Share this @internewscast.com
BIG RAPIDS, Mich. (WOOD) – A new trend on TikTok, fueled by artificial intelligence, is causing alarm as individuals mistakenly believe their homes have been invaded, prompting a wave of unnecessary 911 calls.
This unsettling prank involves AI-generated content depicting a “homeless man” entering homes, rummaging through refrigerators, or even lying in beds. The pranksters then send these realistic-looking videos to unsuspecting loved ones, who are often led to believe the scenes are genuine intrusions.
According to The New York Times, police departments across at least four states have been responding to reports of home invasions, only to discover that the “intruder” is an AI-generated fabrication.
Authorities in West Bloomfield, near Detroit, Michigan, have confirmed receiving such reports and have issued warnings about the “AI homeless man prank,” highlighting its strain on emergency services.
The Yonkers Police Department in New York expressed concern in a Facebook post, emphasizing the dangers: “Officers respond urgently, with lights and sirens, to what seems to be a real threat. The situation only de-escalates when it becomes clear it was a prank. This not only squanders resources but poses significant safety risks to both responding officers and the families involved.”
Greg Gogolin, a professor and director of cyber security and data science at Ferris State University, remarked on the simplicity of creating such deceptive content. He demonstrated how AI technology can be used to manipulate images within just a few hours.
“This is a natural language processing machine learning program called a face swapping,” Gogolin said.
The program was able to make the images look realistic and take features from a person’s face and combines that with other images.
Once a technology like this is developed, it often gets used in ways the original creators never intended.
“They share that out or sell it. … It’s dispersed and that’s where the real danger is because people without any technical background can then utilize that the way they wish,” Gogolin said.
In some cases, there are things you can look for that could indicate an image is AI.
“You might generate something and an arm will be off, the elbows are in the wrong place. It used to be you would often see people with like three arms. A long arm, a long leg, the dynamics were not correct. A lot of that has been corrected or at least drastically improved with the newer versions,” Gogolin said.
Gogolin said investigators and law enforcement also need more advanced training. “There are very few degreed investigators that have a cyber security background, let alone a computer science background particularly at the local level, even at the state level.”