As more and more people live their lives out on social media, posting and sharing pictures and videos of their families, criminals and scammers are using those images to find their next victims.
“All of us probably have at least one image online, some of us many images online that makes all of us vulnerable to the usage of A.I. to manufacture these dangerous images,” said FBI Special Agent and Public Affairs Officer.
Johnson said the federal agency issued an alert over the summer after receiving reports from victims, including minors and adults. The victims told investigators their pictures and videos were altered into explicit content with their consent.
“An image that the victim has actually created, so now, you might see the criminal go online, pull an image to use AI to make it appear that that face is associated with this body and its doing something most people would not appreciate being portrayed doing,” she said.
While Johnson couldn’t talk about the number of specific cases reported in the Midwest, the special agent said the fake content is found social media and pornographic websites, with the victims often facing harassment and extortion.
“A lot of the times we are seeing that people are paying these fees so that these images aren’t put out there even if you hadn’t done it,” she said. “It’s very embarrassing. Children might feel like their life is over. It can be a very heavy problem even for adults.”
Johnson told NBC Chicago the FBI is working with other law enforcement agencies to detect AI-generated images.
Local
“Like any new technology, it’s still in the process,” she said. “It’s something we’re going to get better at as we go along, but that’s one of the reasons there’s so much difficulty and so much concern about AI right now—there’s no full proof way to know for sure.”
So what can you do to protect your children? Outreach workers at the Chicago Children’s Advocacy Center said the best thing you can do is to talk to them.
Feeling out of the loop? We'll catch you up on the Chicago news you need to know. Sign up for the weekly Chicago Catch-Up newsletter.
“You want to be able to build that relationship with your kids so they’re willing to share what their online activity is like, who they’re talking with, who they’re connecting with,” said Liz Baudler.
Baudler is an education specialist at the youth service organization on the city’s Near West side.
“Realistically I think AI broadens the threat that things are perfectly innocuous, perfectly safe become unsafe when it gets into the hands of people who want that kind of material,” said Baudler.
Baudler remind parents to have ongoing conversations with their children, set boundaries around their tablets and devices, and be aware of the content they’re consuming and sharing online before its too late.
“I think with all kinds of abuse prevention, the more we’re able to talk about it and get that information out there the more we can modify our behavior to not give people the material that they need to do the things that they want to do,” said Baudler.
Baudler is trying to trying to raise awareness and educate others about online safety as part of the organization’s training around A.I.
“A.I. is definitely here to stay. I would say in a number of outlets I think A.I. can be a tool for so much, but it’s a neutral tool—you can use it for good, you can use it for bad,” said Baudler.
As the FBI continues to investigate sextortion cases and bring criminals to justice, the federal agency is offering safety recommendations and urging you to use discretion when posting online, run frequent online searches of you and your children’s information, apply privacy settings on your social media accounts, and consider doing a reverse image search to see if there are any pictures or videos circulating online without your knowledge.
“The biggest thing that we know for sure is we can still investigate these as traditional crimes,” Johnson said. “The A.I. aspect is one characteristic and so it really isn’t a large impediment to us going after these criminals anywhere in the world.”