2024 election

Cybersecurity experts warn about ‘deepfake' political images ahead of election

Determining if an image is fake or real is increasingly difficult, experts say

NBC Universal, Inc.

As the 2024 U.S. elections near, cybersecurity experts are sounding the alarm over a growing threat of generated images on the internet, also known as "deepfakes."

Experts believe the content continues to make it harder for the general public to determine if a picture is genuine or not.

"Trying to ascertain what is real and what is not real is getting very, very difficult because of technology changes," Jon Clay, VP of Threat Intelligence at Trend Micro said.

Earlier this week, pop star Taylor Swift took to social media to debunk a deepfake image that depicted her and her fans seemingly endorsing former President Donald Trump in the 2024 presidential election.

Clay said it's critical to be aware in an election year when edited pictures are advancing and appearing more realistic.

"They want people to vote one way, and they’ll use technology to try and sway that vote," Clay said.

Elon Musk, the current CEO of X, formerly known as Twitter, shared an edited image of Vice President Kamala Harris wearing a communist militant uniform.

Experts said fake images can best be discerned from real ones by checking if it's blurry or has off-balance lighting, and to verify an image before trusting it.

If it's a video, the audio and video may not be synced.

"Don’t take first video you see as being true, look at the source, look at where it’s coming from, how did you see it," Clay told NBC Chicago. "I think it’s going to be harder in this election cycle, it’ll be even worse in the next election cycle because technology is always going to improve.”

Clay warned that deepfakes extend beyond political usage, and can come in the form of financial and romance scams as well.

Contact Us