Skip Navigation
MarylandToday

Produced by the Office of Marketing and Communications

Subscribe Now
Research

When Seeing Is Not Believing

A UMD Researcher Is Developing Algorithms to Detect Computer-Generated Forgeries, Digital Deepfakery

By Aadit Tambe M. Jour ’22

Yaser Yacoob portrait

Photo by Stephanie S. Cordle

A lifetime spent scrutinizing landscapes or other scenes so he can render them in paintings helps to inspire Yaser Yacoob, an associate research scientist at UMD’s Institute for Advanced Computer Studies, in his studies of computer vision, including work on computer algorithms to detect deepfake videos.

As a child, Yaser Yacoob loved to take portraits of people and recreate them in watercolor. He’s still painting decades later, and his art inspires the computer scientist’s work creating algorithms able to detect fakes of a far more sinister nature. 

Yacoob, an associate research scientist at UMD’s Institute for Advanced Computer Studies, is developing computer software to take on the rising threat of deepfake videos that are increasingly capable of hoodwinking even sophisticated media consumers with digitally-doctored scenes.

“We live in a world, where there is information—visual information, in particular—that may not be accurate and truthful,” Yacoob said. “As a result of that, it can affect society significantly. Using the technology (available) today, people can be easily fooled.”

The most recent high profile example of deepfake technology—an uncannily realistic video of actor Tom Cruise seemingly performing a magic trick that garnered more than 11 million views on TikTok—has raised alarm about artificial intelligence and the exploitation of social media platforms. 

People are used to believing their eyes, he said. But one of the worrying effects of rapid advances in the field of artificial intelligence is that computers can now create fake identities online that can present themselves to the world as actual humans. 

It’s not hopeless, though. Deepfake videos have certain attributes that allow researchers to distinguish them from real media. When algorithms augment reality, they leave digital fingerprints. Yacoob, along with researchers in computer science, is developing software for computers to automatically identify these “weaknesses” to help flag disinformation. 

Yacoob’s research is focused on computer vision, which deals with how computers acquire and develop the ability to process and analyze media samples such as images and video. His algorithms to spot distortion in motion, color and lighting could flag what could be a forgery. 

“The focus is to try to identify when they are real and not real—and then, more importantly, explain why they’re not real and figure out the intent,” he said. “What could be behind the person or entity that is using this image?”

An algorithm might look at a celebrity’s smile, for instance, for subtleties that give away a fake.

“We look at speed and acceleration in mouth movement as a cue, because the (computer-generated) programs can’t replicate the natural attributes of the motion of the mouth,” he said.

Yacoob’s team is working on a project to flag digital tomfoolery in conjunction with the Defense Advanced Research Projects Agency, an arm of the U.S. Department of Defense, which focuses on developing breakthrough technologies to bolster national security.

Disinformation campaigns launched by other countries—fueled by artificial intelligence—have been at the center of a national debate recently, and because of continuing advancement in a branch of AI known as machine learning, they’re coming at us too fast for human fact checking, said Abhinav Shrivastava, assistant professor in computer science, who is working with Yacoob on the DARPA project. 

Last year, Shrivastava created an algorithm to fact-check COVID-19-related news articles to make sure they rely on fact-based information from legitimate news and information sources. And if the article was accompanied by a photograph, the algorithm reverse-searched the image to make sure it was generated by a reliable source.

“We’re going up against an adversary who will be able to flood the internet with misinformation, or fake news, which can eventually lead to a threat to national security or loss of life,” Shrivastava said. “Our job is to build floodgates to stop a lot of that false information.”

Working with watercolors and oil paint for decades has enabled Yacoob to develop the eye to spot deepfakes, with artistic inspiration derived from Vincent van Gogh, whose work used vivid colors and bold brush strokes to portray larger-than-life ideas.

“His work goes beyond reality—van Gogh wasn’t a photorealistic painter—he was an original person who had his own way of perceiving the world and displaying it for us,” Yacoob said. “Art can be limitless; it doesn’t have to be grounded. And as a result of that, when I look at science, I think outside the box, which widens the scope of my research.”

Topics:

Research

Maryland Today is produced by the Office of Marketing and Communications for the University of Maryland community on weekdays during the academic year, except for university holidays.