Adobe's new artificial intelligence tool automatically pinpoints Photoshopped faces



[ad_1]

The world is worried more and more about spreading fake videos and photos, and Adobe – a name synonymous with rising imagery – says that it shares these concerns. Today, she is sharing new research in collaboration with UC Berkeley scientists who use machine learning to automatically detect when face images have been manipulated.

This is the last sign that society is devoting more resources to this problem. Last year, its engineers created an artificial intelligence tool that detects edited media created by splicing, cloning and deleting objects.

The company said it had no immediate plans to turn the latter job into a commercial product, but a spokesman said The edge it was only one of the many "efforts made by Adobe to better detect image, video, audio and document manipulations."

"We are proud of the impact that Photoshop and other creative tools have had on the world, but we also recognize the ethical implications of our technology," the company said in a blog post. "Dummy content is a serious problem and more and more urgent."

The search is specifically designed to spot changes made with Photoshop's Liquify tool, which is commonly used to adjust face shapes and edit facial expressions. "The effects of this feature can be tricky, which has made it an intriguing test case for detecting such radical and subtle changes to faces," said Adobe.

To create the software, the engineers formed a network of neurons on a paired face database, containing images before and after their modification with the help of Liquify.

The resulting algorithm is impressive in effectiveness. When asked to identify a sample of edited faces, human volunteers obtained the correct answer in 53% of cases, while the algorithm was correct in 99% of cases. The tool is even capable of suggesting how to restore a photo to its unmodified original appearance, although these results are often mixed.

"The idea of ​​a universal magic button" cancel "to cancel image changes is still far from reality," said Adobe researcher Richard Zhang in an article published on the company's blog. "But we live in a world where it is becoming increasingly difficult to trust the digital information we consume, and I look forward to further exploring this area of ​​research."

The researchers said that these works were the first of their kind to identify these facial changes and were an "important step" towards creating tools that could identify complex changes, including "manipulations of the body and body." photometric changes such as skin smoothing ".

Although research is promising, tools like this one are not a quick fix to end the harmful effects of manipulated media. As we have seen with the spread of false news, even if the content is obviously wrong or can be quickly demystified, it will still be shared and adopted on social networks. Knowing that something is wrong is only half the battle, but at least it's a start.

[ad_2]

Source link