Key idea: Even if you think you are good at analysing faces, research shows many people cannot reliably distinguish between photos of real faces and images that have been computer-generated. This is particularly problematic now that computer systems can create realistic-looking photos of people who don’t exist.
Original author and publication date: Manos Tsakiris (The Conversation) – January 23, 2023
Futurizonte Editor’s Note: If “fake” is more real than real, then “fake” is the new reality. (As old as the debate between reality and illusion in early Greek philosophy.)
From the article:
Recently, a fake LinkedIn profile with a computer-generated profile picture made the news because it successfully connected with US officials and other influential individuals on the networking platform, for example. Counter-intelligence experts even say that spies routinely create phantom profiles with such pictures to home in on foreign targets over social media.
These deep fakes are becoming widespread in everyday culture which means people should be more aware of how they’re being used in marketing, advertising and social media.
The images are also being used for malicious purposes, such as political propaganda, espionage and information warfare.
Making them involves something called a deep neural network, a computer system that mimics the way the brain learns. This is “trained” by exposing it to increasingly large data sets of real faces.
In fact, two deep neural networks are set against each other, competing to produce the most realistic images. As a result, the end products are dubbed GAN images, where GAN stands for Generative Adversarial Networks. The process generates novel images that are statistically indistinguishable from the training images.
In our study published in iScience, we showed that a failure to distinguish these artificial faces from the real thing has implications for our online behaviour.
Our research suggests the fake images may erode our trust in others and profoundly change the way we communicate online.