If the technology behind ThisPersonDoesNotExist.com is any indication, AI is going to disrupt the visual sciences. A few days ago, The New York Times wondered if ethical A.I. is even possible. That’s not a rhetorical question given the ability of AI to generate fake visuals as persuasive as ThisPersonDoesNotExist.com suggests.
Just this past summer, we learned that DeepMind’s AI agents were capable of exceeding “human-level” visual gameplay.
That was followed in the fall by the launch of an automated labeling service for Amazon’s SageMaker machine learning tool that promises to greatly speed up the labeling of large data sets to accelerate machine learning.
This week, the folks behind the Halide photo app launched Spectre, an app that uses AI to create stunningly long exposures. Google meanwhile is adding a spelling checker to G Suite that was developed using machine learning.
ThisPersonDoesNotExist.com was created by Uber software engineer Philip Wang and uses research by Nvidia to create an endless stream of simulated portraits. The algorithm was refined using a vast dataset of real images, and uses generative adversarial network (GAN), a type of neural network, to produce new examples of prototypical people.
In a Motherboard statement Wang says: “Most people do not understand how good AI will be at synthesizing images in the future.” In a January 13, 2019 episode of 60 Minutes, Kai Fu Lee issued a similar warning: “[AI] will change the world more than anything in the history of mankind, more than electricity. I believe most people have no idea [about AI] and many people have the wrong idea.”
The Verge reports that the underlying AI framework powering the site was originally invented by researcher Ian Goodfellow. Nvidia’s version of the algorithm, dubbed StyleGAN, was made open source recently and has proven to be very flexible. Although this model was used to generate human faces, it can, in theory, mimic any source.
Somewhere, someone is using StyleGAN to generate deep-fake celebrities. You can’t say that AI didn’t warn you. 😉