[This Gizmodo story is widely cited in coverage of a new tool developed by Disney Research that can more easily, quickly and economically create illusions that age and ‘de-age’ actors. Coverage from Nerdist provides this succinct summary:
“Similar to the system used in The Irishman, which Netflix and ILM developed, this system doesn’t require an actor’s face to be covered in tracking dots. It also doesn’t require visual effects artists to spend weeks erasing (or adding) wrinkles in every frame. The video [in the original story and on YouTube] shows off the simple interface, where all you have to do is enter the actor’s age and how old you want them to look.”
It also points out that aging and de-aging is being applied to voices:
“Recent Star Wars shows also use voice de-aging technology. Artificial intelligence program Respeecher made Luke Skywalker in The Mandalorian sound like Mark Hamill in the original trilogy. James Earl Jones recently retired, but his iconic Darth Vader performance lives on using the same AI in Obi-Wan Kenobi and future Star Wars properties.”
Follow the links in the story below for more information and see the Disney Research YouTube channel for more demonstrations and insights. –Matthew]
[Image: Screenshot from Disney Research showing a woman at her current age and older. Source: Gigazine]
Disney Made a Movie Quality AI Tool That Automatically Makes Actors Look Younger (or Older)
With just a few clicks, actors can look younger or older without the need for expensive visual effects.
By Andrew Liszewski
November 30, 2022
Further demonstrating the power of artificial intelligence when it comes to photorealistically altering footage, researchers from Disney have revealed a new aging/de-aging tool that can make an actor look convincingly older or younger, without the need for weeks of complex and expensive visual effects work.
When watching a blockbuster movie like 2018’s Ant-Man and the Wasp, most viewers can easily spot the work of the many visual effects studios that contribute to these films, what with their flashy moments when Ant-Man shrinks or grows to gigantic proportions. But it’s sometimes the more subtle VFX work that can be the hardest to achieve photorealistic results with, like the shots featuring younger versions of actors Michelle Pfeiffer and Michael Douglas. To get results like those seen in the movie, talented artists either need to spend weeks erasing wrinkles and other telltale signs of age from an actor’s face, or entirely replace it with a computer-generated double.
Visual effects are a powerful filmmaking tool, but there are plenty of reasons to find ways to make them easier to create; from lightening the load on already over-worked and underpaid artists, to making the tools accessible to filmmakers not working with immense Hollywood-sized budgets. Of course, even for major studios, there’s a profit motive in being able to automate this kind of work, too.
That’s why companies like Disney invest in research to help advance the art of visual effects, but in recent years these researchers have also been exploring how artificial intelligence can simplify VFX work. Two years ago, Disney Research Studios developed AI-powered tools that could generate face swap videos with enough quality and resolution to be used for professional filmmaking (instead of as questionably low-res GIFs shared around the internet). This year, the researchers are demonstrating a new tool that leverages AI tricks to make actors look older or younger, minus the weeks of work usually needed to perfect those kinds of shots.
Using neural networks and machine learning to age or de-age a person has already been tried, and while the results are convincing enough when applied to still images, they hadn’t produced photorealistic results on moving video, with temporal artifacts that appear and disappear from frame to frame, and the person’s appearance occasionally becoming unrecognizable as the altered video plays.
To make an age-altering AI tool that was ready for the demands of Hollywood and flexible enough to work on moving footage or shots where an actor isn’t always looking directly at the camera, Disney’s researchers, as detailed in a recently published paper, first created a database of thousands of randomly generated synthetic faces. Existing machine learning aging tools were then used to age and de-age these thousands of non-existent test subjects, and those results were then used to train a new neural network called FRAN (face re-aging network).
When FRAN is fed an input headshot, instead of generating an altered headshot, it predicts what parts of the face would be altered by age, such as the addition or removal of wrinkles, and those results are then layered over the original face as an extra channel of added visual information. This approach accurately preserves the performer’s appearance and identity, even when their head is moving, when their face is looking around, or when the lighting conditions in a shot change over time. It also allows the AI generated changes to be adjusted and tweaked by an artist, which is an important part of VFX work: making the alterations perfectly blend back into a shot so the changes are invisible to an audience.
|