Generative AI won’t replace actors, says CTO of digital realm – The Hollywood Reporter
Digital Domain CTO Hanno Basse believes that the use of generative AI and machine learning in visual effects has a lot of potential, but “it doesn’t replace the human aspect of acting for actors. ” states.
“Human viewers want to connect with humans,” he asserted in his keynote speech at the Society of Motion Picture and Television Engineers’ Media Technology Summit on Monday, in which he talked about VFX studio Digital Domain’s recent work on digital humans. reviewed. Minutes – hundreds of shots – of actress Tatiana Maslany’s performance as the title character in a Marvel movie she-hulk. But while this and other examples include machine learning in their methodology, “at the heart of every example is the performance of real actors. You get performance. I don’t see that changing.”
The use of generative AI is a thorny topic in SAG-AFTRA’s current strike and contract negotiations, as is the WGA, which recently ended its strike and approved a new contract for AI protections.
In a keynote address at Hollywood’s Ray Dolby Ballroom, Bass outlined tools such as DD’s proprietary Charlatan face-swap technology, which includes machine learning and was used in the Amazon series. citadel Create actor-driven performances with characters featuring JFK, FDR, and Winston Churchill. “We can turn anyone into anyone…and you won’t know the difference,” Bass said, noting that the job won’t be easy and people won’t be able to do this from home. did. It’s just a matter of time. ”
DD’s latest proprietary tools include Charlatan Geo, which can be used to enable markerless motion capture. “It’s very accurate and it’s very fast,” he said, noting that it allows the actor to work without a helmet or his camera. “It’s easier for actors to shoot that way and it’s more cost-effective.”
Basse believes that generative AI and machine learning can help with other types of VFX work, especially ideation and communication, taking away some of the “boring” work. “Even the most creative people often don’t know what they want,” he said, adding that 3D teams spend weeks or months working on prototypes that may not be used in the finished project. He pointed out that there are also cases where such efforts may be undertaken. “[Now] You can do this in 2D using generative AI to quickly create 50-100 different things. [sample] version. …It makes communication so much easier. ”
DD has also developed tools to determine how the fabric (clothing) of the human body moves in certain poses. Mr. Basse reported that his DC: blue beetle This was DD’s first major application of this technique and was used in over 300 shots of the film. Without machine learning tools, “it would have been too expensive,” he added.
Looking ahead, during his keynote, Bass also reported on the development of an AI-powered generation tool designed by DD to convert “text to animation.” He acknowledged that there is still “much work to be done” in this area. “We believe that using this type of technology, we can quickly accelerate our animation work and take the pain out of it.”
Prototype work shown during the keynote included an autonomous virtual human in development, a digital robot (still in the uncanny valley) powered by DD’s Charlatan engine. Ms. Bass said, “If you push her too hard, she can get upset and upset.”
The annual SMPTE Media Technology Summit runs through Thursday.