BEFORESANDAFTERS.COM
Heres some ways one visual effects studio is using machine learning tools in production right now
And its not only with the dedicated pro VFX tools you might think (its also with ones originally designed for just social media use).The topic on the top of so many minds in visual effects right now is artificial intelligence and machine learning. There are, quite simply, new developments every day in the area. But how are all these developments finding their way into VFX usage? befores & afters asked one studio owner during the recent VIEW Conference to find out what they are doing.Wylie Co. founder and CEO Jake Maymudes started his visual effects studio in 2015. He had previously worked at facilities including The Mill, Digital Domain and ILM. Wylie Co. has in recent times contributed to Dune: Part One and Part Two, Alien: Romulus, Uglies, The Killer, Thor: Love and Thunder, The Last of Us and a host of other projects. The boutique studio works on final VFX, sometimes serving as the in-house VFX team, and commonly on aspects such as postvis.The biggest change to visual effects that Maymudes has seen in recent times has come with the advent of new artificial intelligence (AI) and machine learning (ML) workflows. The studio has utilized deep learning, neural networks and generative adversarial networks (GANs) for projects. Some of this relates to dedicated VFX tools, other work, as discussed below, was even done with tools intended for just social media use.In terms of the tools now available, Maymudes is adamant that AI and ML workflows will (and already are) changing the way labor-intensive tasks like rotoscoping, motion capture and beauty work are done in VFX. Theres so much efficiency to be had by using AI tools, argues Maymudes. I see it as really the only way to survive right now in VFX by taking advantage of these efficiencies. I think the whole worlds going to change in the next couple of years. I think itll change dramatically in five. I think itll change significantly in two. I could be wrong, it could be one.Wylie Co. has leapt into this AI/ML world in both small and large ways. On She-Hulk: Attorney at Law, for example, Wylie Co. was utilizing machine learning rotoscoping in 2021 for postvis work on the series. Back then I wasnt aware of a single other company that was diving into machine learning like we were, says Maymudes. And now, weve all had that capability for years.The blue eyes of the Fremen in Dune: Part Two.A much larger way Wylie Co. used machine learning tools was on Dune: Part Two to tint thousands of Fremen characters eyes blue. That task involved using training data direct from blue tinting VFX work already done on Dune: Part One by the studio and feeding that into Nukes CopyCat node to help produce rotoscope mattes. Production visual effects supervisor Paul Lambert, who is also Wylies executive creative director, oversaw the training himself. Hes deep into AI and AI research, notes Maymudes. Hes a technologist at heart.[You can read more about Wylie Co.s Fremen blue eyes visual effects in issue #23 of befores & afters magazine.]Then, theres a different kind of approach Wylie Co. has taken with AI and ML tools that were not perhaps initially intended to be used for high-end visual effects work. The example Maymudes provides here is in relation to the studios VFX for Uglies. On that film, visual effects supervisor Janelle Ralla tasked Wylie with a range of beauty work to be done on the characters as part of the Ugly/Pretty story point. Ralla demonstrated a social media appFaceAppto Maymudes that she was using to concept the beauty work. The app lets users, on their smartphones, change their appearance.Original frame inside FaceApp.She used this app to generate the images to convey what she wanted to see, explains Maymudes. The results were really good, even for those concepts. So, I researched it, and it was an AI-based app. It had used a neural network to do the beauty work. And it did it fast.That was an important consideration for Maymudes. The beauty work had to be completed to a limited budget and schedule, meaning the visual effects shots had to be turned around quickly.After the FaceApp filter was applied.Heres what Wylie Co. did using the app as part of its workflow.We downloaded FaceApp, then brought in our plates, discusses Maymudes. I took the app and I made hero frames with the shots. Then I would take those hero frames into Nuke. I would create a dataset with these hero frames. Then I would train overnight on my Lenovo workstation with my NVIDIA GPUs for 12 hours. Id come back in the morning, click a couple buttons, apply the inference, and it worked.Nuke node graph for the beauty work.We figured out a good workflow for this work through trial and error, adds Maymudes. You have to be very explicit with what you want to tell these neural networks because its one-to-one. Youre basically saying, Please do exactly this. And if your dataset is messed up that youre training with, your results are going to be either really bad or not great, but not perfect, no matter what because its so one-to-one. Its so black and white. Thats why using FaceApp was great in this regard because it was so consistent between the hero frames.Why Maymudes is excited for this particular use of an AI/ML tool is that it was actually designed for something elsejust a fun social media purpose. But, he says, it has amazing facial tracking for face effects and gags. I mean, a lot of these tools do now. Theres a lot of R&D that has gone into these tools, especially ones relating to your face. Because of that, you can pick and pull little tools here and there to use in visual effects. And if you do that, you can find just insane efficiency. Thats why we used it.Original frame.Final beauty work.What we do love at our company are tools that make us better artists, continues Maymudes. We have machine learning tools that do re-timing, and upscaling, and morph cuts, beauty work, matte work. All these little things that kind of take the grunt work out of it, which is nice. But I dont think machine learning is going to stop there. Its going to transform our industry. I dont actually know where its going to go even with how much I research it and I think about it. Honestly, I think its completely unpredictable what visual effects or the world will look like in five years. But the stuff you can do now, well, its good, its useful. We use it.The post Heres some ways one visual effects studio is using machine learning tools in production right now appeared first on befores & afters.
0 Comments
0 Shares
10 Views