
Behind the face replacement VFX and **those** lasers in the craziest scene from Sonic the Hedgehog 3
beforesandafters.com
befores & afters goes behind the scenes with Rising Sun Pictures.In Jeff Fowlers Sonic the Hedgehog 3, Jim Carrey plays both the mad scientist Dr. Ivo Robotnik, and Ivos grandfather, Professor Gerald Robotnik. One particularly memorable sequence in which the two characters appear together occurs when they must navigate through a laser gridsomething they do in spectacular dance style, to the tune of The Chemical Brothers Galvanize.A laser dance with two characters played by the same actor brought with it, of course, a number of visual effects challenges. The first was dealing with seeing both characters on screen at the same time. This twinning work required a combination of Carrey completing A and B plates for Gerald and Ivo in different sets of make-up effects for dialogue moments, as well as the use of body doubles and stunt performers standing in for the actor in the more dynamic dance sections of the laser scene.Then, Rising Sun Pictures, under visual effects supervisor Christoph Zollinger, who worked with production visual effects supervisor Ged Wright, relied on its proprietary REVIZETM machine learning and visual effects tools to carry out detailed face replacement work. For the lasers, the visual effects studios undertook this task using effects simulations, creating a dazzling array of red-toned patterns in the room Gerald and Ivo are in, as well as on their bodies.Two Jim CarreysThe laser dance was filmed primarily with body doubles in Gerald and Ivo make-up effects for the high-energy dance portions of the scene. Carrey performed key dialogue moments, and for times when the action was more static in nature. That meant, depending on the shot, Rising Sun Pictures had to take different approaches to face replacement, that is, replacing a double with the recognizable face of Carrey.Director Jeff Fowler and Jim Carrey on the set of Sonic The Hedgehog 3 from Paramount Pictures and Sega.Sometimes, shares Rising Sun Pictures CG supervisor Mathew Mackereth, there was a face replacement on just one of the characters, say it if was a traditional back of the head-type gag. Sometimes it was face replacement on both characters. In nearly all cases, we had face replacement to do the beauty work of putting Jims face on both Ivo and Gerald. Plus, we also had to do a full 3D matchmove in order to get the suit and the lasers on. Fortunately, the lasers never hit the heads of Ivo and Gerald, so that meant that the difference in topology of, say, Gerald from a stunt double, was a problem that we didnt have to deal with.For the face replacement work itself, Rising Sun Pictures has for several years been implementing a face swapping approach which it now calls REVIZETM. The overall methodology was to train a machine learning model of Carrey as his two alter egos that could then be used to composite on top of the doubles. Immediately, the VFX crew knew this would be tough, since the actor is so recognizable, and had already closely inhabited the part of Ivo in two previous films.Jim Carrey is probably one of the harder people that youll have to do this to, comments Rising Sun Pictures machine learning 2D supervisor Robert Beveridge. He has such an expressive face with such a big range of motion and everyone knows what he looks like, how he acts, how he moves.The process began with capture of Carrey in both of the two different make-up effects designs. Says Beveridge: It was a combination between a controlled capture session that we treat as a foundation layer and capture a wide gamut of things and then because we had him in this environment for a large portion of this dance scene, thats where majority of our primary material came from. He was playing both sides of this, and in some cases, we had a B plate, which fed the machine learning process really well. So, we got Jim in this environment, but we also had to recreate him for both of these dance doubles in an extremely complex lighting setup.Just him playing the B side of himself gave us a really big pool of data to source from and a really great library that we could reference, adds Beveridge. In the end, we tallied it up and we had 650,000 images that went in across that sequence. That amount of frames and the curation that the machine learning team had to do to be able to hit Jims expression range, which is, Id say, larger than the average humans, was a real challenge.One of the trickier aspects of the Ivo/Gerald shots involved facial hair. Gerald has a large mustache and bushy eyebrows. We had to make decisions on how we shot him, details Beveridge, and whether he had that on in the capture, whether he didnt, whether we were going to replace all of that, or maintain it from the double. The prosthetics for the doubles were not quite as involved as Jims full prosthetic, which definitely made the blend points where the mustache sat under the nose tough. All of these little intricacies were different on the two of them. So, it became a hybrid of the two where having that source in the plate from the dance doubles was super valuable. But we did train for mustache and eyebrows. Fortunately, he didnt have any hair on both of his characters, which definitely helped us out there.The result of Rising Sun Pictures training model was a full head that then needed to be composited onto the body of the performer. This required, firstly, detailed matchmoving, something further complicated by the fact that different performers with different head and body shapes stood in for Ivo or Gerald. Earlier, in the training process, this use of various dance doubles and stunt doubles meant there were, as Beveridge describes, multiple different targets that we had to allow for. For each new target that you introduce, you have to work out the same relationships and how the two and their features are different. You have to make that calculation for however many you might be dealing with.Compositors spent time narrowing in on the particular features unique to Carrey. We found, for example, says Rising Sun Pictures compositing supervisor Antony Haberl, that Jim Carrey had different size ears compared to the stuntie. With features like that, if you dont get right and you dont nail it 100%, youve got an image that doesnt look right and you dont know why. The challenge is being able to narrow in on all the fine detail that makes a person who they are and recognize what those things are and grab onto it and dont let go until its right.Rising Sun Pictures machine learning VFX toolset REVIZETM is integrated into Foundrys Nuke. Although the resulting model does not necessarily provide artists with the same traditional render passes, AOVs or alpha sets, the compositing process remained largely the same as any other project. What Rising Sun Pictures is able to do with their machine learning techniques, however, is adjust elements such as eye direction in machine learning output. Having control over the results and then continuing to allow for performances to be modifiedrather than putting things through a machine learning black boxis one of the key goals of Rising Sun Pictures process. Ultimately, what we are doing is saying, what makes Jim, Jim? points out Mackereth. And, what makes Ivo, Ivo? What makes Gerald, Gerald?Also, lasers!As Ivo and Gerald dance through the room they are in, their suits are hit by and reflect a multitude of (mostly) red lasers. For the shoot with the Carrey doubles, a pre-programmed laser show had been devised and filmed. Rising Sun Pictures role was to take this much further and provide a larger sense of play and interaction with the choreography of both characters.Part of the challenge for us was, exactly what do we want to see on screen? says Mackereth. With some visual effects that you put together, there might be a nice reference you can look to. In this case, we did struggle to find something that was analogous. We started with concepts, where we drew some lasers and some suits and how they might interact. We had some motion tests of lasers flying around the place. What was a real strong catalyst for us was a clip that we found where there was a dancer who was dancing inside a laser field and shed covered herself with little hexagonal mirrors. That was the aha moment where we realized they were not playing too much with the lasers. Theyre really just enjoying being in this space and the lasers are working around them.Double your VILLAINS, double your FUN!! Heres an early piece of #SonicMovie3 concept art we created to showcase the idea of two Robotniks breaking into G.U.N. headquarters! pic.twitter.com/shUlDbiSAh Jeff Fowler (@fowltown) December 27, 2024Armed with a stronger sense of the look, Rising Sun Pictures then started work on the lasers in the layout department. Here, explains Mackereth, we built some tools for previs so that we could fire a laser at matchmoved characters and essentially then get the laser to interact with the character and then fling around the place to reflect in real-time in Maya. We had quite a lot of control over, say, if we had an incident laser coming in, then the reflected lasers, we could pattern them, shape them, move them in a space to be visually pleasing. We had a lot of control over the reflections in an artificial way.A turning point also came when we made the decision that we had to throw the background that had been shot away, adds Mackereth. They meticulously shot this beautiful hallway with the beautiful choreography, but both the client and us decided that it was no longer suitable. That was a huge creative challenge, but it also freed us to then explore how our lasers would operate in the space.Then, the FX department at Rising Sun Pictures took that initial choreography and realized the lasers with effects simulations. A particularly tricky aspect of the work was treating the characters bodies and suits as highly reflective items. It was all about, how do we show a true reflection of the laser coming off the suits? details Rising Sun Pictures FX lead Kurt Debens. What we found really quickly was that it was either not enough or too much. We couldnt track anything whats going on, essentially. So, it became, how do we make it correct for our shots and how do we adjust those angles of reflection? How much fanning on the reflected lasers? The overall challenge for effects was making sure it was accurate and well-represented from what layout were producing, but then also having that control to be able to create a picture that was pleasing to look at.The lasers were designed to emanate from specific points on the wall and ceiling of the space (which was ultimately an entirely synthetic environment). In determining the final look of the lasers, this was something that came down also to compositing, as Haberl breaks down. A real laser would be virtually invisible. Its very thin. It comes in, it goes away and it doesnt do much. There was a big exploration of just finding that middle ground of the fantasy of it and the joy of it versus the strict reality of it.The post Behind the face replacement VFX and **those** lasers in the craziest scene from Sonic the Hedgehog 3 appeared first on befores & afters.
0 Commentarii
·0 Distribuiri
·48 Views