Passa a Pro

BEFORESANDAFTERS.COM
Motion bases, mo-co cameras and monofilament wires
Behind the techniques used to craft the dragon riding scenes in season two of House of the Dragon. An excerpt from befores & afters in print.The many dramatic dragon riding scenes in season two of HBOs House of the Dragon brought audiences incredibly close and upfront with the riders atop their beasts. That required developing a clear methodology for bringing those riding scenes to life.Several VFX and virtual production-related techniques were therefore overseen by visual effects supervisor Dai Einarsson and visual effects producer Tom Hortonall the way from previs, to shooting actors on a motion base, using a robotic motion control camera, incorporating interactive lighting from LED lighting panels, completing live composites, and even some old-school wind machines and wires during the shoot.Planning the dragon ridesThe overall idea was to use previs and take it to a fairly high animation level, Einarsson tells befores & afters, in relation to the preparation required to orchestrate the dragon riding scenes. We actually had finaling animation artists on the previs, because the point of the previs was to get final cameras and final animation in the saddle area in order to be able to techvis that and transpose the camera and the saddle movement onto a motion control Bolt X camera and a multi-axis gimbal buck.The majority of the dragon riding scenes were previsualized by Pixomondo, which was also responsible for several virtual production services on set, as well as being one of the vendors completing final VFX and animation of the CG dragon sequences. Pixomondo also managed the virtual production on a handful of dragon riding scenes that The Third Floor completed previsualization on.The process started with creative previs. We were not worrying at all about the technical side of it, notes Einarsson. We had quite a specific constraint methodology about what kind of camera work we wanted to use on the dragons. The creative brief was that we wanted to ground the camera work in realism, that is, avoiding magical cameras. We had two basic shooting styles. One of them we called the dragon mounted camera, as if there was a camera operator sitting somewhere on the dragon, say, either on the neck or behind the saddle. We also kept this a little bit loose on the saddle, like it was a handheld shot.The other shooting style was dragon to dragon, as if our camera would be mounted on another dragon, continues Einarsson. We imagined, if this was all real, then the camera work would feel like aerial photography that we are just used to seeing, which grounds our fantasy in a little bit of reality.Pixomondos approach to the previs was to give each dragon a unique visual identity in animation, such as distinct wing cadences and heightened performance. We then applied the detailed animation data to the on-set buck rig, discusses Pixomondo virtual production supervisor James Thompson, enabling actors to closely mimic the dragons flight movements, synchronized with a motion-controlled camera for more immersive shots.Einarsson looked to bring the previs animation about 80% of the way to final. Obviously youre missing facial and overlap of the wings and the head. But we didnt worry too much about that. We just wanted to get it so that the weight was definitely there with the gross movement of the body. Once youve got it in there, you can always do a little bit of tweaking. You can loosen the camera up a bit and add the final process. For instance, when we took it all the way to final comp, then wed be sometimes widening the camera a little bit and adding more camera shake and stuff that we didnt want to bake into the motion control.Pixomondo then re-worked the previs into techvis, which involved creating a 3D virtual scene that replicated the on-set hardware. The idea here was to select the 3D camera and constrain it to our motion control camera, outlines Einarsson. You can select the saddle or the saddle constraint of the flying dragon and constrain it in 3D space to the buck. That basically reverse-engineers the movement. It takes out the global movement and puts it into a single space.Once approved, the techvis data was transferred to the motion control hardware and buck rig for testing. During this phase, relates Thompson, shots were tested with stunts and camera crews, and the rushes were sent to the production team for approval. We worked closely with the director of photography and gaffers during pre-lighting to fine-tune the lighting on the LED walls, using a custom Unreal Engine setup for instant feedback.Shooting the scenesThe riding buck on which an actor sat for filming was positioned on a four axis motion base. This was orchestrated by special effects supervisor Mike Dawson, with Ian Menzies from Mark Roberts Motion Control handling the control of the motion base. The buck was fitted with an art department-designed saddle.Weve got an amazing props department, set department and a whole saddle team that built all of these saddles, details Einarsson. For anything that we are going to shoot a principal cast member on, well have the full saddle built. Each one of those saddles is just an amazing design. We built all of the saddles that we needed for the buck work.Buck shoots for dragon scenes have of course been a part of the Game of Thrones and House of the Dragon universe for several years. But one of the new approaches included for this season was the addition of the Bolt X robotic motion control camera (supplied by Mark Roberts Motion Control). What it gave us was the ability to be able to previs the shot design and animate the shot exactly how we wanted it to look, states Einarsson. If we didnt have motion control cameras, then everything would just have been an estimation. You could eyeball it, but you need to have control over the buck, control over the camera and control over the world lighting. Those three elements allow you to deconstruct a dragon thats flying around in 3D space and shoot it precisely the way you want it framed.If we had a Technocrane or something like that, then youre always just eyeballing it, reinforces Einarsson. You can start slipping into designing the shot on the day or tweaking it or wanting to try a different version of it. The schedule can go out the window. Plus, it can mean you completely break the methodology of the dragon mounted camera or the dragon to dragon camera. It becomes like a crane around a flying thing. You lose the grounding of camera work that weve become attuned to expect from actual flying scenes like a World War II dogfight, where youve got cameras, cockpit cameras or youve got a plane next to it. Thats the film language that weve come to expect from flying scenes.A further consideration during the buck shoot was adding in atmospheric effects, such as wind, on the actors. We were trying to replicate them flying at 160 miles an hour, so there had to be quite of lot of wind from wind machines to blow around their capes, notes Einarsson. We actually had to use monofilaments because the capes would wrap under or get caught in the saddles. So we had these wires that held them out, and then wind machines and handheld windblowers to try and get them to swirl in the right way.Read the full story in issue #21 of the print magazine.The post Motion bases, mo-co cameras and monofilament wires appeared first on befores & afters.
·77 Views