A brand new visual effects and animation publication from Ian Failes.
Recent Updates
-
Twinning, Creepers and more VFX covered in the ‘Mickey 17’ issue
Issue #32 of befores & afters magazine is now out in PRINT and DIGITAL!
Issue #32 of befores & afters magazine is now out in PRINT and DIGITAL. It’s a deep dive into the visual effects of Bong Joon Ho’s Mickey 17, starring Robert Pattinson.
The film contains creatures, spacecraft, snow-filled landscapes and several scenes where actor Robert Pattinson appears as two ‘expendable’ clone characters—Mickey 17 and Mickey 18—on screen at the same time.
The new issue explores this twinning work, as well as going into detail on the creatures and environment visual effects largely orchestrated by DNEG, Framestore, Rising Sun Pictures and Turncoat Pictures.
You can grab the issue in PRINT from Amazon, or as a DIGITAL EDITION on Patreon.
Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released.
Hope you enjoy the latest issue!
Here’s the links to various Amazon stores:
USA:
UK:
Canada:
Germany:
France:
Spain:
Italy:
Australia:
Japan:
Sweden:
Poland:
Netherlands:
The post Twinning, Creepers and more VFX covered in the ‘Mickey 17’ issue appeared first on befores & afters.
#twinning #creepers #more #vfx #coveredTwinning, Creepers and more VFX covered in the ‘Mickey 17’ issueIssue #32 of befores & afters magazine is now out in PRINT and DIGITAL! Issue #32 of befores & afters magazine is now out in PRINT and DIGITAL. It’s a deep dive into the visual effects of Bong Joon Ho’s Mickey 17, starring Robert Pattinson. The film contains creatures, spacecraft, snow-filled landscapes and several scenes where actor Robert Pattinson appears as two ‘expendable’ clone characters—Mickey 17 and Mickey 18—on screen at the same time. The new issue explores this twinning work, as well as going into detail on the creatures and environment visual effects largely orchestrated by DNEG, Framestore, Rising Sun Pictures and Turncoat Pictures. You can grab the issue in PRINT from Amazon, or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. Hope you enjoy the latest issue! Here’s the links to various Amazon stores: USA: UK: Canada: Germany: France: Spain: Italy: Australia: Japan: Sweden: Poland: Netherlands: The post Twinning, Creepers and more VFX covered in the ‘Mickey 17’ issue appeared first on befores & afters. #twinning #creepers #more #vfx #coveredBEFORESANDAFTERS.COMTwinning, Creepers and more VFX covered in the ‘Mickey 17’ issueIssue #32 of befores & afters magazine is now out in PRINT and DIGITAL! Issue #32 of befores & afters magazine is now out in PRINT and DIGITAL. It’s a deep dive into the visual effects of Bong Joon Ho’s Mickey 17, starring Robert Pattinson. The film contains creatures (the Creepers), spacecraft, snow-filled landscapes and several scenes where actor Robert Pattinson appears as two ‘expendable’ clone characters—Mickey 17 and Mickey 18—on screen at the same time. The new issue explores this twinning work, as well as going into detail on the creatures and environment visual effects largely orchestrated by DNEG, Framestore, Rising Sun Pictures and Turncoat Pictures. You can grab the issue in PRINT from Amazon (that’s the US store, make sure you try your local Amazon store, too), or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. Hope you enjoy the latest issue! Here’s the links to various Amazon stores: USA: https://www.amazon.com/dp/B0FCYRV86J UK: https://www.amazon.co.uk/dp/B0FCYRV86J Canada: https://www.amazon.ca/dp/B0FCYRV86J Germany: https://www.amazon.de/dp/B0FCYRV86J France: https://www.amazon.fr/dp/B0FCYRV86J Spain: https://www.amazon.es/dp/B0FCYRV86J Italy: https://www.amazon.it/dp/B0FCYRV86J Australia: https://www.amazon.com.au/dp/B0FCYRV86J Japan: https://www.amazon.co.jp/dp/B0FCYRV86J Sweden: https://www.amazon.se/dp/B0FCYRV86J Poland: https://www.amazon.pl/dp/B0FCYRV86J Netherlands: https://www.amazon.nl/dp/B0FCYRV86J The post Twinning, Creepers and more VFX covered in the ‘Mickey 17’ issue appeared first on befores & afters.Please log in to like, share and comment! -
How Wētā FX created Seasmoke in ‘House of the Dragon’ s2
A new VFX breakdown is out.
The post How Wētā FX created Seasmoke in ‘House of the Dragon’ s2 appeared first on befores & afters.
#how #wētā #created #seasmoke #houseHow Wētā FX created Seasmoke in ‘House of the Dragon’ s2A new VFX breakdown is out. The post How Wētā FX created Seasmoke in ‘House of the Dragon’ s2 appeared first on befores & afters. #how #wētā #created #seasmoke #houseBEFORESANDAFTERS.COMHow Wētā FX created Seasmoke in ‘House of the Dragon’ s2A new VFX breakdown is out. The post How Wētā FX created Seasmoke in ‘House of the Dragon’ s2 appeared first on befores & afters.0 Comments 0 Shares -
The art of two Mickeys
Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine.
The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?”
“Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.”
Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical.
“Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.”
Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.”
“Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.”
In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass.
When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.”
“It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.”
The Hydralite rig, developed by Volucap. Source:
Rising Sun Pictureshandled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.”
“It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.”
Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released.
The post The art of two Mickeys appeared first on befores & afters.
#art #two #mickeysThe art of two MickeysClassic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine. The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?” “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.” Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical. “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.” Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.” “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.” In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass. When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.” “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.” The Hydralite rig, developed by Volucap. Source: Rising Sun Pictureshandled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.” “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.” Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. The post The art of two Mickeys appeared first on befores & afters. #art #two #mickeysBEFORESANDAFTERS.COMThe art of two MickeysClassic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine. The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?” “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.” Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical. “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.” Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.” “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.” In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass. When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.” “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.” The Hydralite rig, developed by Volucap. Source: https://volucap.com Rising Sun Pictures (visual effects supervisor Guido Wolter) handled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.” “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.” Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. The post The art of two Mickeys appeared first on befores & afters.0 Comments 0 Shares -
Crafty Apes breaks down its work for ‘The Residence’
See how the VFX studio crafted scenes for inside and around the White House.
Watch the breakdown here.
The post Crafty Apes breaks down its work for ‘The Residence’ appeared first on befores & afters.
#crafty #apes #breaks #down #itsCrafty Apes breaks down its work for ‘The Residence’See how the VFX studio crafted scenes for inside and around the White House. Watch the breakdown here. The post Crafty Apes breaks down its work for ‘The Residence’ appeared first on befores & afters. #crafty #apes #breaks #down #itsBEFORESANDAFTERS.COMCrafty Apes breaks down its work for ‘The Residence’See how the VFX studio crafted scenes for inside and around the White House. Watch the breakdown here. The post Crafty Apes breaks down its work for ‘The Residence’ appeared first on befores & afters. -
Watch Image Engine’s VFX breakdown for ‘Skeleton Crew’
The VFX studio breaks down its work for the series.
The post Watch Image Engine’s VFX breakdown for ‘Skeleton Crew’ appeared first on befores & afters.
#watch #image #engines #vfx #breakdownWatch Image Engine’s VFX breakdown for ‘Skeleton Crew’The VFX studio breaks down its work for the series. The post Watch Image Engine’s VFX breakdown for ‘Skeleton Crew’ appeared first on befores & afters. #watch #image #engines #vfx #breakdownBEFORESANDAFTERS.COMWatch Image Engine’s VFX breakdown for ‘Skeleton Crew’The VFX studio breaks down its work for the series. The post Watch Image Engine’s VFX breakdown for ‘Skeleton Crew’ appeared first on befores & afters. -
See how MotionMaker, Maya’s new AI animation tool, works
The Autodesk toolset combines AI, motion capture and keyframing inside of Maya.
The post See how MotionMaker, Maya’s new AI animation tool, works appeared first on befores & afters.
#see #how #motionmaker #mayas #newSee how MotionMaker, Maya’s new AI animation tool, worksThe Autodesk toolset combines AI, motion capture and keyframing inside of Maya. The post See how MotionMaker, Maya’s new AI animation tool, works appeared first on befores & afters. #see #how #motionmaker #mayas #newBEFORESANDAFTERS.COMSee how MotionMaker, Maya’s new AI animation tool, worksThe Autodesk toolset combines AI, motion capture and keyframing inside of Maya. The post See how MotionMaker, Maya’s new AI animation tool, works appeared first on befores & afters. -
‘Reacher’ fan? I think you’ll enjoy this invisible effects VFX breakdown for season 3
It comes from WeFX.
The post ‘Reacher’ fan? I think you’ll enjoy this invisible effects VFX breakdown for season 3 appeared first on befores & afters.
#reacher #fan #think #youll #enjoy‘Reacher’ fan? I think you’ll enjoy this invisible effects VFX breakdown for season 3It comes from WeFX. The post ‘Reacher’ fan? I think you’ll enjoy this invisible effects VFX breakdown for season 3 appeared first on befores & afters. #reacher #fan #think #youll #enjoyBEFORESANDAFTERS.COM‘Reacher’ fan? I think you’ll enjoy this invisible effects VFX breakdown for season 3It comes from WeFX. The post ‘Reacher’ fan? I think you’ll enjoy this invisible effects VFX breakdown for season 3 appeared first on befores & afters. -
Check out these invisible effects crafted for ‘The Residence’
See the VFX reel from NEXODUS which showcases digital makeup fixes, set extensions, matte paintings, and editorial polish. More in their case study.
The post Check out these invisible effects crafted for ‘The Residence’ appeared first on befores & afters.
#check #out #these #invisible #effectsCheck out these invisible effects crafted for ‘The Residence’See the VFX reel from NEXODUS which showcases digital makeup fixes, set extensions, matte paintings, and editorial polish. More in their case study. The post Check out these invisible effects crafted for ‘The Residence’ appeared first on befores & afters. #check #out #these #invisible #effectsBEFORESANDAFTERS.COMCheck out these invisible effects crafted for ‘The Residence’See the VFX reel from NEXODUS which showcases digital makeup fixes, set extensions, matte paintings, and editorial polish. More in their case study. The post Check out these invisible effects crafted for ‘The Residence’ appeared first on befores & afters.0 Comments 0 Shares -
Stunts, previs and VFX in ‘Final Destination: Bloodlines’
Special Effects Coordinator Tony Lazarowich, VFX Supervisor Nordin Rahhali, and Stunt Coordinator Simon Burnett discuss the work in this LA Weekly article, which includes a bunch of fun videos.
The post Stunts, previs and VFX in ‘Final Destination: Bloodlines’ appeared first on befores & afters.
#stunts #previs #vfx #final #destinationStunts, previs and VFX in ‘Final Destination: Bloodlines’Special Effects Coordinator Tony Lazarowich, VFX Supervisor Nordin Rahhali, and Stunt Coordinator Simon Burnett discuss the work in this LA Weekly article, which includes a bunch of fun videos. The post Stunts, previs and VFX in ‘Final Destination: Bloodlines’ appeared first on befores & afters. #stunts #previs #vfx #final #destinationBEFORESANDAFTERS.COMStunts, previs and VFX in ‘Final Destination: Bloodlines’Special Effects Coordinator Tony Lazarowich, VFX Supervisor Nordin Rahhali, and Stunt Coordinator Simon Burnett discuss the work in this LA Weekly article, which includes a bunch of fun videos. The post Stunts, previs and VFX in ‘Final Destination: Bloodlines’ appeared first on befores & afters.0 Comments 0 Shares -
ILM’s 50th anniversary showreel
ILM has released a reel showcasing their work over 50 years.
And don’t forget my upcoming book, Industrial Light & Magic: 50 Years of Innovation.
Pre-order links.
USA:
UK:
CANADA:
The post ILM’s 50th anniversary showreel appeared first on befores & afters.
#ilms #50th #anniversary #showreelILM’s 50th anniversary showreelILM has released a reel showcasing their work over 50 years. And don’t forget my upcoming book, Industrial Light & Magic: 50 Years of Innovation. Pre-order links. USA: UK: CANADA: The post ILM’s 50th anniversary showreel appeared first on befores & afters. #ilms #50th #anniversary #showreelBEFORESANDAFTERS.COMILM’s 50th anniversary showreelILM has released a reel showcasing their work over 50 years. And don’t forget my upcoming book, Industrial Light & Magic: 50 Years of Innovation. Pre-order links (but also check out your local Amazon marketplace). USA: https://amzn.to/3GSyUsW UK: https://amzn.to/3GX1JEx CANADA: https://amzn.to/4ktkAWh The post ILM’s 50th anniversary showreel appeared first on befores & afters.0 Comments 0 Shares -
Behind the ‘Spider Rose’ ep of ‘Love Death + Robots’
Behind the scenes.
The post Behind the ‘Spider Rose’ ep of ‘Love Death + Robots’ appeared first on befores & afters.
#behind #spider #rose #love #deathBehind the ‘Spider Rose’ ep of ‘Love Death + Robots’Behind the scenes. The post Behind the ‘Spider Rose’ ep of ‘Love Death + Robots’ appeared first on befores & afters. #behind #spider #rose #love #deathBEFORESANDAFTERS.COMBehind the ‘Spider Rose’ ep of ‘Love Death + Robots’Behind the scenes. The post Behind the ‘Spider Rose’ ep of ‘Love Death + Robots’ appeared first on befores & afters.0 Comments 0 Shares -
‘This Is Animation!’ from Sony Pictures Animation and Sony Pictures Imageworks lets you learn the craft
Director Kris Pearn and a host of key artists deliver this online training.
Sony Pictures Animation and Sony Pictures Imageworks along with Yellowbrick have launched a free educational resource called THIS IS ANIMATION!
It breaks down the process of animation using a whole bunch of real-world examples from Sony projects, including Into The Spider-Verse.
You can sign up for free here: yellowbrick.co/sony
The post ‘This Is Animation!’ from Sony Pictures Animation and Sony Pictures Imageworks lets you learn the craft appeared first on befores & afters.
#this #animation #sony #pictures #imageworks‘This Is Animation!’ from Sony Pictures Animation and Sony Pictures Imageworks lets you learn the craftDirector Kris Pearn and a host of key artists deliver this online training. Sony Pictures Animation and Sony Pictures Imageworks along with Yellowbrick have launched a free educational resource called THIS IS ANIMATION! It breaks down the process of animation using a whole bunch of real-world examples from Sony projects, including Into The Spider-Verse. You can sign up for free here: yellowbrick.co/sony The post ‘This Is Animation!’ from Sony Pictures Animation and Sony Pictures Imageworks lets you learn the craft appeared first on befores & afters. #this #animation #sony #pictures #imageworksBEFORESANDAFTERS.COM‘This Is Animation!’ from Sony Pictures Animation and Sony Pictures Imageworks lets you learn the craftDirector Kris Pearn and a host of key artists deliver this online training. Sony Pictures Animation and Sony Pictures Imageworks along with Yellowbrick have launched a free educational resource called THIS IS ANIMATION! It breaks down the process of animation using a whole bunch of real-world examples from Sony projects, including Into The Spider-Verse. You can sign up for free here: yellowbrick.co/sony The post ‘This Is Animation!’ from Sony Pictures Animation and Sony Pictures Imageworks lets you learn the craft appeared first on befores & afters.0 Comments 0 Shares -
Fun short promo for ‘Jurassic World Rebirth’ highlights VFX
Includes VFX supervisor David Vickery, who hails from ILM.
The post Fun short promo for ‘Jurassic World Rebirth’ highlights VFX appeared first on befores & afters.
#fun #short #promo #jurassic #worldFun short promo for ‘Jurassic World Rebirth’ highlights VFXIncludes VFX supervisor David Vickery, who hails from ILM. The post Fun short promo for ‘Jurassic World Rebirth’ highlights VFX appeared first on befores & afters. #fun #short #promo #jurassic #worldBEFORESANDAFTERS.COMFun short promo for ‘Jurassic World Rebirth’ highlights VFXIncludes VFX supervisor David Vickery, who hails from ILM. The post Fun short promo for ‘Jurassic World Rebirth’ highlights VFX appeared first on befores & afters.14 Comments 0 Shares -
On The Set Pic: ‘The Wheel of Time’
Robert Strange portrays the Eelfinn, an otherworldly being, in ‘The Wheel of Time.’
Credit: Julie Vrabelova/Prime
The post On The Set Pic: ‘The Wheel of Time’ appeared first on befores & afters.
#set #pic #wheel #timeOn The Set Pic: ‘The Wheel of Time’Robert Strange portrays the Eelfinn, an otherworldly being, in ‘The Wheel of Time.’ Credit: Julie Vrabelova/Prime The post On The Set Pic: ‘The Wheel of Time’ appeared first on befores & afters. #set #pic #wheel #timeBEFORESANDAFTERS.COMOn The Set Pic: ‘The Wheel of Time’Robert Strange portrays the Eelfinn, an otherworldly being, in ‘The Wheel of Time.’ Credit: Julie Vrabelova/Prime The post On The Set Pic: ‘The Wheel of Time’ appeared first on befores & afters.0 Comments 0 Shares -
Go behind the scenes of the season 2 finale of ‘The Last of Us’
A new official featurette.
The post Go behind the scenes of the season 2 finale of ‘The Last of Us’ appeared first on befores & afters.
#behind #scenes #season #finale #lastGo behind the scenes of the season 2 finale of ‘The Last of Us’A new official featurette. The post Go behind the scenes of the season 2 finale of ‘The Last of Us’ appeared first on befores & afters. #behind #scenes #season #finale #lastBEFORESANDAFTERS.COMGo behind the scenes of the season 2 finale of ‘The Last of Us’A new official featurette. The post Go behind the scenes of the season 2 finale of ‘The Last of Us’ appeared first on befores & afters.0 Comments 0 Shares -
Official 1.5hr making of ‘Final Destination 3’ is posted online
Warner Bros. has posted it on YouTube.
The post Official 1.5hr making of ‘Final Destination 3’ is posted online appeared first on befores & afters.
#official #15hr #making #final #destinationOfficial 1.5hr making of ‘Final Destination 3’ is posted onlineWarner Bros. has posted it on YouTube. The post Official 1.5hr making of ‘Final Destination 3’ is posted online appeared first on befores & afters. #official #15hr #making #final #destinationBEFORESANDAFTERS.COMOfficial 1.5hr making of ‘Final Destination 3’ is posted onlineWarner Bros. has posted it on YouTube. The post Official 1.5hr making of ‘Final Destination 3’ is posted online appeared first on befores & afters.0 Comments 0 Shares -
Pro-Level Mocap Sync: Stream Vicon in iClone for Real-Time Animation
Reallusion combines Vicon technology with industry-standard timecode support in iClone.
Reallusion announces its official partnership with Vicon, enabling direct motion capture support for iClone. For the first time, Vicon systems can seamlessly connect with Motion LIVE, offering full integration of real-time, high-fidelity body capture, along with facial and hand mocap, all within one unified platform. Jeff Sheetz, founder of Monkey Chow Animation Studio, shares his experience with this powerful iClone-Vicon suite, which he believes delivers studio-quality results faster than ever before.
Jeff Scheetz, Motion Capture Orlando
Since leaving his role as co-founder of The Digital Animation & Visual EffectSchool, Jeff Scheetz, alongside his wife Anne, has been running Monkey Chow Animation Studio in Orlando, Florida. In 2021, they expanded into motion capture with the launch of Motion Capture Orlando, serving the theme park industry. Their work can be seen at Universal Studios and Walt Disney World, and they’ve also produced various motion capture packs for iClone and ActorCore, such as Run for Your Life, Bank Heist, and Avenging Warriors. Jeff’s team also collaborated with Actor Capture on the Netflix movie The Electric State. Key team members include senior mocap technicians Kaszmere Messbarger and Nelson Escobar.
iClone and Vicon—The New Dynamic Duo
One of the most exciting aspects of Jeff’s relationship with Reallusion is getting to test out cutting-edge technology before it’s officially released. When he was informed that iClone would soon support Vicon integration and timecode synchronization, it was a game-changer. Motion Capture Orlando has been using Vicon’s cameras and software since they started investing in mocap gear, but they were unable to integrate Vicon data directly into Motion LIVE until now.
This breakthrough is especially significant for Jeff, as the previous workflow involved streaming Vicon data into Motion Builder for retargeting, and then moving that data into Unreal Engine. The integration of facial data, hand motion, and the inherent latency in this multi-step process presented challenges. Now, with iClone’s Motion LIVE, Vicon streams directly into iClone, allowing for seamless integration of face, body, and hand all within the platform. This streamlined approach reduces complexity, making it easier to execute live shows and set up production workflows.
Record, Edit, and Render Previz in iClone
When it comes to previz, speed and cost-effectiveness are essential. Traditional mocap workflows can be time-consuming due to the necessary exports and imports between different software tools. However, by recording directly into iClone, you can immediately start blocking shots, working out camera moves, and rendering previews. Jeff’s team recorded a few assassin chase scenes on their stage, using Motion LIVE to capture data, and the resulting iProject files served as the foundation for their productions.
The quality of the capture was impressive due to starting with Vicon data. This allowed them to present high-quality previz to a director quickly. The renders included vital data on the edges, such as version info, lens used, and notes for feedback, helping the team stay organized throughout the process.
From Previz to Polished Video: Cleaning and Syncing with Timecode
After capturing mocap for the previz, the team performed proper data cleaning using Shogun Post, and motion edits were carried out in iClone. They identified the best takes—approximately 20% of what was shot—and cleaned only those pieces. One major challenge was syncing the cleaned data, but the introduction of timecode has made this process much easier.
Using the new Timecode Plugin, Jeff could instantly sync the cleaned motion data with the original timeline. By dropping the new motiononto their character and selecting “Align Clip to Embedded Timecode”, iClone automatically aligned it with the original recording. This feature works not only for motion data but also for facial capture, hand data, audio, and video.
Seamless Mocap Sync with Live Action
This seamless integration was especially useful when creating a sitcom scene for Life with Bob! in which a live actress performed against a CGI character. Mocap actors, placed just out of frame, could improvise and interact with the CGI character.
The use of a Tentacle Sync device transmitted timecode across all systems: Vicon, the Live Face app, the video camera, and sound recording devices. This made it incredibly easy to sync everything once the master shot was completed.
Reshooting and Updating with Motion LIVE
One of Jeff’s favorite features of this new workflow is the ability to quickly make adjustments, such as reshooting facial animation without the need for a full reshoot. Using his iPhone, he could capture just the parts that needed improvement, like the lips, while leaving other aspects of the face intact. The flexibility to isolate and reshoot specific elements is a major time-saver and enhances the overall efficiency of the production process.
The New Workflow: Speed, Efficiency, and Creativity
For creators using Reallusion products, the integration of Vicon with iClone opens up new possibilities for creating high-quality animated content in a fraction of the time previously required. For a more detailed insight into the workflow, please find the full article here. The new workflow allows for fast, professional results with minimal overhead. With the addition of the Timecode plugin, keeping everything aligned and organized is simple. Motion LIVE, combined with Vicon’s mocap system, empowers creators to focus on what matters most: creativity and storytelling. And after all, that’s why many of us got into animation in the first place.
Brought to you by Reallusion:
This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.
The post Pro-Level Mocap Sync: Stream Vicon in iClone for Real-Time Animation appeared first on befores & afters.
#prolevel #mocap #sync #stream #viconPro-Level Mocap Sync: Stream Vicon in iClone for Real-Time AnimationReallusion combines Vicon technology with industry-standard timecode support in iClone. Reallusion announces its official partnership with Vicon, enabling direct motion capture support for iClone. For the first time, Vicon systems can seamlessly connect with Motion LIVE, offering full integration of real-time, high-fidelity body capture, along with facial and hand mocap, all within one unified platform. Jeff Sheetz, founder of Monkey Chow Animation Studio, shares his experience with this powerful iClone-Vicon suite, which he believes delivers studio-quality results faster than ever before. Jeff Scheetz, Motion Capture Orlando Since leaving his role as co-founder of The Digital Animation & Visual EffectSchool, Jeff Scheetz, alongside his wife Anne, has been running Monkey Chow Animation Studio in Orlando, Florida. In 2021, they expanded into motion capture with the launch of Motion Capture Orlando, serving the theme park industry. Their work can be seen at Universal Studios and Walt Disney World, and they’ve also produced various motion capture packs for iClone and ActorCore, such as Run for Your Life, Bank Heist, and Avenging Warriors. Jeff’s team also collaborated with Actor Capture on the Netflix movie The Electric State. Key team members include senior mocap technicians Kaszmere Messbarger and Nelson Escobar. iClone and Vicon—The New Dynamic Duo One of the most exciting aspects of Jeff’s relationship with Reallusion is getting to test out cutting-edge technology before it’s officially released. When he was informed that iClone would soon support Vicon integration and timecode synchronization, it was a game-changer. Motion Capture Orlando has been using Vicon’s cameras and software since they started investing in mocap gear, but they were unable to integrate Vicon data directly into Motion LIVE until now. This breakthrough is especially significant for Jeff, as the previous workflow involved streaming Vicon data into Motion Builder for retargeting, and then moving that data into Unreal Engine. The integration of facial data, hand motion, and the inherent latency in this multi-step process presented challenges. Now, with iClone’s Motion LIVE, Vicon streams directly into iClone, allowing for seamless integration of face, body, and hand all within the platform. This streamlined approach reduces complexity, making it easier to execute live shows and set up production workflows. Record, Edit, and Render Previz in iClone When it comes to previz, speed and cost-effectiveness are essential. Traditional mocap workflows can be time-consuming due to the necessary exports and imports between different software tools. However, by recording directly into iClone, you can immediately start blocking shots, working out camera moves, and rendering previews. Jeff’s team recorded a few assassin chase scenes on their stage, using Motion LIVE to capture data, and the resulting iProject files served as the foundation for their productions. The quality of the capture was impressive due to starting with Vicon data. This allowed them to present high-quality previz to a director quickly. The renders included vital data on the edges, such as version info, lens used, and notes for feedback, helping the team stay organized throughout the process. From Previz to Polished Video: Cleaning and Syncing with Timecode After capturing mocap for the previz, the team performed proper data cleaning using Shogun Post, and motion edits were carried out in iClone. They identified the best takes—approximately 20% of what was shot—and cleaned only those pieces. One major challenge was syncing the cleaned data, but the introduction of timecode has made this process much easier. Using the new Timecode Plugin, Jeff could instantly sync the cleaned motion data with the original timeline. By dropping the new motiononto their character and selecting “Align Clip to Embedded Timecode”, iClone automatically aligned it with the original recording. This feature works not only for motion data but also for facial capture, hand data, audio, and video. Seamless Mocap Sync with Live Action This seamless integration was especially useful when creating a sitcom scene for Life with Bob! in which a live actress performed against a CGI character. Mocap actors, placed just out of frame, could improvise and interact with the CGI character. The use of a Tentacle Sync device transmitted timecode across all systems: Vicon, the Live Face app, the video camera, and sound recording devices. This made it incredibly easy to sync everything once the master shot was completed. Reshooting and Updating with Motion LIVE One of Jeff’s favorite features of this new workflow is the ability to quickly make adjustments, such as reshooting facial animation without the need for a full reshoot. Using his iPhone, he could capture just the parts that needed improvement, like the lips, while leaving other aspects of the face intact. The flexibility to isolate and reshoot specific elements is a major time-saver and enhances the overall efficiency of the production process. The New Workflow: Speed, Efficiency, and Creativity For creators using Reallusion products, the integration of Vicon with iClone opens up new possibilities for creating high-quality animated content in a fraction of the time previously required. For a more detailed insight into the workflow, please find the full article here. The new workflow allows for fast, professional results with minimal overhead. With the addition of the Timecode plugin, keeping everything aligned and organized is simple. Motion LIVE, combined with Vicon’s mocap system, empowers creators to focus on what matters most: creativity and storytelling. And after all, that’s why many of us got into animation in the first place. Brought to you by Reallusion: This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here. The post Pro-Level Mocap Sync: Stream Vicon in iClone for Real-Time Animation appeared first on befores & afters. #prolevel #mocap #sync #stream #viconBEFORESANDAFTERS.COMPro-Level Mocap Sync: Stream Vicon in iClone for Real-Time AnimationReallusion combines Vicon technology with industry-standard timecode support in iClone. Reallusion announces its official partnership with Vicon, enabling direct motion capture support for iClone. For the first time, Vicon systems can seamlessly connect with Motion LIVE, offering full integration of real-time, high-fidelity body capture, along with facial and hand mocap, all within one unified platform. Jeff Sheetz, founder of Monkey Chow Animation Studio, shares his experience with this powerful iClone-Vicon suite, which he believes delivers studio-quality results faster than ever before. Jeff Scheetz, Motion Capture Orlando Since leaving his role as co-founder of The Digital Animation & Visual Effect (DAVE) School, Jeff Scheetz, alongside his wife Anne, has been running Monkey Chow Animation Studio in Orlando, Florida. In 2021, they expanded into motion capture with the launch of Motion Capture Orlando, serving the theme park industry. Their work can be seen at Universal Studios and Walt Disney World, and they’ve also produced various motion capture packs for iClone and ActorCore, such as Run for Your Life, Bank Heist, and Avenging Warriors. Jeff’s team also collaborated with Actor Capture on the Netflix movie The Electric State. Key team members include senior mocap technicians Kaszmere Messbarger and Nelson Escobar. iClone and Vicon—The New Dynamic Duo One of the most exciting aspects of Jeff’s relationship with Reallusion is getting to test out cutting-edge technology before it’s officially released. When he was informed that iClone would soon support Vicon integration and timecode synchronization, it was a game-changer. Motion Capture Orlando has been using Vicon’s cameras and software since they started investing in mocap gear, but they were unable to integrate Vicon data directly into Motion LIVE until now. This breakthrough is especially significant for Jeff, as the previous workflow involved streaming Vicon data into Motion Builder for retargeting, and then moving that data into Unreal Engine. The integration of facial data, hand motion (via data gloves), and the inherent latency in this multi-step process presented challenges. Now, with iClone’s Motion LIVE, Vicon streams directly into iClone, allowing for seamless integration of face, body, and hand all within the platform. This streamlined approach reduces complexity, making it easier to execute live shows and set up production workflows. Record, Edit, and Render Previz in iClone When it comes to previz, speed and cost-effectiveness are essential. Traditional mocap workflows can be time-consuming due to the necessary exports and imports between different software tools. However, by recording directly into iClone, you can immediately start blocking shots, working out camera moves, and rendering previews. Jeff’s team recorded a few assassin chase scenes on their stage, using Motion LIVE to capture data, and the resulting iProject files served as the foundation for their productions. The quality of the capture was impressive due to starting with Vicon data. This allowed them to present high-quality previz to a director quickly. The renders included vital data on the edges, such as version info, lens used, and notes for feedback, helping the team stay organized throughout the process. From Previz to Polished Video: Cleaning and Syncing with Timecode After capturing mocap for the previz, the team performed proper data cleaning using Shogun Post (Vicon’s mocap cleaning software), and motion edits were carried out in iClone. They identified the best takes—approximately 20% of what was shot—and cleaned only those pieces. One major challenge was syncing the cleaned data, but the introduction of timecode has made this process much easier. Using the new Timecode Plugin, Jeff could instantly sync the cleaned motion data with the original timeline. By dropping the new motion (as an FBX) onto their character and selecting “Align Clip to Embedded Timecode”, iClone automatically aligned it with the original recording. This feature works not only for motion data but also for facial capture, hand data, audio, and video. Seamless Mocap Sync with Live Action This seamless integration was especially useful when creating a sitcom scene for Life with Bob! in which a live actress performed against a CGI character. Mocap actors, placed just out of frame, could improvise and interact with the CGI character. The use of a Tentacle Sync device transmitted timecode across all systems: Vicon, the Live Face app, the video camera, and sound recording devices. This made it incredibly easy to sync everything once the master shot was completed. Reshooting and Updating with Motion LIVE One of Jeff’s favorite features of this new workflow is the ability to quickly make adjustments, such as reshooting facial animation without the need for a full reshoot. Using his iPhone, he could capture just the parts that needed improvement, like the lips, while leaving other aspects of the face intact. The flexibility to isolate and reshoot specific elements is a major time-saver and enhances the overall efficiency of the production process. The New Workflow: Speed, Efficiency, and Creativity For creators using Reallusion products, the integration of Vicon with iClone opens up new possibilities for creating high-quality animated content in a fraction of the time previously required. For a more detailed insight into the workflow, please find the full article here. The new workflow allows for fast, professional results with minimal overhead. With the addition of the Timecode plugin, keeping everything aligned and organized is simple. Motion LIVE, combined with Vicon’s mocap system, empowers creators to focus on what matters most: creativity and storytelling. And after all, that’s why many of us got into animation in the first place. Brought to you by Reallusion: This article is part of the befores & afters VFX Insight series. If you’d like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here. The post Pro-Level Mocap Sync: Stream Vicon in iClone for Real-Time Animation appeared first on befores & afters.0 Comments 0 Shares -
How that Red Hot Chili Peppers ep of ‘Love, Death & Robots’ was made
David Fincher breaks it down.
The post How that Red Hot Chili Peppers ep of ‘Love, Death & Robots’ was made appeared first on befores & afters.
#how #that #red #hot #chiliHow that Red Hot Chili Peppers ep of ‘Love, Death & Robots’ was madeDavid Fincher breaks it down. The post How that Red Hot Chili Peppers ep of ‘Love, Death & Robots’ was made appeared first on befores & afters. #how #that #red #hot #chiliBEFORESANDAFTERS.COMHow that Red Hot Chili Peppers ep of ‘Love, Death & Robots’ was madeDavid Fincher breaks it down. The post How that Red Hot Chili Peppers ep of ‘Love, Death & Robots’ was made appeared first on befores & afters.0 Comments 0 Shares -
New behind the scenes featurette on ‘A Minecraft Movie’ looks at on-set puppeteering and more
The post New behind the scenes featurette on ‘A Minecraft Movie’ looks at on-set puppeteering and more appeared first on befores & afters.
#new #behind #scenes #featurette #minecraftNew behind the scenes featurette on ‘A Minecraft Movie’ looks at on-set puppeteering and moreThe post New behind the scenes featurette on ‘A Minecraft Movie’ looks at on-set puppeteering and more appeared first on befores & afters. #new #behind #scenes #featurette #minecraftBEFORESANDAFTERS.COMNew behind the scenes featurette on ‘A Minecraft Movie’ looks at on-set puppeteering and moreThe post New behind the scenes featurette on ‘A Minecraft Movie’ looks at on-set puppeteering and more appeared first on befores & afters.0 Comments 0 Shares -
How Steamroller Animation made a 30 min animated pilot episode in Unreal Engine
The full ep is now available to watch for free!
Today on the befores & afters podcast, we’re chatting to Steamroller Animation about their pilot episode of Spice Frontier. Back in 2019, the team released an 8 minute short film as a proof of concept, and now they’ve ramped this up for a pilot. With Adam Meyer, Josh Carroll and Dave Alve, I ask them about what went into making an original IP. I think that is definitely the dream of many VFX and animation studios out there – to develop their own intellectual property – so it’s really interesting to hear the journey Steamroller has been on to create this property.
We also talk technical, in terms of utilizing Unreal Engine in their pipeline, and what some of the creative and tech hurdles have been along the way. Listen in, above, and watch the pilot episode below.
Click to view slideshow.
The post How Steamroller Animation made a 30 min animated pilot episode in Unreal Engine appeared first on befores & afters.
#how #steamroller #animation #made #minHow Steamroller Animation made a 30 min animated pilot episode in Unreal EngineThe full ep is now available to watch for free! Today on the befores & afters podcast, we’re chatting to Steamroller Animation about their pilot episode of Spice Frontier. Back in 2019, the team released an 8 minute short film as a proof of concept, and now they’ve ramped this up for a pilot. With Adam Meyer, Josh Carroll and Dave Alve, I ask them about what went into making an original IP. I think that is definitely the dream of many VFX and animation studios out there – to develop their own intellectual property – so it’s really interesting to hear the journey Steamroller has been on to create this property. We also talk technical, in terms of utilizing Unreal Engine in their pipeline, and what some of the creative and tech hurdles have been along the way. Listen in, above, and watch the pilot episode below. Click to view slideshow. The post How Steamroller Animation made a 30 min animated pilot episode in Unreal Engine appeared first on befores & afters. #how #steamroller #animation #made #minBEFORESANDAFTERS.COMHow Steamroller Animation made a 30 min animated pilot episode in Unreal EngineThe full ep is now available to watch for free! Today on the befores & afters podcast, we’re chatting to Steamroller Animation about their pilot episode of Spice Frontier. Back in 2019, the team released an 8 minute short film as a proof of concept, and now they’ve ramped this up for a pilot. With Adam Meyer, Josh Carroll and Dave Alve, I ask them about what went into making an original IP. I think that is definitely the dream of many VFX and animation studios out there – to develop their own intellectual property – so it’s really interesting to hear the journey Steamroller has been on to create this property. We also talk technical, in terms of utilizing Unreal Engine in their pipeline, and what some of the creative and tech hurdles have been along the way. Listen in, above, and watch the pilot episode below. Click to view slideshow. The post How Steamroller Animation made a 30 min animated pilot episode in Unreal Engine appeared first on befores & afters.0 Comments 0 Shares
More Stories