-
- EXPLORE
-
-
-
-
A brand new visual effects and animation publication from Ian Failes.
Recent Updates
-
BEFORESANDAFTERS.COMHow DNEG crafted the troll and the Eregion battle in s2 of The Rings of PowerToday on the befores & afters podcast, were diving into season 2 of The Lord of the Rings: The Rings of Power, with DNEG and visual effects supervisor, Greg Butler. This season, DNEG delivered over 900 shots and led the work in some of the biggest battle sequences that happen around Eregion. This also involves Damrod the Hill Troll. With Greg, we look at what was filmed for these battles, the CG environments, digi-double soldiers and Orcs, CG horses, and the specific approach to the atmosphere in the scenes.This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.Listen in above, and below, check out some fun before and after images and a video breakdown.The post How DNEG crafted the troll and the Eregion battle in s2 of The Rings of Power appeared first on befores & afters.0 Comments 0 Shares 1 ViewsPlease log in to like, share and comment!
-
BEFORESANDAFTERS.COMYep, Wt FX did it, they turned Robbie Williams into a chimpanzeeBehind the scenes of Better Man.In Michael Graceys Better Man biopic film, pop singer Robbie Williams is portrayed as a chimpanzee. Wt FX was responsible for the digital character, which ranges in ages and also goes through 250 different costume changes and 50 separate hair styles (and, yes, even sports Williams trademark tattoos).On set, actor Jonno Davies performed the role of Williams through the use of performance capture, largely following the workflow Wt FX has employed on the Apes films and, of course, from its long history of bringing various CG creatures to life.befores & afters got to chat to Wt FX visual effects supervisor Luke Millar and animation supervisor David Clayton to walk through, step-by-step, the making of ambitious project, which includes several dazzling musical numbers and perhaps the most f-bombs by a CG character in the history of film.It started with a huge previs effortThe musical momentswhich include an ever-increasing dance number around Londons Regent Street, a 100,000+ audience-filled Knebworth Park concert, and a performance at Royal Albert Hallwere previsualized by Wt FX before any other visual effects work commenced, and even before the film was fully greenlit.Michael Gracey was very keen to previs the musical numbers, time them all out to the music, and really get the details of all the transitional shots and how the ebb and flow of the visuals and sound would work together, to really chase down that emotional connection, explains Clayton, who oversaw the previs. It was really fun work because we already had the template of the soundtrack. Theyd also story-boarded some moments and used video-vis of others to piece together the sequences. We were then able to layer on more detailed previs and explore camera design compositions and action.Wt FX utilized its own motion capture stage as part of the previs process. Gracey also visited the studio in Wellington during this stage of production, where he was able to help block out action and iterate on the all-important virtual camera and lighting cues. That previs was the first step of showing people what this movie could be, observes Millar. It was definitely a catalyst that helped with getting the movie funded and advancing to shooting and production.The performance capture methodologyArmed with a previs of the key parts of the film, Wt FX then helped Gracey establish how it would be shot. The most important thing for me was to shoot this like a regular picture, says Millar. I said to the team, We shoot it like Jonno is in the movie. We light it like Jonnos in the movie. We frame up like Jonnos in the movie. We pull focus like Jonno is the person who will be in the final picture.Plate footage from the shoot on location in Serbia.Animation pass comparing the digital characters facial and body performance to Robbie Williams from the original Knebworth concert.Animation pass showing the progression from Jonno Davies original performance, through to the final digital character.Final render depicting the iconic Knebworth concert.Davies was captured in an active marker performance capture suit. However, with the handheld camera work in the film, and some close and intimate action that needed to be captured, sometimes decisions were made to rely less on the technology and concentrate on the performance.Theres a scene where it becomes clear that Robbies nan (Betty, played by Alison Steadman) is getting dementia, relates Millar. It was a very powerful scene to watch them shoot. At the end of it, Nan embraces Robbies head and strokes his hair. In the first couple of takes, Jonno was wearing a motion capture helmet with little bobbles on it, and Alison Steadman was trying to figure out what she could and could not touch. You could see her trying to stroke these plastic bobbles and it was killing the moment. In that instance, we said, Lets lose the helmet. We ended up sacrificing the technology in order to make that moment the incredibly touching and intimate moment it is in the movie. Even though it creates more work on the back-end for animators who will obviously have to translate Jonnos facial performance by hand rather than being able to solve it based on a camera rig, we cant fix a performance that doesnt feel convincing in the photography.For Davies performance as Williams on set, the actor referenced countless hours of the singers past performances. He also had the benefit of Williams being on set for the first couple of weeks of the shoot. This included for filming of the finale My Way. We got Robbie rigged up in a mocap suit and he came out and did the performance, shares Millar. The performance was incredible. There was the level of engagement from all the extras. Everyone was just so, so good. But he missed half the lines. He wasnt in the right part of the stage. He wasnt looking iin the right direction when he should be. It was a very clear thing that hes a great entertainer that absolutely shone on the stage, but he is not an actor. He did run through a few scenes and we did get a lot of great reference material to see how he moved.Building chimp RobbieThe CG chimpanzee version of Robbie Williams was crafted to resemble the singer, particularly his notable eyes and brows. Wt FX relied on photogrammetry scans, texture reference shoots and facial poses to build up their model and puppet. When building him, notes Clayton, we wanted him to feel a bit like Robbie and have the charisma and some signature looks of the real Williams, but we didnt want it to be a funny monkey face version of Robbie Williams. So, we respected the line of the eyebrows and the shapes of the eyes, but it needed to feel very much like a chimpanzee first and then the likenesses, we just tried to ease them in there.Animation pass of the facial and body performance.Creature pass highlighting the textures of Robbies outfit, including wig, hat and clothing.Lighting pass.Animation pass compared to reference footage from the child and adult actors.Final render of ape Robbie Williams as a child in a school play.To test the model, Wt FX created some side-by-side performances with real footage of Williams from past interviews, including those where, as Clayton notes, Robbie is being quite genuine and in the moment when responding to questions. When we put that onto our digital Robbie and it really worked, that was a breakthrough moment where its like, Oh, this is going to sell. I mean, it is true that if Robbie were an animal, he would be a monkey. Hes cheeky, hes an entertainer. Hes quite sharp and in the moment and spontaneous.While Wt FX has extensive experience in crafting apes, chimpanzee Robbie Williams was a different kind of challenge to previous projects. In the Planet of the Apes franchise, details Millar, the apes start off as chimps and slowly evolve to become more human. Whereas, in Better Man, were basically representing a human being as a chimp. Everything a human being needs to do, Robbie needs to dosing, be emotional, angry, happy. The full range of human emotions.There was also a huge amount of dialogue, adds Clayton, mentioning the intense swearing required, too. Hes a chatterbox and hes in pretty much every shot of the movie and driving the narrative, so hes talking a lot, and that needed to feel convincing.In terms of animating the character, breathing became a central part of the process. Breathing is a big part of singing and speaking and performing, confirms Clayton. I was always eyes on with the breathing controls to make sure that the inhales were happening, then cascading down through the exhales as hes talking or singing, before another intake of breath, and away you go again. Thats such a big part of making a digital character, getting that convincing breathing pass in there. Nostrils, too. Humans dont really flare their nostrils a whole lot. Here, we could use it as a way to just bring a variety and a contrast to the movement of the face and make him flare his nostrils to reflect certain emotional beats and add a complexity and a nuance to the landscape of his face.With so much reference of Robbie Williams for everyone to pull from, Wt FX artists and Jonno Davies became extremely adept in collaborating to craft the performance of ape Robbie. Clayton details: It is actually fun in that way. We were not inventing this new character, well, we were inventing a new version of the character, but we were are also retelling historical events of something thats happened. Robbies very cavalier. Hes not trying to be famous. Hes not trying to pander to people. Hes just being himself so that genuine charisma is always there. As animators, its one of the first times as weve got try to replicate this reality, this genuine, charming reality. It was very cool.One of the significant aspects of the chimpanzee build was representing Williams real-life hairstyles and tattoos in the character. The digital ape model was made up of 1,356,167 strands of fur, with 225,712 of those strands being shaved to replicate Williams tattoos. Says Millar: We went through and pulled different hairstyles from Robbies life over the years and mapped them to the eras as they appear in the movie. Initially, we just tried to take a human haircut and block it on his head, which looked terrible. It looked like an ape wearing a wig, which is not where we wanted to be so we ended up going back to more the chimpanzee hairline and shaving the hairstyle. The direction I gave to the team was, Imagine a chimp grew out their hair and then went into a barber and said, Make me look like Robbie Williams. What would the barber do?In one particular scene, Williams is shown with bleached blonde hair. The first pass that the groom artist did for that, discusses Millar, was that they bleached his hair and then put this line around the back where a human hair line would end. Well, if youre a chimp, why would you stop there? So we ended up bleaching the whole body.A similar methodology was relied upon for the tattoos, shares Millar. Rather than just placing ink under the skin for a regular tattoo, we ended upbecause you wouldnt see them because of all the furwe ended up shaving them into the fur, like hair art. It was a very challenging groom situation, not one we are usually presented with. Our artists did a fantastic job of replicating all that detail as different lengths of density and lengths of fur.The more than 200 Robbie Williams costume changes in the film required Wt FX to collaborate closely with the costume department. They sourced, made, borrowed, rented every outfit that you see in the movie, advises Millar. We scanned them all. We used them on set for reference, but essentially none of those costumes were ever going to actually be in front of the camera. So we had to make all those unique costumes, and then add in more variations for the fight where Robbie is fighting a whole load of different versions of himself.Rock DJ: crafting the Regent Street onerIn a three minute and 42 second long (5,334 frames) oner, Williams with band Take That are shown having signed their first record deal and bursting out onto Regent Street in London to celebrate. As the Rock DJ dance and musical sequence progresses, they are initially not that well known, so few people around them react. However, as the group transitions into different looks throughout their careers and continue dancing down the street, more and more people are swallowed into the celebration and a flash mob-like dance ensues.Vid-ref acquired by animators under Clayton proved critical for imagining the sequence in previs form. We definitely didnt shy away from embodying the character, lets say, admits Clayton. Especially in the previs, a bunch of us went into some of the musical numbers. We learned the dance moves that were going to be done in Regent Street. I mean, you could just key frame that in a simple way, but it doesnt give you bearings the same as if youve got real motion capture, even if its from computer nerds, such as myself, dancing down Regent Street. We had a great motion capture day where we broke the whole musical number into about 20 parts, and we just captured them.Our lead who played Robbie for the previs was Kate Venables, adds Clayton. Shes a dancer, so she nailed it, but the rest of us were making the best of it. When you put that motion capture into the Lidar scan of Regent Street that we had, all of a sudden it just springs to life. You can check your lenses, you can check your camera moves. Everything just starts to feel infinitely more real.The previs was provided to director of photography Erik A. Wilson. He went down Regent Street with an iPhone trying to map out the path that Dave had come up with, states Millar. There were certain ways that he couldnt quite move the camera as in the previs, so we figured out a physical path that we could actually take down the street. There was then a techvis path after that to further figure out how to move the camera and whether it was going to be crane, human mounted, et cetera, and to figure out the lensing which varied throughout the sections.A four day night shoot in Regent Street followed to film the plates, with Davies performing as Williams and many dancers also on set. To animate the CG Williams over the nearly four-minute sequence, Wt FX split it into multiple parts. We were able to do the regular treatment of overlaying our Robbie ape over the top, says Clayton, but there was a lot of scrutiny from Michael and his team, as there should be. Its one of the high points of the film, a real centerpiece, super ambitious, so it needed to look as perfect as we could make it.We had to pay particular attention to some transitional moments, say when he spins around and does a costume change, continues Clayton. Although here we could have relied on computer graphics cheats to fade things on and off, we didnt want to go that way. We wanted to make it feel like it could have really been done in camera and all the imperfections that go with that. It was the same with the moment he jumps onto a taxi and then on the back of an iconic London double-decker bus. The physicality of that needed to be the priority. We never wanted to venture into superhero looking stuff.A major effort was also involved in stitching together plates and providing for building and set extensions. The action goes inside and outside shops on Regent Street. Interiors were filmed in Melbourne prior to the London shoot, and would need to be married up to the shop fronts on Regent Street. Wt FX added in digital traffic including buses and cars. Combining all the different interior and exterior plates, and CG elements, was a massive task, says Millar. I think its got the record for the most amount of roto-tasks ever created here at Wt FX.The shot also has many period-correct components, adds Millar. We had control over a few of the shopfronts, so they were dressed, and then there were the ones which we werent allowed to touch, so they had to be replaced, including right at the end of Regent Street which is Piccadilly Circus. Back in the 90s, it was a huge advertising board with fluorescent tubes, whereas now its an LED screen. We had to replace that and take it back to its 90s look. There was one building that just happened to be under renovation. It was covered in scaffolding, so we then had to patch that so it was back to looking pristine again. Because were transitioning through time as were going down the street, by the time we get to the end, its Christmas time. That meant we had to put all of the iconic Christmas lights down Regent Street and put Christmas decorations in the windows. There was a lot of augmentation work to tell that story as well.Let me entertain you: The Knebworth Park concertWilliams enormously attended 2003 Knebworth Park concert was re-created in Serbia. This location allowed production to film with around 2,000 extras, while a further 123,000 would be added in as digital crowd members by Wt FX. During the shoot, Millar helped co-ordinate the shooting of small chunks of 50-people sized crowds for close-up shots around the stage. It turned out, he says, with 2,000 extras, we could actually get most of the medium and the close-ups all in camera without needing to go digital. Essentially what we did was put on a gig in Serbia. The stage wasnt really a set, it was rented from an actual stage company that built it for concerts. The same with the lighting. We were on a studio backlot, but it was essentially a music festival that played half a song and nothing else for the entire four days that we were there. All of the band and stage workers are in-camera, and then its the wider extension and the big crowds that became digital extensions after that.Some archival footage from the actual Knebworth Park event was able to be intercut with the Serbia scenes. That gave us a great goal to match to as well, both in terms of what was in the event at the time, but also the quality of the finished image, which essentially was early 2000s digital video, notes Millar. Clayton adds that the original footage informed Wt FX about what they called crowd detritus, objects like inflatable toys, flags and beach balls, that would be included in the scenes. Disposable cameras and film cameras were also elements added .Shes the One: dancing on a yachtAnother musical number occurs on a yacht in Saint-Tropez, where Williams and Nicole Appleton (Raechelle Banno) dance. This, of course, required a close level of interaction between the two characters. Its another centerpiece moment of the film where Robbie falls in love with Nicole, describes Clayton, but its also a beautiful interweaving of transitions and going forward in time, back in time and back to the yacht. The performance was paramount and getting that interaction and that feeling that theyre there together was very important. We match moved really accurately to Nicole Appletons actress, and then it was a lot of careful work to prioritize the feeling that theyre right there, theyre interacting seamlessly.Id say its the hardest work for sure when youve got a real person and a digital person and theyre that intertwined, notes Millar. That was all captured live. Its worth noting that Raechelle, who plays Nicole, did all the dance work herself. Some people have asked us, Did you replace her head?, but we didnt. For Robbie, it was a dance double who wore a very tight-fitting mocap suit with blue fabric all over the top. That let us get the actual mocap data from that shoot.The animation team would then animate chimp Robbie, knowing that additional work would be required to simulate clothing and solve for hands and other close interactions. Once Daves got it as close as he could, explains Millar, it then literally came down to going through frame by frame and saying, Okay, this bit here needs to be smoothed down on this frame, and we need to pull that bit tight and create a hand impression here when she puts her hand there. It just becomes very painstaking detail work, which is the detail that you dont notice when you watch it, it just flies past. But if it wasnt there, it would look weird and fake.Clayton makes a point of mentioning a moment during the yacht dance where Nicole runs her hand through Robbies hair. Because weve got the match move of her hand, we can have his hair simulating and responding to that hand. Here, too, we had to deal with ape Robbies ears, which are quite big. That would come up quite often, more often than you might think, actually. He would touch his own ears and so we had animation controls for them. Other people might brush against them, so we just flopped them out of the way.My Way: At Royal Albert HallFor Williams performance at Royal Albert Hall, production filmed in two halves. First, a replica stage and floor area was built at Docklands Studio in Melbourne. We had a full orchestra and all the extras sitting around the tables in front of the stage were all part of that shoot that took place in Melbourne, details Millar. Robbie then had a concert about 10 months later at the real Royal Albert Hall where it was requested that everyone came wearing black tie, so we shot corresponding plates for everything that we filmed in Melbourne during that concert in the Royal Albert Hall.Plate footage of the performance on set in Melbourne with Jonno Davies.Plate footage of the on set audience at the Royal Albert Hall with the real Robbie Williams (L) and director Michael Gracey (R) centre stage.Lighting pass showing the detail of ape Robbies costume and hair.Final render of ape Robbie Williams closing performance at the Royal Albert Hall.This meant that wider views of the concert generally included the London audience, while Davies performance of Robbie, the audience on the ground, and the orchestra pit all acquired in Melbourne, complete with extras. Wt FX then combined the plates to ultimately produce an audience of 5,500 people.The workflow required a higher level of planning to establish where to place cameras in the Melbourne set that could match the shooting of the real Albert Hall concert months later. We were able to acquire a quick scan of our set and a quick scan of the real Royal Albert Hall and stick them on top of each other to line the two references up, outlines Millar. Then I could say to the DP, Okay, if you stick your camera eight meters up from this point here, then you should end up in box 36. For a lot of these key angles, when we were in that space, we had to make sure that we were in legit places where we could get a camera. Production needed to know this information right away so those seats didnt get sold when the Royal Albert Hall tickets went on sale! After the Melbourne shoot, an edit was done of the performance to help further figure out what real Albert Hall plates needed to be filmed during that concert. I think it was about 30 or 40 shots that we would need to shoot during the live concert, says Millar. The way it worked was, Robbie would come out, do half of his set, then he would disappear off. We would get four minutes to do our take with a series of colored lights on poles, which would be for eyelines because the stages were different heights and shapes. And then there was a voiceover for what the audience had to do, listen and sway or stand up and applause or things like that, which they would, whilst we went through the different lighting scenarios that take place during the musical number.The Royal Albert shoot involved some frantic moments, admits Millar. We only had two nights for this. The first night was a write-off because it was Sunday, and the British public got absolutely blind drunk throughout the whole day, so they didnt look where they were supposed to be looking! It basically left us with one night and one four-minute take to get every single shot that we needed to get for that scene. In the end we did it, but it was by far the most stressful shoot Ive ever been a part of. The beauty of it is, when you watch the scene, you can just tell that we are in the Royal Albert Hall, you can tell that those people are real. Theres a certain physicality about that whole thing, which you only get from when you are actually in that space.An additional challenge to the Royal Albert Hall sequence came in terms of lighting, that is, Wt FX needing to match the real stage lighting with their digital lighting. To help do that, Millar recognized that the stage shots were effectively a controlled environment where lighting was timed to a music time code, and therefore repeatable. So, he says, rather than halt filming to wander out with our balls and charts after every take, we said, well, we could use this. On previous shows, Ive always talked to lighting board operators and said, It would be great if your world and our world could somehow combine. Theyve always given us these files and Ive got back to base, looked at them and cant make head nor tail of what they are. But on this movie, it was absolutely critical that we could, because theres literally gantries with 50 to a hundred lights in them, and trying to replicate that after the fact without actually having that information is really hard.I sat down with the concert lighting board operator and talked through how their world works, continues Millar. Then we extracted all of this data from them, brought it back to Wt FX, and then some very clever people took that data and were able to replicate the lighting of the concert within our world. There were certain things that we couldnt do. Things like the brightness or colors of lights, the worlds just dont align. So for those, we just shot HDRIs of static lights that we could then source and apply it to the moving lights and also the physical space of a light. In real life, you place it somewhere, whereas obviously in the computer, you need to know where that light is. Lidar got us the position, HDRIs got us the color and the intensity of the lights. It was very satisfying to see it come to life for that first time because you see how many lights are in this thing, and suddenly theyre all moving and theyre doing exactly the same thing that they did on the day. We had all the components then to recreate the whole concert in the computer for when Robbies running around on stage, which was really cool.When your VFX supervisor becomes a star of the filmFor several scenes requiring digital extras, Wt FX happened to place visual effects supervisor Millar into, well, a lot of them. In fact, Millar served as a digital extra 767 times in the film. It wasnt for any sort of narcissistic tendencies, he protests. It was literally because we had two distinct needs for digital people. One of them was the underwater fans in Come Undone, and the other one was the paparazzi in Come Undone as well. It just so happened that throughout principal photography, whenever we had paparazzi extras, we were always on location, so we werent able to scan any of them. Wed get back to the studio and go, We need a paparazzi! And the requirement would be someone whos average build male, middle-aged, and a bit of a creep. And I was like, Oh, I can do that.Millar brought into the studio a collection of different clothes, whereupon he was scanned in different outfits. Whilst the intention was to just put me in that one scene, once I existed, I ended up everywhere. Normally Im the security guard or a bus driver or, motorcyclist, or paparazzi. Im in the movie about 700 times.Whats really funny is, adds Clayton, sometimes the extras would be my motion capture, but Lukes digital double body, merged together. Our powers combined!All images: 2024 PARAMOUNT PICTURES. ALL RIGHTS RESERVED.The post Yep, Wt FX did it, they turned Robbie Williams into a chimpanzee appeared first on befores & afters.0 Comments 0 Shares 13 Views
-
BEFORESANDAFTERS.COMA new challenge: the OffspringHow Legacy Effects crafted the practical creature effects for the Offspring in Alien: Romulus. An excerpt from befores & afters magazine.The climactic encounter of the film occurs between the remaining characters and the fast-growing humanxenomorph hybrid: the Offspring. Romanian former basketball player Robert Bobroczkyi was brought on to play the tall and skinny creature, owing to his distinctive body features. Fede called us and said theyd seen Robert in some YouTube clips, and said, Do you think we could use this guy? Hes not an actor, he is an athlete. Do you think itd be a good idea? And we were like, Hell, yeah.When we were first talking about the Offspring and looked at storyboards, adds Mahan, it just looked like it was supposed to grow very fast and be like a teenage underdeveloped brain but a big, gawky thing with a lurking presence that doesnt really understand itself. The sub-base of what Robert is just naturally was going to be phenomenal. We could do the same makeup on a six-foot tall guy and it would just be okay. But Robert made it special and really made the ending tremendous. Hes 90% of the success of that creature.The Offspring make-up effects from Legacy consisted of 13 pieces of translucent silicone appliances, with portions of Bobroczkyis skin showing through. Mahan and MacGowan were particularly impressed with Bobroczkyis on-set acting, for someone who had not ever done this kind of work before. Says Mahan: Robert was phenomenal because he really took it to heart and really put the effort in to make a character. I think it was his idea to be smiling during some of it. He worked with the acting coach at his school. He worked very hard on creating the movement and character, and he just showed up ready to rock and roll.When Chris Swift and I, with the team, did his make-up test for the first time, we took him to second unit to shoot a test, recounts Mahan. We knew it was very, very special and we both said it was like when Karloff as Frankenstein walks through the door backwards. It was that magical. Fede and everybody had video monitors over on first unit and they could see us setting it up, and then everyone ran over to come to see it. They just couldnt believe it. Read the full issue of the magazine.The post A new challenge: the Offspring appeared first on befores & afters.0 Comments 0 Shares 32 Views
-
BEFORESANDAFTERS.COMBehind the visual effects of MufasaA new short video showcases the motion capture and visual effects work by MPC.The post Behind the visual effects of Mufasa appeared first on befores & afters.0 Comments 0 Shares 35 Views
-
BEFORESANDAFTERS.COMBehind the scenes of Wallace & Gromit: Vengeance Most FowlNick Park and Merlin Crossingham discuss the film and showcase puppet making and animation.The post Behind the scenes of Wallace & Gromit: Vengeance Most Fowl appeared first on befores & afters.0 Comments 0 Shares 10 Views
-
BEFORESANDAFTERS.COMHeres some ways one visual effects studio is using machine learning tools in production right nowAnd its not only with the dedicated pro VFX tools you might think (its also with ones originally designed for just social media use).The topic on the top of so many minds in visual effects right now is artificial intelligence and machine learning. There are, quite simply, new developments every day in the area. But how are all these developments finding their way into VFX usage? befores & afters asked one studio owner during the recent VIEW Conference to find out what they are doing.Wylie Co. founder and CEO Jake Maymudes started his visual effects studio in 2015. He had previously worked at facilities including The Mill, Digital Domain and ILM. Wylie Co. has in recent times contributed to Dune: Part One and Part Two, Alien: Romulus, Uglies, The Killer, Thor: Love and Thunder, The Last of Us and a host of other projects. The boutique studio works on final VFX, sometimes serving as the in-house VFX team, and commonly on aspects such as postvis.The biggest change to visual effects that Maymudes has seen in recent times has come with the advent of new artificial intelligence (AI) and machine learning (ML) workflows. The studio has utilized deep learning, neural networks and generative adversarial networks (GANs) for projects. Some of this relates to dedicated VFX tools, other work, as discussed below, was even done with tools intended for just social media use.In terms of the tools now available, Maymudes is adamant that AI and ML workflows will (and already are) changing the way labor-intensive tasks like rotoscoping, motion capture and beauty work are done in VFX. Theres so much efficiency to be had by using AI tools, argues Maymudes. I see it as really the only way to survive right now in VFX by taking advantage of these efficiencies. I think the whole worlds going to change in the next couple of years. I think itll change dramatically in five. I think itll change significantly in two. I could be wrong, it could be one.Wylie Co. has leapt into this AI/ML world in both small and large ways. On She-Hulk: Attorney at Law, for example, Wylie Co. was utilizing machine learning rotoscoping in 2021 for postvis work on the series. Back then I wasnt aware of a single other company that was diving into machine learning like we were, says Maymudes. And now, weve all had that capability for years.The blue eyes of the Fremen in Dune: Part Two.A much larger way Wylie Co. used machine learning tools was on Dune: Part Two to tint thousands of Fremen characters eyes blue. That task involved using training data direct from blue tinting VFX work already done on Dune: Part One by the studio and feeding that into Nukes CopyCat node to help produce rotoscope mattes. Production visual effects supervisor Paul Lambert, who is also Wylies executive creative director, oversaw the training himself. Hes deep into AI and AI research, notes Maymudes. Hes a technologist at heart.[You can read more about Wylie Co.s Fremen blue eyes visual effects in issue #23 of befores & afters magazine.]Then, theres a different kind of approach Wylie Co. has taken with AI and ML tools that were not perhaps initially intended to be used for high-end visual effects work. The example Maymudes provides here is in relation to the studios VFX for Uglies. On that film, visual effects supervisor Janelle Ralla tasked Wylie with a range of beauty work to be done on the characters as part of the Ugly/Pretty story point. Ralla demonstrated a social media appFaceAppto Maymudes that she was using to concept the beauty work. The app lets users, on their smartphones, change their appearance.Original frame inside FaceApp.She used this app to generate the images to convey what she wanted to see, explains Maymudes. The results were really good, even for those concepts. So, I researched it, and it was an AI-based app. It had used a neural network to do the beauty work. And it did it fast.That was an important consideration for Maymudes. The beauty work had to be completed to a limited budget and schedule, meaning the visual effects shots had to be turned around quickly.After the FaceApp filter was applied.Heres what Wylie Co. did using the app as part of its workflow.We downloaded FaceApp, then brought in our plates, discusses Maymudes. I took the app and I made hero frames with the shots. Then I would take those hero frames into Nuke. I would create a dataset with these hero frames. Then I would train overnight on my Lenovo workstation with my NVIDIA GPUs for 12 hours. Id come back in the morning, click a couple buttons, apply the inference, and it worked.Nuke node graph for the beauty work.We figured out a good workflow for this work through trial and error, adds Maymudes. You have to be very explicit with what you want to tell these neural networks because its one-to-one. Youre basically saying, Please do exactly this. And if your dataset is messed up that youre training with, your results are going to be either really bad or not great, but not perfect, no matter what because its so one-to-one. Its so black and white. Thats why using FaceApp was great in this regard because it was so consistent between the hero frames.Why Maymudes is excited for this particular use of an AI/ML tool is that it was actually designed for something elsejust a fun social media purpose. But, he says, it has amazing facial tracking for face effects and gags. I mean, a lot of these tools do now. Theres a lot of R&D that has gone into these tools, especially ones relating to your face. Because of that, you can pick and pull little tools here and there to use in visual effects. And if you do that, you can find just insane efficiency. Thats why we used it.Original frame.Final beauty work.What we do love at our company are tools that make us better artists, continues Maymudes. We have machine learning tools that do re-timing, and upscaling, and morph cuts, beauty work, matte work. All these little things that kind of take the grunt work out of it, which is nice. But I dont think machine learning is going to stop there. Its going to transform our industry. I dont actually know where its going to go even with how much I research it and I think about it. Honestly, I think its completely unpredictable what visual effects or the world will look like in five years. But the stuff you can do now, well, its good, its useful. We use it.The post Heres some ways one visual effects studio is using machine learning tools in production right now appeared first on befores & afters.0 Comments 0 Shares 9 Views
-
BEFORESANDAFTERS.COMThe visual effects of Better ManA new video featurette on Wt FXs role in turning Robbie Williams into a chimpanzee.The post The visual effects of Better Man appeared first on befores & afters.0 Comments 0 Shares 9 Views
-
BEFORESANDAFTERS.COMThe big animated features are covered in issue #25 of befores & afters mag!Issue #25 of befores & afters magazine features candid interviews with the filmmakers behind some of the biggest animated features of 2024. Go behind the scenes of Inside Out 2, Moana 2, The Wild Robot, Ultraman: Rising, Transformers One, That Christmas and Wallace & Gromit: Vengeance Most Fowl.Here are the filmmakers befores & afters interviewed for this issue, each one at VIEW Conference 2024:Kelsey Mann, Director, Inside Out 2, Pixar Animation StudiosChris Sanders, Director, The Wild Robot, DreamWorks AnimationShannon Tindle, Director, Ultraman: Rising, NetflixHayden Jones, Overall VFX Supervisor, Ultraman: Rising, ILMSimon Otto, Director, That ChristmasJustin Hutchinson-Chatburn, Production Designer, That ChristmasAmy Smeed, Head of Animation, Moana 2, Disney Animation StudiosWill Becher, Supervising Animator and Stop-Motion Lead, AardmanRob Coleman, Creative Director & Animation Supervisor, Transformers One, ILM, SydneyFind issue #25 at your local Amazon store:USAUKCanadaGermanyFranceSpainItalyAustralia JapanSwedenPolandNetherlandsThe post The big animated features are covered in issue #25 of befores & afters mag! appeared first on befores & afters.0 Comments 0 Shares 9 Views
-
BEFORESANDAFTERS.COMSee the CG creature work crafted by Herne Hill for the demonic possession in The DeliveranceWatch the VFX breakdown exclusively here at befores & afters.The post See the CG creature work crafted by Herne Hill for the demonic possession in The Deliverance appeared first on befores & afters.0 Comments 0 Shares 15 Views
-
BEFORESANDAFTERS.COMWatch Scanlines VFX breakdown for SennaThe post Watch Scanlines VFX breakdown for Senna appeared first on befores & afters.0 Comments 0 Shares 15 Views
-
BEFORESANDAFTERS.COMHeres the 10 films that will go to the VFX Bake-off in early 2025From the Academy:Ten films remain in the running in the Visual Effects category for the 97th Academy Awards. The Visual Effects Branch Executive Committee determined the shortlist. All members of the Visual Effects Branch will be invited to view excerpts and interviews with the artists from each of the shortlisted films on Saturday, January 11, 2025. Branch members will vote to nominate five films for final Oscar consideration.The films, listed in alphabetical order by title, are:Alien: RomulusBetter ManCivil WarDeadpool & WolverineDune: Part TwoGladiator IIKingdom of the Planet of the ApesMufasa: The Lion KingTwistersWickedNominations voting begins on Wednesday, January 8, 2025, and concludes on Sunday, January 12, 2025.Nominations for the 97th Academy Awards will be announced on Friday, January 17, 2025.The 97th Oscars will be held on Sunday, March 2, 2025, at the Dolby Theatre at Ovation Hollywood and will be televised live on ABC, streamed live on Hulu and airs live in more than 200 territories worldwide.The post Heres the 10 films that will go to the VFX Bake-off in early 2025 appeared first on befores & afters.0 Comments 0 Shares 13 Views
-
BEFORESANDAFTERS.COMWatch Rodeos VFX breakdown for s2 of The Rings of PowerA new video breakdown is out!The post Watch Rodeos VFX breakdown for s2 of The Rings of Power appeared first on befores & afters.0 Comments 0 Shares 13 Views
-
BEFORESANDAFTERS.COMBehind that Carry-On car fight sceneWatch a behind the scenes bluescreen shoot reel.The post Behind that Carry-On car fight scene appeared first on befores & afters.0 Comments 0 Shares 17 Views
-
BEFORESANDAFTERS.COMThe Yard breaks down its VFX for s2 of The Rings of PowerTheir video breakdown is here.The post The Yard breaks down its VFX for s2 of The Rings of Power appeared first on befores & afters.0 Comments 0 Shares 16 Views
-
BEFORESANDAFTERS.COMHow to make practical water and bubble effectsPlus, old-school motion graphics anim and optical effects.Today on the befores & afters podcast, were chatting to director, cinematographer and VFX artist Christopher Webb about practical effects. Chris is the founder of FX WRX, an outfit that specializes in in-camera effects. Ive talked to him previously about several projects, but today were narrowing in on a Gatorade Propel spot achieved with some very fun water and bubble effects, and on a Tom Petty and the Heartbreakers video. For that video, FX WRX created some incredible analog motion graphic animation and optical effects, very 80s style. For each project, we go into detail about the shoot at FX WRXs studio, including with motion control camera equipment and bespoke setups.This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.Check out the chat above, and the final pieces and some behind the scenes images, below.The post How to make practical water and bubble effects appeared first on befores & afters.0 Comments 0 Shares 17 Views
-
BEFORESANDAFTERS.COMWatch Picmas VFX breakdown for SennaA new vendor VFX breakdown for the series.The post Watch Picmas VFX breakdown for Senna appeared first on befores & afters.0 Comments 0 Shares 40 Views
-
BEFORESANDAFTERS.COMBreaking down a stunt from Black DovesHow the jump from a burning building was filmed, including behind the scenes of the VFX.The post Breaking down a stunt from Black Doves appeared first on befores & afters.0 Comments 0 Shares 43 Views
-
BEFORESANDAFTERS.COMTwo Ultraman: Rising behind the scenes videos are now hereOne, from Netflix, features writer/director Shannon Tindle and co-director John Aoshima, and the other is ILMs VFX breakdown.The post Two Ultraman: Rising behind the scenes videos are now here appeared first on befores & afters.0 Comments 0 Shares 46 Views
-
BEFORESANDAFTERS.COMOn The Set Pic: Skeleton CrewJude Law on the set of Lucasfilms Star Wars: Skeleton Crew. Photo by Matt Kennedy. 2024 Lucasfilm Ltd. & TM. All Rights Reserved.The post On The Set Pic: Skeleton Crew appeared first on befores & afters.0 Comments 0 Shares 47 Views
-
BEFORESANDAFTERS.COMHow visual effects put ships (and sharks!) in the ColosseumVisual effects supervisor Mark Bakowski breaks down key sequences from Ridley Scotts Gladiator II, including the Colosseum ship battle, the rhino and baboons.One of the most stunning sequences in Ridley Scotts Gladiator II is a naval battle that happens inside the Colosseum. Oh, and also, there are sharks. Visual effects supervisor Mark Bakowski recalls first hearing about the sequence in pre-production. I remember sitting there with the producer who was showing me pictures and pointing at them and then looking at me, and then pointing at them, he tells befores & afters.The plan was to shoot the audacious scene wet-for-wet as much as possible, relates Bakowski. Theres a very large water tank in Malta, and so the idea would be when youre looking down, you put a big ol bluescreen in the background, and wed do a very small set build of the Colosseum. We knew wed need to use a CG Colosseum anyway, but wed get the water interaction with the tank.Gladiator II from Paramount Pictures.Gladiator II from Paramount Pictures.That was the plan, adds Bakowski, but it didnt come to pass because the actors strike came along and we had this week and a half where all the actors had gone, but the stunties were still knocking around because they werent SAG. So, in that time, Ridley said, Well, lets just shoot the boats. But the tank wasnt ready, so we had to shoot the boats dry instead. It was meant to be a rehearsal. Of course, it wasnt a rehearsal.Ultimately, the director was happy with what had been filmed dry. There were definite advantages to shooting dry because you shoot so much faster, says Bakowski. Imagine moving a camera around on a boat; everything just takes time. Ridley could shoot the pace he wanted to. The die was cast and we didnt really want to intercut them too much. Obviously if someone falls into the water and is swimming in the water, those really were shot in the wet.Gladiator II from Paramount Pictures.Gladiator II from Paramount Pictures.In the end, the sequence was filmed dry, then wet in a tank, as well as in an underwater tank, and on stage at Shepparton. Those four shooting scenarios were ultimately brought together by Industrial Light & Magic (also responsible for the sharks).What helped us was the burning sail gag, advises Bakowski. It wasnt planned that way, but it really worked as a glue because if you put embers and smoke into all these things, its like, Oh yeah, its the same place.Filling out the ColosseumThe Colosseum itself, seen in that naval battle as well as in many other sequences, was a mix of a practical build and then extensive CG extensions. Production designer Arthur Max orchestrated construction of the arena in Malta amounting to about one third of its real height, and approximately one third the way around. ILM then constructed the Colosseum as a digital asset (in addition to ILM, other VFX vendors on Gladiator II included Framestore, SSVFX, Cheap Shot, Ombrium and Exceptional Minds).Gladiator II from Paramount Pictures.Gladiator II from Paramount Pictures.Interestingly, Bakowski consulted a professor of history at Oxford about the VFX build for the Colosseum, especially in relation to the awning shadesThe Velariumat the top of the arena, and it was built to match the accurate historical data. However, according to Bakowski, something was amiss.It didnt look like the first Gladiator at all. It was accurate, but not aesthetically pleasing. Our real one just didnt work in the same way, so very quickly we adjusted ours so that it matched Gladiators look in terms of the design.Gladiator II from Paramount Pictures.Gladiator II from Paramount Pictures.For crowds, production was often able to film with around 500 extras in Matla. That was always a fun game, notes Bakowski, because Ridley shoots with 10, 12, 14 cameras at times. Theyre all pointing different directions, and then youve got this 500-person crowd, so youre trying to work out what the hell do we do with this crowd? Youve got to try and place your bets about where to put them and hope the cameras are going to catch enough important bits of the action. Additional crowds were CG, achieved via a separate motion capture shoot and then generation by ILM. Some crowd tiles with the extras were also utilized.Wider shots of Rome, especially those of characters arriving into the city, made similar use of both practical sets and digital augmentations. We had a practical build of an archway which went up about 30 feet, details Bakowski. On top of that, its a top up where they couldnt build any more, which had a bit of a bluescreen for shadow casting and that was it. The suckling wolf is visual effects. Theres a practical river that special effects supervisor Neil Corbould built, which we extended a little bit. And then on the hillsides theres Rome which was all visual effects instead of being a car park, which is actually what was there [in the plate].The opening battleGladiator II begins with the Roman invasion of Numidia. The naval siege against the Numidian fortified walls was actually filmed in Morocco, where there was no water. Neil Corbould orchestrated the movement of Roman galleon ships in the desert using massive 20 axle plant movers, usually relied upon to move oil rigs. ILM then added the ocean, ships and extended a real-world castle in Ouarzazate, Morocco for where the Numidian army defends its homeland.Gladiator II from Paramount Pictures.Gladiator II from Paramount Pictures.The practical photography gave you this thing to hang onto every time, states Bakowski. Thats whats great about it. Even if you replace it all, youve got the sense of intent. Youve got the sense of what these shots should be about, how the light would work, and something to hang on to.One particular challenge for this battle, from Bakowskis point of view, was oars. We had to think about things like oars, whether to have them there or not. In the opening battle, we had no oars, theres no point. You cant see the oarsmen because theyre all under inside. So obviously we didnt want those random oars thrashing about in the desert, so there we went with CG oars. Whereas in the Colosseum set, you had to have oars because you could see the oarsman rowing the whole time.Gladiator II from Paramount Pictures.Gladiator II from Paramount Pictures.Creature features: a rhino, and baboonsAnother dramatic sequence that takes place inside the Colosseum was the rhinoceros battle, which sees a mounted rhino take on some new gladiators. A rhino in the Colosseum was imagined for the first film, with some CG tests even carried out by Tippett Studio, but was never shot. On Gladiator II, production filmed with a Neil Corbould-made rhino animatronic. It was driven around via radio control on wheels, explains Bakowski. It had a gait that wasnt necessarily scientific in how it moved, but we got away with it.Framestore then delivered a CG rhino (and sometimes CG rider) for the sequence, with ILMs Colosseum and crowds making up the backgrounds. Framestore would utilize the saddle and parts of the stunt performer rider where they could. It did work pretty well and we could retrofit [our CG rhino] to what the practical rhino was doing, says Bakowski. That gave us the bouncing up and down, it gave everyone eyelines and it was nice reference in terms of kicking up dust. We could keep a lot of that dust. We added more, but it certainly did no harm, and it was pretty damn cool.Framestore was also responsible for the baboons (suffering from alopecia) that are unleashed on the gladiators in another fight, a sequence that was tricky owing to the level of interaction between gladiator and animal. It was the toughest sequence, I think, views Bakowski. Conceptually, because, well, baboon alopecia, its an unusual thing. No ones seen one until this movie. But the physicality of it was interesting as well, because baboons are small, like four and a half feet. We tried to get the smallest stunties we could, but theyre just not four feet tall as you can imagine. We got the smallest ones we could to do the fighting, but youve got to work with what youve got, so theres some big old stunt men running around there.Prosthetics designer Conor OSullivan built a torso of a baboon for close interaction shots, including for a moment that involves biting. Framestore then meticulously removed the stunt performers and baboon prosthetics from the plates and replaced them with their CG creatures. Bakowski observes that the visual effects work was particularly challenging due to the frenetic action of the baboon fight and the incorporation of a short shutter. It was a tough, tough, tough sequence. But we got there in the end.The post How visual effects put ships (and sharks!) in the Colosseum appeared first on befores & afters.0 Comments 0 Shares 42 Views
-
BEFORESANDAFTERS.COMWatch Miaguis VFX breakdown for SennaThe studio recreated the Monaco GPs of 1984 and 1988.The post Watch Miaguis VFX breakdown for Senna appeared first on befores & afters.0 Comments 0 Shares 43 Views
-
BEFORESANDAFTERS.COMSee WeFXs VFX breakdown reel for s3 of FromIncluding creature transformationsand a cow. (warning: gross)The post See WeFXs VFX breakdown reel for s3 of From appeared first on befores & afters.0 Comments 0 Shares 48 Views
-
BEFORESANDAFTERS.COMThe virtual production behind DaddioHow the Dakota Johnson and Sean Penn film was shot on an LED stage.Today on the befores & afters podcast, were chatting to Disguise VP of Virtual Production, Addy Ghani about the film Daddio. The Dakota Johnson and Sean Penn film is unique because it is essentially one long taxi ride that was effectively fully shot on an LED stage. In our chat, Addy goes through the shooting process, including the car process plates that were filmed, and how different tools and vendors made it possible to play those back on the LED volume. We also dive into the Disguise side of the process, plus a little history of Disguise itself.This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.Check out the chat above, and some behind the scenes images below.The post The virtual production behind Daddio appeared first on befores & afters.0 Comments 0 Shares 44 Views
-
BEFORESANDAFTERS.COMThe new standard for VFX: How Suite Studios is revolutionizing remote collaboration with cloud storage for creative teamsStream data in real-time to access complex working files from anywhere.The VFX industrys biggest challenge isnt creating jaw-dropping visualsits managing the mountains of data that make this work possible. From individual texture plates to high-resolution renders, todays teams are tasked with handling massive, complex files, often sharing them between collaborators in different locations. Ensuring everyone involved in a project maintains secure, reliable access to media is no small feat. Thats where Suite enters the picture.Purpose-built to streamline hybrid, remote, and on-prem workflows, Suites cloud-native filesystem allows teams to store, share, and edit media in real-time, without the need to download or sync files before working. On Suite, individual VFX artists, freelance editors, and supervisors can work directly on files stored in the cloud as if they were stored on a local drive, establishing a central source of truth for every asset, accessible from anywhere in the world.The best VFX pipelines thrive on collaboration. Whether its layering elements in Nuke, or perfecting textures in Maya, teams often experience the biggest bottlenecks when transferring complex assets. Waiting for files to download, waiting for versions to re-upload, or waiting hard drives to arrive in the mailtoo many VFX workflows involve time wasted at every step. Collaboration & creativity falter when the tools cant keep pace.Suite eliminates file sharing headaches by enabling instant access to the most up-to-date media, making remote collaboration and asset management effortless while enabling real-time feedback and faster iterations. Suite also integrates seamlessly with every standard creative tool used by VFX teamslike Unreal Engine, Houdini, the entire Adobe Creative Suite, Flow Production Tracking, and moremaking it a true, all-in-one solution for any style of creative work.Suite is engineered for high-performance, no matter the task. Whether youre focused on fine-tuning a feature-length film requiring petabytes of data, or completing a one-off commercial project with a smaller footprint, VFX teams of any size & scope can experience the benefits of real-time collaboration. Scalable to match you needs, Suite grows with you as you need it. Better yetwith end-to-end encryption, redundant backups in object storage, and easy to manage permissions for individual editors & artists, your assets are always safe, and in the right hands.For any size VFX team, leveraging Suites 24 Gbps read / 10 Gbps write speeds, used in tandem with powerful caching & pre-caching features opens the door to confident remote collaboration. If trouble arises, Suites lightning-fast customer support helps teams get back into the edit with minimal headaches; and if human-error causes a hiccup at any moment, Time Machine lets teams go back in time, down to the millisecond, to recover files exactly as they were before.File transfers and exports are also streamlined on Suite thanks to Suite Connect. This powerful feature allows teams to easily ingest media directly into Suite, or export files for download, directly from the cloud-connected drive. This occurs without ever bringing a file local, making Suite a true end-to-end cloud solution for teams working with multiple editors in different places.VFX work shines brighterand so do the creatives that make itwhen teams are free to focus on their craft, not the logistics of sharing files or managing versions. Suite allows creative teams to collaborate as if theyre in the same room, even if theyre scattered across the globe. By removing the headaches & clunkiness of traditional storage solutions, Suite gives teams the freedom to focus entirely on their work, offering not only a storage solution, but a strategic advantage for the forward-thinking creative teams that choose to adopt it.Learn more about how Suite can revamp your VFX workflow here.Brought to you by Suite Studios:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post The new standard for VFX: How Suite Studios is revolutionizing remote collaboration with cloud storage for creative teams appeared first on befores & afters.0 Comments 0 Shares 47 Views
-
BEFORESANDAFTERS.COMMeet the practical creatures and droids of Skeleton CrewA new featurette highlights stop-motion work by Tippett Studio, and also creature work by Legacy Effects.The post Meet the practical creatures and droids of Skeleton Crew appeared first on befores & afters.0 Comments 0 Shares 47 Views
-
BEFORESANDAFTERS.COMAdvanced Motion Technology Reinvents 3D AnimationExperience Prop-Paired Animations and Motion with Facial Expressions, Exclusively on ActorCore.ActorCore, the premium online content store for 3D productions, releases new types of mocap animations: Prop-Paired Animations and Motions with Facial Expressions. With over 3,600 professional motion assets, many of which stand out in the market, such as Hollywood action sequences and hand-keyed cartoon animations, ActorCores latest offerings demonstrate its strength in delivering unique content. This expansion continues to simplify workflows and boost productivity for game, film, simulation, and visualization productions.Worlds First: Prop-Paired Animations & Motion with Facial ExpressionsActorCore is renowned for its expertise in 3D motion, offering a vast array of high-quality assets that cover a wide range of themes, all while leveraging advanced technology in the digital content market. The latest release includes:Prop-Paired AnimationsThis mocap animation synchronizes human motion with animated props, enhancing visual fidelity and simplifying data handling. It allows characters to interact seamlessly with props without the need for separate loading, animation, or manual alignment. Users can experience this firsthand with the Skateboard motion pack, which features professional skateboarding moves that automatically synchronize the skateboard with the characters animations. Additional motion packs featuring animated props, such as biking, will be released in 2025.Motion with Facial ExpressionsActorCore provides motions that incorporate facial expressions, enhancing interactions by maintaining eye contact during conversations, expressing excitement while watching sports, and conveying situational awareness. This type of motion adds depth and authenticity to scenes, bringing them to life with expressive, lifelike movements. Explore how digital actors are enhanced with natural facial and body motions in the Group Sit & Chat, Group Stand & Chat, and Idle Sit and Stand motion packs.Evolving 3D Motion TechnologyFrom the outset, ActorCore has provided high-quality mocap motion while offering professional hand-keyed motion. Initially focused on single-character performances, ActorCore also delivers dual and multi-character interactions, further enriched with facial expression and animated prop integration. The solid advancement in motion technology is designed to simplify 3D production for users while enhancing precision and realism in their projects.In addition to Prop-Paired Animations and Motion with Facial Expressions, special content can be found in the ActorCore 3D motion store:Hollywood-caliber Mocap Animation: Collaborating with top stunt actors and leading mocap studios to deliver incredible performances, ActorCore provides top-notch mocap animations in a wide range of themes for action-film productions, game design, and scene simulations.Handkey Animation: Tailored hand-keyed animations breathe life into diverse 3D cartoon characters, showcasing unique body types and vibrant personalities. Existing projects can be elevated with dynamic, fluid movements that capture the essence of each character.Group Animation: Paired motions for two or more characters illustrate interpersonal relationships or communication, such as those between a parent and child, friends, lovers, and co-workers.Perfectly Matched Motions and PropsVirtual actors no longer perform empty-handed! Hundreds of 3D props and accessories are now available on ActorCore. With the Related Content feature, it is easy to locate the most suitable props to complement selected motions, or vice versa, and preview them alongside the actor. This intelligent matching process allows for convenient access to relevant content, saving significant time on searching and verification.Additionally, Related Content allows for the application of different props to either hands. Furthermore, the same or different props can be applied to each hand of a group of characters, effortlessly creating variety for crowd scenes, such as street protestors, concert audiences, and sports spectators.Custom DownloadIn addition to downloading individual content, it is possible to combine the selected character, motion, and prop from the inventory to create a single FBX file. This allows the character to directly interact with the needed prop in motion after importing to different platforms, eliminating the hassle of manually combining assets in 3D animation software.Easy Crowd Simulation Facilitated by iCloneWith a large bank of lightweight characters, 3D motions, and props from ActorCore, crowd simulations can be easily achieved by implementing the powerful crowd generation features of iClone.Dynamic crowds of any size can be effortlessly generated, engaged in a range of activities such as sitting, standing, conversing, walking, or jogging, all animated to reflect appropriate styles for different genders and age groups.ActorCore motions and props are accessible directly inside iClone, allowing for the pairing of accessories through motions or vice versa, using ActorCores Related Content feature.Motions are auto-loaded with animated props, such as a skateboarding motion that includes the skateboard inside iClone.Motion and prop animations can be exported from iClone to all major 3D production tools.Find Any Motion in ActorCoreActorCore currently offers over 3,600 motions, all compatible with major 3D programs including Unreal Engine, Unity, Maya, MotionBuilder, Blender, 3ds Max, Cinema 4D, Omniverse, and iClone. Step-by-step guides and tutorials are available for each program, along with optimized Auto Setup plugins and Import Presets to streamline workflow. Visit ActorCore today to access a large selection of premium 3D motions.Brought to you by Reallusion:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Advanced Motion Technology Reinvents 3D Animation appeared first on befores & afters.0 Comments 0 Shares 84 Views
-
BEFORESANDAFTERS.COMIn-depth on face replacement, environments and that crazy tennis ball POV shot in ChallengersA new podcast with VFX supervisor Brian Drewes.Today on the befores & afters podcast, were talking to ZERO VFX visual effects supervisor Brian Drewes about the film, Challengers. Now, if youre a befores & afters reader, you might have already seen some coverage of this fantastic tennis film in the magazine, but I wanted to share a follow-up chat I had with Brian where we go into a lot of fun detail about face replacements, environments, crafting slow-mo shots and a range of invisible effects. This really is one of the most unique visual effects projects from the past year, I think. This episode of the befores & afters podcast is sponsored by SideFX. Looking for great customer case studies, presentations and demos? Head to the SideFX YouTube channel. There youll find tons of Houdini, Solaris and Karma content. This includes recordings of recent Houdini HIVE sessions from around the world.Check out the chat above, a behind the scenes video below.The post In-depth on face replacement, environments and that crazy tennis ball POV shot in Challengers appeared first on befores & afters.0 Comments 0 Shares 80 Views
-
BEFORESANDAFTERS.COMCyber Week Sale Now On at RE:Vision Effects!RE:Vision Effects is the creator of unique, Emmy and Academy Award winning software. Their deep, high-end, flexible solutions address retiming, deflickering, denoising, motion blur, texture mapping in post, warping, and morphing.And it doesnt stop there if you need color matching and automatic color enhancement, upres and video detail enhancement, and handling of lens distortion, projection and VR 180/360 including stabilizationwe have plug-ins for that.Used worldwide by professional video editors and compositors, RE:Vision Effects plug-ins are supported on After Effects, Autograph, Baselight, Catalyst, Diamant, FCP, Flame, Fusion, Hitfilm, Media Composer, Motion, Natron, Nucoda, Nuke, Premiere, Resolve, Scratch, Silhouette and Vegas.Take advantage of their Cyber Week sale 25% off all products!Use the discount coupon: B&ACyberBlack24Head to https://revisionfx.com to find your plug-in.Brought to you by RE:Vision Effects:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Cyber Week Sale Now On at RE:Vision Effects! appeared first on befores & afters.0 Comments 0 Shares 93 Views
-
BEFORESANDAFTERS.COMRestoring Reality: 3D Gaussian to Mesh 2.0Exactly a year ago, KIRI Engine released 3D Gaussian Splatting across all supported platforms, two months later its built-in editing tools, and another two months later 3DGS to Mesh. This brand new 3D scanning method has received lots of utilization and love as optimizations and tools dedicated to it are continously introduced, making this seemingly too-good-to-be-true 3D visualization technique more accessible and practical.Today, a year later, KIRI Engine is excited to announce two revolutionary updates in its 3.12 release: 3D Gaussian Splatting to Mesh 2.0 and KIRI Engine 3DGS Blender Addon V2.0. Along with other quality of life changes and improvements to the overall versatility of the app, this is an update you dont want to miss.3DGS (3D Gaussian Splatting) to Mesh 2.0In January 2024, KIRI released 3DGS to Mesh 1.0. This was made possible in collaboration with CJ Ye and his team, and it was the first attempt to make 3DGS more applicable in widely-used applications as the native PLY point cloud file was hardly accepted and visualized correctly in any rendering engine. This method worked by running a separate algorithm on top of the existing splat, reconstructing surface meshes from the depth map analyzed from the generated gaussian splat result. While this method worked to an extent, the light and depth data was often inaccurate and resulted largely inconsistent results, with featureful objects being converted into mesh files with much higher fidelity than their featureless counterparts.Figure 1: 3DGS to Mesh 1.0 Generated SuitcaseBuilding on top of the solid foundation provided by the lastest gaussian splatting mechanics, the newly updated gaussian splat-to-mesh conversion introduces sophisticated normal prediction and reflection removal techniques. Developed once again by the KIRI Engine team and CJ Ye, the new algorithm draws inspiration from professional 3D scanners, specifically the super-dense point cloud generation methodology. With this change, the algorithm is now able to handle scenarios that used to be impossible for gaussian splat-to-mesh reconstruction, which were scanning reflective and transparent surfaces as they would disrupt the reconstruction process by providing inaccurate depth data.This update has been uploaded on GitHub, along with model viewers that display the differences between 1.0 and 2.0. The model viewers are rigged to be in synch in terms of the camera position, making the models differences stand out clearly.Figure 2: 3DGS-to-Mesh 2.0 versus 1.0 on GitHubFigure 3: 3DGS-to-Mesh 2.0 (detected as reflection) versus 1.0 on GitHubMoreover, 3DGS-to-Mesh 2.0 yields significantly higher quality results and processes the files at a much more consistent level.Figure 4 & 5: 3DGS to Mesh 1.0 vs. 2.0In actual practice, the 3DGS-to-Mesh data will be processed and stored under the Scans list, as they will be high-quality 3D models with tons of editability within the app just like KIRIs Photo Scans and Featureless Object Scans.Needless to say, 3DGS-to-Mesh 2.0 is a state-of-the-art technology and marks a significant leap in raterizational-radiance fields derived geometry research. Additional great news: this technology will be fully open-sourced towards the end of this year or the beginning of 2025.3DGS MaskingIn version 3.12 of KIRI Engine, many changes to gaussian splatting are added directly to aid artists workflows, where features and usability improvements are introduced to the update.3DGS now has masking features, which isolates the object from the rest of the background in the model generation process. For those unfamiliar with this technique, auto-masking has been a tried-and-true method for KIRI Engines Photo Scan, where areas besides the object in focus are automatically cropped out.Figure 6: 3DGS to Mesh 2.0 Generated Suitcase (Masked)3DGS Masking will be available for both conventional 3D gaussian splats as well as the new 3DGS-to-Mesh production, where in both cases the user can obtain a clean model without noise coming from the background. This will be a toggle feature on the upload menu for both methods, where you can get the full scene along with its background when turned off, and the isolated object by itself with the feature turned on.Figure 7: 3DGS Masking Off vs. On3DGS Blender Addon V2.0A couple months ago, KIRI released its very own 3DGS Blender Addon and fully open sourced it. Since then, They have been closely monitoring the addon, gathering community feedback, and in turn, improving the existing features as well as implementing new ones.New FeaturesNew Modifiers Edit mode:Several editing modifiers are created and added to the 3DGS object for importing. These modifiers speed up workflow and performance significantly. The modifiers include:Camera CullingCrop BoxesDecimationRemove ArtifactsFigure 8: Crop Tool Demonstration in KIRIs 3DGS Blender Addon V2.0New Point Edit mode:This mode is for editing point clouds before importing the 3DGS object, which can be used to create an isolated object from a full scene.New Modify Animate mode:This mode allows the user to add several animation presets in a few clicks:Noise displacementsTo points transformationTo curves transformationPixelate transformationOptimizations / Bugfixes:Imported objects are now editableHQ Splat/Render now supports multiple objects, objects can be renamed, and the original file location does not need to be keptBlender will not freeze when using the addon in orthograpic viewLike the initial release of KIRI Engine 3DGS Blender Addon V1.0, the new release will be fully free to download and open-sourced for developers, and it continues to be the most usable addon for 3DGS rendering/editing in Blender.Learn more about 3DGS Render on GitHub here.Get 3DGS Render Addon on Blender Market here.Quality of Life UpdatesFeatureless Object ScanMesh quality improvementUIChanged the Include Mesh from the 3DGS upload page to 3DGS to MeshAfter selecting 3DGS to Mesh, there will be two files being generated:A gaussian splat under the 3DGS listA mesh file based on the splat under the Scans listA Few Words from the DevsThis is one of our proudest updates to date, especially in the realm of 3D gaussian splatting. As we continuously push for innovations ever since the initial release of this technology, we released many different features and tools for the reality-stealing capturing method to be more accessible, and we could not be happier with how far weve comeThere are loads of possibilities heading into the future with 3D gaussian splats, and the release of 3DGS to Mesh 2.0 and the Blender Addon V2.0 is the first step of many. We are infinitely excited and optimistic to see what wonders lie ahead of the technologys path, and we are so proud of everything our dev team has achieved. Zion, KIRI Engines PR ManagerThe release of KIRI Engine 3.12 accompanies the programs biggest sales event of the year, offering its premium subscription at the lowest price in the entire year. This is a wonderful opportunity to capitalize on the unlimited access to the state-of-the-art features they newly released. The subscription will be available on all supported platforms.Happy scanning!Check out KIRI Engines official release video for version 3.12 here.Download the KIRI Engine Blender Addon on GitHub.Visit KIRI Engines Official WebsiteDownload KIRI Engine from the Google Play StoreDownload KIRI Engine from the App StoreDont miss KIRI Engines Black Friday sale 55% off! Brought to you by KIRI Engine:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Restoring Reality: 3D Gaussian to Mesh 2.0 appeared first on befores & afters.0 Comments 0 Shares 77 Views
-
BEFORESANDAFTERS.COMGo behind the scenes of Ultraman: Rising in this Foundry showcaseIt looks at how ILM used Katana, Mari and Nuke on the animated film.The post Go behind the scenes of Ultraman: Rising in this Foundry showcase appeared first on befores & afters.0 Comments 0 Shares 104 Views
-
BEFORESANDAFTERS.COMDisney Animation breaks down its process on Moana 2As it has done with so many of its recent films, Walt Disney Feature Animation has a special process section on its website that breaks down many of the different departments and aspects of how it makes CG animated films. This section has been updated to showcase Moana 2.Go in-depth on storyboarding, layout, rigging, modeling, animation, effects animation, lighting and so much more. This is a real treat each time the studio releases a new project.Find the site here: https://disneyanimation.com/processThe post Disney Animation breaks down its process on Moana 2 appeared first on befores & afters.0 Comments 0 Shares 99 Views
-
BEFORESANDAFTERS.COMActionVFXs Big Black Friday Sale (Free Asteroid Pack Included!)Spending long hours cleaning up subpar stock footage? ActionVFX has your back, with a growing library of over 11,000+ blockbuster-quality assets. And our Black Friday Sale begins November 26thoffering 30% off all Credit Plans and Credit Packs. Plus, dont miss our bonus gift: a FREE Asteroid Collection during the sale! Free Asteroid CollectionDuring our Black Friday Sale, grab a collection of 20 professional-grade asteroid elements for FREEperfect for:Sci-fi environmentsSpace scenesImpact sequencesAtmospheric effectsTo claim your free pack, head to ActionVFX.com and create an account during the sale. Its that simple.ActionVFXs Black Friday Deal: Unbeatable SavingsOur 2024 Black Friday sale is one of the best deals out there. Beginning on November 26th, you can get 30% off Credit Plans and Credit Packs, giving you access to the largest and most versatile VFX asset library available. What 30% Off Gets YouActionVFXs Massive Library at Your Fingertips: With over 11,000 assetsincluding explosions, fire, debris, and moreyoull find everything you need to burn down a house, blow something up, add a spooky atmosphere, and more.Access to 1500+ Premium Elements, Instantly: Our Essentials Catalog of premium assets comes free with every level of subscription, and has almost everything you need to get started on a wide variety of projects.Maximum Flexibility: Our credits roll over for up to 12 months, so you can stock up at lower prices now for freedom to decide what assets you need later, even into 2025.Simple Licensing: Download an asset, and its yours forever. Use it in unlimited projects with no restrictions.Stock up now and build your toolkit at unbeatable prices.ActionVFX is Perfect for Independent ArtistsTrusted by top studios and used in blockbuster films, ActionVFX is a great option for solo creators and small teams. Our new credit system makes large collections more affordable, and helps you adapt to unpredictable project needs. One of the benefits of ActionVFX is that you get to keep any assets you download, even if you cancel your plan. With 30% off all credit plans and credit packs during the Black Friday sale, now is the perfect time to give us a try!Supporting Your Creative JourneyActionVFX is more than just an asset library; were here to help you grow.Join the Community: Connect with like-minded creators and access direct support from the ActionVFX team by joining our Discord community.Tutorial Library: Follow along and learn how to seamlessly integrate ActionVFX elements into your workflows with our detailed tutorials.Free Practice Footage: Experiment and refine your skills with our free library of practice footage.Dont Miss Out on ActionVFXs Black Friday DealThis is your chance to access ActionVFXs massive library of 11,000+ assets at the best prices of the year. The 30% Off Black Friday Sale begins November 26th. Head on over to ActionVFX.com to get started, and dont forget to keep an eye out for that free asteroid collection while youre there!Unleash your creative potential with ActionVFXbecause your vision deserves nothing less than the best.Brought to you by ActionVFX:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post ActionVFXs Big Black Friday Sale (Free Asteroid Pack Included!) appeared first on befores & afters.0 Comments 0 Shares 99 Views
-
BEFORESANDAFTERS.COMThe Phantom Menace is 25. Lets talk about itThe latest VFX Notes podcast covers Episode 1.In our newest episode of VFX Notes, Hugo and I dive into Star Wars: The Phantom Menace, celebrating its 25th anniversary this year. This is part one of our discussion on the film. In this first episode, we review the film and discuss the fan reaction to Jar Jar Binks, talk about the digital revolution, digital cinema projections, digital editing, and many other groundbreaking innovations that made this film become the blueprint for the modern visual effects pipeline, pioneered by ILM.This episode is sponsored by Action VFX. If you are working on a visual effects project and need stock footage or assets, dont miss ActionVFXs Black Friday Sale, which begins Tuesday, November 26th. Theyre offering 30% off on all credit plans and credit packs, plus a free asteroids asset pack. They also have new releases every month, so stock up now so you can make your next project truly outstanding. Visit ActionVFX.com to learn more.Watch the episode, below:The post The Phantom Menace is 25. Lets talk about it appeared first on befores & afters.0 Comments 0 Shares 98 Views
-
BEFORESANDAFTERS.COMHow DNEG made Venom dancePlus, DNEGs visual effects for Venom Horse and the Venom vs Xenophage airplane fight in Venom: The Last Dance.At one point in Kelly Marcels Venom: The Last Dance, our heroes Eddie (Tom Hardy) and Venom run into their old friend Mrs. Chen (Peggy Lu) in Las Vegas. It culminates in a dance routine between Venom and Mrs. Chen inside a casino. DNEG, which has worked on all three Venom films, took on the dance sequence. We had a wonderful choreographer and a couple of dancers who developed that entire dance scene along with Kelly, outlines DNEG visual effects supervisor David Lee, who worked with production visual effects supervisors John Moffatt and Aharon Bourland on the film. It was filmed at Leavesden and we did some motion capture tests beforehand to test eyelines, and ensure we were staging them correctly to accommodate the size difference. Mrs. Chen has a couple of spins and gets lifted up by Venom, so we just needed to make sure that what we were doing would actually work on the day, and that included just the ability to scale up our dancer and our mocap data. For the shoot, the stand-in performer for Venom was around 6 feet tall, although Venom is another foot on top of that. We gave him some extendable forearms to dance with and that allowed him to get a better idea about his reach, particularly around the environment, but also it gave Mrs. Chen a point of reference on the day, and something to hold onto when they were interacting with each other, explains Lee. The distance between performers was of utmost importance. We didnt want Venoms arms to be forced artificially close to his body when it came time to animate, so this allowed both performers to always have Venoms true stature in mind throughout, and place themselves accordingly.A separate motion capture session then took place in order to provide data and reference for the final CG Venom. Lees initial thoughts were that DNEG would need to add a sense of weight and slower speed for the large Venom character, but then after considering reference of dancers with large frames, it became apparent that some of these big guys are incredibly light on their feet. I mean, thats the whole thing with dance, isnt it? Its very much, no matter what your size, you can be incredibly graceful and quick. So it just meant that we kept Venom a little bit pacier than what we were expecting at the start. It was really fun doing that sequence, and I think it went down really, really well. The animators in particular had a great time acting out to that. Crafting VenomHaving worked on the two previous films, DNEG was certainly familiar with the character. Still, the Venom model was effectively re-rigged and had facial shapes re-developed to deal with changes in the VFX studios pipeline over the years. DNEGs fresh challenge on The Last Dance was seeing Venom, and Wraith Venom, the tendrily character that emerges out of Eddie, in broad daylight.Venoms lighting is driven primarily by reflections, says Lee. So if you are in a desert, for example, and youve just got the sun as the primary light source, that meant that we wouldnt necessarily be getting all of the detailing and shaping on him that we would get in a night scene, where you might have all of these multiple light sources that can be quite small. They give us a lot of opportunity to shape and give form to this silhouette.We still ended up using quite a bit of creative licence within the daylight scenes and essentially adding multiple small light sources, but still biasing them towards the keylight direction, adds Lee. That enabled us to get more shape and form on Venom, allowing him to keep a similar look to the previous films and not just losing him to one very hot, bright reflection coming in from the key side, which just wasnt aesthetically very pleasing. The idea was to ensure that Venom and Wraith Venom did not feel like a black void in daylight scenes. We would look at imagery of oil as the natural reference, advises Lee, and if it wasnt getting any reflections, oil almost looks like a black hole into something else entirely. It can look completely unreal. That was something that we were very conscious of trying to avoid. So, we began adding these smaller, additional lights within the rigs that would give us more specular points of interest on Wraith, biasing it towards the key light. This gave us his classic wet look and, when combined with a soft subtle bounce, it really helped get some nice shapes and form without flattening him out from broad, bright sky reflections.In terms of animating Wraith Venom, DNEG leant heavily into the emotion of the scene with Eddie, allowing Wraiths lips to have a larger range of motion. Its really challenging from an animation point of view when youve really only got these eyes without any kind of eyebrows or any other kind of facial features and a mouth with very rudimentary lips, admits Lee. Even though he doesnt have eyebrows, we would treat the top of the eyelids in a very similar way to how we would when we animate with eyebrows. In addition to building Venom and Wraith Venom for the film, DNEG was also responsible for the CG builds for new characters Venom Horse, Xenophage and the green Symbiote. Builds were shared with other VFX vendors, including Industrial Light & Magic.Wait, Venom Horse?Yes, Venoms symbiote abilities transform a horse that he and Eddie ride on in the desert. DNEG referenced Shire and Draft horses to analyze how large muscular horses moved. Theyre actually reasonably light on their feet when they start running, states Lee. Also, when we were doing initial tests on this, you see theres a couple of different ways that you can traverse a distance. You can either move your legs faster or you can have a longer gait. We played around with both, as Kelly wanted to lean into the comical side of the sequence.Initially, to push this comical aspect, DNEG increased the speed of leg movement but the result was something too cartoonish. So, instead, we increased the gait to sell the weight of the horse, says Lee. Venoms a big guy, and you always want to have that sense of power running through his performance.For the look of Venom Horse, DNEG had to be careful with its black liquidy sheen. What we found, reveals Lee, was that, with the speed of the horse, it would often read a little bit too much like noise, and it would often end up in quite streaky motion blur, which just aesthetically didnt really look particularly great. The studio therefore added a broader sheen like real horses. It was about leaning into something the audience could grab onto, says Lee. We still had a touch of that Venom-esque tighter specular running through it, and again, we would bias it onto the key side and allow it to fall off on the shadow side. I think it looked quite successful in terms of its surface characteristics in the end.Encounter with the XenophageEarlier, Eddie and Venom had hitched a ride on a jet plane, only to fight against a Xenophage mid-air. The sequence was filmed on stage on a partial airplane section, which enabled Hardy to be shown hanging onto the side of the plane. That enabled us to get a really good grounding in terms of Toms performance and interaction, notes Lee. It also enabled us to film the interior of those sequences to a degree, since we had about seven or eight rows of seats, and then could do an extension.DNEG built a CG plane and environment. One of the principal challenges was selling the right sense of drama and tension with a plane traveling at that speed. Thats where we made sure that we had a lot of clouds that were always visible in the scene, says Lee. Underneath the plane, we had our wispy clouds that were passing by, but we also really dived into the idea that Venom was essentially something that could liquefy. We really lent into the idea that with this speed of the wind, it was essentially disrupting this solid topology using additional cloth simulations, and giving him all of these ripples all over his body to try and sell that environmental impact.Animation took a strong lead in terms of trying to sell the difficulty of moving against that kind of force, continues Lee. So when the hand comes up to try and gain a new position, for example, you might see it being pulled backwards slightly before he regains his strength and starts pushing it forwards. We looked at a lot of referenceits fantastic when we have people like Tom Cruise actually doing some of those stunts for real and being able to say, Well, thats what your face looks like when you are strapped to the side of the plane.For the Xenophage itself, DNEG considered references of praying manti, reptiles and even turtles, given the creature was somewhat of a mix of an insect and a shelled beast. Says Lee: Kelly was really keen to have something that we hadnt seen before for, even down to the conveyor-belt teeth effect, where youve got these rows of teeth that are essentially emerging from underneath the gums and pulling its prey back to the back of the teeth where it then gets crushed by an additional two rollers of teeth.Venom in attack modeFending off an attack in Mexico early in the film, Eddie and Venom unleash their traditional weapons made up of spindly hands, as well as showcasing the classic transformation between Eddie and Venom seen in the previous movies. Those kinds of transformation shots are just always complex in terms of the feedback between animation, CFX, shot sculpt and effects, outlines Lee. The ideal situation with all of these departments in a normal film is that everyone just moves on one at a time, and that makes it very, very straightforward. But with these kinds of shots, everyones standing on each others toes. Its just not a very linear process, and credit to animation, creature FX, and FX for really working together as a strong unit to get the shots working.One particular shot sees Eddie be almost pushed out of Venom and then thrown across the room via Venoms spidery tendrils. That was an opportunity for us to really play around with the idea of Venom almost shedding his skin, discusses Lee. Wraith emerges from Venom while hes still Venom and pushes him out of his own body. And then this allowed the head to actually split in two and start to disintegrate and liquefy as he gets pushed out. It was quite fun taking these established conventions and then just tweaking them a little bit and playing around with the character. All Images Courtesy of DNEG 2024 CTMG, Inc. All Rights Reserved.The post How DNEG made Venom dance appeared first on befores & afters.0 Comments 0 Shares 93 Views
-
BEFORESANDAFTERS.COMBlack Friday Deal: Up to 70% OFF Premium VFX Assets at FX Elements!Maintaining a comprehensive library of high-quality stock footage is essential for professional VFX artists and studios. This Black Friday, FX Elements is offering its most substantial discounts ever, with savings of up to 70% across its entire catalog of professional VFX assets.With categories of effects ranging from explosions to bullet hits and blood effects, to atmospheric smoke and fog and everything in between, FX Elements delivers a comprehensive library of effects to choose from. Lets take a look at the offer and why this is the VFX sale you dont want to miss!Whats on Sale?FX Elements is slashing prices across its entire catalog, making it easier than ever to build your VFX assets library: 70% OFF Individual FX Clips: Choose the exact effects you need with prices starting at less than $3. 40% OFF FX Subscriptions: Lock in lifetime savings and get full access to all FX Elements library of assets. 30% OFF All FX Packs: Bundled effects for all your creative needs at a fraction of the cost.These discounts cover every license type and file type, so you can create worry-free, whether youre producing for a big screen or social media.Browse FX Elements LibraryWhat Sets FX Elements Apart?FX Elements has earned a reputation for providing VFX artists with high-quality, flexible, and easy-to-use VFX assets for over 10 years. Heres what you can expect:No Subscription Required: While our FX Subscription remains the best value available anywhere, we will never force our customers to subscribe or use a credit-based system.4K8K Resolution Footage: Enjoy the clarity and flexibility needed for professional projects, no matter how large the screen.Broad Compatibility: All effects come in multiple file formats, making them easy to integrate with virtually any workflow.Immediate Downloads: Start using your purchased assets within minutesno waiting around.Versatility for All Projects: From film production to YouTube content, FX Elements library has something for every artist and creator.Dont Miss out!This Black Friday sale is a rare opportunity to access professional-grade assets at a fraction of the cost. With discounts applied automatically at checkout, theres no hasslejust big savings.But dont wait! These deals are only available until December 2nd, 2024.Get started with FX Elements today!Brought to you by FX Elements:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Black Friday Deal: Up to 70% OFF Premium VFX Assets at FX Elements! appeared first on befores & afters.0 Comments 0 Shares 100 Views
-
BEFORESANDAFTERS.COMAnimation producer Jinko Gotoh on getting her start on Space JamPlus, a preview of Gotohs keynote session at SIGGRAPH Asia 2024.Today on the befores & afters podcast, were previewing SIGGRAPH Asia 2024 with keynote speaker Jinko Gotoh. Jinko is a producer who has been involved with some of my favorite animated and hybrid animation films over the years like Space Jam, Finding Nemo, The LEGO Movie 2 and Klaus. She was also part of setting up Disneys The Secret Lab, and shes a key part of the organisation Women in Animation.At SIGGRAPH Asia in Tokyo, coming up 3 to 6 December 2024, Jinko is a keynote speaker who will be talking about CGIs Evolution and the Power of Diversity in Animation Production. Listen in above to the preview.Meanwhile, befores & afters is involved in a couple of things during the week in Tokyo:> Yu Yu Hakusho: How Megalis VFX Created the Shows Most Complex Shot, on Thursday, 5 December at 10:30am JST. Ill be MCing this session which will discuss and show some cool things from one particular shot.> Fireside Chat with Paul Debevec on Thursday, 5 December at 3:30pm JST. This is part of Netflix and Eyeline Studios day of sessions at the conference.> World VFX Day is also happening right after SIGGRAPH Asia on Friday 6 December at 4pm JST. Ill pop up with the brilliant Hugo Guerra to chat about all things VFX during the live-stream. And World VFX Day continues also on 8 December.If youre attending SIGGRAPH Asia, would love to say hi!The post Animation producer Jinko Gotoh on getting her start on Space Jam appeared first on befores & afters.0 Comments 0 Shares 106 Views
-
BEFORESANDAFTERS.COMOn The Set Pic: The GorgeFrom Vanity Fair.The post On The Set Pic: The Gorge appeared first on befores & afters.0 Comments 0 Shares 109 Views
-
BEFORESANDAFTERS.COMRebelways Black Friday Sale: Unlock 25% Off on VFX Training Courses!This Black Friday, unlock your full creative potential with 25% off all VFX training courses at Rebelway!Whether youre a budding VFX artist or a seasoned pro, Rebelways industry-leading courses will help you master the art of visual effects.Why Choose Rebelway?Master industry-standard software: Houdini, Nuke, Unreal and more.Learn from industry professionals: Gain insights from top VFX artists.Create stunning visuals: Develop your skills in character animation, environment creation, and dynamic simulations.Lifetime Access: Learn at your own pace and revisit lessons whenever you need.Get 25% off any individual course or 10% off any bundle.Heres how it works:Use One Of These Coupons in checkout to get your discount:Course25 To Get 25% Off Any CourseBundle10 To Get 10% Off The Course Bundle, with bundles you save more then a $1000 on courses.This offer is valid starting from today.Featured VFX Courses to Take Advantage of This Black FridayCREATURE CFX IN HOUDINIHOUDINI FOR 3D ARTISTSACTION MOVIE FX IN HOUDINICINEMATIC LIGHTING IN HOUDINIFeatured Coding CoursesPYTHON FOR PRODUCTIONMACHINE LEARNINGOther Popular Rebelway Courses to ExploreHOUDINI FUNDAMENTALSCOMPOSITING IN NUKEINTRO TO UNREAL ENGINECITY CREATION IN HOUDINIREALTIME FX IN HOUDINIADVANCED WATER FXTake a look at some of the standout projects created by Rebelway studentsshowcasing the impressive VFX skills theyve developed.If youre ready to create incredible VFX projects like these, NOW is the best time. Visit Rebelways website www.rebelway.net and enjoy 25% off any course with the code COURSE25, or 10% off any bundle with the code BUNDLE10.The offer lasts all week, so act fast!Not sure which course to choose? Feel free to reach out to them at info@rebelway.net, and theyll be happy to assist you!Brought to you by Rebelway:This article is part of the befores & afters VFX Insight series. If youd like to promote your VFX/animation/CG tech or service, you can find out more about the VFX Insight series here.The post Rebelways Black Friday Sale: Unlock 25% Off on VFX Training Courses! appeared first on befores & afters.0 Comments 0 Shares 135 Views
-
BEFORESANDAFTERS.COMFirst look: see previs and postvis reels for Dial of DestinyWatch the reels for the first time right here.James Mangolds Indiana Jones and The Dial of Destiny has been out for some time, but were now able to bring you a first look at previs and postvis reels from Proof. As you can see, the studio delivered previs for the film in an animated comic-book quality to help inform the technical and storytelling. The postvis reel is also fascinating to see how plates and bluescreen photography was filled in with temporary visual effects to help the editorial process.Check out the reels, below. The post First look: see previs and postvis reels for Dial of Destiny appeared first on befores & afters.0 Comments 0 Shares 131 Views
-
BEFORESANDAFTERS.COMYou really dont want to miss this latest OTOY short as part of the Roddenberry ArchiveIt includes the return of William Shatner as James T. Kirk.You may have already seen some of the intriguing Star Trek-related shorts produced by OTOY as part of The Archive from the Roddenberry Archive.The latest is 765874: Unification, which celebrates the 30th anniversary of Star Trek: Generations. It launched on the web and via the Apple Vision Pro app. In it, we see live-action footage and CG images, with actors portraying characters like James T. Kirk and Spock during the shoot. According to OTOYs blog, performances came from Sam Witwer as James T. Kirk, with Lawrence Selleck as Spock. Witwer and Selleck were filmed in costume, performing as Kirk and Spock on set, aided by both physical and digital prosthetics resulting in period-accurate portrayals matching the appearance of the characters as they originally appeared in TV and Film at the time.Watching the short, and seeing a few behind the scenes images and videos here and there, really boggles the mind how they handled the face replacement work (which, as noted above, they call digital prosthetics). The visual effects supervisor was Mark Spatny.Heres a fun video from production designer Dave Blass. For folks using terms like "AI" and "Deep Fake" #Unification was all done in camera with @SamWitwer performance captured along with his Kirk version LIVE. This next level of Digital Prosthetic technology used by actors and craftsmen will be huge. It's technology in the hands of pic.twitter.com/OnDXQux3cD Dave Blass (@DaveBlass) November 20, 2024 Head to OTOYs blog post for more info.The post You really dont want to miss this latest OTOY short as part of the Roddenberry Archive appeared first on befores & afters.0 Comments 0 Shares 129 Views
More Stories