• Sonic The Hedgehog Will Be Back For A Fourth Movie
    www.nintendolife.com
    Is anyone surprised?We're just one day away from the Sonic the Hedgehog 3 movie which introduces Shadow the Hedgehog to the blockbuster franchise hitting cinemas worldwide. But Variety is reporting that Paramount Pictures has already greenlighted a fourth film.Shocked? Nope, not us. According to the report, the movie is aiming for a Spring 2027 release. We know nothing else yet, but given that director Jeff Fowler has stuck around for the first three movies, we wouldn't be surprised if he's also on board for a fourth run.Read the full article on nintendolife.com
    0 Commentarios ·0 Acciones ·73 Views
  • Cult RPG Hit 'OFF' Makes Its Way To The Switch In 2025
    www.nintendolife.com
    With a physical edition to boot.If you're a fan of Undertale, then chances are you may have heard of OFF, even if you've yet to actually play it. Creator Toby Fox often cites the RPG as a huge influence on Undertale, and now the 2008 cult hit is making its way to the Switch.Arriving in 2025, the new port boasts official translations into multiple languages (including English) alongside numerous quality-of-life improvements. Fans and newcomers alike can look forward to an all-new battle system, balancing improvements, and brand-new areas and enemies from developer Mortis Ghost.Read the full article on nintendolife.com
    0 Commentarios ·0 Acciones ·86 Views
  • Sam Altman once owned some equity in OpenAI through Sequoia
    techcrunch.com
    OpenAI CEO Sam Altman sat before Congress in 2023 to testify about the dangers of AI. He told American lawmakers at the time that he owns no equity in OpenAI, something hes said many times, claiming he just runs the company because he loves it.However, Altman recently said he actually did have some equity in OpenAI through a Sequoia fund at one point, a stake he has since sold. In an interview with Bari Weiss released Thursday, Altman was asked what kind of stake he might have if OpenAI successfully converts into a for-profit company.Heres what the OpenAI CEO said:I have a tiny sliver of equity from an old YC fund I used to have some via a Sequoia fund but that one turned out to be easier to like sell and not keep the position in so I have a very small amount thats quite insignificant to me. In terms of what I will or wont have going forward, I dont know. Theres no current plan or promise for me to get anything.While Altmans investment through Y Combinator was known, his investment through Sequoia was not. OpenAI discloses Altmans indirect investment in his own company through YC on its website. The startup says this small investment is the CEOs only interest in the company and was made before he worked full time at OpenAI.Sequoia first invested in OpenAI in 2021, according to its website, two years after Altman became the full time CEO of OpenAI. At that time, OpenAI was worth roughly $14 billion, a valuation thats exploded to $157 billion after the startups latest funding round earlier this year a round Sequoia participated in as well.While Sequoias stake in OpenAI from 2021 is worth a lot more now, there are several unknowns about Altmans investment through the venture firm. Venture firms like Sequoia arent required to disclose their limited partner investors. Its unclear when Altman sold the stake and for how much.An OpenAI spokesperson confirmed Altmans prior exposure in a statement to TechCrunch, but did not offer specifics on these aspects.Sam has never had any direct ownership in OpenAI. He held a negligible stake, less than a fraction of a percent, in a general Sequoia fund with a broad portfolio, which he later learned included minimal exposure to OpenAI, said OpenAI spokesperson Kayla Wood in a statement to TechCrunch. Sam no longer has any ongoing commitments to the fund.Most CEOs do have equity in the companies they run. The biggest percentage of a CEOs pay if they are running a public company is equity. And of course, startup founders start their journey owning all of the equity in their companies, until they grant shares to employees and sell off chunks to investors. But OpenAI was founded as a non-profit, has a strange structure, and Altman has repeatedly said he doesnt own any. Just this month, Altman said he had no equity in OpenAI during The New York Times DealBook Summit.During a May interview with the All In podcast, the OpenAI CEO said he originally decided to not take equity in the company because of its corporate structure. According to its charter, OpenAIs non-profit board is required to be filled with a majority of independent directors, meaning they cant have equity in the company. Altman says this led him to not take any equity, in order to be one of those independent directors. However, this has caused many people to question the CEOs motives at the company, Altman said, which is likely one reason the company is shifting away from this structure.Altmans stake in OpenAI has also become increasingly relevant as the company attempts to transition its for-profit branch, which is currently controlled by the non-profit board, into an independent company. OpenAI is also reportedly contemplating granting the CEO some equity in this transition, though the company and Altman have denied that there are plans to.OpenAIs for-profit transition is currently at risk of being held up by Elon Musks lawsuit against the startup. At its core, Musks lawsuit claims that OpenAI is abandoning its original nonprofit mission to make the fruits of its AI research available to all. However, OpenAI recently claimed that Musk wanted to convert the startup into a for-profit from the start.At one point in Altmans interview with Weiss, the OpenAI CEO called Elon Musk a bully who clearly likes to get in fights. At another point, Altman lashed out at Meta for asking Californias attorney general to block OpenAIs for profit transition.I dont know why Meta sent that letter, but I do know they know thats not how it works. I know that part is in bad faith, said Altman. You can imagine lots of other reasons that Meta might have sent this letter. You can imagine they wanted to curry favor with Elon, you can imagine that they felt like it would help them compete with us.While the company says Altmans exposure to OpenAI through Sequoia was negligible, its hard to square Altmans comments about having no equity in OpenAI with his most recent remarks on Weiss podcast.
    0 Commentarios ·0 Acciones ·86 Views
  • Perplexity has reportedly closed a $500M funding round
    techcrunch.com
    In BriefPosted:2:18 PM PST December 19, 2024Image Credits:SeongJoon Cho/Bloomberg / Getty ImagesPerplexity has reportedly closed a $500M funding roundAI-powered search engine Perplexity has reportedly closed a $500 million funding round, valuing the startup at $9 billion.Bloomberg, citing sources familiar, reports that the round was led by Institutional Venture Partners and that it closed earlier in December. In an email to TechCrunch, a Perplexity spokesperson declined to comment.The mammoth tranche comes as the competition in AI-powered search heats up. OpenAI recently launched ChatGPT Search, its answer to Perplexity. And Google is developing capabilities to rival some of what Perplexity offers, including AI-generated summaries and answers on search results pages. This week, The Information reported that Google is planning to take a more aggressive move, potentially building a chatbot-like AI mode directly into Google Search.Perplexity isnt resting on its laurels even as it battles a class action lawsuit over alleged copyright infringement. On Wednesday, the company made its first acquisition, snatching up Carbon, a startup specializing in connecting AI systems to external data sources.Topics
    0 Commentarios ·0 Acciones ·69 Views
  • Boon raises $20.5M to build agentic AI tools for fleets
    techcrunch.com
    Logistics is the name of the game during the holiday season: Companies that can seal the deal and get people and things to the places they need to be, on time, rake it in this time of year.But behind that demand lies a huge amount of inefficiency and fragmentation. Are logistics businesses ready for AI to help run their services better? A startup called Boon thinks the answer is yes. It has now raised $20.5 million to prove that out, by way of a platform to help them make better use of data from disparate applications, to improve their operations, planning, and overall efficiency.Think of Boon as the second employee in the back office, said Deepti Yenireddy, the companys founder and CEO, in an interview. Our AI agent is like another teammate doing critical work so that people can focus on tasks that actually make them more money.The funding is coming from Marathon and Redpoint, which have backed it in a $15.5 million Series A and a previously undisclosed $5 million seed.Taking just goods carriers alone, there are more than 60 million fleet vehicles globally, according to research from Berg Insight, with the vast majority of the companies operating them classified as SMEs.Meanwhile, the tools they use are equally scattered: accounting, routing, sales, HR on average between 15 and 20 different apps and pieces of software are used to run a logistics or fleet company, all existing in silos surrounded by reams of physical paperwork.As Urvashi Barooah, the partner who led the investment for Redpoint Ventures, described it, first-generation point solution software tools have added a heavy administrative load to fleet management companies.Boon thinks it can speed up efficiency in those systems tenfold with its AI tooling.Focusing initially on revenue and operations workflows for example, to help build more efficient routing and find the best places to fuel up the plan is to use the funding to expand the kinds of workflows it can cover, such as to help improve how containers are loaded or how to optimize staffing.Yenireddy said she came up with the idea for Boon while at her previous job as senior director of product at the fleet operations giant Samsara. We know this customer deeply, from my past experience leading product, telematics, and international product at Samsara, she said. These customers want a single place and a single platform. Theyre doing so many things, and they want simplicity in the technology they adopt. Thats the reason and motivation behind building this.She also has a track record as a founder, having previously built an AI company in the HR sector that she sold to Phenom People, an AI recruitment platform. So rather than consider how she might build this within Samsara, she struck out to build it as Boon. Once a founder, always a founder, Yenireddy said. She has pulled together alums Apple, DoorDash, Google, Samsara, and Shell to bolster her vision. (And its actively hiring now for more go-to-market people and engineers.)The funding is coming on the back of some strong interest. Boon has paying customers that represent 35,000 drivers and 10,000 vehicles on its platform, working out to the company reaching an annual revenue run rate of $1 million after nine months of business.This is just scratching the surface, and going deeper could come with some bumps. The actual work of building a platform that can work intelligently across different data silos to boost enterprise intelligence has been something of a holy grail in the B2B world, at the heart of what other big (and heavily funded) startups such as H are trying to do also in the arena of agentic AI. At the same time, if applications of that actually succeed, they might usher in major efficiency, as well as potentially raise questions about what humans will do next as a result of that extra time.
    0 Commentarios ·0 Acciones ·79 Views
  • Proof Shares Something in the Water Previs Reel
    www.awn.com
    The leading visualization studios work involved animating and editing a number of sequences, including a boat crash and shark chase, for the StudioCanal film about a group of friends who meet up for a wedding in theCaribbean and become stranded at sea after their boat sinks.
    0 Commentarios ·0 Acciones ·109 Views
  • DREAMBOX360 Launches A Christmas Dream 5D Projection Experience
    www.awn.com
    The immersive event aims to bring the magic of the holidays to life with 5DX experiences, such as cutting-edge projection visuals, sound, and special effects.
    0 Commentarios ·0 Acciones ·106 Views
  • DC Studios Drops Superman Official Trailer
    www.awn.com
    Hes finally here! After all the rumors, conjecture, peeks, teases and furtive glances, we finally get to see the new man of steel, Lois Lane, Lex Luthor and superpup Krypto! Hitting theaters next summer.
    0 Commentarios ·0 Acciones ·123 Views
  • Gladiator II: Christian Kaestner (VFX Supervisor) & Kyle Dunlevy (Animation Supervisor) Framestore
    www.artofvfx.com
    InterviewsGladiator II: Christian Kaestner (VFX Supervisor) & Kyle Dunlevy (Animation Supervisor) FramestoreBy Vincent Frei - 19/12/2024 In 2022, Christian Kaestner shed light on the visual effects crafted by Framestore for 1899. Today, he shares insights into his work on the highly anticipated sequel to the iconic film Gladiator.With over 25 years of experience in animation, Kyle Dunlevy has built an impressive portfolio, working on renowned projects such as Jurassic World, Ghost in the Shell, Paddington 2, and The Tomorrow War.How did you and Framestore get involved on this show?Christian Kaestner (CK) // Framestore is a well-established visual effects house in the industry, and we have longstanding working relationships with several members of the production team for Gladiator II. My producer, Jeanne-Elise Prvost, and I previously worked on Ridleys Alien: Covenant together, where we were responsible for the Facehugger and Chestburster. We have a passion for intricate and challenging creature work, and Gladiator II certainly offered a great opportunity to push our limits.What was your feeling about working on the sequel of a cult movie?Kyle Dunlevy (KD) // I think we were all absolutely thrilled at an opportunity to contribute to the Gladiator sequel. I, for one, have a very special place in my heart for the original Gladiator; not only was it an amazing film, but I will never forget seeing it in the theatre. I was in the front row in a cinema in Rome, having just visited the Colosseum and toured the city for the first time that very day! I remember feeling very lucky to be there and feeling so grateful for that film. What a world we live in where films and VFX can show us what life was like long ago and bring our imaginations to life.How was the collaboration with Director Ridley Scott and VFX Supervisor Mark Bakowski?CK // On a production of this size, it is common to collaborate primarily with the production-side visual effects supervisor and have limited direct interaction with the director. I have known Mark for a long time, and we worked on several projects together prior to Gladiator II. While this relationship didnt make the creature work any less challenging, it did make communication and reviews very effcient and productive. Mark was great at guiding us on how to make the most of the limited time we had with Ridley.How did you organize the work with your VFX Producer?CK // We approached organizing Gladiator II like any other visual effects show, working closely with the client on timelines, temporary deliveries, and expected levels of execution. We then worked with the resourcing team to lay out a basic schedule based on all available information. What made this project slightly more challenging was the impact of the SAGs strike in 2023. Unfortunately, the shoot for Gladiator II was affected by this, and while our sequences were already in the can, it was challenging to confidently work on the editorial context.Luckily, we had enough material to progress with our creature development, and the existing edit gave us suffcient insight into the requirements for our scope of work. While the writers strike had a bigger impact on the shooting schedule, it didnt significantly affect our work. It was challenging to progress confidently while the strike was ongoing, but as soon as it was resolved, we resumed full swing on our sequences. This was a crucial step in our process, as it was important to Mark that we build trust and confidence with Ridley, ensuring that these challenging scenes would work as intended in the final edit.What are the sequences made by Framestore?CK // Framestores main focus was the realization of the Baboon Fight sequence, the battle with the rhino in the Colosseum, and the River Styx dream sequence.Can you walk us through the creative process of bringing the baboons and rhinoceros to life through visual effects?CK // Both of these sequences presented their own unique challenges, both in terms of the type of creature and how the sequences were captured.Lets start with the Baboon Fight sequence. To ensure suffcient physicality in the actors performance when fighting CG baboons, Ridley utilized his stunt team to perform the baboon actions and interactions. This provided him with visuals for shot composition and gave the actors something to react to during filming. While this method was particularly useful on set, it posed additional challenges for visual effects in post-production. The stunt performers were excellent references, but it was equally challenging to perfect the timing and actions, as real baboons are smaller and move faster than humans can replicate.One key to success was the collaboration with Framestores Pre-Production team (FPS) who provided the post-viz, which not only answered many editorial questions but also allowed our animation team to get involved early in the process. Kyle Dunlevy, our animation supervisor, played a vital role in guiding the performances, physicality, and timing of the baboon actions.One of the significant differences from usual creature work was the need to match realistic baboons. While having real-life references can make things easier compared to creating fantasy creatures, the devil is in the details. Audiences are much more critical of anything they can directly compare to reality. Any mistakes in motion or physics are far less forgiving.Once we had a general edit with the timings and actions Ridley wanted, we progressed with the detailed baboon animation. This involved extensive research and behavioral studies to ground the animations in reality. Mark and Ridley supplied many specific references for actions they wanted to reflect in the animations.An additional challenge was the casting choice of Lucius main opponent: a baboon with a skin condition called alopecia. Ridley had come across a particular reference that he wanted to match precisely. A hairless baboon is a rare sight, and finding suitable references was challenging. The reference baboon was muscular and lean, which required meticulous attention to deformations and skin slides in our asset builds. Since Ridley strongly favored this specific reference, we decided to track and match it exactly, allowing us to directly compare our CG asset to reality. This was crucial to the success of the creature work.For the remaining supporting baboons with fur, we used countless video references of baboons interacting with each other and humans, including jumping on people and pulling on clothing. Real-life footage was invaluable in capturing these behaviors authentically.The battle between the gladiators and the rhino in the Colosseum came with an entirely different set of challenges. Ridleys special effects supervisor, Neil Corbould, designed and built an animatronic rhino that could articulate and be remote-controlled to choreograph the entire battle. This was invaluable for the camera team, ensuring accurate composition and framing, and gave the actors something tangible to interact with. Additionally, it served as a perfect lighting reference, as the animatronic was highly detailed with accurate textures and colors.In visual effects, we separated the rider from the animatronic and replaced the animatronic with a CG rhino. Our starting point was to match the highly detailed SFX build and add a CG skeleton, anatomy, fat, and a thick layer of skin to allow for simulation once the animation was finalized. The CG and SFX rhino were similar in size, so the paintwork was less complex compared to the stunt performer removal required for the baboon sequence.One of the challenges in replacing the SFX rhino with the CG version was that the dynamic movement of the animation differed significantly from the animatronic rig. While the rig could move precisely and quickly, there was a distinct difference in the up-and-down motion, requiring us to add this dynamic to the rider without breaking the believability of the actor riding the CG rhino.As with the baboons, we relied on extensive real-life references for the rhino to understand its motion, weight, speed, and anatomy, from the skeleton to the outer skin. Everything needed to be accurate for the animation to harmonize with the simulation of muscles and fat beneath the thick skin. Beyond looking realistic, the rhino needed to convey its immense weight and sheer power. The process was detailed, challenging, and immensely rewarding to execute.What were the biggest challenges in designing realistic fur, skin textures, and muscle movements for the baboons and the rhinoceros?CK // The level of detail required for CG creatures today is critical to their success. Visual effects have evolved so much over the last few decades that its nearly impossible to get away with any cheats. Physically accurate shaders only work effectively when every material is meticulously replicated with the correct properties. Each materialfur, skin, muscle, or fatneeded its own set of shaders and texture maps to ensure accurate representation.Fortunately, we had a wealth of references for both the baboons and the rhino. For the hero alopecia baboon, we had a very specific reference that Ridley wanted to match. This became our anchor point, grounding the CG creature in something tangible and realistic. The intricacy of the hairless baboon presented a unique challenge, as every muscle movement and skin slide needed to be flawless. To achieve this, we camera-tracked the reference footage and match-animated a section of the baboon walking. The lighting team meticulously replicated the lighting from the reference footage, painting out the real baboon from the plate and compositing the CG version into the scene. This allowed us to A/B compare the CG baboon with the reference footage, establishing a solid visual anchor. Only after achieving a perfect match did we make slight adjustments to add scars and other cinematic details.The same principles applied to the rhino, although in this case, we matched the carefully crafted SFX build for texturing and rendering while relying on real-life reference footage primarily for animation and simulation.Motion studies were as critical as physically accurate texturing and rendering. Even the best still frame of a CG creature is useless if the animation, anatomy, and physics simulation lack accuracy. While theres always some creative freedom for cinematic drama, the anchor must remain in realistic reference to ensure the audience finds the creatures believable. Balancing these elements was both a challenge and a rewarding part of the process.How did you achieve the nuanced facial expressions and behaviors for the baboons to make them feel lifelike and expressive?KD // For our facial performances, we used baboon video reference to inform everything we did. From how much the jaw opens, to how much the nose lifts, to how high the brows can go, we relied on reference footage to inform all our decisions. We worked with the modellers and riggers to create over a hundred facial shapes that the animators could use. Each shot used specific references of Chacma baboons to inspire the expression choices and ranges.Can you share insights on the technical aspects of animating the interaction between the baboons?KD //When dealing with shots that had our cg baboons interacting with an on-set performer, we first needed to ensure that we had a very tight body track of the actor/digital double. The digital double could then be used in our maya scene so the animators knew exactly where to place a baboon hand, foot or set of jaws! We also had to simulate the cloth for a good connection, as well as the baboons fur.What role did motion capture or performance capture play in animating the baboons?KD // No motion capture was used. However, when filming the scene, there were on set baboon performers that our animators used as inspiration for their shots. The performers wore grey suits with furry gloves and face makeup and did a great job capturing the high energy, frenetic spirit of angry baboons. On-set this gave the actors something to fight or avoid, but it also gave us a great starting point for our animated performances. There is one piece of background action in particular that we loved so much we literally copied it as closely as we could. Try to spot the baboon who uses two hands to violently push and pull on the clothing of a fallen gladiator.Were there any specific references or inspirations used for the design and animation of the rhinoceros?KD // Not only did Ridley Scott provide reference images and concept designs for our rhino, but he also had a lifesize animatronic rhino on wheels moving through every shot with the actor riding on top of course! Our job was to replace that on-set rhino with a digital version. One of the big challenges of the rhino work was the on-set rhino on wheels did not move up and down very much, which meant the actor/rider didnt either. We had to be clever about how to connect the movement of both the cg rhino and the rider. Many of the shots actually contain a fully CG rider, see if you can find them.How did your team approach the challenge of integrating the rhinoceros seamlessly into scenes with live-action elements?CK //The entire rhino battle was captured using the SFX animatronic as a stand-in performer, capable of moving at speeds up to 30 mph. This allowed Ridley to shoot the sequence dynamics as he envisioned, with the animatronic generally in the correct position. While some minor adjustments were needed for timingsuch as transitions from walking to running or running to trottingthe animatronics scale matched the intended final size of the CG rhino. This ensured a strong foundation for the integration of the CG element. That said, this process was far more complex than it might initially sound. As with texture and shader accuracy, the devil was in the details. While it was relatively straightforward to block the sequence, integrating the rhino with the environment, live-action actors, and the surrounding elements presented significant challenges.For instance, sand and dust interactions were crucial to making the rhino feel grounded in the scene. Additionally, nuanced adjustments were required for the live-action performances to maintain a believable spatial relationship between the animatronic and the actors. Safety distances between the animatronic and the actors meant that certain interactionslike the rhino lashing out with its head or knocking a gladiator to the groundhad to be reimagined in post-production.The rider posed a similar challenge, as the animatronics movements were slightly too smooth to create a convincing riding motion. We had to add dynamic rider motion to match the rough and uneven movements expected when riding a massive animal like a rhino.To complete the integration, we painted out the animatronic and tire tracks left on the Colosseum ground. The sandy ground was recreated in CG, and we simulated dust and sand interactions for the rhinos footfalls. Lastly, we added sand particles to the rhinos body, ensuring that dust would naturally come off as it ran or moved quickly. These details collectively made the CG rhino appear seamlessly embedded in the live-action footage.What software and tools were primarily used in creating the detailed animations for the baboons and the rhinoceros?KD // The baboons and rhino, like almost everything we do at Framestore, were all keyframe animated in Maya. We focused our energy on weight and balance, always looking for strong poses and clear silhouettes. For things like ears and tails, trying to be as efficient as possible with our time, we relied on our own Dynamic Curve tool to give us some of that natural overlap for free. A baboons tail is long, with many controllers, and keyframing that is time consuming. Im a big fan of smart, user-friendly animation tools that allow us to improve our workflow and move faster through the shot. At Framestore we have Fiona Kaye and her Anim-Tech team that has created dozens of amazing animation tools for us over the years.Were there any unique techniques or breakthroughs that your team developed during the creation of these complex animal sequences?CK // We are always striving to improve our character pipeline, and for the rhino, we developed and tested a new tool for simulating the fat layer between the muscles and skin. This fat layer plays a crucial role in absorbing and redistributing the motion of the underlying muscle anatomy.In the rhinos case, the skin could be a couple of centimeters thick, while the subcutaneous fat layer could range from three to seven centimeters. Such a thick layer not only absorbs movement but also exhibits its own dynamic behavior, especially when the animal is walking or running. While this approach isnt entirely novel in principle, our team developed a new workflow to achieve more physically accurate simulation results. This ensured that the rhinos motion was as lifelike and believable as possible, with realistic interplay between muscles, fat, and skin layers.This tool allowed us to push the realism of the animation further, creating subtle yet essential details in how the rhinos body responded to movement and physical forces, enhancing the overall believability of the creature on screen.How did lighting and rendering play a part in achieving the photorealism of the baboons fur and the rhinoceros tough skin?CK // These days, everything in visual effectsfrom lighting to rendering, texturing, and simulationneeds to be physically accurate. However, achieving photorealism is about striking the right balance between physical accuracy and visual believability. Just because we can replicate everything to perfection doesnt mean we should. The key is finding the sweet spot where accuracy becomes indistinguishable to the human eye.For the baboons fur and the rhinos tough skin, we began the look development process with a level of accuracy higher than what would ultimately be used in production renders. This gave us a visual guide or anchor to inform the final look. For example, while we might start with simulations that use 100 light bounces or extremely high ray depths, we refine these settings to maintain efficiency while preserving the integrity of the visuals.The goal was to ensure seamless integration into the live-action plates. The audience should believe the CG baboons and rhino were physically present on set. This required careful adjustments in lighting to match the environment and attention to fine details, such as how light interacted with the textures of fur or the rough, layered structure of rhino skin. Ultimately, visual effects are a craft of smoke and mirrors. The best work is invisible, blending into the scene so naturally that it supports the story without drawing attention to itself. When the audience never questions whether a CG element is real, we know weve done our job well.Rhino shots at 00:04Were there any scenes that proved to be particularly diffcult when working with these digital creatures, and how did you overcome them?CK // The baboon fight sequence was definitely the most challenging. It became apparent early in the post-viz stage that matching the fast-moving, agile, and viscous behavior of the baboons with the live-action performances would be extremely demanding. While the stunt team and actors did an exceptional job replicating the dynamics of the fight, real baboons move far more quickly and fluidly than humans can mimic.This discrepancy was less of an issue for free-moving baboons in the background but presented significant challenges for shots where baboons interacted directly with Lucius. In these cases, we often had to use a trial-and-error approach to anchor the interaction moments. We had to carefully determine which parts of the physical interaction were most crucial for believability and which could be adjusted, such as through retiming or partial CG replacements of costumes or props.There wasnt a one-size-fits-all solution; each shot came with its own set of challenges and required a tailored approach. Some shots needed more retiming, others more compositing or animation adjustments.Our guiding principle was to retain as much of the original photography as possible while enhancing the plates enough to make the interactions believable. This often involved combining partial CG enhancements with subtle retiming and compositing work to bridge the gap between the live-action footage and the dynamic, fast-paced behavior of the CG baboons. This meticulous approach ensured the sequence felt grounded and realistic, even with the extraordinary demands of animating such complex creatures.What are some of the reactions youve received from the filmmakers or audiences regarding the realism and impact of the baboons and rhinoceros sequences?CK // This is an interesting question because the reactions to the two sequences have been quite distinct. The baboon sequence, particularly with the hairless baboon, was likely the more challenging of the two to execute. It required an extraordinary level of attention to detail to make it look realistic, and the hairless design was inherently polarizing and unusual. Despite these challenges, I feel the sequence turned out particularly well and delivered the impact we were aiming for.On the other hand, Ive heard that everyone loves the rhino. This might be partly because the rhino was already an idea Ridley had envisioned for the original Gladiator, but it was deemed too impractical at the time. Now, with the advancements in visual effects, the rhino could finally take center stage and have its moment of fame in the sequel. The sheer power and presence of the creature seem to have resonated well with both filmmakers and audiences.Were there any memorable moments or scenes from the film that you found particularly rewarding or challenging to work on from a visual effects standpoint?CK // As I briefly mentioned earlier, creating realistic animals is often more challenging than designing fantastical or non-existent creatures. With real animals, every aspectthe look, the animation, the physicshas to be flawless. When audiences are familiar with the creature, whether from a visit to the zoo or from documentaries, the risk of falling into the uncanny valley is significantly higher because theres a direct point of comparison to real life.Both the baboons and the rhino were incredibly challenging to bring to life but equally rewarding. The baboon sequence required meticulous attention to detail, especially with the hairless baboon, whose unique appearance made even small inaccuracies more noticeable. Similarly, the rhino demanded precision in weight distribution, muscle movement, and interaction with the environment.What made these challenges rewarding was the satisfaction of overcoming them. Seeing the final sequences come togetherafter so much effort in research, animation, and integrationwas an immensely gratifying experience.Rhino shots at 00:05Looking back on the project, what aspects of the visual effects are you most proud of?CK // Id have to pick the baboon sequence. Seeing the before-and-after comparisons really highlights just how challenging it was to bring the sequence to life. From the live-action plates with the stunt performers to the hairless baboon and the brutally vicious fight choreography, we forged a plan to execute it allbut there was always uncertainty about whether it would truly work until everything came together.Almost every step in the process had to be completed before we could feel confident in the result. When it finally did come together, it was incredibly rewarding. I clearly remember the day Mark presented an almost-finished fight sequence and broke the news that Ridley was happy with it. That moment was a huge relief and a proud milestone for the team.Is there something specific that gives you some really short nights?CK // Surprisingly, not on this show. Pulling off the rhino battle and the baboon fight was undoubtedly a challenging task, but the goalposts were always clear. There was never any question about what the sequences needed to bethey just needed to feel real. Its when the goalposts arent clear that sleepless nights begin. Ambiguity, uncertainty, or constantly shifting objectives can make a project stressful. On Gladiator II, the clarity of vision and expectations allowed us to focus on execution rather than second-guessing the end result.What is your favorite shot or sequence?CK // Id have to say the Baboon Fight sequence. Theres something about seeing all the layers of work come togetherfrom the live-action plates with the stunt performers to the hairless baboon and the chaotic fight choreography. It was a sequence where every step of the process felt like it could go either way, and we didnt really know if it was going to work until everything clicked in the end.The hairless baboon, in particular, was such a unique challenge, and getting it to look right was no small feat. I remember the moment when Mark showed the nearly completed sequence, and we heard that Ridley was happy with itthat was a huge relief. Its one of those sequences where, when you look at the final result, you cant help but feel proud of what the team accomplished.What is your best memory on this show?CK // My best memory was a mix of the team effort and a specific moment of recognition. Wed been working tirelessly on the hairless baboon, focusing on every detailskin deformations, muscle firing, and the overall physicality of the creature. It was an incredibly complex task, but the team came together, solving problems and refining every step until we had something we were really proud of. The moment we presented the final-quality shot to Ridley stands out. It was a simple walk cycle, but every muscle and movement was working in harmony, bringing this unique and intimidating creature to life. When Ridley saw it, his reaction was immediatesomething along the lines of, Oh wow, hes a real force of nature. That moment captured what this project was about: the teams dedication, the creative challenges, and ultimately the satisfaction of seeing it all come together in a way that resonated with the filmmakers.How long have you worked on this show?CK // I have been working on the show for about 12 months, give or take.Whats the VFX shots count?CK // Framestores shot count on Gladiator II was 136 final VFX shots.A big thanks for your time.WANT TO KNOW MORE?Framestore: Dedicated page about Gladiator II on Framestore website.Mark Bakowski: Heres my interview of Production VFX Supervisor Mark Bakowski. Vincent Frei The Art of VFX 2024
    0 Commentarios ·0 Acciones ·113 Views
  • Superman
    www.artofvfx.com
    Movie & Games TrailersSupermanBy Vincent Frei - 19/12/2024 James Gunn reimagines the legendary Man of Steel in a breathtaking new trailer. Witness the iconic hero like never before in this first trailer of Superman!The VFX are made by:FramestoreILMWeta FXThe Production VFX Supervisor is Stephane Ceretti.The Production VFX Producer is Susan Pickett.Director: James GunnRelease Date: July 11, 2025 (USA)Screenshot Vincent Frei The Art of VFX 2024
    0 Commentarios ·0 Acciones ·115 Views