• WWW.VFXVOICE.COM
    VISUAL EFFECTS AND ANIMATION BRING HISTORICAL EVENTS TO LIFE FOR DOCUMENTARIES
    By TREVOR HOGGWhen it comes to covering historical events, documentarians go on a journey to find and acquire rights to archive footage and photographs or fill in the visual gaps with talking heads or reenactments. In some cases, the reenactments are more about being authentic to the emotion of a moment than to the actual physical details. With technology becoming more affordable and accessible, the ability to have visual effects and animation within a tight budget has allowed for even more creative and innovative ways to bring the past to cinematic life.Bad RiverWe do social justice documentaries, states Andrew Sanderson, Associate Producer at 50 Eggs Films.Bad River deals with a Native American tribe called the Bad River Band, located in Northern Wisconsin, who are fighting for their sovereignty. Some things are happening now, and some things happened back in 1845 or 1850 that we dont have any photos, footage or music from, so we had to be creative when we were making the film. We want to tell stories the best we can. A lot of the Elders who we interviewed from the band would tell stories of Chief Buffalo, the historic chief of La Pointe Band of the Ojibwe, and other Ojibwe leaders going to Washington, D.C. in 1852 to try to convince President Millard Fillmore not to remove them from their land. These are stories that have been passed down from generation to generation, and its important for us to get it right but let the folks doing the interview tell their story.There is a great sense of community, so we wanted to include Bad River as much as we could in the filmmaking process. We would identify some local youth artists in the area. They would make sketches for us of different scenes or elements we were trying to capture. Then we take those sketches and give them to Punkrobot, an animation company in Chile, which would bring them to life. Andrew Sanderson, Associate Producer, 50 Eggs Films, Bad RiverIllustrations by Bad River Band youths as well as courtroom drawings were the inspiration for the animated sequences created by Punkrobot. (Images courtesy of 55 Eggs Films)Sanderson employed unique approaches to making the film. He remarks, There is a great sense of community, so we wanted to include Bad River as much as we could in the filmmaking process. We would identify local youth artists in the area, and they would make sketches for us of different scenes or elements we were trying to capture. Then we take those sketches and give them to Punkrobot, an animation company in Chile, which would bring them to life.Jackie ShaneAnimated sequences were expanded upon. There is a scene where one of the interviewees is describing when he was younger, people from the Bureau of Indian Affairs driving around the reservation trying to catch kids to bring them to boarding schools, Sanderson explains. We had one of our youth artists draw a man coming out of a car. Then we would have Punkrobot animate that and bring it even a step further into a whole animated sequence. Sometimes, it would transition to another still that we had or another piece of media, so it flowed well. In another example, we had licensed some black-and-white footage of the front lawn of the White House that had sheep eating the grass. We had Punkrobot sketch out what would be the next scene, and from there, it transitioned into the sketch of the interior of the White House where theyre plotting to take land from different reservations. A legal battle between the Bad River Band and Canadian oil and gas pipeline operator Enbridge is included. They had a case that was in Madison Western District Superior Court, so we werent allowed to have any photographs or recording devices in the court, but we wanted to show what was going on. We hired a courtroom sketch artist, told him who the key people were, and had him get a selection of sketches over two days. Then, we had Punkrobot animate those sketches to tell the story of what was going on in the courtroom when we couldnt have told it any other way visually. Sanderson adds, We basically used different mediums and blended them all together to make sequences that are visually appealing and can help bring people into the story.Machine learning and Stable Diffusion enabled the animated sequences to go from 15 to 40 minutes of screentime in Any Other Way: The Jackie Shane Story. (Images courtesy of Banger Films and the National Film Board of Canada)We developed an interesting visual effects process where we ended up with something that was shot relatively inexpensively, and through clever piecing together of strange techniques, we made it look as though 2,000 frames were painted by hand.Luca Tarantini, Director of Animation, Any Other Way: The Jackie Shane StoryPiecing together the life of a trans soul singer, who is revered along with her contemporaries Etta James and Little Richard, and who vanished from public view 40 years ago, is Any Other Way: The Jackie Shane Story, directed by Michael Mabbott and Lucah Rosenberg-Lee and produced by Banger Films and the National Film Board of Canada. We had to bring Jackies story to life, and roto seemed like a cost-effective way to do that because we are starting with an actor, not doing animation from scratch, which can be expensive and not look good if you dont have the right team, remarks Director of Animation Luca Tarantini. We developed an interesting visual effects process where we ended up with something that was shot relatively inexpensively, and through clever piecing together of strange techniques, we made it look as though 2,000 frames were painted by hand. Machine learning and Stable Diffusion were cornerstones of the animation process. Stable Diffusion is meant for you to type in a sentence, and it generates an image of that thing. But we were using it where you start with an image, type in a bit of prompt, and it gives you an interpretation of that original image. If you get the settings just right, it doesnt distort the original image enough but stylizes it in the correct way.Adding flares and working with a virtual set in the animated sequences for Any Other Way: The Jackie Shane Story. (Images courtesy of Banger Films and the National Film Board of Canada)As the edit evolved, it became clear that the animation was a major component of the storytelling and consequently went from 15 to 40 minutes of screen time. Not only did the amount of animation and the time we spent on it have to change, it became impossible without experimenting with new techniques to try to make it feasible for a tiny team of two or three people to deal with that volume of content, notes Co-Director of Animation Jared Raab. We managed to mix a bit of everything that everybody knew from shooting on an actual soundstage in a scrappy, music video-style way, greenscreen. Luca pioneered simple camera tracking to get camera position data for when he created the backgrounds, which were made in 3D using Cinema 4D, then I did a ton of Adobe After Effects work to create some of the 2D animation of the space. Last, Luca created entirely 3D lighting using the camera data to get the lens flares and some of the stuff that we loved from early archival music documentaries. It was a sprinkling of a little bit of everything that we knew how to make a film into the project, and the chemistry gave us just the right recipe to pull it off.Pigeon TunnelUnion VFX made a shift from working on feature films and high-end episodic to contributing to the Errol Morris documentary The Pigeon Tunnel, which explores the life and career of John le Carr through a series of one-on-one interviews with the former intelligence officer turned acclaimed novelist. Generally, visual effects for documentaries are all about enhancing the audiences understanding of the real-life events and subject matter that the narrator is talking about, observes David Schneider, DFX and Technical Supervisor for Union VFX. It is important for the work to focus on realism and subtle invisible effects that stay true to the historical moments being described during the interview. The core value of a documentary is to educate, so we generally have to keep augmentation minimal, not exaggerate, and retain a factually accurate depiction of events. Digital augmentation was not confined to one aspect, as there were 154 visual effects shots, and five assets had to be created. Schneider adds, We handled everything from equipment removal during interview shots to creating CG creatures and augmenting environments. The films many dramatizations gave Union VFX the chance to shine with standout assets, like an unlucky pigeon and a Soviet freighter. One of the highlights was a nighttime airplane sequence where we delivered several fully CG shots that brought the scene to life.For the Monte Carlo pigeon shoot sequence, we needed a close-up of a pigeon being shot out of the sky. To achieve this, we had to create an entirely new feather simulation system that captured the realistic movement of feathers when the pigeon was hit. While weve worked with CG birds before, this was the first time we had been so close to the camera that individual feathers were clearly visible.David Schneider, DFX and Technical Supervisor, Union VFX, The Pigeon TunnelUnion VFX handled everything from equipment removal during interview shots to creating CG creatures and augmenting environments for a total of 154 visual effects for The Pigeon Tunnel. (Images courtesy of Union VFX and Apple)Early on, Union VFX received detailed storyboard animatics. It helped us get on the same page, and since documentaries dont typically use heavy visual effects, this was invaluable, Schneider states. Some scenes required complex augmentation. For example, the sequence in which Kim Philby makes his escape to the Soviet Union required us to build the Dolmatova [a Soviet-era freighter], place it into provided plates, and enhance the surrounding dock with cargo and a digital gangway leading to the ship. All of this was integrated into the practical fog that was present on set. For the Monte Carlo pigeon shoot sequence, we needed a close-up of a pigeon being shot out of the sky. To achieve this, we had to create an entirely new feather simulation system that captured the realistic movement of feathers when the pigeon was hit. While weve worked with CG birds before, this was the first time we had been so close to the camera that individual feathers were clearly visible. We meticulously modeled the texture and styled the pigeons feathers to ensure they moved naturally, both in flight and when they detached from the bird.EnduranceCutting back and forth from the ill-fated 1915 Antarctica expedition to the South African research vessel S.A. Agulhas II searching the Weddell Sea in 2022 for the sunken ship captained by renowned Irish explorer Ernest Shackleton is the National Geographic documentary Endurance, directed by Elizabeth Chai Vasarhelyi, Jimmy Chin and Natalie Hewit. There were 28 men, and most of them wrote diaries or were able to tell their stories after the fact, so there is a lot of historical detail, states Producer Ruth Johnston. We used AI voice conversion technology so that every word that you hear is from one of seven guys [from the expedition] who lead us through the story [by reading from their writings]. Virtual content was built for three separate re-creations of three different campsites with various types of ice flows in the backgrounds. These ice flows were important because it was something we would not have been able to easily recreate in real life, remarks Virtual Production Supervisor Eve Roth. We color-corrected the virtual snow around the camp to match what the art department ended up putting down. Because we knew what kinds of harsh weather we were trying to recreate for the campsites, the virtual content was created in a way where we could dial up or down the wind and snow effects. We were also able to change the type of clouds in the sky, to dial that up and down.These ice flows were important because it was something we would not have been able to easily recreate in real life. We color-corrected the virtual snow around the camp to match what the art department ended up putting down. Because we knew what kinds of harsh weather we were trying to recreate for the campsites, the virtual content was created in a way where we could dial up or down the wind and snow effects. We were also able to change the type of clouds in the sky, to dial that up and down.Eve Roth, Virtual Production Supervisor, EnduranceStept Studios focused on the reenactments. We had the urge to chase some fancy camera work, but ultimately, we wanted to shoot it the same way Frank Hurley [Endurance Expeditions official photographer] would have on sticks with composed frames, explains Nick Martini, Founder and Creative Director of Stept Studios. This visual approach allowed us to intercut our footage with the archival material seamlessly. Most of the visual effects work was completed before production.Our efforts were centered around building the environments where the story takes place using Unreal Engine, Martini states. Those worlds were then projected in LED volume stages to be used as interactive backgrounds on a stage in Los Angeles. This allows for an organic in-camera look when we shoot and provides more realistic lighting than a traditional greenscreen approach. In post, some additional clean-up and effects were added to sell the gag.(Weddell Sea Pictures/Jasper Poore)(Photo: Jasper Poore. Image courtesy of Weddell Sea Pictures)(Photo: Frank Hurley. Image courtesy of BFI)(Photo: James Blake. Image courtesy of Falklands Maritime Heritage Trust)(Photo: Nick Birtwistle. Image courtesy of Falklands Maritime Heritage Trust)(Photo: Esther Horvath. Image courtesy of National Geographic)(Photo: Esther Horvath. Image courtesy of National Geographic)Intercut with contemporary footage of the expedition to find Endurance, the backstory of the sunken ship was told through historical photographs taken by Frank Hurley as well as reenactments taking place in a parking lot and LED volume.Atmospherics were added to the archival still photographs. We didnt want effects to overwhelm or take away from the original photography, rather to enhance the imagery or add impact in dramatic moments, states Josh Norton, Executive Creative Director and Founder of BigStar. Blowing smoke and snow were added only when we felt those moments of drama were necessary or the original photo called for it.Orienting the audience is a collection of maps showing the progression of both expeditions. The filmmakers had a desire to make sure the films graphics didnt feel too expected or conservative, Norton remarks. We were able to work with colorful type, energetic transitional language and texture while still making sure that we were being accurate to the historical research, especially on the maps. As for any lessons learned from the project, he replies, Dont go to the Weddell Sea without a backup plan!
    0 Comments 0 Shares 29 Views
  • WWW.VFXVOICE.COM
    CONTEXTUALIZING VIRTUAL PRODUCTION DESIGN
    By TREVOR HOGGAs technology advances, there are ramifications that create new skill sets and cause traditional approaches to be reassessed. This includes the role of the production designer in virtual production, where physical set builds and digital backgrounds are combined to create an in-camera shot that at one time could have only been completed in post.A consequence of further entwining the art department and visual effects is the emergence of the virtual art department (VAD) and virtual production designer. Does this mean that projects will be divided between a production designer and a virtual production designer? Thats the core question, notes Alex McDowell, Co-Owner and Creative Director at Experimental Design. There is no possible way that having a virtual production designer and an actual production designer is useful. The production designer has always taken command of whatever tools are available to them, so its evolutionary. Its reasonable to consider virtual production as another aspect of the design space. If the production designers job is to essentially frame the story, which means everything that contextualizes the narrative is the designers responsibility. There are a lot of divisions to that, like the cinematographer, costume designer, special effects and stunts.Minority Report (2002) had the first digital art department. (Image courtesy of 20th Century Fox)Virtual production for product designers is very flexible. You can use it as just a live session with them directing the cameras and making requests in the engine while sitting with an operator. We can also offer more control to the viewer, allowing them to view the sets in VR, giving them a sense of scale and flow, or with virtual cameras and mocap allowing framing within virtual sets and testing story beats early and making changes to the set accordingly.Michael Zaman, Realtime Supervisor, FramestoreAn effort to bridge the gap between the art department and visual effects was made during the making of Minority Report. In 2000, when I started to get embedded in technological digital aspects of production with Minority Report, which had the first digital art department. From that point forward, we were beginning to mix tools dramatically. Now, when you speak with Andrew Leung, for example, he is using the tools of visual effects from start to finish, McDowell states. There was a division based on the platforms in which we worked or the tools that we used, particularly when visual effects took command of the back end of the digital. The front end, which the art department created for the virtual, fell into a black hole in the center of production and had to be rebuilt by visual effects. There was this incredible inefficiency where the production designer was designing the film, but the production system did not understand that the visual effects in the back end were an extension of the environment that the designer created. What we did was to send 14 books of material to visual effects so there was a through-line of intent, content and design.Andrew Leung started off as a visual effects artist before becoming a concept designer. (Image courtesy of Andrew Leung)Aiding in maintaining the visual aesthetic throughout the filmmaking process is virtual production, as the focus is on creating content before shooting commences rather than relying entirely on post-production. There is still this idea that visual effects, art department and virtual art department are somehow different as opposed to a through-line of execution, McDowell remarks. You wouldnt separate the construction coordinator, painters and all of the people who make the in-camera physical sets from the art department. Visual effects is doing exactly the same as construction; theyre taking design intent and building it until its finished for camera. When I started Star Wars: Episode IX with Colin Trevorrow, ILM, the production designer and the art team worked together from the beginning. Even though we werent doing virtual production because there were pre-LED screens in the frame, we were building virtual assets for the director to scout in VR. It was very efficient because the flow was continuous then. Visual effects is giving all of their knowledge to the front end so there is no waste to the assets being created.Added stress is placed upon the visual effects team. Theyre taking stuff from the end and moving it to the front, and that becomes an issue in terms of scheduling, observes Concept Designer Andrew Leung. With post, you can always extend it, but there is a huge pressure to get everything designed before shooting. There are some advantages to that, as we get a lot of stuff for free in lighting. At the center of the design process is Unreal Engine. If you want to put a few trees in there, usually the application will quit on you. With Unreal Engine, I can put in millions of trees and it wont complain. That alone opens up huge design opportunities. Unreal Engine has given the ability to design whole worlds. Leung notes, I did a pitch for a Paramount film where I built a whole map for a huge fantasy film, so we were able to talk about it in such a way with the director that you could travel around and talk about how characters travel. It was the same process when I worked on The Lord of the Rings: The Rings of Power, where I built whole sections of Middle Earth, and we were able to talk about travel times for the characters. You couldnt do that before because simply the technology wasnt there.Andrew Leung conceptualizes a battle and the arrival of the witch in Mulan (2020). (Image courtesy of Walt Disney Studios Motion Pictures)Size matters when it comes to shooting in a volume. The biggest complaint is that people tend to overbuild the volume, where they go, We have the biggest volume on stage, Leung states. Most of the time you want to use it for small sets, which comes with its own host of issues. I did a volume shoot with Alex McDowell over at Amazon, and we were constantly butting heads with people running the stage who were telling us that the set was too small. We didnt want to redesign the set much larger than it was because that wasnt part of the story. We kept going back and forth. Eventually, the set was made bigger than it should be and looked funky. A portable volume is what we want. We find that more useful because were not locked to a particular stage. For an LED stage volume to move forward in the future, the portable solution is going to be the best.At the center of the design process for the virtual art production is Unreal Engine. Concept art by Andrew Leung for Black Panther. (Image courtesy of Andrew Leung and Marvel)Being proficient with 3D software is critical. If you are already working in concept design, you should be familiar with set design stuff like SketchUp, Vectorworks and Rhino, Leung remarks. I dont know a single concept designer right now who does not have any kind of 3D skills. The contemporary concept designer, at the bare minimum, should know Blender. If you are talking about me specifically, Im slightly unusual in that I came from visual effects before going into the art department. I am well aware of the post process and use that as part of my design. What matters is the final result, not the medium. I remember working with Jan Roelfs, who designed Gattaca, and one of the things that he said that always stuck in my head was, Pixels are plywood. What a lot of production designers now love is, Wow. More of my stuff is making it into post, instead of it being this contentious relationship between post and the art department. Now, its more tied together.If you want to put a few trees in there, usually the application will quit on you. With Unreal Engine, I can put in millions of trees and it wont complain. That alone opens up huge design opportunities.Andrew Leung, Concept DesignerBeing proficient in 3D software is critical nowadays for concept artists. Concept art of Shuris Lab in Black Panther: Wakanda Forever. (Image courtesy of Andrew Leung and Marvel)Virtual production excels when dealing with visual effects-heavy films where locations are limited and not explored extensively. You build two pieces of a set that is then extended in the volume, and you have this wonderful view of Tatooine, states Supervising Art Director Chris Farmer. Your foreground is a practical set, but everything else beyond that is built in Unreal Engine as opposed to going out on location and building all of that stuff. Those types of things that are fantastical or not something you could capture by going out with a 360 camera and shooting out on location and setting your scene in front of it. Its something that requires fantasy and visual effects-heavy work that you would do in front of a greenscreen, and all of the post is done later. Its something you can build before you shoot and eliminate a lot of visual effects work. Process work is also elevated. We did a lot of putting cars in the volume and running 360 moving plates behind them. These cars driving down the highway you would never know. A police car parked on the highway with a moving plate in the background. Its flawless.Virtual production excels when dealing with visual effects-heavy films and shows that have limited locations that are not explored extensively, such as The Mandalorian. (Image courtesy of Disney+ and Lucasfilm Ltd.)Practical sets for The Mandalorian are constructed in the foreground while everything else is built in Unreal Engine. (Image courtesy of Disney+ and Lucasfilm Ltd.)The flexibility to create controllable environments ranging from Shanghai in the 1930s to the futuristic ones found in The Mandalorian is a major advantage of virtual production. (Image courtesy of Disney+ and Lucasfilm Ltd.)There is going to be a shift where designers have to start working with and understanding a virtual art department, and getting up in front and convincing producers that it can be done, believes Farmer. You can design sets, environments and locations. You can scan locations, and import and manipulate them to get what you want. You can build Shanghai in the 1930s in Unreal Engine, light it, give it the atmosphere it needs, and shoot it without spending weeks on end in a location and then having to do visual effects on top. Twilight is not restricted to a couple of hours within a day. Farmer adds, You have total control of the location and the scene on the fly. You can change it and do whatever you want. Ive been talking to a friend who is a designer about promoting the idea that the art department with cinematographers, designers and directors can deliver almost completed scenes lit and ready to go. You just add your foreground pieces and actors. Rightly or wrongly, a lot of decisions get made in the back end; theyre not necessarily the people who are the creative forces driving the picture. The designer and cinematographer should be the ones making all of those decisions.Virtual production is about giving production designers a flexible 3D workspace where they can interactively make requests and expect a rapid turnaround. Behind the scenes on the Netflix show 1899. (Images courtesy of Netflix)When I started The Mandalorian with Andrew Jones and ILM, we had a regular art department with set designers that Doug Chiang would send down. We would get our designs down to ILM who would break them down, turn them into sets and draw them up in 3D, Farmer recalls. At the same time, we began to build a virtual art department, which housed modelers, lighters and texture artists in a separate room because, at the time, they were technically non-union, and the other part of the art department was union. I do think that the push is to embrace the virtual art department and bring it into the art department, [especially] when you get to a bigger scale doing virtual art department work like Fallout, which has got a lot of volume work. It is my understanding, from the coordinator I had previously worked with, that she was a virtual art department coordinator. It was a separate department managed by a different team, but also reporting to the production designer and art directors.These days, its hard to find a feature or episodic show that does not utilize some form of CG in their planning phase. From Barbie. (Image courtesy of Warner Bros.)As with most elements in production, proper planning ensures the greatest chance for success for virtual production. From Tim Webbers 2023 short film FLITE. (Image courtesy of Framestore Films and Inflammable Films)Virtual production for production designers is very flexible. Virtual production for production designers is about giving them a flexible 3D workspace where they can interactively make requests and expect a rapid turnaround, states Michael Zaman, Realtime Supervisor at Framestore. It requires artists to be mindful of the way we build these virtual sets and make sure we consider the way the production designers, directors and other stakeholders might want to change things on the fly and be prepared for these changes. The artists need to make sure assets are prepared with optimization and customization in mind, breaking elements into well-thought-out chunks that allow for quick movement. Without this, the process becomes very similar to CG production design. Virtual production for product designers is very flexible. You can use it as just a live session with them directing the cameras and making requests in the engine while sitting with an operator. We can also offer more control to the viewer, allowing them to view the sets in VR, giving them a sense of scale and flow, or with virtual cameras and mocap allowing framing within virtual sets and testing story beats early and making changes to the set accordingly.I have not heard of a virtual production designer, admits Connor Ling, Virtual Production Supervisor at Framestore. On a typical show, its simply an extension of a traditional production designer. When working with a production, our real-time supervisor becomes more of a virtual art director, falling underneath the production designer. These days, you will be stretched to find a feature or episodic show that doesnt utilize some form of CG in its planning phase. Commercials vary a little more due to the time they have, but even if we compare to a fully animated CG film, the role of a production designer is still used or they may lean into an art director more. Extra time is required in pre-production. Ling notes, As with most elements in production, proper planning ensures the greatest chance for success. I would love to continue to see productions adopt and commit to the practices earlier in the production process and fully utilize the technology to the best of their ability. Of course, this is dependent on need and whether virtual production is correct for their project. When it is and planned appropriately, thats when you see the best results.
    0 Comments 0 Shares 29 Views
  • WWW.VFXVOICE.COM
    OSCAR PREVIEW: NEXT-LEVEL VFX ELEVATES STORYTELLING TO NEW HEIGHTS
    By OLIVER WEBBDune: Part Two has significantly more action and effects than Dune: Part One, totaling 2,147 VFX shots. (Image courtesy of Warner Bros. Pictures)Godzilla Minus One made history at last years 96thAcademy Awards when it became the first Japanese film to be nominated and win an Oscar for Best Visual Effects, and the first film in the Godzilla franchises 70-year history to be nominated for an Oscar. Will the 97th Academy Awards produce more VFX Oscar history? Certainly, VFX will again take center stage, with a number of pedigree franchises and dazzling sequels hitting movie screens in the past year. From collapsing dunes to vast wastelands, battling primates and America at war with itself, visual effects played a leading role in making 2024 a memorable, mesmerizing year for global audiences.Dune: Part One won six Academy Awards in 2022, including Best Achievement in Visual Effects, marking Visual Effects Supervisor Paul Lamberts third Oscar. Released in March, Dune: Part Two is an outstanding sequel and has significantly more action and effects than the first installment, totaling a staggering 2,147 visual effects shots. The film is a strong contender at this years Awards. It was all the same people from Part One, so our familiarity with Deniss [Villeneuve] vision and his direction allowed us to push the boundaries of visual storytelling even further, Lambert says.The production spent a lot more time in the desert on Dune Two than on Dune One. Cranes were brought in and production built roads into the deep deserts of Jordan and Abu Dhabi. Concrete slabs were also built under the sand so that the team could hold cranes in place for the big action sequences. A lot of meticulous planning was done by Cinematographer Greig Fraser to work out where the sun was going to be relative to particular dunes, Lambert explains.Editorial and postvis collaborated with the VFX team to create a truly unique George Miller action sequence for Furiosa: A Mad Max Saga. (Image courtesy of Warner Bros. Pictures)We had an interactive view in the desert via an iPad that gave us a virtual view of these enormous machines at any time of day. This allowed us, for example, to figure out the shadows for the characters running underneath the spice crawler legs and the main body of the machine. VFX was then able to extend the CG out realistically, making it all fit in the same environment. Dune: Part One was a collaborative experience, but Dune: Part Two was even more so as we went for a much bigger scale with lots more action.The first topic discussed during pre-production among department heads and Villeneuve were the worm-riding scenes. Villeneuve envisaged Paul Atreides mounting the worm from a collapsing dune an idea that immediately struck the team as visually stunning and unique. The challenge lay in making this concept and the rest of the worm-riding appear believable. Filming for the worm sequences took place in both Budapest and the UAE. A dedicated worm unit was established in Budapest for the months-long shoot. The art department built a section of the worm on an SFX gimbal surrounded by a massive 270-degree sand-colored cone. This setup allowed the sun to bounce sand-colored light onto the actors and stunt riders who were constantly blasted with dust and sand, Lambert describes. Shooting only occurred on sunny days to maintain the desert atmosphere. Most of the actual worm-riding shots were captured here, except for the widest shots, which were later augmented with CG. In post-production, the sand-colored cone was replaced with extended, sped-up, low and high-flying helicopter footage of the desert.The VFX team at Framestore delivered 420 shots for Deadpool & Wolverine, while Framestores pre-production services (FPS) delivered 900-plus shots spanning previs, techvis and postvis. (Image courtesy of Marvel Studios)Wt FX delivered 1,521 VFX shots for Kingdom of the Planet of the Apes. Remarkably, there are only 38 non-VFX shots in the film. (Image courtesy of Walt Disney Studios Motion Pictures)Earlier footage from Gladiator was blended into Gladiator II flashbacks and live-action, especially original Gladiator crowd footage and in the arenas. The Colosseum set for Gladiator II was detailed as closely as possible to the first film. (Photo: Aidan Monaghan. Courtesy of Paramount Pictures)Blowing up the Lincoln Memorial for Civil War was shot in a parking lot in Atlanta. The single-story set was extended with VFX and the explosion grounded in real footage. Soldiers fired at a bluescreen with a giant hole in the middle. (Image courtesy of A24)For the collapsing dune scene, an area was scouted in the desert, and then a 10-foot-high proxy dune crest was created on flat desert.Three concrete tubes attached to industrial tractors were buried in this proxy dune and were used to create the collapsing effect while a stunt performer, secured by a safety line, ran across and descended into the collapsing sand as the tubes were pulled out. We could only attempt this once a day because of the need to match the light to the real dune, and the re-set to rebuild the crest took a few hours. On the fourth day, Denis had the shot he wanted. Post-production work extended the dunes apparent height to match the real dune landscape. The sequence was completed with extensive CG sand simulations of the worm moving through dunes, all contributing to the believability of this extraordinary scene.Mad Max: Fury Road was nominated for Best Visual Effects at the 2016 Academy Awards. Spin-off prequel/origin story Furiosa: A Mad Max Saga, the fifth installment in the Mad Max franchise, is the first of the films not to focus on the eponymous Max Rockatansky. DNEG completed 867 visual effects shots for the finished film. When DNEG came onboard with the project, main conversations were focused on the scope of the film and the variety of terrains and environments. Furiosa covers much more of the Wasteland than Fury Road did and details a lot of places that had only been touched on previously, notes DNEG VFX Supervisor Dan Bethell. It was really important that each environment have its own look, so as we travel through the Wasteland with these characters, the look is constantly changing and unique; in effect, each environment is its own character.Twisters features six tornadoes for which ILM built 10 models. (Images courtesy of Universal Pictures)Twisters features six tornadoes for which ILM built 10 models. (Images courtesy of Universal Pictures)The Stowaway sequence was particularly challenging for the visual effects team to complete. Apart from being 240 shots long and lasting 16 minutes, it had a lot of complex moving parts; vehicles that drive, vehicles that fly, dozens of digi-doubles, plenty of explosions and, of course, the Octoboss Kite! says Bethell. Underneath it all, a lot of effort also went into the overall crafting of the sequence, with editorial and postvis collaborating with our VFX team to create a truly unique George Miller action piece. The Bullet Farm Ambush was also a big challenge, although one of my favorites. Choreographing the action to flow from the gates of Bullet Farm down into the quarry as we follow Jack, then culminating with the destruction of, well, everything was very complex. We work often on individual shots, but to have over a hundred of them work together to create a seamless sequence is tough.Working on a George Miller project is always a unique experience for Bethell. Everything is story-driven, so the VFX has to be about serving the characters, their stories and the world they inhabit. Its also a collaboration; the use of VFX to support and enhance work from the other film departments such as stunts, SFX, action vehicles, etc. I enjoy that approach to our craft. Then, for me, its all about the variety and scope of the work. Its rare to get to work on a film with such a vast amount of fresh and interesting creative and technical challenges. On Furiosa, every day was something new, from insane environments and FX to the crazy vehicles of the Wasteland this movie had it all!Robert Zemeckis Here follows multiple generations of couples and families that have inhabited the same home for over a century. The movie required de-aging Tom Hanks and Robin Wright. Nearly the entire movie was touched by VFX in some form or another. (Images courtesy of TriStar Pictures/Sony)Robert Zemeckis Here follows multiple generations of couples and families that have inhabited the same home for over a century. The movie required de-aging Tom Hanks and Robin Wright. Nearly the entire movie was touched by VFX in some form or another. (Images courtesy of TriStar Pictures/Sony)Alex Garlands Civil War required over 1,000 visual effects shots as Garland pushed the importance of realism. The more grounded and believable we could make Civil War, the scarier it would be, notes Production VFX Supervisor David Simpson. We deliberately avoided Hollywood conventions and set a rule that all inspiration should be sourced from the real world. Every element VFX brought to the film had a real-world reference attached to it drawing from documentaries, news footage, ammunition tests and war photography.Due to the strict rules about shooting from the skies above Washington D.C., capturing the aerial shots of the Capitol would have been impossible to do for real. This resulted in full CG aerial angles over D.C. and the visual effects team building their own digital version, which covered 13 square miles and 75 distinct landmarks, thousands of trees, buildings, lampposts and a fully functioning system of traffic lights spread over 800 miles of roads. Plus, there are roadworks, buildings covered in scaffolding, parked cars, tennis courts and golf courses, Simpson adds. One of my favorite touches is that our city has cranes because all major cities are constantly under construction!The visual effects team went even further, building a procedural system to populate the inside of offices. When the camera sees inside a building, you can make out desks, computers, potted plants, emergency exit signs, water coolers. The buildings even have different ceiling-tile configurations and lightbulbs with slight tint variations. We literally built inside and out! Once the city was complete, it was then turned into a war zone with mocap soldier skirmishes, tanks, police cars, explosions, gunfire, helicopters, debris, shattered windows and barricades.Here follows multiple generations of couples and families that have inhabited the same home for over a century. Three sequences in the film were particularly CG-dominant, the first being the neighborhood reveal, which was the last shot in the movie. It was challenging mainly because it was subject to several interpretations, compositions and lighting scenarios, and the build was vast, says DNEG VFX Supervisor John Gibson. The sequence surrounding the houses destruction was also incredibly complex due to the interdependence of multiple simulations and elements, which made making changes difficult and time-consuming.Godzilla x Kong: The New Empire was directed by Adam Wingard, who developed a distinctive and appealing visual style for the film. Compelling VFX work was completed by Wt, Scanline VFX, DNEG and Luma Pictures, among others. (Images courtesy of Warner Bros. Pictures and Legendary Entertainment. GODZILLA TM & Toho Co., Ltd.)Dune: Part Two was even more of a collaborative experience than Dune: Part One, on a bigger scale with more action. (Image courtesy of Warner Bros. Pictures)The biggest challenge was the grand montage, which required seamless transitions through various time periods and environments. The Jurassic Era beat was especially challenging in that we needed to flesh out a brand-new world that had real-time elements mixed with accelerated time elements, and they all had to be set up to transition smoothly into the superheated environment and maintain a consistent layout, Gibson details. By far the most challenging aspect of the grand montage was the tree and plant growth. As it would have been very difficult to modify existing plant growth systems to match our cornucopia of plant species using the existing software available for foliage animation and rendering, we had to develop a host of new techniques to achieve the realistic results we were after.Gibson lauds the collaborative spirit of the team. He cites their willingness to experiment, learn new techniques and support each other as instrumental in overcoming the challenges of the condensed production schedule. Boundaries between departments dissolved, folks seized work to which they thought they could contribute, there was little hesitation to bring in and learn new software or techniques, and we brainstormed together, constantly looking for better and better ways to get results. Thats what stood out to me: the cohesion within the team.Cassandra inserting her hand through Mr. Paradoxs head was one of the many challenging VFX shots required for Deadpool & Wolverine. (Image courtesy of Marvel Studios)Framestore VFX Supervisor Robert Allman praises Marvels collaborative approach to VFX on Deadpool & Wolverine, which he describes as a melting pot for filmmakers and artists. (Images courtesy of Marvel Studios)Framestore VFX Supervisor Robert Allman praises Marvels collaborative approach to VFX on Deadpool & Wolverine, which he describes as a melting pot for filmmakers and artists. (Images courtesy of Marvel Studios)I love Marvels collaborative approach to VFX things are often hectic at the end, but that is because stuff is still being figured out, largely because its complicated! In this melting pot, the filmmakers look to the artists for answers, so your ideas can end up in the film. For hard-working VFX artists, nothing is better than that.Robert Allman, VFX Supervisor,Deadpool & WolverineWt FX delivered 1,521 VFX shots for Kingdom of the Planet of the Apes. Remarkably, there are only 38 non-VFX shots in the film. VFX Supervisor Erik Winquist ran through a gauntlet of challenges, from a cast of 12 new high-res characters whose facial animation needed to support spoken dialogue, to a minute-long oner set in an FX extravaganza with 175 apes and 24 horses to choreograph, he notes. The scenes that Id say were the most challenging were those that featured large water simulations integrating with on-set practical water, digital apes and a human actor. The bar for reality was incredibly high, not only for the water itself but also in having to sell that waters interaction with hairy apes, often in close-ups. It was an incredibly satisfying creative partnership for me and the whole team, working with [director] Wes Ball. From the start, he had a clear vision of what we were trying to achieve together and the challenge was about executing that vision. It gave us unshifting goal posts that we could plan to, and we knew that we were in safe hands working on something special together. That knowledge created a great vibe among the crew.More shooting time was spent in the desert on Dune: Part Two than on Dune: Part One. Cranes were brought in and production built roads deep into the deserts of Jordan and Abu Dhabi, UAE. (Image courtesy of Warner Bros. Pictures)Strict rules about shooting from the skies above Washington, D.C. prevented capturing aerial shots of the Capitol for Civil War, which resulted in full CG aerial angles over D.C. and the VFX team building a digital version covering 13 square miles and 75 distinct landmarks. (Image courtesy of A24)Deadpool & Wolverine has grossed more than $1.264 billion at the box office, a staggering feat. The VFX team at Framestore delivered 420 shots, while Framestores pre-production services (FPS) delivered 900-plus shots spanning previs, techvis and postvis. Robert Allman served as Framestore VFX Supervisor on the film. I love Deadpool, so it was tremendously exciting to be involved in making one, he explains. However, more than this, I love Marvels collaborative approach to VFX things are often hectic at the end, but that is because stuff is still being figured out, largely because its complicated! In this melting pot, the filmmakers look to the artists for answers, so your ideas really can end up in the film. For hard-working VFX artists, nothing is better than that.The atomizing of Cassandra in the final sequence was technically tough to achieve. Making a completely convincing digital human and the atomizing effects as detailed and dynamic as the shots demanded was a huge challenge. Most problematic was creating an effect within the borders of good taste when the brief disintegrate the face and body of a human seems to call for gory and horrifying. Many takes of this now lie on the digital cutting-room floor. An early wrong turn was to reference sandblasted meat and fruit, for which there are a surprisingly large number of videos on YouTube. However, this real-world physics gave rise to some stomach-churning simulations for which there was little appetite among filmmakers and artists alike. In the end, the added element of searingly hot, glowing embers sufficiently covered the more visceral elements of the gore to make the whole thing, while still violent, more palatable to all concerned.Traveling through the Wasteland with the characters of Furiosa: A Mad Max Saga, the look is constantly changing and unique. Each environment had to have its own look and, in effect, became its own character. (Images courtesy of Warner Bros. Pictures)Traveling through the Wasteland with the characters of Furiosa: A Mad Max Saga, the look is constantly changing and unique. Each environment had to have its own look and, in effect, became its own character. (Images courtesy of Warner Bros. Pictures)Ridley Scotts Gladiator was met with critical acclaim upon its release in 2000. It won five awards at the 73rdAcademy Awards, including Best Visual Effects. Nearly 25 years later, Gladiator II hits screens as one of the most anticipated releases of the year. Last year, Scotts highly anticipated Napoleon was also nominated for Best Visual Effects, and Scotts films are, more often than not, strong contenders at the Awards.Work for Gladiator II was split between Industrial Light & Magic, Framestore, Ombrium, Screen Scene, Exceptional Minds and Cheap Shot, with 1,154 visual effects shots required for the film. For Visual Effects Supervisor Mark Bakowski, the baboon fight sequence was particularly daunting. Conceptually, this was a tough one, he explains. Very early on, Ridley saw a picture of a hairless baboon with alopecia. It looked amazing and terrifying but also somewhat unnatural. Most people know what a baboon looks like, but a baboon with alopecia looks a bit like a dog. Framestore did a great job and built a baboon that looked and moved just like the reference, but viewed from certain angles and in action, unfortunately, it didnt immediately sell baboon. Its one thing to seeone in a nature documentary, but to have one in an action sequence with no introduction or explanation was a visual challenge.One of the biggest challenges facing the VFX team on Kingdom of the Planet of the Apes was the cast of 12 new high-res characters whose facial animation needed to support spoken dialogue. (Images courtesy of Walt Disney Studios Motion Pictures)One of the biggest challenges facing the VFX team on Kingdom of the Planet of the Apes was the cast of 12 new high-res characters whose facial animation needed to support spoken dialogue. (Images courtesy of Walt Disney Studios Motion Pictures)Bakowski explains that working with Ridley Scott was a crazy and unique experience. So many cameras and such scale, its a real circus and Ridleys very entertaining. He talks to everyone on Channel 1 on the radio, so you can follow along with his thought process, which is by turns educational, inspirational and hilarious. A lovely man. I enjoyed working with him. The VFX team was all fantastic and so capable both on our production side and vendor side. Ive never worked with such an amazing bunch on both sides. Our production team was a well-oiled machine sometimes in both senses but mainly in terms of efficiency and, vendor side, its great just being served up these beautiful images by such talented people. Both made my job so much easier. The locations were stunning, both scouting and shooting 99% of the film was shot in Maltaand Morocco, so youre there for a long time; you get to immerse yourself in it. That was multiplied by the fact we got impacted by the strikes, so we ended up going back to Malta multiple times. I felt I got to know the island quite well and loved it and the people. That said, I wont be going back to Malta or Morocco for a holiday soon. I feel like Ive had my fill for a while!Other outstanding releases that could potentially compete for Best Visual Effects include Twisters, which took everyone by storm earlier in 2024 (with ILM as the main vendor), Godzilla x Kong: The New Empire featuring compelling work by Wt, Scanline VFX, DNEG and Luma Pictures, among others, and A Quiet Place: Day One, a fresh, frightening addition to the Quiet Place series.
    0 Comments 0 Shares 28 Views
  • WWW.POLYGON.COM
    2025 may be the year of the historical fiction game
    Games can transport players to any number of sci-fi, fantasy, or horror settings, but some of the best and biggest aim for realism and simulation. A notable number of highly anticipated AAA games coming in 2025 are opting for something in between imagined worlds and reality: Historical fiction games will be big in the coming year.In games like Assassins Creed Shadows, players will be able to live out the fantasy of time-traveling to feudal Japan and living their own story as real-world samurai Yasuke and the shinobi Naoe. Players of Sid Meiers Civilization 7 will write their own histories, thinking daily about the Roman empire of their dreams as Augustus, as freedom fighter and Union Army heroine Harriet Tubman, or as a dozen other military and thought leaders as they shape civilization to their liking.In a surprising number of 2025 games, players will experience original fiction stories set against historical settings and events. Here are some of the highlights.Assassins Creed ShadowsAssassins Creed Shadows is set in 16th-century feudal Japan, during the final stage of the Sengoku period. Set against the backdrop of civil war and at the height of daimyo Oda Nobunagas power, players will assume dual roles as samurai Yasuke based on the real-world person and kunoichi Naoe.Like the TV series Shgun, Assassins Creed Shadows will showcase the influence of Portuguese traders and Christian missionaries, who helped bring technologies like cannons and long guns to the island nation. It may also show key historical events like Nobunagas attack on the ikki of the Iga province, where Naoe is from. Those events will be woven into Assassins Creeds ongoing conflict between the Assassin Brotherhood and the Templar Order, adding a layer of fiction to real-life historical events.Assassins Creed Shadows will be released on Feb. 14 on PlayStation 5, Windows PC, and Xbox Series X. The game will also be available for Mac and iPad.Sid Meiers Civilization 7Civilization 7 will let players mix and match leaders and civilizations or civs in ways that distort history. Benjamin Franklin and Confucius can meet and potentially go to war as players rewrite history, blending the cultural traditions, architecture, and technology of wide-ranging civilizations as they evolve across eras. Unlike past Civilization games, players will have a broader selection of leaders, which now include philosophers, scientists, politicians, and religious figures, not simply heads of state.While much of Civ 7s historical fiction writing will be done by players themselves as they grow their settlements into sprawling civilizations, Firaxis strategy game may offer the most varied and interesting version of turning history into (fan) fiction.Sid Meiers Civilization 7 will be released on Feb. 11. Its launching simultaneously on PC and consoles PS4, PS5, Switch, Xbox One, and Xbox Series X for the first time in the franchise.Kingdom Come: Deliverance 2Set in 15th-century medieval Europe, Kingdom Come: Deliverance 2 will let players explore an open-world Bohemian Paradise and the city of Kuttenburg as blacksmith Henry of Skalitz.Henry will undertake a journey to avenge his murdered parents, taking him from aspiring warrior to rebel in Warhorse Studios sequel to Kingdom Come: Deliverance. The games fictional story will be set against the Hussite Wars and Henry will face Holy Roman Emperor Sigismund of Luxembourg and his allies.Kingdom Come: Deliverance 2 will be released on Feb. 4 on PS5, Windows PC, and Xbox Series X.Ghost of YteiSucker Punch Productions Ghost of Tsushima was grounded in the real-world story and setting of Tsushima Island. The game was set during the first Mongol invasion of Japan, and starred Jin Sakai, a fictional samurai.Ghost of Ytei will be set more than 300 years after the events of the original game, and will once again utilize a real-world Japanese setting for its story. But Sucker Punch hasnt said much about what Ghost of Ytei protagonist Atsu will be doing in the sequel. Heres what the studio has said, as it relates to the PlayStation 5 games historical fiction setting.Our story is set in the lands surrounding Mount Ytei, a towering peak in the heart of Ezo, an area of Japan known as Hokkaido in present day, wrote senior communications manager Andrew Goldfarb on the PlayStation Blog. In 1603, this area was outside the rule of Japan, and filled with sprawling grasslands, snowy tundras, and unexpected dangers. Its a far cry from the organized samurai clans who lived in Tsushima, and its the setting for an original story we cant wait to tell.Ghost of Ytei will be released sometime in 2025.Marvel 1943: Rise of HydraFeaturing an original story set during World War II, Marvel 1943: Rise of Hydra will star two superheroes of the era: Captain America and Black Panther. Fighting alongside Howling Commandos member Gabriel Jones and Wakandan Spy Network leader Nanali, the two WWII-era heroes will battle the forces of Hydra Marvels Nazi stand-ins in Occupied Paris.Marvel 1943: Rise of Hydra will be the first game from Skydance New Media, which is led by Amy Hennig, former creative director of the Uncharted series, and Electronic Arts veteran Julian Beak. The studio announced its Marvel project in 2021 as a narrative-driven, blockbuster action-adventure game, featuring a completely original story and take on the Marvel Universe. Expect era-appropriate warfare boosted by the magical technology of Wakanda and inventor Howard Stark.Players will get their hands on Marvel 1943: Rise of Hydra sometime in 2025.
    0 Comments 0 Shares 27 Views
  • WWW.POLYGON.COM
    All the Last of Us season 2 news weve heard so far
    The Last of Us first season has come to an end, but theres more to Joel and Ellies story than one season of TV could tell. Unfortunately, HBO hasnt revealed too much about the next season of the show, beyond the fact that its happening. We dont even know when season 2 might arrive, but given HBOs habit of waiting two years between the first two seasons of its biggest shows (like Westworld or likely House of the Dragon) it seems like we could be waiting awhile to return to The Last of Us Cordyceps apocalypse.The shows first season adapted the story of The Last of Us Part 1, the first game in the franchise, and one thing we do know is that season 2 will focus on the story of The Last of Us Part 2. Aside from that, heres everything you should know about The Last of Us season 2 after watching the first seasons finale.When will The Last of Us season 2 debut on HBO? We know for sure the second season The Last of Us is arriving this year, and while we still dont know the specific date, HBO has given us some idea of when the show will return. In the seasons second teaser trailer, HBO revealed that the show comes back sometime in April. The trailer also gave us a look at some of the seasons new cast members, including Jeffrey Wright as Isaac and, of course, Kaitlyn Dever as Abbey. What is season 2 (and The Last of Us Part 2) about?In the vaguest terms, The Last of Us Part 2 deals with the consequences of Joels actions at the end of Part 1. If the (very general) theme of the first game is love, the second game is more about hatred, with most of its run time dedicated to vengeance. If you thought season 1 was bleak, get ready!Is there a time skip before The Last of Us season 2?The Last of Us Part 2 takes place a few years after the events of the first game, which means that The Last of Us season 2 will likely have a time skip as well. However, Part 2 does include some flashbacks from the intervening years, so it all depends on how HBO chooses to adapt the story, and in what order.Will Pedro Pascal and Bella Ramsey be back for season 2?Probably! Their characters will be, at the very least. And while some fans online have expressed a bit of concern about Bella being too young for Ellies storyline in The Last of Us Part 2, shes actually exactly the same age as the character in the game already. More importantly, showrunner and co-creator Craig Mazin has already said that there are no plans at all to recast Ramsey who was one of the best parts of season 1.Will The Last of Us season 2 be its last season? Its possible, but probably not. Officially, The Last of Us has only been renewed for season 2 and not anything further than that, but given its popularity its likely to get as many seasons as Mazin and co-creator Neil Druckmann would like. As for how many that might be, the two confirmed to GQ after the first season finale that they dont feel the story of The Last of Us Part 2 can be adapted into just one season of television. Mazin is tight-lipped about just how many seasons the pair plan to adapt the game into, only saying that its more than one season.What does the first teaser poster for The Last of Us season 2 mean? No TLoU on HBO tonight. But Season 2 is already on its way! Endure & survive! pic.twitter.com/87bKKCDBeO Neil Druckmann (@Neil_Druckmann) March 19, 2023The Last of Us season 2s first big tease was an image tweeted out by Druckmann on March 19, 2023. The image features an unknown persons arm holding a hammer. While this may seem like a big mystery to show-watchers, anyone who has already played The Last of Us Part 2 will know that this is likely a tease of Abby, one of the most important characters in the game.Supposing Mazin and Druckmann follow the plot of the second game at all, Abby will have a massive role to play in the story, and one that might complicate the moral universe of the series and our ideas about who the heroes and villains of this story might be. While this tease is pretty overt, there still hasnt been an official announcement about who will play Abby in HBOs series, or if shell even be part of the second season.Any discussion of who exactly Abby is, what shes like, or her importance to the plot would involve quite a few spoilers from the game that are better left unsaid while we wait for season 2, but for now we can just say that shes a member of the Fireflies with a very direct personal connection to Joel.Who is playing Abby in The Last of Us season 2?On Jan. 9, 2024, HBO announced Kaitlyn Dever, known for her roles in No One Will Save You and Booksmart, was officially joining the cast of The Last of Us as Abby.Our casting process for season two has been identical to season one: we look for world-class actors who embody the souls of the characters in the source material, Mazin and Druckmann said in a joint statement to Variety. Nothing matters more than talent, and were thrilled to have an acclaimed performer like Kaitlyn join Pedro, Bella and the rest of our family.Again, were not going to get into spoilers about Abby (though you can read about it in Polygons coverage of the game), but suffice it to say: HBOs description of her as a skilled soldier whose black-and-white view of the world is challenged as she seeks vengeance for those she loved feels pretty apt. And Devers gonna have to get jacked.Who else is joining The Last of Us season 2 cast?Along with the casting of Dever as Abby, HBO has announced that Catherine OHara, of Schitts Creek, Beetlejuice, and general legendary character-actor fame, will join the cast in an undisclosed role. (Which means theres no word yet on whether her take on The Last of Us universe will include the softness of Nightmare Before Christmas Sally, or the exaggerated pronunciations of Schitts Moira.)Also joining The Last of Us season 2: Isabela Merced as Dina and Young Mizuno as Jesse. Several more members of the cast were announced in March 2024, including Danny Ramirez (Top Gun: Maverick) as Manny, Ariela Barer (How to Blow Up a Pipeline) as Mel, Tati Gabrielle (You) as Nora, and Spencer Lord (Riverdale) as Owen. Most of this most recent batch of characters are key players in Abbys part of the story, though that means theyll cross paths with Ellie eventually.
    0 Comments 0 Shares 28 Views
  • WWW.POLYGON.COM
    What If? is at its best and worst in its most comedic episode
    An ongoing problem throughout Marvel Studios animated multiverse show What If? comes from the baffling way many of its episodes are framed. Each installment of the series has a What If title, but those titles rarely get at the core of what a given episode is doing, or what might be engaging or thrilling about it. Season 2 was a particular low point in that regard, with even the best storylines hidden behind question titles seemingly designed to produce an apathetic shrug. Season 3 has the same problem, which comes to a head with whats simultaneously the seasons best and worst episode: What If Howard the Duck Got Hitched?Its hard to imagine even the most dedicated Marvel fan caring about Howard the Ducks marital status one way or the other. But its also bizarre to frame the episode that way, since that isnt what its about at all. A more accurate title might be What If We Made an Episode of This Show Solely for the Most Dedicated Deep-Cut MCU Fans? This episode is slight and silly almost to the point of stupidity. Like so much of the series, it doesnt meaningfully interrogate anything that happened in the primary MCU timeline, or bring any insight to the Marvel Universe. Its just a weird extended chase montage that feels closer to a Scooby-Doo episode than a Marvel movie.But that gives the story and writing team a freedom they didnt have with much of the rest of What If? And it let them put together an episode thats all fan service and manic comedy, with no stakes, rules, or meaningful boundaries. Its a featherweight experience compared to nearly everything else in the show. Its also a weirdly satisfying experience for obsessive MCU-heads, wholl get to play the Rick Dalton pointing game in practically every shot, and nod along with the callbacks in every other line. In other words, its perfect for what What If? couldve been.The premise requires the tiniest amount of previous What If? knowledge, though its mostly covered in a quick recap in the episode itself. In the season 1 episode What If Thor Were an Only Child?, Collector escapee and anthropomorphic duck Howard (Seth Green) idly suggests to Dr. Darcy Lewis (Kat Dennings, from the various Thor movies) that they should go get half-price nachos together via a local happy-hour bar special. Smash-cut to the present, and theyre married and have just had their first child which is, naturally, an egg.That egg turns out to have been produced during the Cosmic Convergence, an event previously only of import to Thor: The Dark World. Because of the power the Convergence may have conveyed to the egg, a wide variety of MCU factions want to claim it, for purposes ranging from trivial (Grandmaster wants to eat it) to galaxy-threatening (Kaecilius, Mads Mikkelsens villain from Doctor Strange, wants to offer whatevers in the egg as a host body to Dormammu, the Cosmic Conqueror, the Destroyer of Worlds). Darcy and Howard seek help from other factions, from SHIELD boss Nick Fury to Loki, and the whole thing turns into a frantic planet-hopping McGuffin chase that also feels like an MCU trivia contest.Throughout its three seasons, What If? always operated along two separate tracks: grimdark and goofy. As with the MCU movies themselves, comedy bits sometimes pop up even in the straight-faced stories, and vice versa. But mostly, theres a sharp and sometimes uneasy line between the shows comedy and its attempt to pull off action on a scale (and with a level of Jack Kirby visual referencing) that live action cant handle.In the shows first season, the gradual buildup from seemingly stand-alone speculative stories to a season-concluding crossover has a respectable dramatic weight. That seasons final episode, What If the Watcher Broke His Oath?, is the only place where the show fully reaches its potential, by bringing seemingly unrelated threads together into a meaningful conflict. But seasons 2 and 3 struggle to pull off the same hat trick, and repeat too many of the first seasons beats. The gigantic combats get repetitive. The characters get less stage time and less development. (The X-Mens Storm as Thor is a particular waste a visual design and power set that doesnt get a backstory or any meaningful character depth.)But in season 3, the failure of the drama leaves more space for the comedy to land. In What If Howard the Duck Got Hitched?, misleading title and all, writers finally abandon the idea of building toward a big picture, and just go whole hog into comedy. It isnt great or insightful humor its lowest-common-denominator Hey, I recognize that guy! referential humor, as one character after another from some of the MCUs least-loved movies pops up to demand the stage. And yet theres a real escalating wit to the way the MCUs familiar villainous power-grabs and the villains vast battalions of generic CG extras play into the storys frantic escalation. By the end, Darcy and Howard are facing a The Hobbit-style battle of armies, as one half-forgotten MCU magical mob after another charges into the fray to try and grab their kid-to-be.None of this will land with casual MCU fans who have no idea why its funny when Black Maw shows up, or when Malekith and Kaecilius yell at each other about whos the darkest. Or why a space vacations excursion pitch for couples cliff-diving on Vormir is ironic and alarming. (Other laugh lines are a little more obvious, like Kaecilius telling Guardians of the Galaxy antihero Yondu, Dormammu does not come to bargain.)The episode is directly aimed at the most in-the-know MCU completists. But chasing such a specific audience also lets the writers get as nerdy and narrow as they want in stark contrast to some of the more dramatic What If? episodes, which try to thread the needle of asking meaningful questions about departures for the MCU, while telling broad, familiar, easily accessible hero-versus-villains stories that dont feel meaningfully different from the canon versions they overwrite.In a season as well aligned and well assembled as What If?s first batch of episodes, What If Howard the Duck Got Hitched? might just come across as inane, a grab-bag of references that barely connect with each other, a collection of sweepings from the bottom of the MCU toybox. But by this point in the series, with so many of the dramatic episodes looking so similar to each other, the full-force leap into comic escalation actually has a lot going for it. Its self-indulgent, but that winds up feeling better than the rest of the seasons restraint. Its a weird blend of continuity-bound, in the way it draws from so many different parts of the MCU ber-narrative, and continuity-free, in that it imagines a weird world where a lady and a ducklike alien can make an egg-baby together, and have that baby become a cosmic singularity.And if nothing else, this season 3 episode is daring in a way not enough of What If? ever was. It isnt the series at its best, but its certainly the series at its wildest, weirdest, and most go-for-broke. In a show thats entirely about alternate narrative paths for familiar stories, its the episode that most displays an alternate path for What If? Its a peek into a version of the show where the creators arent trying so hard to produce a series of plausible, meaningful MCU mini-movies to add to the multiverse canon. With this one episode, the What If? team takes full advantage of low-stakes thinking, animations ability to create unimagined new worlds, and a What the hell, lets do one for the fans mentality, upending a sprawling franchise that creators mostly take a little more seriously than they need to.What If? is now streaming on Disney Plus.
    0 Comments 0 Shares 28 Views
  • DESIGN-MILK.COM
    Good Vibes Only in This Coimbatore Hacienda by MuseLAB
    What makes a house a home? What imbues structure with meaning? The ancient Indian tradition, nay science, Vastu Shastra outlines architectural guidelines to ensure each habitation hones positivity and that the appropriate vibrations resonate for those who dwell within. Designers Huzefa Rangwala and Jasem Pirani, the minds behind Mumbai-based MuseLAB, applied those principles to the design of this sprawling Coimbatore Hacienda tucked away in a city of the same name in the south Indian state of Tamil Nadu.Equal parts first and forever home, the current owners brought the design team on board to build a residence that would satisfy unique conceptual goals: contextualize the hacienda typology within Chettinad vernacular, calculate Manaiyadi Shastram according to Vastu Shastra tenets, and create the perfect city center oasis. Just as thoughtful glazing can create sightlines, so too can the geometry of each room and the placement of furnishings elicit a visceral reaction.The 12,000-square-foot palatial construct is situated at the end of an internal road on a nearly 40,000-square-foot plot of luscious green land in an otherwise urban landscape. Spatially and programmatically, the two-story structure is parsed by floor with a dynamic interplay between the two through a generous central courtyard a feature true to local building traditions. The ground floor unfurls horizontally such that the outdoor is in constant dialogue with the interior just as social spaces easily commingle. Guests may easily meander about the formal living and dining rooms, wet and dry kitchens, lounge, pool, gym, and even guest quarters.The first floor comprises private areas including the primary ensuite bedroom, childrens rooms, and library. While the upper level is connected by an elevator for convenience, the elaborate staircase, generous hallways, and sun-drenched walls beckon family and friends to ascend slowly enjoying circulation between the two levels.The structure itself pays homage to the rich architectural heritage of Spain and Mexico while appealing to local tastes and responding to Coimbatores tropical climate. To meet the moment MuseLAB opted for elements like recessed fenestration for shelter from the harsh sun rather than traditional overhangs. Pared-back decorative tile work in the courtyard calls back to historical aesthetics similar to the restrained floral pattern lining the tremendous vaulted pool room that looks to the future with its pixel-like composition.Design choices throughout the homes common spaces make statements through rich surface texture, color, and bold home furnishings, which contrast a palette more sensitive to those occupying the secluded areas. Daylight also plays a pivotal role in creating an emotive experience from the way shadows track the suns movement to the reflections created by bold material choices.This home is not a distinct departure from our usual gestures, but a conscious one where we have tried to champion the floor, focus on the vaulted forms, and select thoughtful elements within each space, the firm shares. The greatest challenge was marrying all these needs to create a modern, comfortable, personal home for a family of four. In a recent conversation with the client, they proclaimed that every day in this home is akin to a vacation.I think this statement is a testament to the fact that we nailed the brief.To learn more about the creative duo and their talented team visit muselab.in.Photography by Ishi Sitwala.
    0 Comments 0 Shares 29 Views
  • LIFEHACKER.COM
    CES 2025: Lenovos Rollable Laptop Can Unfurl Like a Scroll
    Forget foldable phones: Lenovos got a rollable laptop. First revealed as a concept two years ago, the device is finally making its way to market, although with a decidedly uncatchy namethe ThinkBook Plus Gen 6 Rollable.The idea here is that you can get some extra screen space without needing to carry around a secondary display or buy a device with multiple screens, like the Asus Zenbook Duo. Instead, the ThinkBook Plus Gen 6 Rollable will fit in your bag like any normal 14-inch laptop would, but if you press a dedicated key or signal your webcam with a special gesture, its motorized screen will roll up to reveal 50% additional screen space. Credit: Michelle Ehrhardt Thats possible thanks to the OLED panel, since OLED actually has a side benefit beyond its excellent contrastits far more flexible than other display technologies. Thats why you can get folding phones, although this scroll-like device is far bigger than the more pocketable flexible OLEDs youre probably used to. When fully unfurled, the display reaches 16.7-inches, so while youre not exactly doubling your resolution here, the expanded vertical room should be great for coders or people who might need a reference document up while working. You could also take meetings in the upper part of your screen while taking notes in the lower part, and your webcam will actually be a little higher up when the screen is at its full length, which could bring it closer to your face. Credit: Michelle Ehrhardt Aside from the screen, which also tops out at 120Hz for smooth input, the ThinkBook Plus Gen 6 Rollable seems to be a fairly typical Copilot+ Windows productivity PC. CPUs are available up to Intel Core Ultra 7 while memory tops out at 32GB and storage can be configured up to 1TB. There are two Thunderbolt 4 ports as well as a fingerprint reader, but sadly, only one color option. Theres also no dedicated GPU.Given those specs, the starting price point of $3,500 might be a bit rich for my blood, but as someone whos desperately missing my home setups secondary monitor while writing all these CES articles from my hotel room, I see the vision.
    0 Comments 0 Shares 28 Views
  • LIFEHACKER.COM
    CES 2025: Finally, SteamOS Runs on a Handheld Other Than the Steam Deck
    Ever since it kicked off a renewed interest in handheld PC gaming in 2022, the Steam Deck has been my go-to recommendation for anyone with even a passing interest in taking their PC games on the go. It now has plenty of competition, some with better chips and some with better screens, but none have matched the convenience of Valves SteamOS operating systemuntil now.The Lenovo Legion Go S runs SteamOSWhile most Steam Deck competitors use Windows, Lenovo announced during this years CES that its new Lenovo Legion Go S will be the first PC handheld not from Valve to offer SteamOS at launch. That means itll have a dedicated Steam button, easy access to Steams store and your Steam Library (although it can still play games from other stores), immediate remote play functionality, and, perhaps my favorite feature, Valves unmatched quick menu support (for adjusting everything from brightness to power consumption).SteamOS is also a bit more lightweight than Windows, so games could theoretically run better. Although, since it is based on Linux, some games will face limited compatibility.Still, the tradeoff is well worth it to me. The Proton tech SteamOS uses to help Windows games run on Linux has been generally reliable across my library, even with games that are supposed to have issues, and SteamOS is much easier to navigate with a controller than Windows.That said, the Legion Go S will have a Windows version as well, which will accommodate anyone who needs to play a game with anti-cheat, since those have difficulties running on SteamOS.Sadly, it's limited to white, while the SteamOS version comes in what looks like black to me, but Lenovo assures me it's a type of purple. Credit: Michelle Ehrhardt Legion Go S vs. Steam DeckSo, why get the Legion Go S over a Steam Deck?Primarily, the chip and the screen.The Steam Deck is still an impressive bit of kit for its starting price point of $400 (or cheaper when buying refurbished), but its getting a little long in the tooth when it comes to performance and visuals. The more expensive OLED upgrade can help a little bit with the latter, but the Legion Go S is generally more modern, and more akin to other competitors like the Asus ROG Ally.While Ive previously been skeptical about these competitors due to their lack of SteamOS, the Legion Go S promises the best of both worldsmodern hardware and Valves convenient software. Lets start with the chip: over the base Steam Deck, it has either the existing AMD Z1 Extreme processor or the new, exclusive AMD Z2 Go processor.The Z1 Extreme has already proven itself in devices like the aforementioned Asus ROG Ally, putting out a dozen or so extra frames over the Steam Deck in the latest AAA games, while the as-yet-unreleased Z2 Go seems to be a more modest improvement over the custom AMD Steam Deck chip. According to AMDs CES press conference yesterday, you can expect the Z2 Go to have roughly the same compute core count as the Steam Decks chip, but with a higher clock speed, the ability to use up to 30W of power, and four extra graphics cores. Both should perform better than the Steam Deck, although the latter will certainly be a little cheaper.That extra performance plays into the other reason to upgrade: the improved screen. While the Steam Deck tops out at an 800p OLED panel with a 90Hz refresh rate, the Legion Go S has a higher resolution 1920 x 1200 panel with a 120Hz refresh rate. It is LCD, so youll lose out on the crisp OLED color contrast, but if youre like me and still running a Steam Deck LCD rather than the newer Steam Deck OLED, itll be a definitive upgrade. If you do have the OLED, itll largely be a matter of whether you prefer fidelity or speed. The Legion Go S will be able to show a higher frame per second count (hence the need for extra performance), but the Steam Deck OLED will probably have better colors, even if it has a slightly lower resolution. Credit: Michelle Ehrhardt You can configure the RAM to be twice as high as any Steam Deck, which should help make the handheld smoother to use. The remaining factors, then, are a matter of comfort and looks.The Legion Go S is a little thinner and lighter than the Steam Deck, and also marks an overhaul on the first Legion Go design, ditching the detachable controllers for a singular body. The right-hand touchpad from the first Legion Go is also still there, albeit smaller, although it doesn't have the dual touchpad setup that the Steam Deck has. On the plus side, it does have an extra USB-C port that the Steam Deck doesnt, as well as a slightly larger battery and optional trigger stops for a shorter pull.All of that makes for a compelling combo for me: the first device to really tempt me away from the Steam Deck since I got it. Youll want to wait for reviews to make a final decision, but Lenovo has put itself in a comfortable position here, especially with the Legion Go S price.PricingThe Legion Go S Powered by Steam, which is the model that runs SteamOS, will launch in May starting for $500. That will get you a Z2 Go processor, 16GB of RAM, and 512GB of storage. Meanwhile, a 512GB Steam Deck OLED is $50 more expensive, at $550. Upgrades to storage and RAM will also be available, which also brings me to the Windows version of the device.The Windows Legion Go S is actually getting a head start out the door, launching this month for $730. It comes with the Z2 Go processor, 32GB of RAM, and 1TB of storage. Thats a lot to pay for some extra memory and storage, so it should put you at ease to know a Windows model thats otherwise identical to the SteamOS version will also drop in May for $600, alongside additional upgrade options for both models. Those upgrades will include models with the Z1 Extreme chip, although specs for those configurations aren't fully available yet. Credit: Lenovo Note that Lenovo isnt considering the Legion Go S a full-on next-generation version of its existing Legion handheld. That would be the Legion Go 2, which is also at this years CES, but in a prototype stage. The latter is set to ship with the new Ryzen Z2 Extreme processor, an OLED screen, a new finish, and a fingerprint reader. Theres still no word on pricing or whether it'll include SteamOS, but Lenovo did tell me that it's aiming to ship within 2025. Fans of the first version can also rest easy knowing that the detachable controllers are still there, too.
    0 Comments 0 Shares 28 Views
  • LIFEHACKER.COM
    CES 2025: This New Smart Barbecue Is the Tech Im Most Excited About
    This summer, I tested most of the smart grills on the market, but one rose above the rest: the Brisk It Origin. It turned me from a very occasional griller into a barbecue enthusiast by removing every pain of outdoor cooking. Somehow, Brisk It has topped itself at CES this year, introducing a new grill, the Zelos, which brings the same hardware to the market at a more affordable price. Moreover, the Zelos carries the second generation of the AI engine that makes Brisk It great: Vera 2.0.Brisk It grills are powered by wood pellets, so they have the ability to smoke your food with exceptional precision, but also cook your food at higher temperatures if youre looking for more of a true grill experience without as much smoke. You load the grill hopper with pellets, which is a lot like like any other smoker on the market, and an auger inside the hopper feeds the grill. You can operate the grill two ways: manually, from the display on the grill itself, or from the app. Youll likely end up doing a mixture of both as you get more comfortable with the grill, but Ill say this: Regardless of which method I used, the Origin has lit every single time Ive pushed the button this year, easily 50-60 times. I expect the same from the Zelos. There is also a certain amount of giddiness one gets while being able to light a grill from the couch via the app.While most AI-powered cooking devices have apps with recipe generation and cook-time suggestions, youll actually use the advice that the Brisk It app gives you. It offers an impressively deep recipe bank, so youre likely to get a hit on any barbecue recipe youre looking for, and then it tells you precisely how to prep it, and regardless of whether you took that advice or not, will perfectly cook it for you. You choose the recipe, and send it to the grill. The grill will tell you when to put the food on, you insert a temperature probe, and walk away. The grill will execute the perfect smoke or grilling program, raising temperatures when appropriate, injecting higher smoke, and telling you when to flip the food or when to take it off. It will even turn itself off.So, how could Brisk It get better? Vera 2.0 allows you to take snapshots of your ingredients on the phone and Vera will identify the ingredients and then suggest recipes, and, of course, the perfect way to cook them. Snap a picture of the contents of your fridge, grocery cart, or pantry. It even claims you can take a picture of cooked food you enjoy, and Vera will try to give you a recipe for it (Id need to test this to believe it; it feels like the sort of AI promise that often fails).Perhaps more importantly, the Zelos is made to be more affordable than the Origin, which, though currently are on sale, are regularly priced at $849 and $1,099. The Zelos promises to be $399, which is dramatically cheaper, but from the looks of it, doesnt sacrifice much on the build of the grill itself. It still looks solid, and I actually might prefer the placement of the display on the Zelos than on the Origin (it looks to be placed much lower on the hopper).I called my Brisk It one of my favorite pieces of smart tech this yearit's one of those unique pieces of hardware technology that lives up to the hype. Whether you are a confident cook or easily intimidated, great tech like the Brisk It can make you more competent at babecuing.The Zelos will be available for purchase in the next few months online at Amazon, Home Depot, Walmart, Lowes, and BriskItGrills.com
    0 Comments 0 Shares 27 Views