• GAMERANT.COM
    What to Expect From Wuthering Waves in 2025
    Wuthering Waves has continued to find success in the open-world gacha genre since its release in May 2024. On January 2, 2025, Wuthering Waves released its 2.0 update, which marked its first-ever major patch. As one of the most popular free-to-play anime-style ARPGs on the market and a nominee at The Game Awards 2024, the game is set for a massive year in 2025.
    0 Comments 0 Shares 32 Views
  • GAMERANT.COM
    Hyper Light Breaker's Cursed Outpost, Explained
    Many games use the concept of a hub city: an eye-of-the-storm location that players can retreat to between expeditions into a broader, more dangerous world. In the upcoming roguelike Hyper Light Breaker, that hub city is known as the Cursed Outpost. Hyper Light Breaker players can return to the Cursed Outpost at the end of their runs, upgrading their equipment and taking in the world and lore around them.
    0 Comments 0 Shares 31 Views
  • GAMERANT.COM
    Facial Tracking Gone Wrong! (Game Fails #203)
    A player gets a bird's eye view in Assassin's Creed Odyssey and more funny fails.
    0 Comments 0 Shares 32 Views
  • GAMEDEV.NET
    How to integrate angular velocity to point particles which are connected by springs?
    Working on a simple 3d physics engine. I connected 8 point particles with springs so that it forms a cuboid.Moving the particles via linear forces is easy but integrating angular forces is something I have no clue on how to do since point particles don't have an orientation. However if particles are connected via springs the particles should have an orientation from the center of mass to itself right? What I'm wondering is how do I integrate the angular velocity to the orientation of t
    0 Comments 0 Shares 29 Views
  • WWW.VFXVOICE.COM
    AI/VFX ROUNDTABLE: REVOLUTIONIZING IMAGERY THE FUTURE OF AI AND NEWER TECH IN VFX
    By JIM McCULLAUGHHere features a de-aged Tom Hanks and Robin Wright. Their transformations were accomplished using a new generative AI-driven tool called Metaphysic Live. (Image courtesy of Metaphysic and TriStar Pictures/Sony)The VFX industry is still in the formative stage of a revolutionary transformation, driven by rapid advancements in artificial intelligence (AI) and its tech cousins VR, Virtual Production, AR, Immersive and others. As we begin 2025, AI promises to redefine both the creative and technical workflows within this dynamic field. To explore the potential impacts and necessary preparations, a roundtable of leading experts from diverse corners of the global VFX industry brings insights from their experiences and visions for the future, addressing the critical questions.Q. VFX VOICE: How do you foresee AI transforming the creative and technical workflows in the visual effects industry by 2025, and what steps should professionals in the industry take today to prepare for these changes? Are we entering AI and Film 3.0, the phase where filmmakers are figuring out workflows that put together a string of specialized AI tools to serially generate an actual project? Still, lots of fear (era 1.0) and cautious experimentation (era 2.0), but most forward-looking are figuring out actual production processes.With the help of Metaphysic AI, Eminems music video Houdini created a version of Eminem from 20 years ago. Metaphysic offers tools that allow artists to create and manage digital versions of themselves that can be manipulated. (Images courtesy of Metaphysic and Interscope Records)Blue Beetle marked the first feature film where Digital Domain used its proprietary ML Cloth tool, which captures how Blue Beetles rubber-like suit stretches and forms folds and wrinkles in response to Blue Beetles movements. (Image courtesy of Digital Domain and Warner Bros. Pictures)A. Ed Ulbrich, Chief Content Officer & President of Production, MetaphysicBy 2025, AI will profoundly reshape the visual effects industry, enabling creators to achieve what was once deemed impossible. AI-powered tools are unlocking new levels of creativity, allowing artists to produce highly complex imagery and effects that were previously out of reach. These innovations are not only pushing the boundaries of visual storytelling but also drastically cutting costs by automating labor-intensive tasks and streamlining workflows.Moreover, AI will accelerate production and post-production schedules, transforming the entire filmmaking process. With AI handling time-consuming tasks, teams can focus more on the creative elements, leading to faster, more dynamic productions. To stay ahead, professionals should embrace AI, continuously learning and adapting to rapid advancements, ensuring they are prepared to harness these tools to their fullest potential. AI-powered filmmaking tools are like jet fuel for creativity.Fuzzy Door Techs ViewScreen in action from the Ted TV series. ViewScreen Studio is a visualization tool that enables real-time simulcam of visual effects while ViewScreen Scout is an app for iPhone. ViewScreen Studio visualizes and animates a complete scene, including digital assets, in real-time and for multiple cameras simultaneously. (Image courtesy of Fuzzy Door Tech)Harrison Ford transforms into Red Hulk for Captain America: Brave New World. (Image courtesy of Marvel Studios)A. Lala Gavgavian, Global President & COO, Digital Domain AI tools are already making strides in automating rotoscoping, keying and motion capture cleanup, which are traditionally labor-intensive and time-consuming tasks. In 2025, these tools will be more sophisticated, making post-production processes quicker and more accurate. The time saved here can be redirected to refining the quality of the visual effects and pushing the boundaries of whats possible in storytelling. AI has the possibility of being added to the artists palette, allowing expansion to experiment with different styles in a rapid prototyping way. By harnessing the power of AI, VFX professionals can unlock new levels of creativity and efficiency, leading to more immersive and personalized storytelling experiences.We are indeed moving into what could be considered the AI and Film 3.0 era. This phase is characterized by transitioning from fear (1.0) and cautious experimentation (2.0) to practical application.Filmmakers and VFX professionals are now figuring out workflows integrating specialized AI tools to create full-fledged projects. These tools can handle everything from pre-visualization and script breakdowns to real-time rendering and post-production enhancements. However, this transition is not without its challenges. There will be concerns about job displacement and the ethical implications of AI-generated content. To address these issues, the industry must adopt a balanced approach where AI augments human creativity rather than replacing it. Transparent discussions about the role of AI and its ethical implications should be held, ensuring that the technology is used responsibly.A. Brandon Fayette, Co-Founder & Chief Product Officer, Fuzzy Door TechBy 2025, AI is poised to significantly transform both creative and technical workflows in the visual effects industry. AIs impact is already evident in the entertainment sector, and it is set to become the standard for automating repetitive tasks such as shot creation and rendering. This automation is not limited to VFX; we cansee AIs efficiency in code generation, optimization, testing and de-noising audio, images and video. Technical workflows will become more flow-driven, utilizing AI to dynamically adapt and drive the desired creative results. This means AI will assist increating templates for workflows and provide contextual cues that help automate and enhance various stages of the creative process.AI is advancing rapidly, with new tools and techniques emerging almost daily. To stay ahead of these changes, VFX professionals should remain aware of new trends in AI and generative content.Continuous learning and adaptation will be crucial. However, the industry needs to establish standards and guidelines to ensure AI complements rather than compromises the artistic process. Our focus with the ViewScreen family of ProVis tools is on usingAI to support and enhance human creativity, not replace it. By improving processes across production workflows, AI can make jobs easier while respecting and preserving the craft and expertise of entertainment professionals.With GPU-accelerated NVIDIA-Certified Systems combined with NVIDIA RTX Virtual Workstation (vWS) software, professionals can do their work with advanced graphics capabilities from anywhere, able to tackle workloads ranging from interactive rendering to graphics-rich design and visualization applications or game development. (Image courtesy of NVIDIA)Examples of joint deformations before and after AI training shapes. (Image courtesy of SideFX)A. Nick Hayes, ZEISS Director of Cinema Sales, U.S. & CanadaThis past year, we have already seen fingerprints left by AI in both the technical and creative sides of the film industry.Companies like Strada are building AI-enabled production and post-production toolsets to complete tasks widely considered mundane or that nobody wants to do. In turn, this new technology will allow VFX artists and post-production supervisors more freedom to focus on the finer details and create out of this world visuals never seen before. I see this resulting in a higher grade of content, more imagination and even better storytelling.Recently, Cinema Synthetica held an AI-generated film contest. The competition founders argued that the use of generative AI empowers filmmakers to bring their stories to life at a much lower cost and faster than traditional filmmaking methods. Now, creatives can use software tools from companies like Adobe and OpenAI to create content from their minds eye by simply describing their vision in just a few sentences. In a way, the use of AI can be inspiring, especially for filmmakers with lower budgetsand less experience. In fact, in the next 12-24 months, we will see a surge of highly entertaining, imaginative content created by humans, assisted by AI.Character poses created in Houdini and used for AI training of joints. (Image courtesy of SideFX)Final result of posed character after AI training of joints, created and rendered in Houdini by artist Bogdan Lazar. (Image courtesy of SideFX)To stay ahead, professionals should embrace AI, continuously learning and adapting to rapid advancements, ensuring they are prepared toharness these tools to their fullest potential. AI-powered filmmaking tools are like jet fuelfor creativity.Ed Ulbrich, Chief Content Officer & President of Production, MetaphysicThere will be concerns about job displacement and the ethical implications of AI-generated content. To address these issues, the industrymust adopt a balanced approach where AI augments human creativity rather than replacing it. Transparent discussions about the role of AI and its ethical implications should be held, ensuring that the technology is used responsibly.Lala Gavgavian, Global President &COO, Digital DomainA. Neishaw Ali, Founder, President, Executive Producer, Spin VFXAI is set to transform the VFX industry by automating repetitive tasks, enhancing creativity and enabling real-time rendering. By staying up-to-date with AI tools, collaborating across disciplines, experimenting with new technologies and focusing on creative skills, professionals can effectively prepare for and leverage these advancements to enhance their workflows and deliver more innovative and compelling visual effects.We have been working with AI for many years in VFX and only now is it made available at a consumer level and poised to significantly transform both creative and technical workflows in the visual effects industry in several key areas such as: Concept Development Allows for visual ideation among the director, creative teamand VFX to solidify a vision in hours rather than weeks. It enables real-time alignment of the creative vision through text-to-image generation, a process not unlike Google image searches but far more targeted and effective.Automation of Repetitive Tasks Automation of repetitive and non-creative tasks such as rotoscoping and tracking will significantly reduce the time and effort required for these laborious processes thus allowing our artists to concentrate more on the creative aspects of the scene, which is both energizing and inspiring for them.Face Replacement AI is revolutionizing face replacement by enhancing accuracy and realism, increasing speed and efficiency, and improving accessibility and cost-effectiveness, allowing for high-quality face replacement for a wide range of applications. Proper authorization and clearance are necessary to ensure we do no harm to any likeness or person.Real-time rendering Though not only AI-driven, real-time rendering is most certainly changing the VFX workflow. As the quality of final renders becomes more photorealistic and AI-enabled technologies like denoising and upresing allow formore complex scenes to be scalable in software like Unreal Engine, the design and iteration process will accelerate. Changes can be instantly viewed and assessed by everyone.Steps for Professionals to Prepare: I believe one of the biggest challenges for some VFX artists and professionals is understanding that embracing AI does not mean sacrificing anything. Instead, it allows you to work smarter and more effectively, dedicating more time to creative tasks rather than monotonous, repetitive ones.A. Antoine Moulineau, CEO & Creative Director, Light Visual EffectsAI feels like the beginning of CGI 30 years ago when a new software or tool was out every week. There are a lot of different techs available, and its very hard to focus on one thing or invest in specific workflows. At LIGHT, we are focusing on better training artists with Nukes Copycat and new tools such as comfyUI. Up-res or frame interpolation are already huge time-savers in producing high-res renders or textures. AI like Midjourney or FLUX has already disrupted massively concept art and art direction; they play now a major part in the workflow. 2025 will be about animated concepts and possibly postvis if the tools mature enough to have the control required. Animating concepts with tools such as Runway 3.A major blocker for final use remains controlling the AI and the lack of consistency of the tools. As said earlier, there is so much happening now, that it is hard to keep up or rely on the tools to be able to integrate in a pipeline.I dont know if it will be for 2025, but I can see AI disrupting the CGI pipelines in the very short term; generative AI could replace traditional rendering in many scenarios and reduce the need for texturing or high-resolution modeling in the near future, specifically for wide environments. Lip-sync is also a process where AI is really shining and will disrupt traditional workflows in 2025.We will start seeing directors preparing an AI version of their films with an edit of animated concepts with music during the pitching/concept phase, especially for advertising. This is such a helpful process to understand and communicate their vision. Its kind of a Moodboard 3.0, and I can certainly imagine this process becoming the norm very quickly. For very short-form social content, it will probably replace entirely traditional workflows. That being said, I think long-form remains an art form where actors and performance remain central, and I dont see AI taking over anytime soon. It is hard for me to see the point of that. We need real people to identify with so we can connect to the content. Art is about the vision; it captures society and the world as it is in the time it is made. In other words, AI remains a gigantic database of the past, but we still need the human creation process to create new art. A good example is, AI wouldnt be able to generatea cartoon version of a character if someone hadnt invented cartoon previously. It will accelerate processes for sure but not replace them.A. Christian Nielsen, Creative Director, The MillPredicting the future is challenging, especially given AIs rapid advancement. However, I anticipate an increasing integration of AI tools into the VFX pipeline. Were already seeing this to some degree with AI-powered rotoscoping and paint tools, which address some of the most common repetitive tasks in VFX.Additionally, inpainting and outpainting techniques are emerging as powerful tools for removing elements from shots and creating set extensions. ComfyUI has already become an integral part of many AI pipelines, and I foresee its integration expanding across most VFX studios.I strongly recommend that everyone in the VFX industry familiarize themselves with AI to better understand its capabilities and implications. The integration of AI into VFX is both inevitable and unstoppable.AI is advancing rapidly, with new tools and techniques emerging almost daily. To stay ahead of these changes, VFX professionals should remain aware of new trends in AI and generative content. Continuous learning and adaptation will be crucial. However, the industry needs to establish standards and guidelines to ensure AI complements ratherthan compromises the artistic process.Brandon Fayette, Co-Founder & Chief Product Officer, Fuzzy Door TechIn a way, the use of AI can be inspiring, especially for filmmakers with lower budgets and less experience. In fact, in the next 12-24 months, we will see a surge of highly entertaining, imaginative content created by humans, assisted by AI.Nick Hayes, ZEISS Director of Cinema Sales,U.S. & Canada[O]ne of the biggest challenges for some VFX artists and professionals is understandingthat embracing AI does not mean sacrificing anything. Instead, it allows you to work smarter and more effectively, dedicating more time to creative tasks rather than monotonous, repetitive ones.Neishaw Ali, Founder, President, Executive Producer, Spin VFXI can see AI disrupting the CGI pipelines in the very short term; generative AI could replace traditional rendering in many scenarios andreduce the need for texturing or high-resolution modeling in the near future, specifically for wide environments. Lip-sync is also a process where AI is really shining and will disrupttraditional workflows in 2025.Antoine Moulineau, CEO & Creative Director, Light Visual EffectsTheres still progress to be made before full text-to-video tools like Runwayml Gen-3 or Sora can be used to create complete AI commercials or movies. The main challenge is the lack of precise control with AI. If a director dislikes a specific element in a shot or wants to make changes, theres currently no way to control that. As a result, AI tools are generally not very director-friendly. At present, these tools work best for ideation and conceptdevelopment, like how we use Midjourney or Stable Diffusion for still concepts. Initially, AI could be used for creating stock elements, but Im confident that OpenAI and others are working on giving users more control.Over the past 12 months, weve used AI for several commercials and experiences, learning as we go. This technology is so newin the VFX industry that theres little experience to draw from, which can lead to some long workdays.A. Mark Finch, Chief Technology Officer, ViconThe industry is going through considerable change as audience preferences and consumer habits have evolved significantly in recent years. More people are staying in than going out, tentpole IPs are reporting decreased excitement and financial returns, and weve seen a period of continuous layoffs. As a result, theres a lot of caution and anticipation as to whats next.In a transitional period like this, people are looking at the industry around them with a degree of trepidation, but I think theres also a significant amount of opportunity waiting to be exploited. Consumer hunger for new worlds and stories powered by VFX and new technologies is there, along with plenty of companies wanting to meet that demand.For the immediate future, I predict were going to see a spike in experimentation as people search for the most effective ways of utilizing these technologies to serve an audience whose appetite knows no bounds. Vicon is fueling that experimentation with our work in ML/AI, for example, which is the foundation of our markerless technology. Our markerless solution is lowering the barriers to entry to motion capture, paving the way for new non-technical experts to leverage motion capture in their industries.An example weve come to recognize is giving animators direct access to motion capture who historically would have only had access to it through mocap professionals on the performance capture stage, which is expensive and in high demand. This unfettered access reduces the creativity iteration loop, which ultimately leads to a faster final product that is representative of their creative dream.Theres a lot of excitement and noise surrounding the rapid growth of AI and ML-powered tech. Its impossible to look anywhere without seeing tools that encourage new workflows or provide enhancements to existing ones. A consequence of this is that you can fall into the mindset of, This is the way everything is going to be done, so I need to know about it all. When technology is moving so fast, you risk spreading yourself thin across a wealth of tools that are still finding their feet and may themselves be redundant, replaced or improved beyond recognition in the future.The best preparation comes from understanding the problem before the solution, in other words, identifying the obstacle you need to overcome first. You get this by focusing on people speaking to them about their challenges, researching those that exist across their industry in general, and gaining an understanding of why a certain tool, workflow or enhancement might exist.A. Paul Salvini, Global CTO, DNEGAI, especially machine learning, is poised to significantly impact the visual effects industry, transforming both creative and technical workflows. At DNEG, we are investing in the development of new AI-enabled tools and workflows to empower artists and enhance the creative process. For us, storytelling remains paramount so our use of AI is directed towards activities that provide better feedback for artists and deeper creative control.In terms of artist-facing tools, some of the areas likely to see early adoption of AI and ML techniques throughout 2025 include: Improving rendering performance (providing faster artist feedback); automating repetitive tasks; procedurally creating content; generating variations; and processing, manipulating and generating 2D images.AI techniques and tools are being increasingly used to generate ideas, explore creative alternatives and build early stand-ins for various locations, characters and props. As with all new tools, professionals can prepare by learning the basics of AI, and seeing how these tools are already being explored, developed and deployed in existing industry-standard packages.Some AI and ML tools work invisibly, while others require direct user involvement. An abundance of publicly available and user-friendly websites has emerged, allowing artists and the general public to experiment with various ML models to better understand their current capabilities and limitations.These new tools, while impressive, further emphasize the importance of human creativity, communication and collaboration. Our collective job of developing and bringing great stories to life remains unchanged. However, as our tools improve, we can dedicate more time to creative endeavors and less on mundane tasks. This is truly a better way to create better content.A. Christopher Nichols, Director, Chaos LabsMachine learning has been transforming the industry for years, so its nothing new to VFX artists. Especially when it comes to digital humans, rotoscoping, fluid sims and analyzing data/camera tracking information. AI will continue to take on a bigger piece of the workflow and replace a lot of traditional VFX techniques in time. The industry will just continue to adapt.Creating high-level content is going to become much more accessible, though. Soon, independent filmmakers will create shots that would have been the sole domain of high-end VFX houses. This will free the latter to experiment with more ambitious work. Currently, Chaos is trying to help artists get to LED screens faster via Project Arena and NVIDIA AI technology; youll likely see AI solutions become commonplace in the years ahead. Youll also probably see fewer artists per project and more projects in general, too, as AI makes things more affordable. So instead of 10 movies a year with 1,000 VFX artists on each movie, itll be more like 1,000 films with 100 names per project.The elephant in the room is generative AI. However, the big movie studios are reluctant to use it due to copyright issues. Right now, the matter of where the data is coming from is being worked out through the court system, and those decisions will influence what happens next. That said, I dont think an artist will bereplaced by a prompt engineer anytime soon. The best work you see coming out of the generative AI world is being done by artists who add it to their toolsets. You still must know what to feed these tools and artists know that better than anyone.I strongly recommend that everyone in the VFX industry familiarize themselves with AI to better understand its capabilities andimplications. The integration of AI into VFX is both inevitable and unstoppable.Christian Nielsen, Creative Director, The MillA consequence of [the rapid growth of AI] is that you can fall into the mindset of, This is the way everything is going to be done, so Ineed to know about it all. When technology is moving so fast, you risk spreading yourself thin across a wealth of tools that are stillfinding their feet and may themselves be redundant, replaced or improved beyond recognition in the future.Mark Finch, Chief Technology Officer, ViconOur collective job of developing and bringing great stories to life remains unchanged.However, as our tools improve, we can dedicate more time to creative endeavors and less on mundane tasks. This is truly a better way to create better content.Paul Salvini, Global CTO, DNEG[Y]oull likely see AI solutions become commonplace in the years ahead. Youll also probably see fewer artists per project andmore projects in general, too, as AI makes things more affordable. So instead of10 movies a year with 1,000 VFX artists on each movie, itll be more like 1,000 films with 100 names per project.Christopher Nichols, Director, Chaos LabsA. Greg Anderson, COO, Scanline VFX and Eyeline StudiosIn 2025, AI tools and technology are poised to significantly transform how visual effects are created, from automating the most mundaneof tasks to expanding the possibilities of the most complex visual effects sequences. Several compositing packages already incorporate AI-based features that greatly improve rotoscoping, tracking, cleanup speed and quality. These features will continue to improve in 2025, allowing artists to spend more time on the final quality of shot production. The ongoing and fast-moving development of generative AI tools and features will change the process, efficiency and quality of everything from digital environments to effects and character animation.From a technical and production workflow standpoint, AI will continue to optimize render processes, allowing for more iterations and leading to more convincing imagery that is faster and cost-effective. New tools will assist VFX teams in organizing, managing and accessing vast libraries of digital assets, making it easier for artiststo find and reuse elements across different projects. Data-driven insights will also allow AI tools to predict which assets might be needed based on project requirements.Overall, AI technology is poised to revolutionize the VFX industry next year and beyond, as weve only yet to scratch the surface of what will be possible. In preparation, anyone working in the VFX industry should lean heavily toward curiosity, continuous learning and skill development. Time spent experimenting with AI tools and technologies in current workflows will heighten the understanding of AIs capabilities and limitations. Additionally, while AI can enhance many technical aspects, creativity remains a human domain. Artists should focus on developing artistic vision, storytelling skills and creative problem-solving abilities.A. David Lebensfeld, President and VFX Supervisor, Ingenuity Studios and Ghost VFXIn 2025, we will see a continuation of idea genesis happening by leveraging generative AI tools. We will also find that our clients use generative AI tools to communicate their ideas by leveraging easy-to-use tools they have never had before. The sacrifice being controllability, but the benefit is ease of communication.Most of our studio clients have a real sensitivity to how AI is being used on their projects, and they want it to be additive to the projects versus a threat to the ecosystem. In the short term, generative AI will be used more as a tool for communication than it is for execution.Well continue to see AI-based tools in our existing software packages, giving both in-house and vendor tool developers and software developers room to expand their offerings. While AI advancements will continue to improve existing toolsets, they wont replace team members at scale, especially in the high-end part of the market.Looking ahead, I think the best professionals in our industry are already dialed in to developing toolsets and new technologies. Its always been the case that you have to be agile and stay aware of continual software and hardware developments. VFX is theintersection of technology and art; you must know and constantly improve both to stay competitive. Also on a professional level, I dont think well see meaningful changes in 2025 to how VFX final pixels get made at the studio side, for a multitude of reasons, two being a lack of granular control and sour optics.How people are talking about AI can often feel like a marketing trick. Everyone is using the same basic technology layer, and that always gets better as all boats rise. Like anything else, the people who know and leverage advanced technology the best and the most creatively will continue to win.A. Mathieu Raynault, Founder, Raynault VFXWhen I first thought about how AI might affect the visual effects industry, I felt both skeptical and anxious. But since I started in computer graphics in 1996, I havent seen anything with this much potential for exciting transformation.At Raynault VFX, AI is set to significantly boost our efficiency by automating routine tasks and letting our team focus more on the creative parts of our projects. Were a small team of 55 and creativity is at the heart of what we do. Weve started using AI to increase our productivity without sacrificing our artistic integrity. With seven full-time developers, were heavily invested in research and development, including AI, to improve our workflows.Looking ahead, I see AI enhancing our current tools, helping us keep control over the creative process and refine our work with client feedback. This blend of AI and human creativity is crucial because filmmakers will still rely on creative teams to bring their visions to life. Although theres some worry about AIs ability to create entire films or TV shows on its own, I think these tools wont replace human-driven filmmaking anytime soon.AI will certainly transform our workflows and could lead to shifts in employment within our industry. VFX artists will become more productive, able to deliver more work in less time, which might lead to a reduction in job numbers compared to pre-strike highs. For VFX professionals, integrating AI into their workflows is essential, yet its crucial to preserve and enhance our existing skills. In the field of concept art, for example, AI can assist in drafting initial designs, but the intricate process of refining these concepts to align with a directors vision will still require human expertise. Artists who can both direct AI and iterate while creating concept art themselves will be invaluable.In summary, Im quite optimistic. As we move toward 2025, adopting AI requires us to change our skills and approaches to stay competitive and innovative. As a business owner in the VFX industry, its incredibly motivating!AI technology is poised to revolutionize the VFX industry next year and beyond, as weve only yet to scratch the surface of what will be possible. In preparation, anyone working inthe VFX industry should lean heavily toward curiosity, continuous learning and skilldevelopment.Greg Anderson, COO, Scanline VFX and Eyeline StudiosHow people are talking about AI can oftenfeel like a marketing trick. Everyone is using the same basic technology layer, and that always gets better as all boats rise. Like anything else, the people who know and leverage advanced technology the best and the most creatively will continue to win.David Lebensfeld, President andVFX Supervisor, Ingenuity Studios and Ghost VFXWhen I first thought about how AI might affect the visual effects industry, I felt both skeptical and anxious. But since I startedin computer graphics in 1996, I havent seen anything with this much potential for exciting transformation.Mathieu Raynault, Founder,Raynault VFXI know some people look at AI and its use as being somehow catastrophic for our business but, at the end of the day, I think itll be just another tool in our arsenal and, used wisely, a great one. The faster artists and companies embrace it and learn to use it in their workflows, the better, and were already seeing that adaptation now.Viktor Mller, CEO, Universal Production Partners (UPP)A. Viktor Mller, CEO, Universal Production Partners (UPP)To some extent, AI has already begun to transform the industry.We see demonstrations of its growing capabilities almost on a weekly basis, and there seems to be a lot of fear around that.Honestly, Im not worried about it at all. I could sense it coming long before it started turning up in the media, which is why UPP has been quietly building out our VP and AI departments for the last six years.I know some people look at AI and its use as being somehow catastrophic for our business but, at the end of the day, I think itllbe just another tool in our arsenal and, used wisely, a great one. The faster artists and companies embrace it and learn to use it in their workflows, the better, and were already seeing that adaptation now.A. Kim Davidson, President & CEO, SideFXOver the past year, we have seen several advancements in AI in the visual effects industry and we expect this to continue in 2025. So far, the advancements have been more evolutionary than revolutionary. AI is not replacing creatives or the production pipeline butis greatly speeding up many of the more mundane tasks while not fully eliminating them yet. Tracking and rotoscoping are key examples of tasks that have been improved and sped up. We predict that 2025 will see more AI-based tools being used throughout the pipeline, with improved AI implementations andsome brand-new tools. These AI-enhanced workflows will include design concept, asset (model and texture) creation, motion stabilization, improved character animation and deformation (e.g. clothing, hair, skin), matching real-world lights, style transferring, temporal denoising and compositing.Of course, there will be improvements (and more releases) of prompt-based generative video applications. But for a variety of reasons we dont see this as the best workflow for creative professionals, certainly not the be-all and end-all for art-directed content creators. We believe in providing artists with AI/ML-enhanced toolsets to bring their creative visions to life more quickly and efficiently, allowing for more iterationsthat should lead to higher quality. We are at an exciting stage in the confluence of powerful hardware andAI-enhanced software where creative talent will be more important than ever and able to harness creative platforms to tell stories in truly extraordinary new ways.A. Dade Orgeron, Vice President of Innovation, Shutterstock2025 is here, but with generative AI technology moving so quickly, I think we can expect to see AI continueto transform the visual effects industry, particularly through advancements in generative video and 3D tools. As AI models continue to improve, we can expect notable enhancements in temporal consistency and reduced distortion, along with compositing tools to help seamlessly integrate AI-generated content into live-action footage or easily remove/replace unwanted people or objects. In the next wave of generative video models, complex mechanical devices and other intricate details will be represented with unprecedented precision, and advanced dynamics and fluid simulations will start to become achievable with generative video rather than traditional, time-consuming simulation engines. Will it be perfect? Maybe not in the next six months, but perhaps within the next year.To prepare for these advancements, VFX professionals should invest in upskilling themselves in AI and machine learning technologies. Understanding the capabilities, and particularly the limitations of AI-driven tools, will be essential. They should experiment with generative image and video technologies as well as 3D tools that leverage AI to streamline their workflowsand enhance their creative skills. Thats something at Shutterstock that we are actively enabling through partnerships with NVIDIA and Databricks. For instance, weve developed our own GenAI models to accelerate authentic creative output, all with ethically sourced data. Early adoption and a shift towards embracing new technologies and methodologies will enable artists and technicians to remain competitive and innovative in these rapidly evolving times.A. Gary Mundell, CEO, Tippett StudioThe big question is: What will AI mean to us in 2025? As we move through the Gartner Hype Cycle, AI seems to be transitioning from the Trough of Disillusionment into the Slope of Enlightenment, much like the early days of the .com era. AI is poised to bring a suite of tools that handle obvious tasks roto, match move, res up, FX but thats just the tip of the iceberg. Anything described by a massive database can use AI. If youcan articulate your prompts, and theres a database to train the answers, youre set. Forget influencers soon, prompters will drive production with AI-generated insights.By 2025, AI will fundamentally change VFX production. Imagine a system capable of generating an entire schedule and budget through prompts. AI could create a VFX schedule for a 1,200-shot project, complete with budgets, storyboards, 3D layouts and animatic blocking, all tailored to a directors style and the level of complexity. However, where todays AI falls short is in the temporal dimension it struggles with believable, complex animation. Current engines tend to produce flowy, slow visuals lacking continuity, and while many tools claim to address this, it will take time before AI excels at high-quality animation.At Tippett Studios, we leverage AI for previsualization, conceptualization and project management. Using TACTIC Resource, we integrate AI into planning and resource management, handling vast production data to predict outcomes and streamline workflows. As we move into 2025 and beyond, AIs data management capabilities will be key to future productivity and financial success, even as we await more advanced animation tools. As AI continues through the Peak of Inflated Expectations and towards the Plateau of Productivity, its role in VFX production will become increasingly significant.We are at an exciting stage in theconfluence of powerful hardware and AI-enhanced software where creative talent will be more important than ever and able to harness creative platforms to tell stories in truly extraordinary new ways.Kim Davidson, President & CEO, SideFXEarly adoption and a shift towards embracing new technologies andmethodologies will enable artists and technicians to remaincompetitive and innovative in these rapidly evolving times.Dade Orgeron, Vice President of Innovation, Shutterstock[W]here todays AI falls short is in the temporal dimension itstruggles with believable, complex animation. Current engines tend to produce flowy, slow visuals lacking continuity, and while many toolsclaim to address this, it will taketime before AI excels at high-quality animation.Gary Mundell, CEO, Tippett Studio
    0 Comments 0 Shares 28 Views
  • WWW.VFXVOICE.COM
    STREAMING AND VFX: CULTIVATING THE ABILITY TO ADAPT TO CONSTANT CHANGE
    By CHRIS McGOWANShgun (Image courtesy of FX Network)Despite the lingering effects of 2023s writers and actors strikes, the streamers continue to disrupt the industry. Streaming has increased the demand for VFX work and accelerated the growth of all parts of the production and post-production industries, says Tom Williams, Managing Director of DNEG Episodic.Among the leading streamers, Netflix had 277.65 million paid subscribers worldwide as of the second quarter of 2024, according to Statista research, an increase of over eight million subscribers compared with the previous quarter, and Netflixs expenditures on content were expected to stabilize at roughly 17 billion U.S. dollars by 2024. Also, by 2024, the number of Amazon Prime membersin the United States was projected to reach more than 180 million users. In Q2 2024, the number of Disney+ subscribers stood at around 153.6 million, according to Statista, while the combined number of subscribers to Warner Bros. Discoverys Max (formerly HBO) and Discovery+ services surpassed 103 million. Apple TV+, Hulu, Paramount+ and Peacock are among the others with significant viewers.Such subscriber numbers have bankrolled a lot of visual effects and animation. Streaming has been a game-changer for the VFX industry. It has significantly increased demand. With platforms constantly producing new content, visual effects studios have more opportunities than ever before, comments Valrie Clment, VFX Producer at Raynault VFX Visual Effects & Environments. The rise of streaming has also shifted the focus from traditional films to high-budget series, which has diversified the types of projects we work on at Raynault. Jennie Zeiher, President of Rising Sun Pictures (RSP), remarks, The advent of streaming had a huge impact that were still feeling today, not only for global consumers,but studios, production companies, TV channels, post houses, VFX studios; the entire industry was impacted. [It was a major disruption in the industry] that changed how content was consumed.The Last of Us (Image courtesy of HBO)BUDGETS & MODELSStreaming changed the way the industry was divided up and took away market share from broadcast and theatrical, according to Zeiher. She explains, In 2017, RSPs work was still wholly theatrical. We predicted that over the course of that year, we would be progressively taking on more streaming projects and that the year following, our work would be distributed 50/50. This indeed played out, and it tells the story of how a disruptive change can affect a business model. Fast forward to today, the industry is more complex than ever, made more so by the fact that streaming opened up distribution to a global, multi-generational audience, which is more diverse than ever.Everyone is more budget-conscious at the moment, which is not a bad thing for VFX as it encourages more planning and the use of previs and postvis, which helps everyone deliver the best possible end product, Williams says. We are a technology-driven industry that is always moving forward, combined with incredible artists, so I think we will always see improvements in quality. Zeiher adds, I think studios are still trying to settle on their model. There are fewer big hits due to diversity in taste, and there are more risks around greenlighting productions at a higher price point. What made a hit five or 10 years ago isnt the same as it is today. Thereis more diverse product in the pipeline to attract more diverse audiences. The streamers are producing high-end series, but they are more concentrated to a handful of studios.3 Body Problem (Image courtesy of Netflix)House of the Dragon (Image courtesy of HBO)Foundation (Image courtesy of AppleTV+)The Lord of the Rings: The Rings of Power (Image courtesy of Prime Video)The Boys (Image courtesy of Prime Video. Photo: Jan Thijs)SHARING WORKProductions normally split work between multiple vendors, Zeiher notes. This work can be sensitive to timing and schedule changes. Therefore, VFX vendors need to have a plan on how they manage and mitigate any changes in schedule or type of work. Besides capability and the quality of the creative, this is the biggest singular challenge for VFX vendors and is the secret to a successful studio! Zeiher adds, Studios have always split work between multiple vendors, and only in limited scenarios kept whole shows with single vendors, and this continues to be the trend. The studios are splitting work among their trusted vendors who have the capability in terms of crew and pipeline to hit schedules and manage risks.The increase in work has meant that more shows than ever before are being shared between different VFX houses, so that will add to the cooperation. Being a relatively young industry, it doesnt take long to find a mutual connection or 10 when you meet someone else from VFX at an event, Williams says. Comments Wayne Stables, Wt FXs VFX Supervisor on House of the Dragon Season 2, Im not sure that Ive seen a big change [in businessand production models]. We bring the same level of creativity and quality to everything we do, be it for feature film or streaming, and use the same tools and processes. I approach it the same way asI would working on a film. I think episodic television has always pushed boundaries. I remember when Babylon 5 came out [and] being amazed at what they were doing, and then seeing that ripple through to other work such as Star Trek: Deep Space Nine.Fallout (Image courtesy of Prime Video)In Your Dreams. Coming in 2025. (Image courtesy of Netflix)The Wheel of Time (Image courtesy of Prime Video)HIGHER EPISODIC QUALITYWorking with the VFX studios, the streamers have set the visual effects bar high by bringing feature film quality to episodic television. Game of Thrones comes to mind despite starting before the streaming boom. It revolutionized what viewers could expect from a series in terms of production value and storytelling. Laterseasons had blockbuster-level budgets and cinematic visuals that rivaled anything youd see in theaters, Clment says. Netflix has also made significant strides with shows like Stranger Things, which combines appealing aesthetics and compelling storytelling, and The Crown, known for its luxurious production design and attention to detail. Also, series like Westworld and Chernobyl both deliver sophisticated narratives with stunning visuals that feel more like feature films than traditional TV. These are just a few examples, of course. The range of projects that have made a significant impact in the streaming world is vast.Zeiher also points out the streaming titles The Rings of Power, Avatar: The Last Airbender, Shgun, Monarch: Legacy of Monsters, Loki Season 2, Fallout [and] the Star Wars universe, with recent series such as Andor, Ahsoka and The Acolyte as having brought feature-film quality to episodic. Stable comments, As the techniques used on big visual effects films have become more common, we have seen more high-end work appear everywhere. Looking at work in Game of Thrones and then, more recently, Foundation and through to shows like Shgun. And, of course, I am proud of our recent work on House of the Dragon Season 2, Ripley and The Last of Us.EXPECTATIONSThe expectation of quality never changes showrunners, writers and directors can spend years getting their visions greenlit no one is looking to cut corners. We all want to do our best work, regardless of the end platform, Williams says. Regarding the delivery dates for series episodes, Stables comments, I havent ever found the timeframes to be short. The shows tend to be very structured with the fact that you have to deliver for each episode, but that just brings about a practicality as to what is important. As with everything, the key is good planning and working with the studio to work out the best solution to problems. Clment says,While the compressed timelines can be challenging, the push for high-quality content from streaming platforms means that we are constantly striving to deliver top-notch visuals, even within tighter schedules. This is always exciting for our team.Sakamoto Days. Coming in 2025. (Image courtesy of Netflix)A Knight of the Seven Kingdoms: The Hedge Knight. Coming in 2025. (Image courtesy of HBO)CHANGES IN THE STREAMER/VFX RELATIONSHIPI think that showrunners and studios are seeing that it is now possible to create shows that perhaps in the past were not financially feasible. So, we are developing the same relationships [with the streamers] that we have had with the film studios, seeing what we can offer them to help tell their stories, Stables states. Relationships can be reciprocal, or they can be transactional, Zeiher observes. In VFX, we very much operate in a reciprocal relationship with the studios and their production teams; its a partnership at every level. Our success is based on their success and theirs on ours.Knuckles (Image courtesy of Paramount+ and Nickelodeon Network)GLOBAL COOPERATIONStreaming is enhancing global cooperation among VFX studios by creating a greater need for diverse talent and resources. Clment says, As streaming platforms produce more content, studios around the world are teaming up to manage the growing amount and complexity of VFX work. Advances in remote work technology and cloud tools make it easier for teams from different regions to collaborate smoothly and effectively. Zeiher explains, RSPs work on Knuckles is a great example of global, inter-company collaboration. Instead of using a single vendor, the work was split between several, mostly mid-size, vendors. The assets were built to a specification and shared using Universal Scene Description, allowing asset updates to be rolled out simultaneously across vendors and providing a consistent look across the characters. Paramounts approach to Knuckles was very smart and could be indicative for future workflows.The Witcher: Sirens of the Deep. Coming in 2025. (Image courtesy of Netflix)VFX is a tumultuous industry and, off the back of the WGA and SAG-AFTRA strikes, weve entered a time of consolidation, says Zeiher. Studios, often backed by private equity, are acquiring small to mid-size studios. This is helping them to distribute work globally across many jurisdictions. Dream Machine is an example of this new collaborative model with its recent acquisition of Important Looking Pirates and Cumulus VFX, joining Zero, Mavericks and Fin Design. Likewise, RSP has its sister studios FuseFX, FOLKS and El Ranchito under its parent company Pitch Black; its a new form of global collaboration, mid-size studios, with different offerings across brands and locations who can collaborate under one banner.I think that the streaming distribution model was the first disruption, and that distribution continues to evolve, Zeiher comments. The production model may now be disrupted through the use of GAI. Combining the distribution evolution, audience consumer changes and using GAI in production, were in forlots more changes in the year(s) to come. Clment states, As streaming platforms experiment with new content formats and distribution methods, VFX studios will adapt to different types of media and storytelling approaches.
    0 Comments 0 Shares 28 Views
  • WWW.VFXVOICE.COM
    WICKED IS A COLLAGE OF DIFFERENT LENSES AND TALENTS
    By TREVOR HOGGImages courtesy of Universal Studios.Elphaba (Cynthia Erivo) and Glinda (Ariana Grande) get a private tour of the Royal Palace of Oz.In recent years, exploring the backstories of iconic villains has become more in vogue with the release of Maleficent, Joker and now Wicked, a Universal Pictures production that brings the Broadway musical adaptation of Wicked: The Life and Times of the Wicked Witch of the West by Gregory Maguire to the big screen. No stranger to musicals is filmmaker Jon M. Chu, who has been making them ever since he was a USC film school student, but this time around, the scale is a throwback to Hollywood classics such as The Wizard of Oz, with the added benefit of the visual effects industry, which didnt exist back then.There is this grandiose nature to Wicked, but from the beginning, we always wanted it to feel touchable and immersive, director Jon M. Chu explains. We wanted to break the matte painting of Oz that we have in our mind. What happens if you could live in it? What happens if you can touch the dirt and textures? Visual effects are extremely powerful to be able to do that. Of course, we worked hand in hand with building as well by planting nine million tulips and having a real train and Wizards head, but were all in it together.Jonathan Bailey as Prince Fiyero performs in front of the rotating university library set, which was constructed by the special effects team led by Paul Corbould.Massive sets were built. I firmly believe youve got to build as much set as you possibly physically can, or do as much for real as you possibly physically can because the real photography on that set informs visual effects on how everything should look, states Production Designer Nathan Crowley. That is fundamental. You cant just put a bluescreen up because youre going to get enough of that anyway. Youve got to try to balance it. The act of physical construction is extremely informative. Crowley says, The thing is, if you do a concept and dont build it, then you miss out on the art direction of making it. Doing concept art in 3D was imperative. We will build, paint and finish a 3D model and will deliver it rendered to Pablo Helman [Visual Effects Supervisor]. Pablo has to rebuild it because visual effects have to go into a lot more specific areas, but at least he knows what it should look like. We also go scout places, and even if we dont film that place, well say to Pablo and Framestore, which does a lot of the environments, Thats what we need it to look like. We need to go to the south coast down to Poole and Bournemouth and get that set of cliffs, and that becomes Shiz. Emerald City is a hard one because youre going much higher [digitally]. I would try to build enough below 50 feet so he would have textures.Cinematographer Alice Brooks stands behind director Jon M. Chu as he discusses a shot she captured with custom-made lenses by Panavision.Cynthia Erivo decided to go with practical the stairs, and it all becomes this one long Steadicam shot that makeup for the green skin of Elphaba, which was then finessed digitally in post-production.Special Effects Supervisor Paul Corbould built the The Emerald Express, which was designed personal motorized carriage for the Wizard of Oz.Unreal Engine influenced the shot design. Emerald City was the last set that was built and was behind schedule, states Cinematographer Alice Brooks. We had this idea for when Elphaba and Glinda get off of the train, and we start to push down the stairs, and it all becomes this one long Steadicam shot that ends on a crane that lifts up. We had been working on this for months but couldnt get into the set to start rehearsals because all of the construction cranes and painters were in there. What we did do was take the game controller, go into Unreal Engine and start designing the shot. When walking the set in Unreal Engine, we realized that this big stepping-onto-crane move didnt show off the city in any spectacular way; that being low was the way you saw the city in this amazing way. Then, we threw out our amazing Steadicam idea, which our A camera operator was bummed out about, and we created something new in Unreal Engine that was perfect.Glinda, portrayed by Ariana Grande, makes use of her bubble wand.This aerial shot of Munchkinland showcases the nine million tulips that were planted.Numerous production meetings were held to discuss how to deal with the green skin of the future Wicked Witch of the West, Elphaba, portrayed by Cynthia Erivo. We wanted to have all of the options on the table then work with Cynthia herself to know what she needed as an actor, Chu explains. We did a lot of tests with a double to show Cynthia real makeup, semi-makeup where you only do the main areas, and completely non-green makeup because we knew that makeup every day for that long of a shoot could be grueling and would also take away time from actually shooting. Cynthia was like, I need the makeup. Of course, there is some cleanup that we needed to do because sometimes her hands were thinner on certain days than others. The green skin had to look believable and work in any lighting condition. David Stoneman, who is a chemist who makes products for our industry, took my green design, which was from products called Creamy Air and Illustrator, and the discontinued product that I had found, and put three drops of yellow neon into the base, explains Hair Designer/Makeup Designer/Prosthetics Designer Frances Hannon. It reflected off the dark skin tone and made it look like it was her skin, not like it was green-painted on the surface, and more than that, it worked in every light.A lens flare, rainbow and the Yellow Brick Road are incorporated into an establishing shot of the Emerald City.The head of the Wizard of Oz was a massive animatronic puppet hung from the ceiling of the studio.Prosthetic makeup was required to show the characters of Boq (Ethan Slater) and Fiyero (Jonathan Bailey) being transformed into Tin Man and Scarecrow. One of my most important things was working with Mark Coulier [Prosthetic Makeup Designer] again, Hannon remarks. For Tin Man, we wanted to achieve something sympathetic because it should have never happened to Boq. In our story, Elphabas spell goes wrong in Nessarose [Marissa Bode]s office, and everything metal in that room attaches to Boq; his breast plate would be the tray on the table, and his hands become the thimbles, salt and peppers. Then, the visual effects took over because all the joints were blue. With Scarecrow, Jon and Mark particularly wanted to keep Jonathan Baileys face shape. We also kept his nice teeth and natural eye color for Scarecrow. I used contact lenses on Jonathan for Fiyero, so we had a nice change there. Then, for his head element, I put masses of gold blonde through his look as Fiyero, which carried onto Scarecrow in a straw-colored wig; that kept Fiyero attractive because Elphaba and he fall in love.Most of the 2,500 visual effects shots were divided between ILM and Framestore, with other contributors being OPSIS, Lola VFX, Outpost VFX and BOT VFX. The CG creatures were difficult because they also talk, but they are mainly animals, Helman remarks. They dont walk on two legs. If its a goat that talks and is a teacher, its basically a goat if you look at it, then he talks. It was a fine line stylizing the talking so that it doesnt feel like a completely stylized character, but also finding the expression, the eyebrows, eyes and mouth, the phonemes, and how articulate those creatures are. We had an animal unit of about 10 people or so that would play animals on set, and we would shoot a take or a few takes with them. We had a transformation scene where the monkey transforms and gets wings, so we had the whole animal unit performing and being directed by Jon. Sometimes, the second unit would stay there to shoot plates. Besides the music, dancers, choreography and huge sets, then there were the animals.The mandate was to capture as much in-camera, which gave Nathan Crowley the freedom to construct massive sets.Magic was always treated in a grounded manner. Its not a cutesy, glowing, sparkling thing, Helman notes. There is nothing wrong with those kinds of things; its just that this version of Oz is not magical. You have to remember, when you go back to the original story, the Wizard of Oz is not really a wizard. Creative solutions had to be applied to achieve the desired effect. How do you make a book glow without making it look completely fantastical and cartoony? Helman explains, Maybe what you do is provide a language inside of the book with words that may become golden that wasnt golden in the beginning. So, you see a transition between a word that is on a black ink parchment to something golden that produces a glow and is completely grounded. Broomsticks are a form of aerial transportation. We worked with the stunt department to get the center of gravity correct and to be able to move the actors around. Cynthia Erivo wanted to do her own stunts, so she did. All of that wirework was closely planned. There are two things: Theres the center of gravity and what the body is doing in the air, and the lighting. If we get those two things right then were fine, Helman says.Water was seen as the key method of transportation to Shiz.Elphaba (Cynthia Erivo) begins to master the art of flying a broom.An establishing shot of Shiz University.Elphaba (Cynthia Erivo) and Glinda (Ariana Grande) decide to take a more relaxed approach to flying by taking off in a hot air balloon.A major accomplishment was the practical realization of the Emerald City Express, which is the personal train of the Wizard of Oz. It was Nathan Crowleys vision, states Special Effects Supervisor Paul Corbould. We built the running gear and the track and motorized it. Its hydraulically driven. Construction clad it with all of the fiberglass panels. The motion was repeatable. The train could be programmed to drive to a particular spot, run down and stop at a position, and when told to start again, make its next move and run to the end of the track. You can take it back to the beginning and keep on doing that, remarks Special Effects Design Supervisor Jason Leinster. Equally impressive was the construction of the Wizards head. Jason reverse-engineered the scale model and changed the electric servos to hydraulic rams and a whole control system, Corbould explains. It progressed from that. The head was suspended from the ceiling of the stage. It was a civil engineering project to have something like that floating in the middle of space, Leinster notes. It was 22 axes and puppeteered by one person. Most of it was done live and was often changing.Anti-gravity architecture serves as the basis for Kiamo Ko, which is a castle located on the peak of Knobblehead Pike.Other complicated rigs included the rotating library. Because it was first up in the schedule, we built one wheel and ladder, and the dancers with Chris Scott [Choreographer] rehearsed on that one, Corbould states. As we built another one, they were rehearsing on that with more dancers, and we built a third one. It took three months. An amazing prop was the fountains. The petals opened up, and they wanted water to come out, Corbould remarks. Weve got these hydraulic motion bases, and in the middle is a slit ring that allows you to turn a rig round and round without winding the cable up. We had to take a slit ring off, which you normally run a hydraulic oil through, and put that on the fountain. It ruined it because we were running water through it; that was quite a challenge. A bricklaying machine gets pulled by a bison. There was no bison to pull it, so the machine was self-driven, Leinster reveals. You could sit back and steer it. We had a roadway of foam bricks rolled up inside, and as the machine drove forward, it unrolled the Yellow Brick Road. Eventually, it drove off into the sunset, being pulled by the bison. You probably wont realize that is an effect.Madame Morrible (Michelle Yeoh) has the ability to control the weather, so there is a cloud motif to her hairstyle.To convey the impression of a floating castle, the concept of anti-gravity architecture was developed. Kiamo Ko isnt just a castle, Crowley observes. Its a defiant emblem of a bygone era, a testament to the forgotten magic that once pulsed through Oz. Its architecture, though ancient, utilizes lost principles of levitation, defying gravity yet remaining grounded in a sense of order and purpose. The key to Kiamo Kos defiance lies not in defying gravity entirely but in manipulating it subtly. Imagine a series of inverted arches, their points reaching skyward. These arches wouldnt be perfect mirrors of one another; instead, they possess a slight asymmetry, a calculated tilt that interacts with the forgotten magic of the land, generating a gentle, constant lift. This subtle slant would also provide a visual cue, hinting at the castles orientation even from a distance. By incorporating these design elements, Kiamo Ko transcends the trope of a generic floating castle. It becomes a character itself, a silent testament to a forgotten age and a beacon of hope for Elphaba and Fiyeros new beginning.Skies played a major role in setting the proper tone for scenesThroughout the whole movie, there is this idea that the sun is always rising for Glinda (Ariana Grande) and setting for Elphaba (Cynthia Erivo).Jonathan Bailey plays the role of Fiyero, who goes on to be transformed into the iconic Scarecrow.Lenses were developed specifically for the production (which evolved into the new series of Ultra Panatar II) that were paired with the ARRI ALEXA 65 cameras. Jon told me that he wanted Wicked to be unlike anything anyone had ever seen before, and the photography needed to represent that, Brooks states. I was on the movie so early I was able to design them with Dan Sasaki at Panavision in Woodland Hills. We called them the Unlimiteds after Elphaba singing Unlimited in Wicked because at the time they didnt have a name. Those lenses capture all of the pictures that Nathan, Jon and I put together for so many months, and they wrap the light beautifully on our actors. Usually, youre matching close-ups on the same lens, but on Elphaba, we shot her on a 65mm lens and Glinda on a 75mm lens, and we matched the size, but those two lenses did different things to their faces. Oz is a different place, and something is a little bit off everywhere. Our A and B 65mm lenses were not the same. It was a collage of lenses. Each one had such a different characteristic, and that made our movie feel different. Elphaba even has one line in the movie that goes, Some of us are just different. Thats what we want our Oz to be.Musical numbers were as complicated to plan and execute as action sequences.Various animals are part of the faculty at Shiz University, with Peter Dinklage doing facial capture and the voice of Dr. Dillamond.Apple Vision Pro is an essential part of the editorial process. I am overseeing the edit in the Vision Pro, Chu explains. Instead of being trapped in a monitor on a desk, which isnt the most creative, I can be like I am in the room with Myron Kertstein [Editor] where Im walking around or sitting on the couch. We can do visual effects approvals there too. I can bring it on and draw with my finger where certain areas need to be improved or whatnot. Hannon looks forward to seeing everything being brought together. For me, its seeing those finishing touches. The sets were 60 feet high. then we would have bluescreen. I do believe Paul Tazewell [Costume Designer] and myself, to the best of our abilities, gave Jon the spectacular, extraordinary and timeless look that he was after.Wicked is spanning two movies, with the first one centered around the song Defying Gravity and the second song For Good. Its in two parts, but we shot the whole movie in one lifetime! Helman laughs. I look at every project as a traumatic project where you develop these scars and learn from those scars, but you wear them proudly. Teamwork reigns supreme for Chu. Each department can make everything, but the reality is that we need to work together to make the thing that none of us can make alone. I feel lucky to be working with a team at the highest level, with the bar at the highest place for us to cross. It has been an amazing journey.
    0 Comments 0 Shares 28 Views
  • WWW.POLYGON.COM
    All Pocket Promo A cards in Pokmon TCG Pocket
    Promo cards have a lot of variance in Pokmon TCG Pocket. Some can be easily purchased from the shop, while others can only be obtained from limited time events. Right now, all promo cards are part of the Promo A series.If you were a late adopter of Pokmon TCG Pocket, you may have missed out on some of the Promo cards from the early events and Premium Passes. Unfortunately, there has been no indication that any cards no longer available will return to the game, but for posteritys sake, heres every promo card in the Promo A series.Heres a list of every Promo A card in Pokmon TCG Pocket and how to get them. Or if you have all of these, see the full list of Mythical Island cards.All Promo A cards list in Pokmon TCG PocketHeres every Promo A card listed in a handy table:Card #CardTypeHPAttack/AbilityObtained from1 / P-APotionItem—Heal 20 damage from 1 of your Pokémon.Shop2 / P-AX SpeedItem—During this turn, the retreat cost of your active Pokémon is 1 less.Shop3 / P-AHand ScopeItem—Your opponent reveals their hand.Shop4 / P-APokédexItem—Look at the top 3 cards of your deck.Shop5 / P-APoké BallItem—Put 1 random basic Pokémon from your deck into your hand.Shop6 / P-ARed CardItem—Your opponent shuffles their hand into their deck and draws 3 cards.Shop7 / P-AProfessors ResearchSupporter—Draw 2 cards.Shop8 / P-APokédexItem—Look at the top 3 cards of your deck.Not available9 / P-APikachuLightning60Gnaw (20)Premium shop10 / P-AMewtwoPsychic120Power Blast (120) — Discard 2 psychic energy from this Pokémon. Premium mission reward11 / P-AChanseyColorless120Gentle Slap (60)Meowth and Chansey Wonder Pick Event12 / P-AMeowthColorless60Pay Day (10) — Draw a card.Meowth and Chansey Wonder Pick Event13 / P-AButterfreeGrass120Gust (60)   Powder Heal (Ability) — Once during your turn, you may heal 20 damage from each of your Pokémon.Lapras ex Drop Event14 / P-ALapras exWater140Bubble Drain (80) — Heal 20 damage from this Pokémon.Lapras ex Drop Event15 / P-APikachuLightning60Gnaw (20)Lapras ex Drop Event16 / P-AClefairyPsychic60Slap (20)Lapras ex Drop Event17 / P-AMankeyFighting50Reckless Charge (30) — This Pokémon also does 10 damage to itself.Lapras ex Drop Event18 / P-AVenusaurGrass160Mega Drain (80) — Heal 30 damage from this Pokémon.Venusaur Drop Event19 / P-AGreninjaWater120Mist Slash (60)   Water Shuriken (Ability) — Once during your turn, you may do 20 damage to 1 of your opponent's Pokémon.Venusaur Drop Event20 / P-AHaunterPsychic70Surprise Attack (50) — Flip a coin. If tails, this attack does nothing.Venusaur Drop Event21 / P-AOnixFighting110Land Crush (70)Venusaur Drop Event22 / P-AJigglypuffColorless50Sing — Your opponent's active Pokémon is now Asleep.Venusaur Drop Event23 / P-ABulbasaurGrass70Vine Whip (40)Bulbasaur and Magnemite Wonder Pick Event24 / P-AMagnemiteLightning60Lightning Ball (20)Bulbasaur and Magnemite Wonder Pick Event25 / P-AMoltres exFire140Inferno Dance — Flip 3 coins. Take an amount of fire energy from your energy zone equal to the number of heads and attach it to your benched fire Pokémon in any way you like.   Heat Blast (70)Premium mission reward26 / P-APikachuLightning60Gnaw (20)New Year 2025 Event mission27 / P-ASnivyGrass60Tackle (20)Blastoise Drop Event28 / P-AVolcaronaFire120Volcanic Ash — Discard 2 fire energy from this Pokémon. This attack does 80 damage to 1 of your opponent's Pokémon.Blastoise Drop Event29 / P-ABlastoiseWater150Hydro Pump (80+) — If this Pokémon has at least 2 extra water energy attached, this attack does 60 more damage.Blastoise Drop Event30 / P-AEeveeColorless60Growl — During your opponent's next turn, attacks used by the defending Pokémon do –20 damage.Blastoise Drop Event31 / P-ACinccinoColorless90Do the Wave (30x) — This attack does 30 damage for each of your benched Pokémon.Blastoise Drop Event32 / P-ACharmanderFire60Ember (30) — Discard a fire energy from this Pokémon.Charmander and Squirtle Wonder Pick Event33 / P-ASquirtleWater60Water Gun (20)Charmander and Squirtle Wonder Pick Event
    0 Comments 0 Shares 27 Views
  • WWW.POLYGON.COM
    League of Legends new cinematic is a sequel to Arcanes wild ending
    Riot has released a new teaser for the upcoming League of Legends competitive season, and the developer is once again collaborating with Fortiche. The new trailer, Welcome to Noxus, shows an assortment of Noxian champions scheming and battling including Mel, the council member from Piltover, who is travelling to the Immortal Bastion in search of answers.Welcome to Noxus is, in some ways, exactly what you might expect from a pre-season cinematic. There are some cool battles between champions, with Katarina and Elise duking it out at a fancy ball in Noxus while Trundle and Darius engage in a bare-knuckle brawl up in the icy region of the Freljord. But theres a startling amount of story development here, and the cinematic ends with the leaders of the Black Rose scheming about their next steps.The next season of League of Legends, which begins on Jan. 9, will bring a Noxus-themed season to the map, changing the terrain and turrets to fit the militant nations themes. Theres also a new skin line, the Masque of the Black Rose, which has a handful of champions attending a fancy dance ball hosted by the Black Rose. Welcome to Noxus shows us bits and piece of this party, which is rudely crashed by Katarina. Katarina is an assassin loyal to Swain, the current leader of Noxus, so it seems theres still plenty of in-fighting amongst the Noxian power players.Intriguingly, we get to see LeBlanc and Vladimir appear at the end of the cinematic. They dont match their in-game appearances, unfortunately, but theyre still recognizable as characters. This may be a teaser to a future visual update for these champions. It also seems to confirm LeBlanc as the mysterious mage who confronted Mel in season 2 of Arcane. In fact, this new season, the Masque of the Black Rose, and the conversation between LeBlanc and Vladimir seem to be a direct continuation of the end of Arcane season 2. This suggests that Arcane is definitely the mainline canon, which is an interesting revelation.If youre interested in learning more about the Black Rose, League of Legends lore, and how Arcane ties into it all, the above video by Necrit is a solid deep dive into the topic from a long-time curator of the games complex canon.Well have to see more of Noxus and how this cinematic struggle plays out throughout the first season. Theres plenty of narrative potential around Noxus: the ticking time bomb of Mordekaiser hidden within the Immortal Bastion, Swains bargain with a devious demon named Raum, Threshs pilfering of souls from unlucky Noxian bar staff, and the ongoing conflict led by Darius into the Freljord.The new season of League of Legends releases on Jan. 9 at 12 p.m. PST/3 p.m. EST.
    0 Comments 0 Shares 28 Views
  • WWW.POLYGON.COM
    Pokmon Go Color Cup Great League Edition best team recommendations
    The Color Cup: Great League Edition is a limited-time cup in Pokmon Gos Go Battle League that only allows four types of Pokmon to compete. These types are inspired by the colors red, blue, green, and yellow which just so happen to be the names of the original Pokmon games.As with most themed cups, putting together the best Color Cup team possible is going to take quite a bit of consideration. Many of your go-to favorites wont be eligible here.If youre looking for some inspiration, weve rounded up the top Pokmon for the Color Cup in Pokmon Go and their optimal movesets.Color Cup: Great League Edition restrictionsOnly four types of Pokmon are allowed to enter the Color Cup: fire-, water-, grass-, and electric-type Pokmon. Theres also a 1,500 CP limit for each member of your team.This is one of the more restrictive cups in the Go Battle League, although all four eligible types of Pokmon are fairly common, so chances are youll have a decent collection already to choose from. The ability to use dual types also increases the pool of eligible Pokmon.Because the Color Cup has the same CP limit as the Great League, youll find that theres some crossover between the two when it comes to the top-ranking Pokmon. If you want to compete in the Color Cup on a budget, try using Pokmon youve already optimized for the Great League rather than powering up new ones specifically for this cup.Color Cup best teamHeres one of the best teams you can use for the Color Cup: Great League Edition in Pokmon Go:Alolan MarowakMorpekoQwilfishAll three of these Pokmon have dual typings, which gives them unique resistances and access to some excellent moves. Between them, youll have access to fire-, ghost-, ground-, electric-, dark-, psychic-, poison-, ice-, and water-type attacks. There arent many Pokmon you wont have an answer for there!Alolan Marowak is a Great League favorite that looks like it will perform even better in the Color Cup, especially as many of the Pokmon that could exploit its weaknesses arent eligible to compete here. Its got great stats and an excellent, varied moveset that can target plenty of fierce opponents.Morpekos not-so-secret weapon is its signature charged move, Aura Wheel. Its relatively cheap to use, alternates between electric- and dark-type damage for extra coverage, and increases the Pokmons Attack by one stage every time its used. Morpeko might be fragile on the battlefield, but it can dish out some serious damage while it lasts.Finally, we have Qwilfish. With a total of eight resistances (which includes fire and water), its got an advantage against many opponents in the Color Cup. Its poison sub-typing also means Qwilfish doesnt have a weakness to grass-type moves, which most water-type Pokmon do.If you dont have those Pokmon in your collection (or you dont have enough resources to optimise them), there are plenty more great Pokmon to try on your team. These include Salazzle, Hisuian Electrode, Jumpluff, Gastrodon, Toxapex, Emolga, and Magmar. You can find more details about these Pokmon in the next section.Color Cup best Pokmon moves and IVsHere are some of the top Pokmon to use in the Color Cup, as well as their optimal movesets and IVs:PokmonTypeBest Fast MoveBest Charged MovesPerfect IVsAlolan MarowakFire/GhostFire SpinBone Club and Shadow Bone0 / 14 / 14MorpekoElectric/DarkThunder ShockAura Wheel and Psychic Fangs1 / 15 / 15QwilfishWater/PoisonPoison StingAqua Tail and Ice Beam0 / 12 / 14SalazzlePoison/FireIncineratePoison Fang and Dragon Pulse2 / 15 / 13Hisuian ElectrodeElectric/GrassThunder ShockWild Charge and Swift1 / 14 / 14JumpluffGrass/FlyingFairy WindAerial Ace and Energy Ball0 / 14 / 14GastrodonWater/GroundMud SlapBody Slam and Earth Power1 / 15 / 14ToxapexPoison/WaterPoison JabBrine and Sludge Wave0 / 15 / 15EmolgaElectric/FlyingThunder ShockAcrobatics and Discharge0 / 13 / 15MagmarFireKarate ChopFire Punch and Scorching Sands0 / 15 / 15Weve not included Shadow Pokmon in our recommendations, as they can be quite rare and not every trainer will have caught them during their limited appearances. If you do have a Shadow Magmar or a Shadow Jumpluff, though, its worth trying them out with the same movesets listed above.When running the Color Cup format through PvPokes simulator, Toxapex comes out on top of the rankings. Things dont always work out the same in reality as they do on paper, of course, but it would be wise to have at least one strong Toxapex counter to hand. Our team recommendation has two: Morpeko and Alolan Marowak.The best thing about the Color Cup is that there are loads of promising Pokmon to try out, so hopefully, we wont be in for a run thats dominated by just one or two Pokmon. Whatever team you end up using, power them up, make sure youve got plenty of coverage options, and give it your best shot!
    0 Comments 0 Shares 28 Views