• How To Find And Use Minecraft Slimeballs, Defeat Slimes, And Farm Slime Blocks

    The Slime is one of the Minecraft mobs that initially appears hostile, but upon killing it you will find useful items for many sought-after crafting recipes in the survival game. We've got all you need to know on how to find and kill Slimes in Minecraft, as well as the items they drop, Slimeball crafting recipes, and more.Table of ContentsHow to find Slimes in MinecraftHow to find Slimes in MinecraftSlimes spawn in the overworld only, in specific slime chunks. These are all below layer 40, and you can show your Minecraft coordinates to see how close you are. Unlike most mobs, it doesn't matter what light level the environment is at for them to spawn. They can also spawn in swamp biomes between layers 51 and 69 if the light level is seven or less. Slimes spawn regardless of weather conditions. In swamps and mangrove swamps, slimes spawn most often on a full moon, but never on a new moon. Slimes will never spawn in mushroom fields or deep dark biomes.The Slime is a green cube in Minecraft, and is a hostile mob.Slimes do not spawn within 24 blocks of any player, and they despawn over time if no player is within 32 blocks. They despawn instantly if no player is within 128 blocks in Java edition, or 44 to 128 blocks in Bedrock depending on the simulation distance setting.Continue Reading at GameSpot
    #how #find #use #minecraft #slimeballs
    How To Find And Use Minecraft Slimeballs, Defeat Slimes, And Farm Slime Blocks
    The Slime is one of the Minecraft mobs that initially appears hostile, but upon killing it you will find useful items for many sought-after crafting recipes in the survival game. We've got all you need to know on how to find and kill Slimes in Minecraft, as well as the items they drop, Slimeball crafting recipes, and more.Table of ContentsHow to find Slimes in MinecraftHow to find Slimes in MinecraftSlimes spawn in the overworld only, in specific slime chunks. These are all below layer 40, and you can show your Minecraft coordinates to see how close you are. Unlike most mobs, it doesn't matter what light level the environment is at for them to spawn. They can also spawn in swamp biomes between layers 51 and 69 if the light level is seven or less. Slimes spawn regardless of weather conditions. In swamps and mangrove swamps, slimes spawn most often on a full moon, but never on a new moon. Slimes will never spawn in mushroom fields or deep dark biomes.The Slime is a green cube in Minecraft, and is a hostile mob.Slimes do not spawn within 24 blocks of any player, and they despawn over time if no player is within 32 blocks. They despawn instantly if no player is within 128 blocks in Java edition, or 44 to 128 blocks in Bedrock depending on the simulation distance setting.Continue Reading at GameSpot #how #find #use #minecraft #slimeballs
    WWW.GAMESPOT.COM
    How To Find And Use Minecraft Slimeballs, Defeat Slimes, And Farm Slime Blocks
    The Slime is one of the Minecraft mobs that initially appears hostile, but upon killing it you will find useful items for many sought-after crafting recipes in the survival game. We've got all you need to know on how to find and kill Slimes in Minecraft, as well as the items they drop, Slimeball crafting recipes, and more.Table of Contents [hide]How to find Slimes in MinecraftHow to find Slimes in MinecraftSlimes spawn in the overworld only, in specific slime chunks. These are all below layer 40, and you can show your Minecraft coordinates to see how close you are. Unlike most mobs, it doesn't matter what light level the environment is at for them to spawn. They can also spawn in swamp biomes between layers 51 and 69 if the light level is seven or less. Slimes spawn regardless of weather conditions. In swamps and mangrove swamps, slimes spawn most often on a full moon, but never on a new moon. Slimes will never spawn in mushroom fields or deep dark biomes.The Slime is a green cube in Minecraft, and is a hostile mob.Slimes do not spawn within 24 blocks of any player, and they despawn over time if no player is within 32 blocks. They despawn instantly if no player is within 128 blocks in Java edition, or 44 to 128 blocks in Bedrock depending on the simulation distance setting.Continue Reading at GameSpot
    0 Yorumlar 0 hisse senetleri
  • NVIDIA and Partners Highlight Next-Generation Robotics, Automation and AI Technologies at Automatica

    From the heart of Germany’s automotive sector to manufacturing hubs across France and Italy, Europe is embracing industrial AI and advanced AI-powered robotics to address labor shortages, boost productivity and fuel sustainable economic growth.
    Robotics companies are developing humanoid robots and collaborative systems that integrate AI into real-world manufacturing applications. Supported by a billion investment initiative and coordinated efforts from the European Commission, Europe is positioning itself at the forefront of the next wave of industrial automation, powered by AI.
    This momentum is on full display at Automatica — Europe’s premier conference on advancements in robotics, machine vision and intelligent manufacturing — taking place this week in Munich, Germany.
    NVIDIA and its ecosystem of partners and customers are showcasing next-generation robots, automation and AI technologies designed to accelerate the continent’s leadership in smart manufacturing and logistics.
    NVIDIA Technologies Boost Robotics Development 
    Central to advancing robotics development is Europe’s first industrial AI cloud, announced at NVIDIA GTC Paris at VivaTech earlier this month. The Germany-based AI factory, featuring 10,000 NVIDIA GPUs, provides European manufacturers with secure, sovereign and centralized AI infrastructure for industrial workloads. It will support applications ranging from design and engineering to factory digital twins and robotics.
    To help accelerate humanoid development, NVIDIA released NVIDIA Isaac GR00T N1.5 — an open foundation model for humanoid robot reasoning and skills. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks.
    To help post-train GR00T N1.5, NVIDIA has also released the Isaac GR00T-Dreams blueprint — a reference workflow for generating vast amounts of synthetic trajectory data from a small number of human demonstrations — enabling robots to generalize across behaviors and adapt to new environments with minimal human demonstration data.
    In addition, early developer previews of NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 — open-source robot simulation and learning frameworks optimized for NVIDIA RTX PRO 6000 workstations — are now available on GitHub.
    Image courtesy of Wandelbots.
    Robotics Leaders Tap NVIDIA Simulation Technology to Develop and Deploy Humanoids and More 
    Robotics developers and solutions providers across the globe are integrating NVIDIA’s three computers to train, simulate and deploy robots.
    NEURA Robotics, a German robotics company and pioneer for cognitive robots, unveiled the third generation of its humanoid, 4NE1, designed to assist humans in domestic and professional environments through advanced cognitive capabilities and humanlike interaction. 4NE1 is powered by GR00T N1 and was trained in Isaac Sim and Isaac Lab before real-world deployment.
    NEURA Robotics is also presenting Neuraverse, a digital twin and interconnected ecosystem for robot training, skills and applications, fully compatible with NVIDIA Omniverse technologies.
    Delta Electronics, a global leader in power management and smart green solutions, is debuting two next-generation collaborative robots: D-Bot Mar and D-Bot 2 in 1 — both trained using Omniverse and Isaac Sim technologies and libraries. These cobots are engineered to transform intralogistics and optimize production flows.
    Wandelbots, the creator of the Wandelbots NOVA software platform for industrial robotics, is partnering with SoftServe, a global IT consulting and digital services provider, to scale simulation-first automating using NVIDIA Isaac Sim, enabling virtual validation and real-world deployment with maximum impact.
    Cyngn, a pioneer in autonomous mobile robotics, is integrating its DriveMod technology into Isaac Sim to enable large-scale, high fidelity virtual testing of advanced autonomous operation. Purpose-built for industrial applications, DriveMod is already deployed on vehicles such as the Motrec MT-160 Tugger and BYD Forklift, delivering sophisticated automation to material handling operations.
    Doosan Robotics, a company specializing in AI robotic solutions, will showcase its “sim to real” solution, using NVIDIA Isaac Sim and cuRobo. Doosan will be showcasing how to seamlessly transfer tasks from simulation to real robots across a wide range of applications — from manufacturing to service industries.
    Franka Robotics has integrated Isaac GR00T N1.5 into a dual-arm Franka Research 3robot for robotic control. The integration of GR00T N1.5 allows the system to interpret visual input, understand task context and autonomously perform complex manipulation — without the need for task-specific programming or hardcoded logic.
    Image courtesy of Franka Robotics.
    Hexagon, the global leader in measurement technologies, launched its new humanoid, dubbed AEON. With its unique locomotion system and multimodal sensor fusion, and powered by NVIDIA’s three-computer solution, AEON is engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support.
    Intrinsic, a software and AI robotics company, is integrating Intrinsic Flowstate with  Omniverse and OpenUSD for advanced visualization and digital twins that can be used in many industrial use cases. The company is also using NVIDIA foundation models to enhance robot capabilities like grasp planning through AI and simulation technologies.
    SCHUNK, a global leader in gripping systems and automation technology, is showcasing its innovative grasping kit powered by the NVIDIA Jetson AGX Orin module. The kit intelligently detects objects and calculates optimal grasping points. Schunk is also demonstrating seamless simulation-to-reality transfer using IGS Virtuous software — built on Omniverse technologies — to control a real robot through simulation in a pick-and-place scenario.
    Universal Robots is showcasing UR15, its fastest cobot yet. Powered by the UR AI Accelerator — developed with NVIDIA and running on Jetson AGX Orin using CUDA-accelerated Isaac libraries — UR15 helps set a new standard for industrial automation.

    Vention, a full-stack software and hardware automation company, launched its Machine Motion AI, built on CUDA-accelerated Isaac libraries and powered by Jetson. Vention is also expanding its lineup of robotic offerings by adding the FR3 robot from Franka Robotics to its ecosystem, enhancing its solutions for academic and research applications.
    Image courtesy of Vention.
    Learn more about the latest robotics advancements by joining NVIDIA at Automatica, running through Friday, June 27. 
    #nvidia #partners #highlight #nextgeneration #robotics
    NVIDIA and Partners Highlight Next-Generation Robotics, Automation and AI Technologies at Automatica
    From the heart of Germany’s automotive sector to manufacturing hubs across France and Italy, Europe is embracing industrial AI and advanced AI-powered robotics to address labor shortages, boost productivity and fuel sustainable economic growth. Robotics companies are developing humanoid robots and collaborative systems that integrate AI into real-world manufacturing applications. Supported by a billion investment initiative and coordinated efforts from the European Commission, Europe is positioning itself at the forefront of the next wave of industrial automation, powered by AI. This momentum is on full display at Automatica — Europe’s premier conference on advancements in robotics, machine vision and intelligent manufacturing — taking place this week in Munich, Germany. NVIDIA and its ecosystem of partners and customers are showcasing next-generation robots, automation and AI technologies designed to accelerate the continent’s leadership in smart manufacturing and logistics. NVIDIA Technologies Boost Robotics Development  Central to advancing robotics development is Europe’s first industrial AI cloud, announced at NVIDIA GTC Paris at VivaTech earlier this month. The Germany-based AI factory, featuring 10,000 NVIDIA GPUs, provides European manufacturers with secure, sovereign and centralized AI infrastructure for industrial workloads. It will support applications ranging from design and engineering to factory digital twins and robotics. To help accelerate humanoid development, NVIDIA released NVIDIA Isaac GR00T N1.5 — an open foundation model for humanoid robot reasoning and skills. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. To help post-train GR00T N1.5, NVIDIA has also released the Isaac GR00T-Dreams blueprint — a reference workflow for generating vast amounts of synthetic trajectory data from a small number of human demonstrations — enabling robots to generalize across behaviors and adapt to new environments with minimal human demonstration data. In addition, early developer previews of NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 — open-source robot simulation and learning frameworks optimized for NVIDIA RTX PRO 6000 workstations — are now available on GitHub. Image courtesy of Wandelbots. Robotics Leaders Tap NVIDIA Simulation Technology to Develop and Deploy Humanoids and More  Robotics developers and solutions providers across the globe are integrating NVIDIA’s three computers to train, simulate and deploy robots. NEURA Robotics, a German robotics company and pioneer for cognitive robots, unveiled the third generation of its humanoid, 4NE1, designed to assist humans in domestic and professional environments through advanced cognitive capabilities and humanlike interaction. 4NE1 is powered by GR00T N1 and was trained in Isaac Sim and Isaac Lab before real-world deployment. NEURA Robotics is also presenting Neuraverse, a digital twin and interconnected ecosystem for robot training, skills and applications, fully compatible with NVIDIA Omniverse technologies. Delta Electronics, a global leader in power management and smart green solutions, is debuting two next-generation collaborative robots: D-Bot Mar and D-Bot 2 in 1 — both trained using Omniverse and Isaac Sim technologies and libraries. These cobots are engineered to transform intralogistics and optimize production flows. Wandelbots, the creator of the Wandelbots NOVA software platform for industrial robotics, is partnering with SoftServe, a global IT consulting and digital services provider, to scale simulation-first automating using NVIDIA Isaac Sim, enabling virtual validation and real-world deployment with maximum impact. Cyngn, a pioneer in autonomous mobile robotics, is integrating its DriveMod technology into Isaac Sim to enable large-scale, high fidelity virtual testing of advanced autonomous operation. Purpose-built for industrial applications, DriveMod is already deployed on vehicles such as the Motrec MT-160 Tugger and BYD Forklift, delivering sophisticated automation to material handling operations. Doosan Robotics, a company specializing in AI robotic solutions, will showcase its “sim to real” solution, using NVIDIA Isaac Sim and cuRobo. Doosan will be showcasing how to seamlessly transfer tasks from simulation to real robots across a wide range of applications — from manufacturing to service industries. Franka Robotics has integrated Isaac GR00T N1.5 into a dual-arm Franka Research 3robot for robotic control. The integration of GR00T N1.5 allows the system to interpret visual input, understand task context and autonomously perform complex manipulation — without the need for task-specific programming or hardcoded logic. Image courtesy of Franka Robotics. Hexagon, the global leader in measurement technologies, launched its new humanoid, dubbed AEON. With its unique locomotion system and multimodal sensor fusion, and powered by NVIDIA’s three-computer solution, AEON is engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Intrinsic, a software and AI robotics company, is integrating Intrinsic Flowstate with  Omniverse and OpenUSD for advanced visualization and digital twins that can be used in many industrial use cases. The company is also using NVIDIA foundation models to enhance robot capabilities like grasp planning through AI and simulation technologies. SCHUNK, a global leader in gripping systems and automation technology, is showcasing its innovative grasping kit powered by the NVIDIA Jetson AGX Orin module. The kit intelligently detects objects and calculates optimal grasping points. Schunk is also demonstrating seamless simulation-to-reality transfer using IGS Virtuous software — built on Omniverse technologies — to control a real robot through simulation in a pick-and-place scenario. Universal Robots is showcasing UR15, its fastest cobot yet. Powered by the UR AI Accelerator — developed with NVIDIA and running on Jetson AGX Orin using CUDA-accelerated Isaac libraries — UR15 helps set a new standard for industrial automation. Vention, a full-stack software and hardware automation company, launched its Machine Motion AI, built on CUDA-accelerated Isaac libraries and powered by Jetson. Vention is also expanding its lineup of robotic offerings by adding the FR3 robot from Franka Robotics to its ecosystem, enhancing its solutions for academic and research applications. Image courtesy of Vention. Learn more about the latest robotics advancements by joining NVIDIA at Automatica, running through Friday, June 27.  #nvidia #partners #highlight #nextgeneration #robotics
    BLOGS.NVIDIA.COM
    NVIDIA and Partners Highlight Next-Generation Robotics, Automation and AI Technologies at Automatica
    From the heart of Germany’s automotive sector to manufacturing hubs across France and Italy, Europe is embracing industrial AI and advanced AI-powered robotics to address labor shortages, boost productivity and fuel sustainable economic growth. Robotics companies are developing humanoid robots and collaborative systems that integrate AI into real-world manufacturing applications. Supported by a $200 billion investment initiative and coordinated efforts from the European Commission, Europe is positioning itself at the forefront of the next wave of industrial automation, powered by AI. This momentum is on full display at Automatica — Europe’s premier conference on advancements in robotics, machine vision and intelligent manufacturing — taking place this week in Munich, Germany. NVIDIA and its ecosystem of partners and customers are showcasing next-generation robots, automation and AI technologies designed to accelerate the continent’s leadership in smart manufacturing and logistics. NVIDIA Technologies Boost Robotics Development  Central to advancing robotics development is Europe’s first industrial AI cloud, announced at NVIDIA GTC Paris at VivaTech earlier this month. The Germany-based AI factory, featuring 10,000 NVIDIA GPUs, provides European manufacturers with secure, sovereign and centralized AI infrastructure for industrial workloads. It will support applications ranging from design and engineering to factory digital twins and robotics. To help accelerate humanoid development, NVIDIA released NVIDIA Isaac GR00T N1.5 — an open foundation model for humanoid robot reasoning and skills. This update enhances the model’s adaptability and ability to follow instructions, significantly improving its performance in material handling and manufacturing tasks. To help post-train GR00T N1.5, NVIDIA has also released the Isaac GR00T-Dreams blueprint — a reference workflow for generating vast amounts of synthetic trajectory data from a small number of human demonstrations — enabling robots to generalize across behaviors and adapt to new environments with minimal human demonstration data. In addition, early developer previews of NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 — open-source robot simulation and learning frameworks optimized for NVIDIA RTX PRO 6000 workstations — are now available on GitHub. Image courtesy of Wandelbots. Robotics Leaders Tap NVIDIA Simulation Technology to Develop and Deploy Humanoids and More  Robotics developers and solutions providers across the globe are integrating NVIDIA’s three computers to train, simulate and deploy robots. NEURA Robotics, a German robotics company and pioneer for cognitive robots, unveiled the third generation of its humanoid, 4NE1, designed to assist humans in domestic and professional environments through advanced cognitive capabilities and humanlike interaction. 4NE1 is powered by GR00T N1 and was trained in Isaac Sim and Isaac Lab before real-world deployment. NEURA Robotics is also presenting Neuraverse, a digital twin and interconnected ecosystem for robot training, skills and applications, fully compatible with NVIDIA Omniverse technologies. Delta Electronics, a global leader in power management and smart green solutions, is debuting two next-generation collaborative robots: D-Bot Mar and D-Bot 2 in 1 — both trained using Omniverse and Isaac Sim technologies and libraries. These cobots are engineered to transform intralogistics and optimize production flows. Wandelbots, the creator of the Wandelbots NOVA software platform for industrial robotics, is partnering with SoftServe, a global IT consulting and digital services provider, to scale simulation-first automating using NVIDIA Isaac Sim, enabling virtual validation and real-world deployment with maximum impact. Cyngn, a pioneer in autonomous mobile robotics, is integrating its DriveMod technology into Isaac Sim to enable large-scale, high fidelity virtual testing of advanced autonomous operation. Purpose-built for industrial applications, DriveMod is already deployed on vehicles such as the Motrec MT-160 Tugger and BYD Forklift, delivering sophisticated automation to material handling operations. Doosan Robotics, a company specializing in AI robotic solutions, will showcase its “sim to real” solution, using NVIDIA Isaac Sim and cuRobo. Doosan will be showcasing how to seamlessly transfer tasks from simulation to real robots across a wide range of applications — from manufacturing to service industries. Franka Robotics has integrated Isaac GR00T N1.5 into a dual-arm Franka Research 3 (FR3) robot for robotic control. The integration of GR00T N1.5 allows the system to interpret visual input, understand task context and autonomously perform complex manipulation — without the need for task-specific programming or hardcoded logic. Image courtesy of Franka Robotics. Hexagon, the global leader in measurement technologies, launched its new humanoid, dubbed AEON. With its unique locomotion system and multimodal sensor fusion, and powered by NVIDIA’s three-computer solution, AEON is engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Intrinsic, a software and AI robotics company, is integrating Intrinsic Flowstate with  Omniverse and OpenUSD for advanced visualization and digital twins that can be used in many industrial use cases. The company is also using NVIDIA foundation models to enhance robot capabilities like grasp planning through AI and simulation technologies. SCHUNK, a global leader in gripping systems and automation technology, is showcasing its innovative grasping kit powered by the NVIDIA Jetson AGX Orin module. The kit intelligently detects objects and calculates optimal grasping points. Schunk is also demonstrating seamless simulation-to-reality transfer using IGS Virtuous software — built on Omniverse technologies — to control a real robot through simulation in a pick-and-place scenario. Universal Robots is showcasing UR15, its fastest cobot yet. Powered by the UR AI Accelerator — developed with NVIDIA and running on Jetson AGX Orin using CUDA-accelerated Isaac libraries — UR15 helps set a new standard for industrial automation. Vention, a full-stack software and hardware automation company, launched its Machine Motion AI, built on CUDA-accelerated Isaac libraries and powered by Jetson. Vention is also expanding its lineup of robotic offerings by adding the FR3 robot from Franka Robotics to its ecosystem, enhancing its solutions for academic and research applications. Image courtesy of Vention. Learn more about the latest robotics advancements by joining NVIDIA at Automatica, running through Friday, June 27. 
    Like
    Love
    Wow
    Sad
    Angry
    19
    0 Yorumlar 0 hisse senetleri
  • BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4

    By TREVOR HOGG
    Images courtesy of Prime Video.

    For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”

    When Splintersplits in two, the cloning effect was inspired by cellular mitosis.

    “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”
    —Stephan Fleet, VFX Supervisor

    A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”

    Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed.

    Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.”

    “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.”
    —Stephan Fleet, VFX Supervisor

    The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.”

    A building is replaced by a massive crowd attending a rally being held by Homelander.

    In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.”

    In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around.

    “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”
    —Stephan Fleet, VFX Supervisor

    Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”

    The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes.

    Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.”

    Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination.

    Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.”

    “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”
    —Stephan Fleet, VFX Supervisor

    Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution.

    A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.”

    Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4.

    When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    #bouncing #rubber #duckies #flying #sheep
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splintersplits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.” #bouncing #rubber #duckies #flying #sheep
    WWW.VFXVOICE.COM
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splinter (Rob Benedict) splits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith [Previs Director], who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be [from Season 3], so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a human [is] you tend to want to give it human gestures and eyebrows. Erik Kripke [Creator, Executive Producer, Showrunner, Director, Writer] said, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep was [acting in a certain way] in one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr. (Simon Pegg) develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelander (Anthony Starr) breaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchie (Tomer Capone) hallucinates as Kimiko Miyashiro (Karen Fukuhara) goes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splinter (Rob Benedict) splits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker (Valorie Curry). “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    0 Yorumlar 0 hisse senetleri
  • Ah, UGREEN، بمناسبة مرور 13 عامًا على تأسيسها، يبدو أننا بحاجة إلى احتفال كبير! هل نتحدث عن إكسسوارات تحولت إلى قطع أثرية في عالم التكنولوجيا؟ مراجعتنا لأحدث إكسسوارات UGREEN تذكرنا بأن "الأفضل" هو مجرد مصطلح يطلق على الأشياء التي نستخدمها حتى لا نشعر بالذنب عندما نشتري المزيد.

    تحية خاصة لتلك الكابلات التي تبدو وكأنها ستعيش للأبد، بينما نحن نعيش في عالم يتغير كل ثانية! إذا كنت تبحث عن شيء يجعلك تشعر بأنك متقدم على الزمن بينما تتسابق مع أحدث صيحات التكنولوجيا، فالأمر سهل:
    Ah, UGREEN، بمناسبة مرور 13 عامًا على تأسيسها، يبدو أننا بحاجة إلى احتفال كبير! هل نتحدث عن إكسسوارات تحولت إلى قطع أثرية في عالم التكنولوجيا؟ مراجعتنا لأحدث إكسسوارات UGREEN تذكرنا بأن "الأفضل" هو مجرد مصطلح يطلق على الأشياء التي نستخدمها حتى لا نشعر بالذنب عندما نشتري المزيد. تحية خاصة لتلك الكابلات التي تبدو وكأنها ستعيش للأبد، بينما نحن نعيش في عالم يتغير كل ثانية! إذا كنت تبحث عن شيء يجعلك تشعر بأنك متقدم على الزمن بينما تتسابق مع أحدث صيحات التكنولوجيا، فالأمر سهل:
    ARABHARDWARE.NET
    بمناسبة مرور 13 عامًا على تأسيسها.. مراجعتنا لأحدث إكسسوارات UGREEN
    The post بمناسبة مرور 13 عامًا على تأسيسها.. مراجعتنا لأحدث إكسسوارات UGREEN appeared first on عرب هاردوير.
    1 Yorumlar 0 hisse senetleri
  • The world's most recognizable desktop wallpaper has undergone a dramatic transformation, and let me tell you, it’s not the vibrant paradise we all remember. Gone are the days of lush green hills and a sky so blue it could make your eyes water. Now, it looks more like a sad attempt at a water-color painting left out in the rain.

    Who knew nostalgia could be so... dull? It’s almost as if Mother Nature took a permanent vacation and left a mediocre intern in charge of color correction. So, if you’re still clinging to that pixelated dream, it might be time to update your wallpaper and face the reality that sometimes, even iconic images fade into the background.

    #DesktopWallpaper #Nostalgia #PixelatedDreams
    The world's most recognizable desktop wallpaper has undergone a dramatic transformation, and let me tell you, it’s not the vibrant paradise we all remember. Gone are the days of lush green hills and a sky so blue it could make your eyes water. Now, it looks more like a sad attempt at a water-color painting left out in the rain. Who knew nostalgia could be so... dull? It’s almost as if Mother Nature took a permanent vacation and left a mediocre intern in charge of color correction. So, if you’re still clinging to that pixelated dream, it might be time to update your wallpaper and face the reality that sometimes, even iconic images fade into the background. #DesktopWallpaper #Nostalgia #PixelatedDreams
    1 Yorumlar 0 hisse senetleri
  • Exciting news on the horizon! Scientists have developed a groundbreaking 3D material that can capture CO₂, inspired by the incredible cyanobacteria that have thrived on our planet for billions of years! This innovative solution reminds us that nature often holds the keys to our technological challenges. Let's embrace this symbiotic relationship and work towards a cleaner, greener future together!

    Every step we take towards sustainability is a step towards a brighter tomorrow! Let's be the change we wish to see!

    #Sustainability #Innovation #GreenTechnology #Cyanobacteria #FutureIsBright
    🌍✨ Exciting news on the horizon! Scientists have developed a groundbreaking 3D material that can capture CO₂, inspired by the incredible cyanobacteria that have thrived on our planet for billions of years! 💚 This innovative solution reminds us that nature often holds the keys to our technological challenges. Let's embrace this symbiotic relationship and work towards a cleaner, greener future together! 🌱💪 Every step we take towards sustainability is a step towards a brighter tomorrow! Let's be the change we wish to see! 🌟 #Sustainability #Innovation #GreenTechnology #Cyanobacteria #FutureIsBright
    Desarrollan un nuevo material 3D que captura CO₂
    En innumerables ocasiones las respuestas a los desafíos tecnológicos no se han encontrado en un laboratorio, sino en la naturaleza. Las cianobacterias, unos microorganismos que habitan la Tierra desde hace miles de millones de años, podrían ser la re
    1 Yorumlar 0 hisse senetleri
  • In the shadows of creation, I find myself yearning for connection, yet feeling so incredibly alone. The anticipation of "King of Meat" is overshadowed by the weight of silence, as I await the promises made by Mike Green and Jonny Hopper of Glowmade. Their words echo in my mind, yet the distance feels insurmountable. The excitement of the Summer Game Fest now feels like a distant mirage, leaving me feeling hollow. I wonder if others feel this ache of hope intertwined with despair, as we all navigate the dark corners of our passions.

    #KingOfMeat #Glowmade #GamingCommunity #Loneliness #Heartache
    In the shadows of creation, I find myself yearning for connection, yet feeling so incredibly alone. The anticipation of "King of Meat" is overshadowed by the weight of silence, as I await the promises made by Mike Green and Jonny Hopper of Glowmade. Their words echo in my mind, yet the distance feels insurmountable. The excitement of the Summer Game Fest now feels like a distant mirage, leaving me feeling hollow. I wonder if others feel this ache of hope intertwined with despair, as we all navigate the dark corners of our passions. #KingOfMeat #Glowmade #GamingCommunity #Loneliness #Heartache
    WWW.ACTUGAMING.NET
    King of Meat : notre interview avec Mike Green et Jonny Hopper du studio Glowmade
    ActuGaming.net King of Meat : notre interview avec Mike Green et Jonny Hopper du studio Glowmade Après une preview prometteuse de King of Meat lors du Summer Game Fest, nous avons […] L'article King of Meat : notre interview avec Mike Green et
    1 Yorumlar 0 hisse senetleri
  • Hello, wonderful people! Today, I want to take a moment to celebrate the incredible advancements happening in the world of 3D printing, especially highlighted at the recent Paris Air Show!

    What an exciting week it has been for the additive manufacturing industry! The #3DExpress has been buzzing with news, showcasing how innovation and creativity are taking flight together! The Paris Air Show is not just a platform for the latest planes; it’s a stage for groundbreaking technologies that promise to revolutionize our future!

    Imagine a world where designing and producing complex aircraft parts becomes not only efficient but also sustainable! The use of 3D printing is paving the way for a greener future, reducing waste and making manufacturing more accessible than ever before. The possibilities are endless, and it’s invigorating to witness how these technologies can transform entire industries! 💪🏽

    During the show, we saw some amazing demonstrations of 3D printed components that are not only lightweight but also incredibly strong. This is a game-changer for aerospace engineering! Every layer printed brings us closer to smarter, more efficient air travel, and who wouldn’t want to be part of that journey?

    Let’s not forget the talented minds behind these innovations! The engineers, designers, and creators are the true superheroes, pushing boundaries and inspiring the next generation to dream bigger! Their passion and dedication remind us that with hard work and determination, we can reach for the stars!

    If you’ve ever doubted the power of creativity and technology, let this be your reminder: the future is bright, and we have the tools to shape it! So, let’s stay curious, keep pushing forward, and embrace every opportunity that comes our way! Together, we can soar to new heights!

    Let’s keep the conversation going about how #3D printing and additive manufacturing can change our world. What are your thoughts on these incredible innovations? Share your ideas and let’s inspire each other!

    #3DPrinting #Innovation #ParisAirShow #AdditiveManufacturing #FutureOfFlight
    🌟✨ Hello, wonderful people! Today, I want to take a moment to celebrate the incredible advancements happening in the world of 3D printing, especially highlighted at the recent Paris Air Show! 🚀🎉 What an exciting week it has been for the additive manufacturing industry! The #3DExpress has been buzzing with news, showcasing how innovation and creativity are taking flight together! 🌈✈️ The Paris Air Show is not just a platform for the latest planes; it’s a stage for groundbreaking technologies that promise to revolutionize our future! Imagine a world where designing and producing complex aircraft parts becomes not only efficient but also sustainable! 🌍💚 The use of 3D printing is paving the way for a greener future, reducing waste and making manufacturing more accessible than ever before. The possibilities are endless, and it’s invigorating to witness how these technologies can transform entire industries! 💪🏽✨ During the show, we saw some amazing demonstrations of 3D printed components that are not only lightweight but also incredibly strong. This is a game-changer for aerospace engineering! 🛠️🔧 Every layer printed brings us closer to smarter, more efficient air travel, and who wouldn’t want to be part of that journey? 🌟🌍 Let’s not forget the talented minds behind these innovations! The engineers, designers, and creators are the true superheroes, pushing boundaries and inspiring the next generation to dream bigger! 💖🔭 Their passion and dedication remind us that with hard work and determination, we can reach for the stars! 🌟 If you’ve ever doubted the power of creativity and technology, let this be your reminder: the future is bright, and we have the tools to shape it! So, let’s stay curious, keep pushing forward, and embrace every opportunity that comes our way! Together, we can soar to new heights! 🚀💖 Let’s keep the conversation going about how #3D printing and additive manufacturing can change our world. What are your thoughts on these incredible innovations? Share your ideas and let’s inspire each other! 🌈✨ #3DPrinting #Innovation #ParisAirShow #AdditiveManufacturing #FutureOfFlight
    #3DExpress: La fabricación aditiva en el Paris Air Show
    ¿Qué ha ocurrido esta semana en la industria de la impresión 3D? En el 3DExpress de hoy te ofrecemos un resumen rápido con las noticias más destacadas de los últimos días. En primer lugar, el Paris Air Show es esta…
    Like
    Love
    Wow
    Sad
    Angry
    287
    1 Yorumlar 0 hisse senetleri
  • In the silence of my room, I find myself staring at the empty corners where dreams once blossomed. The thought of nurturing life, of watching something grow under my care, feels like a distant memory. The **Gardyn Indoor Hydroponic Garden** promised hope—a way to cultivate green even when the world outside is barren. But here I am, clutching my heart, feeling the weight of disappointment.

    They say even those with the blackest thumbs can become master gardeners with this ingenious creation. Yet, I can’t help but feel that the very act of reaching for this technology only magnifies my solitude. Each subscription I pay feels like a reminder of my failures, echoing through my mind like a haunting melody. The joy of growing, of watching tiny seeds transform into vibrant life, is overshadowed by an overwhelming sense of inadequacy.

    As I browse through the reviews, I see others thriving, their gardens bursting with color and vitality. It’s a sharp contrast to my own barren reality. I feel like an outsider looking in, my heart heavy with the knowledge that I cannot replicate their success, even with the help of AI. The world tells me that I should be able to grow something beautiful—something that reflects life and warmth. Yet, I can only muster the courage to reach out for a lifeline that just keeps slipping away.

    In moments of quiet despair, I question my worth. What is the point of investing in something that only serves to highlight my shortcomings? The **better growing through AI** feels like a cruel joke. It’s as if the universe is reminding me that no amount of technology can bridge the chasm of my isolation. I yearn for the simple joy of nurturing life, yet here I stand, a weary soul wrapped in the chains of disappointment.

    Every time I see the bright greens and vibrant reds of thriving plants online, it cuts deeper. I wonder if I will ever know that feeling, or if I will remain alone in this garden of shadows. The promise of a flourishing indoor garden now feels like a mirage, a fleeting glimpse of what could have been if only I were capable of growing beyond my sorrow.

    Perhaps it’s not just about gardening; perhaps it’s about connection—seeking companionship in a world that often feels cold. I long for someone who understands the weight of this solitude, who knows the struggle of wanting to cultivate something beautiful but feeling lost in the process. With every passing day, I realize that the seeds I wish to plant go beyond soil and water; they are a testament to my desire for companionship, for growth, for life.

    And so, I sit here, clutching my dreams tightly, hoping that someday I will learn to grow not just plants, but the courage to embrace the beauty around me despite the shadows that linger.

    #Gardyn #IndoorGarden #Hydroponics #Loneliness #Heartbreak
    In the silence of my room, I find myself staring at the empty corners where dreams once blossomed. The thought of nurturing life, of watching something grow under my care, feels like a distant memory. The **Gardyn Indoor Hydroponic Garden** promised hope—a way to cultivate green even when the world outside is barren. But here I am, clutching my heart, feeling the weight of disappointment. They say even those with the blackest thumbs can become master gardeners with this ingenious creation. Yet, I can’t help but feel that the very act of reaching for this technology only magnifies my solitude. Each subscription I pay feels like a reminder of my failures, echoing through my mind like a haunting melody. The joy of growing, of watching tiny seeds transform into vibrant life, is overshadowed by an overwhelming sense of inadequacy. As I browse through the reviews, I see others thriving, their gardens bursting with color and vitality. It’s a sharp contrast to my own barren reality. I feel like an outsider looking in, my heart heavy with the knowledge that I cannot replicate their success, even with the help of AI. The world tells me that I should be able to grow something beautiful—something that reflects life and warmth. Yet, I can only muster the courage to reach out for a lifeline that just keeps slipping away. In moments of quiet despair, I question my worth. What is the point of investing in something that only serves to highlight my shortcomings? The **better growing through AI** feels like a cruel joke. It’s as if the universe is reminding me that no amount of technology can bridge the chasm of my isolation. I yearn for the simple joy of nurturing life, yet here I stand, a weary soul wrapped in the chains of disappointment. Every time I see the bright greens and vibrant reds of thriving plants online, it cuts deeper. I wonder if I will ever know that feeling, or if I will remain alone in this garden of shadows. The promise of a flourishing indoor garden now feels like a mirage, a fleeting glimpse of what could have been if only I were capable of growing beyond my sorrow. Perhaps it’s not just about gardening; perhaps it’s about connection—seeking companionship in a world that often feels cold. I long for someone who understands the weight of this solitude, who knows the struggle of wanting to cultivate something beautiful but feeling lost in the process. With every passing day, I realize that the seeds I wish to plant go beyond soil and water; they are a testament to my desire for companionship, for growth, for life. And so, I sit here, clutching my dreams tightly, hoping that someday I will learn to grow not just plants, but the courage to embrace the beauty around me despite the shadows that linger. #Gardyn #IndoorGarden #Hydroponics #Loneliness #Heartbreak
    Gardyn Indoor Hydroponic Garden Review: Better Growing Through AI
    Even those with the blackest thumbs can become master gardeners—as long as they’re willing to shell out for a subscription.
    Like
    Love
    Wow
    Sad
    Angry
    273
    1 Yorumlar 0 hisse senetleri
  • Ah, California! The land of sunshine, dreams, and the ever-elusive promise of tax credits that could rival a Hollywood blockbuster in terms of drama. Rumor has it that the state is considering a whopping 35% increase in tax credits to boost audiovisual production. Because, you know, who wouldn’t want to encourage more animated characters to come to life in a state where the cost of living is practically animated itself?

    Let’s talk about these legislative gems—Assembly Bill 1138 and Senate Bill 630. Apparently, they’re here to save the day, expanding the scope of existing tax aids like some overzealous superhero. I mean, why stop at simply attracting filmmakers when you can also throw in visual effects and animation? It’s like giving a kid a whole candy store instead of a single lollipop. Who can say no to that?

    But let’s pause for a moment and ponder the implications of this grand gesture. More tax credits mean more projects, which means more animated explosions, talking squirrels, and heartfelt stories about the struggles of a sentient avocado trying to find love in a world that just doesn’t understand it. Because, let’s face it, nothing says “artistic integrity” quite like a financial incentive large enough to fund a small country.

    And what do we have to thank for this potential windfall? Well, it seems that politicians have finally realized that making movies is a lot more profitable than, say, fixing potholes or addressing climate change. Who knew? Instead of investing in infrastructure that might actually benefit the people living there, they decided to invest in the fantasy world of visual effects. Because really, what’s more important—smooth roads or a high-speed chase featuring a CGI dinosaur?

    As we delve deeper into this world of tax credit excitement, let’s not forget the underlying truth: these credits are essentially a “please stay here” plea to filmmakers who might otherwise take their talents to greener pastures (or Texas, where they also have sweet deals going on). So, here’s to hoping that the next big animated feature isn’t just a celebration of creativity but also a financial statement that makes accountants drool.

    So get ready, folks! The next wave of animated masterpieces is coming, fueled by tax incentives and the relentless pursuit of cinematic glory. Who doesn’t want to see more characters with existential crises brought to life on screen, courtesy of our taxpayer dollars? Bravo, California! You’ve truly outdone yourself. Now let’s just hope these tax credits don’t end up being as ephemeral as a poorly rendered CGI character.

    #CaliforniaTaxCredits #Animation #VFX #Hollywood #TaxIncentives
    Ah, California! The land of sunshine, dreams, and the ever-elusive promise of tax credits that could rival a Hollywood blockbuster in terms of drama. Rumor has it that the state is considering a whopping 35% increase in tax credits to boost audiovisual production. Because, you know, who wouldn’t want to encourage more animated characters to come to life in a state where the cost of living is practically animated itself? Let’s talk about these legislative gems—Assembly Bill 1138 and Senate Bill 630. Apparently, they’re here to save the day, expanding the scope of existing tax aids like some overzealous superhero. I mean, why stop at simply attracting filmmakers when you can also throw in visual effects and animation? It’s like giving a kid a whole candy store instead of a single lollipop. Who can say no to that? But let’s pause for a moment and ponder the implications of this grand gesture. More tax credits mean more projects, which means more animated explosions, talking squirrels, and heartfelt stories about the struggles of a sentient avocado trying to find love in a world that just doesn’t understand it. Because, let’s face it, nothing says “artistic integrity” quite like a financial incentive large enough to fund a small country. And what do we have to thank for this potential windfall? Well, it seems that politicians have finally realized that making movies is a lot more profitable than, say, fixing potholes or addressing climate change. Who knew? Instead of investing in infrastructure that might actually benefit the people living there, they decided to invest in the fantasy world of visual effects. Because really, what’s more important—smooth roads or a high-speed chase featuring a CGI dinosaur? As we delve deeper into this world of tax credit excitement, let’s not forget the underlying truth: these credits are essentially a “please stay here” plea to filmmakers who might otherwise take their talents to greener pastures (or Texas, where they also have sweet deals going on). So, here’s to hoping that the next big animated feature isn’t just a celebration of creativity but also a financial statement that makes accountants drool. So get ready, folks! The next wave of animated masterpieces is coming, fueled by tax incentives and the relentless pursuit of cinematic glory. Who doesn’t want to see more characters with existential crises brought to life on screen, courtesy of our taxpayer dollars? Bravo, California! You’ve truly outdone yourself. Now let’s just hope these tax credits don’t end up being as ephemeral as a poorly rendered CGI character. #CaliforniaTaxCredits #Animation #VFX #Hollywood #TaxIncentives
    Bientôt 35% de crédits d’impôts en Californie ? Impact à prévoir sur l’animation et les VFX
    La Californie pourrait augmenter ses crédits d’impôt pour favoriser la production audiovisuelle. Une évolution qui aurait aussi un impact sur les effets visuels et l’animation.Deux projets législatifs (Assembly Bill 1138 & Senate Bill
    Like
    Love
    Wow
    Angry
    Sad
    608
    1 Yorumlar 0 hisse senetleri
Arama Sonuçları