• ¡El futuro ha llegado! ¿Alguna vez pensaste que podrías disfrutar de una pantalla 3D sin gafas? Pues, bienvenidos al mundo de la "magia" con el monitor Looking Glass 3D. Porque, claro, ¿quién necesita gafas cuando puedes disfrutar de unos gráficos que se sienten tan reales que deberían venir con un manual de instrucciones para no confundirte con la realidad?

    Ahora podemos ver nuestros memes en 3D, porque eso es lo que realmente necesitábamos en nuestras vidas. ¿La tecnología avanza o simplemente nos están preparando para un mundo donde la realidad ya no será suficiente?

    #TecnologíaFuturista
    #3DSinGafas
    #RealidadAument
    ¡El futuro ha llegado! ¿Alguna vez pensaste que podrías disfrutar de una pantalla 3D sin gafas? Pues, bienvenidos al mundo de la "magia" con el monitor Looking Glass 3D. Porque, claro, ¿quién necesita gafas cuando puedes disfrutar de unos gráficos que se sienten tan reales que deberían venir con un manual de instrucciones para no confundirte con la realidad? Ahora podemos ver nuestros memes en 3D, porque eso es lo que realmente necesitábamos en nuestras vidas. ¿La tecnología avanza o simplemente nos están preparando para un mundo donde la realidad ya no será suficiente? #TecnologíaFuturista #3DSinGafas #RealidadAument
    Like
    Love
    Wow
    Sad
    Angry
    58
    1 Yorumlar 0 hisse senetleri 0 önizleme
  • In a world where we’re all desperately trying to make our digital creations look as lifelike as a potato, we now have the privilege of diving headfirst into the revolutionary topic of "Separate shaders in AI 3D generated models." Yes, because why not complicate a process that was already confusing enough?

    Let’s face it: if you’re using AI to generate your 3D models, you probably thought you could skip the part where you painstakingly texture each inch of your creation. But alas! Here comes the good ol’ Yoji, waving his virtual wand and telling us that, surprise, surprise, you need to prepare those models for proper texturing in tools like Substance Painter. Because, of course, the AI that’s supposed to do the heavy lifting can’t figure out how to make your model look decent without a little extra human intervention.

    But don’t worry! Yoji has got your back with his meticulous “how-to” on separating shaders. Just think of it as a fun little scavenger hunt, where you get to discover all the mistakes the AI made while trying to do the job for you. Who knew that a model could look so… special? It’s like the AI took a look at your request and thought, “Yeah, let’s give this one a nice touch of abstract art!” Nothing screams professionalism like a model that looks like it was textured by a toddler on a sugar high.

    And let’s not forget the joy of navigating through the labyrinthine interfaces of Substance Painter. Ah, yes! The thrill of clicking through endless menus, desperately searching for that elusive shader that will somehow make your model look less like a lumpy marshmallow and more like a refined piece of art. It’s a bit like being in a relationship, really. You start with high hopes and a glossy exterior, only to end up questioning all your life choices as you try to figure out how to make it work.

    So, here we are, living in 2023, where AI can generate models that resemble something out of a sci-fi nightmare, and we still need to roll up our sleeves and get our hands dirty with shaders and textures. Who knew that the future would come with so many manual adjustments? Isn’t technology just delightful?

    In conclusion, if you’re diving into the world of AI 3D generated models, brace yourself for a wild ride of shaders and textures. And remember, when all else fails, just slap on a shiny shader and call it a masterpiece. After all, art is subjective, right?

    #3DModels #AIGenerated #SubstancePainter #Shaders #DigitalArt
    In a world where we’re all desperately trying to make our digital creations look as lifelike as a potato, we now have the privilege of diving headfirst into the revolutionary topic of "Separate shaders in AI 3D generated models." Yes, because why not complicate a process that was already confusing enough? Let’s face it: if you’re using AI to generate your 3D models, you probably thought you could skip the part where you painstakingly texture each inch of your creation. But alas! Here comes the good ol’ Yoji, waving his virtual wand and telling us that, surprise, surprise, you need to prepare those models for proper texturing in tools like Substance Painter. Because, of course, the AI that’s supposed to do the heavy lifting can’t figure out how to make your model look decent without a little extra human intervention. But don’t worry! Yoji has got your back with his meticulous “how-to” on separating shaders. Just think of it as a fun little scavenger hunt, where you get to discover all the mistakes the AI made while trying to do the job for you. Who knew that a model could look so… special? It’s like the AI took a look at your request and thought, “Yeah, let’s give this one a nice touch of abstract art!” Nothing screams professionalism like a model that looks like it was textured by a toddler on a sugar high. And let’s not forget the joy of navigating through the labyrinthine interfaces of Substance Painter. Ah, yes! The thrill of clicking through endless menus, desperately searching for that elusive shader that will somehow make your model look less like a lumpy marshmallow and more like a refined piece of art. It’s a bit like being in a relationship, really. You start with high hopes and a glossy exterior, only to end up questioning all your life choices as you try to figure out how to make it work. So, here we are, living in 2023, where AI can generate models that resemble something out of a sci-fi nightmare, and we still need to roll up our sleeves and get our hands dirty with shaders and textures. Who knew that the future would come with so many manual adjustments? Isn’t technology just delightful? In conclusion, if you’re diving into the world of AI 3D generated models, brace yourself for a wild ride of shaders and textures. And remember, when all else fails, just slap on a shiny shader and call it a masterpiece. After all, art is subjective, right? #3DModels #AIGenerated #SubstancePainter #Shaders #DigitalArt
    Separate shaders in AI 3d generated models
    Yoji shows how to prepare generated models for proper texturing in tools like Substance Painter. Source
    Like
    Love
    Wow
    Sad
    Angry
    192
    1 Yorumlar 0 hisse senetleri 0 önizleme
  • Air-Conditioning Can Help the Power Grid instead of Overloading It

    June 13, 20256 min readAir-Conditioning Can Surprisingly Help the Power Grid during Extreme HeatSwitching on air-conditioning during extreme heat doesn’t have to make us feel guilty—it can actually boost power grid reliability and help bring more renewable energy onlineBy Johanna Mathieu & The Conversation US Imagedepotpro/Getty ImagesThe following essay is reprinted with permission from The Conversation, an online publication covering the latest research.As summer arrives, people are turning on air conditioners in most of the U.S. But if you’re like me, you always feel a little guilty about that. Past generations managed without air conditioning – do I really need it? And how bad is it to use all this electricity for cooling in a warming world?If I leave my air conditioner off, I get too hot. But if everyone turns on their air conditioner at the same time, electricity demand spikes, which can force power grid operators to activate some of the most expensive, and dirtiest, power plants. Sometimes those spikes can ask too much of the grid and lead to brownouts or blackouts.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Research I recently published with a team of scholars makes me feel a little better, though. We have found that it is possible to coordinate the operation of large numbers of home air-conditioning units, balancing supply and demand on the power grid – and without making people endure high temperatures inside their homes.Studies along these lines, using remote control of air conditioners to support the grid, have for many years explored theoretical possibilities like this. However, few approaches have been demonstrated in practice and never for such a high-value application and at this scale. The system we developed not only demonstrated the ability to balance the grid on timescales of seconds, but also proved it was possible to do so without affecting residents’ comfort.The benefits include increasing the reliability of the power grid, which makes it easier for the grid to accept more renewable energy. Our goal is to turn air conditioners from a challenge for the power grid into an asset, supporting a shift away from fossil fuels toward cleaner energy.Adjustable equipmentMy research focuses on batteries, solar panels and electric equipment – such as electric vehicles, water heaters, air conditioners and heat pumps – that can adjust itself to consume different amounts of energy at different times.Originally, the U.S. electric grid was built to transport electricity from large power plants to customers’ homes and businesses. And originally, power plants were large, centralized operations that burned coal or natural gas, or harvested energy from nuclear reactions. These plants were typically always available and could adjust how much power they generated in response to customer demand, so the grid would be balanced between power coming in from producers and being used by consumers.But the grid has changed. There are more renewable energy sources, from which power isn’t always available – like solar panels at night or wind turbines on calm days. And there are the devices and equipment I study. These newer options, called “distributed energy resources,” generate or store energy near where consumers need it – or adjust how much energy they’re using in real time.One aspect of the grid hasn’t changed, though: There’s not much storage built into the system. So every time you turn on a light, for a moment there’s not enough electricity to supply everything that wants it right then: The grid needs a power producer to generate a little more power. And when you turn off a light, there’s a little too much: A power producer needs to ramp down.The way power plants know what real-time power adjustments are needed is by closely monitoring the grid frequency. The goal is to provide electricity at a constant frequency – 60 hertz – at all times. If more power is needed than is being produced, the frequency drops and a power plant boosts output. If there’s too much power being produced, the frequency rises and a power plant slows production a little. These actions, a process called “frequency regulation,” happen in a matter of seconds to keep the grid balanced.This output flexibility, primarily from power plants, is key to keeping the lights on for everyone.Finding new optionsI’m interested in how distributed energy resources can improve flexibility in the grid. They can release more energy, or consume less, to respond to the changing supply or demand, and help balance the grid, ensuring the frequency remains near 60 hertz.Some people fear that doing so might be invasive, giving someone outside your home the ability to control your battery or air conditioner. Therefore, we wanted to see if we could help balance the grid with frequency regulation using home air-conditioning units rather than power plants – without affecting how residents use their appliances or how comfortable they are in their homes.From 2019 to 2023, my group at the University of Michigan tried this approach, in collaboration with researchers at Pecan Street Inc., Los Alamos National Laboratory and the University of California, Berkeley, with funding from the U.S. Department of Energy Advanced Research Projects Agency-Energy.We recruited 100 homeowners in Austin, Texas, to do a real-world test of our system. All the homes had whole-house forced-air cooling systems, which we connected to custom control boards and sensors the owners allowed us to install in their homes. This equipment let us send instructions to the air-conditioning units based on the frequency of the grid.Before I explain how the system worked, I first need to explain how thermostats work. When people set thermostats, they pick a temperature, and the thermostat switches the air-conditioning compressor on and off to maintain the air temperature within a small range around that set point. If the temperature is set at 68 degrees, the thermostat turns the AC on when the temperature is, say, 70, and turns it off when it’s cooled down to, say, 66.Every few seconds, our system slightly changed the timing of air-conditioning compressor switching for some of the 100 air conditioners, causing the units’ aggregate power consumption to change. In this way, our small group of home air conditioners reacted to grid changes the way a power plant would – using more or less energy to balance the grid and keep the frequency near 60 hertz.Moreover, our system was designed to keep home temperatures within the same small temperature range around the set point.Testing the approachWe ran our system in four tests, each lasting one hour. We found two encouraging results.First, the air conditioners were able to provide frequency regulation at least as accurately as a traditional power plant. Therefore, we showed that air conditioners could play a significant role in increasing grid flexibility. But perhaps more importantly – at least in terms of encouraging people to participate in these types of systems – we found that we were able to do so without affecting people’s comfort in their homes.We found that home temperatures did not deviate more than 1.6 Fahrenheit from their set point. Homeowners were allowed to override the controls if they got uncomfortable, but most didn’t. For most tests, we received zero override requests. In the worst case, we received override requests from two of the 100 homes in our test.In practice, this sort of technology could be added to commercially available internet-connected thermostats. In exchange for credits on their energy bills, users could choose to join a service run by the thermostat company, their utility provider or some other third party.Then people could turn on the air conditioning in the summer heat without that pang of guilt, knowing they were helping to make the grid more reliable and more capable of accommodating renewable energy sources – without sacrificing their own comfort in the process.This article was originally published on The Conversation. Read the original article.
    #airconditioning #can #help #power #grid
    Air-Conditioning Can Help the Power Grid instead of Overloading It
    June 13, 20256 min readAir-Conditioning Can Surprisingly Help the Power Grid during Extreme HeatSwitching on air-conditioning during extreme heat doesn’t have to make us feel guilty—it can actually boost power grid reliability and help bring more renewable energy onlineBy Johanna Mathieu & The Conversation US Imagedepotpro/Getty ImagesThe following essay is reprinted with permission from The Conversation, an online publication covering the latest research.As summer arrives, people are turning on air conditioners in most of the U.S. But if you’re like me, you always feel a little guilty about that. Past generations managed without air conditioning – do I really need it? And how bad is it to use all this electricity for cooling in a warming world?If I leave my air conditioner off, I get too hot. But if everyone turns on their air conditioner at the same time, electricity demand spikes, which can force power grid operators to activate some of the most expensive, and dirtiest, power plants. Sometimes those spikes can ask too much of the grid and lead to brownouts or blackouts.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Research I recently published with a team of scholars makes me feel a little better, though. We have found that it is possible to coordinate the operation of large numbers of home air-conditioning units, balancing supply and demand on the power grid – and without making people endure high temperatures inside their homes.Studies along these lines, using remote control of air conditioners to support the grid, have for many years explored theoretical possibilities like this. However, few approaches have been demonstrated in practice and never for such a high-value application and at this scale. The system we developed not only demonstrated the ability to balance the grid on timescales of seconds, but also proved it was possible to do so without affecting residents’ comfort.The benefits include increasing the reliability of the power grid, which makes it easier for the grid to accept more renewable energy. Our goal is to turn air conditioners from a challenge for the power grid into an asset, supporting a shift away from fossil fuels toward cleaner energy.Adjustable equipmentMy research focuses on batteries, solar panels and electric equipment – such as electric vehicles, water heaters, air conditioners and heat pumps – that can adjust itself to consume different amounts of energy at different times.Originally, the U.S. electric grid was built to transport electricity from large power plants to customers’ homes and businesses. And originally, power plants were large, centralized operations that burned coal or natural gas, or harvested energy from nuclear reactions. These plants were typically always available and could adjust how much power they generated in response to customer demand, so the grid would be balanced between power coming in from producers and being used by consumers.But the grid has changed. There are more renewable energy sources, from which power isn’t always available – like solar panels at night or wind turbines on calm days. And there are the devices and equipment I study. These newer options, called “distributed energy resources,” generate or store energy near where consumers need it – or adjust how much energy they’re using in real time.One aspect of the grid hasn’t changed, though: There’s not much storage built into the system. So every time you turn on a light, for a moment there’s not enough electricity to supply everything that wants it right then: The grid needs a power producer to generate a little more power. And when you turn off a light, there’s a little too much: A power producer needs to ramp down.The way power plants know what real-time power adjustments are needed is by closely monitoring the grid frequency. The goal is to provide electricity at a constant frequency – 60 hertz – at all times. If more power is needed than is being produced, the frequency drops and a power plant boosts output. If there’s too much power being produced, the frequency rises and a power plant slows production a little. These actions, a process called “frequency regulation,” happen in a matter of seconds to keep the grid balanced.This output flexibility, primarily from power plants, is key to keeping the lights on for everyone.Finding new optionsI’m interested in how distributed energy resources can improve flexibility in the grid. They can release more energy, or consume less, to respond to the changing supply or demand, and help balance the grid, ensuring the frequency remains near 60 hertz.Some people fear that doing so might be invasive, giving someone outside your home the ability to control your battery or air conditioner. Therefore, we wanted to see if we could help balance the grid with frequency regulation using home air-conditioning units rather than power plants – without affecting how residents use their appliances or how comfortable they are in their homes.From 2019 to 2023, my group at the University of Michigan tried this approach, in collaboration with researchers at Pecan Street Inc., Los Alamos National Laboratory and the University of California, Berkeley, with funding from the U.S. Department of Energy Advanced Research Projects Agency-Energy.We recruited 100 homeowners in Austin, Texas, to do a real-world test of our system. All the homes had whole-house forced-air cooling systems, which we connected to custom control boards and sensors the owners allowed us to install in their homes. This equipment let us send instructions to the air-conditioning units based on the frequency of the grid.Before I explain how the system worked, I first need to explain how thermostats work. When people set thermostats, they pick a temperature, and the thermostat switches the air-conditioning compressor on and off to maintain the air temperature within a small range around that set point. If the temperature is set at 68 degrees, the thermostat turns the AC on when the temperature is, say, 70, and turns it off when it’s cooled down to, say, 66.Every few seconds, our system slightly changed the timing of air-conditioning compressor switching for some of the 100 air conditioners, causing the units’ aggregate power consumption to change. In this way, our small group of home air conditioners reacted to grid changes the way a power plant would – using more or less energy to balance the grid and keep the frequency near 60 hertz.Moreover, our system was designed to keep home temperatures within the same small temperature range around the set point.Testing the approachWe ran our system in four tests, each lasting one hour. We found two encouraging results.First, the air conditioners were able to provide frequency regulation at least as accurately as a traditional power plant. Therefore, we showed that air conditioners could play a significant role in increasing grid flexibility. But perhaps more importantly – at least in terms of encouraging people to participate in these types of systems – we found that we were able to do so without affecting people’s comfort in their homes.We found that home temperatures did not deviate more than 1.6 Fahrenheit from their set point. Homeowners were allowed to override the controls if they got uncomfortable, but most didn’t. For most tests, we received zero override requests. In the worst case, we received override requests from two of the 100 homes in our test.In practice, this sort of technology could be added to commercially available internet-connected thermostats. In exchange for credits on their energy bills, users could choose to join a service run by the thermostat company, their utility provider or some other third party.Then people could turn on the air conditioning in the summer heat without that pang of guilt, knowing they were helping to make the grid more reliable and more capable of accommodating renewable energy sources – without sacrificing their own comfort in the process.This article was originally published on The Conversation. Read the original article. #airconditioning #can #help #power #grid
    WWW.SCIENTIFICAMERICAN.COM
    Air-Conditioning Can Help the Power Grid instead of Overloading It
    June 13, 20256 min readAir-Conditioning Can Surprisingly Help the Power Grid during Extreme HeatSwitching on air-conditioning during extreme heat doesn’t have to make us feel guilty—it can actually boost power grid reliability and help bring more renewable energy onlineBy Johanna Mathieu & The Conversation US Imagedepotpro/Getty ImagesThe following essay is reprinted with permission from The Conversation, an online publication covering the latest research.As summer arrives, people are turning on air conditioners in most of the U.S. But if you’re like me, you always feel a little guilty about that. Past generations managed without air conditioning – do I really need it? And how bad is it to use all this electricity for cooling in a warming world?If I leave my air conditioner off, I get too hot. But if everyone turns on their air conditioner at the same time, electricity demand spikes, which can force power grid operators to activate some of the most expensive, and dirtiest, power plants. Sometimes those spikes can ask too much of the grid and lead to brownouts or blackouts.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Research I recently published with a team of scholars makes me feel a little better, though. We have found that it is possible to coordinate the operation of large numbers of home air-conditioning units, balancing supply and demand on the power grid – and without making people endure high temperatures inside their homes.Studies along these lines, using remote control of air conditioners to support the grid, have for many years explored theoretical possibilities like this. However, few approaches have been demonstrated in practice and never for such a high-value application and at this scale. The system we developed not only demonstrated the ability to balance the grid on timescales of seconds, but also proved it was possible to do so without affecting residents’ comfort.The benefits include increasing the reliability of the power grid, which makes it easier for the grid to accept more renewable energy. Our goal is to turn air conditioners from a challenge for the power grid into an asset, supporting a shift away from fossil fuels toward cleaner energy.Adjustable equipmentMy research focuses on batteries, solar panels and electric equipment – such as electric vehicles, water heaters, air conditioners and heat pumps – that can adjust itself to consume different amounts of energy at different times.Originally, the U.S. electric grid was built to transport electricity from large power plants to customers’ homes and businesses. And originally, power plants were large, centralized operations that burned coal or natural gas, or harvested energy from nuclear reactions. These plants were typically always available and could adjust how much power they generated in response to customer demand, so the grid would be balanced between power coming in from producers and being used by consumers.But the grid has changed. There are more renewable energy sources, from which power isn’t always available – like solar panels at night or wind turbines on calm days. And there are the devices and equipment I study. These newer options, called “distributed energy resources,” generate or store energy near where consumers need it – or adjust how much energy they’re using in real time.One aspect of the grid hasn’t changed, though: There’s not much storage built into the system. So every time you turn on a light, for a moment there’s not enough electricity to supply everything that wants it right then: The grid needs a power producer to generate a little more power. And when you turn off a light, there’s a little too much: A power producer needs to ramp down.The way power plants know what real-time power adjustments are needed is by closely monitoring the grid frequency. The goal is to provide electricity at a constant frequency – 60 hertz – at all times. If more power is needed than is being produced, the frequency drops and a power plant boosts output. If there’s too much power being produced, the frequency rises and a power plant slows production a little. These actions, a process called “frequency regulation,” happen in a matter of seconds to keep the grid balanced.This output flexibility, primarily from power plants, is key to keeping the lights on for everyone.Finding new optionsI’m interested in how distributed energy resources can improve flexibility in the grid. They can release more energy, or consume less, to respond to the changing supply or demand, and help balance the grid, ensuring the frequency remains near 60 hertz.Some people fear that doing so might be invasive, giving someone outside your home the ability to control your battery or air conditioner. Therefore, we wanted to see if we could help balance the grid with frequency regulation using home air-conditioning units rather than power plants – without affecting how residents use their appliances or how comfortable they are in their homes.From 2019 to 2023, my group at the University of Michigan tried this approach, in collaboration with researchers at Pecan Street Inc., Los Alamos National Laboratory and the University of California, Berkeley, with funding from the U.S. Department of Energy Advanced Research Projects Agency-Energy.We recruited 100 homeowners in Austin, Texas, to do a real-world test of our system. All the homes had whole-house forced-air cooling systems, which we connected to custom control boards and sensors the owners allowed us to install in their homes. This equipment let us send instructions to the air-conditioning units based on the frequency of the grid.Before I explain how the system worked, I first need to explain how thermostats work. When people set thermostats, they pick a temperature, and the thermostat switches the air-conditioning compressor on and off to maintain the air temperature within a small range around that set point. If the temperature is set at 68 degrees, the thermostat turns the AC on when the temperature is, say, 70, and turns it off when it’s cooled down to, say, 66.Every few seconds, our system slightly changed the timing of air-conditioning compressor switching for some of the 100 air conditioners, causing the units’ aggregate power consumption to change. In this way, our small group of home air conditioners reacted to grid changes the way a power plant would – using more or less energy to balance the grid and keep the frequency near 60 hertz.Moreover, our system was designed to keep home temperatures within the same small temperature range around the set point.Testing the approachWe ran our system in four tests, each lasting one hour. We found two encouraging results.First, the air conditioners were able to provide frequency regulation at least as accurately as a traditional power plant. Therefore, we showed that air conditioners could play a significant role in increasing grid flexibility. But perhaps more importantly – at least in terms of encouraging people to participate in these types of systems – we found that we were able to do so without affecting people’s comfort in their homes.We found that home temperatures did not deviate more than 1.6 Fahrenheit from their set point. Homeowners were allowed to override the controls if they got uncomfortable, but most didn’t. For most tests, we received zero override requests. In the worst case, we received override requests from two of the 100 homes in our test.In practice, this sort of technology could be added to commercially available internet-connected thermostats. In exchange for credits on their energy bills, users could choose to join a service run by the thermostat company, their utility provider or some other third party.Then people could turn on the air conditioning in the summer heat without that pang of guilt, knowing they were helping to make the grid more reliable and more capable of accommodating renewable energy sources – without sacrificing their own comfort in the process.This article was originally published on The Conversation. Read the original article.
    Like
    Love
    Wow
    Sad
    Angry
    602
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale

    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale . Both the Essential and Standard Combos have been discounted to while the Adventure Combo has dropped to DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look.
    details
    View First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second. This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds, but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilizationtech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080por 2.7Kwith a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording. That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C. This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K: 3840×2880@24/25/30/48/50/60fps and 4K: 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSDBattery: 1770mAh, lab tested to offer up to 160 minutes of runtimeOperating Temperature: -20° to 45° CThis article was originally published in August of 2023 and updated in March 2025.Featured reviews
    #one #most #versatile #action #cameras
    One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale
    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale . Both the Essential and Standard Combos have been discounted to while the Adventure Combo has dropped to DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look. details View First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second. This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds, but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilizationtech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080por 2.7Kwith a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording. That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C. This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K: 3840×2880@24/25/30/48/50/60fps and 4K: 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSDBattery: 1770mAh, lab tested to offer up to 160 minutes of runtimeOperating Temperature: -20° to 45° CThis article was originally published in August of 2023 and updated in March 2025.Featured reviews #one #most #versatile #action #cameras
    WWW.ZDNET.COM
    One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale
    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale at Amazon. Both the Essential and Standard Combos have been discounted to $249, while the Adventure Combo has dropped to $349.DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look. details View at Amazon First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second (fps). This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds (well, actually, I do like clouds), but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilization (EIS) tech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080p (16:9) or 2.7K (16:9) with a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording (at room temperature, with RockSteady on, Wi-Fi off, and screen off). That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra $100. Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C (-4°F). This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at $399, while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is $499.I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K (4:3): 3840×2880@24/25/30/48/50/60fps and 4K (16:9): 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSD (up to 512GB)Battery: 1770mAh, lab tested to offer up to 160 minutes of runtime (tested at room temperature - 25°C/77°F - and 1080p/24fps, with RockSteady on, Wi-Fi off, and screen off)Operating Temperature: -20° to 45° C (-4° to 113° F)This article was originally published in August of 2023 and updated in March 2025.Featured reviews
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • What happens to DOGE without Elon Musk?

    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency, his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government billion — well short of its ambitioustarget of cutting at least trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify.
    What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More:
    #what #happens #doge #without #elon
    What happens to DOGE without Elon Musk?
    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency, his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government billion — well short of its ambitioustarget of cutting at least trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify. What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More: #what #happens #doge #without #elon
    WWW.VOX.COM
    What happens to DOGE without Elon Musk?
    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency (DOGE), his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government $180 billion — well short of its ambitious (and frankly never realistic) target of cutting at least $2 trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify. What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More:
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • FBC: Firebreak developers discuss the inspiration and challenges creating their first multiplayer title

    Things are warming up as Remedy’s FBC: Firebreak approaches its June 17 launch on PlayStation 5 as part of the PlayStation Plus Game Catalog. We chatted with Communications Director Thomas Puha, Lead Level Designer Teemu Huhtiniemi, Lead Designer/Lead Technical Designer Anssi Hyytiainen, and Game Director/Lead Writer Mike Kayatta about some of the fascinating and often hilarious development secrets behind the first-person shooter.

    PlayStation Blog: First, what PS5 and PS5 Pro features did you utilize?

    Thomas Puha: We’ll support 3D Audio, and we’re prioritising 60 FPS on both formats. We’re aiming for FSR2 with an output resolution of 2560 x 1440on PS, and PSSR with an output resolution of 3840×2160on PS5 Pro.

    Some of the DualSense wireless controller’s features are still a work in progress, but we’re looking to use haptic feedback in a similar way to our previous titles, such as Control and Alan Wake 2. For example, we want to differentiate the weapons to feel unique from each other using the adaptive triggers.

    Going into the game itself, were there any other influences on its creation outside of Control?

    Mike Kayatta: We looked at different TV shows that had lots of tools for going into a place and dealing with a crisis. One was a reality show called Dirty Jobs, where the host Mike Rowe finds these terrible, dangerous, or unexpected jobs that you don’t know exist, like cleaning out the inside of a water tower.

    We also looked at PowerWash Simulator. Cleaning dirt is oddly meditative and really fulfilling. It made me wish a zombie attacked me to break the Zen, and then I’d go right back to cleaning. And we were like, that would be pretty fun in the game.

    Play Video

    Were there specific challenges you faced given it’s your first multiplayer game and first-person shooter?

    Anssi Hyytiainen: It’s radically different from a workflow point of view. You can’t really test it alone, necessarily, which is quite a different experience. And then there are times when one player is missing things on their screen that others are seeing. It was like, “What are you shooting at?”

    What’s been your favorite moments developing the game so far?

    Teemu Huhtiniemi: There were so many. But I like when we started seeing all of these overlapping systems kind of click, because there’s a long time in the development where you talk about things on paper and have some prototypes, but you don’t really see it all come together until a point. Then you start seeing the interaction between the systems and all the fun that comes out of that.

    Kayatta: I imagine there’s a lot of people who probably are a little skeptical about Remedy making something so different. Even internally, when the project was starting. And once we got the trailer out there, everyone was so nervous, but it got a pretty positive reaction. Exposing it to the public is very motivating, because with games, for a very long time, there is nothing, or it is janky and it’s ugly and you don’t find the fun immediately.

    Were there any specific ideals you followed while you worked on the game?

    Kayatta: Early on we were constantly asking ourselves, “Could this only happen in Control or at Remedy?” Because the first thing you hear is, “Okay, this is just another co-op multiplayer shooter” – there’s thousands of them, and they’re all good. So what can we do to make it worth playing our game? We were always saying we’ve got this super weird universe and really interesting studio, so we’re always looking at what we could do that nobody else can.

    Huhtiniemi: I think for me it was when we chose to just embrace the chaos. Like, that’s the whole point of the game. It’s supposed to feel overwhelming and busy at times, so that was great to say it out loud.

    Kayatta: Yeah, originally we had a prototype where there were only two Hiss in the level, but it just didn’t work, it wasn’t fun. Then everything just accidentally went in the opposite direction, where it was super chaos. At some point we actually started looking at Overcooked quite a bit, and saying, “Look, just embrace it. It’s gonna be nuts.”

    How did you finally decide on the name FBC: Firebreak, and were there any rejected, alternate, or working titles?

    Kayatta: So Firebreak is named after real world firebreaks, where you deforest an area to prevent a fire from spreading, but firebreaks are also topographical features of the Oldest House. And so we leaned into the term being a first responder who stops fires from spreading. The FBC part came from not wanting to put ‘Control’ in the title, so Control players wouldn’t feel like they had to detour to this before Control 2, but we didn’t want to totally detach from it either as that felt insincere.

    An external partner pitched a title. They were very serious about talking up the game being in the Oldest House, and then dramatically revealed the name: Housekeepers. I got what they were going for, but I was like, we cannot call it this. It was like you were playing as a maid!  

    FBC: Firebreak launches on PS5 June 17 as a day on PlayStation Plus Game Catalog title.
    #fbc #firebreak #developers #discuss #inspiration
    FBC: Firebreak developers discuss the inspiration and challenges creating their first multiplayer title
    Things are warming up as Remedy’s FBC: Firebreak approaches its June 17 launch on PlayStation 5 as part of the PlayStation Plus Game Catalog. We chatted with Communications Director Thomas Puha, Lead Level Designer Teemu Huhtiniemi, Lead Designer/Lead Technical Designer Anssi Hyytiainen, and Game Director/Lead Writer Mike Kayatta about some of the fascinating and often hilarious development secrets behind the first-person shooter. PlayStation Blog: First, what PS5 and PS5 Pro features did you utilize? Thomas Puha: We’ll support 3D Audio, and we’re prioritising 60 FPS on both formats. We’re aiming for FSR2 with an output resolution of 2560 x 1440on PS, and PSSR with an output resolution of 3840×2160on PS5 Pro. Some of the DualSense wireless controller’s features are still a work in progress, but we’re looking to use haptic feedback in a similar way to our previous titles, such as Control and Alan Wake 2. For example, we want to differentiate the weapons to feel unique from each other using the adaptive triggers. Going into the game itself, were there any other influences on its creation outside of Control? Mike Kayatta: We looked at different TV shows that had lots of tools for going into a place and dealing with a crisis. One was a reality show called Dirty Jobs, where the host Mike Rowe finds these terrible, dangerous, or unexpected jobs that you don’t know exist, like cleaning out the inside of a water tower. We also looked at PowerWash Simulator. Cleaning dirt is oddly meditative and really fulfilling. It made me wish a zombie attacked me to break the Zen, and then I’d go right back to cleaning. And we were like, that would be pretty fun in the game. Play Video Were there specific challenges you faced given it’s your first multiplayer game and first-person shooter? Anssi Hyytiainen: It’s radically different from a workflow point of view. You can’t really test it alone, necessarily, which is quite a different experience. And then there are times when one player is missing things on their screen that others are seeing. It was like, “What are you shooting at?” What’s been your favorite moments developing the game so far? Teemu Huhtiniemi: There were so many. But I like when we started seeing all of these overlapping systems kind of click, because there’s a long time in the development where you talk about things on paper and have some prototypes, but you don’t really see it all come together until a point. Then you start seeing the interaction between the systems and all the fun that comes out of that. Kayatta: I imagine there’s a lot of people who probably are a little skeptical about Remedy making something so different. Even internally, when the project was starting. And once we got the trailer out there, everyone was so nervous, but it got a pretty positive reaction. Exposing it to the public is very motivating, because with games, for a very long time, there is nothing, or it is janky and it’s ugly and you don’t find the fun immediately. Were there any specific ideals you followed while you worked on the game? Kayatta: Early on we were constantly asking ourselves, “Could this only happen in Control or at Remedy?” Because the first thing you hear is, “Okay, this is just another co-op multiplayer shooter” – there’s thousands of them, and they’re all good. So what can we do to make it worth playing our game? We were always saying we’ve got this super weird universe and really interesting studio, so we’re always looking at what we could do that nobody else can. Huhtiniemi: I think for me it was when we chose to just embrace the chaos. Like, that’s the whole point of the game. It’s supposed to feel overwhelming and busy at times, so that was great to say it out loud. Kayatta: Yeah, originally we had a prototype where there were only two Hiss in the level, but it just didn’t work, it wasn’t fun. Then everything just accidentally went in the opposite direction, where it was super chaos. At some point we actually started looking at Overcooked quite a bit, and saying, “Look, just embrace it. It’s gonna be nuts.” How did you finally decide on the name FBC: Firebreak, and were there any rejected, alternate, or working titles? Kayatta: So Firebreak is named after real world firebreaks, where you deforest an area to prevent a fire from spreading, but firebreaks are also topographical features of the Oldest House. And so we leaned into the term being a first responder who stops fires from spreading. The FBC part came from not wanting to put ‘Control’ in the title, so Control players wouldn’t feel like they had to detour to this before Control 2, but we didn’t want to totally detach from it either as that felt insincere. An external partner pitched a title. They were very serious about talking up the game being in the Oldest House, and then dramatically revealed the name: Housekeepers. I got what they were going for, but I was like, we cannot call it this. It was like you were playing as a maid!   FBC: Firebreak launches on PS5 June 17 as a day on PlayStation Plus Game Catalog title. #fbc #firebreak #developers #discuss #inspiration
    BLOG.PLAYSTATION.COM
    FBC: Firebreak developers discuss the inspiration and challenges creating their first multiplayer title
    Things are warming up as Remedy’s FBC: Firebreak approaches its June 17 launch on PlayStation 5 as part of the PlayStation Plus Game Catalog. We chatted with Communications Director Thomas Puha, Lead Level Designer Teemu Huhtiniemi, Lead Designer/Lead Technical Designer Anssi Hyytiainen, and Game Director/Lead Writer Mike Kayatta about some of the fascinating and often hilarious development secrets behind the first-person shooter. PlayStation Blog: First, what PS5 and PS5 Pro features did you utilize? Thomas Puha: We’ll support 3D Audio, and we’re prioritising 60 FPS on both formats. We’re aiming for FSR2 with an output resolution of 2560 x 1440 (1440p) on PS, and PSSR with an output resolution of 3840×2160 (4K) on PS5 Pro. Some of the DualSense wireless controller’s features are still a work in progress, but we’re looking to use haptic feedback in a similar way to our previous titles, such as Control and Alan Wake 2. For example, we want to differentiate the weapons to feel unique from each other using the adaptive triggers. Going into the game itself, were there any other influences on its creation outside of Control? Mike Kayatta: We looked at different TV shows that had lots of tools for going into a place and dealing with a crisis. One was a reality show called Dirty Jobs, where the host Mike Rowe finds these terrible, dangerous, or unexpected jobs that you don’t know exist, like cleaning out the inside of a water tower. We also looked at PowerWash Simulator. Cleaning dirt is oddly meditative and really fulfilling. It made me wish a zombie attacked me to break the Zen, and then I’d go right back to cleaning. And we were like, that would be pretty fun in the game. Play Video Were there specific challenges you faced given it’s your first multiplayer game and first-person shooter? Anssi Hyytiainen: It’s radically different from a workflow point of view. You can’t really test it alone, necessarily, which is quite a different experience. And then there are times when one player is missing things on their screen that others are seeing. It was like, “What are you shooting at?” What’s been your favorite moments developing the game so far? Teemu Huhtiniemi: There were so many. But I like when we started seeing all of these overlapping systems kind of click, because there’s a long time in the development where you talk about things on paper and have some prototypes, but you don’t really see it all come together until a point. Then you start seeing the interaction between the systems and all the fun that comes out of that. Kayatta: I imagine there’s a lot of people who probably are a little skeptical about Remedy making something so different. Even internally, when the project was starting. And once we got the trailer out there, everyone was so nervous, but it got a pretty positive reaction. Exposing it to the public is very motivating, because with games, for a very long time, there is nothing, or it is janky and it’s ugly and you don’t find the fun immediately. Were there any specific ideals you followed while you worked on the game? Kayatta: Early on we were constantly asking ourselves, “Could this only happen in Control or at Remedy?” Because the first thing you hear is, “Okay, this is just another co-op multiplayer shooter” – there’s thousands of them, and they’re all good. So what can we do to make it worth playing our game? We were always saying we’ve got this super weird universe and really interesting studio, so we’re always looking at what we could do that nobody else can. Huhtiniemi: I think for me it was when we chose to just embrace the chaos. Like, that’s the whole point of the game. It’s supposed to feel overwhelming and busy at times, so that was great to say it out loud. Kayatta: Yeah, originally we had a prototype where there were only two Hiss in the level, but it just didn’t work, it wasn’t fun. Then everything just accidentally went in the opposite direction, where it was super chaos. At some point we actually started looking at Overcooked quite a bit, and saying, “Look, just embrace it. It’s gonna be nuts.” How did you finally decide on the name FBC: Firebreak, and were there any rejected, alternate, or working titles? Kayatta: So Firebreak is named after real world firebreaks, where you deforest an area to prevent a fire from spreading, but firebreaks are also topographical features of the Oldest House. And so we leaned into the term being a first responder who stops fires from spreading. The FBC part came from not wanting to put ‘Control’ in the title, so Control players wouldn’t feel like they had to detour to this before Control 2, but we didn’t want to totally detach from it either as that felt insincere. An external partner pitched a title. They were very serious about talking up the game being in the Oldest House, and then dramatically revealed the name: Housekeepers. I got what they were going for, but I was like, we cannot call it this. It was like you were playing as a maid!   FBC: Firebreak launches on PS5 June 17 as a day on PlayStation Plus Game Catalog title.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • The art of two Mickeys

    Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine.
    The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?”
    “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.”

    Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical.
    “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.”
    Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.”

    “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.”
    In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass.

    When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.”
    “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.”
    The Hydralite rig, developed by Volucap. Source:
    Rising Sun Pictureshandled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.”
    “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.”

    Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released.
    The post The art of two Mickeys appeared first on befores & afters.
    #art #two #mickeys
    The art of two Mickeys
    Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine. The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?” “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.” Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical. “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.” Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.” “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.” In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass. When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.” “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.” The Hydralite rig, developed by Volucap. Source: Rising Sun Pictureshandled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.” “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.” Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. The post The art of two Mickeys appeared first on befores & afters. #art #two #mickeys
    BEFORESANDAFTERS.COM
    The art of two Mickeys
    Classic splitscreens, traditional face replacements and new approaches to machine learning-assisted face swapping allowed for twinning shots in ‘Mickey 17’. An excerpt from issue #32 of befores & afters magazine. The art of representing two characters on screen at the same time has become known as ‘twinning’. For Mickey 17 visual effects supervisor Dan Glass, the effect of seeing both Mickey 17 and 18 together was one he looked to achieve with a variety of methodologies. “With a technique like that,” he says, “you always want to use a range of tricks, because you don’t want people to figure it out. You want to keep them like, ‘Oh, wait a minute. How did they…?” “Going back to the way that Director Bong is so prepared and organized,” adds Glass, “it again makes the world of difference with that kind of work, because he thumbnails every shot. Then, some of them are a bit more fleshed out in storyboards. You can look at it and go, ‘Okay, in this situation, this is what the camera’s doing, this is what the actor’s doing,’ which in itself is quite interesting, because he pre-thinks all of this. You’d think that the actors show up and basically just have to follow the steps like robots. It’s not like that. He gives them an environment to work in, but the shots do end up extraordinarily close to what he thumbnails, and it made it a lot simpler to go through.” Those different approaches to twinning ranged from simple splitscreens, to traditional face replacements, and then substantially with a machine learned AI approach, now usually termed ‘face swapping’. What made the twinning work a tougher task than usual, suggests Glass, was the fact that the two Pattinson characters are virtually identical. “Normally, when you’re doing some kind of face replacement, you’re comparing it to a memory of the face. But this was right in front of you as two Mickeys looking strikingly similar.” Here’s how a typical twinning shot was achieved, as described by Glass. “Because Mickey was mostly dressed the same, with only a slight hair change, we were able to have Robert play both roles and to do them one after another. Sometimes, you have to do these things where hair and makeup or costume has a significant variation, so you’re either waiting a long time, which slows production, or you’re coming back at another time to do the different roles, which always makes the process a lot more complicated to match, but we were able to do that immediately.” “Based on the design of the shot,” continues Glass, “I would recommend which of Robert’s parts should be shot first. This was most often determined by which role had more impact on the camera movement. A huge credit goes to Robert for his ability to flip between the roles so effortlessly.” In the film, Mickey 17 is more passive and Mickey 18 is more aggressive. Pattinson reflected the distinct characters in his actions, including for a moment in which they fight. This fight, overseen by stunt coordinator Paul Lowe, represented moments of close interaction between the two Mickeys. It was here that a body double was crucial in shooting. The body double was also relied upon for the classic twinning technique of shooting ‘dirty’ over-the- shoulder out of focus shots of the double—ie. 17 looking at 18. However, it was quickly determined that even these would need face replacement work. “Robert’s jawline is so distinct that even those had to be replaced or shot as split screens,” observes Glass. When the shot was a moving one, no motion control was employed. “I’ve never been a big advocate for motion control,” states Glass. “To me it’s applicable when you’re doing things like miniatures where you need many matching passes, but I think when performances are involved, it interferes too much. It slows down a production’s speed of movement, but it’s also restrictive. Performance and camera always benefit from more flexibility.” “It helped tremendously that Director Bong and DOP Darius Khondji shot quite classically with minimal crane and Steadicam moves,” says Glass. “So, a lot of the moves are pan and dolly. There are some Steadicams in there that we were sometimes able to do splitscreens on. I wasn’t always sure that we could get away with the splitscreen as we shot it, but since we were always shooting the two roles, we had the footage to assess the practicality later. We were always prepared to go down a CG or machine learning route, but where we could use the splitscreen, that was the preference.” The Hydralite rig, developed by Volucap. Source: https://volucap.com Rising Sun Pictures (visual effects supervisor Guido Wolter) handled the majority of twinning visual effects, completing them as splitscreen composites, 2D face replacements, and most notably via their machine learning toolset REVIZE, which utilized facial and body capture of Pattinson to train a model of his face and torso to swap for the double’s. A custom capture rig, dubbed the ‘Crazy Rig’ and now officially, The Hydralite, was devised and configured by Volucap to capture multiple angles of Robert on set in each lighting environment in order to produce the best possible reference for the machine learning algorithm. “For me, it was a completely legitimate use of the technique,” attests Glass, in terms of the machine learning approach. “All of the footage that we used to go into that process was captured on our movie for our movie. There’s nothing historic, or going through past libraries of footage, and it was all with Robert’s approval. I think the results were tremendous.” “It’s staggering to me as I watch the movie that the performances of each character are so flawlessly consistent throughout the film, because I know how much we were jumping around,” notes Glass. “I did encourage that we rehearse scenes ahead. Let’s say 17 was going to be the first role we captured, I’d have them rehearse it the other way around so that the double knew what he was going to do. Therefore, eyelines, movement, pacing and in instances where we were basically replacing the likeness of his head or even torso, we were still able to use the double’s performance and then map to that.” Read the full Mickey 17 issue of befores & afters magazine in PRINT from Amazon or as a DIGITAL EDITION on Patreon. Remember, you can also subscribe to the DIGITAL EDITION as a tier on the Patreon and get a new issue every time one is released. The post The art of two Mickeys appeared first on befores & afters.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • AN EXPLOSIVE MIX OF SFX AND VFX IGNITES FINAL DESTINATION BLOODLINES

    By CHRIS McGOWAN

    Images courtesy of Warner Bros. Pictures.

    Final Destination Bloodlines, the sixth installment in the graphic horror series, kicks off with the film’s biggest challenge – deploying an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant. While there in 1968, young Iris Campbellhas a premonition about the Skyview burning, cracking, crumbling and collapsing. Then, when she sees these events actually starting to happen around her, she intervenes and causes an evacuation of the tower, thus thwarting death’s design and saving many lives. Years later, her granddaughter, Stefani Reyes, inherits the vision of the destruction that could have occurred and realizes death is still coming for the survivors.

    “I knew we couldn’t put the wholeon fire, but Tonytried and put as much fire as he could safely and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction that can’t be simulated, so I think it was a success in terms of blending that practical with the visual.”
    —Nordin Rahhali, VFX Supervisor

    The film opens with an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant – and its collapse. Drone footage was digitized to create a 3D asset for the LED wall so the time of day could be changed as needed.

    “The set that the directors wanted was very large,” says Nordin Rahhali, VFX Supervisor. “We had limited space options in stages given the scale and the footprint of the actual restaurant that they wanted. It was the first set piece, the first big thing we shot, so we had to get it all ready and going right off the bat. We built a bigger volume for our needs, including an LED wall that we built the assets for.”

    “We were outside Vancouver at Bridge Studios in Burnaby. The custom-built LED volume was a little over 200 feet in length” states Christian Sebaldt, ASC, the movie’s DP. The volume was 98 feet in diameter and 24 feet tall. Rahhali explains, “Pixomondo was the vendor that we contracted to come in and build the volume. They also built the asset that went on the LED wall, so they were part of our filming team and production shoot. Subsequently, they were also the main vendor doing post, which was by design. By having them design and take care of the asset during production, we were able to leverage their assets, tools and builds for some of the post VFX.” Rahhali adds, “It was really important to make sure we had days with the volume team and with Christian and his camera team ahead of the shoot so we could dial it in.”

    Built at Bridge Studios in Burnaby outside Vancouver, the custom-built LED volume for events at the Skyview restaurant was over 200 feet long, 98 feet wide and 24 feet tall. Extensive previs with Digital Domain was done to advance key shots.Zach Lipovsky and Adam Stein directed Final Destination Bloodlines for New Line film, distributed by Warner Bros., in which chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated death at some point. Pixomondo was the lead VFX vendor, followed by FOLKS VFX. Picture Shop also contributed. There were around 800 VFX shots. Tony Lazarowich was the Special Effects Supervisor.

    “The Skyview restaurant involved building a massive setwas fire retardant, which meant the construction took longer than normal because they had to build it with certain materials and coat it with certain things because, obviously, it serves for the set piece. As it’s falling into chaos, a lot of that fire was practical. I really jived with what Christian and directors wanted and how Tony likes to work – to augment as much real practical stuff as possible,” Rahhali remarks. “I knew we couldn’t put the whole thing on fire, but Tony tried and put as much fire as he could safely, and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction can’t be simulated, so I think it was a success in terms of blending that practical with the visual.”

    The Skyview restaurant required building a massive set that was fire retardant. Construction on the set took longer because it had to be built and coated with special materials. As the Skyview restaurant falls into chaos, much of the fire was practical.“We got all the Vancouver skylineso we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.”
    —Christian Sebaldt, ASC, Director of Photography

    For drone shots, the team utilized a custom heavy-lift drone with three RED Komodo Digital Cinema cameras “giving us almost 180 degrees with overlap that we would then stitch in post and have a ridiculous amount of resolution off these three cameras,” Sebaldt states. “The other drone we used was a DJI Inspire 3, which was also very good. And we flew these drones up at the height. We flew them at different times of day. We flew full 360s, and we also used them for photogrammetry. We got all the Vancouver skyline so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” Rahhali adds, “All of this allowed us to figure out what we were going to shoot. We had the stage build, and we had the drone footage that we then digitized and created a 3D asset to go on the wallwe could change the times of day”

    Pixomondo built the volume and the asset that went on the LED wall for the Skyview sequence. They were also the main vendor during post. FOLKS VFX and Picture Shop contributed.“We did extensive previs with Digital Domain,” Rahhali explains. “That was important because we knew the key shots that the directors wanted. With a combination of those key shots, we then kind of reverse-engineeredwhile we did techvis off the previs and worked with Christian and the art department so we would have proper flexibility with the set to be able to pull off some of these shots.some of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.”

    Some shots required the Skyview’s ceiling to be lifted and partially removed to get a crane to shoot Paul Campbellas he’s about to fall.

    The character Iris lived in a fortified house, isolating herself methodically to avoid the Grim Reaper. Rahhali comments, “That was a beautiful locationGVRD, very cold. It was a long, hard shoot, because it was all nights. It was just this beautiful pocket out in the middle of the mountains. We in visual effects didn’t do a ton other than a couple of clean-ups of the big establishing shots when you see them pull up to the compound. We had to clean up small roads we wanted to make look like one road and make the road look like dirt.” There were flames involved. Sebaldt says, “The explosionwas unbelievably big. We had eight cameras on it at night and shot it at high speed, and we’re all going ‘Whoa.’” Rahhali notes, “There was some clean-up, but the explosion was 100% practical. Our Special Effects Supervisor, Tony, went to town on that. He blew up the whole house, and it looked spectacular.”

    The tattoo shop piercing scene is one of the most talked-about sequences in the movie, where a dangling chain from a ceiling fan attaches itself to the septum nose piercing of Erik Campbelland drags him toward a raging fire. Rahhali observes, “That was very Final Destination and a great Rube Goldberg build-up event. Richard was great. He was tied up on a stunt line for most of it, balancing on top of furniture. All of that was him doing it for real with a stunt line.” Some effects solutions can be surprisingly extremely simple. Rahhali continues, “Our producercame up with a great gagseptum ring.” Richard’s nose was connected with just a nose plug that went inside his nostrils. “All that tugging and everything that you’re seeing was real. For weeks and weeks, we were all trying to figure out how to do it without it being a big visual effects thing. ‘How are we gonna pull his nose for real?’ Craig said, ‘I have these things I use to help me open up my nose and you can’t really see them.’ They built it off of that, and it looked great.”

    Filmmakers spent weeks figuring out how to execute the harrowing tattoo shop scene. A dangling chain from a ceiling fan attaches itself to the septum nose ring of Erik Campbell– with the actor’s nose being tugged by the chain connected to a nose plug that went inside his nostrils.

    “ome of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.”
    —Nordin Rahhali, VFX Supervisor

    Most of the fire in the tattoo parlor was practical. “There are some fire bars and stuff that you’re seeing in there from SFX and the big pool of fire on the wide shots.” Sebaldt adds, “That was a lot of fun to shoot because it’s so insane when he’s dancing and balancing on all this stuff – we were laughing and laughing. We were convinced that this was going to be the best scene in the movie up to that moment.” Rahhali says, “They used the scene wholesale for the trailer. It went viral – people were taking out their septum rings.” Erik survives the parlor blaze only to meet his fate in a hospital when he is pulled by a wheelchair into an out-of-control MRI machine at its highest magnetic level. Rahhali comments, “That is a good combination of a bunch of different departments. Our Stunt Coordinator, Simon Burnett, came up with this hard pull-wire linewhen Erik flies and hits the MRI. That’s a real stunt with a double, and he hit hard. All the other shots are all CG wheelchairs because the directors wanted to art-direct how the crumpling metal was snapping and bending to show pressure on him as his body starts going into the MRI.”

    To augment the believability that comes with reality, the directors aimed to capture as much practically as possible, then VFX Supervisor Nordin Rahhali and his team built on that result.A train derailment concludes the film after Stefani and her brother, Charlie, realize they are still on death’s list. A train goes off the tracks, and logs from one of the cars fly though the air and kills them. “That one was special because it’s a hard sequence and was also shot quite late, so we didn’t have a lot of time. We went back to Vancouver and shot the actual street, and we shot our actors performing. They fell onto stunt pads, and the moment they get touched by the logs, it turns into CG as it was the only way to pull that off and the train of course. We had to add all that. The destruction of the houses and everything was done in visual effects.”

    Erik survives the tattoo parlor blaze only to meet his fate in a hospital when he is crushed by a wheelchair while being pulled into an out-of-control MRI machine.

    Erikappears about to be run over by a delivery truck at the corner of 21A Ave. and 132A St., but he’s not – at least not then. The truck is actually on the opposite side of the road, and the person being run over is Howard.

    A rolling penny plays a major part in the catastrophic chain reactions and seems to be a character itself. “The magic penny was a mix from two vendors, Pixomondo and FOLKS; both had penny shots,” Rahhali says. “All the bouncing pennies you see going through the vents and hitting the fan blade are all FOLKS. The bouncing penny at the end as a lady takes it out of her purse, that goes down the ramp and into the rail – that’s FOLKS. The big explosion shots in the Skyview with the penny slowing down after the kid throws itare all Pixomondo shots. It was a mix. We took a little time to find that balance between readability and believability.”

    Approximately 800 VFX shots were required for Final Destination Bloodlines.Chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated Death at some point in the Final Destination films.

    From left: Kaitlyn Santa Juana as Stefani Reyes, director Adam Stein, director Zach Lipovsky and Gabrielle Rose as Iris.Rahhali adds, “The film is a great collaboration of departments. Good visual effects are always a good combination of special effects, makeup effects and cinematography; it’s all the planning of all the pieces coming together. For a film of this size, I’m really proud of the work. I think we punched above our weight class, and it looks quite good.”
    #explosive #mix #sfx #vfx #ignites
    AN EXPLOSIVE MIX OF SFX AND VFX IGNITES FINAL DESTINATION BLOODLINES
    By CHRIS McGOWAN Images courtesy of Warner Bros. Pictures. Final Destination Bloodlines, the sixth installment in the graphic horror series, kicks off with the film’s biggest challenge – deploying an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant. While there in 1968, young Iris Campbellhas a premonition about the Skyview burning, cracking, crumbling and collapsing. Then, when she sees these events actually starting to happen around her, she intervenes and causes an evacuation of the tower, thus thwarting death’s design and saving many lives. Years later, her granddaughter, Stefani Reyes, inherits the vision of the destruction that could have occurred and realizes death is still coming for the survivors. “I knew we couldn’t put the wholeon fire, but Tonytried and put as much fire as he could safely and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction that can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” —Nordin Rahhali, VFX Supervisor The film opens with an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant – and its collapse. Drone footage was digitized to create a 3D asset for the LED wall so the time of day could be changed as needed. “The set that the directors wanted was very large,” says Nordin Rahhali, VFX Supervisor. “We had limited space options in stages given the scale and the footprint of the actual restaurant that they wanted. It was the first set piece, the first big thing we shot, so we had to get it all ready and going right off the bat. We built a bigger volume for our needs, including an LED wall that we built the assets for.” “We were outside Vancouver at Bridge Studios in Burnaby. The custom-built LED volume was a little over 200 feet in length” states Christian Sebaldt, ASC, the movie’s DP. The volume was 98 feet in diameter and 24 feet tall. Rahhali explains, “Pixomondo was the vendor that we contracted to come in and build the volume. They also built the asset that went on the LED wall, so they were part of our filming team and production shoot. Subsequently, they were also the main vendor doing post, which was by design. By having them design and take care of the asset during production, we were able to leverage their assets, tools and builds for some of the post VFX.” Rahhali adds, “It was really important to make sure we had days with the volume team and with Christian and his camera team ahead of the shoot so we could dial it in.” Built at Bridge Studios in Burnaby outside Vancouver, the custom-built LED volume for events at the Skyview restaurant was over 200 feet long, 98 feet wide and 24 feet tall. Extensive previs with Digital Domain was done to advance key shots.Zach Lipovsky and Adam Stein directed Final Destination Bloodlines for New Line film, distributed by Warner Bros., in which chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated death at some point. Pixomondo was the lead VFX vendor, followed by FOLKS VFX. Picture Shop also contributed. There were around 800 VFX shots. Tony Lazarowich was the Special Effects Supervisor. “The Skyview restaurant involved building a massive setwas fire retardant, which meant the construction took longer than normal because they had to build it with certain materials and coat it with certain things because, obviously, it serves for the set piece. As it’s falling into chaos, a lot of that fire was practical. I really jived with what Christian and directors wanted and how Tony likes to work – to augment as much real practical stuff as possible,” Rahhali remarks. “I knew we couldn’t put the whole thing on fire, but Tony tried and put as much fire as he could safely, and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” The Skyview restaurant required building a massive set that was fire retardant. Construction on the set took longer because it had to be built and coated with special materials. As the Skyview restaurant falls into chaos, much of the fire was practical.“We got all the Vancouver skylineso we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” —Christian Sebaldt, ASC, Director of Photography For drone shots, the team utilized a custom heavy-lift drone with three RED Komodo Digital Cinema cameras “giving us almost 180 degrees with overlap that we would then stitch in post and have a ridiculous amount of resolution off these three cameras,” Sebaldt states. “The other drone we used was a DJI Inspire 3, which was also very good. And we flew these drones up at the height. We flew them at different times of day. We flew full 360s, and we also used them for photogrammetry. We got all the Vancouver skyline so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” Rahhali adds, “All of this allowed us to figure out what we were going to shoot. We had the stage build, and we had the drone footage that we then digitized and created a 3D asset to go on the wallwe could change the times of day” Pixomondo built the volume and the asset that went on the LED wall for the Skyview sequence. They were also the main vendor during post. FOLKS VFX and Picture Shop contributed.“We did extensive previs with Digital Domain,” Rahhali explains. “That was important because we knew the key shots that the directors wanted. With a combination of those key shots, we then kind of reverse-engineeredwhile we did techvis off the previs and worked with Christian and the art department so we would have proper flexibility with the set to be able to pull off some of these shots.some of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” Some shots required the Skyview’s ceiling to be lifted and partially removed to get a crane to shoot Paul Campbellas he’s about to fall. The character Iris lived in a fortified house, isolating herself methodically to avoid the Grim Reaper. Rahhali comments, “That was a beautiful locationGVRD, very cold. It was a long, hard shoot, because it was all nights. It was just this beautiful pocket out in the middle of the mountains. We in visual effects didn’t do a ton other than a couple of clean-ups of the big establishing shots when you see them pull up to the compound. We had to clean up small roads we wanted to make look like one road and make the road look like dirt.” There were flames involved. Sebaldt says, “The explosionwas unbelievably big. We had eight cameras on it at night and shot it at high speed, and we’re all going ‘Whoa.’” Rahhali notes, “There was some clean-up, but the explosion was 100% practical. Our Special Effects Supervisor, Tony, went to town on that. He blew up the whole house, and it looked spectacular.” The tattoo shop piercing scene is one of the most talked-about sequences in the movie, where a dangling chain from a ceiling fan attaches itself to the septum nose piercing of Erik Campbelland drags him toward a raging fire. Rahhali observes, “That was very Final Destination and a great Rube Goldberg build-up event. Richard was great. He was tied up on a stunt line for most of it, balancing on top of furniture. All of that was him doing it for real with a stunt line.” Some effects solutions can be surprisingly extremely simple. Rahhali continues, “Our producercame up with a great gagseptum ring.” Richard’s nose was connected with just a nose plug that went inside his nostrils. “All that tugging and everything that you’re seeing was real. For weeks and weeks, we were all trying to figure out how to do it without it being a big visual effects thing. ‘How are we gonna pull his nose for real?’ Craig said, ‘I have these things I use to help me open up my nose and you can’t really see them.’ They built it off of that, and it looked great.” Filmmakers spent weeks figuring out how to execute the harrowing tattoo shop scene. A dangling chain from a ceiling fan attaches itself to the septum nose ring of Erik Campbell– with the actor’s nose being tugged by the chain connected to a nose plug that went inside his nostrils. “ome of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” —Nordin Rahhali, VFX Supervisor Most of the fire in the tattoo parlor was practical. “There are some fire bars and stuff that you’re seeing in there from SFX and the big pool of fire on the wide shots.” Sebaldt adds, “That was a lot of fun to shoot because it’s so insane when he’s dancing and balancing on all this stuff – we were laughing and laughing. We were convinced that this was going to be the best scene in the movie up to that moment.” Rahhali says, “They used the scene wholesale for the trailer. It went viral – people were taking out their septum rings.” Erik survives the parlor blaze only to meet his fate in a hospital when he is pulled by a wheelchair into an out-of-control MRI machine at its highest magnetic level. Rahhali comments, “That is a good combination of a bunch of different departments. Our Stunt Coordinator, Simon Burnett, came up with this hard pull-wire linewhen Erik flies and hits the MRI. That’s a real stunt with a double, and he hit hard. All the other shots are all CG wheelchairs because the directors wanted to art-direct how the crumpling metal was snapping and bending to show pressure on him as his body starts going into the MRI.” To augment the believability that comes with reality, the directors aimed to capture as much practically as possible, then VFX Supervisor Nordin Rahhali and his team built on that result.A train derailment concludes the film after Stefani and her brother, Charlie, realize they are still on death’s list. A train goes off the tracks, and logs from one of the cars fly though the air and kills them. “That one was special because it’s a hard sequence and was also shot quite late, so we didn’t have a lot of time. We went back to Vancouver and shot the actual street, and we shot our actors performing. They fell onto stunt pads, and the moment they get touched by the logs, it turns into CG as it was the only way to pull that off and the train of course. We had to add all that. The destruction of the houses and everything was done in visual effects.” Erik survives the tattoo parlor blaze only to meet his fate in a hospital when he is crushed by a wheelchair while being pulled into an out-of-control MRI machine. Erikappears about to be run over by a delivery truck at the corner of 21A Ave. and 132A St., but he’s not – at least not then. The truck is actually on the opposite side of the road, and the person being run over is Howard. A rolling penny plays a major part in the catastrophic chain reactions and seems to be a character itself. “The magic penny was a mix from two vendors, Pixomondo and FOLKS; both had penny shots,” Rahhali says. “All the bouncing pennies you see going through the vents and hitting the fan blade are all FOLKS. The bouncing penny at the end as a lady takes it out of her purse, that goes down the ramp and into the rail – that’s FOLKS. The big explosion shots in the Skyview with the penny slowing down after the kid throws itare all Pixomondo shots. It was a mix. We took a little time to find that balance between readability and believability.” Approximately 800 VFX shots were required for Final Destination Bloodlines.Chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated Death at some point in the Final Destination films. From left: Kaitlyn Santa Juana as Stefani Reyes, director Adam Stein, director Zach Lipovsky and Gabrielle Rose as Iris.Rahhali adds, “The film is a great collaboration of departments. Good visual effects are always a good combination of special effects, makeup effects and cinematography; it’s all the planning of all the pieces coming together. For a film of this size, I’m really proud of the work. I think we punched above our weight class, and it looks quite good.” #explosive #mix #sfx #vfx #ignites
    WWW.VFXVOICE.COM
    AN EXPLOSIVE MIX OF SFX AND VFX IGNITES FINAL DESTINATION BLOODLINES
    By CHRIS McGOWAN Images courtesy of Warner Bros. Pictures. Final Destination Bloodlines, the sixth installment in the graphic horror series, kicks off with the film’s biggest challenge – deploying an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant. While there in 1968, young Iris Campbell (Brec Bassinger) has a premonition about the Skyview burning, cracking, crumbling and collapsing. Then, when she sees these events actually starting to happen around her, she intervenes and causes an evacuation of the tower, thus thwarting death’s design and saving many lives. Years later, her granddaughter, Stefani Reyes (Kaitlyn Santa Juana), inherits the vision of the destruction that could have occurred and realizes death is still coming for the survivors. “I knew we couldn’t put the whole [Skyview restaurant] on fire, but Tony [Lazarowich, Special Effects Supervisor] tried and put as much fire as he could safely and then we just built off that [in VFX] and added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction that can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” —Nordin Rahhali, VFX Supervisor The film opens with an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant – and its collapse. Drone footage was digitized to create a 3D asset for the LED wall so the time of day could be changed as needed. “The set that the directors wanted was very large,” says Nordin Rahhali, VFX Supervisor. “We had limited space options in stages given the scale and the footprint of the actual restaurant that they wanted. It was the first set piece, the first big thing we shot, so we had to get it all ready and going right off the bat. We built a bigger volume for our needs, including an LED wall that we built the assets for.” “We were outside Vancouver at Bridge Studios in Burnaby. The custom-built LED volume was a little over 200 feet in length” states Christian Sebaldt, ASC, the movie’s DP. The volume was 98 feet in diameter and 24 feet tall. Rahhali explains, “Pixomondo was the vendor that we contracted to come in and build the volume. They also built the asset that went on the LED wall, so they were part of our filming team and production shoot. Subsequently, they were also the main vendor doing post, which was by design. By having them design and take care of the asset during production, we were able to leverage their assets, tools and builds for some of the post VFX.” Rahhali adds, “It was really important to make sure we had days with the volume team and with Christian and his camera team ahead of the shoot so we could dial it in.” Built at Bridge Studios in Burnaby outside Vancouver, the custom-built LED volume for events at the Skyview restaurant was over 200 feet long, 98 feet wide and 24 feet tall. Extensive previs with Digital Domain was done to advance key shots. (Photo: Eric Milner) Zach Lipovsky and Adam Stein directed Final Destination Bloodlines for New Line film, distributed by Warner Bros., in which chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated death at some point. Pixomondo was the lead VFX vendor, followed by FOLKS VFX. Picture Shop also contributed. There were around 800 VFX shots. Tony Lazarowich was the Special Effects Supervisor. “The Skyview restaurant involved building a massive set [that] was fire retardant, which meant the construction took longer than normal because they had to build it with certain materials and coat it with certain things because, obviously, it serves for the set piece. As it’s falling into chaos, a lot of that fire was practical. I really jived with what Christian and directors wanted and how Tony likes to work – to augment as much real practical stuff as possible,” Rahhali remarks. “I knew we couldn’t put the whole thing on fire, but Tony tried and put as much fire as he could safely, and then we just built off that [in VFX] and added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” The Skyview restaurant required building a massive set that was fire retardant. Construction on the set took longer because it had to be built and coated with special materials. As the Skyview restaurant falls into chaos, much of the fire was practical. (Photo: Eric Milner) “We got all the Vancouver skyline [with drones] so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” —Christian Sebaldt, ASC, Director of Photography For drone shots, the team utilized a custom heavy-lift drone with three RED Komodo Digital Cinema cameras “giving us almost 180 degrees with overlap that we would then stitch in post and have a ridiculous amount of resolution off these three cameras,” Sebaldt states. “The other drone we used was a DJI Inspire 3, which was also very good. And we flew these drones up at the height [we needed]. We flew them at different times of day. We flew full 360s, and we also used them for photogrammetry. We got all the Vancouver skyline so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” Rahhali adds, “All of this allowed us to figure out what we were going to shoot. We had the stage build, and we had the drone footage that we then digitized and created a 3D asset to go on the wall [so] we could change the times of day” Pixomondo built the volume and the asset that went on the LED wall for the Skyview sequence. They were also the main vendor during post. FOLKS VFX and Picture Shop contributed. (Photo: Eric Milner) “We did extensive previs with Digital Domain,” Rahhali explains. “That was important because we knew the key shots that the directors wanted. With a combination of those key shots, we then kind of reverse-engineered [them] while we did techvis off the previs and worked with Christian and the art department so we would have proper flexibility with the set to be able to pull off some of these shots. [For example,] some of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paul [Max Lloyd-Jones] as he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” Some shots required the Skyview’s ceiling to be lifted and partially removed to get a crane to shoot Paul Campbell (Max Lloyd-Jones) as he’s about to fall. The character Iris lived in a fortified house, isolating herself methodically to avoid the Grim Reaper. Rahhali comments, “That was a beautiful location [in] GVRD [Greater Vancouver], very cold. It was a long, hard shoot, because it was all nights. It was just this beautiful pocket out in the middle of the mountains. We in visual effects didn’t do a ton other than a couple of clean-ups of the big establishing shots when you see them pull up to the compound. We had to clean up small roads we wanted to make look like one road and make the road look like dirt.” There were flames involved. Sebaldt says, “The explosion [of Iris’s home] was unbelievably big. We had eight cameras on it at night and shot it at high speed, and we’re all going ‘Whoa.’” Rahhali notes, “There was some clean-up, but the explosion was 100% practical. Our Special Effects Supervisor, Tony, went to town on that. He blew up the whole house, and it looked spectacular.” The tattoo shop piercing scene is one of the most talked-about sequences in the movie, where a dangling chain from a ceiling fan attaches itself to the septum nose piercing of Erik Campbell (Richard Harmon) and drags him toward a raging fire. Rahhali observes, “That was very Final Destination and a great Rube Goldberg build-up event. Richard was great. He was tied up on a stunt line for most of it, balancing on top of furniture. All of that was him doing it for real with a stunt line.” Some effects solutions can be surprisingly extremely simple. Rahhali continues, “Our producer [Craig Perry] came up with a great gag [for the] septum ring.” Richard’s nose was connected with just a nose plug that went inside his nostrils. “All that tugging and everything that you’re seeing was real. For weeks and weeks, we were all trying to figure out how to do it without it being a big visual effects thing. ‘How are we gonna pull his nose for real?’ Craig said, ‘I have these things I use to help me open up my nose and you can’t really see them.’ They built it off of that, and it looked great.” Filmmakers spent weeks figuring out how to execute the harrowing tattoo shop scene. A dangling chain from a ceiling fan attaches itself to the septum nose ring of Erik Campbell (Richard Harmon) – with the actor’s nose being tugged by the chain connected to a nose plug that went inside his nostrils. “[S]ome of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paul [Campbell] as he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” —Nordin Rahhali, VFX Supervisor Most of the fire in the tattoo parlor was practical. “There are some fire bars and stuff that you’re seeing in there from SFX and the big pool of fire on the wide shots.” Sebaldt adds, “That was a lot of fun to shoot because it’s so insane when he’s dancing and balancing on all this stuff – we were laughing and laughing. We were convinced that this was going to be the best scene in the movie up to that moment.” Rahhali says, “They used the scene wholesale for the trailer. It went viral – people were taking out their septum rings.” Erik survives the parlor blaze only to meet his fate in a hospital when he is pulled by a wheelchair into an out-of-control MRI machine at its highest magnetic level. Rahhali comments, “That is a good combination of a bunch of different departments. Our Stunt Coordinator, Simon Burnett, came up with this hard pull-wire line [for] when Erik flies and hits the MRI. That’s a real stunt with a double, and he hit hard. All the other shots are all CG wheelchairs because the directors wanted to art-direct how the crumpling metal was snapping and bending to show pressure on him as his body starts going into the MRI.” To augment the believability that comes with reality, the directors aimed to capture as much practically as possible, then VFX Supervisor Nordin Rahhali and his team built on that result. (Photo: Eric Milner) A train derailment concludes the film after Stefani and her brother, Charlie, realize they are still on death’s list. A train goes off the tracks, and logs from one of the cars fly though the air and kills them. “That one was special because it’s a hard sequence and was also shot quite late, so we didn’t have a lot of time. We went back to Vancouver and shot the actual street, and we shot our actors performing. They fell onto stunt pads, and the moment they get touched by the logs, it turns into CG as it was the only way to pull that off and the train of course. We had to add all that. The destruction of the houses and everything was done in visual effects.” Erik survives the tattoo parlor blaze only to meet his fate in a hospital when he is crushed by a wheelchair while being pulled into an out-of-control MRI machine. Erik (Richard Harmon) appears about to be run over by a delivery truck at the corner of 21A Ave. and 132A St., but he’s not – at least not then. The truck is actually on the opposite side of the road, and the person being run over is Howard. A rolling penny plays a major part in the catastrophic chain reactions and seems to be a character itself. “The magic penny was a mix from two vendors, Pixomondo and FOLKS; both had penny shots,” Rahhali says. “All the bouncing pennies you see going through the vents and hitting the fan blade are all FOLKS. The bouncing penny at the end as a lady takes it out of her purse, that goes down the ramp and into the rail – that’s FOLKS. The big explosion shots in the Skyview with the penny slowing down after the kid throws it [off the deck] are all Pixomondo shots. It was a mix. We took a little time to find that balance between readability and believability.” Approximately 800 VFX shots were required for Final Destination Bloodlines. (Photo: Eric Milner) Chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated Death at some point in the Final Destination films. From left: Kaitlyn Santa Juana as Stefani Reyes, director Adam Stein, director Zach Lipovsky and Gabrielle Rose as Iris. (Photo: Eric Milner) Rahhali adds, “The film is a great collaboration of departments. Good visual effects are always a good combination of special effects, makeup effects and cinematography; it’s all the planning of all the pieces coming together. For a film of this size, I’m really proud of the work. I think we punched above our weight class, and it looks quite good.”
    0 Yorumlar 0 hisse senetleri 0 önizleme
CGShares https://cgshares.com