• BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4

    By TREVOR HOGG
    Images courtesy of Prime Video.

    For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”

    When Splintersplits in two, the cloning effect was inspired by cellular mitosis.

    “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!”
    —Stephan Fleet, VFX Supervisor

    A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”

    Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed.

    Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.”

    “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.”
    —Stephan Fleet, VFX Supervisor

    The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.”

    A building is replaced by a massive crowd attending a rally being held by Homelander.

    In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.”

    In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around.

    “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.”
    —Stephan Fleet, VFX Supervisor

    Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”

    The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes.

    Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.”

    Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination.

    Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.”

    “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.”
    —Stephan Fleet, VFX Supervisor

    Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution.

    A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.”

    Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4.

    When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    #bouncing #rubber #duckies #flying #sheep
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splintersplits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith, who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be, so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a humanyou tend to want to give it human gestures and eyebrows. Erik Kripkesaid, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep wasin one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr.develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelanderbreaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchiehallucinates as Kimiko Miyashirogoes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splintersplits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker. “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.” #bouncing #rubber #duckies #flying #sheep
    WWW.VFXVOICE.COM
    BOUNCING FROM RUBBER DUCKIES AND FLYING SHEEP TO CLONES FOR THE BOYS SEASON 4
    By TREVOR HOGG Images courtesy of Prime Video. For those seeking an alternative to the MCU, Prime Video has two offerings of the live-action and animated variety that take the superhero genre into R-rated territory where the hands of the god-like figures get dirty, bloodied and severed. “The Boys is about the intersection of celebrity and politics using superheroes,” states Stephan Fleet, VFX Supervisor on The Boys. “Sometimes I see the news and I don’t even know we can write to catch up to it! But we try. Invincible is an intense look at an alternate DC Universe that has more grit to the superhero side of it all. On one hand, I was jealous watching Season 1 of Invincible because in animation you can do things that you can’t do in real life on a budget.” Season 4 does not tone down the blood, gore and body count. Fleet notes, “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” When Splinter (Rob Benedict) splits in two, the cloning effect was inspired by cellular mitosis. “The writers almost have this dialogue with us. Sometimes, they’ll write in the script, ‘And Fleet will come up with a cool visual effect for how to kill this person.’ Or, ‘Chhiu, our fight coordinator, will make an awesome fight.’ It is a frequent topic of conversation. We’re constantly trying to be inventive and create new ways to kill people!” —Stephan Fleet, VFX Supervisor A total of 1,600 visual effects shots were created for the eight episodes by ILM, Pixomondo, MPC Toronto, Spin VFX, DNEG, Untold Studios, Luma Pictures and Rocket Science VFX. Previs was a critical part of the process. “We have John Griffith [Previs Director], who owns a small company called CNCPT out of Texas, and he does wonderful Unreal Engine level previs,” Fleet remarks. “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” Founding Director of Federal Bureau of Superhuman Affairs, Victoria Neuman, literally gets ripped in half by two tendrils coming out of Compound V-enhanced Billy Butcher, the leader of superhero resistance group The Boys. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” Multiple plates were shot to enable Simon Pegg to phase through the actor laying in a hospital bed. Testing can get rather elaborate. “For that end scene with Butcher’s tendrils, the room was two stories, and we were able to put the camera up high along with a bunch of blood cannons,” Fleet recalls. “When the body rips in half and explodes, there is a practical component. We rained down a bunch of real blood and guts right in front of Huey. It’s a known joke that we like to douse Jack Quaid with blood as much as possible! In this case, the special effects team led by Hudson Kenny needed to test it the day before, and I said, “I’ll be the guinea pig for the test.’ They covered the whole place with plastic like it was a Dexter kill room because you don’t want to destroy the set. I’m standing there in a white hazmat suit with goggles on, covered from head to toe in plastic and waiting as they’re tweaking all of these things. It sounds like World War II going on. They’re on walkie talkies to each other, and then all of a sudden, it’s ‘Five, four, three, two, one…’  And I get exploded with blood. I wanted to see what it was like, and it’s intense.” “On set, we have a cartoon of what is going to be done, and you’ll be amazed, specifically for action and heavy visual effects stuff, how close those shots are to the previs when we finish.” —Stephan Fleet, VFX Supervisor The Deep has a love affair with an octopus called Ambrosius, voiced by Tilda Swinton. “It’s implied bestiality!” Fleet laughs. “I would call it more of a romance. What was fun from my perspective is that I knew what the look was going to be [from Season 3], so then it’s about putting in the details and the animation. One of the instincts that you always have when you’re making a sea creature that talks to a human [is] you tend to want to give it human gestures and eyebrows. Erik Kripke [Creator, Executive Producer, Showrunner, Director, Writer] said, ‘No. We have to find things that an octopus could do that conveys the same emotion.’ That’s when ideas came in, such as putting a little The Deep toy inside the water tank. When Ambrosius is trying to have an intimate moment or connect with him, she can wrap a tentacle around that. My favorite experience doing Ambrosius was when The Deep is reading poetry to her on a bed. CG creatures touching humans is one of the more complicated things to do and make look real. Ambrosius’ tentacles reach for his arm, and it becomes an intimate moment. More than touching the skin, displacing the bedsheet as Ambrosius moved ended up becoming a lot of CG, and we had to go back and forth a few times to get that looking right; that turned out to be tricky.” A building is replaced by a massive crowd attending a rally being held by Homelander. In a twisted form of sexual foreplay, Sister Sage has The Deep perform a transorbital lobotomy on her. “Thank you, Amazon for selling lobotomy tools as novelty items!” Fleet chuckles. “We filmed it with a lobotomy tool on set. There is a lot of safety involved in doing something like that. Obviously, you don’t want to put any performer in any situation where they come close to putting anything real near their eye. We created this half lobotomy tool and did this complicated split screen with the lobotomy tool on a teeter totter. The Deep was [acting in a certain way] in one shot and Sister Sage reacted in the other shot. To marry the two ended up being a lot of CG work. Then there are these close-ups which are full CG. I always keep a dummy head that is painted gray that I use all of the time for reference. In macrophotography I filmed this lobotomy tool going right into the eye area. I did that because the tool is chrome, so it’s reflective and has ridges. It has an interesting reflective property. I was able to see how and what part of the human eye reflects onto the tool. A lot of that shot became about realistic reflections and lighting on the tool. Then heavy CG for displacing the eye and pushing the lobotomy tool into it. That was one of the more complicated sequences that we had to achieve.” In order to create an intimate moment between Ambrosius and The Deep, a toy version of the superhero was placed inside of the water tank that she could wrap a tentacle around. “The word that we like to use on this show is ‘grounded,’ and I like to say ‘grounded’ with an asterisk in this day and age because we’re grounded until we get to killing people in the craziest ways. In this case, having someone floating in the air and being ripped in half by two tendrils was all CG.” —Stephan Fleet, VFX Supervisor Sheep and chickens embark on a violent rampage courtesy of Compound V with the latter piercing the chest of a bodyguard belonging to Victoria Neuman. “Weirdly, that was one of our more traditional shots,’ Fleet states. “What is fun about that one is I asked for real chickens as reference. The chicken flying through his chest is real. It’s our chicken wrangler in green suit gently tossing a chicken. We blended two real plates together with some CG in the middle.” A connection was made with a sci-fi classic. “The sheep kill this bull, and we shot it is in this narrow corridor of fencing. When they run, I always equated it as the Trench Run in Star Wars and looked at the sheep as TIE fighters or X-wings coming at them.” The scene was one of the scarier moments for the visual effects team. Fleet explains, “When I read the script, I thought this could be the moment where we jump the shark. For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” The sheep injected with Compound V develop the ability to fly and were shot in an imperfect manner to help ground the scenes. Once injected with Compound V, Hugh Campbell Sr. (Simon Pegg) develops the ability to phase through objects, including human beings. “We called it the Bro-nut because his name in the script is Wall Street Bro,” Fleet notes. “That was a complicated motion control shot, repeating the move over and over again. We had to shoot multiple plates of Simon Pegg and the guy in the bed. Special effects and prosthetics created a dummy guy with a hole in his chest with practical blood dripping down. It was meshing it together and getting the timing right in post. On top of that, there was the CG blood immediately around Simon Pegg.” The phasing effect had to avoid appearing as a dissolve. “I had this idea of doing high-frequency vibration on the X axis loosely based on how The Flash vibrates through walls. You want everything to have a loose motivation that then helps trigger the visuals. We tried not to overcomplicate that because, ultimately, you want something like that to be quick. If you spend too much time on phasing, it can look cheesy. In our case, it was a lot of false walls. Simon Pegg is running into a greenscreen hole which we plug in with a wall or coming out of one. I went off the actor’s action, and we added a light opacity mix with some X-axis shake.” Providing a different twist to the fights was the replacement of spurting blood with photoreal rubber duckies during a drug-induced hallucination. Homelander (Anthony Starr) breaks a mirror which emphasizes his multiple personality disorder. “The original plan was that special effects was going to pre-break a mirror, and we were going to shoot Anthony Starr moving his head doing all of the performances in the different parts of the mirror,” Fleet reveals. “This was all based on a photo that my ex-brother-in-law sent me. He was walking down a street in Glendale, California, came across a broken mirror that someone had thrown out, and took a photo of himself where he had five heads in the mirror. We get there on the day, and I’m realizing that this is really complicated. Anthony has to do these five different performances, and we have to deal with infinite mirrors. At the last minute, I said, ‘We have to do this on a clean mirror.’ We did it on a clear mirror and gave Anthony different eyelines. The mirror break was all done in post, and we were able to cheat his head slightly and art-direct where the break crosses his chin. Editorial was able to do split screens for the timing of the dialogue.” “For the shots where the sheep are still and scream to the camera, Untold Studios did a bunch of R&D and came up with baboon teeth. I tried to keep anything real as much as possible, but, obviously, when sheep are flying, they have to be CG. I call it the Battlestar Galactica theory, where I like to shake the camera, overshoot shots and make it sloppy when they’re in the air so you can add motion blur. Comedy also helps sell visual effects.” —Stephan Fleet, VFX Supervisor Initially, the plan was to use a practical mirror, but creating a digital version proved to be the more effective solution. A different spin on the bloodbath occurs during a fight when a drugged Frenchie (Tomer Capone) hallucinates as Kimiko Miyashiro (Karen Fukuhara) goes on a killing spree. “We went back and forth with a lot of different concepts for what this hallucination would be,” Fleet remarks. “When we filmed it, we landed on Frenchie having a synesthesia moment where he’s seeing a lot of abstract colors flying in the air. We started getting into that in post and it wasn’t working. We went back to the rubber duckies, which goes back to the story of him in the bathtub. What’s in the bathtub? Rubber duckies, bubbles and water. There was a lot of physics and logic required to figure out how these rubber duckies could float out of someone’s neck. We decided on bubbles when Kimiko hits people’s heads. At one point, we had water when she got shot, but it wasn’t working, so we killed it. We probably did about 100 different versions. We got really detailed with our rubber duckie modeling because we didn’t want it to look cartoony. That took a long time.” Ambrosius, voiced by Tilda Swinton, gets a lot more screentime in Season 4. When Splinter (Rob Benedict) splits in two was achieved heavily in CG. “Erik threw out the words ‘cellular mitosis’ early on as something he wanted to use,” Fleet states. “We shot Rob Benedict on a greenscreen doing all of the different performances for the clones that pop out. It was a crazy amount of CG work with Houdini and particle and skin effects. We previs’d the sequence so we had specific actions. One clone comes out to the right and the other pulls backwards.” What tends to go unnoticed by many is Splinter’s clones setting up for a press conference being held by Firecracker (Valorie Curry). “It’s funny how no one brings up the 22-hour motion control shot that we had to do with Splinter on the stage, which was the most complicated shot!” Fleet observes. “We have this sweeping long shot that brings you into the room and follows Splinter as he carries a container to the stage and hands it off to a clone, and then you reveal five more of them interweaving each other and interacting with all of these objects. It’s like a minute-long dance. First off, you have to choreograph it. We previs’d it, but then you need to get people to do it. We hired dancers and put different colored armbands on them. The camera is like another performer, and a metronome is going, which enables you to find a pace. That took about eight hours of rehearsal. Then Rob has to watch each one of their performances and mimic it to the beat. When he is handing off a box of cables, it’s to a double who is going to have to be erased and be him on the other side. They have to be almost perfect in their timing and lineup in order to take it over in visual effects and make it work.”
    0 Commentarii 0 Distribuiri
  • Inside the thinking behind Frontify Futures' standout brand identity

    Who knows where branding will go in the future? However, for many of us working in the creative industries, it's our job to know. So it's something we need to start talking about, and Frontify Futures wants to be the platform where that conversation unfolds.
    This ambitious new thought leadership initiative from Frontify brings together an extraordinary coalition of voices—CMOs who've scaled global brands, creative leaders reimagining possibilities, strategy directors pioneering new approaches, and cultural forecasters mapping emerging opportunities—to explore how effectiveness, innovation, and scale will shape tomorrow's brand-building landscape.
    But Frontify Futures isn't just another content platform. Excitingly, from a design perspective, it's also a living experiment in what brand identity can become when technology meets craft, when systems embrace chaos, and when the future itself becomes a design material.
    Endless variation
    What makes Frontify Futures' typography unique isn't just its custom foundation: it's how that foundation enables endless variation and evolution. This was primarily achieved, reveals developer and digital art director Daniel Powell, by building bespoke tools for the project.

    "Rather than rely solely on streamlined tools built for speed and production, we started building our own," he explains. "The first was a node-based design tool that takes our custom Frame and Hairline fonts as a base and uses them as the foundations for our type generator. With it, we can generate unique type variations for each content strand—each article, even—and create both static and animated type, exportable as video or rendered live in the browser."
    Each of these tools included what Daniel calls a "chaos element: a small but intentional glitch in the system. A microstatement about the nature of the future: that it can be anticipated but never fully known. It's our way of keeping gesture alive inside the system."
    One of the clearest examples of this is the colour palette generator. "It samples from a dynamic photo grid tied to a rotating colour wheel that completes one full revolution per year," Daniel explains. "But here's the twist: wind speed and direction in St. Gallen, Switzerland—Frontify's HQ—nudges the wheel unpredictably off-centre. It's a subtle, living mechanic; each article contains a log of the wind data in its code as a kind of Easter Egg."

    Another favourite of Daniel's—yet to be released—is an expanded version of Conway's Game of Life. "It's been running continuously for over a month now, evolving patterns used in one of the content strand headers," he reveals. "The designer becomes a kind of photographer, capturing moments from a petri dish of generative motion."
    Core Philosophy
    In developing this unique identity, two phrases stood out to Daniel as guiding lights from the outset. The first was, 'We will show, not tell.'
    "This became the foundation for how we approached the identity," recalls Daniel. "It had to feel like a playground: open, experimental, and fluid. Not overly precious or prescriptive. A system the Frontify team could truly own, shape, and evolve. A platform, not a final product. A foundation, just as the future is always built on the past."

    The second guiding phrase, pulled directly from Frontify's rebrand materials, felt like "a call to action," says Daniel. "'Gestural and geometric. Human and machine. Art and science.' It's a tension that feels especially relevant in the creative industries today. As technology accelerates, we ask ourselves: how do we still hold onto our craft? What does it mean to be expressive in an increasingly systemised world?"
    Stripped back and skeletal typography
    The identity that Daniel and his team created reflects these themes through typography that literally embodies the platform's core philosophy. It really started from this idea of the past being built upon the 'foundations' of the past," he explains. "At the time Frontify Futures was being created, Frontify itself was going through a rebrand. With that, they'd started using a new variable typeface called Cranny, a custom cut of Azurio by Narrow Type."
    Daniel's team took Cranny and "pushed it into a stripped-back and almost skeletal take". The result was Crany-Frame and Crany-Hairline. "These fonts then served as our base scaffolding," he continues. "They were never seen in design, but instead, we applied decoration them to produce new typefaces for each content strand, giving the identity the space to grow and allow new ideas and shapes to form."

    As Daniel saw it, the demands on the typeface were pretty simple. "It needed to set an atmosphere. We needed it needed to feel alive. We wanted it to be something shifting and repositioning. And so, while we have a bunch of static cuts of each base style, we rarely use them; the typefaces you see on the website and social only exist at the moment as a string of parameters to create a general style that we use to create live animating versions of the font generated on the fly."
    In addition to setting the atmosphere, it needed to be extremely flexible and feature live inputs, as a significant part of the branding is about the unpredictability of the future. "So Daniel's team built in those aforementioned "chaos moments where everything from user interaction to live windspeeds can affect the font."
    Design Process
    The process of creating the typefaces is a fascinating one. "We started by working with the custom cut of Azuriofrom Narrow Type. We then redrew it to take inspiration from how a frame and a hairline could be produced from this original cut. From there, we built a type generation tool that uses them as a base.
    "It's a custom node-based system that lets us really get in there and play with the overlays for everything from grid-sizing, shapes and timing for the animation," he outlines. "We used this tool to design the variants for different content strands. We weren't just designing letterforms; we were designing a comprehensive toolset that could evolve in tandem with the content.
    "That became a big part of the process: designing systems that designers could actually use, not just look at; again, it was a wider conversation and concept around the future and how designers and machines can work together."

    In short, the evolution of the typeface system reflects the platform's broader commitment to continuous growth and adaptation." The whole idea was to make something open enough to keep building on," Daniel stresses. "We've already got tools in place to generate new weights, shapes and animated variants, and the tool itself still has a ton of unused functionality.
    "I can see that growing as new content strands emerge; we'll keep adapting the type with them," he adds. "It's less about version numbers and more about ongoing movement. The system's alive; that's the point.
    A provocation for the industry
    In this context, the Frontify Futures identity represents more than smart visual branding; it's also a manifesto for how creative systems might evolve in an age of increasing automation and systematisation. By building unpredictability into their tools, embracing the tension between human craft and machine precision, and creating systems that grow and adapt rather than merely scale, Daniel and the Frontify team have created something that feels genuinely forward-looking.
    For creatives grappling with similar questions about the future of their craft, Frontify Futures offers both inspiration and practical demonstration. It shows how brands can remain human while embracing technological capability, how systems can be both consistent and surprising, and how the future itself can become a creative medium.
    This clever approach suggests that the future of branding lies not in choosing between human creativity and systematic efficiency but in finding new ways to make them work together, creating something neither could achieve alone.
    #inside #thinking #behind #frontify #futures039
    Inside the thinking behind Frontify Futures' standout brand identity
    Who knows where branding will go in the future? However, for many of us working in the creative industries, it's our job to know. So it's something we need to start talking about, and Frontify Futures wants to be the platform where that conversation unfolds. This ambitious new thought leadership initiative from Frontify brings together an extraordinary coalition of voices—CMOs who've scaled global brands, creative leaders reimagining possibilities, strategy directors pioneering new approaches, and cultural forecasters mapping emerging opportunities—to explore how effectiveness, innovation, and scale will shape tomorrow's brand-building landscape. But Frontify Futures isn't just another content platform. Excitingly, from a design perspective, it's also a living experiment in what brand identity can become when technology meets craft, when systems embrace chaos, and when the future itself becomes a design material. Endless variation What makes Frontify Futures' typography unique isn't just its custom foundation: it's how that foundation enables endless variation and evolution. This was primarily achieved, reveals developer and digital art director Daniel Powell, by building bespoke tools for the project. "Rather than rely solely on streamlined tools built for speed and production, we started building our own," he explains. "The first was a node-based design tool that takes our custom Frame and Hairline fonts as a base and uses them as the foundations for our type generator. With it, we can generate unique type variations for each content strand—each article, even—and create both static and animated type, exportable as video or rendered live in the browser." Each of these tools included what Daniel calls a "chaos element: a small but intentional glitch in the system. A microstatement about the nature of the future: that it can be anticipated but never fully known. It's our way of keeping gesture alive inside the system." One of the clearest examples of this is the colour palette generator. "It samples from a dynamic photo grid tied to a rotating colour wheel that completes one full revolution per year," Daniel explains. "But here's the twist: wind speed and direction in St. Gallen, Switzerland—Frontify's HQ—nudges the wheel unpredictably off-centre. It's a subtle, living mechanic; each article contains a log of the wind data in its code as a kind of Easter Egg." Another favourite of Daniel's—yet to be released—is an expanded version of Conway's Game of Life. "It's been running continuously for over a month now, evolving patterns used in one of the content strand headers," he reveals. "The designer becomes a kind of photographer, capturing moments from a petri dish of generative motion." Core Philosophy In developing this unique identity, two phrases stood out to Daniel as guiding lights from the outset. The first was, 'We will show, not tell.' "This became the foundation for how we approached the identity," recalls Daniel. "It had to feel like a playground: open, experimental, and fluid. Not overly precious or prescriptive. A system the Frontify team could truly own, shape, and evolve. A platform, not a final product. A foundation, just as the future is always built on the past." The second guiding phrase, pulled directly from Frontify's rebrand materials, felt like "a call to action," says Daniel. "'Gestural and geometric. Human and machine. Art and science.' It's a tension that feels especially relevant in the creative industries today. As technology accelerates, we ask ourselves: how do we still hold onto our craft? What does it mean to be expressive in an increasingly systemised world?" Stripped back and skeletal typography The identity that Daniel and his team created reflects these themes through typography that literally embodies the platform's core philosophy. It really started from this idea of the past being built upon the 'foundations' of the past," he explains. "At the time Frontify Futures was being created, Frontify itself was going through a rebrand. With that, they'd started using a new variable typeface called Cranny, a custom cut of Azurio by Narrow Type." Daniel's team took Cranny and "pushed it into a stripped-back and almost skeletal take". The result was Crany-Frame and Crany-Hairline. "These fonts then served as our base scaffolding," he continues. "They were never seen in design, but instead, we applied decoration them to produce new typefaces for each content strand, giving the identity the space to grow and allow new ideas and shapes to form." As Daniel saw it, the demands on the typeface were pretty simple. "It needed to set an atmosphere. We needed it needed to feel alive. We wanted it to be something shifting and repositioning. And so, while we have a bunch of static cuts of each base style, we rarely use them; the typefaces you see on the website and social only exist at the moment as a string of parameters to create a general style that we use to create live animating versions of the font generated on the fly." In addition to setting the atmosphere, it needed to be extremely flexible and feature live inputs, as a significant part of the branding is about the unpredictability of the future. "So Daniel's team built in those aforementioned "chaos moments where everything from user interaction to live windspeeds can affect the font." Design Process The process of creating the typefaces is a fascinating one. "We started by working with the custom cut of Azuriofrom Narrow Type. We then redrew it to take inspiration from how a frame and a hairline could be produced from this original cut. From there, we built a type generation tool that uses them as a base. "It's a custom node-based system that lets us really get in there and play with the overlays for everything from grid-sizing, shapes and timing for the animation," he outlines. "We used this tool to design the variants for different content strands. We weren't just designing letterforms; we were designing a comprehensive toolset that could evolve in tandem with the content. "That became a big part of the process: designing systems that designers could actually use, not just look at; again, it was a wider conversation and concept around the future and how designers and machines can work together." In short, the evolution of the typeface system reflects the platform's broader commitment to continuous growth and adaptation." The whole idea was to make something open enough to keep building on," Daniel stresses. "We've already got tools in place to generate new weights, shapes and animated variants, and the tool itself still has a ton of unused functionality. "I can see that growing as new content strands emerge; we'll keep adapting the type with them," he adds. "It's less about version numbers and more about ongoing movement. The system's alive; that's the point. A provocation for the industry In this context, the Frontify Futures identity represents more than smart visual branding; it's also a manifesto for how creative systems might evolve in an age of increasing automation and systematisation. By building unpredictability into their tools, embracing the tension between human craft and machine precision, and creating systems that grow and adapt rather than merely scale, Daniel and the Frontify team have created something that feels genuinely forward-looking. For creatives grappling with similar questions about the future of their craft, Frontify Futures offers both inspiration and practical demonstration. It shows how brands can remain human while embracing technological capability, how systems can be both consistent and surprising, and how the future itself can become a creative medium. This clever approach suggests that the future of branding lies not in choosing between human creativity and systematic efficiency but in finding new ways to make them work together, creating something neither could achieve alone. #inside #thinking #behind #frontify #futures039
    WWW.CREATIVEBOOM.COM
    Inside the thinking behind Frontify Futures' standout brand identity
    Who knows where branding will go in the future? However, for many of us working in the creative industries, it's our job to know. So it's something we need to start talking about, and Frontify Futures wants to be the platform where that conversation unfolds. This ambitious new thought leadership initiative from Frontify brings together an extraordinary coalition of voices—CMOs who've scaled global brands, creative leaders reimagining possibilities, strategy directors pioneering new approaches, and cultural forecasters mapping emerging opportunities—to explore how effectiveness, innovation, and scale will shape tomorrow's brand-building landscape. But Frontify Futures isn't just another content platform. Excitingly, from a design perspective, it's also a living experiment in what brand identity can become when technology meets craft, when systems embrace chaos, and when the future itself becomes a design material. Endless variation What makes Frontify Futures' typography unique isn't just its custom foundation: it's how that foundation enables endless variation and evolution. This was primarily achieved, reveals developer and digital art director Daniel Powell, by building bespoke tools for the project. "Rather than rely solely on streamlined tools built for speed and production, we started building our own," he explains. "The first was a node-based design tool that takes our custom Frame and Hairline fonts as a base and uses them as the foundations for our type generator. With it, we can generate unique type variations for each content strand—each article, even—and create both static and animated type, exportable as video or rendered live in the browser." Each of these tools included what Daniel calls a "chaos element: a small but intentional glitch in the system. A microstatement about the nature of the future: that it can be anticipated but never fully known. It's our way of keeping gesture alive inside the system." One of the clearest examples of this is the colour palette generator. "It samples from a dynamic photo grid tied to a rotating colour wheel that completes one full revolution per year," Daniel explains. "But here's the twist: wind speed and direction in St. Gallen, Switzerland—Frontify's HQ—nudges the wheel unpredictably off-centre. It's a subtle, living mechanic; each article contains a log of the wind data in its code as a kind of Easter Egg." Another favourite of Daniel's—yet to be released—is an expanded version of Conway's Game of Life. "It's been running continuously for over a month now, evolving patterns used in one of the content strand headers," he reveals. "The designer becomes a kind of photographer, capturing moments from a petri dish of generative motion." Core Philosophy In developing this unique identity, two phrases stood out to Daniel as guiding lights from the outset. The first was, 'We will show, not tell.' "This became the foundation for how we approached the identity," recalls Daniel. "It had to feel like a playground: open, experimental, and fluid. Not overly precious or prescriptive. A system the Frontify team could truly own, shape, and evolve. A platform, not a final product. A foundation, just as the future is always built on the past." The second guiding phrase, pulled directly from Frontify's rebrand materials, felt like "a call to action," says Daniel. "'Gestural and geometric. Human and machine. Art and science.' It's a tension that feels especially relevant in the creative industries today. As technology accelerates, we ask ourselves: how do we still hold onto our craft? What does it mean to be expressive in an increasingly systemised world?" Stripped back and skeletal typography The identity that Daniel and his team created reflects these themes through typography that literally embodies the platform's core philosophy. It really started from this idea of the past being built upon the 'foundations' of the past," he explains. "At the time Frontify Futures was being created, Frontify itself was going through a rebrand. With that, they'd started using a new variable typeface called Cranny, a custom cut of Azurio by Narrow Type." Daniel's team took Cranny and "pushed it into a stripped-back and almost skeletal take". The result was Crany-Frame and Crany-Hairline. "These fonts then served as our base scaffolding," he continues. "They were never seen in design, but instead, we applied decoration them to produce new typefaces for each content strand, giving the identity the space to grow and allow new ideas and shapes to form." As Daniel saw it, the demands on the typeface were pretty simple. "It needed to set an atmosphere. We needed it needed to feel alive. We wanted it to be something shifting and repositioning. And so, while we have a bunch of static cuts of each base style, we rarely use them; the typefaces you see on the website and social only exist at the moment as a string of parameters to create a general style that we use to create live animating versions of the font generated on the fly." In addition to setting the atmosphere, it needed to be extremely flexible and feature live inputs, as a significant part of the branding is about the unpredictability of the future. "So Daniel's team built in those aforementioned "chaos moments where everything from user interaction to live windspeeds can affect the font." Design Process The process of creating the typefaces is a fascinating one. "We started by working with the custom cut of Azurio (Cranny) from Narrow Type. We then redrew it to take inspiration from how a frame and a hairline could be produced from this original cut. From there, we built a type generation tool that uses them as a base. "It's a custom node-based system that lets us really get in there and play with the overlays for everything from grid-sizing, shapes and timing for the animation," he outlines. "We used this tool to design the variants for different content strands. We weren't just designing letterforms; we were designing a comprehensive toolset that could evolve in tandem with the content. "That became a big part of the process: designing systems that designers could actually use, not just look at; again, it was a wider conversation and concept around the future and how designers and machines can work together." In short, the evolution of the typeface system reflects the platform's broader commitment to continuous growth and adaptation." The whole idea was to make something open enough to keep building on," Daniel stresses. "We've already got tools in place to generate new weights, shapes and animated variants, and the tool itself still has a ton of unused functionality. "I can see that growing as new content strands emerge; we'll keep adapting the type with them," he adds. "It's less about version numbers and more about ongoing movement. The system's alive; that's the point. A provocation for the industry In this context, the Frontify Futures identity represents more than smart visual branding; it's also a manifesto for how creative systems might evolve in an age of increasing automation and systematisation. By building unpredictability into their tools, embracing the tension between human craft and machine precision, and creating systems that grow and adapt rather than merely scale, Daniel and the Frontify team have created something that feels genuinely forward-looking. For creatives grappling with similar questions about the future of their craft, Frontify Futures offers both inspiration and practical demonstration. It shows how brands can remain human while embracing technological capability, how systems can be both consistent and surprising, and how the future itself can become a creative medium. This clever approach suggests that the future of branding lies not in choosing between human creativity and systematic efficiency but in finding new ways to make them work together, creating something neither could achieve alone.
    0 Commentarii 0 Distribuiri
  • Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals

    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals
    She spent nearly 40 years taking theater and dance pictures, providing glimpses behind the scenes and creating images that the public couldn’t otherwise access

    Stephanie Rudig

    - Freelance Writer

    June 11, 2025

    Photographer Martha Swope sitting on a floor covered with prints of her photos in 1987
    Andrea Legge / © NYPL

    Martha Swope wanted to be a dancer. She moved from her home state of Texas to New York to attend the School of American Ballet, hoping to start a career in dance. Swope also happened to be an amateur photographer. So, in 1957, a fellow classmate invited her to bring her camera and document rehearsals for a little theater show he was working on. The classmate was director and choreographer Jerome Robbins, and the show was West Side Story.
    One of those rehearsal shots ended up in Life magazine, and Swope quickly started getting professional bookings. It’s notoriously tough to make it on Broadway, but through photography, Swope carved out a career capturing theater and dance. Over the course of nearly four decades, she photographed hundreds more rehearsals, productions and promotional studio shots.

    Unidentified male chorus members dancing during rehearsals for musical West Side Story in 1957

    Martha Swope / © NYPL

    At a time when live performances were not often or easily captured, Swope’s photographs caught the animated moments and distilled the essence of a show into a single image: André De Shields clad in a jumpsuit as the title character in The Wiz, Patti LuPone with her arms raised overhead in Evita, the cast of Cats leaping in feline formations, a close-up of a forlorn Sheryl Lee Ralph in Dreamgirls and the row of dancers obscuring their faces with their headshots in A Chorus Line were all captured by Swope’s camera. She was also the house photographer for the New York City Ballet and the Martha Graham Dance Company and photographed other major dance companies such as the Ailey School.
    Her vision of the stage became fairly ubiquitous, with Playbill reporting that in the late 1970s, two-thirds of Broadway productions were photographed by Swope, meaning her work dominated theater and dance coverage. Carol Rosegg was early in her photography career when she heard that Swope was looking for an assistant. “I didn't frankly even know who she was,” Rosegg says. “Then the press agent who told me said, ‘Pick up any New York Times and you’ll find out.’”
    Swope’s background as a dancer likely equipped her to press the shutter at the exact right moment to capture movement, and to know when everyone on stage was precisely posed. She taught herself photography and early on used a Brownie camera, a simple box model made by Kodak. “She was what she described as ‘a dancer with a Brownie,’” says Barbara Stratyner, a historian of the performing arts who curated exhibitions of Swope’s work at the New York Public Library.

    An ensemble of dancers in rehearsal for the stage production Cats in 1982

    Martha Swope / © NYPL

    “Dance was her first love,” Rosegg says. “She knew everything about dance. She would never use a photo of a dancer whose foot was wrong; the feet had to be perfect.”
    According to Rosegg, once the photo subjects knew she was shooting, “the anxiety level came down a little bit.” They knew that they’d look good in the resulting photos, and they likely trusted her intuition as a fellow dancer. Swope moved with the bearing of a dancer and often stood with her feet in ballet’s fourth position while she shot. She continued to take dance classes throughout her life, including at the prestigious Martha Graham School. Stratyner says, “As Graham got older,was, I think, the only person who was allowed to photograph rehearsals, because Graham didn’t want rehearsals shown.”
    Photographic technology and the theater and dance landscapes evolved greatly over the course of Swope’s career. Rosegg points out that at the start of her own career, cameras didn’t even automatically advance the film after each shot. She explains the delicate nature of working with film, saying, “When you were shooting film, you actually had to compose, because you had 35 shots and then you had to change your film.” Swope also worked during a period of changing over from all black-and-white photos to a mixture of black-and-white and color photography. Rosegg notes that simultaneously, Swope would shoot black-and-white, and she herself would shoot color. Looking at Swope’s portfolio is also an examination of increasingly crisp photo production. Advances in photography made shooting in the dark or capturing subjects under blinding stage lights easier, and they allowed for better zooming in from afar.

    Martha Graham rehearses dancer Takako Asakawa and others in Heretic, a dance work choreographed by Graham, in 1986

    Martha Swope / © NYPL

    It’s much more common nowadays to get a look behind the curtain of theater productions via social media. “The theater photographers of today need to supply so much content,” Rosegg says. “We didn’t have any of that, and getting to go backstage was kind of a big deal.”
    Photographers coming to document a rehearsal once might have been seen as an intrusion, but now, as Rosegg puts it, “everybody is desperate for you to come, and if you’re not there, they’re shooting it on their iPhone.”
    Even with exclusive behind-the-scenes access to the hottest tickets in town and the biggest stars of the day, Swope remained unpretentious. She lived and worked in a brownstone with her apartment above her studio, where the film was developed in a closet and the bathroom served as a darkroom. Rosegg recalls that a phone sat in the darkroom so they could be reached while printing, and she would be amazed at the big-name producers and theater glitterati who rang in while she was making prints in an unventilated space.

    From left to right: Paul Winfield, Ruby Dee, Marsha Jackson and Denzel Washington in the stage production Checkmates in 1988

    Martha Swope / © NYPL

    Swope’s approachability extended to how she chose to preserve her work. She originally sold her body of work to Time Life, and, according to Stratyner, she was unhappy with the way the photos became relatively inaccessible. She took back the rights to her collection and donated it to the New York Public Library, where many photos can be accessed by researchers in person, and the entire array of photos is available online to the public in the Digital Collections. Searching “Martha Swope” yields over 50,000 items from more than 800 productions, featuring a huge variety of figures, from a white-suited John Travolta busting a disco move in Saturday Night Fever to Andrew Lloyd Webber with Nancy Reagan at a performance of Phantom of the Opera.
    Swope’s extensive career was recognized in 2004 with a special Tony Award, a Tony Honors for Excellence in Theater, which are given intermittently to notable figures in theater who operate outside of traditional awards categories. She also received a lifetime achievement award from the League of Professional Theater Women in 2007. Though she retired in 1994 and died in 2017, her work still reverberates through dance and Broadway history today. For decades, she captured the fleeting moments of theater that would otherwise never be seen by the public. And her passion was clear and straightforward. As she once told an interviewer: “I’m not interested in what’s going on on my side of the camera. I’m interested in what’s happening on the other side.”

    Get the latest Travel & Culture stories in your inbox.
    #meet #martha #swope #legendary #broadway
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals She spent nearly 40 years taking theater and dance pictures, providing glimpses behind the scenes and creating images that the public couldn’t otherwise access Stephanie Rudig - Freelance Writer June 11, 2025 Photographer Martha Swope sitting on a floor covered with prints of her photos in 1987 Andrea Legge / © NYPL Martha Swope wanted to be a dancer. She moved from her home state of Texas to New York to attend the School of American Ballet, hoping to start a career in dance. Swope also happened to be an amateur photographer. So, in 1957, a fellow classmate invited her to bring her camera and document rehearsals for a little theater show he was working on. The classmate was director and choreographer Jerome Robbins, and the show was West Side Story. One of those rehearsal shots ended up in Life magazine, and Swope quickly started getting professional bookings. It’s notoriously tough to make it on Broadway, but through photography, Swope carved out a career capturing theater and dance. Over the course of nearly four decades, she photographed hundreds more rehearsals, productions and promotional studio shots. Unidentified male chorus members dancing during rehearsals for musical West Side Story in 1957 Martha Swope / © NYPL At a time when live performances were not often or easily captured, Swope’s photographs caught the animated moments and distilled the essence of a show into a single image: André De Shields clad in a jumpsuit as the title character in The Wiz, Patti LuPone with her arms raised overhead in Evita, the cast of Cats leaping in feline formations, a close-up of a forlorn Sheryl Lee Ralph in Dreamgirls and the row of dancers obscuring their faces with their headshots in A Chorus Line were all captured by Swope’s camera. She was also the house photographer for the New York City Ballet and the Martha Graham Dance Company and photographed other major dance companies such as the Ailey School. Her vision of the stage became fairly ubiquitous, with Playbill reporting that in the late 1970s, two-thirds of Broadway productions were photographed by Swope, meaning her work dominated theater and dance coverage. Carol Rosegg was early in her photography career when she heard that Swope was looking for an assistant. “I didn't frankly even know who she was,” Rosegg says. “Then the press agent who told me said, ‘Pick up any New York Times and you’ll find out.’” Swope’s background as a dancer likely equipped her to press the shutter at the exact right moment to capture movement, and to know when everyone on stage was precisely posed. She taught herself photography and early on used a Brownie camera, a simple box model made by Kodak. “She was what she described as ‘a dancer with a Brownie,’” says Barbara Stratyner, a historian of the performing arts who curated exhibitions of Swope’s work at the New York Public Library. An ensemble of dancers in rehearsal for the stage production Cats in 1982 Martha Swope / © NYPL “Dance was her first love,” Rosegg says. “She knew everything about dance. She would never use a photo of a dancer whose foot was wrong; the feet had to be perfect.” According to Rosegg, once the photo subjects knew she was shooting, “the anxiety level came down a little bit.” They knew that they’d look good in the resulting photos, and they likely trusted her intuition as a fellow dancer. Swope moved with the bearing of a dancer and often stood with her feet in ballet’s fourth position while she shot. She continued to take dance classes throughout her life, including at the prestigious Martha Graham School. Stratyner says, “As Graham got older,was, I think, the only person who was allowed to photograph rehearsals, because Graham didn’t want rehearsals shown.” Photographic technology and the theater and dance landscapes evolved greatly over the course of Swope’s career. Rosegg points out that at the start of her own career, cameras didn’t even automatically advance the film after each shot. She explains the delicate nature of working with film, saying, “When you were shooting film, you actually had to compose, because you had 35 shots and then you had to change your film.” Swope also worked during a period of changing over from all black-and-white photos to a mixture of black-and-white and color photography. Rosegg notes that simultaneously, Swope would shoot black-and-white, and she herself would shoot color. Looking at Swope’s portfolio is also an examination of increasingly crisp photo production. Advances in photography made shooting in the dark or capturing subjects under blinding stage lights easier, and they allowed for better zooming in from afar. Martha Graham rehearses dancer Takako Asakawa and others in Heretic, a dance work choreographed by Graham, in 1986 Martha Swope / © NYPL It’s much more common nowadays to get a look behind the curtain of theater productions via social media. “The theater photographers of today need to supply so much content,” Rosegg says. “We didn’t have any of that, and getting to go backstage was kind of a big deal.” Photographers coming to document a rehearsal once might have been seen as an intrusion, but now, as Rosegg puts it, “everybody is desperate for you to come, and if you’re not there, they’re shooting it on their iPhone.” Even with exclusive behind-the-scenes access to the hottest tickets in town and the biggest stars of the day, Swope remained unpretentious. She lived and worked in a brownstone with her apartment above her studio, where the film was developed in a closet and the bathroom served as a darkroom. Rosegg recalls that a phone sat in the darkroom so they could be reached while printing, and she would be amazed at the big-name producers and theater glitterati who rang in while she was making prints in an unventilated space. From left to right: Paul Winfield, Ruby Dee, Marsha Jackson and Denzel Washington in the stage production Checkmates in 1988 Martha Swope / © NYPL Swope’s approachability extended to how she chose to preserve her work. She originally sold her body of work to Time Life, and, according to Stratyner, she was unhappy with the way the photos became relatively inaccessible. She took back the rights to her collection and donated it to the New York Public Library, where many photos can be accessed by researchers in person, and the entire array of photos is available online to the public in the Digital Collections. Searching “Martha Swope” yields over 50,000 items from more than 800 productions, featuring a huge variety of figures, from a white-suited John Travolta busting a disco move in Saturday Night Fever to Andrew Lloyd Webber with Nancy Reagan at a performance of Phantom of the Opera. Swope’s extensive career was recognized in 2004 with a special Tony Award, a Tony Honors for Excellence in Theater, which are given intermittently to notable figures in theater who operate outside of traditional awards categories. She also received a lifetime achievement award from the League of Professional Theater Women in 2007. Though she retired in 1994 and died in 2017, her work still reverberates through dance and Broadway history today. For decades, she captured the fleeting moments of theater that would otherwise never be seen by the public. And her passion was clear and straightforward. As she once told an interviewer: “I’m not interested in what’s going on on my side of the camera. I’m interested in what’s happening on the other side.” Get the latest Travel & Culture stories in your inbox. #meet #martha #swope #legendary #broadway
    WWW.SMITHSONIANMAG.COM
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals She spent nearly 40 years taking theater and dance pictures, providing glimpses behind the scenes and creating images that the public couldn’t otherwise access Stephanie Rudig - Freelance Writer June 11, 2025 Photographer Martha Swope sitting on a floor covered with prints of her photos in 1987 Andrea Legge / © NYPL Martha Swope wanted to be a dancer. She moved from her home state of Texas to New York to attend the School of American Ballet, hoping to start a career in dance. Swope also happened to be an amateur photographer. So, in 1957, a fellow classmate invited her to bring her camera and document rehearsals for a little theater show he was working on. The classmate was director and choreographer Jerome Robbins, and the show was West Side Story. One of those rehearsal shots ended up in Life magazine, and Swope quickly started getting professional bookings. It’s notoriously tough to make it on Broadway, but through photography, Swope carved out a career capturing theater and dance. Over the course of nearly four decades, she photographed hundreds more rehearsals, productions and promotional studio shots. Unidentified male chorus members dancing during rehearsals for musical West Side Story in 1957 Martha Swope / © NYPL At a time when live performances were not often or easily captured, Swope’s photographs caught the animated moments and distilled the essence of a show into a single image: André De Shields clad in a jumpsuit as the title character in The Wiz, Patti LuPone with her arms raised overhead in Evita, the cast of Cats leaping in feline formations, a close-up of a forlorn Sheryl Lee Ralph in Dreamgirls and the row of dancers obscuring their faces with their headshots in A Chorus Line were all captured by Swope’s camera. She was also the house photographer for the New York City Ballet and the Martha Graham Dance Company and photographed other major dance companies such as the Ailey School. Her vision of the stage became fairly ubiquitous, with Playbill reporting that in the late 1970s, two-thirds of Broadway productions were photographed by Swope, meaning her work dominated theater and dance coverage. Carol Rosegg was early in her photography career when she heard that Swope was looking for an assistant. “I didn't frankly even know who she was,” Rosegg says. “Then the press agent who told me said, ‘Pick up any New York Times and you’ll find out.’” Swope’s background as a dancer likely equipped her to press the shutter at the exact right moment to capture movement, and to know when everyone on stage was precisely posed. She taught herself photography and early on used a Brownie camera, a simple box model made by Kodak. “She was what she described as ‘a dancer with a Brownie,’” says Barbara Stratyner, a historian of the performing arts who curated exhibitions of Swope’s work at the New York Public Library. An ensemble of dancers in rehearsal for the stage production Cats in 1982 Martha Swope / © NYPL “Dance was her first love,” Rosegg says. “She knew everything about dance. She would never use a photo of a dancer whose foot was wrong; the feet had to be perfect.” According to Rosegg, once the photo subjects knew she was shooting, “the anxiety level came down a little bit.” They knew that they’d look good in the resulting photos, and they likely trusted her intuition as a fellow dancer. Swope moved with the bearing of a dancer and often stood with her feet in ballet’s fourth position while she shot. She continued to take dance classes throughout her life, including at the prestigious Martha Graham School. Stratyner says, “As Graham got older, [Swope] was, I think, the only person who was allowed to photograph rehearsals, because Graham didn’t want rehearsals shown.” Photographic technology and the theater and dance landscapes evolved greatly over the course of Swope’s career. Rosegg points out that at the start of her own career, cameras didn’t even automatically advance the film after each shot. She explains the delicate nature of working with film, saying, “When you were shooting film, you actually had to compose, because you had 35 shots and then you had to change your film.” Swope also worked during a period of changing over from all black-and-white photos to a mixture of black-and-white and color photography. Rosegg notes that simultaneously, Swope would shoot black-and-white, and she herself would shoot color. Looking at Swope’s portfolio is also an examination of increasingly crisp photo production. Advances in photography made shooting in the dark or capturing subjects under blinding stage lights easier, and they allowed for better zooming in from afar. Martha Graham rehearses dancer Takako Asakawa and others in Heretic, a dance work choreographed by Graham, in 1986 Martha Swope / © NYPL It’s much more common nowadays to get a look behind the curtain of theater productions via social media. “The theater photographers of today need to supply so much content,” Rosegg says. “We didn’t have any of that, and getting to go backstage was kind of a big deal.” Photographers coming to document a rehearsal once might have been seen as an intrusion, but now, as Rosegg puts it, “everybody is desperate for you to come, and if you’re not there, they’re shooting it on their iPhone.” Even with exclusive behind-the-scenes access to the hottest tickets in town and the biggest stars of the day, Swope remained unpretentious. She lived and worked in a brownstone with her apartment above her studio, where the film was developed in a closet and the bathroom served as a darkroom. Rosegg recalls that a phone sat in the darkroom so they could be reached while printing, and she would be amazed at the big-name producers and theater glitterati who rang in while she was making prints in an unventilated space. From left to right: Paul Winfield, Ruby Dee, Marsha Jackson and Denzel Washington in the stage production Checkmates in 1988 Martha Swope / © NYPL Swope’s approachability extended to how she chose to preserve her work. She originally sold her body of work to Time Life, and, according to Stratyner, she was unhappy with the way the photos became relatively inaccessible. She took back the rights to her collection and donated it to the New York Public Library, where many photos can be accessed by researchers in person, and the entire array of photos is available online to the public in the Digital Collections. Searching “Martha Swope” yields over 50,000 items from more than 800 productions, featuring a huge variety of figures, from a white-suited John Travolta busting a disco move in Saturday Night Fever to Andrew Lloyd Webber with Nancy Reagan at a performance of Phantom of the Opera. Swope’s extensive career was recognized in 2004 with a special Tony Award, a Tony Honors for Excellence in Theater, which are given intermittently to notable figures in theater who operate outside of traditional awards categories. She also received a lifetime achievement award from the League of Professional Theater Women in 2007. Though she retired in 1994 and died in 2017, her work still reverberates through dance and Broadway history today. For decades, she captured the fleeting moments of theater that would otherwise never be seen by the public. And her passion was clear and straightforward. As she once told an interviewer: “I’m not interested in what’s going on on my side of the camera. I’m interested in what’s happening on the other side.” Get the latest Travel & Culture stories in your inbox.
    0 Commentarii 0 Distribuiri
  • How a planetarium show discovered a spiral at the edge of our solar system

    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system.

    “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist.

    Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years. 

    The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?” 

    To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data.

    “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says. 

    The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars.

    “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.”

    She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’” 

    While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves. 

    In each simulation, the spiral persisted.

    “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’” 

    An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system.

    “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.”

    “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.”

    It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.”

    The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems.

    Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”

     In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show.

    “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’

    “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'”

    “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds.

    The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.”

    By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies.

    To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX.

    The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.” 

    The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.”

    Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data.

    “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.”

    As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands.

    Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent. 

    More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud. 

    Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.” 

    The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud. 

    For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    #how #planetarium #show #discovered #spiral
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space ShowMore simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system.As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths.Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “ThenNeil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud, a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud“New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park. #how #planetarium #show #discovered #spiral
    WWW.FASTCOMPANY.COM
    How a planetarium show discovered a spiral at the edge of our solar system
    If you’ve ever flown through outer space, at least while watching a documentary or a science fiction film, you’ve seen how artists turn astronomical findings into stunning visuals. But in the process of visualizing data for their latest planetarium show, a production team at New York’s American Museum of Natural History made a surprising discovery of their own: a trillion-and-a-half mile long spiral of material drifting along the edge of our solar system. “So this is a really fun thing that happened,” says Jackie Faherty, the museum’s senior scientist. Last winter, Faherty and her colleagues were beneath the dome of the museum’s Hayden Planetarium, fine-tuning a scene that featured the Oort cloud, the big, thick bubble surrounding our Sun and planets that’s filled with ice and rock and other remnants from the solar system’s infancy. The Oort cloud begins far beyond Neptune, around one and a half light years from the Sun. It has never been directly observed; its existence is inferred from the behavior of long-period comets entering the inner solar system. The cloud is so expansive that the Voyager spacecraft, our most distant probes, would need another 250 years just to reach its inner boundary; to reach the other side, they would need about 30,000 years.  The 30-minute show, Encounters in the Milky Way, narrated by Pedro Pascal, guides audiences on a trip through the galaxy across billions of years. For a section about our nascent solar system, the writing team decided “there’s going to be a fly-by” of the Oort cloud, Faherty says. “But what does our Oort cloud look like?”  To find out, the museum consulted astronomers and turned to David Nesvorný, a scientist at the Southwest Research Institute in San Antonio. He provided his model of the millions of particles believed to make up the Oort cloud, based on extensive observational data. “Everybody said, go talk to Nesvorný. He’s got the best model,” says Faherty. And “everybody told us, ‘There’s structure in the model,’ so we were kind of set up to look for stuff,” she says.  The museum’s technical team began using Nesvorný’s model to simulate how the cloud evolved over time. Later, as the team projected versions of the fly-by scene into the dome, with the camera looking back at the Oort cloud, they saw a familiar shape, one that appears in galaxies, Saturn’s rings, and disks around young stars. “We’re flying away from the Oort cloud and out pops this spiral, a spiral shape to the outside of our solar system,” Faherty marveled. “A huge structure, millions and millions of particles.” She emailed Nesvorný to ask for “more particles,” with a render of the scene attached. “We noticed the spiral of course,” she wrote. “And then he writes me back: ‘what are you talking about, a spiral?’”  While fine-tuning a simulation of the Oort cloud, a vast expanse of ice material leftover from the birth of our Sun, the ‘Encounters in the Milky Way’ production team noticed a very clear shape: a structure made of billions of comets and shaped like a spiral-armed galaxy, seen here in a scene from the final Space Show (curving, dusty S-shape behind the Sun) [Image: © AMNH] More simulations ensued, this time on Pleiades, a powerful NASA supercomputer. In high-performance computer simulations spanning 4.6 billion years, starting from the Solar System’s earliest days, the researchers visualized how the initial icy and rocky ingredients of the Oort cloud began circling the Sun, in the elliptical orbits that are thought to give the cloud its rough disc shape. The simulations also incorporated the physics of the Sun’s gravitational pull, the influences from our Milky Way galaxy, and the movements of the comets themselves.  In each simulation, the spiral persisted. “No one has ever seen the Oort structure like that before,” says Faherty. Nesvorný “has a great quote about this: ‘The math was all there. We just needed the visuals.’”  An illustration of the Kuiper Belt and Oort Cloud in relation to our solar system. [Image: NASA] As the Oort cloud grew with the early solar system, Nesvorný and his colleagues hypothesize that the galactic tide, or the gravitational force from the Milky Way, disrupted the orbits of some comets. Although the Sun pulls these objects inward, the galaxy’s gravity appears to have twisted part of the Oort cloud outward, forming a spiral tilted roughly 30 degrees from the plane of the solar system. “As the galactic tide acts to decouple bodies from the scattered disk it creates a spiral structure in physical space that is roughly 15,000 astronomical units in length,” or around 1.4 trillion miles from one end to the other, the researchers write in a paper that was published in March in the Astrophysical Journal. “The spiral is long-lived and persists in the inner Oort Cloud to the present time.” “The physics makes sense,” says Faherty. “Scientists, we’re amazing at what we do, but it doesn’t mean we can see everything right away.” It helped that the team behind the space show was primed to look for something, says Carter Emmart, the museum’s director of astrovisualization and director of Encounters. Astronomers had described Nesvorný’s model as having “a structure,” which intrigued the team’s artists. “We were also looking for structure so that it wouldn’t just be sort of like a big blob,” he says. “Other models were also revealing this—but they just hadn’t been visualized.” The museum’s attempts to simulate nature date back to its first habitat dioramas in the early 1900s, which brought visitors to places that hadn’t yet been captured by color photos, TV, or the web. The planetarium, a night sky simulator for generations of would-be scientists and astronauts, got its start after financier Charles Hayden bought the museum its first Zeiss projector. The planetarium now boasts one of the world’s few Zeiss Mark IX systems. Still, these days the star projector is rarely used, Emmart says, now that fulldome laser projectors can turn the old static starfield into 3D video running at 60 frames per second. The Hayden boasts six custom-built Christie projectors, part of what the museum’s former president called “the most advanced planetarium ever attempted.”  In about 1.3 million years, the star system Gliese 710 is set to pass directly through our Oort Cloud, an event visualized in a dramatic scene in ‘Encounters in the Milky Way.’ During its flyby, our systems will swap icy comets, flinging some out on new paths. [Image: © AMNH] Emmart recalls how in 1998, when he and other museum leaders were imagining the future of space shows at the Hayden—now with the help of digital projectors and computer graphics—there were questions over how much space they could try to show. “We’re talking about these astronomical data sets we could plot to make the galaxy and the stars,” he says. “Of course, we knew that we would have this star projector, but we really wanted to emphasize astrophysics with this dome video system. I was drawing pictures of this just to get our heads around it and noting the tip of the solar system to the Milky Way is about 60 degrees. And I said, what are we gonna do when we get outside the Milky Way?’ “Then [planetarium’s director] Neil Degrasse Tyson “goes, ‘whoa, whoa, whoa, Carter, we have enough to do. And just plotting the Milky Way, that’s hard enough.’ And I said, ‘well, when we exit the Milky Way and we don’t see any other galaxies, that’s sort of like astronomy in 1920—we thought maybe the entire universe is just a Milky Way.'” “And that kind of led to a chaotic discussion about, well, what other data sets are there for this?” Emmart adds. The museum worked with astronomer Brent Tully, who had mapped 3500 galaxies beyond the Milky Way, in collaboration with the National Center for Super Computing Applications. “That was it,” he says, “and that seemed fantastical.” By the time the first planetarium show opened at the museum’s new Rose Center for Earth and Space in 2000, Tully had broadened his survey “to an amazing” 30,000 galaxies. The Sloan Digital Sky Survey followed—it’s now at data release 18—with six million galaxies. To build the map of the universe that underlies Encounters, the team also relied on data from the European Space Agency’s space observatory, Gaia. Launched in 2013 and powered down in March of this year, Gaia brought an unprecedented precision to our astronomical map, plotting the distance between 1.7 billion stars. To visualize and render the simulated data, Jon Parker, the museum’s lead technical director, relied on Houdini, a 3D animation tool by Toronto-based SideFX. The goal is immersion, “whether it’s in front of the buffalo downstairs, and seeing what those herds were like before we decimated them, to coming in this room and being teleported to space, with an accurate foundation in the science,” Emmart says. “But the art is important, because the art is the way to the soul.”  The museum, he adds, is “a testament to wonder. And I think wonder is a gateway to inspiration, and inspiration is a gateway to motivation.” Three-D visuals aren’t just powerful tools for communicating science, but increasingly crucial for science itself. Software like OpenSpace, an open source simulation tool developed by the museum, along with the growing availability of high-performance computing, are making it easier to build highly detailed visuals of ever larger and more complex collections of data. “Anytime we look, literally, from a different angle at catalogs of astronomical positions, simulations, or exploring the phase space of a complex data set, there is great potential to discover something new,” says Brian R. Kent, an astronomer and director of science communications at National Radio Astronomy Observatory. “There is also a wealth of astronomics tatical data in archives that can be reanalyzed in new ways, leading to new discoveries.” As the instruments grow in size and sophistication, so does the data, and the challenge of understanding it. Like all scientists, astronomers are facing a deluge of data, ranging from gamma rays and X-rays to ultraviolet, optical, infrared, and radio bands. Our Oort cloud (center), a shell of icy bodies that surrounds the solar system and extends one-and-a-half light years in every direction, is shown in this scene from ‘Encounters in the Milky Way’ along with the Oort clouds of neighboring stars. The more massive the star, the larger its Oort cloud [Image: © AMNH ] “New facilities like the Next Generation Very Large Array here at NRAO or the Vera Rubin Observatory and LSST survey project will generate large volumes of data, so astronomers have to get creative with how to analyze it,” says Kent.  More data—and new instruments—will also be needed to prove the spiral itself is actually there: there’s still no known way to even observe the Oort cloud.  Instead, the paper notes, the structure will have to be measured from “detection of a large number of objects” in the radius of the inner Oort cloud or from “thermal emission from small particles in the Oort spiral.”  The Vera C. Rubin Observatory, a powerful, U.S.-funded telescope that recently began operation in Chile, could possibly observe individual icy bodies within the cloud. But researchers expect the telescope will likely discover only dozens of these objects, maybe hundreds, not enough to meaningfully visualize any shapes in the Oort cloud.  For us, here and now, the 1.4 trillion mile-long spiral will remain confined to the inside of a dark dome across the street from Central Park.
    0 Commentarii 0 Distribuiri
  • The Drinking Fountain of La Arboleda by Luis Barragán: Water, Memory, and Geometry

    Drinking Fountain of La Arboleda| 1970s Photograph
    Luis Barragan’s work is often celebrated for its profound dialogue between form, memory, and landscape. In the Drinking Fountain of La Arboleda, Barragán channels these core principles into a singular architectural gesture. Situated at the culmination of the Paseo de los Gigantes, this fountain transcends utilitarian function to become a space of contemplation and poetic reflection.

    Drinking Fountain of La Arboleda Technical Information

    Architects1-2: Luis Barragán
    Location: Avenida Paseo de los Gigantes, Las Arboledas, Mexico
    Height: 14.6 meters
    Width: 10.4 meters
    Project Years: 1960s
    Plans by: Enrique Delgado Camara

    In Las Arboledas I had the pleasure of building a large rectangular pond among eucalyptus trees; however, while doing so, I thought of Persian gardens, I also thought of De Chirico, I also thought that water is a mirror, and I liked that it reflected the branches of the trees. You know, popular architecture has always impressed me because it is pure truth and because the spaces that occur in plazas, in porticos, in courtyards, are always given with generosity.
    – Luis Barragán

    Drinking Fountain of La Arboleda Photographs

    Drinking Fountain of La Arboleda| 1970s Photograph

    1970s Photograph

    1970s Photograph

    1970s Photograph

    1970s Photograph
    Spatial Composition and Geometric Manipulation
    The project extends Barragán’s broader explorations in Las Arboledas and Los Clubes, developments marked by an intimate relationship with nature and a restrained formal language. Here, water becomes material and metaphor, shaping a spatial experience that is as much about the mind as the body.
    The Drinking Fountain of La Arboleda is defined by the dynamic interplay of two elements: a towering white wall and a long, linear water trough. The wall, rising to a height of 14.6 meters, asserts its presence in the landscape as a vertical marker. It competes with, yet does not overshadow, the surrounding eucalyptus trees. The water trough, measuring 44 meters in length, 2.55 meters in width, and 0.67 meters in height, extends along the path in a measured horizontal counterpoint.
    This juxtaposition of vertical and horizontal geometries establishes a composition of duality. The white wall commands attention from afar, while the dark basin of water, offset to the side, quietly draws in the viewer’s gaze. The deliberate misalignment of these two forms prevents a static symmetry, generating a subtle sense of movement and tension within the space.
    Barragán’s manipulation of circulation further reinforces this dynamic quality. Rather than a direct approach, entry to the plaza is orchestrated through a series of turns. These indirect paths obscure the view and gradually reveal the fountain, heightening the sense of arrival and emphasizing the experiential choreography of the approach.
    Materiality and Sensory Qualities
    Material choices are critical in the fountain’s ability to evoke stillness and dynamism. The white stucco of the wall acts as a canvas for the interplay of light and shadow, particularly as the sun filters through the towering eucalyptus canopy. This shifting luminosity imbues the space with a living quality, constantly animated by the rhythms of the day.
    The basin of the fountain is constructed from dark anthracite, lending the water a reflective depth that absorbs and mirrors the surrounding environment. The edge of the water, defined by precisely cut, sharp-edged walls, creates an illusion of the water as a freestanding volume. This interplay of light, shadow, and reflection intensifies the perception of depth, dissolving the boundary between container and contained.
    The gentle sound of water flowing over the basin’s edge adds a sonic dimension to the experience. It serves as a subtle counterpoint to the plaza’s otherwise hushed atmosphere, enhancing the sensory richness without disrupting the meditative calm.
    Drinking Fountain of La Arboleda Cultural Resonance
    In this project, Barragán evokes a memory of rural Mexico that resonates with personal nostalgia and collective cultural imagery. The trough recalls the water basins of his childhood, echoing the hacienda landscapes and the enduring significance of water in Mexican life. Yet, by abstracting these elements into minimalist forms, he situates them within a modern architectural discourse that transcends mere historicism.
    Barragán’s insistence on the evocative power of space is evident in every aspect of the Drinking Fountain. It is a site of transition, marking the end of the linear paseo while simultaneously inviting introspection and pause. The project’s restrained materiality and precise spatial articulation distill Barragán’s belief in architecture as a vehicle for personal reflection and cultural continuity.
    His 1980 Pritzker Prize acceptance speech, in which he described his enduring fascination with water and the memories of fountains and acequias, underscores this deep personal connection. The Drinking Fountain of La Arboleda can be read as an architectural meditation on that theme. This work bridges the abstraction of modernism with the rich, elemental forces of the Mexican landscape.
    Drinking Fountain of La Arboleda Plans

    Floor Plan | © Enrique Delgado Camara

    Axonometric View | © Enrique Delgado Camara
    Drinking Fountain of La Arboleda Image Gallery

    About Luis Barragán
    Luis Barragánwas a Mexican architect renowned for his masterful integration of light, color, and landscape into architecture. His work blends modernist abstraction with deeply rooted Mexican traditions, crafting spaces that evoke memory, contemplation, and poetic resonance.
    Credits and Additional Notes

    Water TroughLength: 44 meters
    Water TroughWidth: 2.55 meters
    Water TroughHeight: 0.67 meters
    Material: Anthracite-colored stoneDelgado Cámara, Enrique. La Geometría del Agua: Mecanismos Arquitectónicos de Manipulación Espacial. Enrique Delgado Cámara, 2024. 
    Ambasz, Emilio. The Architecture of Luis Barragán. Museum of Modern Art, New York, 1976.
    #drinking #fountain #arboleda #luis #barragán
    The Drinking Fountain of La Arboleda by Luis Barragán: Water, Memory, and Geometry
    Drinking Fountain of La Arboleda| 1970s Photograph Luis Barragan’s work is often celebrated for its profound dialogue between form, memory, and landscape. In the Drinking Fountain of La Arboleda, Barragán channels these core principles into a singular architectural gesture. Situated at the culmination of the Paseo de los Gigantes, this fountain transcends utilitarian function to become a space of contemplation and poetic reflection. Drinking Fountain of La Arboleda Technical Information Architects1-2: Luis Barragán Location: Avenida Paseo de los Gigantes, Las Arboledas, Mexico Height: 14.6 meters Width: 10.4 meters Project Years: 1960s Plans by: Enrique Delgado Camara In Las Arboledas I had the pleasure of building a large rectangular pond among eucalyptus trees; however, while doing so, I thought of Persian gardens, I also thought of De Chirico, I also thought that water is a mirror, and I liked that it reflected the branches of the trees. You know, popular architecture has always impressed me because it is pure truth and because the spaces that occur in plazas, in porticos, in courtyards, are always given with generosity. – Luis Barragán Drinking Fountain of La Arboleda Photographs Drinking Fountain of La Arboleda| 1970s Photograph 1970s Photograph 1970s Photograph 1970s Photograph 1970s Photograph Spatial Composition and Geometric Manipulation The project extends Barragán’s broader explorations in Las Arboledas and Los Clubes, developments marked by an intimate relationship with nature and a restrained formal language. Here, water becomes material and metaphor, shaping a spatial experience that is as much about the mind as the body. The Drinking Fountain of La Arboleda is defined by the dynamic interplay of two elements: a towering white wall and a long, linear water trough. The wall, rising to a height of 14.6 meters, asserts its presence in the landscape as a vertical marker. It competes with, yet does not overshadow, the surrounding eucalyptus trees. The water trough, measuring 44 meters in length, 2.55 meters in width, and 0.67 meters in height, extends along the path in a measured horizontal counterpoint. This juxtaposition of vertical and horizontal geometries establishes a composition of duality. The white wall commands attention from afar, while the dark basin of water, offset to the side, quietly draws in the viewer’s gaze. The deliberate misalignment of these two forms prevents a static symmetry, generating a subtle sense of movement and tension within the space. Barragán’s manipulation of circulation further reinforces this dynamic quality. Rather than a direct approach, entry to the plaza is orchestrated through a series of turns. These indirect paths obscure the view and gradually reveal the fountain, heightening the sense of arrival and emphasizing the experiential choreography of the approach. Materiality and Sensory Qualities Material choices are critical in the fountain’s ability to evoke stillness and dynamism. The white stucco of the wall acts as a canvas for the interplay of light and shadow, particularly as the sun filters through the towering eucalyptus canopy. This shifting luminosity imbues the space with a living quality, constantly animated by the rhythms of the day. The basin of the fountain is constructed from dark anthracite, lending the water a reflective depth that absorbs and mirrors the surrounding environment. The edge of the water, defined by precisely cut, sharp-edged walls, creates an illusion of the water as a freestanding volume. This interplay of light, shadow, and reflection intensifies the perception of depth, dissolving the boundary between container and contained. The gentle sound of water flowing over the basin’s edge adds a sonic dimension to the experience. It serves as a subtle counterpoint to the plaza’s otherwise hushed atmosphere, enhancing the sensory richness without disrupting the meditative calm. Drinking Fountain of La Arboleda Cultural Resonance In this project, Barragán evokes a memory of rural Mexico that resonates with personal nostalgia and collective cultural imagery. The trough recalls the water basins of his childhood, echoing the hacienda landscapes and the enduring significance of water in Mexican life. Yet, by abstracting these elements into minimalist forms, he situates them within a modern architectural discourse that transcends mere historicism. Barragán’s insistence on the evocative power of space is evident in every aspect of the Drinking Fountain. It is a site of transition, marking the end of the linear paseo while simultaneously inviting introspection and pause. The project’s restrained materiality and precise spatial articulation distill Barragán’s belief in architecture as a vehicle for personal reflection and cultural continuity. His 1980 Pritzker Prize acceptance speech, in which he described his enduring fascination with water and the memories of fountains and acequias, underscores this deep personal connection. The Drinking Fountain of La Arboleda can be read as an architectural meditation on that theme. This work bridges the abstraction of modernism with the rich, elemental forces of the Mexican landscape. Drinking Fountain of La Arboleda Plans Floor Plan | © Enrique Delgado Camara Axonometric View | © Enrique Delgado Camara Drinking Fountain of La Arboleda Image Gallery About Luis Barragán Luis Barragánwas a Mexican architect renowned for his masterful integration of light, color, and landscape into architecture. His work blends modernist abstraction with deeply rooted Mexican traditions, crafting spaces that evoke memory, contemplation, and poetic resonance. Credits and Additional Notes Water TroughLength: 44 meters Water TroughWidth: 2.55 meters Water TroughHeight: 0.67 meters Material: Anthracite-colored stoneDelgado Cámara, Enrique. La Geometría del Agua: Mecanismos Arquitectónicos de Manipulación Espacial. Enrique Delgado Cámara, 2024.  Ambasz, Emilio. The Architecture of Luis Barragán. Museum of Modern Art, New York, 1976. #drinking #fountain #arboleda #luis #barragán
    ARCHEYES.COM
    The Drinking Fountain of La Arboleda by Luis Barragán: Water, Memory, and Geometry
    Drinking Fountain of La Arboleda (Bebedero) | 1970s Photograph Luis Barragan’s work is often celebrated for its profound dialogue between form, memory, and landscape. In the Drinking Fountain of La Arboleda, Barragán channels these core principles into a singular architectural gesture. Situated at the culmination of the Paseo de los Gigantes, this fountain transcends utilitarian function to become a space of contemplation and poetic reflection. Drinking Fountain of La Arboleda Technical Information Architects1-2: Luis Barragán Location: Avenida Paseo de los Gigantes, Las Arboledas, Mexico Height: 14.6 meters Width: 10.4 meters Project Years: 1960s Plans by: Enrique Delgado Camara In Las Arboledas I had the pleasure of building a large rectangular pond among eucalyptus trees; however, while doing so, I thought of Persian gardens, I also thought of De Chirico, I also thought that water is a mirror, and I liked that it reflected the branches of the trees. You know, popular architecture has always impressed me because it is pure truth and because the spaces that occur in plazas, in porticos, in courtyards, are always given with generosity. – Luis Barragán Drinking Fountain of La Arboleda Photographs Drinking Fountain of La Arboleda (Bebedero) | 1970s Photograph 1970s Photograph 1970s Photograph 1970s Photograph 1970s Photograph Spatial Composition and Geometric Manipulation The project extends Barragán’s broader explorations in Las Arboledas and Los Clubes, developments marked by an intimate relationship with nature and a restrained formal language. Here, water becomes material and metaphor, shaping a spatial experience that is as much about the mind as the body. The Drinking Fountain of La Arboleda is defined by the dynamic interplay of two elements: a towering white wall and a long, linear water trough. The wall, rising to a height of 14.6 meters, asserts its presence in the landscape as a vertical marker. It competes with, yet does not overshadow, the surrounding eucalyptus trees. The water trough, measuring 44 meters in length, 2.55 meters in width, and 0.67 meters in height, extends along the path in a measured horizontal counterpoint. This juxtaposition of vertical and horizontal geometries establishes a composition of duality. The white wall commands attention from afar, while the dark basin of water, offset to the side, quietly draws in the viewer’s gaze. The deliberate misalignment of these two forms prevents a static symmetry, generating a subtle sense of movement and tension within the space. Barragán’s manipulation of circulation further reinforces this dynamic quality. Rather than a direct approach, entry to the plaza is orchestrated through a series of turns. These indirect paths obscure the view and gradually reveal the fountain, heightening the sense of arrival and emphasizing the experiential choreography of the approach. Materiality and Sensory Qualities Material choices are critical in the fountain’s ability to evoke stillness and dynamism. The white stucco of the wall acts as a canvas for the interplay of light and shadow, particularly as the sun filters through the towering eucalyptus canopy. This shifting luminosity imbues the space with a living quality, constantly animated by the rhythms of the day. The basin of the fountain is constructed from dark anthracite, lending the water a reflective depth that absorbs and mirrors the surrounding environment. The edge of the water, defined by precisely cut, sharp-edged walls, creates an illusion of the water as a freestanding volume. This interplay of light, shadow, and reflection intensifies the perception of depth, dissolving the boundary between container and contained. The gentle sound of water flowing over the basin’s edge adds a sonic dimension to the experience. It serves as a subtle counterpoint to the plaza’s otherwise hushed atmosphere, enhancing the sensory richness without disrupting the meditative calm. Drinking Fountain of La Arboleda Cultural Resonance In this project, Barragán evokes a memory of rural Mexico that resonates with personal nostalgia and collective cultural imagery. The trough recalls the water basins of his childhood, echoing the hacienda landscapes and the enduring significance of water in Mexican life. Yet, by abstracting these elements into minimalist forms, he situates them within a modern architectural discourse that transcends mere historicism. Barragán’s insistence on the evocative power of space is evident in every aspect of the Drinking Fountain. It is a site of transition, marking the end of the linear paseo while simultaneously inviting introspection and pause. The project’s restrained materiality and precise spatial articulation distill Barragán’s belief in architecture as a vehicle for personal reflection and cultural continuity. His 1980 Pritzker Prize acceptance speech, in which he described his enduring fascination with water and the memories of fountains and acequias, underscores this deep personal connection. The Drinking Fountain of La Arboleda can be read as an architectural meditation on that theme. This work bridges the abstraction of modernism with the rich, elemental forces of the Mexican landscape. Drinking Fountain of La Arboleda Plans Floor Plan | © Enrique Delgado Camara Axonometric View | © Enrique Delgado Camara Drinking Fountain of La Arboleda Image Gallery About Luis Barragán Luis Barragán (1902–1988) was a Mexican architect renowned for his masterful integration of light, color, and landscape into architecture. His work blends modernist abstraction with deeply rooted Mexican traditions, crafting spaces that evoke memory, contemplation, and poetic resonance. Credits and Additional Notes Water Trough (Bebedero) Length: 44 meters Water Trough (Bebedero) Width: 2.55 meters Water Trough (Bebedero) Height: 0.67 meters Material: Anthracite-colored stone (dark tone to enhance reflections) Delgado Cámara, Enrique. La Geometría del Agua: Mecanismos Arquitectónicos de Manipulación Espacial. Enrique Delgado Cámara, 2024.  Ambasz, Emilio. The Architecture of Luis Barragán. Museum of Modern Art, New York, 1976.
    Like
    Love
    Wow
    Sad
    Angry
    746
    0 Commentarii 0 Distribuiri
  • The Invisible Visual Effects Secrets of ‘Severance’ with ILM’s Eric Leven

    ILM teams with Ben Stiller and Apple TV+ to bring thousands of seamless visual effects shots to the hit drama’s second season.
    By Clayton Sandell
    There are mysterious and important secrets to be uncovered in the second season of the wildly popular Apple TV+ series Severance.
    About 3,500 of them are hiding in plain sight.
    That’s roughly the number of visual effects shots helping tell the Severance story over 10 gripping episodes in the latest season, a collaborative effort led by Industrial Light & Magic.
    ILM’s Eric Leven served as the Severance season two production visual effects supervisor. We asked him to help pull back the curtain on some of the show’s impressive digital artistry that most viewers will probably never notice.
    “This is the first show I’ve ever done where it’s nothing but invisible effects,” Leven tells ILM.com. “It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.”
    With so many season two shots to choose from, Leven helped us narrow down a list of his favorite visual effects sequences to five.Before we dig in, a word of caution. This article contains plot spoilers for Severance.Severance tells the story of Mark Scout, department chief of the secretive Severed Floor located in the basement level of Lumon Industries, a multinational biotech corporation. Mark S., as he’s known to his co-workers, heads up Macrodata Refinement, a department where employees help categorize numbers without knowing the true purpose of their work. 
    Mark and his team – Helly R., Dylan G., and Irving B., have all undergone a surgical procedure to “sever” their personal lives from their work lives. The chip embedded in their brains effectively creates two personalities that are sometimes at odds: an “Innie” during Lumon office hours and an “Outie” at home.
    “This is the first show I’ve ever done where it’s nothing but invisible effects. It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.”Eric Leven
    1. The Running ManThe season one finale ends on a major cliffhanger. Mark S. learns that his Outie’s wife, Gemma – believed killed in a car crash years ago – is actually alive somewhere inside the Lumon complex. Season two opens with Mark S. arriving at the Severed Floor in a desperate search for Gemma, who he only knows as her Innie persona, Ms. Casey.
    The fast-paced sequence is designed to look like a single, two-minute shot. It begins with the camera making a series of rapid and elaborate moves around a frantic Mark S. as he steps out of the elevator, into the Severed Floor lobby, and begins running through the hallways.
    “The nice thing about that sequence was that everyone knew it was going to be difficult and challenging,” Leven says, adding that executive producer and Episode 201 director, Ben Stiller, began by mapping out the hallway run with his team. Leven recommended that a previsualization sequence – provided by The Third Floor – would help the filmmakers refine their plan before cameras rolled.
    “While prevising it, we didn’t worry about how we would actually photograph anything. It was just, ‘These are the visuals we want to capture,’” Leven says. “‘What does it look like for this guy to run down this hallway for two minutes? We’ll figure out how to shoot it later.’”
    The previs process helped determine how best to shoot the sequence, and also informed which parts of the soundstage set would have to be digitally replaced. The first shot was captured by a camera mounted on a Bolt X Cinebot motion-control arm provided by The Garage production company. The size of the motion-control setup, however, meant it could not fit in the confined space of an elevator or the existing hallways.
    “We couldn’t actually shoot in the elevator,” Leven says. “The whole elevator section of the set was removed and was replaced with computer graphics.” In addition to the elevator, ILM artists replaced portions of the floor, furniture, and an entire lobby wall, even adding a reflection of Adam Scott into the elevator doors.
    As Scott begins running, he’s picked up by a second camera mounted on a more compact, stabilized gimbal that allows the operator to quickly run behind and sometimes in front of the actor as he darts down different hallways. ILM seamlessly combined the first two Mark S. plates in a 2D composite.
    “Part of that is the magic of the artists at ILM who are doing that blend. But I have to give credit to Adam Scott because he ran the same way in both cameras without really being instructed,” says Leven. “Lucky for us, he led with the same foot. He used the same arm. I remember seeing it on the set, and I did a quick-and-dirty blend right there and thought, ‘Oh my gosh, this is going to work.’ So it was really nice.”
    The action continues at a frenetic pace, ultimately combining ten different shots to complete the sequence.
    “We didn’t want the very standard sleight of hand that you’ve seen a lot where you do a wipe across the white hallway,” Leven explains. “We tried to vary that as much as possible because we didn’t want to give away the gag. So, there are times when the camera will wipe across a hallway, and it’s not a computer graphics wipe. We’d hide the wipe somewhere else.”
    A slightly more complicated illusion comes as the camera sweeps around Mark S. from back to front as he barrels down another long hallway. “There was no way to get the camera to spin around Mark while he is running because there’s physically not enough room for the camera there,” says Leven.
    To capture the shot, Adam Scott ran on a treadmill placed on a green screen stage as the camera maneuvered around him. At that point, the entire hallway environment is made with computer graphics. Artists even added a few extra frames of the actor to help connect one shot to the next, selling the illusion of a single continuous take. “We painted in a bit of Adam Scott running around the corner. So if you freeze and look through it, you’ll see a bit of his heel. He never completely clears the frame,” Leven points out.
    Leven says ILM also provided Ben Stiller with options when it came to digitally changing up the look of Lumon’s sterile hallways: sometimes adding extra doors, vents, or even switching door handles. “I think Ben was very excited about having this opportunity,” says Leven. “He had never had a complete, fully computer graphics version of these hallways before. And now he was able to do things that he was never able to do in season one.”.
    2. Let it SnowThe MDR team – Mark, Helly, Dylan, and Irving – unexpectedly find themselves in the snowy wilderness as part of a two-day Lumon Outdoor Retreat and Team-Building Occurrence, or ORTBO. 
    Exterior scenes were shot on location at Minnewaska State Park Preserve in New York. Throughout the ORTBO sequence, ILM performed substantial environment enhancements, making trees and landscapes appear far snowier than they were during the shoot. “It’s really nice to get the actors out there in the cold and see their breath,” Leven says. “It just wasn’t snowy during the shoot. Nearly every exterior shot was either replaced or enhanced with snow.”
    For a shot of Irving standing on a vast frozen lake, for example, virtually every element in the location plate – including an unfrozen lake, mountains, and trees behind actor John Turturro – was swapped out for a CG environment. Wide shots of a steep, rocky wall Irving must scale to reach his co-workers were also completely digital.
    Eventually, the MDR team discovers a waterfall that marks their arrival at a place called Woe’s Hollow. The location – the state park’s real-life Awosting Falls – also got extensive winter upgrades from ILM, including much more snow covering the ground and trees, an ice-covered pond, and hundreds of icicles clinging to the rocky walls. “To make it fit in the world of Severance, there’s a ton of work that has to happen,” Leven tells ILM.com..
    3. Welcome to LumonThe historic Bell Labs office complex, now known as Bell Works in Holmdel Township, New Jersey, stands in as the fictional Lumon Industries headquarters building.
    Exterior shots often underwent a significant digital metamorphosis, with artists transforming areas of green grass into snow-covered terrain, inserting a CG water tower, and rendering hundreds of 1980s-era cars to fill the parking lot.
    “We’re always adding cars, we’re always adding snow. We’re changing, subtly, the shape and the layout of the design,” says Leven. “We’re seeing new angles that we’ve never seen before. On the roof of Lumon, for example, the air conditioning units are specifically designed and created with computer graphics.”
    In real life, the complex is surrounded by dozens of houses, requiring the digital erasure of entire neighborhoods. “All of that is taken out,” Leven explains. “CG trees are put in, and new mountains are put in the background.”
    Episodes 202 and 203 feature several night scenes shot from outside the building looking in. In one sequence, a camera drone flying outside captured a long tracking shot of Helena Eaganmaking her way down a glass-enclosed walkway. The building’s atrium can be seen behind her, complete with a massive wall sculpture depicting company founder Kier Eagan.
    “We had to put the Kier sculpture in with the special lighting,” Leven reveals. “The entire atrium was computer graphics.” Artists completed the shot by adding CG reflections of the snowy parking lot to the side of the highly reflective building.
    “We have to replace what’s in the reflections because the real reflection is a parking lot with no snow or a parking lot with no cars,” explains Leven. “We’re often replacing all kinds of stuff that you wouldn’t think would need to be replaced.”
    Another nighttime scene shot from outside the building features Helena in a conference room overlooking the Lumon parking lot, which sits empty except for Mr. Milchickriding in on his motorcycle.
    “The top story, where she is standing, was practical,” says Leven, noting the shot was also captured using a drone hovering outside the window. “The second story below her was all computer graphics. Everything other than the building is computer graphics. They did shoot a motorcycle on location, getting as much practical reference as possible, but then it had to be digitally replaced after the fact to make it work with the rest of the shot.”.
    4. Time in MotionEpisode seven reveals that MDR’s progress is being monitored by four dopplegang-ish observers in a control room one floor below, revealed via a complex move that has the camera traveling downward through a mass of data cables.
    “They built an oversize cable run, and they shot with small probe lenses. Visual effects helped by blending several plates together,” explains Leven. “It was a collaboration between many different departments, which was really nice. Visual effects helped with stuff that just couldn’t be shot for real. For example, when the camera exits the thin holes of the metal grate at the bottom of the floor, that grate is computer graphics.”
    The sequence continues with a sweeping motion-control time-lapse shot that travels around the control-room observers in a spiral pattern, a feat pulled off with an ingenious mix of technical innovation and old-school sleight of hand.
    A previs sequence from The Third Floor laid out the camera move, but because the Bolt arm motion-control rig could only travel on a straight track and cover roughly one-quarter of the required distance, The Garage came up with a way to break the shot into multiple passes. The passes would later be stitched together into one seemingly uninterrupted movement.
    The symmetrical set design – including the four identical workstations – helped complete the illusion, along with a clever solution that kept the four actors in the correct position relative to the camera.
    “The camera would basically get to the end of the track,” Leven explains. “Then everybody would switch positions 90 degrees. Everyone would get out of their chairs and move. The camera would go back to one, and it would look like one continuous move around in a circle because the room is perfectly symmetrical, and everything in it is perfectly symmetrical. We were able to move the actors, and it looks like the camera was going all the way around the room.”
    The final motion-control move switches from time-lapse back to real time as the camera passes by a workstation and reveals Mr. Drummondand Dr. Mauerstanding behind it. Leven notes that each pass was completed with just one take.
    5. Mark vs. MarkThe Severance season two finale begins with an increasingly tense conversation between Innie Mark and Outie Mark, as the two personas use a handheld video camera to send recorded messages back and forth. Their encounter takes place at night in a Lumon birthing cabin equipped with a severance threshold that allows Mark S. to become Mark Scout each time he steps outside and onto the balcony.
    The cabin set was built on a soundstage at York Studios in the Bronx, New York. The balcony section consisted of the snowy floor, two chairs, and a railing, all surrounded by a blue screen background. Everything else was up to ILM to create.
    “It was nice to have Ben’s trust that we could just do it,” Leven remembers. “He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’”
    Artists filled in the scene with CG water, mountains, and moonlight to match the on-set lighting and of course, more snow. As Mark Scout steps onto the balcony, the camera pulls back to a wide shot, revealing the cabin’s full exterior. “They built a part of the exterior of the set. But everything other than the windows, even the railing, was digitally replaced,” Leven says.
    “It was nice to have Bentrust that we could just do it. He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’”Eric Leven
    Bonus: Marching Band MagicFinally, our bonus visual effects shot appears roughly halfway through the season finale. To celebrate Mark S. completing the Cold Harbor file, Mr. Milchick orders up a marching band from Lumon’s Choreography and Merriment department. Band members pour into MDR, but Leven says roughly 15 to 20 shots required adding a few more digital duplicates. “They wanted it to look like MDR was filled with band members. And for several of the shots there were holes in there. It just didn’t feel full enough,” he says.
    In a shot featuring a God’s-eye view of MDR, band members hold dozens of white cards above their heads, forming a giant illustration of a smiling Mark S. with text that reads “100%.”
    “For the top shot, we had to find a different stage because the MDR ceiling is only about eight feet tall,” recalls Leven. “And Ben really pushed to have it done practically, which I think was the right call because you’ve already got the band members, you’ve made the costumes, you’ve got the instruments. Let’s find a place to shoot it.”
    To get the high shot, the production team set up on an empty soundstage, placing signature MDR-green carpet on the floor. A simple foam core mock-up of the team’s desks occupied the center of the frame, with the finished CG versions added later.
    Even without the restraints of the practical MDR walls and ceiling, the camera could only get enough height to capture about 30 band members in the shot. So the scene was digitally expanded, with artists adding more green carpet, CG walls, and about 50 more band members.
    “We painted in new band members, extracting what we could from the practical plate,” Leven says. “We moved them around; we added more, just to make it look as full as Ben wanted.” Every single white card in the shot, Leven points out, is completely digital..
    A Mysterious and Important Collaboration
    With fans now fiercely debating the many twists and turns of Severance season two, Leven is quick to credit ILM’s two main visual effects collaborators: east side effects and Mango FX INC, as well as ILM studios and artists around the globe, including San Francisco, Vancouver, Singapore, Sydney, and Mumbai.
    Leven also believes Severance ultimately benefited from a successful creative partnership between ILM and Ben Stiller.
    “This one clicked so well, and it really made a difference on the show,” Leven says. “I think we both had the same sort of visual shorthand in terms of what we wanted things to look like. One of the things I love about working with Ben is that he’s obviously grounded in reality. He wants to shoot as much stuff real as possible, but then sometimes there’s a shot that will either come to him late or he just knows is impractical to shoot. And he knows that ILM can deliver it.”

    Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on InstagramBlueskyor X.
    #invisible #visual #effects #secrets #severance
    The Invisible Visual Effects Secrets of ‘Severance’ with ILM’s Eric Leven
    ILM teams with Ben Stiller and Apple TV+ to bring thousands of seamless visual effects shots to the hit drama’s second season. By Clayton Sandell There are mysterious and important secrets to be uncovered in the second season of the wildly popular Apple TV+ series Severance. About 3,500 of them are hiding in plain sight. That’s roughly the number of visual effects shots helping tell the Severance story over 10 gripping episodes in the latest season, a collaborative effort led by Industrial Light & Magic. ILM’s Eric Leven served as the Severance season two production visual effects supervisor. We asked him to help pull back the curtain on some of the show’s impressive digital artistry that most viewers will probably never notice. “This is the first show I’ve ever done where it’s nothing but invisible effects,” Leven tells ILM.com. “It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.” With so many season two shots to choose from, Leven helped us narrow down a list of his favorite visual effects sequences to five.Before we dig in, a word of caution. This article contains plot spoilers for Severance.Severance tells the story of Mark Scout, department chief of the secretive Severed Floor located in the basement level of Lumon Industries, a multinational biotech corporation. Mark S., as he’s known to his co-workers, heads up Macrodata Refinement, a department where employees help categorize numbers without knowing the true purpose of their work.  Mark and his team – Helly R., Dylan G., and Irving B., have all undergone a surgical procedure to “sever” their personal lives from their work lives. The chip embedded in their brains effectively creates two personalities that are sometimes at odds: an “Innie” during Lumon office hours and an “Outie” at home. “This is the first show I’ve ever done where it’s nothing but invisible effects. It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.”Eric Leven 1. The Running ManThe season one finale ends on a major cliffhanger. Mark S. learns that his Outie’s wife, Gemma – believed killed in a car crash years ago – is actually alive somewhere inside the Lumon complex. Season two opens with Mark S. arriving at the Severed Floor in a desperate search for Gemma, who he only knows as her Innie persona, Ms. Casey. The fast-paced sequence is designed to look like a single, two-minute shot. It begins with the camera making a series of rapid and elaborate moves around a frantic Mark S. as he steps out of the elevator, into the Severed Floor lobby, and begins running through the hallways. “The nice thing about that sequence was that everyone knew it was going to be difficult and challenging,” Leven says, adding that executive producer and Episode 201 director, Ben Stiller, began by mapping out the hallway run with his team. Leven recommended that a previsualization sequence – provided by The Third Floor – would help the filmmakers refine their plan before cameras rolled. “While prevising it, we didn’t worry about how we would actually photograph anything. It was just, ‘These are the visuals we want to capture,’” Leven says. “‘What does it look like for this guy to run down this hallway for two minutes? We’ll figure out how to shoot it later.’” The previs process helped determine how best to shoot the sequence, and also informed which parts of the soundstage set would have to be digitally replaced. The first shot was captured by a camera mounted on a Bolt X Cinebot motion-control arm provided by The Garage production company. The size of the motion-control setup, however, meant it could not fit in the confined space of an elevator or the existing hallways. “We couldn’t actually shoot in the elevator,” Leven says. “The whole elevator section of the set was removed and was replaced with computer graphics.” In addition to the elevator, ILM artists replaced portions of the floor, furniture, and an entire lobby wall, even adding a reflection of Adam Scott into the elevator doors. As Scott begins running, he’s picked up by a second camera mounted on a more compact, stabilized gimbal that allows the operator to quickly run behind and sometimes in front of the actor as he darts down different hallways. ILM seamlessly combined the first two Mark S. plates in a 2D composite. “Part of that is the magic of the artists at ILM who are doing that blend. But I have to give credit to Adam Scott because he ran the same way in both cameras without really being instructed,” says Leven. “Lucky for us, he led with the same foot. He used the same arm. I remember seeing it on the set, and I did a quick-and-dirty blend right there and thought, ‘Oh my gosh, this is going to work.’ So it was really nice.” The action continues at a frenetic pace, ultimately combining ten different shots to complete the sequence. “We didn’t want the very standard sleight of hand that you’ve seen a lot where you do a wipe across the white hallway,” Leven explains. “We tried to vary that as much as possible because we didn’t want to give away the gag. So, there are times when the camera will wipe across a hallway, and it’s not a computer graphics wipe. We’d hide the wipe somewhere else.” A slightly more complicated illusion comes as the camera sweeps around Mark S. from back to front as he barrels down another long hallway. “There was no way to get the camera to spin around Mark while he is running because there’s physically not enough room for the camera there,” says Leven. To capture the shot, Adam Scott ran on a treadmill placed on a green screen stage as the camera maneuvered around him. At that point, the entire hallway environment is made with computer graphics. Artists even added a few extra frames of the actor to help connect one shot to the next, selling the illusion of a single continuous take. “We painted in a bit of Adam Scott running around the corner. So if you freeze and look through it, you’ll see a bit of his heel. He never completely clears the frame,” Leven points out. Leven says ILM also provided Ben Stiller with options when it came to digitally changing up the look of Lumon’s sterile hallways: sometimes adding extra doors, vents, or even switching door handles. “I think Ben was very excited about having this opportunity,” says Leven. “He had never had a complete, fully computer graphics version of these hallways before. And now he was able to do things that he was never able to do in season one.”. 2. Let it SnowThe MDR team – Mark, Helly, Dylan, and Irving – unexpectedly find themselves in the snowy wilderness as part of a two-day Lumon Outdoor Retreat and Team-Building Occurrence, or ORTBO.  Exterior scenes were shot on location at Minnewaska State Park Preserve in New York. Throughout the ORTBO sequence, ILM performed substantial environment enhancements, making trees and landscapes appear far snowier than they were during the shoot. “It’s really nice to get the actors out there in the cold and see their breath,” Leven says. “It just wasn’t snowy during the shoot. Nearly every exterior shot was either replaced or enhanced with snow.” For a shot of Irving standing on a vast frozen lake, for example, virtually every element in the location plate – including an unfrozen lake, mountains, and trees behind actor John Turturro – was swapped out for a CG environment. Wide shots of a steep, rocky wall Irving must scale to reach his co-workers were also completely digital. Eventually, the MDR team discovers a waterfall that marks their arrival at a place called Woe’s Hollow. The location – the state park’s real-life Awosting Falls – also got extensive winter upgrades from ILM, including much more snow covering the ground and trees, an ice-covered pond, and hundreds of icicles clinging to the rocky walls. “To make it fit in the world of Severance, there’s a ton of work that has to happen,” Leven tells ILM.com.. 3. Welcome to LumonThe historic Bell Labs office complex, now known as Bell Works in Holmdel Township, New Jersey, stands in as the fictional Lumon Industries headquarters building. Exterior shots often underwent a significant digital metamorphosis, with artists transforming areas of green grass into snow-covered terrain, inserting a CG water tower, and rendering hundreds of 1980s-era cars to fill the parking lot. “We’re always adding cars, we’re always adding snow. We’re changing, subtly, the shape and the layout of the design,” says Leven. “We’re seeing new angles that we’ve never seen before. On the roof of Lumon, for example, the air conditioning units are specifically designed and created with computer graphics.” In real life, the complex is surrounded by dozens of houses, requiring the digital erasure of entire neighborhoods. “All of that is taken out,” Leven explains. “CG trees are put in, and new mountains are put in the background.” Episodes 202 and 203 feature several night scenes shot from outside the building looking in. In one sequence, a camera drone flying outside captured a long tracking shot of Helena Eaganmaking her way down a glass-enclosed walkway. The building’s atrium can be seen behind her, complete with a massive wall sculpture depicting company founder Kier Eagan. “We had to put the Kier sculpture in with the special lighting,” Leven reveals. “The entire atrium was computer graphics.” Artists completed the shot by adding CG reflections of the snowy parking lot to the side of the highly reflective building. “We have to replace what’s in the reflections because the real reflection is a parking lot with no snow or a parking lot with no cars,” explains Leven. “We’re often replacing all kinds of stuff that you wouldn’t think would need to be replaced.” Another nighttime scene shot from outside the building features Helena in a conference room overlooking the Lumon parking lot, which sits empty except for Mr. Milchickriding in on his motorcycle. “The top story, where she is standing, was practical,” says Leven, noting the shot was also captured using a drone hovering outside the window. “The second story below her was all computer graphics. Everything other than the building is computer graphics. They did shoot a motorcycle on location, getting as much practical reference as possible, but then it had to be digitally replaced after the fact to make it work with the rest of the shot.”. 4. Time in MotionEpisode seven reveals that MDR’s progress is being monitored by four dopplegang-ish observers in a control room one floor below, revealed via a complex move that has the camera traveling downward through a mass of data cables. “They built an oversize cable run, and they shot with small probe lenses. Visual effects helped by blending several plates together,” explains Leven. “It was a collaboration between many different departments, which was really nice. Visual effects helped with stuff that just couldn’t be shot for real. For example, when the camera exits the thin holes of the metal grate at the bottom of the floor, that grate is computer graphics.” The sequence continues with a sweeping motion-control time-lapse shot that travels around the control-room observers in a spiral pattern, a feat pulled off with an ingenious mix of technical innovation and old-school sleight of hand. A previs sequence from The Third Floor laid out the camera move, but because the Bolt arm motion-control rig could only travel on a straight track and cover roughly one-quarter of the required distance, The Garage came up with a way to break the shot into multiple passes. The passes would later be stitched together into one seemingly uninterrupted movement. The symmetrical set design – including the four identical workstations – helped complete the illusion, along with a clever solution that kept the four actors in the correct position relative to the camera. “The camera would basically get to the end of the track,” Leven explains. “Then everybody would switch positions 90 degrees. Everyone would get out of their chairs and move. The camera would go back to one, and it would look like one continuous move around in a circle because the room is perfectly symmetrical, and everything in it is perfectly symmetrical. We were able to move the actors, and it looks like the camera was going all the way around the room.” The final motion-control move switches from time-lapse back to real time as the camera passes by a workstation and reveals Mr. Drummondand Dr. Mauerstanding behind it. Leven notes that each pass was completed with just one take. 5. Mark vs. MarkThe Severance season two finale begins with an increasingly tense conversation between Innie Mark and Outie Mark, as the two personas use a handheld video camera to send recorded messages back and forth. Their encounter takes place at night in a Lumon birthing cabin equipped with a severance threshold that allows Mark S. to become Mark Scout each time he steps outside and onto the balcony. The cabin set was built on a soundstage at York Studios in the Bronx, New York. The balcony section consisted of the snowy floor, two chairs, and a railing, all surrounded by a blue screen background. Everything else was up to ILM to create. “It was nice to have Ben’s trust that we could just do it,” Leven remembers. “He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’” Artists filled in the scene with CG water, mountains, and moonlight to match the on-set lighting and of course, more snow. As Mark Scout steps onto the balcony, the camera pulls back to a wide shot, revealing the cabin’s full exterior. “They built a part of the exterior of the set. But everything other than the windows, even the railing, was digitally replaced,” Leven says. “It was nice to have Bentrust that we could just do it. He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’”Eric Leven Bonus: Marching Band MagicFinally, our bonus visual effects shot appears roughly halfway through the season finale. To celebrate Mark S. completing the Cold Harbor file, Mr. Milchick orders up a marching band from Lumon’s Choreography and Merriment department. Band members pour into MDR, but Leven says roughly 15 to 20 shots required adding a few more digital duplicates. “They wanted it to look like MDR was filled with band members. And for several of the shots there were holes in there. It just didn’t feel full enough,” he says. In a shot featuring a God’s-eye view of MDR, band members hold dozens of white cards above their heads, forming a giant illustration of a smiling Mark S. with text that reads “100%.” “For the top shot, we had to find a different stage because the MDR ceiling is only about eight feet tall,” recalls Leven. “And Ben really pushed to have it done practically, which I think was the right call because you’ve already got the band members, you’ve made the costumes, you’ve got the instruments. Let’s find a place to shoot it.” To get the high shot, the production team set up on an empty soundstage, placing signature MDR-green carpet on the floor. A simple foam core mock-up of the team’s desks occupied the center of the frame, with the finished CG versions added later. Even without the restraints of the practical MDR walls and ceiling, the camera could only get enough height to capture about 30 band members in the shot. So the scene was digitally expanded, with artists adding more green carpet, CG walls, and about 50 more band members. “We painted in new band members, extracting what we could from the practical plate,” Leven says. “We moved them around; we added more, just to make it look as full as Ben wanted.” Every single white card in the shot, Leven points out, is completely digital.. A Mysterious and Important Collaboration With fans now fiercely debating the many twists and turns of Severance season two, Leven is quick to credit ILM’s two main visual effects collaborators: east side effects and Mango FX INC, as well as ILM studios and artists around the globe, including San Francisco, Vancouver, Singapore, Sydney, and Mumbai. Leven also believes Severance ultimately benefited from a successful creative partnership between ILM and Ben Stiller. “This one clicked so well, and it really made a difference on the show,” Leven says. “I think we both had the same sort of visual shorthand in terms of what we wanted things to look like. One of the things I love about working with Ben is that he’s obviously grounded in reality. He wants to shoot as much stuff real as possible, but then sometimes there’s a shot that will either come to him late or he just knows is impractical to shoot. And he knows that ILM can deliver it.” — Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on InstagramBlueskyor X. #invisible #visual #effects #secrets #severance
    WWW.ILM.COM
    The Invisible Visual Effects Secrets of ‘Severance’ with ILM’s Eric Leven
    ILM teams with Ben Stiller and Apple TV+ to bring thousands of seamless visual effects shots to the hit drama’s second season. By Clayton Sandell There are mysterious and important secrets to be uncovered in the second season of the wildly popular Apple TV+ series Severance (2022-present). About 3,500 of them are hiding in plain sight. That’s roughly the number of visual effects shots helping tell the Severance story over 10 gripping episodes in the latest season, a collaborative effort led by Industrial Light & Magic. ILM’s Eric Leven served as the Severance season two production visual effects supervisor. We asked him to help pull back the curtain on some of the show’s impressive digital artistry that most viewers will probably never notice. “This is the first show I’ve ever done where it’s nothing but invisible effects,” Leven tells ILM.com. “It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.” With so many season two shots to choose from, Leven helped us narrow down a list of his favorite visual effects sequences to five. (As a bonus, we’ll also dive into an iconic season finale shot featuring the Mr. Milchick-led marching band.) Before we dig in, a word of caution. This article contains plot spoilers for Severance. (And in case you’re already wondering: No, the goats are not computer-graphics.) Severance tells the story of Mark Scout (Adam Scott), department chief of the secretive Severed Floor located in the basement level of Lumon Industries, a multinational biotech corporation. Mark S., as he’s known to his co-workers, heads up Macrodata Refinement (MDR), a department where employees help categorize numbers without knowing the true purpose of their work.  Mark and his team – Helly R. (Britt Lower), Dylan G. (Zach Cherry), and Irving B. (John Turturro), have all undergone a surgical procedure to “sever” their personal lives from their work lives. The chip embedded in their brains effectively creates two personalities that are sometimes at odds: an “Innie” during Lumon office hours and an “Outie” at home. “This is the first show I’ve ever done where it’s nothing but invisible effects. It’s a really different calculus because nobody talks about them. And if you’ve done them well, they are invisible to the naked eye.”Eric Leven 1. The Running Man (Episode 201: “Hello, Ms. Cobel”) The season one finale ends on a major cliffhanger. Mark S. learns that his Outie’s wife, Gemma – believed killed in a car crash years ago – is actually alive somewhere inside the Lumon complex. Season two opens with Mark S. arriving at the Severed Floor in a desperate search for Gemma, who he only knows as her Innie persona, Ms. Casey. The fast-paced sequence is designed to look like a single, two-minute shot. It begins with the camera making a series of rapid and elaborate moves around a frantic Mark S. as he steps out of the elevator, into the Severed Floor lobby, and begins running through the hallways. “The nice thing about that sequence was that everyone knew it was going to be difficult and challenging,” Leven says, adding that executive producer and Episode 201 director, Ben Stiller, began by mapping out the hallway run with his team. Leven recommended that a previsualization sequence – provided by The Third Floor – would help the filmmakers refine their plan before cameras rolled. “While prevising it, we didn’t worry about how we would actually photograph anything. It was just, ‘These are the visuals we want to capture,’” Leven says. “‘What does it look like for this guy to run down this hallway for two minutes? We’ll figure out how to shoot it later.’” The previs process helped determine how best to shoot the sequence, and also informed which parts of the soundstage set would have to be digitally replaced. The first shot was captured by a camera mounted on a Bolt X Cinebot motion-control arm provided by The Garage production company. The size of the motion-control setup, however, meant it could not fit in the confined space of an elevator or the existing hallways. “We couldn’t actually shoot in the elevator,” Leven says. “The whole elevator section of the set was removed and was replaced with computer graphics [CG].” In addition to the elevator, ILM artists replaced portions of the floor, furniture, and an entire lobby wall, even adding a reflection of Adam Scott into the elevator doors. As Scott begins running, he’s picked up by a second camera mounted on a more compact, stabilized gimbal that allows the operator to quickly run behind and sometimes in front of the actor as he darts down different hallways. ILM seamlessly combined the first two Mark S. plates in a 2D composite. “Part of that is the magic of the artists at ILM who are doing that blend. But I have to give credit to Adam Scott because he ran the same way in both cameras without really being instructed,” says Leven. “Lucky for us, he led with the same foot. He used the same arm. I remember seeing it on the set, and I did a quick-and-dirty blend right there and thought, ‘Oh my gosh, this is going to work.’ So it was really nice.” The action continues at a frenetic pace, ultimately combining ten different shots to complete the sequence. “We didn’t want the very standard sleight of hand that you’ve seen a lot where you do a wipe across the white hallway,” Leven explains. “We tried to vary that as much as possible because we didn’t want to give away the gag. So, there are times when the camera will wipe across a hallway, and it’s not a computer graphics wipe. We’d hide the wipe somewhere else.” A slightly more complicated illusion comes as the camera sweeps around Mark S. from back to front as he barrels down another long hallway. “There was no way to get the camera to spin around Mark while he is running because there’s physically not enough room for the camera there,” says Leven. To capture the shot, Adam Scott ran on a treadmill placed on a green screen stage as the camera maneuvered around him. At that point, the entire hallway environment is made with computer graphics. Artists even added a few extra frames of the actor to help connect one shot to the next, selling the illusion of a single continuous take. “We painted in a bit of Adam Scott running around the corner. So if you freeze and look through it, you’ll see a bit of his heel. He never completely clears the frame,” Leven points out. Leven says ILM also provided Ben Stiller with options when it came to digitally changing up the look of Lumon’s sterile hallways: sometimes adding extra doors, vents, or even switching door handles. “I think Ben was very excited about having this opportunity,” says Leven. “He had never had a complete, fully computer graphics version of these hallways before. And now he was able to do things that he was never able to do in season one.” (Credit: Apple TV+). 2. Let it Snow (Episode 204: “Woe’s Hollow”) The MDR team – Mark, Helly, Dylan, and Irving – unexpectedly find themselves in the snowy wilderness as part of a two-day Lumon Outdoor Retreat and Team-Building Occurrence, or ORTBO.  Exterior scenes were shot on location at Minnewaska State Park Preserve in New York. Throughout the ORTBO sequence, ILM performed substantial environment enhancements, making trees and landscapes appear far snowier than they were during the shoot. “It’s really nice to get the actors out there in the cold and see their breath,” Leven says. “It just wasn’t snowy during the shoot. Nearly every exterior shot was either replaced or enhanced with snow.” For a shot of Irving standing on a vast frozen lake, for example, virtually every element in the location plate – including an unfrozen lake, mountains, and trees behind actor John Turturro – was swapped out for a CG environment. Wide shots of a steep, rocky wall Irving must scale to reach his co-workers were also completely digital. Eventually, the MDR team discovers a waterfall that marks their arrival at a place called Woe’s Hollow. The location – the state park’s real-life Awosting Falls – also got extensive winter upgrades from ILM, including much more snow covering the ground and trees, an ice-covered pond, and hundreds of icicles clinging to the rocky walls. “To make it fit in the world of Severance, there’s a ton of work that has to happen,” Leven tells ILM.com. (Credit: Apple TV+). 3. Welcome to Lumon (Episode 202: “Goodbye, Mrs. Selvig” & Episode 203: “Who is Alive?”) The historic Bell Labs office complex, now known as Bell Works in Holmdel Township, New Jersey, stands in as the fictional Lumon Industries headquarters building. Exterior shots often underwent a significant digital metamorphosis, with artists transforming areas of green grass into snow-covered terrain, inserting a CG water tower, and rendering hundreds of 1980s-era cars to fill the parking lot. “We’re always adding cars, we’re always adding snow. We’re changing, subtly, the shape and the layout of the design,” says Leven. “We’re seeing new angles that we’ve never seen before. On the roof of Lumon, for example, the air conditioning units are specifically designed and created with computer graphics.” In real life, the complex is surrounded by dozens of houses, requiring the digital erasure of entire neighborhoods. “All of that is taken out,” Leven explains. “CG trees are put in, and new mountains are put in the background.” Episodes 202 and 203 feature several night scenes shot from outside the building looking in. In one sequence, a camera drone flying outside captured a long tracking shot of Helena Eagan (Helly R.’s Outie) making her way down a glass-enclosed walkway. The building’s atrium can be seen behind her, complete with a massive wall sculpture depicting company founder Kier Eagan. “We had to put the Kier sculpture in with the special lighting,” Leven reveals. “The entire atrium was computer graphics.” Artists completed the shot by adding CG reflections of the snowy parking lot to the side of the highly reflective building. “We have to replace what’s in the reflections because the real reflection is a parking lot with no snow or a parking lot with no cars,” explains Leven. “We’re often replacing all kinds of stuff that you wouldn’t think would need to be replaced.” Another nighttime scene shot from outside the building features Helena in a conference room overlooking the Lumon parking lot, which sits empty except for Mr. Milchick (Tramell Tillman) riding in on his motorcycle. “The top story, where she is standing, was practical,” says Leven, noting the shot was also captured using a drone hovering outside the window. “The second story below her was all computer graphics. Everything other than the building is computer graphics. They did shoot a motorcycle on location, getting as much practical reference as possible, but then it had to be digitally replaced after the fact to make it work with the rest of the shot.” (Credit: Apple TV+). 4. Time in Motion (Episode 207: “Chikhai Bardo”) Episode seven reveals that MDR’s progress is being monitored by four dopplegang-ish observers in a control room one floor below, revealed via a complex move that has the camera traveling downward through a mass of data cables. “They built an oversize cable run, and they shot with small probe lenses. Visual effects helped by blending several plates together,” explains Leven. “It was a collaboration between many different departments, which was really nice. Visual effects helped with stuff that just couldn’t be shot for real. For example, when the camera exits the thin holes of the metal grate at the bottom of the floor, that grate is computer graphics.” The sequence continues with a sweeping motion-control time-lapse shot that travels around the control-room observers in a spiral pattern, a feat pulled off with an ingenious mix of technical innovation and old-school sleight of hand. A previs sequence from The Third Floor laid out the camera move, but because the Bolt arm motion-control rig could only travel on a straight track and cover roughly one-quarter of the required distance, The Garage came up with a way to break the shot into multiple passes. The passes would later be stitched together into one seemingly uninterrupted movement. The symmetrical set design – including the four identical workstations – helped complete the illusion, along with a clever solution that kept the four actors in the correct position relative to the camera. “The camera would basically get to the end of the track,” Leven explains. “Then everybody would switch positions 90 degrees. Everyone would get out of their chairs and move. The camera would go back to one, and it would look like one continuous move around in a circle because the room is perfectly symmetrical, and everything in it is perfectly symmetrical. We were able to move the actors, and it looks like the camera was going all the way around the room.” The final motion-control move switches from time-lapse back to real time as the camera passes by a workstation and reveals Mr. Drummond (Ólafur Darri Ólafsson) and Dr. Mauer (Robby Benson) standing behind it. Leven notes that each pass was completed with just one take. 5. Mark vs. Mark (Episode 210: “Cold Harbor”) The Severance season two finale begins with an increasingly tense conversation between Innie Mark and Outie Mark, as the two personas use a handheld video camera to send recorded messages back and forth. Their encounter takes place at night in a Lumon birthing cabin equipped with a severance threshold that allows Mark S. to become Mark Scout each time he steps outside and onto the balcony. The cabin set was built on a soundstage at York Studios in the Bronx, New York. The balcony section consisted of the snowy floor, two chairs, and a railing, all surrounded by a blue screen background. Everything else was up to ILM to create. “It was nice to have Ben’s trust that we could just do it,” Leven remembers. “He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’” Artists filled in the scene with CG water, mountains, and moonlight to match the on-set lighting and of course, more snow. As Mark Scout steps onto the balcony, the camera pulls back to a wide shot, revealing the cabin’s full exterior. “They built a part of the exterior of the set. But everything other than the windows, even the railing, was digitally replaced,” Leven says. “It was nice to have Ben [Stiller’s] trust that we could just do it. He said, ‘Hey, you’re just going to make this look great, right?’ We said, ‘Yeah, no problem.’”Eric Leven Bonus: Marching Band Magic (Episode 210: “Cold Harbor”) Finally, our bonus visual effects shot appears roughly halfway through the season finale. To celebrate Mark S. completing the Cold Harbor file, Mr. Milchick orders up a marching band from Lumon’s Choreography and Merriment department. Band members pour into MDR, but Leven says roughly 15 to 20 shots required adding a few more digital duplicates. “They wanted it to look like MDR was filled with band members. And for several of the shots there were holes in there. It just didn’t feel full enough,” he says. In a shot featuring a God’s-eye view of MDR, band members hold dozens of white cards above their heads, forming a giant illustration of a smiling Mark S. with text that reads “100%.” “For the top shot, we had to find a different stage because the MDR ceiling is only about eight feet tall,” recalls Leven. “And Ben really pushed to have it done practically, which I think was the right call because you’ve already got the band members, you’ve made the costumes, you’ve got the instruments. Let’s find a place to shoot it.” To get the high shot, the production team set up on an empty soundstage, placing signature MDR-green carpet on the floor. A simple foam core mock-up of the team’s desks occupied the center of the frame, with the finished CG versions added later. Even without the restraints of the practical MDR walls and ceiling, the camera could only get enough height to capture about 30 band members in the shot. So the scene was digitally expanded, with artists adding more green carpet, CG walls, and about 50 more band members. “We painted in new band members, extracting what we could from the practical plate,” Leven says. “We moved them around; we added more, just to make it look as full as Ben wanted.” Every single white card in the shot, Leven points out, is completely digital. (Credit: Apple TV+). A Mysterious and Important Collaboration With fans now fiercely debating the many twists and turns of Severance season two, Leven is quick to credit ILM’s two main visual effects collaborators: east side effects and Mango FX INC, as well as ILM studios and artists around the globe, including San Francisco, Vancouver, Singapore, Sydney, and Mumbai. Leven also believes Severance ultimately benefited from a successful creative partnership between ILM and Ben Stiller. “This one clicked so well, and it really made a difference on the show,” Leven says. “I think we both had the same sort of visual shorthand in terms of what we wanted things to look like. One of the things I love about working with Ben is that he’s obviously grounded in reality. He wants to shoot as much stuff real as possible, but then sometimes there’s a shot that will either come to him late or he just knows is impractical to shoot. And he knows that ILM can deliver it.” — Clayton Sandell is a Star Wars author and enthusiast, TV storyteller, and a longtime fan of the creative people who keep Industrial Light & Magic and Skywalker Sound on the leading edge of visual effects and sound design. Follow him on Instagram (@claytonsandell) Bluesky (@claytonsandell.com) or X (@Clayton_Sandell).
    Like
    Love
    Wow
    Sad
    Angry
    682
    0 Commentarii 0 Distribuiri
  • Airstream’s new Frank Lloyd Wright trailer is a match made in midcentury heaven

    Like a good pair of Basquiat Crocs, there are innumerable bad ways to license an artist’s work. So when Airstream looked to partner up on a project with the Frank Lloyd Wright Foundation, the aluminum-clad trailer brand could have just printed one of the architect’s famous patterns on a limited run of its vehicles and called it a day. It probably would have even sold well. But that is decidedly what Bob Wheeler, Airstream’s president and CEO, did not want to do. 

    “We said, ‘All right, let’s make sure that everything has a purpose and a function—that way it’s not just a pastiche, or some kind of lame attempt to mimic something,’” Wheeler recalls. “We didn’t want it to seem overdone or kitschy.”

    Instead, the brand embarked on a multiyear collaboration with the experts at Wright’s Taliesin West home and studio in Scottsdale, Arizona, and today the two are rolling out the 28-foot Airstream Frank Lloyd Wright Usonian Limited Edition Travel Trailer. With just 200 numbered vehicles that retail for on offer, you—like me—might not be able to afford one at the moment, but they just might also restore your faith in the art of the artist collab at large. BETTER LATE THAN NEVER

    Wheeler has a passion for midcentury design, so it tracks that he’d be a natural fan of Wright’s organic architecture.

    “Honestly, this has been a dream of mine for the last 20 years, which is about as long as I’ve been president of Airstream,” he says. “Why are Wright’s designs so celebrated today? It’s because they’re timeless. I think there are values there that incentivize someone to buy an Airstream that overlap in some meaningful ways.”

    Though Wright and Airstream founder Wally Byam were active at the same time and likely shared some of the same design fan base, there’s no record of them ever meeting. But a collaboration between the two ultimately proved inevitable when Wheeler reached out to Wright’s foundation in 2022. Foundation historian Sally Russell says her team wasn’t initially sure how robust a joint project could be. They eventually toured the Airstream factory in Ohio where the trailers are handmade using 3,000 rivets over the course of 350 hours, and saw how much customization was truly possible. Then she realized that it could be a great showcase of Wright’s work. 

    Beyond an Airstream’s signature aluminum exterior, Wheeler says the trailer is essentially a blank canvas. “And that’s where we can really flex some design muscle and allow others to do so.” 

    Russell says the foundation first explored whether to make the trailer feel like an adaptation of a specific Frank Lloyd Wright home. “The answer to that was no,” she says. “We didn’t want to try to re-create the Rosenbaum House and shove it into the size of a trailer. It didn’t make sense, because Frank Lloyd Wright certainly designed for each of his individual projects—he created something new, something that expressed the individual forms of the project, the needs of the client. So there was a great awareness of wanting to continue that legacy through the work that we did on the trailer.”

    The two teams ultimately homed in on the concept of Usonian design, a style that aimed to democratize design via small, affordable homes with a focus on efficient floor plans, functionality, and modularity. 

    In other words: an ideal fit for an Airstream.COLLAPSIBLE CHAIRS AND CLERESTORY WINDOWS

    When you approach the trailer, the connection to Wright is immediate on the custom front door featuring the Gordon leaf pattern, which the architect commissioned his apprentice Eugene Masselink to design in 1956. It’s a tip of the hat to nature, presumably an Airstreamer’s destination, and can be found subtly throughout the trailer in elements like sconces and cabinet pulls—but not too much, per the design mission at the outset.With the push of a button, the bench seating converts into a king-size bed—one of Wheeler’s favorite elements. It is the largest bed in any Airstream, and is a first for the company, he says. Another convertible element, in line with that focus on modularity, is the living space at the front of the trailer. Here, a dining table, desk, and seating inspired by the slant-back chairs that Wright used throughout his career collapse into a wall cabinet. Wheeler says Airstream used to deploy clever features like this in the midcentury era, before modern preferences trended toward built-in furniture. “So in some ways, this is a bit of a flashback to an earlier design in the ’50s, which is appropriate.”

    The teams also honored Wright’s focus on natural light, relocating Airstream’s usual overhead storage in favor of clerestory windows, which are prominent in Usonian homes. Meanwhile, the overall color palette comes from a 1955 Wright-curated Martin-Senour paint line. Russell says the team selected it for its harmonious blend with the natural settings where the trailer is likely headed, featuring ocher, red, and turquoise. 

    Ultimately, “It’s like a Frank Lloyd Wright home, where you walk into it, and it’s a completely different experience from any other building,” Russell says. “I hope that he would be very happy to see that design legacy continue, because he certainly did that with his own fellowship and the apprentices that he worked with.”USONIAN LIFE

    Starting today, the limited-edition, numbered trailers will be available for order at Airstream dealerships. Wheeler says the company was originally going to release just 100 of them, but got so much positive feedback from dealers and others that they doubled the run. 

    On the whole, the collaboration comes in the wake of a boom time for Airstream, which is owned by Thor Industries. Airstream experienced a surge during the pandemic, resulting in a 22% jump in sales in 2021 as people embraced remote work or realigned their relationship to the world. 

    “We’ve come back to earth now, and now we’re much more tied to actual market retail rates, which is what we know,” Wheeler says.

    In its third-quarter financials, Thor reported billion in revenue. While the company declined to provide Airstream-specific numbers, its overall North American towable RV division is up 9.1% from the same period in 2024.

    But there’s a problem afoot: The current administration’s tariffs, which Wheeler says made settling on the price for the Frank Lloyd Wright collaboration tricky. He adds that the company is struggling with shortages caused by the disruption in the supply chain, and high interest rates are also a problem. “Look, we’re 94 years old,” he says. “We’ve been through more of these cycles than we can count, so we’re fine, and we’ll continue to trade on authenticity, quality, great service and support, a great dealer network, and a brand that really has become part of the fabric of the U.S. traveling adventure.”
    #airstreams #new #frank #lloyd #wright
    Airstream’s new Frank Lloyd Wright trailer is a match made in midcentury heaven
    Like a good pair of Basquiat Crocs, there are innumerable bad ways to license an artist’s work. So when Airstream looked to partner up on a project with the Frank Lloyd Wright Foundation, the aluminum-clad trailer brand could have just printed one of the architect’s famous patterns on a limited run of its vehicles and called it a day. It probably would have even sold well. But that is decidedly what Bob Wheeler, Airstream’s president and CEO, did not want to do.  “We said, ‘All right, let’s make sure that everything has a purpose and a function—that way it’s not just a pastiche, or some kind of lame attempt to mimic something,’” Wheeler recalls. “We didn’t want it to seem overdone or kitschy.” Instead, the brand embarked on a multiyear collaboration with the experts at Wright’s Taliesin West home and studio in Scottsdale, Arizona, and today the two are rolling out the 28-foot Airstream Frank Lloyd Wright Usonian Limited Edition Travel Trailer. With just 200 numbered vehicles that retail for on offer, you—like me—might not be able to afford one at the moment, but they just might also restore your faith in the art of the artist collab at large. BETTER LATE THAN NEVER Wheeler has a passion for midcentury design, so it tracks that he’d be a natural fan of Wright’s organic architecture. “Honestly, this has been a dream of mine for the last 20 years, which is about as long as I’ve been president of Airstream,” he says. “Why are Wright’s designs so celebrated today? It’s because they’re timeless. I think there are values there that incentivize someone to buy an Airstream that overlap in some meaningful ways.” Though Wright and Airstream founder Wally Byam were active at the same time and likely shared some of the same design fan base, there’s no record of them ever meeting. But a collaboration between the two ultimately proved inevitable when Wheeler reached out to Wright’s foundation in 2022. Foundation historian Sally Russell says her team wasn’t initially sure how robust a joint project could be. They eventually toured the Airstream factory in Ohio where the trailers are handmade using 3,000 rivets over the course of 350 hours, and saw how much customization was truly possible. Then she realized that it could be a great showcase of Wright’s work.  Beyond an Airstream’s signature aluminum exterior, Wheeler says the trailer is essentially a blank canvas. “And that’s where we can really flex some design muscle and allow others to do so.”  Russell says the foundation first explored whether to make the trailer feel like an adaptation of a specific Frank Lloyd Wright home. “The answer to that was no,” she says. “We didn’t want to try to re-create the Rosenbaum House and shove it into the size of a trailer. It didn’t make sense, because Frank Lloyd Wright certainly designed for each of his individual projects—he created something new, something that expressed the individual forms of the project, the needs of the client. So there was a great awareness of wanting to continue that legacy through the work that we did on the trailer.” The two teams ultimately homed in on the concept of Usonian design, a style that aimed to democratize design via small, affordable homes with a focus on efficient floor plans, functionality, and modularity.  In other words: an ideal fit for an Airstream.COLLAPSIBLE CHAIRS AND CLERESTORY WINDOWS When you approach the trailer, the connection to Wright is immediate on the custom front door featuring the Gordon leaf pattern, which the architect commissioned his apprentice Eugene Masselink to design in 1956. It’s a tip of the hat to nature, presumably an Airstreamer’s destination, and can be found subtly throughout the trailer in elements like sconces and cabinet pulls—but not too much, per the design mission at the outset.With the push of a button, the bench seating converts into a king-size bed—one of Wheeler’s favorite elements. It is the largest bed in any Airstream, and is a first for the company, he says. Another convertible element, in line with that focus on modularity, is the living space at the front of the trailer. Here, a dining table, desk, and seating inspired by the slant-back chairs that Wright used throughout his career collapse into a wall cabinet. Wheeler says Airstream used to deploy clever features like this in the midcentury era, before modern preferences trended toward built-in furniture. “So in some ways, this is a bit of a flashback to an earlier design in the ’50s, which is appropriate.” The teams also honored Wright’s focus on natural light, relocating Airstream’s usual overhead storage in favor of clerestory windows, which are prominent in Usonian homes. Meanwhile, the overall color palette comes from a 1955 Wright-curated Martin-Senour paint line. Russell says the team selected it for its harmonious blend with the natural settings where the trailer is likely headed, featuring ocher, red, and turquoise.  Ultimately, “It’s like a Frank Lloyd Wright home, where you walk into it, and it’s a completely different experience from any other building,” Russell says. “I hope that he would be very happy to see that design legacy continue, because he certainly did that with his own fellowship and the apprentices that he worked with.”USONIAN LIFE Starting today, the limited-edition, numbered trailers will be available for order at Airstream dealerships. Wheeler says the company was originally going to release just 100 of them, but got so much positive feedback from dealers and others that they doubled the run.  On the whole, the collaboration comes in the wake of a boom time for Airstream, which is owned by Thor Industries. Airstream experienced a surge during the pandemic, resulting in a 22% jump in sales in 2021 as people embraced remote work or realigned their relationship to the world.  “We’ve come back to earth now, and now we’re much more tied to actual market retail rates, which is what we know,” Wheeler says. In its third-quarter financials, Thor reported billion in revenue. While the company declined to provide Airstream-specific numbers, its overall North American towable RV division is up 9.1% from the same period in 2024. But there’s a problem afoot: The current administration’s tariffs, which Wheeler says made settling on the price for the Frank Lloyd Wright collaboration tricky. He adds that the company is struggling with shortages caused by the disruption in the supply chain, and high interest rates are also a problem. “Look, we’re 94 years old,” he says. “We’ve been through more of these cycles than we can count, so we’re fine, and we’ll continue to trade on authenticity, quality, great service and support, a great dealer network, and a brand that really has become part of the fabric of the U.S. traveling adventure.” #airstreams #new #frank #lloyd #wright
    WWW.FASTCOMPANY.COM
    Airstream’s new Frank Lloyd Wright trailer is a match made in midcentury heaven
    Like a good pair of Basquiat Crocs, there are innumerable bad ways to license an artist’s work. So when Airstream looked to partner up on a project with the Frank Lloyd Wright Foundation, the aluminum-clad trailer brand could have just printed one of the architect’s famous patterns on a limited run of its vehicles and called it a day. It probably would have even sold well. But that is decidedly what Bob Wheeler, Airstream’s president and CEO, did not want to do.  “We said, ‘All right, let’s make sure that everything has a purpose and a function—that way it’s not just a pastiche, or some kind of lame attempt to mimic something,’” Wheeler recalls. “We didn’t want it to seem overdone or kitschy.” Instead, the brand embarked on a multiyear collaboration with the experts at Wright’s Taliesin West home and studio in Scottsdale, Arizona, and today the two are rolling out the 28-foot Airstream Frank Lloyd Wright Usonian Limited Edition Travel Trailer. With just 200 numbered vehicles that retail for $184,900 on offer, you—like me—might not be able to afford one at the moment, but they just might also restore your faith in the art of the artist collab at large.  [Photo: Airstream] BETTER LATE THAN NEVER Wheeler has a passion for midcentury design (as you might expect of Airstream’s CEO), so it tracks that he’d be a natural fan of Wright’s organic architecture. “Honestly, this has been a dream of mine for the last 20 years, which is about as long as I’ve been president of Airstream,” he says. “Why are Wright’s designs so celebrated today? It’s because they’re timeless. I think there are values there that incentivize someone to buy an Airstream that overlap in some meaningful ways.” Though Wright and Airstream founder Wally Byam were active at the same time and likely shared some of the same design fan base, there’s no record of them ever meeting. But a collaboration between the two ultimately proved inevitable when Wheeler reached out to Wright’s foundation in 2022. Foundation historian Sally Russell says her team wasn’t initially sure how robust a joint project could be. They eventually toured the Airstream factory in Ohio where the trailers are handmade using 3,000 rivets over the course of 350 hours, and saw how much customization was truly possible. Then she realized that it could be a great showcase of Wright’s work.  Beyond an Airstream’s signature aluminum exterior, Wheeler says the trailer is essentially a blank canvas. “And that’s where we can really flex some design muscle and allow others to do so.”  Russell says the foundation first explored whether to make the trailer feel like an adaptation of a specific Frank Lloyd Wright home. “The answer to that was no,” she says. “We didn’t want to try to re-create the Rosenbaum House and shove it into the size of a trailer. It didn’t make sense, because Frank Lloyd Wright certainly designed for each of his individual projects—he created something new, something that expressed the individual forms of the project, the needs of the client. So there was a great awareness of wanting to continue that legacy through the work that we did on the trailer.” The two teams ultimately homed in on the concept of Usonian design, a style that aimed to democratize design via small, affordable homes with a focus on efficient floor plans, functionality, and modularity.  In other words: an ideal fit for an Airstream. [Photo: Airstream] COLLAPSIBLE CHAIRS AND CLERESTORY WINDOWS When you approach the trailer, the connection to Wright is immediate on the custom front door featuring the Gordon leaf pattern, which the architect commissioned his apprentice Eugene Masselink to design in 1956. It’s a tip of the hat to nature, presumably an Airstreamer’s destination, and can be found subtly throughout the trailer in elements like sconces and cabinet pulls—but not too much, per the design mission at the outset. (“At one point we had a lot more of that Gordon leaf in there,” Wheeler notes. “We dialed that way back.”) With the push of a button, the bench seating converts into a king-size bed—one of Wheeler’s favorite elements. It is the largest bed in any Airstream, and is a first for the company, he says.  [Photo: Airstream] Another convertible element, in line with that focus on modularity, is the living space at the front of the trailer. Here, a dining table, desk, and seating inspired by the slant-back chairs that Wright used throughout his career collapse into a wall cabinet. Wheeler says Airstream used to deploy clever features like this in the midcentury era, before modern preferences trended toward built-in furniture. “So in some ways, this is a bit of a flashback to an earlier design in the ’50s, which is appropriate.” The teams also honored Wright’s focus on natural light, relocating Airstream’s usual overhead storage in favor of clerestory windows, which are prominent in Usonian homes. Meanwhile, the overall color palette comes from a 1955 Wright-curated Martin-Senour paint line. Russell says the team selected it for its harmonious blend with the natural settings where the trailer is likely headed, featuring ocher, red, and turquoise.  Ultimately, “It’s like a Frank Lloyd Wright home, where you walk into it, and it’s a completely different experience from any other building,” Russell says. “I hope that he would be very happy to see that design legacy continue, because he certainly did that with his own fellowship and the apprentices that he worked with.” [Photo: Airstream] USONIAN LIFE Starting today, the limited-edition, numbered trailers will be available for order at Airstream dealerships. Wheeler says the company was originally going to release just 100 of them, but got so much positive feedback from dealers and others that they doubled the run.  On the whole, the collaboration comes in the wake of a boom time for Airstream, which is owned by Thor Industries. Airstream experienced a surge during the pandemic, resulting in a 22% jump in sales in 2021 as people embraced remote work or realigned their relationship to the world.  “We’ve come back to earth now, and now we’re much more tied to actual market retail rates, which is what we know,” Wheeler says. In its third-quarter financials, Thor reported $2.89 billion in revenue (up 3.3% from previous year). While the company declined to provide Airstream-specific numbers, its overall North American towable RV division is up 9.1% from the same period in 2024. But there’s a problem afoot: The current administration’s tariffs, which Wheeler says made settling on the price for the Frank Lloyd Wright collaboration tricky. He adds that the company is struggling with shortages caused by the disruption in the supply chain, and high interest rates are also a problem.  [Photo: Airstream] “Look, we’re 94 years old,” he says. “We’ve been through more of these cycles than we can count, so we’re fine, and we’ll continue to trade on authenticity, quality, great service and support, a great dealer network, and a brand that really has become part of the fabric of the U.S. traveling adventure.”
    Like
    Love
    Wow
    Sad
    Angry
    592
    0 Commentarii 0 Distribuiri
  • FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY

    By TREVOR HOGG

    Denis Villeneuvefinds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track.If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation.Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.”

    A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects isRoger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.”

    Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey.Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow.One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it.Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters.Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, andcreate a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.”

    Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.”

    Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.”

    Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.”

    Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.”

    “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.”

    Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky.Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.”

    Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles.Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline.Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats.For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef.Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once.Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.”

    “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’”

    Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. ForEast of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.”

    Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well.Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh.

    Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise.Bluescreen and stunt doubles on Twisters.“The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.”
    —Dan Mindel, Cinematographer, Twisters

    Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.”
    #set #pixels #cinematic #artists #come
    FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY
    By TREVOR HOGG Denis Villeneuvefinds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track.If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation.Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.” A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects isRoger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.” Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey.Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow.One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it.Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters.Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, andcreate a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.” Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.” Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.” Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.” Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.” “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.” Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky.Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.” Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles.Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline.Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats.For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef.Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once.Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.” “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’” Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. ForEast of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.” Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well.Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh. Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise.Bluescreen and stunt doubles on Twisters.“The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.” —Dan Mindel, Cinematographer, Twisters Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.” #set #pixels #cinematic #artists #come
    WWW.VFXVOICE.COM
    FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY
    By TREVOR HOGG Denis Villeneuve (Dune: Part Two) finds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track. (Image courtesy of Warner Bros. Pictures) If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation. [VFX Supervisor] Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.” A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects is [Cinematographer] Roger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.” Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey. (Image courtesy of Paramount Pictures) Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow. (Image courtesy of Apple Studios) One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it. (Image courtesy of Paramount Pictures) Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters. (Image courtesy of Universal Pictures) Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, and [they] create a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.” Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.” Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.” Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.” Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.” “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.” Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky. (Image courtesy of Prime Video) Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.” Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles. (Image courtesy of Columbia Pictures) Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline. (Image courtesy of HBO) Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats. (Image courtesy of Universal Pictures) For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef. (Image courtesy of Netflix) Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once. (Image courtesy of A24) Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.” “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’” Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. For [the 2026 Netflix limited series] East of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.” Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well. [Director] Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh. Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise. (Image courtesy of HBO) Bluescreen and stunt doubles on Twisters. (Image courtesy of Universal Pictures) “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.” —Dan Mindel, Cinematographer, Twisters Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.”
    Like
    Love
    Wow
    Sad
    Angry
    634
    0 Commentarii 0 Distribuiri
  • Popular Chrome Extensions Leak API Keys, User Data via HTTP and Hard-Coded Credentials

    Cybersecurity researchers have flagged several popular Google Chrome extensions that have been found to transmit data in HTTP and hard-code secrets in their code, exposing users to privacy and security risks.
    "Several widely used extensionsunintentionally transmit sensitive data over simple HTTP," Yuanjing Guo, a security researcher in the Symantec's Security Technology and Response team, said. "By doing so, they expose browsing domains, machine IDs, operating system details, usage analytics, and even uninstall information, in plaintext."
    The fact that the network traffic is unencrypted also means that they are susceptible to adversary-in-the-middleattacks, allowing malicious actors on the same network such as a public Wi-Fi to intercept and, even worse, modify this data, which could lead to far more serious consequences.

    The list of identified extensions are below -

    SEMRush Rankand PI Rank, which call the URL "rank.trelliancom" over plain HTTP
    Browsec VPN, which uses HTTP to call an uninstall URL at "browsec-uninstall.s3-website.eu-central-1.amazonawscom" when a user attempts to uninstall the extension
    MSN New Taband MSN Homepage, Bing Search & News, which transmit a unique machine identifier and other details over HTTP to "g.ceipmsncom"
    DualSafe Password Manager & Digital Vault, which constructs an HTTP-based URL request to "stats.itopupdatecom" along with information about the extension version, user's browser language, and usage "type"

    "Although credentials or passwords do not appear to be leaked, the fact that a password manager uses unencrypted requests for telemetry erodes trust in its overall security posture," Guo said.
    Symantec said it also identified another set of extensions with API keys, secrets, and tokens directly embedded in the JavaScript code, which an attacker could weaponize to craft malicious requests and carry out various malicious actions -

    Online Security & Privacy extension, AVG Online Security, Speed Dial- New Tab Page, 3D, Sync, and SellerSprite - Amazon Research Tool, which expose a hard-coded Google Analytics 4API secret that an attacker could use to bombard the GA4 endpoint and corrupt metrics
    Equatio – Math Made Digital, which embeds a Microsoft Azure API key used for speech recognition that an attacker could use to inflate the developer's costs or exhaust their usage limits
    Awesome Screen Recorder & Screenshotand Scrolling Screenshot Tool & Screen Capture, which expose the developer's Amazon Web Servicesaccess key used to upload screenshots to the developer's S3 bucket
    Microsoft Editor – Spelling & Grammar Checker, which exposes a telemetry key named "StatsApiKey" to log user data for analytics
    Antidote Connector, which incorporates a third-party library called InboxSDK that contains hard-coded credentials, including API keys.
    Watch2Gether, which exposes a Tenor GIF search API key
    Trust Wallet, which exposes an API key associated with the Ramp Network, a Web3 platform that offers wallet developers a way to let users buy or sell crypto directly from the app
    TravelArrow – Your Virtual Travel Agent, which exposes a geolocation API key when making queries to "ip-apicom"

    Attackers who end up finding these keys could weaponize them to drive up API costs, host illegal content, send spoofed telemetry data, and mimic cryptocurrency transaction orders, some of which could see the developer's ban getting banned.
    Adding to the concern, Antidote Connector is just one of over 90 extensions that use InboxSDK, meaning the other extensions are susceptible to the same problem. The names of the other extensions were not disclosed by Symantec.

    "From GA4 analytics secrets to Azure speech keys, and from AWS S3 credentials to Google-specific tokens, each of these snippets demonstrates how a few lines of code can jeopardize an entire service," Guo said. "The solution: never store sensitive credentials on the client side."
    Developers are recommended to switch to HTTPS whenever they send or receive data, store credentials securely in a backend server using a credentials management service, and regularly rotate secrets to further minimize risk.
    The findings show how even popular extensions with hundreds of thousands of installations can suffer from trivial misconfigurations and security blunders like hard-coded credentials, leaving users' data at risk.
    "Users of these extensions should consider removing them until the developers address the insecurecalls," the company said. "The risk is not just theoretical; unencrypted traffic is simple to capture, and the data can be used for profiling, phishing, or other targeted attacks."
    "The overarching lesson is that a large install base or a well-known brand does not necessarily ensure best practices around encryption. Extensions should be scrutinized for the protocols they use and the data they share, to ensure users' information remains truly safe."

    Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post.
    #popular #chrome #extensions #leak #api
    Popular Chrome Extensions Leak API Keys, User Data via HTTP and Hard-Coded Credentials
    Cybersecurity researchers have flagged several popular Google Chrome extensions that have been found to transmit data in HTTP and hard-code secrets in their code, exposing users to privacy and security risks. "Several widely used extensionsunintentionally transmit sensitive data over simple HTTP," Yuanjing Guo, a security researcher in the Symantec's Security Technology and Response team, said. "By doing so, they expose browsing domains, machine IDs, operating system details, usage analytics, and even uninstall information, in plaintext." The fact that the network traffic is unencrypted also means that they are susceptible to adversary-in-the-middleattacks, allowing malicious actors on the same network such as a public Wi-Fi to intercept and, even worse, modify this data, which could lead to far more serious consequences. The list of identified extensions are below - SEMRush Rankand PI Rank, which call the URL "rank.trelliancom" over plain HTTP Browsec VPN, which uses HTTP to call an uninstall URL at "browsec-uninstall.s3-website.eu-central-1.amazonawscom" when a user attempts to uninstall the extension MSN New Taband MSN Homepage, Bing Search & News, which transmit a unique machine identifier and other details over HTTP to "g.ceipmsncom" DualSafe Password Manager & Digital Vault, which constructs an HTTP-based URL request to "stats.itopupdatecom" along with information about the extension version, user's browser language, and usage "type" "Although credentials or passwords do not appear to be leaked, the fact that a password manager uses unencrypted requests for telemetry erodes trust in its overall security posture," Guo said. Symantec said it also identified another set of extensions with API keys, secrets, and tokens directly embedded in the JavaScript code, which an attacker could weaponize to craft malicious requests and carry out various malicious actions - Online Security & Privacy extension, AVG Online Security, Speed Dial- New Tab Page, 3D, Sync, and SellerSprite - Amazon Research Tool, which expose a hard-coded Google Analytics 4API secret that an attacker could use to bombard the GA4 endpoint and corrupt metrics Equatio – Math Made Digital, which embeds a Microsoft Azure API key used for speech recognition that an attacker could use to inflate the developer's costs or exhaust their usage limits Awesome Screen Recorder & Screenshotand Scrolling Screenshot Tool & Screen Capture, which expose the developer's Amazon Web Servicesaccess key used to upload screenshots to the developer's S3 bucket Microsoft Editor – Spelling & Grammar Checker, which exposes a telemetry key named "StatsApiKey" to log user data for analytics Antidote Connector, which incorporates a third-party library called InboxSDK that contains hard-coded credentials, including API keys. Watch2Gether, which exposes a Tenor GIF search API key Trust Wallet, which exposes an API key associated with the Ramp Network, a Web3 platform that offers wallet developers a way to let users buy or sell crypto directly from the app TravelArrow – Your Virtual Travel Agent, which exposes a geolocation API key when making queries to "ip-apicom" Attackers who end up finding these keys could weaponize them to drive up API costs, host illegal content, send spoofed telemetry data, and mimic cryptocurrency transaction orders, some of which could see the developer's ban getting banned. Adding to the concern, Antidote Connector is just one of over 90 extensions that use InboxSDK, meaning the other extensions are susceptible to the same problem. The names of the other extensions were not disclosed by Symantec. "From GA4 analytics secrets to Azure speech keys, and from AWS S3 credentials to Google-specific tokens, each of these snippets demonstrates how a few lines of code can jeopardize an entire service," Guo said. "The solution: never store sensitive credentials on the client side." Developers are recommended to switch to HTTPS whenever they send or receive data, store credentials securely in a backend server using a credentials management service, and regularly rotate secrets to further minimize risk. The findings show how even popular extensions with hundreds of thousands of installations can suffer from trivial misconfigurations and security blunders like hard-coded credentials, leaving users' data at risk. "Users of these extensions should consider removing them until the developers address the insecurecalls," the company said. "The risk is not just theoretical; unencrypted traffic is simple to capture, and the data can be used for profiling, phishing, or other targeted attacks." "The overarching lesson is that a large install base or a well-known brand does not necessarily ensure best practices around encryption. Extensions should be scrutinized for the protocols they use and the data they share, to ensure users' information remains truly safe." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post. #popular #chrome #extensions #leak #api
    THEHACKERNEWS.COM
    Popular Chrome Extensions Leak API Keys, User Data via HTTP and Hard-Coded Credentials
    Cybersecurity researchers have flagged several popular Google Chrome extensions that have been found to transmit data in HTTP and hard-code secrets in their code, exposing users to privacy and security risks. "Several widely used extensions [...] unintentionally transmit sensitive data over simple HTTP," Yuanjing Guo, a security researcher in the Symantec's Security Technology and Response team, said. "By doing so, they expose browsing domains, machine IDs, operating system details, usage analytics, and even uninstall information, in plaintext." The fact that the network traffic is unencrypted also means that they are susceptible to adversary-in-the-middle (AitM) attacks, allowing malicious actors on the same network such as a public Wi-Fi to intercept and, even worse, modify this data, which could lead to far more serious consequences. The list of identified extensions are below - SEMRush Rank (extension ID: idbhoeaiokcojcgappfigpifhpkjgmab) and PI Rank (ID: ccgdboldgdlngcgfdolahmiilojmfndl), which call the URL "rank.trellian[.]com" over plain HTTP Browsec VPN (ID: omghfjlpggmjjaagoclmmobgdodcjboh), which uses HTTP to call an uninstall URL at "browsec-uninstall.s3-website.eu-central-1.amazonaws[.]com" when a user attempts to uninstall the extension MSN New Tab (ID: lklfbkdigihjaaeamncibechhgalldgl) and MSN Homepage, Bing Search & News (ID: midiombanaceofjhodpdibeppmnamfcj), which transmit a unique machine identifier and other details over HTTP to "g.ceipmsn[.]com" DualSafe Password Manager & Digital Vault (ID: lgbjhdkjmpgjgcbcdlhkokkckpjmedgc), which constructs an HTTP-based URL request to "stats.itopupdate[.]com" along with information about the extension version, user's browser language, and usage "type" "Although credentials or passwords do not appear to be leaked, the fact that a password manager uses unencrypted requests for telemetry erodes trust in its overall security posture," Guo said. Symantec said it also identified another set of extensions with API keys, secrets, and tokens directly embedded in the JavaScript code, which an attacker could weaponize to craft malicious requests and carry out various malicious actions - Online Security & Privacy extension (ID: gomekmidlodglbbmalcneegieacbdmki), AVG Online Security (ID: nbmoafcmbajniiapeidgficgifbfmjfo), Speed Dial [FVD] - New Tab Page, 3D, Sync (ID: llaficoajjainaijghjlofdfmbjpebpa), and SellerSprite - Amazon Research Tool (ID: lnbmbgocenenhhhdojdielgnmeflbnfb), which expose a hard-coded Google Analytics 4 (GA4) API secret that an attacker could use to bombard the GA4 endpoint and corrupt metrics Equatio – Math Made Digital (ID: hjngolefdpdnooamgdldlkjgmdcmcjnc), which embeds a Microsoft Azure API key used for speech recognition that an attacker could use to inflate the developer's costs or exhaust their usage limits Awesome Screen Recorder & Screenshot (ID: nlipoenfbbikpbjkfpfillcgkoblgpmj) and Scrolling Screenshot Tool & Screen Capture (ID: mfpiaehgjbbfednooihadalhehabhcjo), which expose the developer's Amazon Web Services (AWS) access key used to upload screenshots to the developer's S3 bucket Microsoft Editor – Spelling & Grammar Checker (ID: gpaiobkfhnonedkhhfjpmhdalgeoebfa), which exposes a telemetry key named "StatsApiKey" to log user data for analytics Antidote Connector (ID: lmbopdiikkamfphhgcckcjhojnokgfeo), which incorporates a third-party library called InboxSDK that contains hard-coded credentials, including API keys. Watch2Gether (ID: cimpffimgeipdhnhjohpbehjkcdpjolg), which exposes a Tenor GIF search API key Trust Wallet (ID: egjidjbpglichdcondbcbdnbeeppgdph), which exposes an API key associated with the Ramp Network, a Web3 platform that offers wallet developers a way to let users buy or sell crypto directly from the app TravelArrow – Your Virtual Travel Agent (ID: coplmfnphahpcknbchcehdikbdieognn), which exposes a geolocation API key when making queries to "ip-api[.]com" Attackers who end up finding these keys could weaponize them to drive up API costs, host illegal content, send spoofed telemetry data, and mimic cryptocurrency transaction orders, some of which could see the developer's ban getting banned. Adding to the concern, Antidote Connector is just one of over 90 extensions that use InboxSDK, meaning the other extensions are susceptible to the same problem. The names of the other extensions were not disclosed by Symantec. "From GA4 analytics secrets to Azure speech keys, and from AWS S3 credentials to Google-specific tokens, each of these snippets demonstrates how a few lines of code can jeopardize an entire service," Guo said. "The solution: never store sensitive credentials on the client side." Developers are recommended to switch to HTTPS whenever they send or receive data, store credentials securely in a backend server using a credentials management service, and regularly rotate secrets to further minimize risk. The findings show how even popular extensions with hundreds of thousands of installations can suffer from trivial misconfigurations and security blunders like hard-coded credentials, leaving users' data at risk. "Users of these extensions should consider removing them until the developers address the insecure [HTTP] calls," the company said. "The risk is not just theoretical; unencrypted traffic is simple to capture, and the data can be used for profiling, phishing, or other targeted attacks." "The overarching lesson is that a large install base or a well-known brand does not necessarily ensure best practices around encryption. Extensions should be scrutinized for the protocols they use and the data they share, to ensure users' information remains truly safe." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post.
    Like
    Love
    Wow
    Sad
    Angry
    334
    0 Commentarii 0 Distribuiri