Professional Honorary Organization
Recent Updates
-
NOSIPHO MAKETO-VAN DEN BRAGT ALTERED HER CAREER PATH TO LAUNCH CHOCOLATE TRIBE
By TREVOR HOGG
Images courtesy of Chocolate Tribe.
Nosipho Maketo-van den Bragt, Owner and CEO, Chocolate Tribe
After initially pursuing a career as an attorney, Nosipho Maketo-van den Bragt discovered her true calling was to apply her legal knowledge in a more artistic endeavor with her husband, Rob Van den Bragt, who had forged a career as a visual effects supervisor. The couple co-founded Chocolate Tribe, the Johannesburg and Cape Town-based visual effects and animation studio that has done work for Netflix, BBC, Disney and Voltage Pictures.
“It was following my passion and my passion finding me,” observes Maketo-van den Bragt, Owner and CEO of Chocolate Tribe and Founder of AVIJOZI. “I grew up in Soweto, South Africa, and we had this old-fashioned television. I was always fascinated by how those people got in there to perform and entertain us. Living in the townships, you become the funnel for your parents’ aspirations and dreams. My dad was a judge’s registrar, so he was writing all of the court cases coming up for a judge. My dad would come home and tell us stories of what happened in court. I found this enthralling, funny and sometimes painful because it was about people’s lives. I did law and to some extent still practice it. My legal career and entertainment media careers merged because I fell in love with the storytelling aspect of it all. There are those who say that lawyers are failed actors!”
Chocolate Tribe hosts what has become the annual AVIJOZI festival with Netflix. AVIJOZI is a two-day, free-access event in Johannesburg focused on Animation/Film, Visual Effects and Interactive Technology. This year’s AVIJOZI is scheduled for September 13-14 in Johannesburg. Photo: Casting Director and Actor Spaces Founder Ayanda Sithebeand friends at AVIJOZI 2024.
A personal ambition was to find a way to merge married life into a professional partnership. “I never thought that a lawyer and a creative would work together,” admits Maketo-van den Bragt. “However, Rob and I had this great love for watching films together and music; entertainment was the core fabric of our relationship. That was my first gentle schooling into the visual effects and animation content development space. Starting the company was due to both of us being out of work. I had quit my job without any sort of plan B. I actually incorporated Chocolate Tribe as a company without knowing what we would do with it. As time went on, there was a project that we were asked to come to do. The relationship didn’t work out, so Rob and I decided, ‘Okay, it seems like we can do this on our own.’ I’ve read many books about visual effects and animation, and I still do. I attend a lot of festivals. I am connected with a lot of the guys who work in different visual effects spaces because it is all about understanding how it works and, from a business side, how can we leverage all of that information?”
Chocolate Tribe provided VFX and post-production for Checkers supermarket’s “Planet” ad promoting environmental sustainability. The Chocolate Tribe team pushed photorealism for the ad, creating three fully CG creatures: a polar bear, orangutan and sea turtle.
With a population of 1.5 billion, there is no shortage of consumers and content creators in Africa. “Nollywood is great because it shows us that even with minimal resources, you can create a whole movement and ecosystem,” Maketo-van den Bragt remarks. “Maybe the question around Nollywood is making sure that the caliber and quality of work is high end and speaks to a global audience. South Africa has the same dynamics. It’s a vibrant traditional film and animation industry that grows in leaps and bounds every year. More and more animation houses are being incorporated or started with CEOs or managing directors in their 20s. There’s also an eagerness to look for different stories which haven’t been told. Africa gives that opportunity to tell stories that ordinary people, for example, in America, have not heard or don’t know about. There’s a huge rise in animation, visual effects and content in general.”
Rob van den Bragt served as Creative Supervisor and Nosipho Maketo-van den Bragt as Studio Executive for the “Surf Sangoma” episode of the Disney+ series Kizazi Moto: Generation Fire.
Rob van den Bragt, CCO, and Nosipho Maketo-van den Bragt, CEO, Co-Founders of Chocolate Tribe, in an AVIJOZI planning meeting.
Stella Gono, Software Developer, working on the Chocolate Tribe website.
Family photo of the Maketos. Maketo-van de Bragt has two siblings.
Film tax credits have contributed to The Woman King, Dredd, Safe House, Black Sails and Mission: Impossible – Final Reckoning shooting in South Africa. “People understand principal photography, but there is confusion about animation and visual effects,” Maketo-van den Bragt states. “Rebates pose a challenge because now you have to go above and beyond to explain what you are selling. It’s taken time for the government to realize this is a viable career.” The streamers have had a positive impact. “For the most part, Netflix localizes, and that’s been quite a big hit because it speaks to the demographics and local representation and uplifts talent within those geographical spaces. We did one of the shorts for Disney’s Kizazi Moto: Generation Fire, and there was huge global excitement to that kind of anthology coming from Africa. We’ve worked on a number of collaborations with the U.K., and often that melding of different partners creates a fusion of universality. We need to tell authentic stories, and that authenticity will be dictated by the voices in the writing room.”
AVIJOZI was established to support the development of local talent in animation, visual effects, film production and gaming. “AVIJOZI stands for Animation Visual Effects Interactive in JOZI,” Maketo-van den Bragt explains. “It is a conference as well as a festival. The conference part is where we have networking sessions, panel discussions and behind-the-scenes presentations to draw the curtain back and show what happens when people create avatars. We want to show the next generation that there is a way to do this magical craft. The festival part is people have film screenings and music as well. We’ve brought in gaming as an integral aspect, which attracts many young people because that’s something they do at an early age. Gaming has become the common sport. AVIJOVI is in its fourth year now. It started when I got irritated by people constantly complaining, ‘Nothing ever happens in Johannesburg in terms of animation and visual effects.’ Nobody wanted to do it. So, I said, ‘I’ll do it.’ I didn’t know what I was getting myself into, and four years later I have lots of gray hair!”
Rob van den Bragt served as Animation Supervisor/Visual Effects Supervisor and Nosipho Maketo-van den Bragt as an Executive Producer on iNumber Number: Jozi Goldfor Netflix.Mentorship and internship programs have been established with various academic institutions, and while there are times when specific skills are being sought, like rigging, the field of view tends to be much wider. “What we are finding is that the people who have done other disciplines are much more vibrant,” Maketo-van den Bragt states. “Artists don’t always know how to communicate because it’s all in their heads. Sometimes, somebody with a different background can articulate that vision a bit better because they have those other skills. We also find with those who have gone to art school that the range within their artistry and craftsmanship has become a ‘thing.’ When you have mentally traveled where you have done other things, it allows you to be a more well-rounded artist because you can pull references from different walks of life and engage with different topics without being constrained to one thing. We look for people with a plethora of skills and diverse backgrounds. It’s a lot richer as a Chocolate Tribe. There are multiple flavors.”
South African director/producer/cinematographer and drone cinemtography specialist FC Hamman, Founder of FC Hamman Films, at AVIJOZI 2024.
There is a particular driving force when it comes to mentoring. “I want to be the mentor I hoped for,” Maketo-van den Bragt remarks. “I have silent mentors in that we didn’t formalize the relationship, but I knew they were my mentors because every time I would encounter an issue, I would be able to call them. One of the people who not only mentored but pushed me into different spaces is Jinko Gotoh, who is part of Women in Animation. She brought me into Women in Animation, and I had never mentored anybody. Here I was, sitting with six women who wanted to know how I was able to build up Chocolate Tribe. I didn’t know how to structure a presentation to tell them about the journey because I had been so focused on the journey. It’s a sense of grit and feeling that I cannot fail because I have a whole community that believes in me. Even when I felt my shoulders sagging, they would be there to say, ‘We need this. Keep it moving.’ This isn’t just about me. I have a whole stream of people who want this to work.”
Netflix VFX Manager Ben Perry, who oversees Netflix’s VFX strategy across Africa, the Middle East and Europe, at AVIJOZI 2024. Netflix was a partner in AVIJOZI with Chocolate Tribe for three years.
Zama Mfusi, Founder of IndiLang, and Isabelle Rorke, CEO of Dreamforge Creative and Deputy Chair of Animation SA, at AVIJOZI 2024.
Numerous unknown factors had to be accounted for, which made predicting how the journey would unfold extremely difficult. “What it looks like and what I expected it to be, you don’t have the full sense of what it would lead to in this situation,” Maketo-van den Bragt states. “I can tell you that there have been moments of absolute joy where I was so excited we got this project or won that award. There are other moments where you feel completely lost and ask yourself, ‘Am I doing the right thing?’ The journey is to have the highs, lows and moments of confusion. I go through it and accept that not every day will be an award-winning day. For the most part, I love this journey. I wanted to be somewhere where there was a purpose. What has been a big highlight is when I’m signing a contract for new employees who are excited about being part of Chocolate Tribe. Also, when you get a new project and it’s exciting, especially from a service or visual effects perspective, we’re constantly looking for that dragon or big creature. It’s about being mesmerizing, epic and awesome.”
Maketo-van den Bragt has two major career-defining ambitions. “Fostering the next generation of talent and making sure that they are ready to create these amazing stories properly – that is my life work, and relating the African narrative to let the world see the human aspect of who we are because for the longest time we’ve been written out of the stories and narratives.”
#nosipho #maketovan #den #bragt #alteredNOSIPHO MAKETO-VAN DEN BRAGT ALTERED HER CAREER PATH TO LAUNCH CHOCOLATE TRIBEBy TREVOR HOGG Images courtesy of Chocolate Tribe. Nosipho Maketo-van den Bragt, Owner and CEO, Chocolate Tribe After initially pursuing a career as an attorney, Nosipho Maketo-van den Bragt discovered her true calling was to apply her legal knowledge in a more artistic endeavor with her husband, Rob Van den Bragt, who had forged a career as a visual effects supervisor. The couple co-founded Chocolate Tribe, the Johannesburg and Cape Town-based visual effects and animation studio that has done work for Netflix, BBC, Disney and Voltage Pictures. “It was following my passion and my passion finding me,” observes Maketo-van den Bragt, Owner and CEO of Chocolate Tribe and Founder of AVIJOZI. “I grew up in Soweto, South Africa, and we had this old-fashioned television. I was always fascinated by how those people got in there to perform and entertain us. Living in the townships, you become the funnel for your parents’ aspirations and dreams. My dad was a judge’s registrar, so he was writing all of the court cases coming up for a judge. My dad would come home and tell us stories of what happened in court. I found this enthralling, funny and sometimes painful because it was about people’s lives. I did law and to some extent still practice it. My legal career and entertainment media careers merged because I fell in love with the storytelling aspect of it all. There are those who say that lawyers are failed actors!” Chocolate Tribe hosts what has become the annual AVIJOZI festival with Netflix. AVIJOZI is a two-day, free-access event in Johannesburg focused on Animation/Film, Visual Effects and Interactive Technology. This year’s AVIJOZI is scheduled for September 13-14 in Johannesburg. Photo: Casting Director and Actor Spaces Founder Ayanda Sithebeand friends at AVIJOZI 2024. A personal ambition was to find a way to merge married life into a professional partnership. “I never thought that a lawyer and a creative would work together,” admits Maketo-van den Bragt. “However, Rob and I had this great love for watching films together and music; entertainment was the core fabric of our relationship. That was my first gentle schooling into the visual effects and animation content development space. Starting the company was due to both of us being out of work. I had quit my job without any sort of plan B. I actually incorporated Chocolate Tribe as a company without knowing what we would do with it. As time went on, there was a project that we were asked to come to do. The relationship didn’t work out, so Rob and I decided, ‘Okay, it seems like we can do this on our own.’ I’ve read many books about visual effects and animation, and I still do. I attend a lot of festivals. I am connected with a lot of the guys who work in different visual effects spaces because it is all about understanding how it works and, from a business side, how can we leverage all of that information?” Chocolate Tribe provided VFX and post-production for Checkers supermarket’s “Planet” ad promoting environmental sustainability. The Chocolate Tribe team pushed photorealism for the ad, creating three fully CG creatures: a polar bear, orangutan and sea turtle. With a population of 1.5 billion, there is no shortage of consumers and content creators in Africa. “Nollywood is great because it shows us that even with minimal resources, you can create a whole movement and ecosystem,” Maketo-van den Bragt remarks. “Maybe the question around Nollywood is making sure that the caliber and quality of work is high end and speaks to a global audience. South Africa has the same dynamics. It’s a vibrant traditional film and animation industry that grows in leaps and bounds every year. More and more animation houses are being incorporated or started with CEOs or managing directors in their 20s. There’s also an eagerness to look for different stories which haven’t been told. Africa gives that opportunity to tell stories that ordinary people, for example, in America, have not heard or don’t know about. There’s a huge rise in animation, visual effects and content in general.” Rob van den Bragt served as Creative Supervisor and Nosipho Maketo-van den Bragt as Studio Executive for the “Surf Sangoma” episode of the Disney+ series Kizazi Moto: Generation Fire. Rob van den Bragt, CCO, and Nosipho Maketo-van den Bragt, CEO, Co-Founders of Chocolate Tribe, in an AVIJOZI planning meeting. Stella Gono, Software Developer, working on the Chocolate Tribe website. Family photo of the Maketos. Maketo-van de Bragt has two siblings. Film tax credits have contributed to The Woman King, Dredd, Safe House, Black Sails and Mission: Impossible – Final Reckoning shooting in South Africa. “People understand principal photography, but there is confusion about animation and visual effects,” Maketo-van den Bragt states. “Rebates pose a challenge because now you have to go above and beyond to explain what you are selling. It’s taken time for the government to realize this is a viable career.” The streamers have had a positive impact. “For the most part, Netflix localizes, and that’s been quite a big hit because it speaks to the demographics and local representation and uplifts talent within those geographical spaces. We did one of the shorts for Disney’s Kizazi Moto: Generation Fire, and there was huge global excitement to that kind of anthology coming from Africa. We’ve worked on a number of collaborations with the U.K., and often that melding of different partners creates a fusion of universality. We need to tell authentic stories, and that authenticity will be dictated by the voices in the writing room.” AVIJOZI was established to support the development of local talent in animation, visual effects, film production and gaming. “AVIJOZI stands for Animation Visual Effects Interactive in JOZI,” Maketo-van den Bragt explains. “It is a conference as well as a festival. The conference part is where we have networking sessions, panel discussions and behind-the-scenes presentations to draw the curtain back and show what happens when people create avatars. We want to show the next generation that there is a way to do this magical craft. The festival part is people have film screenings and music as well. We’ve brought in gaming as an integral aspect, which attracts many young people because that’s something they do at an early age. Gaming has become the common sport. AVIJOVI is in its fourth year now. It started when I got irritated by people constantly complaining, ‘Nothing ever happens in Johannesburg in terms of animation and visual effects.’ Nobody wanted to do it. So, I said, ‘I’ll do it.’ I didn’t know what I was getting myself into, and four years later I have lots of gray hair!” Rob van den Bragt served as Animation Supervisor/Visual Effects Supervisor and Nosipho Maketo-van den Bragt as an Executive Producer on iNumber Number: Jozi Goldfor Netflix.Mentorship and internship programs have been established with various academic institutions, and while there are times when specific skills are being sought, like rigging, the field of view tends to be much wider. “What we are finding is that the people who have done other disciplines are much more vibrant,” Maketo-van den Bragt states. “Artists don’t always know how to communicate because it’s all in their heads. Sometimes, somebody with a different background can articulate that vision a bit better because they have those other skills. We also find with those who have gone to art school that the range within their artistry and craftsmanship has become a ‘thing.’ When you have mentally traveled where you have done other things, it allows you to be a more well-rounded artist because you can pull references from different walks of life and engage with different topics without being constrained to one thing. We look for people with a plethora of skills and diverse backgrounds. It’s a lot richer as a Chocolate Tribe. There are multiple flavors.” South African director/producer/cinematographer and drone cinemtography specialist FC Hamman, Founder of FC Hamman Films, at AVIJOZI 2024. There is a particular driving force when it comes to mentoring. “I want to be the mentor I hoped for,” Maketo-van den Bragt remarks. “I have silent mentors in that we didn’t formalize the relationship, but I knew they were my mentors because every time I would encounter an issue, I would be able to call them. One of the people who not only mentored but pushed me into different spaces is Jinko Gotoh, who is part of Women in Animation. She brought me into Women in Animation, and I had never mentored anybody. Here I was, sitting with six women who wanted to know how I was able to build up Chocolate Tribe. I didn’t know how to structure a presentation to tell them about the journey because I had been so focused on the journey. It’s a sense of grit and feeling that I cannot fail because I have a whole community that believes in me. Even when I felt my shoulders sagging, they would be there to say, ‘We need this. Keep it moving.’ This isn’t just about me. I have a whole stream of people who want this to work.” Netflix VFX Manager Ben Perry, who oversees Netflix’s VFX strategy across Africa, the Middle East and Europe, at AVIJOZI 2024. Netflix was a partner in AVIJOZI with Chocolate Tribe for three years. Zama Mfusi, Founder of IndiLang, and Isabelle Rorke, CEO of Dreamforge Creative and Deputy Chair of Animation SA, at AVIJOZI 2024. Numerous unknown factors had to be accounted for, which made predicting how the journey would unfold extremely difficult. “What it looks like and what I expected it to be, you don’t have the full sense of what it would lead to in this situation,” Maketo-van den Bragt states. “I can tell you that there have been moments of absolute joy where I was so excited we got this project or won that award. There are other moments where you feel completely lost and ask yourself, ‘Am I doing the right thing?’ The journey is to have the highs, lows and moments of confusion. I go through it and accept that not every day will be an award-winning day. For the most part, I love this journey. I wanted to be somewhere where there was a purpose. What has been a big highlight is when I’m signing a contract for new employees who are excited about being part of Chocolate Tribe. Also, when you get a new project and it’s exciting, especially from a service or visual effects perspective, we’re constantly looking for that dragon or big creature. It’s about being mesmerizing, epic and awesome.” Maketo-van den Bragt has two major career-defining ambitions. “Fostering the next generation of talent and making sure that they are ready to create these amazing stories properly – that is my life work, and relating the African narrative to let the world see the human aspect of who we are because for the longest time we’ve been written out of the stories and narratives.” #nosipho #maketovan #den #bragt #alteredWWW.VFXVOICE.COMNOSIPHO MAKETO-VAN DEN BRAGT ALTERED HER CAREER PATH TO LAUNCH CHOCOLATE TRIBEBy TREVOR HOGG Images courtesy of Chocolate Tribe. Nosipho Maketo-van den Bragt, Owner and CEO, Chocolate Tribe After initially pursuing a career as an attorney, Nosipho Maketo-van den Bragt discovered her true calling was to apply her legal knowledge in a more artistic endeavor with her husband, Rob Van den Bragt, who had forged a career as a visual effects supervisor. The couple co-founded Chocolate Tribe, the Johannesburg and Cape Town-based visual effects and animation studio that has done work for Netflix, BBC, Disney and Voltage Pictures. “It was following my passion and my passion finding me,” observes Maketo-van den Bragt, Owner and CEO of Chocolate Tribe and Founder of AVIJOZI. “I grew up in Soweto, South Africa, and we had this old-fashioned television. I was always fascinated by how those people got in there to perform and entertain us. Living in the townships, you become the funnel for your parents’ aspirations and dreams. My dad was a judge’s registrar, so he was writing all of the court cases coming up for a judge. My dad would come home and tell us stories of what happened in court. I found this enthralling, funny and sometimes painful because it was about people’s lives. I did law and to some extent still practice it. My legal career and entertainment media careers merged because I fell in love with the storytelling aspect of it all. There are those who say that lawyers are failed actors!” Chocolate Tribe hosts what has become the annual AVIJOZI festival with Netflix. AVIJOZI is a two-day, free-access event in Johannesburg focused on Animation/Film, Visual Effects and Interactive Technology. This year’s AVIJOZI is scheduled for September 13-14 in Johannesburg. Photo: Casting Director and Actor Spaces Founder Ayanda Sithebe (center in black T-shirt) and friends at AVIJOZI 2024. A personal ambition was to find a way to merge married life into a professional partnership. “I never thought that a lawyer and a creative would work together,” admits Maketo-van den Bragt. “However, Rob and I had this great love for watching films together and music; entertainment was the core fabric of our relationship. That was my first gentle schooling into the visual effects and animation content development space. Starting the company was due to both of us being out of work. I had quit my job without any sort of plan B. I actually incorporated Chocolate Tribe as a company without knowing what we would do with it. As time went on, there was a project that we were asked to come to do. The relationship didn’t work out, so Rob and I decided, ‘Okay, it seems like we can do this on our own.’ I’ve read many books about visual effects and animation, and I still do. I attend a lot of festivals. I am connected with a lot of the guys who work in different visual effects spaces because it is all about understanding how it works and, from a business side, how can we leverage all of that information?” Chocolate Tribe provided VFX and post-production for Checkers supermarket’s “Planet” ad promoting environmental sustainability. The Chocolate Tribe team pushed photorealism for the ad, creating three fully CG creatures: a polar bear, orangutan and sea turtle. With a population of 1.5 billion, there is no shortage of consumers and content creators in Africa. “Nollywood is great because it shows us that even with minimal resources, you can create a whole movement and ecosystem,” Maketo-van den Bragt remarks. “Maybe the question around Nollywood is making sure that the caliber and quality of work is high end and speaks to a global audience. South Africa has the same dynamics. It’s a vibrant traditional film and animation industry that grows in leaps and bounds every year. More and more animation houses are being incorporated or started with CEOs or managing directors in their 20s. There’s also an eagerness to look for different stories which haven’t been told. Africa gives that opportunity to tell stories that ordinary people, for example, in America, have not heard or don’t know about. There’s a huge rise in animation, visual effects and content in general.” Rob van den Bragt served as Creative Supervisor and Nosipho Maketo-van den Bragt as Studio Executive for the “Surf Sangoma” episode of the Disney+ series Kizazi Moto: Generation Fire. Rob van den Bragt, CCO, and Nosipho Maketo-van den Bragt, CEO, Co-Founders of Chocolate Tribe, in an AVIJOZI planning meeting. Stella Gono, Software Developer, working on the Chocolate Tribe website. Family photo of the Maketos. Maketo-van de Bragt has two siblings. Film tax credits have contributed to The Woman King, Dredd, Safe House, Black Sails and Mission: Impossible – Final Reckoning shooting in South Africa. “People understand principal photography, but there is confusion about animation and visual effects,” Maketo-van den Bragt states. “Rebates pose a challenge because now you have to go above and beyond to explain what you are selling. It’s taken time for the government to realize this is a viable career.” The streamers have had a positive impact. “For the most part, Netflix localizes, and that’s been quite a big hit because it speaks to the demographics and local representation and uplifts talent within those geographical spaces. We did one of the shorts for Disney’s Kizazi Moto: Generation Fire, and there was huge global excitement to that kind of anthology coming from Africa. We’ve worked on a number of collaborations with the U.K., and often that melding of different partners creates a fusion of universality. We need to tell authentic stories, and that authenticity will be dictated by the voices in the writing room.” AVIJOZI was established to support the development of local talent in animation, visual effects, film production and gaming. “AVIJOZI stands for Animation Visual Effects Interactive in JOZI [nickname for Johannesburg],” Maketo-van den Bragt explains. “It is a conference as well as a festival. The conference part is where we have networking sessions, panel discussions and behind-the-scenes presentations to draw the curtain back and show what happens when people create avatars. We want to show the next generation that there is a way to do this magical craft. The festival part is people have film screenings and music as well. We’ve brought in gaming as an integral aspect, which attracts many young people because that’s something they do at an early age. Gaming has become the common sport. AVIJOVI is in its fourth year now. It started when I got irritated by people constantly complaining, ‘Nothing ever happens in Johannesburg in terms of animation and visual effects.’ Nobody wanted to do it. So, I said, ‘I’ll do it.’ I didn’t know what I was getting myself into, and four years later I have lots of gray hair!” Rob van den Bragt served as Animation Supervisor/Visual Effects Supervisor and Nosipho Maketo-van den Bragt as an Executive Producer on iNumber Number: Jozi Gold (2023) for Netflix. (Image courtesy of Chocolate Tribe and Netflix) Mentorship and internship programs have been established with various academic institutions, and while there are times when specific skills are being sought, like rigging, the field of view tends to be much wider. “What we are finding is that the people who have done other disciplines are much more vibrant,” Maketo-van den Bragt states. “Artists don’t always know how to communicate because it’s all in their heads. Sometimes, somebody with a different background can articulate that vision a bit better because they have those other skills. We also find with those who have gone to art school that the range within their artistry and craftsmanship has become a ‘thing.’ When you have mentally traveled where you have done other things, it allows you to be a more well-rounded artist because you can pull references from different walks of life and engage with different topics without being constrained to one thing. We look for people with a plethora of skills and diverse backgrounds. It’s a lot richer as a Chocolate Tribe. There are multiple flavors.” South African director/producer/cinematographer and drone cinemtography specialist FC Hamman, Founder of FC Hamman Films, at AVIJOZI 2024. There is a particular driving force when it comes to mentoring. “I want to be the mentor I hoped for,” Maketo-van den Bragt remarks. “I have silent mentors in that we didn’t formalize the relationship, but I knew they were my mentors because every time I would encounter an issue, I would be able to call them. One of the people who not only mentored but pushed me into different spaces is Jinko Gotoh, who is part of Women in Animation. She brought me into Women in Animation, and I had never mentored anybody. Here I was, sitting with six women who wanted to know how I was able to build up Chocolate Tribe. I didn’t know how to structure a presentation to tell them about the journey because I had been so focused on the journey. It’s a sense of grit and feeling that I cannot fail because I have a whole community that believes in me. Even when I felt my shoulders sagging, they would be there to say, ‘We need this. Keep it moving.’ This isn’t just about me. I have a whole stream of people who want this to work.” Netflix VFX Manager Ben Perry, who oversees Netflix’s VFX strategy across Africa, the Middle East and Europe, at AVIJOZI 2024. Netflix was a partner in AVIJOZI with Chocolate Tribe for three years. Zama Mfusi, Founder of IndiLang, and Isabelle Rorke, CEO of Dreamforge Creative and Deputy Chair of Animation SA, at AVIJOZI 2024. Numerous unknown factors had to be accounted for, which made predicting how the journey would unfold extremely difficult. “What it looks like and what I expected it to be, you don’t have the full sense of what it would lead to in this situation,” Maketo-van den Bragt states. “I can tell you that there have been moments of absolute joy where I was so excited we got this project or won that award. There are other moments where you feel completely lost and ask yourself, ‘Am I doing the right thing?’ The journey is to have the highs, lows and moments of confusion. I go through it and accept that not every day will be an award-winning day. For the most part, I love this journey. I wanted to be somewhere where there was a purpose. What has been a big highlight is when I’m signing a contract for new employees who are excited about being part of Chocolate Tribe. Also, when you get a new project and it’s exciting, especially from a service or visual effects perspective, we’re constantly looking for that dragon or big creature. It’s about being mesmerizing, epic and awesome.” Maketo-van den Bragt has two major career-defining ambitions. “Fostering the next generation of talent and making sure that they are ready to create these amazing stories properly – that is my life work, and relating the African narrative to let the world see the human aspect of who we are because for the longest time we’ve been written out of the stories and narratives.”Please log in to like, share and comment! -
DISCOVERING ELIO
By TREVOR HOGG
Images courtesy of Pixar.
The character design of Glordon is based on a tardigrade, which is a microscopic water bear.
Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red.
“Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.”
The character design of Glordon is based on a tardigrade, which is a microscopic water bear.
There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’”
Green is the thematic color for Elio.
Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?”
The Communiverse was meant to feel like a place that a child would love to visit and explore.
Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’”
The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena.
The variety in the Communiverse is a contrast to the regimented world on the military base.
There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’”
Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters.
Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.”
An aerial image of Elio as he attempts to get abducted by aliens.
Part of the design of the Coummuniverse was inspired by Chinese puzzle balls.
A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.”
Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.”
Exploring various facial expressions for Elio.
A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew.
Character designs of Elio and Glordon. which shows them interacting with each other.
Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.”
Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.”
Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.”
Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
#discovering #elioDISCOVERING ELIOBy TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.” #discovering #elioWWW.VFXVOICE.COMDISCOVERING ELIOBy TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when those [alien] characters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Sanii [Visual Effects Supervisor] gave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Hom [Production Manager] and I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowds [department] is dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessup [Production Designer] because sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Muren [VES] was keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter (CCO, Pixar). “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”0 Comments 0 Shares -
AN EXPLOSIVE MIX OF SFX AND VFX IGNITES FINAL DESTINATION BLOODLINES
By CHRIS McGOWAN
Images courtesy of Warner Bros. Pictures.
Final Destination Bloodlines, the sixth installment in the graphic horror series, kicks off with the film’s biggest challenge – deploying an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant. While there in 1968, young Iris Campbellhas a premonition about the Skyview burning, cracking, crumbling and collapsing. Then, when she sees these events actually starting to happen around her, she intervenes and causes an evacuation of the tower, thus thwarting death’s design and saving many lives. Years later, her granddaughter, Stefani Reyes, inherits the vision of the destruction that could have occurred and realizes death is still coming for the survivors.
“I knew we couldn’t put the wholeon fire, but Tonytried and put as much fire as he could safely and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction that can’t be simulated, so I think it was a success in terms of blending that practical with the visual.”
—Nordin Rahhali, VFX Supervisor
The film opens with an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant – and its collapse. Drone footage was digitized to create a 3D asset for the LED wall so the time of day could be changed as needed.
“The set that the directors wanted was very large,” says Nordin Rahhali, VFX Supervisor. “We had limited space options in stages given the scale and the footprint of the actual restaurant that they wanted. It was the first set piece, the first big thing we shot, so we had to get it all ready and going right off the bat. We built a bigger volume for our needs, including an LED wall that we built the assets for.”
“We were outside Vancouver at Bridge Studios in Burnaby. The custom-built LED volume was a little over 200 feet in length” states Christian Sebaldt, ASC, the movie’s DP. The volume was 98 feet in diameter and 24 feet tall. Rahhali explains, “Pixomondo was the vendor that we contracted to come in and build the volume. They also built the asset that went on the LED wall, so they were part of our filming team and production shoot. Subsequently, they were also the main vendor doing post, which was by design. By having them design and take care of the asset during production, we were able to leverage their assets, tools and builds for some of the post VFX.” Rahhali adds, “It was really important to make sure we had days with the volume team and with Christian and his camera team ahead of the shoot so we could dial it in.”
Built at Bridge Studios in Burnaby outside Vancouver, the custom-built LED volume for events at the Skyview restaurant was over 200 feet long, 98 feet wide and 24 feet tall. Extensive previs with Digital Domain was done to advance key shots.Zach Lipovsky and Adam Stein directed Final Destination Bloodlines for New Line film, distributed by Warner Bros., in which chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated death at some point. Pixomondo was the lead VFX vendor, followed by FOLKS VFX. Picture Shop also contributed. There were around 800 VFX shots. Tony Lazarowich was the Special Effects Supervisor.
“The Skyview restaurant involved building a massive setwas fire retardant, which meant the construction took longer than normal because they had to build it with certain materials and coat it with certain things because, obviously, it serves for the set piece. As it’s falling into chaos, a lot of that fire was practical. I really jived with what Christian and directors wanted and how Tony likes to work – to augment as much real practical stuff as possible,” Rahhali remarks. “I knew we couldn’t put the whole thing on fire, but Tony tried and put as much fire as he could safely, and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction can’t be simulated, so I think it was a success in terms of blending that practical with the visual.”
The Skyview restaurant required building a massive set that was fire retardant. Construction on the set took longer because it had to be built and coated with special materials. As the Skyview restaurant falls into chaos, much of the fire was practical.“We got all the Vancouver skylineso we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.”
—Christian Sebaldt, ASC, Director of Photography
For drone shots, the team utilized a custom heavy-lift drone with three RED Komodo Digital Cinema cameras “giving us almost 180 degrees with overlap that we would then stitch in post and have a ridiculous amount of resolution off these three cameras,” Sebaldt states. “The other drone we used was a DJI Inspire 3, which was also very good. And we flew these drones up at the height. We flew them at different times of day. We flew full 360s, and we also used them for photogrammetry. We got all the Vancouver skyline so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” Rahhali adds, “All of this allowed us to figure out what we were going to shoot. We had the stage build, and we had the drone footage that we then digitized and created a 3D asset to go on the wallwe could change the times of day”
Pixomondo built the volume and the asset that went on the LED wall for the Skyview sequence. They were also the main vendor during post. FOLKS VFX and Picture Shop contributed.“We did extensive previs with Digital Domain,” Rahhali explains. “That was important because we knew the key shots that the directors wanted. With a combination of those key shots, we then kind of reverse-engineeredwhile we did techvis off the previs and worked with Christian and the art department so we would have proper flexibility with the set to be able to pull off some of these shots.some of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.”
Some shots required the Skyview’s ceiling to be lifted and partially removed to get a crane to shoot Paul Campbellas he’s about to fall.
The character Iris lived in a fortified house, isolating herself methodically to avoid the Grim Reaper. Rahhali comments, “That was a beautiful locationGVRD, very cold. It was a long, hard shoot, because it was all nights. It was just this beautiful pocket out in the middle of the mountains. We in visual effects didn’t do a ton other than a couple of clean-ups of the big establishing shots when you see them pull up to the compound. We had to clean up small roads we wanted to make look like one road and make the road look like dirt.” There were flames involved. Sebaldt says, “The explosionwas unbelievably big. We had eight cameras on it at night and shot it at high speed, and we’re all going ‘Whoa.’” Rahhali notes, “There was some clean-up, but the explosion was 100% practical. Our Special Effects Supervisor, Tony, went to town on that. He blew up the whole house, and it looked spectacular.”
The tattoo shop piercing scene is one of the most talked-about sequences in the movie, where a dangling chain from a ceiling fan attaches itself to the septum nose piercing of Erik Campbelland drags him toward a raging fire. Rahhali observes, “That was very Final Destination and a great Rube Goldberg build-up event. Richard was great. He was tied up on a stunt line for most of it, balancing on top of furniture. All of that was him doing it for real with a stunt line.” Some effects solutions can be surprisingly extremely simple. Rahhali continues, “Our producercame up with a great gagseptum ring.” Richard’s nose was connected with just a nose plug that went inside his nostrils. “All that tugging and everything that you’re seeing was real. For weeks and weeks, we were all trying to figure out how to do it without it being a big visual effects thing. ‘How are we gonna pull his nose for real?’ Craig said, ‘I have these things I use to help me open up my nose and you can’t really see them.’ They built it off of that, and it looked great.”
Filmmakers spent weeks figuring out how to execute the harrowing tattoo shop scene. A dangling chain from a ceiling fan attaches itself to the septum nose ring of Erik Campbell– with the actor’s nose being tugged by the chain connected to a nose plug that went inside his nostrils.
“ome of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.”
—Nordin Rahhali, VFX Supervisor
Most of the fire in the tattoo parlor was practical. “There are some fire bars and stuff that you’re seeing in there from SFX and the big pool of fire on the wide shots.” Sebaldt adds, “That was a lot of fun to shoot because it’s so insane when he’s dancing and balancing on all this stuff – we were laughing and laughing. We were convinced that this was going to be the best scene in the movie up to that moment.” Rahhali says, “They used the scene wholesale for the trailer. It went viral – people were taking out their septum rings.” Erik survives the parlor blaze only to meet his fate in a hospital when he is pulled by a wheelchair into an out-of-control MRI machine at its highest magnetic level. Rahhali comments, “That is a good combination of a bunch of different departments. Our Stunt Coordinator, Simon Burnett, came up with this hard pull-wire linewhen Erik flies and hits the MRI. That’s a real stunt with a double, and he hit hard. All the other shots are all CG wheelchairs because the directors wanted to art-direct how the crumpling metal was snapping and bending to show pressure on him as his body starts going into the MRI.”
To augment the believability that comes with reality, the directors aimed to capture as much practically as possible, then VFX Supervisor Nordin Rahhali and his team built on that result.A train derailment concludes the film after Stefani and her brother, Charlie, realize they are still on death’s list. A train goes off the tracks, and logs from one of the cars fly though the air and kills them. “That one was special because it’s a hard sequence and was also shot quite late, so we didn’t have a lot of time. We went back to Vancouver and shot the actual street, and we shot our actors performing. They fell onto stunt pads, and the moment they get touched by the logs, it turns into CG as it was the only way to pull that off and the train of course. We had to add all that. The destruction of the houses and everything was done in visual effects.”
Erik survives the tattoo parlor blaze only to meet his fate in a hospital when he is crushed by a wheelchair while being pulled into an out-of-control MRI machine.
Erikappears about to be run over by a delivery truck at the corner of 21A Ave. and 132A St., but he’s not – at least not then. The truck is actually on the opposite side of the road, and the person being run over is Howard.
A rolling penny plays a major part in the catastrophic chain reactions and seems to be a character itself. “The magic penny was a mix from two vendors, Pixomondo and FOLKS; both had penny shots,” Rahhali says. “All the bouncing pennies you see going through the vents and hitting the fan blade are all FOLKS. The bouncing penny at the end as a lady takes it out of her purse, that goes down the ramp and into the rail – that’s FOLKS. The big explosion shots in the Skyview with the penny slowing down after the kid throws itare all Pixomondo shots. It was a mix. We took a little time to find that balance between readability and believability.”
Approximately 800 VFX shots were required for Final Destination Bloodlines.Chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated Death at some point in the Final Destination films.
From left: Kaitlyn Santa Juana as Stefani Reyes, director Adam Stein, director Zach Lipovsky and Gabrielle Rose as Iris.Rahhali adds, “The film is a great collaboration of departments. Good visual effects are always a good combination of special effects, makeup effects and cinematography; it’s all the planning of all the pieces coming together. For a film of this size, I’m really proud of the work. I think we punched above our weight class, and it looks quite good.”
#explosive #mix #sfx #vfx #ignitesAN EXPLOSIVE MIX OF SFX AND VFX IGNITES FINAL DESTINATION BLOODLINESBy CHRIS McGOWAN Images courtesy of Warner Bros. Pictures. Final Destination Bloodlines, the sixth installment in the graphic horror series, kicks off with the film’s biggest challenge – deploying an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant. While there in 1968, young Iris Campbellhas a premonition about the Skyview burning, cracking, crumbling and collapsing. Then, when she sees these events actually starting to happen around her, she intervenes and causes an evacuation of the tower, thus thwarting death’s design and saving many lives. Years later, her granddaughter, Stefani Reyes, inherits the vision of the destruction that could have occurred and realizes death is still coming for the survivors. “I knew we couldn’t put the wholeon fire, but Tonytried and put as much fire as he could safely and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction that can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” —Nordin Rahhali, VFX Supervisor The film opens with an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant – and its collapse. Drone footage was digitized to create a 3D asset for the LED wall so the time of day could be changed as needed. “The set that the directors wanted was very large,” says Nordin Rahhali, VFX Supervisor. “We had limited space options in stages given the scale and the footprint of the actual restaurant that they wanted. It was the first set piece, the first big thing we shot, so we had to get it all ready and going right off the bat. We built a bigger volume for our needs, including an LED wall that we built the assets for.” “We were outside Vancouver at Bridge Studios in Burnaby. The custom-built LED volume was a little over 200 feet in length” states Christian Sebaldt, ASC, the movie’s DP. The volume was 98 feet in diameter and 24 feet tall. Rahhali explains, “Pixomondo was the vendor that we contracted to come in and build the volume. They also built the asset that went on the LED wall, so they were part of our filming team and production shoot. Subsequently, they were also the main vendor doing post, which was by design. By having them design and take care of the asset during production, we were able to leverage their assets, tools and builds for some of the post VFX.” Rahhali adds, “It was really important to make sure we had days with the volume team and with Christian and his camera team ahead of the shoot so we could dial it in.” Built at Bridge Studios in Burnaby outside Vancouver, the custom-built LED volume for events at the Skyview restaurant was over 200 feet long, 98 feet wide and 24 feet tall. Extensive previs with Digital Domain was done to advance key shots.Zach Lipovsky and Adam Stein directed Final Destination Bloodlines for New Line film, distributed by Warner Bros., in which chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated death at some point. Pixomondo was the lead VFX vendor, followed by FOLKS VFX. Picture Shop also contributed. There were around 800 VFX shots. Tony Lazarowich was the Special Effects Supervisor. “The Skyview restaurant involved building a massive setwas fire retardant, which meant the construction took longer than normal because they had to build it with certain materials and coat it with certain things because, obviously, it serves for the set piece. As it’s falling into chaos, a lot of that fire was practical. I really jived with what Christian and directors wanted and how Tony likes to work – to augment as much real practical stuff as possible,” Rahhali remarks. “I knew we couldn’t put the whole thing on fire, but Tony tried and put as much fire as he could safely, and then we just built off thatand added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” The Skyview restaurant required building a massive set that was fire retardant. Construction on the set took longer because it had to be built and coated with special materials. As the Skyview restaurant falls into chaos, much of the fire was practical.“We got all the Vancouver skylineso we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” —Christian Sebaldt, ASC, Director of Photography For drone shots, the team utilized a custom heavy-lift drone with three RED Komodo Digital Cinema cameras “giving us almost 180 degrees with overlap that we would then stitch in post and have a ridiculous amount of resolution off these three cameras,” Sebaldt states. “The other drone we used was a DJI Inspire 3, which was also very good. And we flew these drones up at the height. We flew them at different times of day. We flew full 360s, and we also used them for photogrammetry. We got all the Vancouver skyline so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” Rahhali adds, “All of this allowed us to figure out what we were going to shoot. We had the stage build, and we had the drone footage that we then digitized and created a 3D asset to go on the wallwe could change the times of day” Pixomondo built the volume and the asset that went on the LED wall for the Skyview sequence. They were also the main vendor during post. FOLKS VFX and Picture Shop contributed.“We did extensive previs with Digital Domain,” Rahhali explains. “That was important because we knew the key shots that the directors wanted. With a combination of those key shots, we then kind of reverse-engineeredwhile we did techvis off the previs and worked with Christian and the art department so we would have proper flexibility with the set to be able to pull off some of these shots.some of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” Some shots required the Skyview’s ceiling to be lifted and partially removed to get a crane to shoot Paul Campbellas he’s about to fall. The character Iris lived in a fortified house, isolating herself methodically to avoid the Grim Reaper. Rahhali comments, “That was a beautiful locationGVRD, very cold. It was a long, hard shoot, because it was all nights. It was just this beautiful pocket out in the middle of the mountains. We in visual effects didn’t do a ton other than a couple of clean-ups of the big establishing shots when you see them pull up to the compound. We had to clean up small roads we wanted to make look like one road and make the road look like dirt.” There were flames involved. Sebaldt says, “The explosionwas unbelievably big. We had eight cameras on it at night and shot it at high speed, and we’re all going ‘Whoa.’” Rahhali notes, “There was some clean-up, but the explosion was 100% practical. Our Special Effects Supervisor, Tony, went to town on that. He blew up the whole house, and it looked spectacular.” The tattoo shop piercing scene is one of the most talked-about sequences in the movie, where a dangling chain from a ceiling fan attaches itself to the septum nose piercing of Erik Campbelland drags him toward a raging fire. Rahhali observes, “That was very Final Destination and a great Rube Goldberg build-up event. Richard was great. He was tied up on a stunt line for most of it, balancing on top of furniture. All of that was him doing it for real with a stunt line.” Some effects solutions can be surprisingly extremely simple. Rahhali continues, “Our producercame up with a great gagseptum ring.” Richard’s nose was connected with just a nose plug that went inside his nostrils. “All that tugging and everything that you’re seeing was real. For weeks and weeks, we were all trying to figure out how to do it without it being a big visual effects thing. ‘How are we gonna pull his nose for real?’ Craig said, ‘I have these things I use to help me open up my nose and you can’t really see them.’ They built it off of that, and it looked great.” Filmmakers spent weeks figuring out how to execute the harrowing tattoo shop scene. A dangling chain from a ceiling fan attaches itself to the septum nose ring of Erik Campbell– with the actor’s nose being tugged by the chain connected to a nose plug that went inside his nostrils. “ome of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paulas he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” —Nordin Rahhali, VFX Supervisor Most of the fire in the tattoo parlor was practical. “There are some fire bars and stuff that you’re seeing in there from SFX and the big pool of fire on the wide shots.” Sebaldt adds, “That was a lot of fun to shoot because it’s so insane when he’s dancing and balancing on all this stuff – we were laughing and laughing. We were convinced that this was going to be the best scene in the movie up to that moment.” Rahhali says, “They used the scene wholesale for the trailer. It went viral – people were taking out their septum rings.” Erik survives the parlor blaze only to meet his fate in a hospital when he is pulled by a wheelchair into an out-of-control MRI machine at its highest magnetic level. Rahhali comments, “That is a good combination of a bunch of different departments. Our Stunt Coordinator, Simon Burnett, came up with this hard pull-wire linewhen Erik flies and hits the MRI. That’s a real stunt with a double, and he hit hard. All the other shots are all CG wheelchairs because the directors wanted to art-direct how the crumpling metal was snapping and bending to show pressure on him as his body starts going into the MRI.” To augment the believability that comes with reality, the directors aimed to capture as much practically as possible, then VFX Supervisor Nordin Rahhali and his team built on that result.A train derailment concludes the film after Stefani and her brother, Charlie, realize they are still on death’s list. A train goes off the tracks, and logs from one of the cars fly though the air and kills them. “That one was special because it’s a hard sequence and was also shot quite late, so we didn’t have a lot of time. We went back to Vancouver and shot the actual street, and we shot our actors performing. They fell onto stunt pads, and the moment they get touched by the logs, it turns into CG as it was the only way to pull that off and the train of course. We had to add all that. The destruction of the houses and everything was done in visual effects.” Erik survives the tattoo parlor blaze only to meet his fate in a hospital when he is crushed by a wheelchair while being pulled into an out-of-control MRI machine. Erikappears about to be run over by a delivery truck at the corner of 21A Ave. and 132A St., but he’s not – at least not then. The truck is actually on the opposite side of the road, and the person being run over is Howard. A rolling penny plays a major part in the catastrophic chain reactions and seems to be a character itself. “The magic penny was a mix from two vendors, Pixomondo and FOLKS; both had penny shots,” Rahhali says. “All the bouncing pennies you see going through the vents and hitting the fan blade are all FOLKS. The bouncing penny at the end as a lady takes it out of her purse, that goes down the ramp and into the rail – that’s FOLKS. The big explosion shots in the Skyview with the penny slowing down after the kid throws itare all Pixomondo shots. It was a mix. We took a little time to find that balance between readability and believability.” Approximately 800 VFX shots were required for Final Destination Bloodlines.Chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated Death at some point in the Final Destination films. From left: Kaitlyn Santa Juana as Stefani Reyes, director Adam Stein, director Zach Lipovsky and Gabrielle Rose as Iris.Rahhali adds, “The film is a great collaboration of departments. Good visual effects are always a good combination of special effects, makeup effects and cinematography; it’s all the planning of all the pieces coming together. For a film of this size, I’m really proud of the work. I think we punched above our weight class, and it looks quite good.” #explosive #mix #sfx #vfx #ignitesWWW.VFXVOICE.COMAN EXPLOSIVE MIX OF SFX AND VFX IGNITES FINAL DESTINATION BLOODLINESBy CHRIS McGOWAN Images courtesy of Warner Bros. Pictures. Final Destination Bloodlines, the sixth installment in the graphic horror series, kicks off with the film’s biggest challenge – deploying an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant. While there in 1968, young Iris Campbell (Brec Bassinger) has a premonition about the Skyview burning, cracking, crumbling and collapsing. Then, when she sees these events actually starting to happen around her, she intervenes and causes an evacuation of the tower, thus thwarting death’s design and saving many lives. Years later, her granddaughter, Stefani Reyes (Kaitlyn Santa Juana), inherits the vision of the destruction that could have occurred and realizes death is still coming for the survivors. “I knew we couldn’t put the whole [Skyview restaurant] on fire, but Tony [Lazarowich, Special Effects Supervisor] tried and put as much fire as he could safely and then we just built off that [in VFX] and added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction that can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” —Nordin Rahhali, VFX Supervisor The film opens with an elaborate, large-scale set piece involving the 400-foot-high Skyview Tower restaurant – and its collapse. Drone footage was digitized to create a 3D asset for the LED wall so the time of day could be changed as needed. “The set that the directors wanted was very large,” says Nordin Rahhali, VFX Supervisor. “We had limited space options in stages given the scale and the footprint of the actual restaurant that they wanted. It was the first set piece, the first big thing we shot, so we had to get it all ready and going right off the bat. We built a bigger volume for our needs, including an LED wall that we built the assets for.” “We were outside Vancouver at Bridge Studios in Burnaby. The custom-built LED volume was a little over 200 feet in length” states Christian Sebaldt, ASC, the movie’s DP. The volume was 98 feet in diameter and 24 feet tall. Rahhali explains, “Pixomondo was the vendor that we contracted to come in and build the volume. They also built the asset that went on the LED wall, so they were part of our filming team and production shoot. Subsequently, they were also the main vendor doing post, which was by design. By having them design and take care of the asset during production, we were able to leverage their assets, tools and builds for some of the post VFX.” Rahhali adds, “It was really important to make sure we had days with the volume team and with Christian and his camera team ahead of the shoot so we could dial it in.” Built at Bridge Studios in Burnaby outside Vancouver, the custom-built LED volume for events at the Skyview restaurant was over 200 feet long, 98 feet wide and 24 feet tall. Extensive previs with Digital Domain was done to advance key shots. (Photo: Eric Milner) Zach Lipovsky and Adam Stein directed Final Destination Bloodlines for New Line film, distributed by Warner Bros., in which chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated death at some point. Pixomondo was the lead VFX vendor, followed by FOLKS VFX. Picture Shop also contributed. There were around 800 VFX shots. Tony Lazarowich was the Special Effects Supervisor. “The Skyview restaurant involved building a massive set [that] was fire retardant, which meant the construction took longer than normal because they had to build it with certain materials and coat it with certain things because, obviously, it serves for the set piece. As it’s falling into chaos, a lot of that fire was practical. I really jived with what Christian and directors wanted and how Tony likes to work – to augment as much real practical stuff as possible,” Rahhali remarks. “I knew we couldn’t put the whole thing on fire, but Tony tried and put as much fire as he could safely, and then we just built off that [in VFX] and added a lot more. Even when it’s just a little bit of real fire, the lighting and interaction can’t be simulated, so I think it was a success in terms of blending that practical with the visual.” The Skyview restaurant required building a massive set that was fire retardant. Construction on the set took longer because it had to be built and coated with special materials. As the Skyview restaurant falls into chaos, much of the fire was practical. (Photo: Eric Milner) “We got all the Vancouver skyline [with drones] so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” —Christian Sebaldt, ASC, Director of Photography For drone shots, the team utilized a custom heavy-lift drone with three RED Komodo Digital Cinema cameras “giving us almost 180 degrees with overlap that we would then stitch in post and have a ridiculous amount of resolution off these three cameras,” Sebaldt states. “The other drone we used was a DJI Inspire 3, which was also very good. And we flew these drones up at the height [we needed]. We flew them at different times of day. We flew full 360s, and we also used them for photogrammetry. We got all the Vancouver skyline so we could rebuild our version of the city, which was based a little on the Vancouver footprint. So, we used all that to build a digital recreation of a city that was in line with what the directors wanted, which was a coastal city somewhere in the States that doesn’t necessarily have to be Vancouver or Seattle, but it looks a little like the Pacific Northwest.” Rahhali adds, “All of this allowed us to figure out what we were going to shoot. We had the stage build, and we had the drone footage that we then digitized and created a 3D asset to go on the wall [so] we could change the times of day” Pixomondo built the volume and the asset that went on the LED wall for the Skyview sequence. They were also the main vendor during post. FOLKS VFX and Picture Shop contributed. (Photo: Eric Milner) “We did extensive previs with Digital Domain,” Rahhali explains. “That was important because we knew the key shots that the directors wanted. With a combination of those key shots, we then kind of reverse-engineered [them] while we did techvis off the previs and worked with Christian and the art department so we would have proper flexibility with the set to be able to pull off some of these shots. [For example,] some of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paul [Max Lloyd-Jones] as he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” Some shots required the Skyview’s ceiling to be lifted and partially removed to get a crane to shoot Paul Campbell (Max Lloyd-Jones) as he’s about to fall. The character Iris lived in a fortified house, isolating herself methodically to avoid the Grim Reaper. Rahhali comments, “That was a beautiful location [in] GVRD [Greater Vancouver], very cold. It was a long, hard shoot, because it was all nights. It was just this beautiful pocket out in the middle of the mountains. We in visual effects didn’t do a ton other than a couple of clean-ups of the big establishing shots when you see them pull up to the compound. We had to clean up small roads we wanted to make look like one road and make the road look like dirt.” There were flames involved. Sebaldt says, “The explosion [of Iris’s home] was unbelievably big. We had eight cameras on it at night and shot it at high speed, and we’re all going ‘Whoa.’” Rahhali notes, “There was some clean-up, but the explosion was 100% practical. Our Special Effects Supervisor, Tony, went to town on that. He blew up the whole house, and it looked spectacular.” The tattoo shop piercing scene is one of the most talked-about sequences in the movie, where a dangling chain from a ceiling fan attaches itself to the septum nose piercing of Erik Campbell (Richard Harmon) and drags him toward a raging fire. Rahhali observes, “That was very Final Destination and a great Rube Goldberg build-up event. Richard was great. He was tied up on a stunt line for most of it, balancing on top of furniture. All of that was him doing it for real with a stunt line.” Some effects solutions can be surprisingly extremely simple. Rahhali continues, “Our producer [Craig Perry] came up with a great gag [for the] septum ring.” Richard’s nose was connected with just a nose plug that went inside his nostrils. “All that tugging and everything that you’re seeing was real. For weeks and weeks, we were all trying to figure out how to do it without it being a big visual effects thing. ‘How are we gonna pull his nose for real?’ Craig said, ‘I have these things I use to help me open up my nose and you can’t really see them.’ They built it off of that, and it looked great.” Filmmakers spent weeks figuring out how to execute the harrowing tattoo shop scene. A dangling chain from a ceiling fan attaches itself to the septum nose ring of Erik Campbell (Richard Harmon) – with the actor’s nose being tugged by the chain connected to a nose plug that went inside his nostrils. “[S]ome of these shots required the Skyview restaurant ceiling to be lifted and partially removed for us to get a crane to shoot Paul [Campbell] as he’s about to fall and the camera’s going through a roof, that we then digitally had to recreate. Had we not done the previs to know those shots in advance, we would not have been able to build that in time to accomplish the look. We had many other shots that were driven off the previs that allowed the art department, construction and camera teams to work out how they would get those shots.” —Nordin Rahhali, VFX Supervisor Most of the fire in the tattoo parlor was practical. “There are some fire bars and stuff that you’re seeing in there from SFX and the big pool of fire on the wide shots.” Sebaldt adds, “That was a lot of fun to shoot because it’s so insane when he’s dancing and balancing on all this stuff – we were laughing and laughing. We were convinced that this was going to be the best scene in the movie up to that moment.” Rahhali says, “They used the scene wholesale for the trailer. It went viral – people were taking out their septum rings.” Erik survives the parlor blaze only to meet his fate in a hospital when he is pulled by a wheelchair into an out-of-control MRI machine at its highest magnetic level. Rahhali comments, “That is a good combination of a bunch of different departments. Our Stunt Coordinator, Simon Burnett, came up with this hard pull-wire line [for] when Erik flies and hits the MRI. That’s a real stunt with a double, and he hit hard. All the other shots are all CG wheelchairs because the directors wanted to art-direct how the crumpling metal was snapping and bending to show pressure on him as his body starts going into the MRI.” To augment the believability that comes with reality, the directors aimed to capture as much practically as possible, then VFX Supervisor Nordin Rahhali and his team built on that result. (Photo: Eric Milner) A train derailment concludes the film after Stefani and her brother, Charlie, realize they are still on death’s list. A train goes off the tracks, and logs from one of the cars fly though the air and kills them. “That one was special because it’s a hard sequence and was also shot quite late, so we didn’t have a lot of time. We went back to Vancouver and shot the actual street, and we shot our actors performing. They fell onto stunt pads, and the moment they get touched by the logs, it turns into CG as it was the only way to pull that off and the train of course. We had to add all that. The destruction of the houses and everything was done in visual effects.” Erik survives the tattoo parlor blaze only to meet his fate in a hospital when he is crushed by a wheelchair while being pulled into an out-of-control MRI machine. Erik (Richard Harmon) appears about to be run over by a delivery truck at the corner of 21A Ave. and 132A St., but he’s not – at least not then. The truck is actually on the opposite side of the road, and the person being run over is Howard. A rolling penny plays a major part in the catastrophic chain reactions and seems to be a character itself. “The magic penny was a mix from two vendors, Pixomondo and FOLKS; both had penny shots,” Rahhali says. “All the bouncing pennies you see going through the vents and hitting the fan blade are all FOLKS. The bouncing penny at the end as a lady takes it out of her purse, that goes down the ramp and into the rail – that’s FOLKS. The big explosion shots in the Skyview with the penny slowing down after the kid throws it [off the deck] are all Pixomondo shots. It was a mix. We took a little time to find that balance between readability and believability.” Approximately 800 VFX shots were required for Final Destination Bloodlines. (Photo: Eric Milner) Chain reactions of small and big events lead to bloody catastrophes befalling those who have cheated Death at some point in the Final Destination films. From left: Kaitlyn Santa Juana as Stefani Reyes, director Adam Stein, director Zach Lipovsky and Gabrielle Rose as Iris. (Photo: Eric Milner) Rahhali adds, “The film is a great collaboration of departments. Good visual effects are always a good combination of special effects, makeup effects and cinematography; it’s all the planning of all the pieces coming together. For a film of this size, I’m really proud of the work. I think we punched above our weight class, and it looks quite good.”0 Comments 0 Shares -
FROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRY
By TREVOR HOGG
Denis Villeneuvefinds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track.If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation.Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.”
A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects isRoger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.”
Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey.Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow.One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it.Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters.Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, andcreate a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.”
Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.”
Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.”
Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.”
Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.”
“All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.”
Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky.Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.”
Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles.Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline.Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats.For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef.Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once.Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.”
“Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’”
Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. ForEast of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.”
Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well.Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh.
Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise.Bluescreen and stunt doubles on Twisters.“The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.”
—Dan Mindel, Cinematographer, Twisters
Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.”
#set #pixels #cinematic #artists #comeFROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRYBy TREVOR HOGG Denis Villeneuvefinds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track.If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation.Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.” A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects isRoger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.” Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey.Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow.One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it.Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters.Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, andcreate a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.” Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.” Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.” Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.” Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.” “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.” Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky.Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.” Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles.Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline.Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats.For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef.Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once.Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.” “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’” Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. ForEast of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.” Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well.Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh. Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise.Bluescreen and stunt doubles on Twisters.“The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.” —Dan Mindel, Cinematographer, Twisters Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.” #set #pixels #cinematic #artists #comeWWW.VFXVOICE.COMFROM SET TO PIXELS: CINEMATIC ARTISTS COME TOGETHER TO CREATE POETRYBy TREVOR HOGG Denis Villeneuve (Dune: Part Two) finds the difficulty of working with visual effects are sometimes the intermediaries between him and the artists and therefore the need to be precise with directions to keep things on track. (Image courtesy of Warner Bros. Pictures) If post-production has any chance of going smoothly, there must be a solid on-set relationship between the director, cinematographer and visual effects supervisor. “It’s my job to have a vision and to bring it to the screen,” notes Denis Villeneuve, director of Dune: Part Two. “That’s why working with visual effects requires a lot of discipline. It’s not like you work with a keyboard and can change your mind all the time. When I work with a camera, I commit to a mise-en-scène. I’m trying to take the risk, move forward in one direction and enhance it with visual effects. I push it until it looks perfect. It takes a tremendous amount of time and preparation. [VFX Supervisor] Paul Lambert is a perfectionist, and I love that about him. We will never put a shot on the screen that we don’t feel has a certain level of quality. It needs to look as real as the face of my actor.” A legendary cinematographer had a significant influence on how Villeneuve approaches digital augmentation. “Someone I have learned a lot from about visual effects is [Cinematographer] Roger Deakins. I remember that at the beginning, when I was doing Blade Runner 2049, some artwork was not defined enough, and I was like, ‘I will correct that later.’ Roger said, ‘No. Don’t do that. You have to make sure right at the start.’ I’ve learned the hard way that you need to be as precise as you can, otherwise it goes in a lot of directions.” Motion capture is visually jarring because your eye is always drawn to the performer in the mocap suit, but it worked out well on Better Man because the same thing happens when he gets replaced by a CG monkey. (Image courtesy of Paramount Pictures) Visual effects enabled the atmospherics on Wolfs to be art directed, which is not always possible with practical snow. (Image courtesy of Apple Studios) One of the most complex musical numbers in Better Man is “Rock DJ,” which required LiDAR scans of Regent Street and doing full 3D motion capture with the dancers dancing down the whole length of the street to work out how best to shoot it. (Image courtesy of Paramount Pictures) Cinematographer Dan Mindel favors on-set practical effects because the reactions from the cast come across as being more genuine, which was the case for Twisters. (Image courtesy of Universal Pictures) Storyboards are an essential part of the planning process. “When I finish a screenplay, the first thing I do is to storyboard, not just to define the visual element of the movie, but also to rewrite the movie through images,” Villeneuve explains. “Those storyboards inform my crew about the design, costumes, accessories and vehicles, and [they] create a visual inner rhythm of the film. This is the first step towards visual effects where there will be a conversation that will start from the boards. That will be translated into previs to help the animators know where we are going because the movie has to be made in a certain timeframe and needs choreography to make sure everybody is moving in the same direction.” The approach towards filmmaking has not changed over the years. “You have a camera and a couple of actors in front of you, and it’s about finding the right angle; the rest is noise. I try to protect the intimacy around the camera as much as possible and focus on that because if you don’t believe the actor, then you won’t believe anything.” Before transforming singer Robbie Williams into a CG primate, Michael Gracey started as a visual effects artist. “I feel so fortu- nate to have come from a visual effects background early on in my career,” recalls Michael Gracey, director of Better Man. “I would sit down and do all the post myself because I didn’t trust anyone to care as much as I did. Fortunately, over the years I’ve met people who do. It’s a huge part of how I even scrapbook ideas together. Early on, I was constantly throwing stuff up in Flame, doing a video test and asking, ‘Is this going to work?’ Jumping into 3D was something I felt comfortable doing. I’ve been able to plan out or previs ideas. It’s an amazing tool to be armed with if you are a director and have big ideas and you’re trying to convey them to a lot of people.” Previs was pivotal in getting Better Man financed. “Off the page, people were like, ‘Is this monkey even going to work?’ Then they were worried that it wouldn’t work in a musical number. We showed them the previs for Feel, the first musical number, and My Way at the end of the film. I would say, ‘If you get any kind of emotion watching these musical numbers, just imagine what it’s going to be like when it’s filmed and is photoreal.” Several shots had to be stitched together to create a ‘oner’ that features numerous costume changes and 500 dancers. “For Rock DJ, we were doing LiDAR scans of Regent Street and full 3D motion capture with the dancers dancing down the whole length of the street to work out all of the transition points and how best to shoot it,” Gracey states. “That process involved Erik Wilson, the Cinematographer; Luke Millar, the Visual Effects Supervisor; Ashley Wallen, the Choreographer; and Patrick Correll, Co-Producer. Patrick would sit on set and, in DaVinci Resolve, take the feed from the camera and check every take against the blueprint that we had already previs.” Motion capture is visually jarring to shoot. “Everything that is in-camera looks perfect, then a guy walks in wearing a mocap suit and your eye zooms onto him. But the truth is, your eye does that the moment you replace him with a monkey as well. It worked out quite well because that idea is true to what it is to be famous. A famous person walks into the room and your eye immediately goes to them.” Digital effects have had a significant impact on a particular area of filmmaking. “Physical effects were a much higher art form than it is now, or it was allowed to be then than it is now,” notes Dan Mindel, Cinematographer on Twisters. “People will decline a real pyrotechnic explosion and do a digital one. But you get a much bigger reaction when there’s actual noise and flash.” It is all about collaboration. Mindel explains, “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys, because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world. When we made Twister, it was an analog movie with digital effects, and it worked great. That’s because everyone on set doing the technical work understood both formats, and we were able to use them well.” Digital filmmaking has caused a generational gap. “The younger directors don’t think holistically,” Mindel notes. “It’s much more post-driven because they want to manipulate on the Avid or whatever platform it is going to be. What has happened is that the overreaching nature of these tools has left very little to the imagination. A movie that is heavy visual effects is mostly conceptualized on paper using computer-generated graphics and color; that insidiously sneaks into the look and feel of the movie before you know it. You see concept art blasted all over production offices. People could get used to looking at those images, and before you know it, that’s how the movie looks. That’s a very dangerous place to be, not to have the imagination to work around an issue that perhaps doesn’t manifest itself until you’re shooting.” There has to be a sense of purpose. Mindel remarks, “The ability to shoot in a way that doesn’t allow any manipulation in post is the only way to guarantee that there’s just one direction the look can go in. But that could be a little dangerous for some people. Generally, the crowd I’m working with is part of a team, and there’s little thought of taking the movie to a different place than what was shot. I work in the DI with the visual effects supervisor, and we look at our work together so we’re all in agreement that it fits into the movie.” “All of the advances in technology are a push for greater control,” notes Larkin Seiple, Cinematographer on Everything Everywhere All at Once. “There are still a lot of things that we do with visual effects that we could do practically, but a lot of times it’s more efficient, or we have more attempts at it later in post, than if we had tried to do it practically. I find today, there’s still a debate about what we do on set and what we do later digitally. Many directors have been trying to do more on set, and the best visual effects supervisors I work with push to do everything in-camera as much as possible to make it as realistic as possible.” Storytelling is about figuring out where to invest your time and effort. Seiple states, “I like the adventure of filmmaking. I prefer to go to a mountain top and shoot some of the scenes, get there and be inspired, as opposed to recreate it. Now, if it’s a five-second cutaway, I don’t want production to go to a mountain top and do that. For car work, we’ll shoot the real streets, figure out the time of day and even light the plates for it. Then, I’ll project those on LED walls with actors in a car on a stage. I love doing that because then I get to control how that looks.” Visual effects have freed Fallout Cinematographer Stuart Dryburgh to shoot quicker and in places that in the past would have been deemed imperfect because of power lines, out-of-period buildings or the sky. (Image courtesy of Prime Video) Visual effects assist in achieving the desired atmospherics. Seiple says, “On Wolfs, we tried to bring in our own snow for every scene. We would shoot one take, the snow would blow left, and the next take would blow right. Janek Sirrs is probably the best visual effects supervisor I’ve worked with, and he was like, ‘Please turn off the snow. It’ll be a nightmare trying to remove the snow from all these shots then add our own snow back for continuity because you can’t have the snow changing direction every other cut.’ Or we’d have to ‘snow’ a street, which would take ages. Janek would say, ‘Let’s put enough snow on the ground to see the lighting on it and where the actors walk. We’ll do the rest of the street later because we have a perfect reference of what it should look like.” Certain photographic principles have to be carried over into post-production to make shots believable to the eye. Seiple explains, “When you make all these amazing details that should be out of focus sharper, then the image feels like a visual effect because it doesn’t work the way a lens would work.” Familiarity with the visual effects process is an asset in being able to achieve the best result. “I inadvertently come from a lot of visual effect-heavy shoots and shows, so I’m quick to have an opinion about it. Many directors love to reference the way David Fincher uses visual effects because there is such great behind-the-scenes imagery that showcases how they were able to do simple things. Also, I like to shoot tests even on an iPhone to see if this comp will work or if this idea is a good one.” Cinematographer Fabian Wagner and VFX Supervisor John Moffatt spent a lot of time in pre-production for Venom: The Last Dance discussing how to bring out the texture of the symbiote through lighting and camera angles. (Image courtesy of Columbia Pictures) Game of Thrones Director of Photography Fabian Wagner had to make key decisions while prepping and breaking down the script so visual effects had enough time to meet deadline. (Image courtesy of HBO) Twisters was an analog movie with digital effects that worked well because everyone on set doing the technical work understood both formats. (Image courtesy of Universal Pictures) For Cinematographer Larkin Seiple, storytelling is about figuring out where to invest your time and effort. Scene from the Netflix series Beef. (Image courtesy of Netflix) Cinematographer Larkin Seiple believes that all of the advances in technology are a push for greater control, which occurred on Everything Everywhere All at Once. (Image courtesy of A24) Nothing beats reality when it comes to realism. “Every project I do I talk more about the real elements to bring into the shoot than the visual effect element because the more practical stuff that you can do on set, the more it will embed the visual effects into the image, and, therefore, they’re more real,” observes Fabian Wagner, Cinematographer on Venom: The Last Dance. “It also depends on the job you’re doing in terms of how real or unreal you want it to be. Game of Thrones was a good example because it was a visual effects-heavy show, but they were keen on pushing the reality of things as much as possible. We were doing interactive lighting and practical on-set things to embed the visual effects. It was successful.” Television has a significantly compressed schedule compared to feature films. “There are fewer times to iterate. You have to be much more precise. On Game of Thrones, we knew that certain decisions had to be made early on while we were still prepping and breaking down the script. Because of their due dates, to be ready in time, they had to start the visual effects process for certain dragon scenes months before we even started shooting.” “Like everything else, it’s always about communication,” Wagner notes. “I’ve been fortunate to work with extremely talented and collaborative visual effects supervisors, visual effects producers and directors. I have become friends with most of those visual effects departments throughout the shoot, so it’s easy to stay in touch. Even when Venom: The Last Dance was posting, I would be talking to John Moffatt, who was our talented visual effects supervisor. We would exchange emails, text messages or phone calls once a week, and he would send me updates, which we would talk about it. If I gave any notes or thoughts, John would listen, and if it were possible to do anything about, he would. In the end, it’s about those personal relationships, and if you have those, that can go a long way.” Wagner has had to deal with dragons, superheroes and symbiotes. “They’re all the same to me! For the symbiote, we had two previous films to see what they had done, where they had succeeded and where we could improve it slightly. While prepping, John and I spent a lot of time talking about how to bring out the texture of the symbiote and help it with the lighting and camera angles. One of the earliest tests was to see what would happen if we backlit or side lit it as well as trying different textures for reflections. We came up with something we all were happy with, and that’s what we did on set. It was down to trying to speak the same language and aiming for the same thing, which in this case was, ‘How could we make the symbiote look the coolest?’” Visual effects has become a crucial department throughout the filmmaking process. “The relationship with the visual effects supervisor is new,” states Stuart Dryburgh, Cinematographer on Fallout. “We didn’t really have that. On The Piano, the extent of the visual effects was having somebody scribbling in a lightning strike over a stormy sky and a little flash of an animated puppet. Runaway Bride had a two-camera setup where one of the cameras pushed into the frame, and that was digitally removed, but we weren’t using it the way we’re using it now. For [the 2026 Netflix limited series] East of Eden, we’re recreating 19th and early 20th century Connecticut, Boston and Salinas, California in New Zealand. While we have some great sets built and historical buildings that we can use, there is a lot of set extension and modification, and some complete bluescreen scenes, which allow us to more realistically portray a historical environment than we could have done back in the day.” The presence of a visual effects supervisor simplified principal photography. Dryburgh adds, “In many ways, using visual effects frees you to shoot quicker and in places that might otherwise be deemed imperfect because of one little thing, whether it’s power lines or out-of-period buildings or sky. All of those can be easily fixed. Most of us have been doing it for long enough that we have a good idea of what can and can’t be done and how it’s done so that the visual effects supervisor isn’t the arbiter.” Lighting cannot be arbitrarily altered in post as it never looks right. “Whether you set the lighting on the set and the background artist has to match that, or you have an existing background and you, as a DP, have to match that – that is the lighting trick to the whole thing,” Dryburgh observes. “Everything has to be the same, a soft or hard light, the direction and color. Those things all need to line up in a composited shot; that is crucial.” Every director has his or her own approach to filmmaking. “Harold Ramis told me, ‘I’ll deal with the acting and the words. You just make it look nice, alright?’ That’s the conversation we had about shots, and it worked out well. [Director] Garth Davis, who I’m working with now, is a terrific photographer in his own right and has a great visual sense, so he’s much more involved in anything visual, whether it be the designs of the sets, creation of the visual effects, my lighting or choice of lenses. It becomes much more collaborative. And that applies to the visual effects department as well.” Recreating vintage lenses digitally is an important part of the visual aesthetic. “As digital photography has become crisper, better and sharper, people have chosen to use fewer perfect optics, such as lenses that are softer on the edges or give a flare characteristic. Before production, we have the camera department shoot all of these lens grids of different packages and ranges, and visual effects takes that information so they can model every lens. If they’re doing a fully CG background, they can apply that lens characteristic,” remarks Dryburgh. Television schedules for productions like House of the Dragon do not allow a lot of time to iterate, so decisions have to be precise. (Image courtesy of HBO) Bluescreen and stunt doubles on Twisters. (Image courtesy of Universal Pictures) “The principle that I work with is that the visual effects department will make us look great, and we have to give them the raw materials in the best possible form so they can work with it instinctually. Sometimes, as a DP, you might want to do something different, but the bottom line is, you’ve got to listen to these guys because they know what they want. It gets a bit dogmatic, but most of the time, my relationship with visual effects is good, and especially the guys who have had a foot in the analog world at one point or another and have transitioned into the digital world.” —Dan Mindel, Cinematographer, Twisters Cinematographers like Greig Fraser have adopted Unreal Engine. “Greig has an incredible curiosity about new technology, and that helped us specifically with Dune: Part Two,” Villeneuve explains. “Greig was using Unreal Engine to capture natural environments. For example, if we decide to shoot in that specific rocky area, we’ll capture the whole area with drones to recreate the terrain in the computer. If I said, ‘I want to shoot in that valley on November 3rd and have the sun behind the actors. At what time is it? You have to be there at 9:45 am.’ We built the whole schedule like a puzzle to maximize the power of natural light, but that came through those studies, which were made with the software usually used for video games.” Technology is essentially a tool that keeps evolving. Villeneuve adds, “Sometimes, I don’t know if I feel like a dinosaur or if my last movie will be done in this house behind the computer alone. It would be much less tiring to do that, but seriously, the beauty of cinema is the idea of bringing many artists together to create poetry.” -
SHINING A LIGHT ON ESSENTIAL DANISH VFX WITH PETER HJORTH
By OLIVER WEBB
Images courtesy of Peter Hjorth and Zentropa, except where noted.
Peter Hjorth.When Peter Hjorth first started out, visual effects were virtually non-existent in the Danish film industry. “We had one guy at the lab who did work on the Oxberry, and I worked at a video production company,” Hjorth states. “I trained as a videotape editor, then it went into online. When the first digital tools arrived, I joined one of the hot post places where they got the first digital VTRs. All my first years of experience were with commercial clients and music videos and making the transition from analogue to digital in video post-production. I did a little bit of work for friends of mine where we actually did it at the lab. I’m old enough to have done stuff with the optical printer and waiting for weeks to get it right. There were some very early start-ups in Copenhagen doing files to film, and I started working with them.”
Hjorth’s first feature film came in 1998 with Thomas Vinterberg’s Festen, where he served as camera operator and digital consultant. Festen also marked Hjorth’s first foray into the Dogme 95 movement. “We shot on MiniDV, and I was attached to the whole project. I shot the second camera and then was asked if I could do some advanced work in visual effects for commercials. I was then asked by Lars von Trier to help out on Dancer in the Dark when he was starting.”
Working on Dancer in the Dark marked the beginning of Hjorth’s frequent collaborations with Lars von Trier. “That was sort of a two-fold thing because we had 100 DV cameras that needed some kind of infrastructure to work, and my television background was good for that. We also needed some visual effects work to get rid of some cameras. If you put 100 cameras in the same set, you’re going to get into a visual effects situation. So, I did that and worked on the editing. At that time, people were a little bit afraid of Lars, but I’m up for anything. We had a great time, especially during the editing and post-production.”
Hjorth was pleased with his collaboration with director Tarik Saleh on the U.S. film The Contractor, on which he served as Production Visual Effects Supervisor.“There’s a special thing about Denmark, which is that we tend to all stick together… It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.”
—Peter Hjorth, Visual Effects Supervisor
Initially, production experimented with a wall of cameras, where Hjorth did a test compositing that into an image. Von Trier found it interesting, but felt it wasn’t right for Dancer in the Dark. He later came back to Hjorth with Dogville and explained that he wanted to implement the multi-camera technique for this project. “Lars didn’t want linear perspective, instead he wanted something more like visual arts, fine arts, a notion of perspective, even cubism maybe,” Hjorth adds. “At that point in between those two projects, I did the first big Vinterberg film, It’s All About Love.” Hjorth worked as Visual Effects Supervisor for the film. “We did lots of precise visual effects, matched lenses, matched camera heights, everything by the book. Then I went into this totally crazy project for Lars and really developed a close understanding of what Lars wanted. We’ve done eight feature films and a TV series together. The last one was the third season of The Kingdom. I also did his last feature film, The House That Jack Built. I was Production Visual Effects Supervisor on all the stuff in-between, such as Antichrist and Melancholia.” Hjorth explains that he was very lucky to be in the right place at the right time. “Working on those projects has given me a network all over Europe with good people. We had some decent budgets, and people were thrilled to work on Lars’ films. I’ve made some excellent friends and good connections. If you wanted VFX for a movie in the early 2000s you hired someone from a post house for a specific scene. The notion of a production visual effects supervisor was not very common in Denmark, and the role has since developed. I find that my contribution is now mostly in pre-production. With post-production, I usually take a step back and leave it to the vendors to get right, but I’m happy I’ve been able to assist when the need arose.”
Throughout his career, Hjorth has worked across the board as camera operator, colorist and editor. “I did some camera work on the side for music videos, and so on,” he explains. “When I speak to the DP and the gaffers, I know the language. I wouldn’t say I did great work as a cinematographer, but I know the language, the equipment and the limitations. Actually, my first job before even going into post-production was as an electrician. I used to work on really old, heavy movie lights back in the day, so I also know a little bit about departments on set and how it works. That has made it a little bit easier for me to be on set because as a visual effects supervisor, it can be a super scary experience. If you feel like a tourist, it’s just horrible. I, of course, worked on the Dogme 95 films, where we worked closely with the actors, and I’m not afraid to have a conversation with an actor. No matter how good the VFX is, if the actors don’t believe a scene they are in, it doesn’t work. So, I’ve been lucky to do a bit of everything, and I feel blessed that things turned out the way that they did.”
Starting out in Dogme 95 also proved to be a huge learning curve when it came to film language and understanding how to work within a set of specific rules and guidelines. The movement was founded by Lars von Trier and Thomas Vinterberg, who created the Dogme 95 Manifesto. The Manifesto consisted of 10 rules, which included: camera must be handheld, shooting must be done on location and special lighting isn’t allowed. “It’s a good background to have,” Hjorth states. “We’ve had rules for all of the films I’ve made with Lars, even on projects such as Melancholia.”
Setting rules hasn’t been limited to Danish cinema and extends beyond that. “We made kind of a set of rules for the films I’ve made with Ali Abbasi, and that’s always made things easier,” Hjorth says. “He first called me when he was in film school. He was doing some early tests and was audacious enough to ask me for a VFX shot. It was hard to understand what he was saying, but then he talked about a scene with a guy coming out of a cake and he kills his brother, slicing his throat with a knife, and he wanted to see that in close-up. I appreciate younger directors calling and asking me to work with them, and it has really paid off.”
Hjorth was the Visual Effects Supervisor for several episodes of the 1994-2022 TV series The Kingdom and The Kingdom: Exodus.
Hjorth has worked on eight feature films and a TV series with director Lars von Trier.Hjorth with director Lars von Trier, left, on the set of The Kingdom: Exodus.Hjorth was Visual Effects Supervisor on The House That Jack Built, directed by Lars von Trier.Peter Hjorth was recognized for his work as European Visual Effects Supervisor for the Swedish-Danish feature and Cannes winner Border, directed by Ali Abbasi.Hjorth was Production Visual Effects Supervisor on Lamb, directed by Valdimar Jóhannsson.Hjorth with Simone Grau Roney, Production Designer on The House That Jack Built, directed by Lars von Trier.
Choosing a favorite visual effect shot from his career, however, is a difficult task for Hjorth, though he’s particularly proud of the work achieved on Dogville. “Nobody noticed how messed up it was,” he explains. “Toward the end of the movie, you can see the masks, and you can see that we didn’t bother to match the grain between layers and all that. We did the first test on Flame, and when we went to layer 99, it just stopped working. We ended up doing it with combustion software, which was crummy, but it worked, and we got the shots done. I think we went to 170 layers on the opening shot. It was a learning experience for everybody involved, and I still work with some of those same people, most recently on the Netflix series I did this spring.”
Hjorth worked as Visual Effects Supervisor on Holy Spider, directed by Ali Abbasi.
Hjorth served as Visual Effects Supervisor on Antichrist, directed by Lars von Trier.
Hjorth believes that there’s been an immense upgrade in professionalism in Denmark in the years since he’s worked in the business. “The beginning was much less industrial. The directors that I have worked with tend to work with me multiple times. A lot of the stuff I say in the first meeting is really defining for how thatis going to go. I’ve been so lucky to work on films that I actually think made a difference. It has mostly been art house films with limited budgets and resources. When we work together with the same producer or director a few times, sometimes they come back and say, ‘We’d like to have a creature or some special thing.’ It’s an evolving process.”
Hjorth worked as Visual Effects Supervisor on the Lars von Trier-directed Melancholia, and was also credited for his astrophotography of auroras for the film.Hjorth was Visual Effects Supervisor on Dogville, directed by Lars von Trier.
Hjorth worked with director Lars von Trier to develop the Automavision technique, which was credited with the cinematography for The Boss of It All. A computer algorithm randomly changes the camera’s tilt, pan, focal length and/or positioning as well as the sound recording without being actively operated by the cinematographer.
Hjorth works closely with stunts, special effects makeup, animal wranglers and other specialists. “I know the craft and what they need from me. They know more about what’s going to be effective on screen, so I just leave them to it and make sure they have what they need. Same thing with animals and visual effects, makeup and stuff like that, physical things. You know I have a bit of a reputation for trying to get as many pieces of the puzzle as possible with a camera. Some production VFX people get quotes from, say, three different vendors, and then they pick all the cheapest bids for each sequence or shot, and that’s how they get down in budget. I tried to avoid that. I’d rather actually sit down with the director and say for example, ‘We should have some breathing space here.’”
When it comes to the future of visual effects in Denmark, Hjorth takes an optimistic view. “I think this trend that we have more production supervisors is basically going to continue in the way that even if you have very little work, you hire someone from the get-go and you make sure that’s a balance in ambition and resources. There’s a special thing about Denmark, which is that we tend to all stick together, even people who are not in the same line of work. We have lots of experience sharing. There are no limits to who you can call and ask questions. It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.”
#shining #light #essential #danish #vfxSHINING A LIGHT ON ESSENTIAL DANISH VFX WITH PETER HJORTHBy OLIVER WEBB Images courtesy of Peter Hjorth and Zentropa, except where noted. Peter Hjorth.When Peter Hjorth first started out, visual effects were virtually non-existent in the Danish film industry. “We had one guy at the lab who did work on the Oxberry, and I worked at a video production company,” Hjorth states. “I trained as a videotape editor, then it went into online. When the first digital tools arrived, I joined one of the hot post places where they got the first digital VTRs. All my first years of experience were with commercial clients and music videos and making the transition from analogue to digital in video post-production. I did a little bit of work for friends of mine where we actually did it at the lab. I’m old enough to have done stuff with the optical printer and waiting for weeks to get it right. There were some very early start-ups in Copenhagen doing files to film, and I started working with them.” Hjorth’s first feature film came in 1998 with Thomas Vinterberg’s Festen, where he served as camera operator and digital consultant. Festen also marked Hjorth’s first foray into the Dogme 95 movement. “We shot on MiniDV, and I was attached to the whole project. I shot the second camera and then was asked if I could do some advanced work in visual effects for commercials. I was then asked by Lars von Trier to help out on Dancer in the Dark when he was starting.” Working on Dancer in the Dark marked the beginning of Hjorth’s frequent collaborations with Lars von Trier. “That was sort of a two-fold thing because we had 100 DV cameras that needed some kind of infrastructure to work, and my television background was good for that. We also needed some visual effects work to get rid of some cameras. If you put 100 cameras in the same set, you’re going to get into a visual effects situation. So, I did that and worked on the editing. At that time, people were a little bit afraid of Lars, but I’m up for anything. We had a great time, especially during the editing and post-production.” Hjorth was pleased with his collaboration with director Tarik Saleh on the U.S. film The Contractor, on which he served as Production Visual Effects Supervisor.“There’s a special thing about Denmark, which is that we tend to all stick together… It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” —Peter Hjorth, Visual Effects Supervisor Initially, production experimented with a wall of cameras, where Hjorth did a test compositing that into an image. Von Trier found it interesting, but felt it wasn’t right for Dancer in the Dark. He later came back to Hjorth with Dogville and explained that he wanted to implement the multi-camera technique for this project. “Lars didn’t want linear perspective, instead he wanted something more like visual arts, fine arts, a notion of perspective, even cubism maybe,” Hjorth adds. “At that point in between those two projects, I did the first big Vinterberg film, It’s All About Love.” Hjorth worked as Visual Effects Supervisor for the film. “We did lots of precise visual effects, matched lenses, matched camera heights, everything by the book. Then I went into this totally crazy project for Lars and really developed a close understanding of what Lars wanted. We’ve done eight feature films and a TV series together. The last one was the third season of The Kingdom. I also did his last feature film, The House That Jack Built. I was Production Visual Effects Supervisor on all the stuff in-between, such as Antichrist and Melancholia.” Hjorth explains that he was very lucky to be in the right place at the right time. “Working on those projects has given me a network all over Europe with good people. We had some decent budgets, and people were thrilled to work on Lars’ films. I’ve made some excellent friends and good connections. If you wanted VFX for a movie in the early 2000s you hired someone from a post house for a specific scene. The notion of a production visual effects supervisor was not very common in Denmark, and the role has since developed. I find that my contribution is now mostly in pre-production. With post-production, I usually take a step back and leave it to the vendors to get right, but I’m happy I’ve been able to assist when the need arose.” Throughout his career, Hjorth has worked across the board as camera operator, colorist and editor. “I did some camera work on the side for music videos, and so on,” he explains. “When I speak to the DP and the gaffers, I know the language. I wouldn’t say I did great work as a cinematographer, but I know the language, the equipment and the limitations. Actually, my first job before even going into post-production was as an electrician. I used to work on really old, heavy movie lights back in the day, so I also know a little bit about departments on set and how it works. That has made it a little bit easier for me to be on set because as a visual effects supervisor, it can be a super scary experience. If you feel like a tourist, it’s just horrible. I, of course, worked on the Dogme 95 films, where we worked closely with the actors, and I’m not afraid to have a conversation with an actor. No matter how good the VFX is, if the actors don’t believe a scene they are in, it doesn’t work. So, I’ve been lucky to do a bit of everything, and I feel blessed that things turned out the way that they did.” Starting out in Dogme 95 also proved to be a huge learning curve when it came to film language and understanding how to work within a set of specific rules and guidelines. The movement was founded by Lars von Trier and Thomas Vinterberg, who created the Dogme 95 Manifesto. The Manifesto consisted of 10 rules, which included: camera must be handheld, shooting must be done on location and special lighting isn’t allowed. “It’s a good background to have,” Hjorth states. “We’ve had rules for all of the films I’ve made with Lars, even on projects such as Melancholia.” Setting rules hasn’t been limited to Danish cinema and extends beyond that. “We made kind of a set of rules for the films I’ve made with Ali Abbasi, and that’s always made things easier,” Hjorth says. “He first called me when he was in film school. He was doing some early tests and was audacious enough to ask me for a VFX shot. It was hard to understand what he was saying, but then he talked about a scene with a guy coming out of a cake and he kills his brother, slicing his throat with a knife, and he wanted to see that in close-up. I appreciate younger directors calling and asking me to work with them, and it has really paid off.” Hjorth was the Visual Effects Supervisor for several episodes of the 1994-2022 TV series The Kingdom and The Kingdom: Exodus. Hjorth has worked on eight feature films and a TV series with director Lars von Trier.Hjorth with director Lars von Trier, left, on the set of The Kingdom: Exodus.Hjorth was Visual Effects Supervisor on The House That Jack Built, directed by Lars von Trier.Peter Hjorth was recognized for his work as European Visual Effects Supervisor for the Swedish-Danish feature and Cannes winner Border, directed by Ali Abbasi.Hjorth was Production Visual Effects Supervisor on Lamb, directed by Valdimar Jóhannsson.Hjorth with Simone Grau Roney, Production Designer on The House That Jack Built, directed by Lars von Trier. Choosing a favorite visual effect shot from his career, however, is a difficult task for Hjorth, though he’s particularly proud of the work achieved on Dogville. “Nobody noticed how messed up it was,” he explains. “Toward the end of the movie, you can see the masks, and you can see that we didn’t bother to match the grain between layers and all that. We did the first test on Flame, and when we went to layer 99, it just stopped working. We ended up doing it with combustion software, which was crummy, but it worked, and we got the shots done. I think we went to 170 layers on the opening shot. It was a learning experience for everybody involved, and I still work with some of those same people, most recently on the Netflix series I did this spring.” Hjorth worked as Visual Effects Supervisor on Holy Spider, directed by Ali Abbasi. Hjorth served as Visual Effects Supervisor on Antichrist, directed by Lars von Trier. Hjorth believes that there’s been an immense upgrade in professionalism in Denmark in the years since he’s worked in the business. “The beginning was much less industrial. The directors that I have worked with tend to work with me multiple times. A lot of the stuff I say in the first meeting is really defining for how thatis going to go. I’ve been so lucky to work on films that I actually think made a difference. It has mostly been art house films with limited budgets and resources. When we work together with the same producer or director a few times, sometimes they come back and say, ‘We’d like to have a creature or some special thing.’ It’s an evolving process.” Hjorth worked as Visual Effects Supervisor on the Lars von Trier-directed Melancholia, and was also credited for his astrophotography of auroras for the film.Hjorth was Visual Effects Supervisor on Dogville, directed by Lars von Trier. Hjorth worked with director Lars von Trier to develop the Automavision technique, which was credited with the cinematography for The Boss of It All. A computer algorithm randomly changes the camera’s tilt, pan, focal length and/or positioning as well as the sound recording without being actively operated by the cinematographer. Hjorth works closely with stunts, special effects makeup, animal wranglers and other specialists. “I know the craft and what they need from me. They know more about what’s going to be effective on screen, so I just leave them to it and make sure they have what they need. Same thing with animals and visual effects, makeup and stuff like that, physical things. You know I have a bit of a reputation for trying to get as many pieces of the puzzle as possible with a camera. Some production VFX people get quotes from, say, three different vendors, and then they pick all the cheapest bids for each sequence or shot, and that’s how they get down in budget. I tried to avoid that. I’d rather actually sit down with the director and say for example, ‘We should have some breathing space here.’” When it comes to the future of visual effects in Denmark, Hjorth takes an optimistic view. “I think this trend that we have more production supervisors is basically going to continue in the way that even if you have very little work, you hire someone from the get-go and you make sure that’s a balance in ambition and resources. There’s a special thing about Denmark, which is that we tend to all stick together, even people who are not in the same line of work. We have lots of experience sharing. There are no limits to who you can call and ask questions. It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” #shining #light #essential #danish #vfxWWW.VFXVOICE.COMSHINING A LIGHT ON ESSENTIAL DANISH VFX WITH PETER HJORTHBy OLIVER WEBB Images courtesy of Peter Hjorth and Zentropa, except where noted. Peter Hjorth. (Photo courtesy of Danish Film Institute) When Peter Hjorth first started out, visual effects were virtually non-existent in the Danish film industry. “We had one guy at the lab who did work on the Oxberry [rostrum animation camera], and I worked at a video production company,” Hjorth states. “I trained as a videotape editor, then it went into online. When the first digital tools arrived, I joined one of the hot post places where they got the first digital VTRs. All my first years of experience were with commercial clients and music videos and making the transition from analogue to digital in video post-production. I did a little bit of work for friends of mine where we actually did it at the lab. I’m old enough to have done stuff with the optical printer and waiting for weeks to get it right. There were some very early start-ups in Copenhagen doing files to film, and I started working with them.” Hjorth’s first feature film came in 1998 with Thomas Vinterberg’s Festen, where he served as camera operator and digital consultant. Festen also marked Hjorth’s first foray into the Dogme 95 movement. “We shot on MiniDV, and I was attached to the whole project. I shot the second camera and then was asked if I could do some advanced work in visual effects for commercials. I was then asked by Lars von Trier to help out on Dancer in the Dark when he was starting.” Working on Dancer in the Dark marked the beginning of Hjorth’s frequent collaborations with Lars von Trier. “That was sort of a two-fold thing because we had 100 DV cameras that needed some kind of infrastructure to work, and my television background was good for that. We also needed some visual effects work to get rid of some cameras. If you put 100 cameras in the same set, you’re going to get into a visual effects situation. So, I did that and worked on the editing. At that time, people were a little bit afraid of Lars, but I’m up for anything. We had a great time, especially during the editing and post-production.” Hjorth was pleased with his collaboration with director Tarik Saleh on the U.S. film The Contractor (2022), on which he served as Production Visual Effects Supervisor. (Image courtesy of Paramount Pictures) “There’s a special thing about Denmark, which is that we tend to all stick together… It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” —Peter Hjorth, Visual Effects Supervisor Initially, production experimented with a wall of cameras, where Hjorth did a test compositing that into an image. Von Trier found it interesting, but felt it wasn’t right for Dancer in the Dark. He later came back to Hjorth with Dogville and explained that he wanted to implement the multi-camera technique for this project. “Lars didn’t want linear perspective, instead he wanted something more like visual arts, fine arts, a notion of perspective, even cubism maybe,” Hjorth adds. “At that point in between those two projects, I did the first big Vinterberg film, It’s All About Love.” Hjorth worked as Visual Effects Supervisor for the film. “We did lots of precise visual effects, matched lenses, matched camera heights, everything by the book. Then I went into this totally crazy project for Lars and really developed a close understanding of what Lars wanted. We’ve done eight feature films and a TV series together. The last one was the third season of The Kingdom. I also did his last feature film, The House That Jack Built. I was Production Visual Effects Supervisor on all the stuff in-between, such as Antichrist and Melancholia.” Hjorth explains that he was very lucky to be in the right place at the right time. “Working on those projects has given me a network all over Europe with good people. We had some decent budgets, and people were thrilled to work on Lars’ films. I’ve made some excellent friends and good connections. If you wanted VFX for a movie in the early 2000s you hired someone from a post house for a specific scene. The notion of a production visual effects supervisor was not very common in Denmark, and the role has since developed. I find that my contribution is now mostly in pre-production. With post-production, I usually take a step back and leave it to the vendors to get right, but I’m happy I’ve been able to assist when the need arose.” Throughout his career, Hjorth has worked across the board as camera operator, colorist and editor. “I did some camera work on the side for music videos, and so on,” he explains. “When I speak to the DP and the gaffers, I know the language. I wouldn’t say I did great work as a cinematographer, but I know the language, the equipment and the limitations. Actually, my first job before even going into post-production was as an electrician. I used to work on really old, heavy movie lights back in the day, so I also know a little bit about departments on set and how it works. That has made it a little bit easier for me to be on set because as a visual effects supervisor, it can be a super scary experience. If you feel like a tourist, it’s just horrible. I, of course, worked on the Dogme 95 films, where we worked closely with the actors, and I’m not afraid to have a conversation with an actor. No matter how good the VFX is, if the actors don’t believe a scene they are in, it doesn’t work. So, I’ve been lucky to do a bit of everything, and I feel blessed that things turned out the way that they did.” Starting out in Dogme 95 also proved to be a huge learning curve when it came to film language and understanding how to work within a set of specific rules and guidelines. The movement was founded by Lars von Trier and Thomas Vinterberg, who created the Dogme 95 Manifesto. The Manifesto consisted of 10 rules, which included: camera must be handheld, shooting must be done on location and special lighting isn’t allowed. “It’s a good background to have,” Hjorth states. “We’ve had rules for all of the films I’ve made with Lars, even on projects such as Melancholia.” Setting rules hasn’t been limited to Danish cinema and extends beyond that. “We made kind of a set of rules for the films I’ve made with Ali Abbasi, and that’s always made things easier,” Hjorth says. “He first called me when he was in film school. He was doing some early tests and was audacious enough to ask me for a VFX shot. It was hard to understand what he was saying, but then he talked about a scene with a guy coming out of a cake and he kills his brother, slicing his throat with a knife, and he wanted to see that in close-up. I appreciate younger directors calling and asking me to work with them, and it has really paid off.” Hjorth was the Visual Effects Supervisor for several episodes of the 1994-2022 TV series The Kingdom and The Kingdom: Exodus. Hjorth has worked on eight feature films and a TV series with director Lars von Trier. (Photo: Peter Hjorth) Hjorth with director Lars von Trier, left, on the set of The Kingdom: Exodus. (Photo: Peter Hjorth) Hjorth was Visual Effects Supervisor on The House That Jack Built, directed by Lars von Trier. (Photo: Christian Geisnæs) Peter Hjorth was recognized for his work as European Visual Effects Supervisor for the Swedish-Danish feature and Cannes winner Border, directed by Ali Abbasi. (Image courtesy of Meta Film Stockholm) Hjorth was Production Visual Effects Supervisor on Lamb (2021), directed by Valdimar Jóhannsson. (Image courtesy of Go To Sheep and A24) Hjorth with Simone Grau Roney, Production Designer on The House That Jack Built (2018), directed by Lars von Trier. Choosing a favorite visual effect shot from his career, however, is a difficult task for Hjorth, though he’s particularly proud of the work achieved on Dogville. “Nobody noticed how messed up it was,” he explains. “Toward the end of the movie, you can see the masks, and you can see that we didn’t bother to match the grain between layers and all that. We did the first test on Flame, and when we went to layer 99, it just stopped working. We ended up doing it with combustion software, which was crummy, but it worked, and we got the shots done. I think we went to 170 layers on the opening shot. It was a learning experience for everybody involved, and I still work with some of those same people, most recently on the Netflix series I did this spring.” Hjorth worked as Visual Effects Supervisor on Holy Spider, directed by Ali Abbasi. (Photo: Nadim Carlsen. Image courtesy of Profile Pictures) Hjorth served as Visual Effects Supervisor on Antichrist (2009), directed by Lars von Trier. Hjorth believes that there’s been an immense upgrade in professionalism in Denmark in the years since he’s worked in the business. “The beginning was much less industrial. The directors that I have worked with tend to work with me multiple times. A lot of the stuff I say in the first meeting is really defining for how that [job] is going to go. I’ve been so lucky to work on films that I actually think made a difference. It has mostly been art house films with limited budgets and resources. When we work together with the same producer or director a few times, sometimes they come back and say, ‘We’d like to have a creature or some special thing.’ It’s an evolving process.” Hjorth worked as Visual Effects Supervisor on the Lars von Trier-directed Melancholia (2011), and was also credited for his astrophotography of auroras for the film. (Image courtesy Magnolia Pictures) Hjorth was Visual Effects Supervisor on Dogville (2003), directed by Lars von Trier. Hjorth worked with director Lars von Trier to develop the Automavision technique, which was credited with the cinematography for The Boss of It All (2006). A computer algorithm randomly changes the camera’s tilt, pan, focal length and/or positioning as well as the sound recording without being actively operated by the cinematographer. Hjorth works closely with stunts, special effects makeup, animal wranglers and other specialists. “I know the craft and what they need from me. They know more about what’s going to be effective on screen, so I just leave them to it and make sure they have what they need. Same thing with animals and visual effects, makeup and stuff like that, physical things. You know I have a bit of a reputation for trying to get as many pieces of the puzzle as possible with a camera. Some production VFX people get quotes from, say, three different vendors, and then they pick all the cheapest bids for each sequence or shot, and that’s how they get down in budget. I tried to avoid that. I’d rather actually sit down with the director and say for example, ‘We should have some breathing space here.’” When it comes to the future of visual effects in Denmark, Hjorth takes an optimistic view. “I think this trend that we have more production supervisors is basically going to continue in the way that even if you have very little work, you hire someone from the get-go and you make sure that’s a balance in ambition and resources. There’s a special thing about Denmark, which is that we tend to all stick together, even people who are not in the same line of work. We have lots of experience sharing. There are no limits to who you can call and ask questions. It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” -
HOLLYWOOD VFX TOOLS FOR SPACE EXPLORATION
By CHRIS McGOWAN
This image of Jupiter from NASA’s James Webb Space Telescope’s NIRCamshows stunning details of the majestic planet in infrared light.Special effects have been used for decades to depict space exploration, from visits to planets and moons to zero gravity and spaceships – one need only think of the landmark 2001: A Space Odyssey. Since that era, visual effects have increasingly grown in realism and importance. VFX have been used for entertainment and for scientific purposes, outreach to the public and astronaut training in virtual reality. Compelling images and videos can bring data to life. NASA’s Scientific Visualization Studioproduces visualizations, animations and images to help scientists tell stories of their research and make science more approachable and engaging.
A.J. Christensen is a senior visualization designer for the NASA Scientific Visualization Studioat the Goddard Space Flight Center in Greenbelt, Maryland. There, he develops data visualization techniques and designs data-driven imagery for scientific analysis and public outreach using Hollywood visual effects tools, according to NASA. SVS visualizations feature datasets from Earth-and space-based instrumentation, scientific supercomputer models and physical statistical distributions that have been analyzed and processed by computational scientists. Christensen’s specialties include working with 3D volumetric data, using the procedural cinematic software Houdini and science topics in Heliophysics, Geophysics and Astrophysics. He previously worked at the National Center for Supercomputing Applications’ Advanced Visualization Lab where he worked on more than a dozen science documentary full-dome films as well as the IMAX films Hubble 3D and A Beautiful Planet – and he worked at DNEG on the movie Interstellar, which won the 2015 Best Visual Effects Academy Award.
This global map of CO2 was created by NASA’s Scientific Visualization Studio using a model called GEOS, short for the Goddard Earth Observing System. GEOS is a high-resolution weather reanalysis model, powered by supercomputers, that is used to represent what was happening in the atmosphere.“The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video.”
—A.J. Christensen, Senior Visualization Designer, NASA Scientific Visualization StudioAbout his work at NASA SVS, Christensen comments, “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video. This past year we were part of NASA’s total eclipse outreach efforts, we participated in all the major earth science and astronomy conferences, we launched a public exhibition at the Smithsonian Museum of Natural History called the Earth Information Center, and we posted hundreds of new visualizations to our publicly accessible website: svs.gsfc.nasa.gov.”
This is the ‘beauty shot version’ of Perpetual Ocean 2: Western Boundary Currents. The visualization starts with a rotating globe showing ocean currents. The colors used to color the flow in this version were chosen to provide a pleasing look.The Gulf Stream and connected currents.Venus, our nearby “sister” planet, beckons today as a compelling target for exploration that may connect the objects in our own solar system to those discovered around nearby stars.WORKING WITH DATA
While Christensen is interpreting the data from active spacecraft and making it usable in different forms, such as for science and outreach, he notes, “It’s not just spacecraft that collect data. NASA maintains or monitors instruments on Earth too – on land, in the oceans and in the air. And to be precise, there are robots wandering around Mars that are collecting data, too.”
He continues, “Sometimes the data comes to our team as raw telescope imagery, sometimes we get it as a data product that a scientist has already analyzed and extracted meaning from, and sometimes various sensor data is used to drive computational models and we work with the models’ resulting output.”
Jupiter’s moon Europa may have life in a vast ocean beneath its icy surface.HOUDINI AND OTHER TOOLS
“Data visualization means a lot of different things to different people, but many people on our team interpret it as a form of filmmaking,” Christensen says. “We are very inspired by the approach to visual storytelling that Hollywood uses, and we use tools that are standard for Hollywood VFX. Many professionals in our area – the visualization of 3D scientific data – were previously using other animation tools but have discovered that Houdini is the most capable of understanding and manipulating unusual data, so there has been major movement toward Houdini over the past decade.”
Satellite imagery from NASA’s Solar Dynamics Observatoryshows the Sun in ultraviolet light colorized in light brown. Seen in ultraviolet light, the dark patches on the Sun are known as coronal holes and are regions where fast solar wind gushes out into space.Christensen explains, “We have always worked with scientific software as well – sometimes there’s only one software tool in existence to interpret a particular kind of scientific data. More often than not, scientific software does not have a GUI, so we’ve had to become proficient at learning new coding environments very quickly. IDL and Python are the generic data manipulation environments we use when something is too complicated or oversized for Houdini, but there are lots of alternatives out there. Typically, we use these tools to get the data into a format that Houdini can interpret, and then we use Houdini to do our shading, lighting and camera design, and seamlessly blend different datasets together.”
While cruising around Saturn in early October 2004, Cassini captured a series of images that have been composed into this large global natural color view of Saturn and its rings. This grand mosaic consists of 126 images acquired in a tile-like fashion, covering one end of Saturn’s rings to the other and the entire planet in between.The black hole Gargantua and the surrounding accretion disc from the 2014 movie Interstellar.Another visualization of the black hole Gargantua.INTERSTELLAR & GARGANTUA
Christensen recalls working for DNEG on Interstellar. “When I first started at DNEG, they asked me to work on the giant waves on Miller’s ocean planet. About a week in, my manager took me into the hall and said, ‘I was looking at your reel and saw all this astronomy stuff. We’re working on another sequence with an accretion disk around a black hole that I’m wondering if we should put you on.’ And I said, ‘Oh yeah, I’ve done lots of accretion disks.’ So, for the rest of my time on the show, I was working on the black hole team.”
He adds, “There are a lot of people in my community that would be hesitant to label any big-budget movie sequence as a scientific visualization. The typical assumption is that for a Hollywood movie, no one cares about accuracy as long as it looks good. Guardians of the Galaxy makes it seem like space is positively littered with nebulae, and Star Wars makes it seem like asteroids travel in herds. But the black hole Gargantua in Interstellar is a good case for being called a visualization. The imagery you see in the movie is the direct result of a collaboration with an expert scientist, Dr. Kip Thorne, working with the DNEG research team using the actual Einstein equations that describe the gravity around a black hole.”
Thorne is a Nobel Prize-winning theoretical physicist who taught at Caltech for many years. He has reached wide audiences with his books and presentations on black holes, time travel and wormholes on PBS and BBC shows. Christensen comments, “You can make the argument that some of the complexity around what a black hole actually looks like was discarded for the film, and they admit as much in the research paper that was published after the movie came out. But our team at NASA does that same thing. There is no such thing as an objectively ‘true’ scientific image – you always have to make aesthetic decisions around whether the image tells the science story, and often it makes more sense to omit information to clarify what’s important. Ultimately, Gargantua taught a whole lot of people something new about science, and that’s what a good scientific visualization aims to do.”
The SVS produces an annual visualization of the Moon’s phase and libration comprising 8,760 hourly renderings of its precise size, orientation and illumination.FURTHER CHALLENGES
The sheer size of the data often encountered by Christensen and his peers is a challenge. “I’m currently working with a dataset that is 400GB per timestep. It’s so big that I don’t even want to move it from one file server to another. So, then I have to make decisions about which data attributes to keep and which to discard, whether there’s a region of the data that I can cull or downsample, and I have to experiment with data compression schemes that might require me to entirely re-design the pipeline I’m using for Houdini. Of course, if I get rid of too much information, it becomes very resource-intensive to recompute everything, but if I don’t get rid of enough, then my design process becomes agonizingly slow.”
SVS also works closely with its NASA partner groups Conceptual Image Laband Goddard Media Studiosto publish a diverse array of content. Conceptual Image Lab focuses more on the artistic side of things – producing high-fidelity renders using film animation and visual design techniques, according to NASA. Where the SVS primarily focuses on making data-based visualizations, CIL puts more emphasis on conceptual visualizations – producing animations featuring NASA spacecraft, planetary observations and simulations, according to NASA. Goddard Media Studios, on the other hand, is more focused towards public outreach – producing interviews, TV programs and documentaries. GMS continues to be the main producers behind NASA TV, and as such, much of their content is aimed towards the general public.
An impact crater on the moon.Image of Mars showing a partly shadowed Olympus Mons toward the upper left of the image.Mars. Hellas Basin can be seen in the lower right portion of the image.Mars slightly tilted to show the Martian North Pole.Christensen notes, “One of the more unique challenges in this field is one of bringing people from very different backgrounds to agree on a common outcome. I work on teams with scientists, communicators and technologists, and we all have different communities we’re trying to satisfy. For instance, communicators are generally trying to simplify animations so their learning goal is clear, but scientists will insist that we add text and annotations on top of the video to eliminate ambiguity and avoid misinterpretations. Often, the technologist will have to say we can’t zoom in or look at the data in a certain way because it will show the data boundaries or data resolution limits. Every shot is a negotiation, but in trying to compromise, we often push the boundaries of what has been done before, which is exciting.”
#hollywood #vfx #tools #space #explorationHOLLYWOOD VFX TOOLS FOR SPACE EXPLORATIONBy CHRIS McGOWAN This image of Jupiter from NASA’s James Webb Space Telescope’s NIRCamshows stunning details of the majestic planet in infrared light.Special effects have been used for decades to depict space exploration, from visits to planets and moons to zero gravity and spaceships – one need only think of the landmark 2001: A Space Odyssey. Since that era, visual effects have increasingly grown in realism and importance. VFX have been used for entertainment and for scientific purposes, outreach to the public and astronaut training in virtual reality. Compelling images and videos can bring data to life. NASA’s Scientific Visualization Studioproduces visualizations, animations and images to help scientists tell stories of their research and make science more approachable and engaging. A.J. Christensen is a senior visualization designer for the NASA Scientific Visualization Studioat the Goddard Space Flight Center in Greenbelt, Maryland. There, he develops data visualization techniques and designs data-driven imagery for scientific analysis and public outreach using Hollywood visual effects tools, according to NASA. SVS visualizations feature datasets from Earth-and space-based instrumentation, scientific supercomputer models and physical statistical distributions that have been analyzed and processed by computational scientists. Christensen’s specialties include working with 3D volumetric data, using the procedural cinematic software Houdini and science topics in Heliophysics, Geophysics and Astrophysics. He previously worked at the National Center for Supercomputing Applications’ Advanced Visualization Lab where he worked on more than a dozen science documentary full-dome films as well as the IMAX films Hubble 3D and A Beautiful Planet – and he worked at DNEG on the movie Interstellar, which won the 2015 Best Visual Effects Academy Award. This global map of CO2 was created by NASA’s Scientific Visualization Studio using a model called GEOS, short for the Goddard Earth Observing System. GEOS is a high-resolution weather reanalysis model, powered by supercomputers, that is used to represent what was happening in the atmosphere.“The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video.” —A.J. Christensen, Senior Visualization Designer, NASA Scientific Visualization StudioAbout his work at NASA SVS, Christensen comments, “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video. This past year we were part of NASA’s total eclipse outreach efforts, we participated in all the major earth science and astronomy conferences, we launched a public exhibition at the Smithsonian Museum of Natural History called the Earth Information Center, and we posted hundreds of new visualizations to our publicly accessible website: svs.gsfc.nasa.gov.” This is the ‘beauty shot version’ of Perpetual Ocean 2: Western Boundary Currents. The visualization starts with a rotating globe showing ocean currents. The colors used to color the flow in this version were chosen to provide a pleasing look.The Gulf Stream and connected currents.Venus, our nearby “sister” planet, beckons today as a compelling target for exploration that may connect the objects in our own solar system to those discovered around nearby stars.WORKING WITH DATA While Christensen is interpreting the data from active spacecraft and making it usable in different forms, such as for science and outreach, he notes, “It’s not just spacecraft that collect data. NASA maintains or monitors instruments on Earth too – on land, in the oceans and in the air. And to be precise, there are robots wandering around Mars that are collecting data, too.” He continues, “Sometimes the data comes to our team as raw telescope imagery, sometimes we get it as a data product that a scientist has already analyzed and extracted meaning from, and sometimes various sensor data is used to drive computational models and we work with the models’ resulting output.” Jupiter’s moon Europa may have life in a vast ocean beneath its icy surface.HOUDINI AND OTHER TOOLS “Data visualization means a lot of different things to different people, but many people on our team interpret it as a form of filmmaking,” Christensen says. “We are very inspired by the approach to visual storytelling that Hollywood uses, and we use tools that are standard for Hollywood VFX. Many professionals in our area – the visualization of 3D scientific data – were previously using other animation tools but have discovered that Houdini is the most capable of understanding and manipulating unusual data, so there has been major movement toward Houdini over the past decade.” Satellite imagery from NASA’s Solar Dynamics Observatoryshows the Sun in ultraviolet light colorized in light brown. Seen in ultraviolet light, the dark patches on the Sun are known as coronal holes and are regions where fast solar wind gushes out into space.Christensen explains, “We have always worked with scientific software as well – sometimes there’s only one software tool in existence to interpret a particular kind of scientific data. More often than not, scientific software does not have a GUI, so we’ve had to become proficient at learning new coding environments very quickly. IDL and Python are the generic data manipulation environments we use when something is too complicated or oversized for Houdini, but there are lots of alternatives out there. Typically, we use these tools to get the data into a format that Houdini can interpret, and then we use Houdini to do our shading, lighting and camera design, and seamlessly blend different datasets together.” While cruising around Saturn in early October 2004, Cassini captured a series of images that have been composed into this large global natural color view of Saturn and its rings. This grand mosaic consists of 126 images acquired in a tile-like fashion, covering one end of Saturn’s rings to the other and the entire planet in between.The black hole Gargantua and the surrounding accretion disc from the 2014 movie Interstellar.Another visualization of the black hole Gargantua.INTERSTELLAR & GARGANTUA Christensen recalls working for DNEG on Interstellar. “When I first started at DNEG, they asked me to work on the giant waves on Miller’s ocean planet. About a week in, my manager took me into the hall and said, ‘I was looking at your reel and saw all this astronomy stuff. We’re working on another sequence with an accretion disk around a black hole that I’m wondering if we should put you on.’ And I said, ‘Oh yeah, I’ve done lots of accretion disks.’ So, for the rest of my time on the show, I was working on the black hole team.” He adds, “There are a lot of people in my community that would be hesitant to label any big-budget movie sequence as a scientific visualization. The typical assumption is that for a Hollywood movie, no one cares about accuracy as long as it looks good. Guardians of the Galaxy makes it seem like space is positively littered with nebulae, and Star Wars makes it seem like asteroids travel in herds. But the black hole Gargantua in Interstellar is a good case for being called a visualization. The imagery you see in the movie is the direct result of a collaboration with an expert scientist, Dr. Kip Thorne, working with the DNEG research team using the actual Einstein equations that describe the gravity around a black hole.” Thorne is a Nobel Prize-winning theoretical physicist who taught at Caltech for many years. He has reached wide audiences with his books and presentations on black holes, time travel and wormholes on PBS and BBC shows. Christensen comments, “You can make the argument that some of the complexity around what a black hole actually looks like was discarded for the film, and they admit as much in the research paper that was published after the movie came out. But our team at NASA does that same thing. There is no such thing as an objectively ‘true’ scientific image – you always have to make aesthetic decisions around whether the image tells the science story, and often it makes more sense to omit information to clarify what’s important. Ultimately, Gargantua taught a whole lot of people something new about science, and that’s what a good scientific visualization aims to do.” The SVS produces an annual visualization of the Moon’s phase and libration comprising 8,760 hourly renderings of its precise size, orientation and illumination.FURTHER CHALLENGES The sheer size of the data often encountered by Christensen and his peers is a challenge. “I’m currently working with a dataset that is 400GB per timestep. It’s so big that I don’t even want to move it from one file server to another. So, then I have to make decisions about which data attributes to keep and which to discard, whether there’s a region of the data that I can cull or downsample, and I have to experiment with data compression schemes that might require me to entirely re-design the pipeline I’m using for Houdini. Of course, if I get rid of too much information, it becomes very resource-intensive to recompute everything, but if I don’t get rid of enough, then my design process becomes agonizingly slow.” SVS also works closely with its NASA partner groups Conceptual Image Laband Goddard Media Studiosto publish a diverse array of content. Conceptual Image Lab focuses more on the artistic side of things – producing high-fidelity renders using film animation and visual design techniques, according to NASA. Where the SVS primarily focuses on making data-based visualizations, CIL puts more emphasis on conceptual visualizations – producing animations featuring NASA spacecraft, planetary observations and simulations, according to NASA. Goddard Media Studios, on the other hand, is more focused towards public outreach – producing interviews, TV programs and documentaries. GMS continues to be the main producers behind NASA TV, and as such, much of their content is aimed towards the general public. An impact crater on the moon.Image of Mars showing a partly shadowed Olympus Mons toward the upper left of the image.Mars. Hellas Basin can be seen in the lower right portion of the image.Mars slightly tilted to show the Martian North Pole.Christensen notes, “One of the more unique challenges in this field is one of bringing people from very different backgrounds to agree on a common outcome. I work on teams with scientists, communicators and technologists, and we all have different communities we’re trying to satisfy. For instance, communicators are generally trying to simplify animations so their learning goal is clear, but scientists will insist that we add text and annotations on top of the video to eliminate ambiguity and avoid misinterpretations. Often, the technologist will have to say we can’t zoom in or look at the data in a certain way because it will show the data boundaries or data resolution limits. Every shot is a negotiation, but in trying to compromise, we often push the boundaries of what has been done before, which is exciting.” #hollywood #vfx #tools #space #explorationWWW.VFXVOICE.COMHOLLYWOOD VFX TOOLS FOR SPACE EXPLORATIONBy CHRIS McGOWAN This image of Jupiter from NASA’s James Webb Space Telescope’s NIRCam (Near-Infrared Camera) shows stunning details of the majestic planet in infrared light. (Image courtesy of NASA, ESA and CSA) Special effects have been used for decades to depict space exploration, from visits to planets and moons to zero gravity and spaceships – one need only think of the landmark 2001: A Space Odyssey (1968). Since that era, visual effects have increasingly grown in realism and importance. VFX have been used for entertainment and for scientific purposes, outreach to the public and astronaut training in virtual reality. Compelling images and videos can bring data to life. NASA’s Scientific Visualization Studio (SVS) produces visualizations, animations and images to help scientists tell stories of their research and make science more approachable and engaging. A.J. Christensen is a senior visualization designer for the NASA Scientific Visualization Studio (SVS) at the Goddard Space Flight Center in Greenbelt, Maryland. There, he develops data visualization techniques and designs data-driven imagery for scientific analysis and public outreach using Hollywood visual effects tools, according to NASA. SVS visualizations feature datasets from Earth-and space-based instrumentation, scientific supercomputer models and physical statistical distributions that have been analyzed and processed by computational scientists. Christensen’s specialties include working with 3D volumetric data, using the procedural cinematic software Houdini and science topics in Heliophysics, Geophysics and Astrophysics. He previously worked at the National Center for Supercomputing Applications’ Advanced Visualization Lab where he worked on more than a dozen science documentary full-dome films as well as the IMAX films Hubble 3D and A Beautiful Planet – and he worked at DNEG on the movie Interstellar, which won the 2015 Best Visual Effects Academy Award. This global map of CO2 was created by NASA’s Scientific Visualization Studio using a model called GEOS, short for the Goddard Earth Observing System. GEOS is a high-resolution weather reanalysis model, powered by supercomputers, that is used to represent what was happening in the atmosphere. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video.” —A.J. Christensen, Senior Visualization Designer, NASA Scientific Visualization Studio (SVS) About his work at NASA SVS, Christensen comments, “The NASA Scientific Visualization Studio operates like a small VFX studio that creates animations of scientific data that has been collected or analyzed at NASA. We are one of several groups at NASA that create imagery for public consumption, but we are also a part of the scientific research process, helping scientists understand and share their data through pictures and video. This past year we were part of NASA’s total eclipse outreach efforts, we participated in all the major earth science and astronomy conferences, we launched a public exhibition at the Smithsonian Museum of Natural History called the Earth Information Center, and we posted hundreds of new visualizations to our publicly accessible website: svs.gsfc.nasa.gov.” This is the ‘beauty shot version’ of Perpetual Ocean 2: Western Boundary Currents. The visualization starts with a rotating globe showing ocean currents. The colors used to color the flow in this version were chosen to provide a pleasing look. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) The Gulf Stream and connected currents. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Venus, our nearby “sister” planet, beckons today as a compelling target for exploration that may connect the objects in our own solar system to those discovered around nearby stars. (Image courtesy of NASA’s Goddard Space Flight Center) WORKING WITH DATA While Christensen is interpreting the data from active spacecraft and making it usable in different forms, such as for science and outreach, he notes, “It’s not just spacecraft that collect data. NASA maintains or monitors instruments on Earth too – on land, in the oceans and in the air. And to be precise, there are robots wandering around Mars that are collecting data, too.” He continues, “Sometimes the data comes to our team as raw telescope imagery, sometimes we get it as a data product that a scientist has already analyzed and extracted meaning from, and sometimes various sensor data is used to drive computational models and we work with the models’ resulting output.” Jupiter’s moon Europa may have life in a vast ocean beneath its icy surface. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) HOUDINI AND OTHER TOOLS “Data visualization means a lot of different things to different people, but many people on our team interpret it as a form of filmmaking,” Christensen says. “We are very inspired by the approach to visual storytelling that Hollywood uses, and we use tools that are standard for Hollywood VFX. Many professionals in our area – the visualization of 3D scientific data – were previously using other animation tools but have discovered that Houdini is the most capable of understanding and manipulating unusual data, so there has been major movement toward Houdini over the past decade.” Satellite imagery from NASA’s Solar Dynamics Observatory (SDO) shows the Sun in ultraviolet light colorized in light brown. Seen in ultraviolet light, the dark patches on the Sun are known as coronal holes and are regions where fast solar wind gushes out into space. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Christensen explains, “We have always worked with scientific software as well – sometimes there’s only one software tool in existence to interpret a particular kind of scientific data. More often than not, scientific software does not have a GUI, so we’ve had to become proficient at learning new coding environments very quickly. IDL and Python are the generic data manipulation environments we use when something is too complicated or oversized for Houdini, but there are lots of alternatives out there. Typically, we use these tools to get the data into a format that Houdini can interpret, and then we use Houdini to do our shading, lighting and camera design, and seamlessly blend different datasets together.” While cruising around Saturn in early October 2004, Cassini captured a series of images that have been composed into this large global natural color view of Saturn and its rings. This grand mosaic consists of 126 images acquired in a tile-like fashion, covering one end of Saturn’s rings to the other and the entire planet in between. (Image courtesy of ASA/JPL/Space Science Institute) The black hole Gargantua and the surrounding accretion disc from the 2014 movie Interstellar. (Image courtesy of DNEG and Paramount Pictures) Another visualization of the black hole Gargantua. (Image courtesy of DNEG and Paramount Pictures) INTERSTELLAR & GARGANTUA Christensen recalls working for DNEG on Interstellar (2014). “When I first started at DNEG, they asked me to work on the giant waves on Miller’s ocean planet [in the film]. About a week in, my manager took me into the hall and said, ‘I was looking at your reel and saw all this astronomy stuff. We’re working on another sequence with an accretion disk around a black hole that I’m wondering if we should put you on.’ And I said, ‘Oh yeah, I’ve done lots of accretion disks.’ So, for the rest of my time on the show, I was working on the black hole team.” He adds, “There are a lot of people in my community that would be hesitant to label any big-budget movie sequence as a scientific visualization. The typical assumption is that for a Hollywood movie, no one cares about accuracy as long as it looks good. Guardians of the Galaxy makes it seem like space is positively littered with nebulae, and Star Wars makes it seem like asteroids travel in herds. But the black hole Gargantua in Interstellar is a good case for being called a visualization. The imagery you see in the movie is the direct result of a collaboration with an expert scientist, Dr. Kip Thorne, working with the DNEG research team using the actual Einstein equations that describe the gravity around a black hole.” Thorne is a Nobel Prize-winning theoretical physicist who taught at Caltech for many years. He has reached wide audiences with his books and presentations on black holes, time travel and wormholes on PBS and BBC shows. Christensen comments, “You can make the argument that some of the complexity around what a black hole actually looks like was discarded for the film, and they admit as much in the research paper that was published after the movie came out. But our team at NASA does that same thing. There is no such thing as an objectively ‘true’ scientific image – you always have to make aesthetic decisions around whether the image tells the science story, and often it makes more sense to omit information to clarify what’s important. Ultimately, Gargantua taught a whole lot of people something new about science, and that’s what a good scientific visualization aims to do.” The SVS produces an annual visualization of the Moon’s phase and libration comprising 8,760 hourly renderings of its precise size, orientation and illumination. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) FURTHER CHALLENGES The sheer size of the data often encountered by Christensen and his peers is a challenge. “I’m currently working with a dataset that is 400GB per timestep. It’s so big that I don’t even want to move it from one file server to another. So, then I have to make decisions about which data attributes to keep and which to discard, whether there’s a region of the data that I can cull or downsample, and I have to experiment with data compression schemes that might require me to entirely re-design the pipeline I’m using for Houdini. Of course, if I get rid of too much information, it becomes very resource-intensive to recompute everything, but if I don’t get rid of enough, then my design process becomes agonizingly slow.” SVS also works closely with its NASA partner groups Conceptual Image Lab (CIL) and Goddard Media Studios (GMS) to publish a diverse array of content. Conceptual Image Lab focuses more on the artistic side of things – producing high-fidelity renders using film animation and visual design techniques, according to NASA. Where the SVS primarily focuses on making data-based visualizations, CIL puts more emphasis on conceptual visualizations – producing animations featuring NASA spacecraft, planetary observations and simulations, according to NASA. Goddard Media Studios, on the other hand, is more focused towards public outreach – producing interviews, TV programs and documentaries. GMS continues to be the main producers behind NASA TV, and as such, much of their content is aimed towards the general public. An impact crater on the moon. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Image of Mars showing a partly shadowed Olympus Mons toward the upper left of the image. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Mars. Hellas Basin can be seen in the lower right portion of the image. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Mars slightly tilted to show the Martian North Pole. (Image courtesy of NASA/Goddard Space Flight Center Scientific Visualization Studio) Christensen notes, “One of the more unique challenges in this field is one of bringing people from very different backgrounds to agree on a common outcome. I work on teams with scientists, communicators and technologists, and we all have different communities we’re trying to satisfy. For instance, communicators are generally trying to simplify animations so their learning goal is clear, but scientists will insist that we add text and annotations on top of the video to eliminate ambiguity and avoid misinterpretations. Often, the technologist will have to say we can’t zoom in or look at the data in a certain way because it will show the data boundaries or data resolution limits. Every shot is a negotiation, but in trying to compromise, we often push the boundaries of what has been done before, which is exciting.” -
VFX EMMY CONTENDERS: SETTING THE BENCHMARK FOR VISUAL EFFECTS ON TV
By JENNIFER CHAMPAGNE
House of the Dragon expands its dragon-filled world in its second season, offering more large-scale battles and heightened aerial warfare.The 2025 Emmy race for outstanding visual effects is shaping up to be one of the most competitive in years with major genre heavyweights breaking new ground on what’s possible on television. As prestige fantasy and sci-fi continue to dominate, the battle for the category will likely come down to sheer scale, technical innovation and how seamlessly effects are integrated into storytelling. Returning titans like House of the Dragon and The Lord of the Rings: The Rings of Power have proven their ability to deliver breathtaking visuals. At the same time, Dune: Prophecy enters the conversation as a visually stunning newcomer. The Boys remains the category’s wildcard, bringing its own brand of hyper-realistic, shock-value effects to the race. With its subtle yet immersive world-building, The Penguin stands apart from the spectacle-driven contenders, using “invisible” VFX to transform Gotham into a post-flooded, decaying metropolis. Each series offers a distinct approach to digital effects, making for an intriguing showdown between blockbuster-scale world-building and more nuanced, atmospheric craftsmanship.
Sharing the arena with marquee pacesetters HBO’s The Last of Us, Disney+’s Andor and Netflix’s Squid Game, these series lead the charge in ensuring that the 2025 Emmy race isn’t just about visual spectacle; it’s about which shows will set the next benchmark for visual effects on television. The following insights and highlights from VFX supervisors of likely Emmy contenders illustrate why their award-worthy shows have caught the attention of TV watchers and VFX Emmy voters.
The Penguin, with its subtle yet immersive world-building, stands apart from the spectacle-driven contenders, using “invisible” VFX to transform Gotham into a post-flooded, decaying metropolis. For The Lord of the Rings: The Rings of Power VFX Supervisor Jason Smith, the second season presented some of the Amazon series’ most ambitious visual effects challenges. From the epic Battle of Eregion to the painstaking design of the Entwives, Smith and his team at Wētā FX sought to advance digital world-building while staying true to J.R.R. Tolkien’s vision. “The Battle of Eregion was amazing to work on – and challenging too, because it’s a pivotal moment in Tolkien’s story,” Smith states. Unlike typical large-scale clashes, this battle begins as a siege culminating in an explosive cavalry charge. “We looked for every way we could to heighten the action during the siege by keeping the armies interacting, even at a distance,” Smith explains. His team introduced projectiles and siege weaponry to create dynamic action, ensuring the prolonged standoff felt kinetic. The environment work for Eregion posed another challenge. The city was initially constructed as a massive digital asset in Season 1, showcasing the collaborative brilliance of the Elves and Dwarves. In Season 2, that grandeur had to be systematically razed to the ground. “The progression of destruction had to be planned extremely carefully,” Smith notes. His team devised seven distinct levels of damage, mapping out in granular detail which areas would be smoldering, reduced to rubble or utterly consumed by fire. “Our goal was to have the audience feel the loss that the Elves feel as this beautiful symbol of the height of Elvendom is utterly razed.”
The SSVFX team helped shape a world for Lady in the Lake that felt rich, lived-in and historically precise.One of most ambitious effects for Season 4 of The Boys was Splinter, who has the ability to duplicate himself. The sequence required eight hours of rehearsal, six hours of filming, for one shot. The final effect was a mix of prosthetic cover-up pieces and VFX face replacement.The Penguin, HBO Max’s spinoff series of The Batman, centers on Oswald ‘Oz’ Cobb’s ruthless rise to power, and relies on meticulous environmental effects, smoothly integrating CG elements to enhance Gotham’s noir aesthetic without ever calling attention to the work itself. “The most rewarding part of our work was crafting VFX that don’t feel like VFX,” says VFX Supervisor Johnny Han. Across the series’ 3,100 VFX shots, every collapsing freeway, skyline extension and flicker of light from a muzzle flash had to feel utterly real – woven so naturally into the world of Gotham that viewers never stopped to question its authenticity.
Zimia spaceport, an enormous hub of interstellar commerce in Dune: Prophecy. The production team built a vast practical set to provide a strong scale foundation, but its full grandeur came to life in post by extending this environment with CG.The second season of The Lord of the Rings: The Rings of Power refined its environments, which elevate Middle-earth’s realism.Some of the series’ most striking visual moments were also its most understated. The shift of Gotham’s seasons – transforming sunlit summer shoots into autumn’s muted chill – helped shape the show’s somber tone, reinforcing the bleak, crime-ridden undercurrent. The city’s bridges and skyscrapers were meticulously augmented, stretching Gotham beyond the limits of practical sets while preserving its grounded, brutalist aesthetic. Even the scars and wounds on Sofia Falcone were enhanced through digital artistry, ensuring that her past traumas remained ever-present, etched into her skin.
The series wasn’t without its large-scale effects – far from it. Han and his team orchestrated massive sequences of urban devastation. “The floodwaters were one of our biggest challenges,” Han notes, referring to the ongoing impact of the catastrophic deluge that left Gotham in ruins. One particularly harrowing sequence required simulating a tsunami tearing through the streets – not as an action set piece, but as a deeply personal moment of loss. “Telling Victor’s story of how he lost his entire family in the bombing and floods of Gotham was heartbreaking,” Han says. “Normally, you create an event like that for excitement, for tension. But for us, it was about capturing emotional devastation.”
Perhaps the most technically intricate sequences were the shootouts, hallmarks of Gotham’s criminal underbelly. “We programmed millisecond-accurate synced flash guns to mimic dramatic gunfire light,” Han explains, ensuring that the interplay of practical and digital elements remained imperceptible. Every muzzle flash, every ricochet was meticulously planned and rendered. The ultimate achievement for Han and his team wasn’t crafting the biggest explosion or the most elaborate digital sequence – it was making Gotham itself feel inescapably real. He says, “Nothing was more important to us than for you to forget that there are 3,100 VFX shots in this series.”
The challenge for The Residence was making one of the most recognizable buildings in the world feel both immersive and narratively engaging.Bringing the universe of Dune to life on TV for HBO’s Dune: Prophecy requires a delicate balance of realism and imagination, grounded in natural physics, yet awe-inspiring in scale. Dune: Prophecy looks to challenge traditional fantasy dominance with its stunning, desert-bound landscapes and intricate space-faring visuals, uniting the grandeur of Denis Villeneuve’s films with the demands of episodic storytelling. Set thousands of years before the events of the films, the series explores the early days of the Bene Gesserit, a secretive order wielding extraordinary abilities. Translating that power into a visual language required technical innovation. “Kudos to Important Looking Pirates for the space folding andAgony work,” says VFX Supervisor Mike Enriquez. No Dune project would be complete without its most iconic inhabitant, the sandworm. VFX Producer Terron Pratt says. “We’re incredibly proud of what the team at Image Engine created. Precise animation conveyed this creature’s weight and massive scale, while incredibly detailed sand simulations integrated it into the environment.” Every grain of sand had to move believably in response to the worm’s colossal presence to ensure the physics of Arrakis remained authentic.
Floodwaters play a significant part in the destruction of Gotham in The Penguin. One particularly harrowing sequence required simulating a tsunami tearing through the streets.American Primeval integrated visual effects with practical techniques in creative, unconventional ways. The massacre sequence showcases technical mastery and pulls the audience into the brutal reality of the American frontier.For the Zimia spaceport, an enormous hub of interstellar commerce, the Dune: Prophecy production team built a vast practical set to provide a strong scale foundation. However, its full grandeur came to life in post. “By extending this environment with CG, we amplified the scope of our world, making it feel expansive and deeply impactful,” Pratt explains. The result was a sprawling, futuristic cityscape that retained a tangible weight with impeccably amalgamated practical and digital elements.
Wētā FX sought to advance digital world-building for Season 2 of The Lord of the Rings: The Rings of Power while staying true to J.R.R. Tolkien’s vision.Visual effects extended beyond character work for Lady in the Lake, playing a key role in the show’s immersive world-building.For House of the Dragon VFX Supervisor Daði Einarsson, Season 2 presented some of the HBO show’s most complex and ambitious visual effects work. The Battle at Rook’s Rest in Episode 4 was a milestone for the series, marking the first full-scale dragon-on-dragon aerial battle. “We were tasked with pitting three dragons against each other in an all-out aerial war above a castle siege,” Einarsson says. Capturing the actors’ performances mid-flight required a combination of motion-controlled cameras, preprogrammed motion bases with saddles and LED volume lighting – all mapped directly from fully animated previsualized sequences approved by director Alan Taylor and Showrunner Ryan J. Condal. On the ground, the battlefield required digital crowd replication, extensive environment extensions, and pyrotechnic enhancements to create a war zone that felt both vast and intimately chaotic. “In the air, we created a fully CG version of the environment to have full control over the camera work,” Einarsson explains. Under the supervision of Sven Martin, the Pixomondo team stitched together breathtaking aerial combat, ensuring the dragons moved with the weight and raw power befitting their legendary status.
Blood, weapon effects and period-accurate muzzle flashes heightened the intensity of the brutal fight sequences in American Primeval. The natural elements and violence reflected the harsh realities of the American west in 1857.The Residence brings a refined, detailed approach to environmental augmentation, using visual effects to take the audience on a journey through the White House in this political murder mystery.Episode 7 introduced Hugh Hammer’s claim of Vermithor, Westeros’ second-largest dragon. Rather than breaking the sequence into multiple shots, Einarsson and director Loni Peristere saw an opportunity to craft something exceptional: a single, uninterrupted long take reminiscent of Children of Men and Gravity. “It took a lot of planning to design a series of beats that cohesively flowed from one into the next, with Hugh leading the camera by action and reaction,” Einarsson says. The sequence, which involved Hugh dodging Vermithor’s flames and ultimately claiming the beast through sheer bravery, was technically demanding. To achieve this, the team stitched together five separate takes of Hugh’s performance, shot over two separate days weeks apart, due to the set needing to be struck and rebuilt in different configurations. VFX Supervisor Wayne Stables and the team at Wētā ensured the transitions were imperceptible, uniting practical and digital elements into a continuous, immersive moment. “The Dragonmont Cavern environment was a beautiful, raised gantry and cave designed byJim Clay and expanded by Wētā,” Einarsson says. Then Rowley Imran’s stunt team and Mike Dawson’s SFX team engulfed the set in practical flames so every element, from fire to dust to movement, contributed to the illusion of real-time danger.
For Einarsson, the most significant challenge wasn’t just in making these sequences visually spectacular – it was ensuring they belonged within the same world as the quiet, dialogue-driven moments in King’s Landing. “The aim is for incredibly complex and spectacular visual effects scenes to feel like they belong in the same world as two people talking in a council chamber,” he states. Every dragon, flame and gust of wind had to feel as lived-in as the politics playing out beneath them.
Season 4 of The Boys delivered the fully CG octopus character, Ambrosius. A challenge was crafting a believable yet expressive sea creature and keeping it grounded while still embracing the show’s signature absurdity.In The Penguin, Gotham isn’t just a city; it’s a living, breathing entity shaped by destruction, decay and the quiet menace lurking beneath its streets.The Boys continues to defy genre norms, delivering audacious, technically complex effects that lean into its hyperviolent, satirical take on superheroes. For The Boys VFX Supervisor Stephan Fleet, Season 4 delivered some of the Amazon Prime show’s most dramatic effects yet, from the self-replicating Splinter to the fully CG octopus character, Ambrosius. Splinter, who has the ability to duplicate himself, presented a unique challenge. Fleet says, “His introduction on the podium was a complex motion control sequence. Eight hours of rehearsal, six hours of filming – for one shot.” Splinter’s design came with an added layer of difficulty. “We had to figure out how to make a nude male clone,” Fleet says. “Normally, you can hide doubles’ bodies in clothes – not this time!” The final effect required a mix of prosthetic cover-up pieces and VFX face replacement, requiring multiple iterations to make it work. Ambrosius became one of The Boys’ most unexpected breakout characters. “It’s fun making a full-on character in the show that’s an octopus,” Fleet reveals in a nod to the show’s absurd side. “As much as possible, we aim for a grounded approach and try to attain a level of thought and detail you don’t often find on TV.”
While the battle for outstanding visual effects will likely be dominated by large-scale fantasy and sci-fi productions, several standout series are also making waves with their innovative and immersive visual storytelling. Netflix’s The Residence, led by VFX Supervisor Seth Hill, brings a refined, detailed approach to environmental augmentation, enhancing the grandeur of the White House setting in this political murder mystery. “Using visual effects to take the audience on a journey through an iconic location like the White House was really fun,” Hill says. “It’s a cool and unique use of visual effects.” One of the most ambitious sequences involved what the team called the Doll House, a digital rendering of the White House with its south façade removed, exposing the interior like a cross-section of a dollhouse. Hill explains. “Going back and forth from filmed footage to full CGI – that jump from grounded realism to abstract yet still real – was quite tricky,” he says, adding, “VFX is best when it is in service of the storytelling, and The Residence presented a unique opportunity to do just that. It was a big challenge and a tough nut to crack, but those creative and technical hurdles are a good part of what makes it so rewarding.”
“We were tasked with pitting three dragons against each other in an all-out aerial war above a castle siege. In the air, we created a fully CG version of the environment to have full control over the camera work.”—Daði Einarsson, VFX Supervisor, House of the Dragon
The Battle at Rook’s Rest in Episode 4 of House of the Dragon Season 2 was a major milestone for the series, marking the first full-scale dragon-on-dragon aerial battle.Season 2 of House of the Dragon presented some of the most complex and ambitious visual effects work for the show to date.For Jay Worth, VFX Supervisor on Apple TV+’s Lady in the Lake, the challenge was two-fold: create seamless effects and preserve the raw emotional truth of a performance. One of the most significant technical achievements was de-aging Natalie Portman. “It seems so easy on paper, but the reality was far more challenging,” Worth admits. Worth had tackled de-aging before, but never with the same level of success. “For me, it is simply because of her performance.” Portman delivered a nuanced, youthful portrayal that felt entirely authentic to the time period. “It made our job both so much easier and set the bar so high for us. Sometimes, you can hide in a scene like this – you pull the camera back, cut away before the most expressive parts of the dialogue, or the illusion breaks,” Worth explains. In Lady in the Lake, there was nowhere to hide. “I think that is what I am most proud of with these shots. It felt like the longer you stayed on them, the more you believed them. That is a real feat with this sort of work.” Skully VFX handled the de-aging. “They nailed the look early on and delivered throughout the project on this difficult task.” Working alongside Production Designer Jc Molina, the VFX team helped shape a world that felt rich, lived-in and historically precise. “We were entrusted with the most important part of this show – do we believe this performance from this character in this part of her journey? – and we feel like we were able to deliver on this challenge.”
On the other end of the spectrum, Netflix’s American Primeval, under the guidance of VFX Supervisor Andrew Ceperley, delivers rugged, visceral realism in its portrayal of the untamed American frontier. With brutal battle sequences, sprawling landscapes and historical re-creations that interweave practical and digital effects, the series stands as a testament to how VFX can enhance grounded, historical storytelling. Ceperley says, “The standout is definitely the nearly three-minute single-shot massacre sequence in the forest episode.” Designed to immerse the audience in the raw, chaotic violence of the frontier, the scene captures every brutal detail with unrelenting intensity. The challenge was crafting invisible visual effects, enhancing practical stunts and destruction without breaking the immersive, handheld camera style. “The sequence was designed to be one shot made up of 10 individual takes, shot over seven days, seamlessly stitched together, all while using a handheld camera on an extremely wide-angle lens.” One of the most complex moments involved a bull smashing through a wagon while the characters hid underneath. Rather than relying on CGI, the team took a practical approach, placing a 360-degree camera under the wagon while the special effects team rigged it to explode in a way that simulated an impact. “A real bull was then guided to run toward the 360 camera and leap over it,” Ceperley says. The footage was blended with live-action shots of the actors with minimal CGI enhancements – just dust and debris – to complete the effect. Adding to the difficulty, the scene was set at sunset, giving the team an extremely limited window to capture each day’s footage. The massacre sequence was a prime example of integrating visual effects with practical techniques in creative, unconventional ways, blending old-school in-camera effects with modern stitching techniques to create a visceral cinematic moment that stayed true to the show’s raw, historical aesthetic. “Using old techniques in new, even strange ways and seeing it pay off and deliver on the original vision was the most rewarding part.”
#vfx #emmy #contenders #setting #benchmarkVFX EMMY CONTENDERS: SETTING THE BENCHMARK FOR VISUAL EFFECTS ON TVBy JENNIFER CHAMPAGNE House of the Dragon expands its dragon-filled world in its second season, offering more large-scale battles and heightened aerial warfare.The 2025 Emmy race for outstanding visual effects is shaping up to be one of the most competitive in years with major genre heavyweights breaking new ground on what’s possible on television. As prestige fantasy and sci-fi continue to dominate, the battle for the category will likely come down to sheer scale, technical innovation and how seamlessly effects are integrated into storytelling. Returning titans like House of the Dragon and The Lord of the Rings: The Rings of Power have proven their ability to deliver breathtaking visuals. At the same time, Dune: Prophecy enters the conversation as a visually stunning newcomer. The Boys remains the category’s wildcard, bringing its own brand of hyper-realistic, shock-value effects to the race. With its subtle yet immersive world-building, The Penguin stands apart from the spectacle-driven contenders, using “invisible” VFX to transform Gotham into a post-flooded, decaying metropolis. Each series offers a distinct approach to digital effects, making for an intriguing showdown between blockbuster-scale world-building and more nuanced, atmospheric craftsmanship. Sharing the arena with marquee pacesetters HBO’s The Last of Us, Disney+’s Andor and Netflix’s Squid Game, these series lead the charge in ensuring that the 2025 Emmy race isn’t just about visual spectacle; it’s about which shows will set the next benchmark for visual effects on television. The following insights and highlights from VFX supervisors of likely Emmy contenders illustrate why their award-worthy shows have caught the attention of TV watchers and VFX Emmy voters. The Penguin, with its subtle yet immersive world-building, stands apart from the spectacle-driven contenders, using “invisible” VFX to transform Gotham into a post-flooded, decaying metropolis. For The Lord of the Rings: The Rings of Power VFX Supervisor Jason Smith, the second season presented some of the Amazon series’ most ambitious visual effects challenges. From the epic Battle of Eregion to the painstaking design of the Entwives, Smith and his team at Wētā FX sought to advance digital world-building while staying true to J.R.R. Tolkien’s vision. “The Battle of Eregion was amazing to work on – and challenging too, because it’s a pivotal moment in Tolkien’s story,” Smith states. Unlike typical large-scale clashes, this battle begins as a siege culminating in an explosive cavalry charge. “We looked for every way we could to heighten the action during the siege by keeping the armies interacting, even at a distance,” Smith explains. His team introduced projectiles and siege weaponry to create dynamic action, ensuring the prolonged standoff felt kinetic. The environment work for Eregion posed another challenge. The city was initially constructed as a massive digital asset in Season 1, showcasing the collaborative brilliance of the Elves and Dwarves. In Season 2, that grandeur had to be systematically razed to the ground. “The progression of destruction had to be planned extremely carefully,” Smith notes. His team devised seven distinct levels of damage, mapping out in granular detail which areas would be smoldering, reduced to rubble or utterly consumed by fire. “Our goal was to have the audience feel the loss that the Elves feel as this beautiful symbol of the height of Elvendom is utterly razed.” The SSVFX team helped shape a world for Lady in the Lake that felt rich, lived-in and historically precise.One of most ambitious effects for Season 4 of The Boys was Splinter, who has the ability to duplicate himself. The sequence required eight hours of rehearsal, six hours of filming, for one shot. The final effect was a mix of prosthetic cover-up pieces and VFX face replacement.The Penguin, HBO Max’s spinoff series of The Batman, centers on Oswald ‘Oz’ Cobb’s ruthless rise to power, and relies on meticulous environmental effects, smoothly integrating CG elements to enhance Gotham’s noir aesthetic without ever calling attention to the work itself. “The most rewarding part of our work was crafting VFX that don’t feel like VFX,” says VFX Supervisor Johnny Han. Across the series’ 3,100 VFX shots, every collapsing freeway, skyline extension and flicker of light from a muzzle flash had to feel utterly real – woven so naturally into the world of Gotham that viewers never stopped to question its authenticity. Zimia spaceport, an enormous hub of interstellar commerce in Dune: Prophecy. The production team built a vast practical set to provide a strong scale foundation, but its full grandeur came to life in post by extending this environment with CG.The second season of The Lord of the Rings: The Rings of Power refined its environments, which elevate Middle-earth’s realism.Some of the series’ most striking visual moments were also its most understated. The shift of Gotham’s seasons – transforming sunlit summer shoots into autumn’s muted chill – helped shape the show’s somber tone, reinforcing the bleak, crime-ridden undercurrent. The city’s bridges and skyscrapers were meticulously augmented, stretching Gotham beyond the limits of practical sets while preserving its grounded, brutalist aesthetic. Even the scars and wounds on Sofia Falcone were enhanced through digital artistry, ensuring that her past traumas remained ever-present, etched into her skin. The series wasn’t without its large-scale effects – far from it. Han and his team orchestrated massive sequences of urban devastation. “The floodwaters were one of our biggest challenges,” Han notes, referring to the ongoing impact of the catastrophic deluge that left Gotham in ruins. One particularly harrowing sequence required simulating a tsunami tearing through the streets – not as an action set piece, but as a deeply personal moment of loss. “Telling Victor’s story of how he lost his entire family in the bombing and floods of Gotham was heartbreaking,” Han says. “Normally, you create an event like that for excitement, for tension. But for us, it was about capturing emotional devastation.” Perhaps the most technically intricate sequences were the shootouts, hallmarks of Gotham’s criminal underbelly. “We programmed millisecond-accurate synced flash guns to mimic dramatic gunfire light,” Han explains, ensuring that the interplay of practical and digital elements remained imperceptible. Every muzzle flash, every ricochet was meticulously planned and rendered. The ultimate achievement for Han and his team wasn’t crafting the biggest explosion or the most elaborate digital sequence – it was making Gotham itself feel inescapably real. He says, “Nothing was more important to us than for you to forget that there are 3,100 VFX shots in this series.” The challenge for The Residence was making one of the most recognizable buildings in the world feel both immersive and narratively engaging.Bringing the universe of Dune to life on TV for HBO’s Dune: Prophecy requires a delicate balance of realism and imagination, grounded in natural physics, yet awe-inspiring in scale. Dune: Prophecy looks to challenge traditional fantasy dominance with its stunning, desert-bound landscapes and intricate space-faring visuals, uniting the grandeur of Denis Villeneuve’s films with the demands of episodic storytelling. Set thousands of years before the events of the films, the series explores the early days of the Bene Gesserit, a secretive order wielding extraordinary abilities. Translating that power into a visual language required technical innovation. “Kudos to Important Looking Pirates for the space folding andAgony work,” says VFX Supervisor Mike Enriquez. No Dune project would be complete without its most iconic inhabitant, the sandworm. VFX Producer Terron Pratt says. “We’re incredibly proud of what the team at Image Engine created. Precise animation conveyed this creature’s weight and massive scale, while incredibly detailed sand simulations integrated it into the environment.” Every grain of sand had to move believably in response to the worm’s colossal presence to ensure the physics of Arrakis remained authentic. Floodwaters play a significant part in the destruction of Gotham in The Penguin. One particularly harrowing sequence required simulating a tsunami tearing through the streets.American Primeval integrated visual effects with practical techniques in creative, unconventional ways. The massacre sequence showcases technical mastery and pulls the audience into the brutal reality of the American frontier.For the Zimia spaceport, an enormous hub of interstellar commerce, the Dune: Prophecy production team built a vast practical set to provide a strong scale foundation. However, its full grandeur came to life in post. “By extending this environment with CG, we amplified the scope of our world, making it feel expansive and deeply impactful,” Pratt explains. The result was a sprawling, futuristic cityscape that retained a tangible weight with impeccably amalgamated practical and digital elements. Wētā FX sought to advance digital world-building for Season 2 of The Lord of the Rings: The Rings of Power while staying true to J.R.R. Tolkien’s vision.Visual effects extended beyond character work for Lady in the Lake, playing a key role in the show’s immersive world-building.For House of the Dragon VFX Supervisor Daði Einarsson, Season 2 presented some of the HBO show’s most complex and ambitious visual effects work. The Battle at Rook’s Rest in Episode 4 was a milestone for the series, marking the first full-scale dragon-on-dragon aerial battle. “We were tasked with pitting three dragons against each other in an all-out aerial war above a castle siege,” Einarsson says. Capturing the actors’ performances mid-flight required a combination of motion-controlled cameras, preprogrammed motion bases with saddles and LED volume lighting – all mapped directly from fully animated previsualized sequences approved by director Alan Taylor and Showrunner Ryan J. Condal. On the ground, the battlefield required digital crowd replication, extensive environment extensions, and pyrotechnic enhancements to create a war zone that felt both vast and intimately chaotic. “In the air, we created a fully CG version of the environment to have full control over the camera work,” Einarsson explains. Under the supervision of Sven Martin, the Pixomondo team stitched together breathtaking aerial combat, ensuring the dragons moved with the weight and raw power befitting their legendary status. Blood, weapon effects and period-accurate muzzle flashes heightened the intensity of the brutal fight sequences in American Primeval. The natural elements and violence reflected the harsh realities of the American west in 1857.The Residence brings a refined, detailed approach to environmental augmentation, using visual effects to take the audience on a journey through the White House in this political murder mystery.Episode 7 introduced Hugh Hammer’s claim of Vermithor, Westeros’ second-largest dragon. Rather than breaking the sequence into multiple shots, Einarsson and director Loni Peristere saw an opportunity to craft something exceptional: a single, uninterrupted long take reminiscent of Children of Men and Gravity. “It took a lot of planning to design a series of beats that cohesively flowed from one into the next, with Hugh leading the camera by action and reaction,” Einarsson says. The sequence, which involved Hugh dodging Vermithor’s flames and ultimately claiming the beast through sheer bravery, was technically demanding. To achieve this, the team stitched together five separate takes of Hugh’s performance, shot over two separate days weeks apart, due to the set needing to be struck and rebuilt in different configurations. VFX Supervisor Wayne Stables and the team at Wētā ensured the transitions were imperceptible, uniting practical and digital elements into a continuous, immersive moment. “The Dragonmont Cavern environment was a beautiful, raised gantry and cave designed byJim Clay and expanded by Wētā,” Einarsson says. Then Rowley Imran’s stunt team and Mike Dawson’s SFX team engulfed the set in practical flames so every element, from fire to dust to movement, contributed to the illusion of real-time danger. For Einarsson, the most significant challenge wasn’t just in making these sequences visually spectacular – it was ensuring they belonged within the same world as the quiet, dialogue-driven moments in King’s Landing. “The aim is for incredibly complex and spectacular visual effects scenes to feel like they belong in the same world as two people talking in a council chamber,” he states. Every dragon, flame and gust of wind had to feel as lived-in as the politics playing out beneath them. Season 4 of The Boys delivered the fully CG octopus character, Ambrosius. A challenge was crafting a believable yet expressive sea creature and keeping it grounded while still embracing the show’s signature absurdity.In The Penguin, Gotham isn’t just a city; it’s a living, breathing entity shaped by destruction, decay and the quiet menace lurking beneath its streets.The Boys continues to defy genre norms, delivering audacious, technically complex effects that lean into its hyperviolent, satirical take on superheroes. For The Boys VFX Supervisor Stephan Fleet, Season 4 delivered some of the Amazon Prime show’s most dramatic effects yet, from the self-replicating Splinter to the fully CG octopus character, Ambrosius. Splinter, who has the ability to duplicate himself, presented a unique challenge. Fleet says, “His introduction on the podium was a complex motion control sequence. Eight hours of rehearsal, six hours of filming – for one shot.” Splinter’s design came with an added layer of difficulty. “We had to figure out how to make a nude male clone,” Fleet says. “Normally, you can hide doubles’ bodies in clothes – not this time!” The final effect required a mix of prosthetic cover-up pieces and VFX face replacement, requiring multiple iterations to make it work. Ambrosius became one of The Boys’ most unexpected breakout characters. “It’s fun making a full-on character in the show that’s an octopus,” Fleet reveals in a nod to the show’s absurd side. “As much as possible, we aim for a grounded approach and try to attain a level of thought and detail you don’t often find on TV.” While the battle for outstanding visual effects will likely be dominated by large-scale fantasy and sci-fi productions, several standout series are also making waves with their innovative and immersive visual storytelling. Netflix’s The Residence, led by VFX Supervisor Seth Hill, brings a refined, detailed approach to environmental augmentation, enhancing the grandeur of the White House setting in this political murder mystery. “Using visual effects to take the audience on a journey through an iconic location like the White House was really fun,” Hill says. “It’s a cool and unique use of visual effects.” One of the most ambitious sequences involved what the team called the Doll House, a digital rendering of the White House with its south façade removed, exposing the interior like a cross-section of a dollhouse. Hill explains. “Going back and forth from filmed footage to full CGI – that jump from grounded realism to abstract yet still real – was quite tricky,” he says, adding, “VFX is best when it is in service of the storytelling, and The Residence presented a unique opportunity to do just that. It was a big challenge and a tough nut to crack, but those creative and technical hurdles are a good part of what makes it so rewarding.” “We were tasked with pitting three dragons against each other in an all-out aerial war above a castle siege. In the air, we created a fully CG version of the environment to have full control over the camera work.”—Daði Einarsson, VFX Supervisor, House of the Dragon The Battle at Rook’s Rest in Episode 4 of House of the Dragon Season 2 was a major milestone for the series, marking the first full-scale dragon-on-dragon aerial battle.Season 2 of House of the Dragon presented some of the most complex and ambitious visual effects work for the show to date.For Jay Worth, VFX Supervisor on Apple TV+’s Lady in the Lake, the challenge was two-fold: create seamless effects and preserve the raw emotional truth of a performance. One of the most significant technical achievements was de-aging Natalie Portman. “It seems so easy on paper, but the reality was far more challenging,” Worth admits. Worth had tackled de-aging before, but never with the same level of success. “For me, it is simply because of her performance.” Portman delivered a nuanced, youthful portrayal that felt entirely authentic to the time period. “It made our job both so much easier and set the bar so high for us. Sometimes, you can hide in a scene like this – you pull the camera back, cut away before the most expressive parts of the dialogue, or the illusion breaks,” Worth explains. In Lady in the Lake, there was nowhere to hide. “I think that is what I am most proud of with these shots. It felt like the longer you stayed on them, the more you believed them. That is a real feat with this sort of work.” Skully VFX handled the de-aging. “They nailed the look early on and delivered throughout the project on this difficult task.” Working alongside Production Designer Jc Molina, the VFX team helped shape a world that felt rich, lived-in and historically precise. “We were entrusted with the most important part of this show – do we believe this performance from this character in this part of her journey? – and we feel like we were able to deliver on this challenge.” On the other end of the spectrum, Netflix’s American Primeval, under the guidance of VFX Supervisor Andrew Ceperley, delivers rugged, visceral realism in its portrayal of the untamed American frontier. With brutal battle sequences, sprawling landscapes and historical re-creations that interweave practical and digital effects, the series stands as a testament to how VFX can enhance grounded, historical storytelling. Ceperley says, “The standout is definitely the nearly three-minute single-shot massacre sequence in the forest episode.” Designed to immerse the audience in the raw, chaotic violence of the frontier, the scene captures every brutal detail with unrelenting intensity. The challenge was crafting invisible visual effects, enhancing practical stunts and destruction without breaking the immersive, handheld camera style. “The sequence was designed to be one shot made up of 10 individual takes, shot over seven days, seamlessly stitched together, all while using a handheld camera on an extremely wide-angle lens.” One of the most complex moments involved a bull smashing through a wagon while the characters hid underneath. Rather than relying on CGI, the team took a practical approach, placing a 360-degree camera under the wagon while the special effects team rigged it to explode in a way that simulated an impact. “A real bull was then guided to run toward the 360 camera and leap over it,” Ceperley says. The footage was blended with live-action shots of the actors with minimal CGI enhancements – just dust and debris – to complete the effect. Adding to the difficulty, the scene was set at sunset, giving the team an extremely limited window to capture each day’s footage. The massacre sequence was a prime example of integrating visual effects with practical techniques in creative, unconventional ways, blending old-school in-camera effects with modern stitching techniques to create a visceral cinematic moment that stayed true to the show’s raw, historical aesthetic. “Using old techniques in new, even strange ways and seeing it pay off and deliver on the original vision was the most rewarding part.” #vfx #emmy #contenders #setting #benchmarkWWW.VFXVOICE.COMVFX EMMY CONTENDERS: SETTING THE BENCHMARK FOR VISUAL EFFECTS ON TVBy JENNIFER CHAMPAGNE House of the Dragon expands its dragon-filled world in its second season, offering more large-scale battles and heightened aerial warfare. (Image courtesy of HBO) The 2025 Emmy race for outstanding visual effects is shaping up to be one of the most competitive in years with major genre heavyweights breaking new ground on what’s possible on television. As prestige fantasy and sci-fi continue to dominate, the battle for the category will likely come down to sheer scale, technical innovation and how seamlessly effects are integrated into storytelling. Returning titans like House of the Dragon and The Lord of the Rings: The Rings of Power have proven their ability to deliver breathtaking visuals. At the same time, Dune: Prophecy enters the conversation as a visually stunning newcomer. The Boys remains the category’s wildcard, bringing its own brand of hyper-realistic, shock-value effects to the race. With its subtle yet immersive world-building, The Penguin stands apart from the spectacle-driven contenders, using “invisible” VFX to transform Gotham into a post-flooded, decaying metropolis. Each series offers a distinct approach to digital effects, making for an intriguing showdown between blockbuster-scale world-building and more nuanced, atmospheric craftsmanship. Sharing the arena with marquee pacesetters HBO’s The Last of Us, Disney+’s Andor and Netflix’s Squid Game, these series lead the charge in ensuring that the 2025 Emmy race isn’t just about visual spectacle; it’s about which shows will set the next benchmark for visual effects on television. The following insights and highlights from VFX supervisors of likely Emmy contenders illustrate why their award-worthy shows have caught the attention of TV watchers and VFX Emmy voters. The Penguin, with its subtle yet immersive world-building, stands apart from the spectacle-driven contenders, using “invisible” VFX to transform Gotham into a post-flooded, decaying metropolis. (Image courtesy of HBO) For The Lord of the Rings: The Rings of Power VFX Supervisor Jason Smith, the second season presented some of the Amazon series’ most ambitious visual effects challenges. From the epic Battle of Eregion to the painstaking design of the Entwives, Smith and his team at Wētā FX sought to advance digital world-building while staying true to J.R.R. Tolkien’s vision. “The Battle of Eregion was amazing to work on – and challenging too, because it’s a pivotal moment in Tolkien’s story,” Smith states. Unlike typical large-scale clashes, this battle begins as a siege culminating in an explosive cavalry charge. “We looked for every way we could to heighten the action during the siege by keeping the armies interacting, even at a distance,” Smith explains. His team introduced projectiles and siege weaponry to create dynamic action, ensuring the prolonged standoff felt kinetic. The environment work for Eregion posed another challenge. The city was initially constructed as a massive digital asset in Season 1, showcasing the collaborative brilliance of the Elves and Dwarves. In Season 2, that grandeur had to be systematically razed to the ground. “The progression of destruction had to be planned extremely carefully,” Smith notes. His team devised seven distinct levels of damage, mapping out in granular detail which areas would be smoldering, reduced to rubble or utterly consumed by fire. “Our goal was to have the audience feel the loss that the Elves feel as this beautiful symbol of the height of Elvendom is utterly razed.” The SSVFX team helped shape a world for Lady in the Lake that felt rich, lived-in and historically precise. (Image courtesy of Apple TV+) One of most ambitious effects for Season 4 of The Boys was Splinter, who has the ability to duplicate himself. The sequence required eight hours of rehearsal, six hours of filming, for one shot. The final effect was a mix of prosthetic cover-up pieces and VFX face replacement. (Image courtesy of Prime Video) The Penguin, HBO Max’s spinoff series of The Batman, centers on Oswald ‘Oz’ Cobb’s ruthless rise to power, and relies on meticulous environmental effects, smoothly integrating CG elements to enhance Gotham’s noir aesthetic without ever calling attention to the work itself. “The most rewarding part of our work was crafting VFX that don’t feel like VFX,” says VFX Supervisor Johnny Han. Across the series’ 3,100 VFX shots, every collapsing freeway, skyline extension and flicker of light from a muzzle flash had to feel utterly real – woven so naturally into the world of Gotham that viewers never stopped to question its authenticity. Zimia spaceport, an enormous hub of interstellar commerce in Dune: Prophecy. The production team built a vast practical set to provide a strong scale foundation, but its full grandeur came to life in post by extending this environment with CG.(Images courtesy of HBO) The second season of The Lord of the Rings: The Rings of Power refined its environments, which elevate Middle-earth’s realism. (Image courtesy of Prime Video) Some of the series’ most striking visual moments were also its most understated. The shift of Gotham’s seasons – transforming sunlit summer shoots into autumn’s muted chill – helped shape the show’s somber tone, reinforcing the bleak, crime-ridden undercurrent. The city’s bridges and skyscrapers were meticulously augmented, stretching Gotham beyond the limits of practical sets while preserving its grounded, brutalist aesthetic. Even the scars and wounds on Sofia Falcone were enhanced through digital artistry, ensuring that her past traumas remained ever-present, etched into her skin. The series wasn’t without its large-scale effects – far from it. Han and his team orchestrated massive sequences of urban devastation. “The floodwaters were one of our biggest challenges,” Han notes, referring to the ongoing impact of the catastrophic deluge that left Gotham in ruins. One particularly harrowing sequence required simulating a tsunami tearing through the streets – not as an action set piece, but as a deeply personal moment of loss. “Telling Victor’s story of how he lost his entire family in the bombing and floods of Gotham was heartbreaking,” Han says. “Normally, you create an event like that for excitement, for tension. But for us, it was about capturing emotional devastation.” Perhaps the most technically intricate sequences were the shootouts, hallmarks of Gotham’s criminal underbelly. “We programmed millisecond-accurate synced flash guns to mimic dramatic gunfire light,” Han explains, ensuring that the interplay of practical and digital elements remained imperceptible. Every muzzle flash, every ricochet was meticulously planned and rendered. The ultimate achievement for Han and his team wasn’t crafting the biggest explosion or the most elaborate digital sequence – it was making Gotham itself feel inescapably real. He says, “Nothing was more important to us than for you to forget that there are 3,100 VFX shots in this series.” The challenge for The Residence was making one of the most recognizable buildings in the world feel both immersive and narratively engaging. (Photo: Erin Simkin. Courtesy of Netflix) Bringing the universe of Dune to life on TV for HBO’s Dune: Prophecy requires a delicate balance of realism and imagination, grounded in natural physics, yet awe-inspiring in scale. Dune: Prophecy looks to challenge traditional fantasy dominance with its stunning, desert-bound landscapes and intricate space-faring visuals, uniting the grandeur of Denis Villeneuve’s films with the demands of episodic storytelling. Set thousands of years before the events of the films, the series explores the early days of the Bene Gesserit, a secretive order wielding extraordinary abilities. Translating that power into a visual language required technical innovation. “Kudos to Important Looking Pirates for the space folding and [Lila’s] Agony work,” says VFX Supervisor Mike Enriquez. No Dune project would be complete without its most iconic inhabitant, the sandworm. VFX Producer Terron Pratt says. “We’re incredibly proud of what the team at Image Engine created. Precise animation conveyed this creature’s weight and massive scale, while incredibly detailed sand simulations integrated it into the environment.” Every grain of sand had to move believably in response to the worm’s colossal presence to ensure the physics of Arrakis remained authentic. Floodwaters play a significant part in the destruction of Gotham in The Penguin. One particularly harrowing sequence required simulating a tsunami tearing through the streets. (Image courtesy of HBO) American Primeval integrated visual effects with practical techniques in creative, unconventional ways. The massacre sequence showcases technical mastery and pulls the audience into the brutal reality of the American frontier. (Photo: Justin Lubin. Courtesy of Netflix) For the Zimia spaceport, an enormous hub of interstellar commerce, the Dune: Prophecy production team built a vast practical set to provide a strong scale foundation. However, its full grandeur came to life in post. “By extending this environment with CG, we amplified the scope of our world, making it feel expansive and deeply impactful,” Pratt explains. The result was a sprawling, futuristic cityscape that retained a tangible weight with impeccably amalgamated practical and digital elements. Wētā FX sought to advance digital world-building for Season 2 of The Lord of the Rings: The Rings of Power while staying true to J.R.R. Tolkien’s vision. (Image courtesy of Prime Video) Visual effects extended beyond character work for Lady in the Lake, playing a key role in the show’s immersive world-building. (Image courtesy of Apple TV+) For House of the Dragon VFX Supervisor Daði Einarsson, Season 2 presented some of the HBO show’s most complex and ambitious visual effects work. The Battle at Rook’s Rest in Episode 4 was a milestone for the series, marking the first full-scale dragon-on-dragon aerial battle. “We were tasked with pitting three dragons against each other in an all-out aerial war above a castle siege,” Einarsson says. Capturing the actors’ performances mid-flight required a combination of motion-controlled cameras, preprogrammed motion bases with saddles and LED volume lighting – all mapped directly from fully animated previsualized sequences approved by director Alan Taylor and Showrunner Ryan J. Condal. On the ground, the battlefield required digital crowd replication, extensive environment extensions, and pyrotechnic enhancements to create a war zone that felt both vast and intimately chaotic. “In the air, we created a fully CG version of the environment to have full control over the camera work,” Einarsson explains. Under the supervision of Sven Martin, the Pixomondo team stitched together breathtaking aerial combat, ensuring the dragons moved with the weight and raw power befitting their legendary status. Blood, weapon effects and period-accurate muzzle flashes heightened the intensity of the brutal fight sequences in American Primeval. The natural elements and violence reflected the harsh realities of the American west in 1857. (Image courtesy of Netflix) The Residence brings a refined, detailed approach to environmental augmentation, using visual effects to take the audience on a journey through the White House in this political murder mystery. (Photo: Jessica Brooks. Courtesy of Netflix) Episode 7 introduced Hugh Hammer’s claim of Vermithor, Westeros’ second-largest dragon. Rather than breaking the sequence into multiple shots, Einarsson and director Loni Peristere saw an opportunity to craft something exceptional: a single, uninterrupted long take reminiscent of Children of Men and Gravity. “It took a lot of planning to design a series of beats that cohesively flowed from one into the next, with Hugh leading the camera by action and reaction,” Einarsson says. The sequence, which involved Hugh dodging Vermithor’s flames and ultimately claiming the beast through sheer bravery, was technically demanding. To achieve this, the team stitched together five separate takes of Hugh’s performance, shot over two separate days weeks apart, due to the set needing to be struck and rebuilt in different configurations. VFX Supervisor Wayne Stables and the team at Wētā ensured the transitions were imperceptible, uniting practical and digital elements into a continuous, immersive moment. “The Dragonmont Cavern environment was a beautiful, raised gantry and cave designed by [Production Designer] Jim Clay and expanded by Wētā,” Einarsson says. Then Rowley Imran’s stunt team and Mike Dawson’s SFX team engulfed the set in practical flames so every element, from fire to dust to movement, contributed to the illusion of real-time danger. For Einarsson, the most significant challenge wasn’t just in making these sequences visually spectacular – it was ensuring they belonged within the same world as the quiet, dialogue-driven moments in King’s Landing. “The aim is for incredibly complex and spectacular visual effects scenes to feel like they belong in the same world as two people talking in a council chamber,” he states. Every dragon, flame and gust of wind had to feel as lived-in as the politics playing out beneath them. Season 4 of The Boys delivered the fully CG octopus character, Ambrosius. A challenge was crafting a believable yet expressive sea creature and keeping it grounded while still embracing the show’s signature absurdity. (Image courtesy of Prime Video) In The Penguin, Gotham isn’t just a city; it’s a living, breathing entity shaped by destruction, decay and the quiet menace lurking beneath its streets. (Images courtesy of HBO) The Boys continues to defy genre norms, delivering audacious, technically complex effects that lean into its hyperviolent, satirical take on superheroes. For The Boys VFX Supervisor Stephan Fleet, Season 4 delivered some of the Amazon Prime show’s most dramatic effects yet, from the self-replicating Splinter to the fully CG octopus character, Ambrosius. Splinter, who has the ability to duplicate himself, presented a unique challenge. Fleet says, “His introduction on the podium was a complex motion control sequence. Eight hours of rehearsal, six hours of filming – for one shot.” Splinter’s design came with an added layer of difficulty. “We had to figure out how to make a nude male clone,” Fleet says. “Normally, you can hide doubles’ bodies in clothes – not this time!” The final effect required a mix of prosthetic cover-up pieces and VFX face replacement, requiring multiple iterations to make it work. Ambrosius became one of The Boys’ most unexpected breakout characters. “It’s fun making a full-on character in the show that’s an octopus,” Fleet reveals in a nod to the show’s absurd side. “As much as possible, we aim for a grounded approach and try to attain a level of thought and detail you don’t often find on TV.” While the battle for outstanding visual effects will likely be dominated by large-scale fantasy and sci-fi productions, several standout series are also making waves with their innovative and immersive visual storytelling. Netflix’s The Residence, led by VFX Supervisor Seth Hill, brings a refined, detailed approach to environmental augmentation, enhancing the grandeur of the White House setting in this political murder mystery. “Using visual effects to take the audience on a journey through an iconic location like the White House was really fun,” Hill says. “It’s a cool and unique use of visual effects.” One of the most ambitious sequences involved what the team called the Doll House, a digital rendering of the White House with its south façade removed, exposing the interior like a cross-section of a dollhouse. Hill explains. “Going back and forth from filmed footage to full CGI – that jump from grounded realism to abstract yet still real – was quite tricky,” he says, adding, “VFX is best when it is in service of the storytelling, and The Residence presented a unique opportunity to do just that. It was a big challenge and a tough nut to crack, but those creative and technical hurdles are a good part of what makes it so rewarding.” “We were tasked with pitting three dragons against each other in an all-out aerial war above a castle siege. In the air, we created a fully CG version of the environment to have full control over the camera work.”—Daði Einarsson, VFX Supervisor, House of the Dragon The Battle at Rook’s Rest in Episode 4 of House of the Dragon Season 2 was a major milestone for the series, marking the first full-scale dragon-on-dragon aerial battle. (Image courtesy of HBO) Season 2 of House of the Dragon presented some of the most complex and ambitious visual effects work for the show to date. (Photo: Theo Whiteman. Courtesy of HBO) For Jay Worth, VFX Supervisor on Apple TV+’s Lady in the Lake, the challenge was two-fold: create seamless effects and preserve the raw emotional truth of a performance. One of the most significant technical achievements was de-aging Natalie Portman. “It seems so easy on paper, but the reality was far more challenging,” Worth admits. Worth had tackled de-aging before, but never with the same level of success. “For me, it is simply because of her performance.” Portman delivered a nuanced, youthful portrayal that felt entirely authentic to the time period. “It made our job both so much easier and set the bar so high for us. Sometimes, you can hide in a scene like this – you pull the camera back, cut away before the most expressive parts of the dialogue, or the illusion breaks,” Worth explains. In Lady in the Lake, there was nowhere to hide. “I think that is what I am most proud of with these shots. It felt like the longer you stayed on them, the more you believed them. That is a real feat with this sort of work.” Skully VFX handled the de-aging. “They nailed the look early on and delivered throughout the project on this difficult task.” Working alongside Production Designer Jc Molina, the VFX team helped shape a world that felt rich, lived-in and historically precise. “We were entrusted with the most important part of this show – do we believe this performance from this character in this part of her journey? – and we feel like we were able to deliver on this challenge.” On the other end of the spectrum, Netflix’s American Primeval, under the guidance of VFX Supervisor Andrew Ceperley, delivers rugged, visceral realism in its portrayal of the untamed American frontier. With brutal battle sequences, sprawling landscapes and historical re-creations that interweave practical and digital effects, the series stands as a testament to how VFX can enhance grounded, historical storytelling. Ceperley says, “The standout is definitely the nearly three-minute single-shot massacre sequence in the forest episode.” Designed to immerse the audience in the raw, chaotic violence of the frontier, the scene captures every brutal detail with unrelenting intensity. The challenge was crafting invisible visual effects, enhancing practical stunts and destruction without breaking the immersive, handheld camera style. “The sequence was designed to be one shot made up of 10 individual takes, shot over seven days, seamlessly stitched together, all while using a handheld camera on an extremely wide-angle lens.” One of the most complex moments involved a bull smashing through a wagon while the characters hid underneath. Rather than relying on CGI, the team took a practical approach, placing a 360-degree camera under the wagon while the special effects team rigged it to explode in a way that simulated an impact. “A real bull was then guided to run toward the 360 camera and leap over it,” Ceperley says. The footage was blended with live-action shots of the actors with minimal CGI enhancements – just dust and debris – to complete the effect. Adding to the difficulty, the scene was set at sunset, giving the team an extremely limited window to capture each day’s footage. The massacre sequence was a prime example of integrating visual effects with practical techniques in creative, unconventional ways, blending old-school in-camera effects with modern stitching techniques to create a visceral cinematic moment that stayed true to the show’s raw, historical aesthetic. “Using old techniques in new, even strange ways and seeing it pay off and deliver on the original vision was the most rewarding part.” -
CASTING A BLACK MIRROR ON USS CALLISTER: INTO INFINITY
By TREVOR HOGG
Images courtesy of Netflix.
Unlike North America where episodes tend to be no longer than an hour, it is not uncommon in Britain to have feature-length episodes, which explains why the seasons are shorter. Season 7 of Black Mirror has six episodes with the first sequel for the Netflix anthology series that explores the dark side of technology having a run time of 90 minutes. “USS Callister: Into Infinity” comes eight years after “USS Callister” went on to win four Emmys as part of Season 4 and expands the tale where illegally constructed digital clones from human DNA struggle to survive in a multiplayer online video game environment. Returning creative talent includes filmmaker Toby Hayness, writers Charlie Brooker and William Bridges, and cast members Cristin Milioti, Jimmi Simpson, Osy Ikhile, Milanka Brooks, Paul Raymond and Jesse Plemons. Stepping into the Star Trek-meets-The Twilight Zone proceedings for the first time is VFX Supervisor James MacLachlan, who previously handled the digital augmentation for Ted Lasso.
“… We got on a train and went to the middle of Angleseyto a copper mine. The copper mine was absolutely stunning. … You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. … It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.”
—James MacLachlan, Visual Effects Supervisor
Taking advantage of the reflective quality of the bridge set was the LED wall utilized for the main viewscreen.
Dealing with a sequel to a critically-acclaimed episode was not a daunting task. “It’s almost like I have a cheat code for what we need to do, which I quite like because there’s a language from the previous show, so we have a certain number of outlines and guidelines,” MacLachlan states. “But because this was set beyond where the previous one was. it’s a different kind of aesthetic. I didn’t feel the pressure.” No assets were reused. “We were lucky that the company that previously did the USS Callister ship packaged it out neatly for us, and we were able to take that model; however, it doesn’t fit in pipelines anymore in the same way with the layering and materials. It was different visual effects vendors as well. Union VFX was smashing out all our new ships, planets and the Heart of Infinity. There was a significant number of resources put into new content.” Old props were helpful. “The Metallica ship that shows up in this episode is actually the Valdack ship turned backwards, upside down, re-textured and re-modeled off a prop I happened to wander past and saw in Charlie Brooker’s and Jessica Rhoades’ office.” MacLachlan notes.
Greenscreens were placed outside of the set windows for the USS Callister.
“USS Callister: Into Infinity” required 669 visual effects shots while the other five episodes totaled 912. “Josie Henwood, the Visual Effects Producer, sat down with a calculator and did an amazing job of making sure that the budget distribution was well-weighted for each of the scripts,’ MacLachlan remarks. “We shot this one third and posted it all the way to the end, so it overlapped a lot with some of the others. It was almost an advantage because we could work out where we were at with the major numbers and balance things out. It was a huge benefit that Toby had directed ‘USS Callister’. We had conversations about how we could approach the visual effects and make sure they sat within the budget and timeframe.” Working across the series were Crafty Apes, Jam VFX, Jellyfish Pictures, Magic Lab, One of Us, Stargate Studios, Terraform Studios, Union VFX, and Bigtooth Studios. “We had a spectrum of vendors that were brilliant and weighted so Union VFX took the heavy load on ‘USS Callister: Into Infinity,’ One of Us on ‘Eulogy’ and Jam VFX on ‘Hotel Riverie’ while the other vendors were used for all the shows.”
“e had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace, outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.”
—James MacLachlan, Visual Effects Supervisor
Miranda Jones looked after the production design and constructed a number of practical sets for the different sections of the USS Callister.
A clever visual effect was deployed when a digital clone of Robert Dalyis in his garage crafting a new world for Infinity, which transforms from a horizontal landscape into a spherical planetary form. “A lot of it was based off current UI when you use your phone and scroll,” MacLachlan remarks. “It is weighted and slows down through an exponential curve, so we tried to do that with the rotational values. We also looked at people using HoloLenses and Minority Report with those gestural moments. It has a language that a number of people are comfortable with, and we have gotten there with AR.” Union VFX spent a lot of time working on the transition. “They had three levels of detail for each of the moments. We had the mountain range and talked about the Himalayas. Union VFX had these moments where they animated between the different sizes and scales of each of these models. The final one is a wrap and reveal to the sphere, so it’s like you’re scaling down and out of the moment, then it folds out from itself. It was really nice.”
For safety reasons, weapons were digitally thrown. “We had a 3D prop printed for the shuriken and were able to get that out in front of the camera onstage,” MacLachlan explains. “Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible, so it worked well. Cristin did a convincing job of yanking the shuriken out. We added some blood and increased the size of the wound on her top, which we had to do for a couple of other scenes because blood goes dark when its dry, so it needed to be made redder.” Nanette Colethrows a ceremonial knife that hits Robert Daly directly in the head. “That was a crazy one. We had the full prop on the shelf in the beginning that she picks up and throws. The art department made a second one with a cutout section that was mounted to his head. Lucy Cainand I constructed a cage of hair clips and wire to hold it onto his head. Beyond that, we put tracking markers on his forehead, and we were able to add all of the blood. What we didn’t want to do was have too much blood and then have to remove it later. The decision was made to do the blood in post because you don’t want to be redressing it if you’re doing two or three takes; that can take a lot of time out of production.”
“USS Callister: Into Infinity” required 669 visual effects shots.
A digital clone of Robert Daly placed inside the game engine is responsible for creating the vast worlds found inside of Infinity.
“We had a 3D prop printed for the shuriken… Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible… Cristin did a convincing job of yanking the shuriken out.”
—James MacLachlan, Visual Effects Supervisor
A cross between 2001: A Space Odyssey and Cast Away is the otherworldly planet where the digital clone of James Waltonis found. “We got on a train and went to the middle of Angleseyto a copper mine,” MacLachlan recounts. “The copper mine was absolutely stunning. It’s not as saturated. You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. We found moments that worked for the different areas. It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.”
The blue teleportation ring was practically built and digitally enhanced.
Set pieces were LiDAR scanned. “What was interesting about the ice planetthe art department built these amazing structures in the foreground and beyond that we had white drapes the whole way around, which fell off into darkness beautifully and naturally because of where the light was pulled by Stephan Pehrsson,” MacLachlan states. “On top of that, there was the special effects department, which was wafting in a lot of atmospherics. Some of the atmospherics were in-camera and others were augmented to even it out and boost it in places to help the situation. We did add foreground snow. There is a big crane shot in the beginning where Unreal Engine assisted in generating some material. Then we did matte painting and set extensions beyond that to create a larger scale and cool rock shapes that were on an angle.” The jungle setting was an actual location. “That’s Black Park, and because of the time of year, there are a lot of protected plants. We had a couple of moments where we weren’t allowed to walk in certain places. There is one big stunt where Nanette steps on a mine, and it explodes her back against a tree. That was a protected tree, so the whole thing was wrapped in this giant stunt mat while the stunt woman got thrown across it. Areas would be filled in with dressed plants to help the foreground, but we got most of the background in-camera. There were bits of clean-up where we spotted crew or trucks.”
Large-scale and distinct rock shapes were placed at an angle to give the ice planet more of an alien quality.
An exterior space shot of the USS Callister that is entirely CG.
Twin versions of Nanette Cole and James Walton appear within the same frame. “Literally, we used every trick in the book the whole way through. Stephan and I went to see a motion control company that had a motion control camera on a TechnoDolly. Stephan could put it on his shoulder and record a move on a 20-foot crane. Once Stephan had done that first take, he would step away, then the motion control guys would do the same move again. You get this handheld feel through motion control rather than plotting two points and having it mechanical. You get a wide of a scene of clone Nanette in a chair and real Nanette standing in white, and you’ll notice the two Waltons in the background interacting with one another. Those shots were done on this motion control rig. We had motion control where we could plot points to make it feel like a tracking dolly. Then we also had our cameraman doing handheld moves pushing in and repeating himself. We had a wonderful double for Cristin who was excellent at mirroring what she was achieving, and they would switch and swap. You would have a shoulder or hair in the foreground in front of you, but then we would also stitch plates together that were handheld.”
The USS Callister approaches the game engine situated at the Heart of Infinity.
A homage to the fighter cockpit shots featured in the Star Wars franchise.
USS Callister flies into the game engine while pursued by other Infinity players.
A major story point is that the game engine is made to look complex but is in fact a façade.
A copper mine served as the location for the planet where the digital clone of James Waltonis found.
Principal photography for the jungle planet took place at Black Park in England.
The blue skin of Elena Tulaskawas achieved with practical makeup.
Assisting the lighting were some cool tools such as the teleportation ring. “We had this beautiful two-meter blue ring that we were able to put on the ground and light up as people step into it,” MacLachlan remarks. “You get these lovely reflections on their visors, helmets and kits. Then we augmented the blue ring in visual effects where it was replaced with more refined edging and lighting effects that stream up from it, which assisted with the integration with the teleportation effect because of their blue cyan tones.” Virtual production was utilized for the main viewscreen located on the bridge of the USS Callister. “In terms of reflections, the biggest boon for us in visual effects was the LED wall. The last time they did the big screen in the USS Callister was a greenscreen. We got a small version of a LED screen when the set was being built and did some tests. Then we had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace or outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.”
#casting #black #mirror #uss #callisterCASTING A BLACK MIRROR ON USS CALLISTER: INTO INFINITYBy TREVOR HOGG Images courtesy of Netflix. Unlike North America where episodes tend to be no longer than an hour, it is not uncommon in Britain to have feature-length episodes, which explains why the seasons are shorter. Season 7 of Black Mirror has six episodes with the first sequel for the Netflix anthology series that explores the dark side of technology having a run time of 90 minutes. “USS Callister: Into Infinity” comes eight years after “USS Callister” went on to win four Emmys as part of Season 4 and expands the tale where illegally constructed digital clones from human DNA struggle to survive in a multiplayer online video game environment. Returning creative talent includes filmmaker Toby Hayness, writers Charlie Brooker and William Bridges, and cast members Cristin Milioti, Jimmi Simpson, Osy Ikhile, Milanka Brooks, Paul Raymond and Jesse Plemons. Stepping into the Star Trek-meets-The Twilight Zone proceedings for the first time is VFX Supervisor James MacLachlan, who previously handled the digital augmentation for Ted Lasso. “… We got on a train and went to the middle of Angleseyto a copper mine. The copper mine was absolutely stunning. … You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. … It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” —James MacLachlan, Visual Effects Supervisor Taking advantage of the reflective quality of the bridge set was the LED wall utilized for the main viewscreen. Dealing with a sequel to a critically-acclaimed episode was not a daunting task. “It’s almost like I have a cheat code for what we need to do, which I quite like because there’s a language from the previous show, so we have a certain number of outlines and guidelines,” MacLachlan states. “But because this was set beyond where the previous one was. it’s a different kind of aesthetic. I didn’t feel the pressure.” No assets were reused. “We were lucky that the company that previously did the USS Callister ship packaged it out neatly for us, and we were able to take that model; however, it doesn’t fit in pipelines anymore in the same way with the layering and materials. It was different visual effects vendors as well. Union VFX was smashing out all our new ships, planets and the Heart of Infinity. There was a significant number of resources put into new content.” Old props were helpful. “The Metallica ship that shows up in this episode is actually the Valdack ship turned backwards, upside down, re-textured and re-modeled off a prop I happened to wander past and saw in Charlie Brooker’s and Jessica Rhoades’ office.” MacLachlan notes. Greenscreens were placed outside of the set windows for the USS Callister. “USS Callister: Into Infinity” required 669 visual effects shots while the other five episodes totaled 912. “Josie Henwood, the Visual Effects Producer, sat down with a calculator and did an amazing job of making sure that the budget distribution was well-weighted for each of the scripts,’ MacLachlan remarks. “We shot this one third and posted it all the way to the end, so it overlapped a lot with some of the others. It was almost an advantage because we could work out where we were at with the major numbers and balance things out. It was a huge benefit that Toby had directed ‘USS Callister’. We had conversations about how we could approach the visual effects and make sure they sat within the budget and timeframe.” Working across the series were Crafty Apes, Jam VFX, Jellyfish Pictures, Magic Lab, One of Us, Stargate Studios, Terraform Studios, Union VFX, and Bigtooth Studios. “We had a spectrum of vendors that were brilliant and weighted so Union VFX took the heavy load on ‘USS Callister: Into Infinity,’ One of Us on ‘Eulogy’ and Jam VFX on ‘Hotel Riverie’ while the other vendors were used for all the shows.” “e had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace, outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.” —James MacLachlan, Visual Effects Supervisor Miranda Jones looked after the production design and constructed a number of practical sets for the different sections of the USS Callister. A clever visual effect was deployed when a digital clone of Robert Dalyis in his garage crafting a new world for Infinity, which transforms from a horizontal landscape into a spherical planetary form. “A lot of it was based off current UI when you use your phone and scroll,” MacLachlan remarks. “It is weighted and slows down through an exponential curve, so we tried to do that with the rotational values. We also looked at people using HoloLenses and Minority Report with those gestural moments. It has a language that a number of people are comfortable with, and we have gotten there with AR.” Union VFX spent a lot of time working on the transition. “They had three levels of detail for each of the moments. We had the mountain range and talked about the Himalayas. Union VFX had these moments where they animated between the different sizes and scales of each of these models. The final one is a wrap and reveal to the sphere, so it’s like you’re scaling down and out of the moment, then it folds out from itself. It was really nice.” For safety reasons, weapons were digitally thrown. “We had a 3D prop printed for the shuriken and were able to get that out in front of the camera onstage,” MacLachlan explains. “Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible, so it worked well. Cristin did a convincing job of yanking the shuriken out. We added some blood and increased the size of the wound on her top, which we had to do for a couple of other scenes because blood goes dark when its dry, so it needed to be made redder.” Nanette Colethrows a ceremonial knife that hits Robert Daly directly in the head. “That was a crazy one. We had the full prop on the shelf in the beginning that she picks up and throws. The art department made a second one with a cutout section that was mounted to his head. Lucy Cainand I constructed a cage of hair clips and wire to hold it onto his head. Beyond that, we put tracking markers on his forehead, and we were able to add all of the blood. What we didn’t want to do was have too much blood and then have to remove it later. The decision was made to do the blood in post because you don’t want to be redressing it if you’re doing two or three takes; that can take a lot of time out of production.” “USS Callister: Into Infinity” required 669 visual effects shots. A digital clone of Robert Daly placed inside the game engine is responsible for creating the vast worlds found inside of Infinity. “We had a 3D prop printed for the shuriken… Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible… Cristin did a convincing job of yanking the shuriken out.” —James MacLachlan, Visual Effects Supervisor A cross between 2001: A Space Odyssey and Cast Away is the otherworldly planet where the digital clone of James Waltonis found. “We got on a train and went to the middle of Angleseyto a copper mine,” MacLachlan recounts. “The copper mine was absolutely stunning. It’s not as saturated. You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. We found moments that worked for the different areas. It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” The blue teleportation ring was practically built and digitally enhanced. Set pieces were LiDAR scanned. “What was interesting about the ice planetthe art department built these amazing structures in the foreground and beyond that we had white drapes the whole way around, which fell off into darkness beautifully and naturally because of where the light was pulled by Stephan Pehrsson,” MacLachlan states. “On top of that, there was the special effects department, which was wafting in a lot of atmospherics. Some of the atmospherics were in-camera and others were augmented to even it out and boost it in places to help the situation. We did add foreground snow. There is a big crane shot in the beginning where Unreal Engine assisted in generating some material. Then we did matte painting and set extensions beyond that to create a larger scale and cool rock shapes that were on an angle.” The jungle setting was an actual location. “That’s Black Park, and because of the time of year, there are a lot of protected plants. We had a couple of moments where we weren’t allowed to walk in certain places. There is one big stunt where Nanette steps on a mine, and it explodes her back against a tree. That was a protected tree, so the whole thing was wrapped in this giant stunt mat while the stunt woman got thrown across it. Areas would be filled in with dressed plants to help the foreground, but we got most of the background in-camera. There were bits of clean-up where we spotted crew or trucks.” Large-scale and distinct rock shapes were placed at an angle to give the ice planet more of an alien quality. An exterior space shot of the USS Callister that is entirely CG. Twin versions of Nanette Cole and James Walton appear within the same frame. “Literally, we used every trick in the book the whole way through. Stephan and I went to see a motion control company that had a motion control camera on a TechnoDolly. Stephan could put it on his shoulder and record a move on a 20-foot crane. Once Stephan had done that first take, he would step away, then the motion control guys would do the same move again. You get this handheld feel through motion control rather than plotting two points and having it mechanical. You get a wide of a scene of clone Nanette in a chair and real Nanette standing in white, and you’ll notice the two Waltons in the background interacting with one another. Those shots were done on this motion control rig. We had motion control where we could plot points to make it feel like a tracking dolly. Then we also had our cameraman doing handheld moves pushing in and repeating himself. We had a wonderful double for Cristin who was excellent at mirroring what she was achieving, and they would switch and swap. You would have a shoulder or hair in the foreground in front of you, but then we would also stitch plates together that were handheld.” The USS Callister approaches the game engine situated at the Heart of Infinity. A homage to the fighter cockpit shots featured in the Star Wars franchise. USS Callister flies into the game engine while pursued by other Infinity players. A major story point is that the game engine is made to look complex but is in fact a façade. A copper mine served as the location for the planet where the digital clone of James Waltonis found. Principal photography for the jungle planet took place at Black Park in England. The blue skin of Elena Tulaskawas achieved with practical makeup. Assisting the lighting were some cool tools such as the teleportation ring. “We had this beautiful two-meter blue ring that we were able to put on the ground and light up as people step into it,” MacLachlan remarks. “You get these lovely reflections on their visors, helmets and kits. Then we augmented the blue ring in visual effects where it was replaced with more refined edging and lighting effects that stream up from it, which assisted with the integration with the teleportation effect because of their blue cyan tones.” Virtual production was utilized for the main viewscreen located on the bridge of the USS Callister. “In terms of reflections, the biggest boon for us in visual effects was the LED wall. The last time they did the big screen in the USS Callister was a greenscreen. We got a small version of a LED screen when the set was being built and did some tests. Then we had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace or outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.” #casting #black #mirror #uss #callisterWWW.VFXVOICE.COMCASTING A BLACK MIRROR ON USS CALLISTER: INTO INFINITYBy TREVOR HOGG Images courtesy of Netflix. Unlike North America where episodes tend to be no longer than an hour, it is not uncommon in Britain to have feature-length episodes, which explains why the seasons are shorter. Season 7 of Black Mirror has six episodes with the first sequel for the Netflix anthology series that explores the dark side of technology having a run time of 90 minutes. “USS Callister: Into Infinity” comes eight years after “USS Callister” went on to win four Emmys as part of Season 4 and expands the tale where illegally constructed digital clones from human DNA struggle to survive in a multiplayer online video game environment. Returning creative talent includes filmmaker Toby Hayness, writers Charlie Brooker and William Bridges, and cast members Cristin Milioti, Jimmi Simpson, Osy Ikhile, Milanka Brooks, Paul Raymond and Jesse Plemons. Stepping into the Star Trek-meets-The Twilight Zone proceedings for the first time is VFX Supervisor James MacLachlan, who previously handled the digital augmentation for Ted Lasso. “[For the planet where the digital clone of James Walton is found]… We got on a train and went to the middle of Anglesey [island in Wales] to a copper mine. The copper mine was absolutely stunning. … You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. … It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” —James MacLachlan, Visual Effects Supervisor Taking advantage of the reflective quality of the bridge set was the LED wall utilized for the main viewscreen. Dealing with a sequel to a critically-acclaimed episode was not a daunting task. “It’s almost like I have a cheat code for what we need to do, which I quite like because there’s a language from the previous show, so we have a certain number of outlines and guidelines,” MacLachlan states. “But because this was set beyond where the previous one was. it’s a different kind of aesthetic. I didn’t feel the pressure.” No assets were reused. “We were lucky that the company that previously did the USS Callister ship packaged it out neatly for us, and we were able to take that model; however, it doesn’t fit in pipelines anymore in the same way with the layering and materials. It was different visual effects vendors as well. Union VFX was smashing out all our new ships, planets and the Heart of Infinity. There was a significant number of resources put into new content.” Old props were helpful. “The Metallica ship that shows up in this episode is actually the Valdack ship turned backwards, upside down, re-textured and re-modeled off a prop I happened to wander past and saw in Charlie Brooker’s and Jessica Rhoades’ office.” MacLachlan notes. Greenscreens were placed outside of the set windows for the USS Callister. “USS Callister: Into Infinity” required 669 visual effects shots while the other five episodes totaled 912. “Josie Henwood, the Visual Effects Producer, sat down with a calculator and did an amazing job of making sure that the budget distribution was well-weighted for each of the scripts,’ MacLachlan remarks. “We shot this one third and posted it all the way to the end, so it overlapped a lot with some of the others. It was almost an advantage because we could work out where we were at with the major numbers and balance things out. It was a huge benefit that Toby had directed ‘USS Callister’. We had conversations about how we could approach the visual effects and make sure they sat within the budget and timeframe.” Working across the series were Crafty Apes, Jam VFX, Jellyfish Pictures, Magic Lab, One of Us, Stargate Studios, Terraform Studios, Union VFX, and Bigtooth Studios. “We had a spectrum of vendors that were brilliant and weighted so Union VFX took the heavy load on ‘USS Callister: Into Infinity,’ One of Us on ‘Eulogy’ and Jam VFX on ‘Hotel Riverie’ while the other vendors were used for all the shows.” “[W]e had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace, outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.” —James MacLachlan, Visual Effects Supervisor Miranda Jones looked after the production design and constructed a number of practical sets for the different sections of the USS Callister. A clever visual effect was deployed when a digital clone of Robert Daly (Jesse Plemmons) is in his garage crafting a new world for Infinity, which transforms from a horizontal landscape into a spherical planetary form. “A lot of it was based off current UI when you use your phone and scroll,” MacLachlan remarks. “It is weighted and slows down through an exponential curve, so we tried to do that with the rotational values. We also looked at people using HoloLenses and Minority Report with those gestural moments. It has a language that a number of people are comfortable with, and we have gotten there with AR.” Union VFX spent a lot of time working on the transition. “They had three levels of detail for each of the moments. We had the mountain range and talked about the Himalayas. Union VFX had these moments where they animated between the different sizes and scales of each of these models. The final one is a wrap and reveal to the sphere, so it’s like you’re scaling down and out of the moment, then it folds out from itself. It was really nice.” For safety reasons, weapons were digitally thrown. “We had a 3D prop printed for the shuriken and were able to get that out in front of the camera onstage,” MacLachlan explains. “Then we decided to have it stand out more, so as [the Infinity Player] throws it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible, so it worked well. Cristin did a convincing job of yanking the shuriken out. We added some blood and increased the size of the wound on her top, which we had to do for a couple of other scenes because blood goes dark when its dry, so it needed to be made redder.” Nanette Cole (Cristin Milioti) throws a ceremonial knife that hits Robert Daly directly in the head. “That was a crazy one. We had the full prop on the shelf in the beginning that she picks up and throws. The art department made a second one with a cutout section that was mounted to his head. Lucy Cain [Makeup & Hair Designer] and I constructed a cage of hair clips and wire to hold it onto his head. Beyond that, we put tracking markers on his forehead, and we were able to add all of the blood. What we didn’t want to do was have too much blood and then have to remove it later. The decision was made to do the blood in post because you don’t want to be redressing it if you’re doing two or three takes; that can take a lot of time out of production.” “USS Callister: Into Infinity” required 669 visual effects shots. A digital clone of Robert Daly placed inside the game engine is responsible for creating the vast worlds found inside of Infinity. “We had a 3D prop printed for the shuriken [hidden hand weapon]… Then we decided to have it stand out more, so as [the Infinity Player] throws it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible… Cristin did a convincing job of yanking the shuriken out.” —James MacLachlan, Visual Effects Supervisor A cross between 2001: A Space Odyssey and Cast Away is the otherworldly planet where the digital clone of James Walton (Jimmi Simpson) is found. “We got on a train and went to the middle of Anglesey [island in Wales] to a copper mine,” MacLachlan recounts. “The copper mine was absolutely stunning. It’s not as saturated. You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. We found moments that worked for the different areas. It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” The blue teleportation ring was practically built and digitally enhanced. Set pieces were LiDAR scanned. “What was interesting about the ice planet [was that] the art department built these amazing structures in the foreground and beyond that we had white drapes the whole way around, which fell off into darkness beautifully and naturally because of where the light was pulled by Stephan Pehrsson [Cinematographer],” MacLachlan states. “On top of that, there was the special effects department, which was wafting in a lot of atmospherics. Some of the atmospherics were in-camera and others were augmented to even it out and boost it in places to help the situation. We did add foreground snow. There is a big crane shot in the beginning where Unreal Engine assisted in generating some material. Then we did matte painting and set extensions beyond that to create a larger scale and cool rock shapes that were on an angle.” The jungle setting was an actual location. “That’s Black Park [in England], and because of the time of year, there are a lot of protected plants. We had a couple of moments where we weren’t allowed to walk in certain places. There is one big stunt where Nanette steps on a mine, and it explodes her back against a tree. That was a protected tree, so the whole thing was wrapped in this giant stunt mat while the stunt woman got thrown across it. Areas would be filled in with dressed plants to help the foreground, but we got most of the background in-camera. There were bits of clean-up where we spotted crew or trucks.” Large-scale and distinct rock shapes were placed at an angle to give the ice planet more of an alien quality. An exterior space shot of the USS Callister that is entirely CG. Twin versions of Nanette Cole and James Walton appear within the same frame. “Literally, we used every trick in the book the whole way through. Stephan and I went to see a motion control company that had a motion control camera on a TechnoDolly. Stephan could put it on his shoulder and record a move on a 20-foot crane. Once Stephan had done that first take, he would step away, then the motion control guys would do the same move again. You get this handheld feel through motion control rather than plotting two points and having it mechanical. You get a wide of a scene of clone Nanette in a chair and real Nanette standing in white, and you’ll notice the two Waltons in the background interacting with one another. Those shots were done on this motion control rig. We had motion control where we could plot points to make it feel like a tracking dolly. Then we also had our cameraman doing handheld moves pushing in and repeating himself. We had a wonderful double for Cristin who was excellent at mirroring what she was achieving, and they would switch and swap. You would have a shoulder or hair in the foreground in front of you, but then we would also stitch plates together that were handheld.” The USS Callister approaches the game engine situated at the Heart of Infinity. A homage to the fighter cockpit shots featured in the Star Wars franchise. USS Callister flies into the game engine while pursued by other Infinity players. A major story point is that the game engine is made to look complex but is in fact a façade. A copper mine served as the location for the planet where the digital clone of James Walton (Jimmi Simpson) is found. Principal photography for the jungle planet took place at Black Park in England. The blue skin of Elena Tulaska (Milanka Brooks) was achieved with practical makeup. Assisting the lighting were some cool tools such as the teleportation ring. “We had this beautiful two-meter blue ring that we were able to put on the ground and light up as people step into it,” MacLachlan remarks. “You get these lovely reflections on their visors, helmets and kits. Then we augmented the blue ring in visual effects where it was replaced with more refined edging and lighting effects that stream up from it, which assisted with the integration with the teleportation effect because of their blue cyan tones.” Virtual production was utilized for the main viewscreen located on the bridge of the USS Callister. “In terms of reflections, the biggest boon for us in visual effects was the LED wall. The last time they did the big screen in the USS Callister was a greenscreen. We got a small version of a LED screen when the set was being built and did some tests. Then we had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace or outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.”0 Comments 0 Shares -
MAXIMIZING THE STRENGTH OF ANIME FOR SOLO LEVELING
By TREVOR HOGG
Images © Solo Leveling Animation Partners.
Like the protagonist Sung Jinwoo, the Solo Leveling franchise has leveled up from being a web novel created by Chugong to a webtoon to anime series with each medium expanding upon the other in world-building, character development and narrative scope. What remains at the core of the fantasy tale are portals to other dimensions that contain Magic Beasts that threaten to overtake the Earth, with the only defense being humans known as Hunters who have manifested supernatural abilities to varying degrees. At the lowest end of the power spectrum is Sung Jinwoo, but his fortunes dramatically change when a mysterious entity puts him through a series of tests that enable him to improve so much that his achievements become legendary. As part of the Emmy Awards campaign which sees the Aniplex, A-1 Pictures and Crunchyroll partnership being submitted for Outstanding Animated Program, Outstanding Individual Achievement in Animation – Character Animationand Outstanding Individual Achievement in Animation – Character Design, three of the major players recently gathered in Los Angeles along with a translator to discuss the creative challenges in producing two seasons consisting of 25 episodes.
Storyboard and the final animated still featuring renown S-Level Hunter Cha Hae-In, who becomes the wife of Sung Jinwoo.“he world of Solo Leveling takes place in some country in present day, so making sure that reality comes across to an audience was important for us. The original work is Korean, so Korea is the foundation. In the Japanese version that was also released as a manga, there were some slight cultural differences and adaptation. When animating this, we wanted to make sure that all of that was embraced and encapsulated. Anything from the exterior textures on buildings down to traffic rules.”
—Sota Furuhashi, Producer, Aniplex
Even though Shunsuke Nakashigeis guiding the vision of the adaptation, the nature of anime requires multiple contributors. “Personally, what I think makes anime interesting is this idea that the same person doesn’t illustrate or draw everything,” notes Sota Furuhashi, Producer at Aniplex. “That brings a lot of different textures and layers to how things are expressed. Take even one shot or scene: There will be multiple people animating this, so that gives the visual its own unique taste. This all comes together into a singular experience for audiences to enjoy. In the case of Solo Leveling, the action component is something that needs to be constantly evolving, so there is a lot of different talent that is pooled into these different scenes; that’s what adds value to the Solo Leveling experience.” Despite the fantastical elements, there has to be an element of realism to the world-building. “I want to preface this by saying I am only a proxy for the director, but the world of Solo Leveling takes place in some country in present day, so making sure that reality comes across to an audience was important for us,” Furuhashi explains. “The original work is Korean, so Korea is the foundation. In the Japanese version that was also released as a manga, there were some slight cultural differences and adaptation. When animating this, we wanted to make sure that all of that was embraced and encapsulated. Anything from the exterior textures on buildings down to traffic rules. We wanted to express what happens in people’s daily lives because that creates a lot of reality in terms of world-building.”
Storyboards for the fight between Sung Jinwoo and Blue Venom-Fanged Kasaka which emphasize the size and scale of the Magic Beast.“When we first saw Solo Leveling, which is quite serious and has a lot of violence, and understood the intent, the decision was made not to run away fromand to make sure it was embraced because it showed how serious the creators were in wanting to push the envelope.”
—Kanako Takahashi, Producer, Crunchyroll
A combination of 2D and 3D animation techniques were utilized. “We wanted to use 2D hand-drawn animation as the foundation, but I’m sure you’re aware it doesn’t scale well in terms of mass-producing groups of enemies,” states Atsushi Kaneko, Producer at A-1 Pictures. “Take any character that has a name – those are all hand-drawn. But other characters we decided to go with 3D because there’s a lot more range of expression that we could do and was more efficient. That was a decision we made with the director and CG director.” Initially, the Shadow Army, which consists of the enemies killed by Sung Jinwoo and subsequently resurrected by him upon gaining the ability of a necromancer, was to be 2D rather than 3D. “The Shadow Army doesn’t have a lot of visual cues and information in terms of the colors and lines,” Kaneko remarks. “That would have been easy to animate in 2D, but in terms of how that would be conveyed on the screen, it would feel a little lacking. Then we pivoted to 3D and ran a lot of tests. When we finished it, got it through the pipeline and composited it together, there was a lot more range in terms of what we could express through animation.”
Storyboard and final close-up shot of Sung Jinwoo as he takes on the Ice Elves.Fights that are briefly depicted in the source material became major action sequences. “In terms of the choreography of the action scenes themselves, our protagonist, Sung Jinwoo, at the beginning is weak and pathetic, and we wanted to make sure that we captured that inexperience and clumsiness in his actions,” Kaneko states. “We agreed with the animation team that we would go in stages. As he becomes stronger, so would his action. In Episode 104, after he reawakens, we start to move a little away from reality and go into this suspension of disbelief. Jinwoo’s actions and choreography improved significantly, and you could see that over several subsequent episodes. We wanted to make sure that it felt dynamic, alive and cathartic when audiences see it. But it wasn’t just that thought. A lot of the decisions in terms of how the action was going to unfold didn’t always come from the director or action director. Several of our keyframe artists in the storyboarding phase, or while animating, almost ad lib, so there was a lot of creative agency the animators had on the actual ground floor. That’s how all of the action came together under the guidance of the director and action director.”
The Shadow Army features defeated foes such as Igris and Ironwho have been resurrected by Sung Jinwoo through his ability as a necromancer.Blood and gore are not in short supply during the fights. “Early on, the directorand I decided not to shy away from the gore, but we did have to have the conversation because in Japan if something is going to be broadcast then it needs to fall within certain limits,” Furuhashi explains. “But we wanted to push the limits of expression as far as we could for something that would go on air. Part of that comes from this idea that the series starts with the main character’s death, and he is beginning from the bottom; that needed to be conveyed to audiences in a visual way. Another reason that we didn’t want to shy away from the gore is even though this is a fantasy world, it’s set in the present day as we know it. In order to make sure that reality was conveyed, if you cut someone’s arm with a sword, they’re going to lose an arm.” Crunchyroll understood that the violence was an integral part of the storytelling. “When we first saw Solo Leveling,” remarks Kanako Takahashi, Producer at Crunchyroll, “which is quite serious and has a lot of violence, and understood the intent, the decision was made not to run away from that and to make sure it was embraced because it showed how serious the creators were in wanting to push the envelope.”
An example of how detailed the storyboards can be, down to the eyes, which is also reflected in the final animation.“A lot of the decisions in terms of how the action was going to unfold didn’t always come from the director or action director. Several of our keyframe artists in the storyboarding phase, or while animating, almost ad lib, so there was a lot of creative agency the animators had on the actual ground floor. That’s how all of the action came together under the guidance of the director and action director.”
—Atsushi Kaneko, Producer, A-1 Pictures
3D animation was utilized for the Shadow Army in order to make their simplistic shapes and forms more dynamic.Magic Beasts appear ranging from the Blue Venom-Fanged Kasaka to Ice Elves that have volatile tempers and awe-inspiring agility. “I don’t know if I would call it a monster, but one scene that stands out to me is the Kargalgan gun battle because the action in that scene and the level of quality is second to none,” Takahashi observes. “Seeing how passionate the fans were and their online reaction to that battle in particular left a big impression on me.” The goal was to focus on adapting the source material as opposed to making creative decisions driven by the desire to stand out from other anime productions. “It was a natural result, and we’re grateful for the chance,” Kaneko states. “I didn’t feel like we were doing anything too different or out of the ordinary, but the chain of events that led ultimately to how fans reacted to it is something that differentiates us from other anime. In a way, we are also searching for that answer because we didn’t set out to do something very different or say, ‘We’re going to make this series about this or this is the angle we’re going to come from.’ The differentiation was also a byproduct of taking this IP and animating it.”
The Shadow Army is ordered by Sung Jinwoo to assist the Hunters during the 4th Jeju Island Raid.A scale and proportion study between Sung Jinwoo and the Ant King along with a final still of the bloodthirsty Magic Beast.The passionate fan reaction to the Kargalgan gun battle left a lasting impression on Crunchyroll Producer Kanako Takahashi.
As Sung Jinwoo becomes stronger, his actions become more dynamic.
There is no holding back on the blood and gore as the consequences of being struck by weapons are grounded in the real world despite the fantastical settings.
A quiet moment as Cha Hae-In ponders her encounter with Sung Jinwoo.Watch the dynamic Jinwoo vs. Kargalgan battle in Season 2 of Solo Leveling: Arise from the Shadow courtesy of Crunchyroll and Solo Leveling Animation Partners. Click here: . And watch a GIF on the design and construction of an action sequence for the anime opening of Solo Leveling by Studio Ppuri’s Inseung Choi and his animation team. Click here: .
#maximizing #strength #anime #solo #levelingMAXIMIZING THE STRENGTH OF ANIME FOR SOLO LEVELINGBy TREVOR HOGG Images © Solo Leveling Animation Partners. Like the protagonist Sung Jinwoo, the Solo Leveling franchise has leveled up from being a web novel created by Chugong to a webtoon to anime series with each medium expanding upon the other in world-building, character development and narrative scope. What remains at the core of the fantasy tale are portals to other dimensions that contain Magic Beasts that threaten to overtake the Earth, with the only defense being humans known as Hunters who have manifested supernatural abilities to varying degrees. At the lowest end of the power spectrum is Sung Jinwoo, but his fortunes dramatically change when a mysterious entity puts him through a series of tests that enable him to improve so much that his achievements become legendary. As part of the Emmy Awards campaign which sees the Aniplex, A-1 Pictures and Crunchyroll partnership being submitted for Outstanding Animated Program, Outstanding Individual Achievement in Animation – Character Animationand Outstanding Individual Achievement in Animation – Character Design, three of the major players recently gathered in Los Angeles along with a translator to discuss the creative challenges in producing two seasons consisting of 25 episodes. Storyboard and the final animated still featuring renown S-Level Hunter Cha Hae-In, who becomes the wife of Sung Jinwoo.“he world of Solo Leveling takes place in some country in present day, so making sure that reality comes across to an audience was important for us. The original work is Korean, so Korea is the foundation. In the Japanese version that was also released as a manga, there were some slight cultural differences and adaptation. When animating this, we wanted to make sure that all of that was embraced and encapsulated. Anything from the exterior textures on buildings down to traffic rules.” —Sota Furuhashi, Producer, Aniplex Even though Shunsuke Nakashigeis guiding the vision of the adaptation, the nature of anime requires multiple contributors. “Personally, what I think makes anime interesting is this idea that the same person doesn’t illustrate or draw everything,” notes Sota Furuhashi, Producer at Aniplex. “That brings a lot of different textures and layers to how things are expressed. Take even one shot or scene: There will be multiple people animating this, so that gives the visual its own unique taste. This all comes together into a singular experience for audiences to enjoy. In the case of Solo Leveling, the action component is something that needs to be constantly evolving, so there is a lot of different talent that is pooled into these different scenes; that’s what adds value to the Solo Leveling experience.” Despite the fantastical elements, there has to be an element of realism to the world-building. “I want to preface this by saying I am only a proxy for the director, but the world of Solo Leveling takes place in some country in present day, so making sure that reality comes across to an audience was important for us,” Furuhashi explains. “The original work is Korean, so Korea is the foundation. In the Japanese version that was also released as a manga, there were some slight cultural differences and adaptation. When animating this, we wanted to make sure that all of that was embraced and encapsulated. Anything from the exterior textures on buildings down to traffic rules. We wanted to express what happens in people’s daily lives because that creates a lot of reality in terms of world-building.” Storyboards for the fight between Sung Jinwoo and Blue Venom-Fanged Kasaka which emphasize the size and scale of the Magic Beast.“When we first saw Solo Leveling, which is quite serious and has a lot of violence, and understood the intent, the decision was made not to run away fromand to make sure it was embraced because it showed how serious the creators were in wanting to push the envelope.” —Kanako Takahashi, Producer, Crunchyroll A combination of 2D and 3D animation techniques were utilized. “We wanted to use 2D hand-drawn animation as the foundation, but I’m sure you’re aware it doesn’t scale well in terms of mass-producing groups of enemies,” states Atsushi Kaneko, Producer at A-1 Pictures. “Take any character that has a name – those are all hand-drawn. But other characters we decided to go with 3D because there’s a lot more range of expression that we could do and was more efficient. That was a decision we made with the director and CG director.” Initially, the Shadow Army, which consists of the enemies killed by Sung Jinwoo and subsequently resurrected by him upon gaining the ability of a necromancer, was to be 2D rather than 3D. “The Shadow Army doesn’t have a lot of visual cues and information in terms of the colors and lines,” Kaneko remarks. “That would have been easy to animate in 2D, but in terms of how that would be conveyed on the screen, it would feel a little lacking. Then we pivoted to 3D and ran a lot of tests. When we finished it, got it through the pipeline and composited it together, there was a lot more range in terms of what we could express through animation.” Storyboard and final close-up shot of Sung Jinwoo as he takes on the Ice Elves.Fights that are briefly depicted in the source material became major action sequences. “In terms of the choreography of the action scenes themselves, our protagonist, Sung Jinwoo, at the beginning is weak and pathetic, and we wanted to make sure that we captured that inexperience and clumsiness in his actions,” Kaneko states. “We agreed with the animation team that we would go in stages. As he becomes stronger, so would his action. In Episode 104, after he reawakens, we start to move a little away from reality and go into this suspension of disbelief. Jinwoo’s actions and choreography improved significantly, and you could see that over several subsequent episodes. We wanted to make sure that it felt dynamic, alive and cathartic when audiences see it. But it wasn’t just that thought. A lot of the decisions in terms of how the action was going to unfold didn’t always come from the director or action director. Several of our keyframe artists in the storyboarding phase, or while animating, almost ad lib, so there was a lot of creative agency the animators had on the actual ground floor. That’s how all of the action came together under the guidance of the director and action director.” The Shadow Army features defeated foes such as Igris and Ironwho have been resurrected by Sung Jinwoo through his ability as a necromancer.Blood and gore are not in short supply during the fights. “Early on, the directorand I decided not to shy away from the gore, but we did have to have the conversation because in Japan if something is going to be broadcast then it needs to fall within certain limits,” Furuhashi explains. “But we wanted to push the limits of expression as far as we could for something that would go on air. Part of that comes from this idea that the series starts with the main character’s death, and he is beginning from the bottom; that needed to be conveyed to audiences in a visual way. Another reason that we didn’t want to shy away from the gore is even though this is a fantasy world, it’s set in the present day as we know it. In order to make sure that reality was conveyed, if you cut someone’s arm with a sword, they’re going to lose an arm.” Crunchyroll understood that the violence was an integral part of the storytelling. “When we first saw Solo Leveling,” remarks Kanako Takahashi, Producer at Crunchyroll, “which is quite serious and has a lot of violence, and understood the intent, the decision was made not to run away from that and to make sure it was embraced because it showed how serious the creators were in wanting to push the envelope.” An example of how detailed the storyboards can be, down to the eyes, which is also reflected in the final animation.“A lot of the decisions in terms of how the action was going to unfold didn’t always come from the director or action director. Several of our keyframe artists in the storyboarding phase, or while animating, almost ad lib, so there was a lot of creative agency the animators had on the actual ground floor. That’s how all of the action came together under the guidance of the director and action director.” —Atsushi Kaneko, Producer, A-1 Pictures 3D animation was utilized for the Shadow Army in order to make their simplistic shapes and forms more dynamic.Magic Beasts appear ranging from the Blue Venom-Fanged Kasaka to Ice Elves that have volatile tempers and awe-inspiring agility. “I don’t know if I would call it a monster, but one scene that stands out to me is the Kargalgan gun battle because the action in that scene and the level of quality is second to none,” Takahashi observes. “Seeing how passionate the fans were and their online reaction to that battle in particular left a big impression on me.” The goal was to focus on adapting the source material as opposed to making creative decisions driven by the desire to stand out from other anime productions. “It was a natural result, and we’re grateful for the chance,” Kaneko states. “I didn’t feel like we were doing anything too different or out of the ordinary, but the chain of events that led ultimately to how fans reacted to it is something that differentiates us from other anime. In a way, we are also searching for that answer because we didn’t set out to do something very different or say, ‘We’re going to make this series about this or this is the angle we’re going to come from.’ The differentiation was also a byproduct of taking this IP and animating it.” The Shadow Army is ordered by Sung Jinwoo to assist the Hunters during the 4th Jeju Island Raid.A scale and proportion study between Sung Jinwoo and the Ant King along with a final still of the bloodthirsty Magic Beast.The passionate fan reaction to the Kargalgan gun battle left a lasting impression on Crunchyroll Producer Kanako Takahashi. As Sung Jinwoo becomes stronger, his actions become more dynamic. There is no holding back on the blood and gore as the consequences of being struck by weapons are grounded in the real world despite the fantastical settings. A quiet moment as Cha Hae-In ponders her encounter with Sung Jinwoo.Watch the dynamic Jinwoo vs. Kargalgan battle in Season 2 of Solo Leveling: Arise from the Shadow courtesy of Crunchyroll and Solo Leveling Animation Partners. Click here: . And watch a GIF on the design and construction of an action sequence for the anime opening of Solo Leveling by Studio Ppuri’s Inseung Choi and his animation team. Click here: . #maximizing #strength #anime #solo #levelingWWW.VFXVOICE.COMMAXIMIZING THE STRENGTH OF ANIME FOR SOLO LEVELINGBy TREVOR HOGG Images © Solo Leveling Animation Partners. Like the protagonist Sung Jinwoo, the Solo Leveling franchise has leveled up from being a web novel created by Chugong to a webtoon to anime series with each medium expanding upon the other in world-building, character development and narrative scope. What remains at the core of the fantasy tale are portals to other dimensions that contain Magic Beasts that threaten to overtake the Earth, with the only defense being humans known as Hunters who have manifested supernatural abilities to varying degrees. At the lowest end of the power spectrum is Sung Jinwoo, but his fortunes dramatically change when a mysterious entity puts him through a series of tests that enable him to improve so much that his achievements become legendary. As part of the Emmy Awards campaign which sees the Aniplex, A-1 Pictures and Crunchyroll partnership being submitted for Outstanding Animated Program, Outstanding Individual Achievement in Animation – Character Animation (Yoshihiro Kanno) and Outstanding Individual Achievement in Animation – Character Design (Tomoko Sudo), three of the major players recently gathered in Los Angeles along with a translator to discuss the creative challenges in producing two seasons consisting of 25 episodes. Storyboard and the final animated still featuring renown S-Level Hunter Cha Hae-In, who becomes the wife of Sung Jinwoo. (Image courtesy of Atsushi Kaneko via X) “[T]he world of Solo Leveling takes place in some country in present day, so making sure that reality comes across to an audience was important for us. The original work is Korean, so Korea is the foundation. In the Japanese version that was also released as a manga, there were some slight cultural differences and adaptation. When animating this, we wanted to make sure that all of that was embraced and encapsulated. Anything from the exterior textures on buildings down to traffic rules.” —Sota Furuhashi, Producer, Aniplex Even though Shunsuke Nakashige [director] is guiding the vision of the adaptation, the nature of anime requires multiple contributors. “Personally, what I think makes anime interesting is this idea that the same person doesn’t illustrate or draw everything,” notes Sota Furuhashi, Producer at Aniplex. “That brings a lot of different textures and layers to how things are expressed. Take even one shot or scene: There will be multiple people animating this, so that gives the visual its own unique taste. This all comes together into a singular experience for audiences to enjoy. In the case of Solo Leveling, the action component is something that needs to be constantly evolving, so there is a lot of different talent that is pooled into these different scenes; that’s what adds value to the Solo Leveling experience.” Despite the fantastical elements, there has to be an element of realism to the world-building. “I want to preface this by saying I am only a proxy for the director, but the world of Solo Leveling takes place in some country in present day, so making sure that reality comes across to an audience was important for us,” Furuhashi explains. “The original work is Korean, so Korea is the foundation. In the Japanese version that was also released as a manga, there were some slight cultural differences and adaptation. When animating this, we wanted to make sure that all of that was embraced and encapsulated. Anything from the exterior textures on buildings down to traffic rules. We wanted to express what happens in people’s daily lives because that creates a lot of reality in terms of world-building.” Storyboards for the fight between Sung Jinwoo and Blue Venom-Fanged Kasaka which emphasize the size and scale of the Magic Beast. (Image courtesy of Atsushi Kaneko via X) “When we first saw Solo Leveling, which is quite serious and has a lot of violence, and understood the intent, the decision was made not to run away from [the violence] and to make sure it was embraced because it showed how serious the creators were in wanting to push the envelope.” —Kanako Takahashi, Producer, Crunchyroll A combination of 2D and 3D animation techniques were utilized. “We wanted to use 2D hand-drawn animation as the foundation, but I’m sure you’re aware it doesn’t scale well in terms of mass-producing groups of enemies,” states Atsushi Kaneko, Producer at A-1 Pictures. “Take any character that has a name – those are all hand-drawn. But other characters we decided to go with 3D because there’s a lot more range of expression that we could do and was more efficient. That was a decision we made with the director and CG director [Toshitaka Morioka].” Initially, the Shadow Army, which consists of the enemies killed by Sung Jinwoo and subsequently resurrected by him upon gaining the ability of a necromancer, was to be 2D rather than 3D. “The Shadow Army doesn’t have a lot of visual cues and information in terms of the colors and lines,” Kaneko remarks. “That would have been easy to animate in 2D, but in terms of how that would be conveyed on the screen, it would feel a little lacking. Then we pivoted to 3D and ran a lot of tests. When we finished it, got it through the pipeline and composited it together, there was a lot more range in terms of what we could express through animation.” Storyboard and final close-up shot of Sung Jinwoo as he takes on the Ice Elves. (Image courtesy of Atsushi Kaneko via X) Fights that are briefly depicted in the source material became major action sequences. “In terms of the choreography of the action scenes themselves, our protagonist, Sung Jinwoo, at the beginning is weak and pathetic, and we wanted to make sure that we captured that inexperience and clumsiness in his actions,” Kaneko states. “We agreed with the animation team that we would go in stages. As he becomes stronger, so would his action. In Episode 104, after he reawakens, we start to move a little away from reality and go into this suspension of disbelief. Jinwoo’s actions and choreography improved significantly, and you could see that over several subsequent episodes. We wanted to make sure that it felt dynamic, alive and cathartic when audiences see it. But it wasn’t just that thought. A lot of the decisions in terms of how the action was going to unfold didn’t always come from the director or action director [Yoshihiro Kanno]. Several of our keyframe artists in the storyboarding phase, or while animating, almost ad lib, so there was a lot of creative agency the animators had on the actual ground floor. That’s how all of the action came together under the guidance of the director and action director.” The Shadow Army features defeated foes such as Igris and Iron (former A-Rank Hunter Kim Chul) who have been resurrected by Sung Jinwoo through his ability as a necromancer. (Image courtesy of Atsushi Kaneko via X) Blood and gore are not in short supply during the fights. “Early on, the director [Shunsuke Nakashige] and I decided not to shy away from the gore, but we did have to have the conversation because in Japan if something is going to be broadcast then it needs to fall within certain limits,” Furuhashi explains. “But we wanted to push the limits of expression as far as we could for something that would go on air. Part of that comes from this idea that the series starts with the main character’s death, and he is beginning from the bottom; that needed to be conveyed to audiences in a visual way. Another reason that we didn’t want to shy away from the gore is even though this is a fantasy world, it’s set in the present day as we know it. In order to make sure that reality was conveyed, if you cut someone’s arm with a sword, they’re going to lose an arm.” Crunchyroll understood that the violence was an integral part of the storytelling. “When we first saw Solo Leveling,” remarks Kanako Takahashi, Producer at Crunchyroll, “which is quite serious and has a lot of violence, and understood the intent, the decision was made not to run away from that and to make sure it was embraced because it showed how serious the creators were in wanting to push the envelope.” An example of how detailed the storyboards can be, down to the eyes, which is also reflected in the final animation. (Image courtesy of Atsushi Kaneko via X) “A lot of the decisions in terms of how the action was going to unfold didn’t always come from the director or action director [Yoshihiro Kanno]. Several of our keyframe artists in the storyboarding phase, or while animating, almost ad lib, so there was a lot of creative agency the animators had on the actual ground floor. That’s how all of the action came together under the guidance of the director and action director.” —Atsushi Kaneko, Producer, A-1 Pictures 3D animation was utilized for the Shadow Army in order to make their simplistic shapes and forms more dynamic. (Image courtesy of Atsushi Kaneko via X) Magic Beasts appear ranging from the Blue Venom-Fanged Kasaka to Ice Elves that have volatile tempers and awe-inspiring agility. “I don’t know if I would call it a monster, but one scene that stands out to me is the Kargalgan gun battle because the action in that scene and the level of quality is second to none,” Takahashi observes. “Seeing how passionate the fans were and their online reaction to that battle in particular left a big impression on me.” The goal was to focus on adapting the source material as opposed to making creative decisions driven by the desire to stand out from other anime productions. “It was a natural result, and we’re grateful for the chance,” Kaneko states. “I didn’t feel like we were doing anything too different or out of the ordinary, but the chain of events that led ultimately to how fans reacted to it is something that differentiates us from other anime. In a way, we are also searching for that answer because we didn’t set out to do something very different or say, ‘We’re going to make this series about this or this is the angle we’re going to come from.’ The differentiation was also a byproduct of taking this IP and animating it.” The Shadow Army is ordered by Sung Jinwoo to assist the Hunters during the 4th Jeju Island Raid. (Image courtesy of Atsushi Kaneko via X) A scale and proportion study between Sung Jinwoo and the Ant King along with a final still of the bloodthirsty Magic Beast. (Image courtesy of Atsushi Kaneko via X) The passionate fan reaction to the Kargalgan gun battle left a lasting impression on Crunchyroll Producer Kanako Takahashi. As Sung Jinwoo becomes stronger, his actions become more dynamic. There is no holding back on the blood and gore as the consequences of being struck by weapons are grounded in the real world despite the fantastical settings. A quiet moment as Cha Hae-In ponders her encounter with Sung Jinwoo. (Image courtesy of Atsushi Kaneko via X) Watch the dynamic Jinwoo vs. Kargalgan battle in Season 2 of Solo Leveling: Arise from the Shadow courtesy of Crunchyroll and Solo Leveling Animation Partners. Click here: https://www.youtube.com/watch?v=dLKYFu_sMTM. And watch a GIF on the design and construction of an action sequence for the anime opening of Solo Leveling by Studio Ppuri’s Inseung Choi and his animation team. Click here: https://x.com/i/status/1747212779868901522.0 Comments 0 Shares -
AWARD-WINNING VFX TEAMS
by NAOMI GOLDMAN
How can visual effects practitioners best collaborate to create a successful work dynamic and production? How do you embrace new technology to enhance the pipeline and visual storytelling process? And what are lessons learned that other VFX teams can employ?
Last week, VES participated in FMX, the 29th edition of FMX/Film & Media Exchange in Stuttgart, Germany, and hosted a live discussion on Award-Winning VFX Teams. In extending that dynamic conversation, we are proud to showcase another VES panel with three outstanding VES Award-winning Visual Effects Supervisors. They came together to share their insights into the talent, teamwork and technology it takes to create and nurture successful VFX teams.
Lending their voices to this dynamic conversation: moderator Rob Legato, ASC, five time VES Award-winning Visual Effects Supervisor and Cinematographer and recipient of the VES Award for Creative Excellence; Michael Lasker, Visual Effects Supervisor and Creative Director of CG Features at Sony Pictures Imageworks – whose team won the 2024 VES Award for Outstanding Visual Effects in an Animated Feature for Spider-Man: Across the Spider-Verse; and Alex Wang, Production Visual Effects Supervisor whose team won the 2024 VES Award for Outstanding Visual Effects in a Photoreal Episode for The Last Of Us.
“I like to hire people smarter than me and find the personality that seems like they have not been tested or given the opportunity to show the limits of their talent.”
-Rob Legato
Rob Legato, ASC, five time VES Award-winning Visual Effects Supervisor and Cinematographer and recipient of the VES Award for Creative Excellence.
Rob Legato: What does it take to create a team that allows you to realize your vision and the director’s vision?
Michael Lasker: To build a team for Across the Spider-Verse, we had to enlist a team that would collaborate in this artistically-driven environment with a very graphic aesthetic. We were already looking at artists during our work on The Mitchells vs.
the Machines.
A lot of people wanted to be part of the Spider-Verse; the trick was assembling people willing to experiment and find the visual answers to any number of challenges.
The final look was an exploration, as we had six total universes to figure out and 1,000 artists across departments who thrived in bringing this unique film to completion.
Rob Legato: On animation projects, artists often pitch their ideas; did that happen for you?
Michael Lasker, Visual Effects Supervisor and Creative Director of CG Features at Sony Pictures Imageworks – whose team won the 2024 VES Award for Outstanding Visual Effects in an Animated Feature for Spider-Man: Across the Spider-Verse
Michael Lasker: Artists pitched their ideas to directors and producers early in the visual development process.
Once you get to animation, where you are infusing performance into the story, it reveals a lot of new ideas.
So many times, animators will show several versions of shots in our ever-evolving process. One of the biggest strengths of this film is that you can feel the hand of the artists in every frame.
Rob Legato: Alex – Your work was maybe a more traditional approach.
But when you have so many artists and vendors working on a project, do you go in with a set aesthetic vs.
always looking for the next great visual?
Alex Wang: It helps to start with an IP that many artists love. On season one (The Last Of Us), we had 18 vendors on the show who worked on more than 3,000 shots…it’s like working on a slew of feature films. On such a big show, it’s important to involve our vendors as early as possible on prep and concepts…through conversations and on-set presence to help establish the look of the world we’re building.
Alex Wang, Production Visual Effects Supervisor whose team won the 2024 VES Award for Outstanding Visual Effects in a Photoreal Episode for The Last Of Us.
Rob Legato: When I work on things, I am continually gauging who is good at what and steering shots to them where they might excel, always moving things around. What’s your approach to allocating and managing the work?
Alex Wang: With many of our vendors, we already had a shorthand and knew where they excelled…environment, creatures, etc. While we are often catering to strengths, we also ask what they like to do as vendors and how they approach a variety of challenges…such as making sure that the infection is grounded in what this situation would really look like.
Rob Legato: In balancing who leans into their comfort zone or embraces something new, I usually get a response that creates some excitement for me to move the work forward. And when I’m assigning something, I usually don’t pick the technique upfront, because new things invariably come up. What’s your approach to the evolving dynamic?
Michael Lasker: We’re continually evolving how we’re doing the work while it is in process. Some tools are written from the ground up and optimized, but we always add tools to the toolset. Early artwork if often not representative of the final, as we come to challenges from different directions and add to our naturalistic work.
Alex Wang: Because this was our first season, we were establishing a foundation of what worked.
From episodes 1-9, we started to develop a rhythm… the abstract nature of the infected and creature work, that got easier in later episodes.
But because this is a journey across America, each episode had a different look…day, night, seasons. So every time we solved a challenge, the next episode posed a new one.
Rob Legato: What are some of the differences in working on a show vs a feature?
Michael Lasker: More stylistic animated features pose different challenges. Cloudy With a Chance of Meatballs and Hotel Transylvania were more traditional; in Spider-Verse, we had to throw out the efficiencies and create new ones.
Coming off the first Spider-Verse, we knew they wanted something bigger and crazier. At the start we didn’t realize every shot would be so different; once we saw the reference artwork, it started to reveal itself.
Rob Legato: Alex, since your show had 18 vendors, did you create any kind of key or bible to show a standardized look?
Alex Wang: A large art of my job was ensuring continuity and I divvied up the work in a way that would help ensure that. Conceptually, the game was my bible every time I needed a reference point; there is a beautiful book of game artwork. Between reviews, it was inspirational to get me back into that foundational mindset and excitement.
Rob Legato: Since you were working during the pandemic, did that help or hurt your process?
Michael Lasker: I was working on The Mitchells vs.
The Machines, and in the course of one week, we moved 600-700 artists home, got their machines up and it barely affected production; surprisingly, production even increased.
We tried to send out QCs every day.
Rob Legato: My work style was always to go to someone’s desk, even before IO and dailies…I like to nip issues in the bud or redirect.
So during COVID, we would do online desktop reviews, which was very similar.
“On such a big show [The Last of Us], it’s important to involve our vendors as early as possible on prep and concept.”
-Alex Wang
Alex Wang: Even with 18 vendors, I needed to make everyone feel they were working under the same umbrella.
Everyone’s work got dumped into Shotgrid, vendors see things right away and I like to give feedback ASAP.
If we had 1,000 finals, imagine the throughput.
We needed a system that worked efficiently.
Rob Legato.
Do you prefer working at home or in a studio environment?
Michael Lasker: I’m the last person in LA who loves going to the office every day. Client reviews and getting people in a sweatbox is great, but I like hallway talk and brainstorms.
Rob Legato: I like to hire people smarter than me and find the personality that seems like they have not been tested or given the opportunity to show the limits of their talent.
I don’t have to have all the answers – I just have to get them.
Alex Wang: I invite supervisors to set to see the process and build a relationship; it’s hard to do that over Zoom.
I like to harness people’s potential who have complementary skills to mine.
I always tell supervisors to give me a bold version, don’t be shy.
Michael Lasker: I encourage people to bring their ideas and try to make a creative, empowering environment where people can become stronger supervisors and I let people know that their cool ideas may well end up on the screen.
Go too far, because we’ll learn something from it.
“To build a team for Across the Spider-Verse, we had to enlist a team that would collaborate in this artistically-driven environment with a very graphic aesthetic.”
-Michael Lasker
Rob Legato: You are both creating things we haven’t seen before and the quality of the work is extremely high.
What else is unique about your way of working?
Alex Wang: What’s unique about our show is our showrunner Craig Mazin – an incredible writer and leader.
On a show where the effects should be invisible, it should not take us away from the characters.
He will always be the biggest champion for visual effects and made everyone who touched the show feel special.
Michael Lasker: My main partner is the Head of Character Animation.
On these more stylistic shows, the animation drives a lot of the style, and we needed a way for it to translate downstream. I’m always seeking the motivation for the style.
And the team at the studio just got it – having a studio behind such a huge gamble was a great collaboration.
Rob Legato: Some lessons I learned early on, including a Frank Capra quote which says – at any one moment, whatever is on the screen is the movie star.
So everything you’re creating and cutting is important.
Any final advice?
Michael Lasker: For artists out there, follow your passion and don’t let anyone slow you down.
Every shot is like a painting and we try to infuse love, care and enthusiasm.
Alex Wang: Stay hungry and stay humble.
And don’t forget that the computer is just a tool in service to the story.
Source: https://www.vfxvoice.com/award-winning-vfx-teams/" style="color: #0066cc;">https://www.vfxvoice.com/award-winning-vfx-teams/
#awardwinning #vfx #teamsAWARD-WINNING VFX TEAMSby NAOMI GOLDMAN How can visual effects practitioners best collaborate to create a successful work dynamic and production? How do you embrace new technology to enhance the pipeline and visual storytelling process? And what are lessons learned that other VFX teams can employ? Last week, VES participated in FMX, the 29th edition of FMX/Film & Media Exchange in Stuttgart, Germany, and hosted a live discussion on Award-Winning VFX Teams. In extending that dynamic conversation, we are proud to showcase another VES panel with three outstanding VES Award-winning Visual Effects Supervisors. They came together to share their insights into the talent, teamwork and technology it takes to create and nurture successful VFX teams. Lending their voices to this dynamic conversation: moderator Rob Legato, ASC, five time VES Award-winning Visual Effects Supervisor and Cinematographer and recipient of the VES Award for Creative Excellence; Michael Lasker, Visual Effects Supervisor and Creative Director of CG Features at Sony Pictures Imageworks – whose team won the 2024 VES Award for Outstanding Visual Effects in an Animated Feature for Spider-Man: Across the Spider-Verse; and Alex Wang, Production Visual Effects Supervisor whose team won the 2024 VES Award for Outstanding Visual Effects in a Photoreal Episode for The Last Of Us. “I like to hire people smarter than me and find the personality that seems like they have not been tested or given the opportunity to show the limits of their talent.” -Rob Legato Rob Legato, ASC, five time VES Award-winning Visual Effects Supervisor and Cinematographer and recipient of the VES Award for Creative Excellence. Rob Legato: What does it take to create a team that allows you to realize your vision and the director’s vision? Michael Lasker: To build a team for Across the Spider-Verse, we had to enlist a team that would collaborate in this artistically-driven environment with a very graphic aesthetic. We were already looking at artists during our work on The Mitchells vs. the Machines. A lot of people wanted to be part of the Spider-Verse; the trick was assembling people willing to experiment and find the visual answers to any number of challenges. The final look was an exploration, as we had six total universes to figure out and 1,000 artists across departments who thrived in bringing this unique film to completion. Rob Legato: On animation projects, artists often pitch their ideas; did that happen for you? Michael Lasker, Visual Effects Supervisor and Creative Director of CG Features at Sony Pictures Imageworks – whose team won the 2024 VES Award for Outstanding Visual Effects in an Animated Feature for Spider-Man: Across the Spider-Verse Michael Lasker: Artists pitched their ideas to directors and producers early in the visual development process. Once you get to animation, where you are infusing performance into the story, it reveals a lot of new ideas. So many times, animators will show several versions of shots in our ever-evolving process. One of the biggest strengths of this film is that you can feel the hand of the artists in every frame. Rob Legato: Alex – Your work was maybe a more traditional approach. But when you have so many artists and vendors working on a project, do you go in with a set aesthetic vs. always looking for the next great visual? Alex Wang: It helps to start with an IP that many artists love. On season one (The Last Of Us), we had 18 vendors on the show who worked on more than 3,000 shots…it’s like working on a slew of feature films. On such a big show, it’s important to involve our vendors as early as possible on prep and concepts…through conversations and on-set presence to help establish the look of the world we’re building. Alex Wang, Production Visual Effects Supervisor whose team won the 2024 VES Award for Outstanding Visual Effects in a Photoreal Episode for The Last Of Us. Rob Legato: When I work on things, I am continually gauging who is good at what and steering shots to them where they might excel, always moving things around. What’s your approach to allocating and managing the work? Alex Wang: With many of our vendors, we already had a shorthand and knew where they excelled…environment, creatures, etc. While we are often catering to strengths, we also ask what they like to do as vendors and how they approach a variety of challenges…such as making sure that the infection is grounded in what this situation would really look like. Rob Legato: In balancing who leans into their comfort zone or embraces something new, I usually get a response that creates some excitement for me to move the work forward. And when I’m assigning something, I usually don’t pick the technique upfront, because new things invariably come up. What’s your approach to the evolving dynamic? Michael Lasker: We’re continually evolving how we’re doing the work while it is in process. Some tools are written from the ground up and optimized, but we always add tools to the toolset. Early artwork if often not representative of the final, as we come to challenges from different directions and add to our naturalistic work. Alex Wang: Because this was our first season, we were establishing a foundation of what worked. From episodes 1-9, we started to develop a rhythm… the abstract nature of the infected and creature work, that got easier in later episodes. But because this is a journey across America, each episode had a different look…day, night, seasons. So every time we solved a challenge, the next episode posed a new one. Rob Legato: What are some of the differences in working on a show vs a feature? Michael Lasker: More stylistic animated features pose different challenges. Cloudy With a Chance of Meatballs and Hotel Transylvania were more traditional; in Spider-Verse, we had to throw out the efficiencies and create new ones. Coming off the first Spider-Verse, we knew they wanted something bigger and crazier. At the start we didn’t realize every shot would be so different; once we saw the reference artwork, it started to reveal itself. Rob Legato: Alex, since your show had 18 vendors, did you create any kind of key or bible to show a standardized look? Alex Wang: A large art of my job was ensuring continuity and I divvied up the work in a way that would help ensure that. Conceptually, the game was my bible every time I needed a reference point; there is a beautiful book of game artwork. Between reviews, it was inspirational to get me back into that foundational mindset and excitement. Rob Legato: Since you were working during the pandemic, did that help or hurt your process? Michael Lasker: I was working on The Mitchells vs. The Machines, and in the course of one week, we moved 600-700 artists home, got their machines up and it barely affected production; surprisingly, production even increased. We tried to send out QCs every day. Rob Legato: My work style was always to go to someone’s desk, even before IO and dailies…I like to nip issues in the bud or redirect. So during COVID, we would do online desktop reviews, which was very similar. “On such a big show [The Last of Us], it’s important to involve our vendors as early as possible on prep and concept.” -Alex Wang Alex Wang: Even with 18 vendors, I needed to make everyone feel they were working under the same umbrella. Everyone’s work got dumped into Shotgrid, vendors see things right away and I like to give feedback ASAP. If we had 1,000 finals, imagine the throughput. We needed a system that worked efficiently. Rob Legato. Do you prefer working at home or in a studio environment? Michael Lasker: I’m the last person in LA who loves going to the office every day. Client reviews and getting people in a sweatbox is great, but I like hallway talk and brainstorms. Rob Legato: I like to hire people smarter than me and find the personality that seems like they have not been tested or given the opportunity to show the limits of their talent. I don’t have to have all the answers – I just have to get them. Alex Wang: I invite supervisors to set to see the process and build a relationship; it’s hard to do that over Zoom. I like to harness people’s potential who have complementary skills to mine. I always tell supervisors to give me a bold version, don’t be shy. Michael Lasker: I encourage people to bring their ideas and try to make a creative, empowering environment where people can become stronger supervisors and I let people know that their cool ideas may well end up on the screen. Go too far, because we’ll learn something from it. “To build a team for Across the Spider-Verse, we had to enlist a team that would collaborate in this artistically-driven environment with a very graphic aesthetic.” -Michael Lasker Rob Legato: You are both creating things we haven’t seen before and the quality of the work is extremely high. What else is unique about your way of working? Alex Wang: What’s unique about our show is our showrunner Craig Mazin – an incredible writer and leader. On a show where the effects should be invisible, it should not take us away from the characters. He will always be the biggest champion for visual effects and made everyone who touched the show feel special. Michael Lasker: My main partner is the Head of Character Animation. On these more stylistic shows, the animation drives a lot of the style, and we needed a way for it to translate downstream. I’m always seeking the motivation for the style. And the team at the studio just got it – having a studio behind such a huge gamble was a great collaboration. Rob Legato: Some lessons I learned early on, including a Frank Capra quote which says – at any one moment, whatever is on the screen is the movie star. So everything you’re creating and cutting is important. Any final advice? Michael Lasker: For artists out there, follow your passion and don’t let anyone slow you down. Every shot is like a painting and we try to infuse love, care and enthusiasm. Alex Wang: Stay hungry and stay humble. And don’t forget that the computer is just a tool in service to the story. Source: https://www.vfxvoice.com/award-winning-vfx-teams/ #awardwinning #vfx #teamsWWW.VFXVOICE.COMAWARD-WINNING VFX TEAMSby NAOMI GOLDMAN How can visual effects practitioners best collaborate to create a successful work dynamic and production? How do you embrace new technology to enhance the pipeline and visual storytelling process? And what are lessons learned that other VFX teams can employ? Last week, VES participated in FMX, the 29th edition of FMX/Film & Media Exchange in Stuttgart, Germany, and hosted a live discussion on Award-Winning VFX Teams. In extending that dynamic conversation, we are proud to showcase another VES panel with three outstanding VES Award-winning Visual Effects Supervisors. They came together to share their insights into the talent, teamwork and technology it takes to create and nurture successful VFX teams. Lending their voices to this dynamic conversation: moderator Rob Legato, ASC, five time VES Award-winning Visual Effects Supervisor and Cinematographer and recipient of the VES Award for Creative Excellence; Michael Lasker, Visual Effects Supervisor and Creative Director of CG Features at Sony Pictures Imageworks – whose team won the 2024 VES Award for Outstanding Visual Effects in an Animated Feature for Spider-Man: Across the Spider-Verse; and Alex Wang, Production Visual Effects Supervisor whose team won the 2024 VES Award for Outstanding Visual Effects in a Photoreal Episode for The Last Of Us. “I like to hire people smarter than me and find the personality that seems like they have not been tested or given the opportunity to show the limits of their talent.” -Rob Legato Rob Legato, ASC, five time VES Award-winning Visual Effects Supervisor and Cinematographer and recipient of the VES Award for Creative Excellence. Rob Legato: What does it take to create a team that allows you to realize your vision and the director’s vision? Michael Lasker: To build a team for Across the Spider-Verse, we had to enlist a team that would collaborate in this artistically-driven environment with a very graphic aesthetic. We were already looking at artists during our work on The Mitchells vs. the Machines. A lot of people wanted to be part of the Spider-Verse; the trick was assembling people willing to experiment and find the visual answers to any number of challenges. The final look was an exploration, as we had six total universes to figure out and 1,000 artists across departments who thrived in bringing this unique film to completion. Rob Legato: On animation projects, artists often pitch their ideas; did that happen for you? Michael Lasker, Visual Effects Supervisor and Creative Director of CG Features at Sony Pictures Imageworks – whose team won the 2024 VES Award for Outstanding Visual Effects in an Animated Feature for Spider-Man: Across the Spider-Verse Michael Lasker: Artists pitched their ideas to directors and producers early in the visual development process. Once you get to animation, where you are infusing performance into the story, it reveals a lot of new ideas. So many times, animators will show several versions of shots in our ever-evolving process. One of the biggest strengths of this film is that you can feel the hand of the artists in every frame. Rob Legato: Alex – Your work was maybe a more traditional approach. But when you have so many artists and vendors working on a project, do you go in with a set aesthetic vs. always looking for the next great visual? Alex Wang: It helps to start with an IP that many artists love. On season one (The Last Of Us), we had 18 vendors on the show who worked on more than 3,000 shots…it’s like working on a slew of feature films. On such a big show, it’s important to involve our vendors as early as possible on prep and concepts…through conversations and on-set presence to help establish the look of the world we’re building. Alex Wang, Production Visual Effects Supervisor whose team won the 2024 VES Award for Outstanding Visual Effects in a Photoreal Episode for The Last Of Us. Rob Legato: When I work on things, I am continually gauging who is good at what and steering shots to them where they might excel, always moving things around. What’s your approach to allocating and managing the work? Alex Wang: With many of our vendors, we already had a shorthand and knew where they excelled…environment, creatures, etc. While we are often catering to strengths, we also ask what they like to do as vendors and how they approach a variety of challenges…such as making sure that the infection is grounded in what this situation would really look like. Rob Legato: In balancing who leans into their comfort zone or embraces something new, I usually get a response that creates some excitement for me to move the work forward. And when I’m assigning something, I usually don’t pick the technique upfront, because new things invariably come up. What’s your approach to the evolving dynamic? Michael Lasker: We’re continually evolving how we’re doing the work while it is in process. Some tools are written from the ground up and optimized, but we always add tools to the toolset. Early artwork if often not representative of the final, as we come to challenges from different directions and add to our naturalistic work. Alex Wang: Because this was our first season, we were establishing a foundation of what worked. From episodes 1-9, we started to develop a rhythm… the abstract nature of the infected and creature work, that got easier in later episodes. But because this is a journey across America, each episode had a different look…day, night, seasons. So every time we solved a challenge, the next episode posed a new one. Rob Legato: What are some of the differences in working on a show vs a feature? Michael Lasker: More stylistic animated features pose different challenges. Cloudy With a Chance of Meatballs and Hotel Transylvania were more traditional; in Spider-Verse, we had to throw out the efficiencies and create new ones. Coming off the first Spider-Verse, we knew they wanted something bigger and crazier. At the start we didn’t realize every shot would be so different; once we saw the reference artwork, it started to reveal itself. Rob Legato: Alex, since your show had 18 vendors, did you create any kind of key or bible to show a standardized look? Alex Wang: A large art of my job was ensuring continuity and I divvied up the work in a way that would help ensure that. Conceptually, the game was my bible every time I needed a reference point; there is a beautiful book of game artwork. Between reviews, it was inspirational to get me back into that foundational mindset and excitement. Rob Legato: Since you were working during the pandemic, did that help or hurt your process? Michael Lasker: I was working on The Mitchells vs. The Machines, and in the course of one week, we moved 600-700 artists home, got their machines up and it barely affected production; surprisingly, production even increased. We tried to send out QCs every day. Rob Legato: My work style was always to go to someone’s desk, even before IO and dailies…I like to nip issues in the bud or redirect. So during COVID, we would do online desktop reviews, which was very similar. “On such a big show [The Last of Us], it’s important to involve our vendors as early as possible on prep and concept.” -Alex Wang Alex Wang: Even with 18 vendors, I needed to make everyone feel they were working under the same umbrella. Everyone’s work got dumped into Shotgrid, vendors see things right away and I like to give feedback ASAP. If we had 1,000 finals, imagine the throughput. We needed a system that worked efficiently. Rob Legato. Do you prefer working at home or in a studio environment? Michael Lasker: I’m the last person in LA who loves going to the office every day. Client reviews and getting people in a sweatbox is great, but I like hallway talk and brainstorms. Rob Legato: I like to hire people smarter than me and find the personality that seems like they have not been tested or given the opportunity to show the limits of their talent. I don’t have to have all the answers – I just have to get them. Alex Wang: I invite supervisors to set to see the process and build a relationship; it’s hard to do that over Zoom. I like to harness people’s potential who have complementary skills to mine. I always tell supervisors to give me a bold version, don’t be shy. Michael Lasker: I encourage people to bring their ideas and try to make a creative, empowering environment where people can become stronger supervisors and I let people know that their cool ideas may well end up on the screen. Go too far, because we’ll learn something from it. “To build a team for Across the Spider-Verse, we had to enlist a team that would collaborate in this artistically-driven environment with a very graphic aesthetic.” -Michael Lasker Rob Legato: You are both creating things we haven’t seen before and the quality of the work is extremely high. What else is unique about your way of working? Alex Wang: What’s unique about our show is our showrunner Craig Mazin – an incredible writer and leader. On a show where the effects should be invisible, it should not take us away from the characters. He will always be the biggest champion for visual effects and made everyone who touched the show feel special. Michael Lasker: My main partner is the Head of Character Animation. On these more stylistic shows, the animation drives a lot of the style, and we needed a way for it to translate downstream. I’m always seeking the motivation for the style. And the team at the studio just got it – having a studio behind such a huge gamble was a great collaboration. Rob Legato: Some lessons I learned early on, including a Frank Capra quote which says – at any one moment, whatever is on the screen is the movie star. So everything you’re creating and cutting is important. Any final advice? Michael Lasker: For artists out there, follow your passion and don’t let anyone slow you down. Every shot is like a painting and we try to infuse love, care and enthusiasm. Alex Wang: Stay hungry and stay humble. And don’t forget that the computer is just a tool in service to the story.0 Comments 0 Shares -
WWW.VFXVOICE.COMEXPANDING THE SCOPE WITHOUT GETTING CAUGHT FOR DOPE THIEFBy TREVOR HOGG Before stills courtesy of John Heller. Final stills courtesy of Apple TV+. Two small-time crooks steal from drug dealers by posing as DEA agents, but when a score goes horribly wrong, they become a target of a ruthless biker gang that has no qualms in leaving behind a bloody trail of retribution. This is the premise for the Apple TV+ series Dope Thief, based on the novel by Dennis Tafoya and adapted by series creator Peter Craig. Serving as an executive producer is Ridley Scott, who directed the pilot that set the groundwork for the following seven episodes that star Brian Tyree Henry, Wagner Moura, Marin Ireland, Nesta Cooper, Kate Mulgrew and Ving Rhames. “There is no way that anything cannot look like it really exists in the frame because there’s nothing about Dope Thief that isn’t real life. This shouldn’t be a visual effects show even though we have nearly 1,600 shots. You never want to see the work that we do unless it’s Game of Thrones or Dark Matter where you want to show that off and everybody knows that stuff doesn’t exist.” —John Heller, VFX Supervisor One of the tricky simulations was to make sure that the practical and CG snow blended seamlessly together. Causing some canine mayhem is the dog owned by Ray’s adoptive mother, which resulted in poop and pee being added in digitally. While con men Ray Driscoll (Brian Tyree Henry) and Manny Carvahlo (Wagner Moura) make a show of their deceit, VFX Supervisor John Heller adopted a stealth approach when overseeing 1,575 visual effects shots created by Barnstorm VFX, Mavericks VFX and FOLKS VFX. “Dope Thief couldn’t be told without visual effects, or if it was, you couldn’t tell it quite the same way with the gunfights, explosions and some of the death scenes,” Heller observes. “There is a lot going on but, hopefully, virtually unnoticed. That’s the sad thing about doing a show like this, but also the beauty of it.” Principal photography took place in Philadelphia, which is where the story takes place. “We didn’t have to do much to Philadelphia at all,” Heller reveals. “They didn’t focus much on the downtown. It was more about outlying areas, like lower-income neighborhoods and down by the airport.” There was the seasonal matter of the pilot being shot during the winter and the other episodes happening months later. “One of the biggest things was the amount of deforestation that we had to do. There is a park scene where they are having this big birthday party and there are blowing bubbles and stuff everywhere surrounded completely by trees in full foliage. Time after time, we had to completely rid those trees of leaves and make them look like bare winter trees; that was a big undertaking, but the work was excellent.” Snow was a major issue. “We did a lot of snow work, whether it was removing, blending or adding to make shot-to-shot work through the sequence that otherwise wouldn’t have been able to cut together well.” Principal photography occurred in Philadelphia where the story takes place. As the narrative progresses, the body count rises. “There is a subtlety to the blood and gore,” Heller remarks. “The first thing you always need to know is the studio’s appetite. This one they let us go where we wanted to with it. There is one great head shot I love in an episode [where] there is a big gun battle and a lot of gunshots and wounds throughout the show. Entry and exit wounds are standard fare. I always like to make things look as realistic as possible. If it’s a head shot, there should be a little bit of ‘gak’ flying through the air. I don’t like anything too extreme.” A murdered woman is found in an overflowing bathtub. “Although the tub water was practical, we bloodied it up. Then we added some blood dripping down over the edge of the tub. It was a little too dead, so we livened up the interactivity of the water and did a little bit of additional make-up work,” Heller says. “There is a subtlety to the blood and gore. … There is one great head shot I love in an episode [where] there is a big gun battle and a lot of gunshots and wounds throughout the show. Entry and exit wounds are standard fare. I always like to make things look as realistic as possible. If it’s a head shot, there should be a little bit of ‘gak’ flying through the air. I don’t like anything too extreme.” —John Heller, VFX Supervisor Wagner Moura and Brian Tyree Henry portray childhood friends and partners-in- crime Manny Carvalho and Driscoll. Among the dramatic death scenes is a biker getting squished between a truck and dumpster that is in full view of the camera. “It’s gruesome,” Heller laughs. “This guy is getting crunched. We even had to do a fully CG truck to replace the wall that was used to push him against that dumpster. Then we had to do a lot of blood enhancements even when he is laying on the ground. The way it was shot was one thing, then putting it together we pushed it even further.” A digital double was avoided. “When you’re up in close with that character with the beard, with the grooming and clothing and the emotions on his face, we wanted him to be real. Doing a CG truck, you want to do it perfectly, and it’s not the easiest thing in the world to do, but it’s a rigid body item versus a CG character. We wanted to get the actor’s performance in there, which is why the decision was made.” Muzzle flashes tend to be viewed as nothing special. “It’s about what kind of gun is it?” Heller observes. “What would that gun actually do? How would that gun actually be photographed? I start there. I don’t like to do things that are done for the sake of being big and explosive in this case. For this one, we had so much gunfire and so many different types of firearms that I did a lot of research. I sent video after video to our vendors egging and pushing them along so we had a good variety of styles of gunfire so that it didn’t look like they were using the same gunshot element every time.” Bullet hits find their way into wooden doorframes, vehicles and windows. “You want to make it look cinematic, but you’re not going to blow out a brick wall if someone shoots it with a pistol.” Heller remarks. “We had to do a lot of glass blowing out, which involved rolling car windows down or taking the windshields out so we could do a fully CG one with reflections and then blow it out. There is a scene where Manny ducks down as a guy tries to shoot him with a shotgun and blows out the plaster behind him. That’s a cool shot because you see what would have happened to his head.” A deer that gets hit by the car began as a stuffie but did not have the proper weight needed, so it was replaced with a CG version. On the spectacular side are the explosions. “The first major explosion was the meth lab blowing up,” Heller states. “That was a significant effects explosion, which was done practically, but once you get it in the cut, everything needs to be bigger. What we ended up doing was blend into the effects explosion that was there and at least triple or quadruple it in size. We also added a bunch of smoke that wrapped around buildings. We removed and replaced trees, power lines and power poles so we could have a concussive effect that affected the trees and power poles. It’s all subtle stuff that sets it up to be realistic and fits in the scene.” A mixture of practical and CG effects elements was utilized. Heller adds, “There are a lot of elements that vendors have since they do quite a bit of this, but I’m usually not too happy with that, and we don’t always have the right angles. I always push hard to have CG-driven [effects elements] where we need it, and in this one there was quite a lot of that for sparks, fire and smoke.” Getting a lot of attention is the unruly canine owned by Ray’s adoptive mother, Theresa Bowers (Kate Mulgrew) – it might be tiny but leaves behind a big mess. “When I look back on the dog, it was one of those things where whenever he was around, we had to do something,” Heller notes. “It was usually poop and pee. There’s a scene where the dog is in the car, and it didn’t work out editorially. The dog is jumping around the car and going in between characters, and we had to recreate all of this stuff to get rid of it.” Receiving additional treatment are the flashback scenes that explore backstory of Ray. “What was cool about the flashbacks in this show is that the idea began that we’ll only go to black and white as Ray is remembering these things with his dad or high school girlfriend. Then later as we got into editorial, Peter wanted to explore these hallucinatory events, especially when Ray got shot and was all drugged up. It evolved into a warping alternative-reality pseudo-color treatment where we let some color creep in and thread through depending on what was in the scene.” The flashbacks that merge into reality were treated as if a cinematographer had placed a Lensbaby on the camera. “[T]he idea [of flashbacks] began that we’ll only go to black and white as Ray is remembering these things with his dad or high school girlfriend. Then later as we got into editorial, Peter [Craig, series creator] wanted to explore these hallucinatory events, especially when Ray got shot and was all drugged up. It evolved into a warping alternative-reality pseudo-color treatment where we let some color creep in and thread through… ” —John Heller, VFX Supervisor Ray Driscoll (Brian Tyree Henry) runs toward a CG truck. Grips compress a flat board against the actor to make look like he is being pushed against the dumpster by the CG truck. CG effects were blended into the practical explosion to significantly increase its size. Because the pilot was shot during the winter and the remaining seven episodes were captured later on, the visual effects team had to use digital deforestation to ensure seasonal continuity. The explosion that occurs in Episode 108 was the biggest in terms of scope. Blood and gore were only held back by reality and avoid wounds such as head shots appearing cartoony. Achieving realism was the main challenge of Dope Thief. “There is no way that anything cannot look like it really exists in the frame because there’s nothing about Dope Thief that isn’t real life,” Heller remarks. “This shouldn’t be a visual effects show even though we have nearly 1,600 shots. You never want to see the work that we do unless it’s Game of Thrones or Dark Matter where you want to show that off and everybody knows that stuff doesn’t exist.” Favorite moments are the big gun battles and the final scene of the series. “The fire explosion that happens in the end was one of the bigger things scope-wise. It was a lot of fun to do. They knocked it out of the park. One of the most effective things that could have been the most difficult was the tree work we did. When I first saw the passes coming out of FOLKS VFX, I was like, ‘This is going to work. It’s going to be beautiful.’ And it is.”0 Comments 0 Shares
-
WWW.VFXVOICE.COMDNEG GETS ACQUAINTED WITH CREEPERS AND NIFLHEIM FOR MICKEY 17By TREVOR HOGG Images courtesy of DNEG and Warner Bros. Pictures. South Korean filmmaker Bong Joon Ho excels at creating dark social satires. Mickey 17 centers around Mickey Barnes, who agrees to be cloned an unlimited number of times to perform lethal tasks, in particular the settlement of the ice planet Niflheim where creatures known as Creepers roam. DNEG was hired by Production VFX Supervisor Dan Glass, who previously collaborated with Bong on Okja, to look after 350 shots dealing with Niflheim, three different types of Creepers and flying transport crafts known as Flitters. “Bong Joon Ho has a clear idea of what he wants and is good at communicating that,” states Chris McLaughlin, VFX Supervisor at DNEG. “The language barrier was never an issue. He did have a translator, but he can speak good English and certainly understood everything we said. Bong can also draw whatever he thinks. It’s a clean illustrative style, which went a long way in giving us an idea of what he wants.” “Sand and snow are both terrible! Especially with creatures that need weight; you can’t fake it. It’s wrapped in a single simulated entity, which is snow or sand, and if you do it too fast, you blow the effects simulation, which is based on dynamics that can [only] be changed a little.” —Robyn Luckham, Animation Director, DNEG A set about the size of a football field was built at Cardington Studios in the U.K. for the exterior shots of Niflheim, which consisted of white walls and Epsom salts for snow. DNEG went from having to deal with the sand of Arrakis to the snow of Niflheim. “Sand and snow are both terrible!” laughs Robyn Luckham, Animation Director at DNEG. “Especially with creatures that need weight; you can’t fake it. It’s wrapped in a single simulated entity, which is snow or sand, and if you do it too fast, you blow the effects simulation, which is based on dynamics that can be changed a little.” Several layers had to be simulated for snow. “You’ve got your snow on the ground; interactive snow, which is anything that gets kicked up; depressions like footsteps, tire tracks and Creeper tracks; and falling snow. At a distance, it becomes a volumetric mist,” McLaughlin explains. “There’s falling snow that lands on Mickey or in the hair of the Creepers, a thin dusting of snow on top of the Creepers, and ice on the underside of Mama because she’s constantly in contact with the snow. We simulated a ton of snow.” A full-size Flitter was built that had the functionality of a car. A set about a size of a football field was built at Cardington Studios for the exterior shots of Niflheim. “It had wall-to-wall white walls 30 feet high on all four sides,” McLaughlin states. “Then white lights on top to give you that broad, white overcast light. The ground was covered in Epsom salts and from the distance, it looked convincing as snow. You get footprints in it as well and they appeared quite convincing. Up close not so much. They are some close-ups of Mickey’s feet in the snow that were redone completely as a full CG shot with a simulation. That was probably the most difficult stuff to simulate but also some of the most successful. There was a whole team of guys with rakes, tools and a tractor who would smooth out the snow again at the end of every take.” The vast majority of the falling snow is CG. “They blew snow on the first take of everything,” McLaughlin notes. “We would always get that as a reference for how the snow should move and look, but generally, we would get a take that had no snow in it and completely replace all of that snow. For all of the shots on top of the Flitter, there was a lot of wind and snow blowing directly in their face because you’re so close up. It would have been difficult and expensive to simulate that stuff in CG. We did shoot plates of falling snow against bluescreen as elements that could be used later.” There is a misconception about animation that it is strictly about movement. “We were very lucky with Bong, as you get to create a creature from scratch with a brand-new language that nobody has ever seen,” Luckham remarks. “The development and prep that we did six to eight months before working with Dan Glass, Bong and Framestore were the big part of it.” Bong consulted his frequent creature designer Hee Chul Jang and produced concept art and 3D models for the Creepers. “When you render that with some realistic lighting, it suddenly starts to look like a CG creature,” McLaughlin states. “You have to take that a step further. There was a small team at DNEG that started to look into the real details, down to the pores of the skin, to make it feel like an actual creature.” For the Creepers, there is the Mama, Junior and Baby versions. “Each Creeper had a different character,” Luckham continues. “We got a lot of storyboards and good concepts. Mama was the queen bee of everything.” The vast majority of the falling snow is CG. “If you think about the scale of a big movie like Avatar, our shot count [of 350] seems quite low, but the amount of content in one shot is like a hundred shots. There were so many bits to make one shot realistic. It was incredibly dense.” —Robyn Luckham, Animation Director, DNEG When it comes to complexity, the Creepers were one of the hardest creatures to animate. “We had so many legs outside and inside, external and internal mandibles, four sets of eyelids, tentacles, tail and tongue,” Luckham remarks. “Everything would move, and you had to keyframe quite a lot of that. Also, in the evolution of the creature from the page, it looks like a millipede that has lots of legs. It had to run, so we looked at one of my favorite films, Totoro from Studio Ghibli, and the Catbus for the gallop, and it was something that Bong really liked. The amount of animation inside of each creature was quite staggering. It’s a complete marriage of building and animating those details. We were changing the texture colors for the areas that are rubbing a lot more. A number of these details you would subconsciously absorb, but if they weren’t there, you wouldn’t know why it didn’t look real.” Robert Pattinson portrays multiple versions of himself, which at times share the screen with each other. “You’ve got your snow on the ground; interactive snow, which is anything that gets kicked up; depressions like footsteps, tire tracks and Creeper tracks; and falling snow. At a distance, it becomes a volumetric mist. There’s falling snow that lands on Mickey or in the hair of the Creepers, a thin dusting of snow on top of the Creepers, and ice on the underside of Mama because she’s constantly in contact with the snow. We simulated a ton of snow.” —Chris McLaughlin, VFX Supervisor, DNEG Getting creatures to emote is something that animators relish. “You have to start from what the creature is and its motivation,” Luckham states. “Director Bong gave us a bible of background history, and we could build up from that. We looked at what age they could be, and he alluded that the Juniors would be like teenage boys. The Juniors had so much charm, and I wondered how they could relate to each other. What if they were best friends and competitive with each other? We found great footage of these jostling bear cubs.” Variation had to be retained within the crowd simulations. “Throughout the course of the film, the crowd of Creepers marching around the spaceship goes from a walk to a trot to a gallop to a super gallop,” McLaughlin notes. “Each of those four variants had to have its own keyframe animation, which comes from Robyn’s team. Within each of those four, we had to have variants of that so they don’t all look the same. You multiply those four by five or six. Then you need to have lots of little characteristics, like jostles, bumps and jumps. Each of those have to be done at a different speed. This all starts to add up!” A fun part for the animation team was assisting in designing the Creepers from scratch. “In the evolution of the [Creepers] from the page, it looks like a millipede that has lots of legs. It had to run, so we looked at one of my favorite films, Totoro from Studio Ghibli, and the Catbus for the gallop, and it was something that Bong really liked. The amount of animation inside of each creature was quite staggering. It’s a complete marriage of building and animating those details.” —Robyn Luckham, Animation Director, DNEG Interaction was achieved procedurally. “Our effects supervisor did setup that whenever a new version of animation was published, we could run that, and it would simulate the crowd,” McLaughlin reveals. “You could do hundreds of Creepers trampling through the snow, and you would get the kick-up out of that. In terms of the interaction on set with the cast, for the scenes where they are holding the Baby, there was a stuffie, which looked similar and had a nice weight so you could tell they were holding onto something. It was the exact size and shape, so our CG sat quite nicely over top. For the crowds, we had numerous yoga balls on set that were about the right height and size, and Robert Pattinson would be staggering through and tripping over them.” Puppets were utilized. “We had the company Stitches and Glue mainly for Mama,” Luckham states. “Mama was incredibly interesting. Framestore built the asset, but we had quite a lot of the acting shots. As you see in the story, she is cheeky and secretive. We looked at a cheeky granny who may be old but still has her complete wits about her and knows how to put things together. Mama is a big, slow and lumbering creature, and we have to make sure there is weight. Especially in the acting and talking shots, it’s a lot about doing less rather than more. You’re looking at the eyes, the micro movements and believing what she said.” Flitters are bulky industrial machines that happen to fly. The crowd pipeline at DNEG had to be revamped to handle the massive number of Creepers in some shots. An audio language was created for the Creepers, which required answering several questions. “We had to go to the base of everything,” Luckham explains. “What’s its lungs? Why does it want to communicate? Where does it come from? Where does the air come in and out? If the air is passing through the mandibles, does that make a different sound? Certain mandibles moved at the higher range, and at the lower range, other mandibles moved. That’s more related to what you expect humans to do. Different parts of the face move depending if it’s a higher or lower pitch. It was like chicken and egg. They were like, ‘Give us some animation and we’ll work out the audio.’ And vice versa. If you got the logic right, that was a great place to start. Then Dan and Bong worked with the sound designer to create a hybrid sound, and then I know what motions can be attached to that audio.” The audio has a noise pattern. “We had to try break it down and relate that to what those words could mean,” Luckham reveals. “The ‘beep-beep’ means ‘How are you doing?’ and the ‘blop-blop’ stands for ‘I’m good. How are you?’ Then you would have to stitch that audio together and go, ‘There’s a sentence.’ What’s the action going with that noise? It was complicated and thoroughly enjoyable.” The Creepers provide a sense of depth to the bleak, snowy environment. Practical tire tracks were expanded upon and altered in the final shot. Several layers had to be simulated for snow. Resembling a flying garbage truck with a jet propulsion system are the Flitters. “Very heavy and cumbersome, which was the intention,” McLaughlin notes. “They were quite industrial. The setup is not meant for a fighter pilot. It’s about getting things done. We had to manipulate it in the sky with less elegance than you normally would. There was one full-sized Flitter that was functional as a car. You could rotate the engines, open the top and sit on top and bottom. Then there was another one that was a buck. It was a top section that was used to film the actors in, and we had full CG ones as well.” Icicles and snow were added to the aircraft. Fiona Crombie [Production Designer] wanted the Flitters and the massive spaceship that is their base to look like you would never get inside them.” Particular words of advice were given to the animators. “I would say to them, ‘This thing is always falling,’” Luckham recounts. “‘And make sure that you manipulate the air correctly for the weight.’ There is also lot of small details, like a thruster shimmer for heat.” He adds, “The intention of the Flitter was a chunky-looking piece of industrial machinery that happened to fly. That’s our boundaries we can animate in. If you start doing things that are not congruent with the actual design, it won’t look right.” Filmmaker Bong Joon Ho discusses a shot with Cinematographer Darius Khondji during principal photography at Cardington Studios in the U.K. DNEG improved on one area of the pipeline. “We didn’t have to change anything, but I would say we haven’t done crowds of creatures to this extent in a long time,” McLaughlin notes. “Most of our in-house-built crowd tools are for bipeds, while these were eight-legged creatures, so we had to rebuild a lot of that tooling.” Animation did not modify tooling but certainly had to be acutely aware of scale management. “Definitely, the amount we needed to prep was much bigger because we wanted to find the actors,” Luckham states. “The amount of variation in animation that we had to do for the crowds was way bigger than you would expect. Because they’re so complicated, you have to keyframe so much more. Our animation team was much bigger in order to get that crowd-feeling dynamic. These Creeper piles, and all of these little things that Bong wanted, was a lot of work to key.” The shot count of 350 is misleading. “If you think about the scale of a big movie like Avatar, our shot count seems quite low, but the amount of content in one shot is like a hundred shots,” Luckham observes. “There were so many bits to make one shot realistic. It was incredibly dense.”0 Comments 0 Shares
-
WWW.VFXVOICE.COMRIOT GAMES AND FORTICHE GET REVOLUTIONARY WITH ARCANE SEASON 2By TREVOR HOGG Animating video games or music videos is one thing, but independently producing an animated television series that streams on Netflix is an entirely different animal. This is something Riot Games learned after spending $250 million to create two nine-episode seasons of Arcane, a spin-off of their multiplayer online battle arena video game League of Legends. “Christian Linke, [Co-Creator/Showrunner/Writer of Arcane for Riot Games] used to say that animation is where we come from because it was so much a part of the game and how we thought about the movement of characters,” states Alex Yee, Co-Creator/Showrunner/Writer of Arcane for Riot Games. “When the Warcraft movie came out, I began to see how the advantage of animation was that our characters could do things that people couldn’t do. They could live up to a type of motion the animators had established in the game. This heightened life.” “Riot Games realized that they needed to create a planet because we are creating a backstory for all of these different champions – some felt inspired by Greek mythology, while others were more modern, like the Sheriff of Piltover. A team at Riot Games has been working for more than a decade on the world-building aspect of this universe.” —Arnaud-Loris Baudry, Production Designer and Senior Concept Artist, Riot Games A music video aesthetic was incorporated into the animation style for the sequence when Caitlyn leads a covert operation into Zaun to capture Jinx. Pushing the boundaries of animation combines 2D and 3D techniques to produce a painterly style. “A lot of this is the magic of Fortiche,” Yee notes. “The first time we worked with Fortiche, Christian was looking all over the place for something that felt fresh and current. He had this rebellious attitude that matched the gamers’ spirit and the spirit that went with League of Legends.” Riot Games is in Los Angeles while Fortiche is in Paris. “What was a bit tricky was finding ways to work with Fortiche, which is in a different time zone,” states Arnaud-Loris Baudry, Production Designer and Senior Concept Artist at Riot Games. “My team was in charge of everything concerning the IP and did some early visual development. The goal was to give a lot of ideas to the storyboard team. When they decide to use one of our ideas or concepts, we might work for months to find the final design and work with Fortiche to finalize it.” Viktor achieves a god-like transformation, which causes him to want to do the same thing with humanity. Fortiche has been able to elevate the animation style. “Because we use a mix of 2D and 3D techniques, our workflow is a little unusual,” observes Julien Georgel, Art Director at Fortiche. “It’s important for us to think about how things will be made at the concept stage. We need to have a good idea of how things are going to come together and how it’s going to look in the end.” Adding to the stylization was the decision to treat smoke and fire as 2D effects. “For the basic visual effects, we rely on the proven expertise of our long-standing 2D and 3D FX team,” Georgel states. “This season, my attention was particularly focused on innovative special effects, such as those related to Hexcore and the Anomaly. These central narrative elements significantly impacted the entire production, from the characters to the props, background design and compositing. The aim was to ensure seamless visual and narrative consistency, with every detail adding to the story’s impact. At the same time, the characters’ powers were the subject of close collaboration between Arnaud Baudry’s team at Riot Games, my team at Fortiche and our FX department. This teamwork made it possible to create distinctive visual effects for each champion, contributing to their own identity.” The Black Rose practices the Arcane version of Black Magic. Driving the storytelling are the characters. “It’s interesting because I’ve worked on developing characters for the game for a while,” Yee notes. “At first, the characters were like a little sonar ping into some part of the world. They would reveal the environment around them because you create that through their backstory. Then, after a bit, the regions became like these thematic philosophies of the world, so when you would develop a character early on, the first things you had to understand were what is the role that they play in the game and where in the world do they come from. For me, many of the characters are reflections of the pros and cons of these philosophies. Piltover and Zaun were always something that grabbed Christian just for how it was not doing the typical swords and sorcery thing. But Vi and Jinx were close to our hearts, connected many of the early projects Christian and I worked on.” Motion influenced the character designs. “The character designs from the game are specifically tailored to be seen from the top-down angle, at a distance, and move in a specific way aimed to make that experience the best it can be,” states Jason Chan, Senior Principal Visual Development Artist at Riot Games. “But when we’re working on adapting it to more cinematic and storytelling experiences, we have to find a way to make those designs still feel recognizable so there’s not this long period where people don’t recognize these characters, but they still work well for animation and storytelling. How we designed the actual animation itself is a back-and-forth process. We try our best on the design team to keep it in mind so we’re designing the characters in ways that won’t make an impossible task for animators to animate.” Jinx’s relationship with Isha allows her to better understand her big sister, Vi. Storyboards also impacted the character designs and vice versa. “In the fight scenes in the end, there are many technical things that happen based on Vi’s updated gauntlets, which have this rocket power, and Jinx’s new gun, which is a gigantic gun that fires from both ends,” Chan remarks. “Those were designed before the storyboard and ended up influencing the storyboard because it gave the animators new toys to play with in the way that the characters could fight; they ran with it and did some creative stuff. Other stuff that has to happen in the storyboard will also influence the design, like when they were storyboarding the Caitlyn/Vi love scene. They had to come back to me to ensure they could remove their clothes! I had to redesign the clothing so that you could see how it comes apart.” Smeech was hard to animate because of his mechanical parts. A new weapon is introduced in Season 2. “Christian wanted some next evolution for Caitlyn in this battle, so I was going for what would separate this Hextech rifle from what she had before,” Chan explains. “I went over the top and tried to design something a little crazier. It reminds me of Jayce’s hammer when it gets all powered up and shoots. I wanted to do something fun with that. I wished she got to use it longer. Every weapon she gets is destroyed during the show, but they make it feel cool when she uses it. That was fun for me. I built the whole thing out in Blender and animated it, transforming and doing stuff. I sent that over to Fortiche, which ran with it and made it cool.” The costumes were designed to reflect the arc of the characters, with Caitlyn, particularly, going on a journey that has her taking on an authoritarian presence. Only three main pieces of artwork existed for Piltover. “Fortiche did a year of visual development of Piltover, particularly the shape language, colors and materials,” Baudry remarks. “We got inspired and enriched the vision of the city. When we started working on Arcane, we already had almost two rounds of visual development.” Guides were provided to Fortiche by the world-building team at Riot Games. “That was base inspiration. But in Arcane, we also wanted to feel that the city was lived in, and we added a history to the architecture. The upper city was more influenced by New York Art Deco, symmetrical and based on geometry; they value something clean and pure. Zaun is more influenced by European Art Deco, which has more flourish and is asymmetrical with some accents.” “Our approach to the color palette and lighting schemes are based on a combination of intuition and visual references carefully selected with the directors right from the mood board phase. The aim is to create a regular and coherent dynamic throughout the episodes and the season. We deliberately alternate darker moments with high-contrast sequences to intensify the emotional impact of key scenes. This strategy also allows for subtle parallels and foreshadowing between episodes, thus reinforcing the overall narrative.” —Julien Georgel, Art Director, Fortiche When Vi is at her darkest emotional state, she dyes her hair black, such as when she is cage-fighting. Viktor had the most model sheets because he slowly transforms through both seasons. His face went through a mock transformation to illustrate how it would change. Caitlyn strikes an action pose with a highly advance Hextech-powered rifle. Jinx was given a gun that is able to fire from both ends. Ekko is considered to be one of the best-looking characters, which made it difficult to find something for him to grow into for the final episode. “Christian Linke [Co-Creator and screenwriter of Arcane for Riot Games] always wanted to have this bridge, which created tension with the IP and world-building team because, before Arcane, the two cities were on top of each other. However, Christian wanted the show to open and end on this bridge. It symbolizes the division and alliance of two sides of a city.” —Arnaud-Loris Baudry, Production Designer and Senior Concept Artist, Riot Games There was room for environmental exploration. “We were in a special situation with League of Legends because the universe we’re depicting in the show doesn’t appear in the video game,” Baudry states. “The video game is more like a battlefield or chessboard where heroes are summoned and come from different universes. After a while, Riot Games realized they needed to create a planet because we were creating a backstory for all these different champions. Still, some felt inspired by Greek mythology, while others were more modern, like the Sheriff of Piltover. A team at Riot Games has been working for more than a decade on the world-building aspect of this universe.” A signature landmark is Progress Bridge, which connects Piltover with Zaun. “Christian Linke always wanted to have this bridge, which created some tension with the IP and world-building team because, before Arcane, the two cities were on top of each other,” Baudry reveals. “However, Christian wanted the show to open and end on this bridge. It symbolizes the division and alliance of two sides of a city.” During Season 2, Mel comes into her own powers, which are tied to a mysterious organization known as the Black Rose. Color and light are important storytelling tools that direct the viewer’s eye and help convey the desired mood for a shot. “Our approach to the color palette and lighting schemes are based on a combination of intuition and visual references carefully selected with the directors right from the mood board phase,” Georgel notes. “The aim is to create a regular and coherent dynamic throughout the episodes and the season. We deliberately alternate darker moments with high-contrast sequences to intensify the emotional impact of key scenes. This strategy also allows for subtle parallels and foreshadowing between episodes, thus reinforcing the overall narrative.” There is an angelic yet unsettling quality as Viktor heals Huck. Magic is a prominent aspect of the visual language with Zaun conjuring it through chemicals and Piltover via science. “In some ways, we would argue that this idea of magic bleeding out of everything in the world is something that is the principal feature of the League of Legends universe,” Yee states. “The key is trying to figure out how to give these many different regions and people a relationship with magic that felt unique and grounded enough to make the world something where it wasn’t just a battle among the top sorcerers for control. It’s tricky because you don’t want to make the world too dry. You want to seize the opportunities for magic where you can find it.” The transformation of the commune established by Viktor was particularly difficult to conceptualize. “It was first introduced in Episode 106 as an abandoned underground mining village, a refuge for some of Zaun’s most destitute inhabitants,” Georgel explains. “We presented it as a rubbish heap from Piltover and Zaun, surrounded by tents, a place where Viktor, guided by the voice of Sky, decides to build his church. In Episode 206, this squalid place is transformed into an oasis of peace, bathed in unexpected subterranean sunlight, decorated with yellow flowers, and giving the impression of a blue sky. This transformation is the work of Viktor’s cultists, whose obsession with the fractal motif, the leitmotif of the series, is manifested at all levels.” Among the props that had to be designed were the various vehicles. “[T]he characters’ powers were the subject of close collaboration between Arnaud Baudry’s team at Riot Games, my team at Fortiche and our FX department. This teamwork made it possible to create distinctive visual effects for each champion, contributing to their own identity.” —Julien Georgel, Art Director, Fortiche Unique environments to create were the labs and workshops that reflect the work and the personalities of those utilizing them. “We knew from the early stages of outlining the story that we had a lot of scientists – Heimerdinger, Jayce, Viktor, the Mad Scientist Singed and Jinx, who is more of a crafter than an inventor,” Baudry observes. “Our job at Riot Games was to find visual ideas to make them all different. For Jinx, it was a place where no one else would consider of use. It’s really dangerous. We imagined that since Zaun was polluted, maybe it was an old vent system, and you have this big propeller. One of the artists on my team had the idea that maybe it could be an airship stuck inside this vent. For Heimerdinger, since he’s short, I imagined that maybe there were many ways to climb ladders to go and find all of his stuff. Heimerdinger is old, so he probably has a lot of old junk everywhere. Heimerdinger is also careful about the danger of science, so he has a vault where he keeps everything.” A central visual and thematic landmark is Progress Bridge, which connects Piltover with Zaun. Not surprising is the role music plays in the narrative, given that Riot Games and Fortiche started off doing music videos together. “Season 2 is characterized by the presence of musical segments in almost every episode, often conceived from the writing stage,” Georgel remarks. “These narrative sequences allow us to simplify complex information and convey the desired atmosphere without creating too many new assets or going overboard with character animation. Otherwise, with the usual art direction and animation style, they would be way too costly to produce. Thanks to elliptical editing and an adapted visual style, we can focus on the essentials and still achieve the desired results. The choice of artistic style is always in harmony with the atmosphere of the sequence and the music: charcoal, with Cait and Vi in color for the funeral [Episode 201], vintage posters for Episode 202, comic-book aesthetics for the task force [Episode 203], and black-and-white press photographs with punk graffiti for the Jinxers’ riot [Episode 204].” The commune established by Viktor has a Fibonacci motif, which visually reflects the corruption caused by him combining Hextech and Chemtech. A long time was spent on Viktor’s minions, trying to find a sweet spot where they were unsettling but also angelic and perfect. Helping audience members understand the geography of Piltover and Zaun are 12 landmarks with one of them being the clock tower. The strong vertical and symmetrical design elements of New York Art Deco inspired the shape language for Piltover and its opera house. A matte painting of a rooftop view of Piltover at magic hour. Various Pre-Zaun street assets that contributed to making the city feel historical and lived in. When designing Heimerdinger’s Lab, the short stature of the academic was kept in mind. A significant prop design was the Hextech rifle, which has several moving parts that widen the canvas of experimentation for animators. Jinx is more of artisan than a scientist, so her inventions are assembled in a dangerous workshop environment. The Hextech lab is where extremely dangerous experiments take place, so it has a vault-like aesthetic. Atmospherics such as steam were added to the Atlas Guantlets worn by Vi to emphasize the powerful nature of the weapon. Somewhat battle-scarred, Riot Games has come out wiser and emboldened by the experience of making Arcane. “I count myself extremely fortunate to have gotten to see a company that mastered its movement with a bunch of people who didn’t know what they were doing but wanted to do something great,” Yee states. “I’ve now had the opportunity to see that life cycle twice, first at Riot and then again with Arcane at Fortiche. It’s a magical feeling. It’s one of those things you think: how could something like that ever happen again? What inspires me is that initial spirit of Riot – this desire to push forward, innovate, disrupt and break open the boundaries of what is possible. It’s a big thing to stand behind that kind of an effort, and it’s incredible to see what it can create in the world.”0 Comments 0 Shares
-
WWW.VFXVOICE.COMHOW VIRTUAL EFFECTS BRING THE SPARK TO LIVE SHOWS AND BROADCASTSBy TREVOR HOGG There was a time when the audience was left to fill in the visual gaps with their imagination when attending live events, but this is no longer the case as stage productions have become technologically sophisticated and proficient in incorporating visual effects as part of the performance. This also applies to broadcast coverage, which has elevated the use of screen graphics by leveraging the tools of augmented reality. Recognizing the growing demand, companies such as Halon Entertainment, which is normally associated with creating visualization for film and television projects, have expanded into the world of virtual design for the likes of ESPN and Coachella. Arkelin Players Theatre and Zero Gravity Lab experiment with technology but not to the point of merging film and theatre. (Image courtesy of Arlekin Players Theatre) “[Y]ou are starting to see the crossroads of everything coming together. That being visual effects heading towards games, AR, VR and broadcast because of USD and all of the programs finding good ways to work together.” —Jess Marley, Virtual Art Department Supervisor, Halon Entertainment “For the past two, two and a half years, you are starting to see the crossroads of everything coming together,” states Jess Marley, Virtual Art Department Supervisor at Halon Entertainment. “That being visual effects heading towards games, AR, VR and broadcast because of USD and all of the programs finding good ways to work together. [The virtual news desk for] ESPN is a good intersection to that because we’re using Jack Morton’s C Port E-Files to introduce content into Unreal Engine and having to bridge the gap between all of those different departments working together, which is what virtual production does across the board.” The configuration resembles a LEGO set. “You are putting the pieces together in Unreal Engine, but you are getting pieces from different kits and packs from the client,” remarks Andrew Ritter, Virtual Art Department Producer at Halon Entertainment. “You have to make sure that they plug into the technology and creative vision sides and go directly into the audience’s eyeballs at the end of it.” Creating something filmic onstage is not an ideal way of using the stage and vice versa, believes Igor Golyak of Arlekin Theatre. (Photo: Andrey Maslov. Image courtesy of Arlekin Players Theatre) Nothing was radically different for ESPN. “ESPN used the same game setup process, meaning you had to look at your assets, plan out UE-Vs and light maps to make sure that things are going to take in light and shadow accordingly and to bake things down,” Marley explains. “Because in order for things to run in real-time and still have dynamic elements as far as media screens and different inputs, the rest of the scene had to be light. We’re also talking about glass and different layers of glass or frosted glass. Different things that are SportsCenter-related that need to be built. It was a careful curation of materials running efficiently but also looking a certain way. It also had inputs that you could control versus things that needed to catch light at a certain distance, so you put specific bevels on things.” The screen graphics were a combination of old and new techniques. “We are emulating the depth monitor for SportsCenter, so we are putting those graphic feeds into a screen that is in the fake world but has to look like a real screen in the real world,” Ritter states. “Initially, we asked how many feeds were needed back there because every frame you add is going to drop our framerate and affect the finished quality. Originally, they were like, ‘Four.’ Then, we had a meeting with the director’s team at SportsCenter who said, ‘We need 16.’ It all worked out in the end.” Gigi Watson as She and Garrett Sands as Soldier in Just Tell No One by Arkelin Players Theatre. (Image courtesy of Arlekin Players Theatre) Our Class, produced by Arkelin Players Theatre, taps into the surrealistic quality of a stage show. (Image courtesy of Arlekin Players Theatre) For Igor Golyak of Arlekin Theatre, the aim is to use the contemporary language of expression, which includes special effects, virtual reality and Unreal Engine, to express something that needs to be expressed. (Image courtesy of Arlekin Players Theatre) Stranger Things was spun off into a theatre production that was able to make use of virtual effects. (Photo: Manuel Harlan. Courtesy of Disguise) Shania Twain goes all out incorporating virtual content into her “Queen of Me Tour” in 2023. (Image courtesy of Disguise) ”It’s not about comparing. It’s about using the contemporary language of expression that includes special effects, virtual reality and Unreal Engine to express something that needs to be expressed.” —Igor Golyak, Founder and Producing Artistic Director of Arkelin Players Theater and Zero Gravity Virtual Theater Lab Preparation allows for adaptability. “Coachella was a good example of us creating AR content for a live concert that was being streamed live, and all of our stuff went off without a hitch, but they were pushing it,” Ritter remarks. “They were making changes up to the last minute, but we had rehearsals and got to see it on the stage through the cameras before the actual shoot started. We had to test everything to make sure we had plenty of space as far as our framerate. We were communicating with the directors’ team to let them know which angles should be on ahead of time. And we rolled with it.” The process for producing content for a studio or concert setting is similar. “The only thing you have to consider is in-screen versus out-of-screen,” Marley observes. “All we’re doing for ESPN content is building in-screen 3D, so we’re moving the content behind the screen for the most part. For the Coachella stuff, it’s out-of-screen and is AR, so it’s separate as far as the use case, but the content still needs to run at the same pace and as flexible and optimized because the camera has to be live, and all the content needs to move whatever the tracking it’s going into. But it’s all the same 3D content.” Mixing various content together has become commonplace, which is reflected in the Glastonbury Festival in 2023. (Photo: Tom Marshak. Courtesy of Disguise) Working within the same environment is Disguise, which has collaborated with artists such as Adele and Shania Twain on their concert tours, as well as the stage play spin-off Stranger Things: The First Shadow. “One might be a TV show that has been adapted to the stage at the West End, and another might be a massive concert tour,” notes Emily Malone, Head of Live Events at Disguise. “It’s all about how you can best realize those ambitions, and that comes down to flexibility, stability, reliability and how quickly you can achieve something.” Mixing various content together is commonplace. “You have capturing inputs, pre-rendered content, real-time content increasingly, and being able to lay all of those together is becoming normal. A lot of the work that allows that to happen occurs when people have meticulous pre-production workflows. There’s so much that happens beforehand to make sure once you get into the room, you’re set up for success. Previsualization and content distribution are huge parts. There are more traditional tools from the film post-production world making their way into making content for live events. We’re pulling technologies from all of these different places. A game engine is another tool in the arsenal. Even if you’re not doing real-time content, Unreal Engine might be the tool you want to use to make the content and render it out from there because you get a different aesthetic, or your render time is different.” Tools more traditionally from the film post-production world are making their way into generating content for live events, such as Adele performing in Munich. (Image courtesy of Disguise) Coordinating the virtual content for the performance of DJ Snake at Coachella 2024. (Image courtesy of Halon Entertainment) “Coachella was a good example of us creating AR content for a live concert that was being streamed live… but they were making changes up to the last minute. [However], we had rehearsals and got to see it on the stage through the cameras before the actual shoot started. We had to test everything to make sure we had plenty of space as far as our framerate. We were communicating with the directors’ team to let them know which angles should be on ahead of time. And we rolled with it.” —Andrew Ritter, Virtual Art Department Producer, Halon Entertainment. Lighting and video departments are dependent on each other. “It might be for a portion of a show that the light team is handed control of video fixtures so they can do something tightly integrated in this one section. Then, there might be parts where video maps to lighting fixtures,” Malone remarks. “There is a tight collaboration between those two teams. For Stranger Things: The First Shadow, you would be hard-pushed to tell what bits come from what department. It can be incredibly immersive and engaging to the point where you don’t have to do much work at all to suspend the disbelief. It happens to you. There have been technological changes that have helped with that. Things like denser pixel pictures and higher resolution; things are getting bigger, and it is becoming cheaper to do those things. Higher color fidelity as well because there is a lot more color accuracy work going on and, subtly, that can be achieved, and the flexibility you get with that. You’re able to do more compositing and do more integration with the things that are happening on stage.” Virtual reality has been embraced by sports broadcasting and given a serious upgrade with ESPN SportsCenter. (Image courtesy of Halon Entertainment and ESPN) Making use of a LiDAR scanner to create virtual reality content. (Image courtesy of Halon Entertainment) Creating an interactive experience that allows artists to expand the scope of their vision is something that the Museum of Modern Art facilitates. “At the core of our department’s mission is a strong desire to support our artists who are deeply informed by histories of experimental cinema, live performance, early video art, artwork responding to the media, and artists seeking to break down boundaries between genres and create something entirely new,” remarks Erica Papernik-Shimizu, Associate Curator, Department of Media and Performance at MoMA. “That’s something that feeds our desire to create the Kravis Studio.” The Marie-Josée and Henry Kravis Studio has the appearance and capability of a black box theater. “But really that facility is flexible rather than a dedicated purpose with frontal-facing time-based work,” notes Paul DiPietro, Senior Manager, Audio Visual Design and Live Performance at MoMA. “That draws from the traditional theatre where you have to change things quickly, or you can do something that is time-based and change a look with lighting, sound, projected image and performers. When you start talking about theatre or art or installation, it’s all blurred. It has more to do with the public’s perception of it than the actual creation of the art.” Within a matter of seconds, the Marie-Josée and Henry Kravis Studio can be turned into a light-locked room, which was required for the “Shana Moulton: Meta/Physical Therapy” exhibition. (Image courtesy of Museum of Modern Art) “At Pre-Kravis Studio, we would think about: do we need to bring in a sprung floor, put down Marley and install power and projectors?” states Lizzie Gorfaine, Associate Director and Producer of Performance and Live Programs at MoMA. “It was such a huge production. The thing about the studio is that when you get a proposal, there are things available for you to use much more readily than at any other point in the history of our museum. The fact that there is a tension grid where you can affix projectors, lighting and speakers at any point in the ceiling is unheard of for a regular museum gallery. Our seating situation is much more flexible. There is a blackout shade that exists on the southside where we can cover this huge window that looks out onto 53rd Street, which means in a matter of seconds, we have a light-locked room where we can achieve productions of the scale [required for the ‘Shana Moulton: Meta/Physical Therapy’ exhibition]. The thought initially around creating a space like this was that we could even consider these kinds of proposals and be able to achieve that level of production, which would have been much more costly and time and resource-intensive than before.” The Marie-Josée and Henry Kravis Studio has the appearance and capability of a black box theater. (Image courtesy of Museum of Modern Art) Arlekin Players Theater and Zero Gravity Virtual Theater Lab are experimenting with technology but not to point of merging film and theatre. “The point is to continue their own path of development,” observes Igor Golyak, Founder and Producing Artistic Director of Arlekin Players Theater and Zero Gravity Virtual Theater Lab. “There are some things that are a crossover, but for me, they are a substitution. Creating something filmic onstage is not an ideal way of using the stage and vice versa. I was watching Poor Things and how interesting, imaginative and theatrical it is. It’s quite stunning what they achieved. Sometimes, things that aren’t overly realistic in theatre are more difficult to do in film; they can make the audience members [react] more [in theatre] than in a realistic filmic way of expression. It’s a different way of affecting the audience, and both work. It’s not about comparing. It’s about using the contemporary language of expression that includes special effects, virtual reality and Unreal Engine to express something that needs to be expressed” The Museum of Modern Art facilitates artists creating an interactive experience that allows them to expand the scope of their vision. (Image courtesy of Museum of Modern Art) The coronavirus pandemic had a significant impact. “Before the pandemic, I had a studio where we fit 55 to 60 people,” Golyak explains. “My virtual production theatrical shows that are live were presented simultaneously in 55 countries. Connecting to a wider set of audience is huge. It’s a different experience, and unlike a movie, they’re experiencing it live, and their interaction counts. They’re not passive listeners or watchers. They provide input into what is happening onstage or on a virtual stage and make choices. It lets me connect with more audiences and lets me connect the audience members to each other who otherwise wouldn’t have connected. There is a sense in a live theatre room where we are all sitting together, and if it’s inspiring and something incredible, we start to breathe together and our heartbeats align. There is a sense of community that comes from using contemporary tools to express the soul.”0 Comments 0 Shares
-
WWW.VFXVOICE.COMVES AWARD WINNERSKINGDOM OF THE PLANET OF THE APESThe VES Award for Outstanding Visual Effects in a Photoreal Feature went to Kingdom of the Planet of the Apes. (Photos courtesy of Walt Disney Studios)THE WILD ROBOTOutstanding Visual Effects in an Animated Feature went to The Wild Robot, which won four VES Awards including Outstanding Animated Character in an Animated Feature (Roz), Outstanding Created Environment in an Animated Feature (The Forest) and Outstanding Effects Simulations in an Animated Feature. (Photos courtesy of DreamWorks Animation and Universal Pictures)SHOGUNOutstanding Visual Effects in a Photoreal Episode went to Shgun; Anjin, which won three VES Awards including Outstanding Effects Simulations in an Episode, Commercial, Game Cinematic or Real-Time Project (Broken to the Fist; Landslide) and Outstanding Created Environment in an Episode, Commercial or Real-Time Project (Osaka). (Photos courtesy of FX Network)0 Comments 0 Shares
-
WWW.VFXVOICE.COMVISUAL EFFECTS ARTISTRY IN THE SPOTLIGHTCaptions list all members of each Award-winning team even if some members of the team were not present or out of frame. For more Show photosand a complete list of nominees and winners of the 23rd Annual VES Awards, visit vesglobal.org.All photos by Moloshok Photography.A Night to Remember. Nearly 1,200 guests filled The Beverly Hilton on February 11th, coming together to honor the best in VFX across 25 award categories celebrating innovation, artistry and cinematic magic.A Warm Welcome. Nancy Ward, Executive Director of the Visual Effects Society, took the stage with a heartfelt greeting, celebrating an incredible year for the VES and the groundbreaking achievements of the visual effects community.Kicking Off the Celebration. Kim Davidson, founder of SideFX and Chair of the Visual Effects Society, took the stage to present the first round of awards, setting the tone for an unforgettable night of VFX excellence.On February 11, the VES hosted the 23rdAnnual VES Awards, the prestigious yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.Industry guests gathered at The Beverly Hilton hotel to celebrate VFX talent in 25 awards categories and special honorees. Kingdom of the Planet of the Apes received the top photoreal feature award. The Wild Robot was named top animated film, winning four awards. Shgun; Anjin was named best photoreal episode, winning three awards.The Sklar Brothers (Randy and Jason) brought their signature wit and energy to their debut as VES Awards hosts.Comedy duo The Sklar Brothers made their debut as VES Awards hosts. Acclaimed actor Keanu Reeves presented Golden Globe-winning actor-producer Hiroyuki Sanada with the VES Award for Creative Excellence. Chief Research Officer of Eyeline Studios Paul Debevec, VES, presented Virtual Reality/Immersive Technology Pioneer Dr. Jacquelyn Ford Morie with the Georges Mlis Award. Writer-director Michael Dougherty presented Academy Award-winning filmmaker and VFX Supervisor Takashi Yamazaki with the Visionary Award. Award presenters included: Kelvin Harrison, Jr., Krys Marshall, Mary Mouser, Russell Hornsby, Tanner Buchanan, Eric Winter, Tia Carrere, and Autodesks Senior Director, Business Strategy Rachael Appleton presented the VES-Autodesk Student Award.As we celebrate the 23rd Annual VES Awards, were honored to shine a light on outstanding visual effects artistry and innovation, said VES Chair Kim Davidson. The honorees and their work represent best-in-class visual effects work that engages audiences and enhances the art of storytelling. The VES Awards is the only venue that showcases and honors these outstanding global artists across a wide range of disciplines, and we are extremely proud of all our winners and nominees.Kingdom of the Planet of the Apes won the Award for Outstanding Visual Effects in a Photoreal Feature, led by the team of Erik Winquist, Julia Neighly, Paul Story, Danielle Immerman and Rodney Burke.The Award for Outstanding Visual Effects in an Animated Feature went to The Wild Robot and the team of Chris Sanders, Jeff Hermann, Jeff Budsberg and Jakob Hjort Jensen.The Award for Outstanding Visual Effects in a Photoreal Episode went to Shgun; Anjin and the team of Michael Cliett, Melody Mead, Philip Engstrm, Ed Bruce and Cameron Waldbauer.Cobra Kai stars Tanner Buchanan and Mary Mouser brought energy and charm to the stage, acting as an engaging duo of presenters.The Award for Outstanding Supporting Visual Effects in a Photoreal Episode went to The Penguin; Bliss and the team of Johnny Han, Michelle Rose, Goran Pavles, Ed Bruce and Devin Maggio.The Award for Outstanding Supporting Visual Effects in a Photoreal Feature went to Civil War and the team of David Simpson, Michelle Rose, Freddy Salazar, Chris Zeh and J.D. Schwalm.Lisa Cooke, former VES Chair and the first woman ever elected to the role, took the stage to present multiple VES Awards.The Award for Outstanding Visual Effects in a Commercial went to Coca-Cola; The Heroes and the team of Greg McKneally, Antonia Vlasto, Ryan Knowles and Fabrice Fiteni.The Award for Outstanding Character in a Photoreal Feature went to Better Man; Robbie Williams and the team of Milton Ramirez, Andrea Merlo, Seoungseok Charlie Kim and Eteuati Tema.Actress and singer Tia Carrere took the stage to present several awards, bringing charm, energy and star power to the celebration.The Award for Outstanding Visual Effects in a Special Venue Project went to D23; Real-Time Rocket and the team of Evan Goldberg, Alyssa Finley, Jason Breneman and Alice Taylor.The Award for Outstanding Character in an Animated Feature went to The Wild Robot; Roz and the team of Fabio Lignini, Yukinori Inagaki, Owen Demers and Hyun Huh.The Award for Outstanding Character in an Episode, Commercial, Game Cinematic or Real-Time Project went to Ronja the Robbers Daughter; Vildvittran the Queen Harpy and the team of Nicklas Andersson, David Allan, Gustav hren and Niklas Walln.The Award for Outstanding Environment in a Photoreal Feature was won by Dune: Part Two; The Arrakeen Basin and the team of Daniel Rhein, Daniel Anton Fernandez, Marc James Austin and Christopher Anciaume.The Award for Outstanding Environment in an Episode, Commercial, Game Cinematic or Real-Time Project went to Shgun; Osaka and the team of Manuel Martinez, Phil Hannigan, Keith Malone and Francesco Corvino.Liv Hewson, star of Yellowjackets, was a captivating presenter in celebrating VFX, as she bestowed multiple awards.The Award for Outstanding Environment in an Animated Feature was presented to The Wild Robot; The Forest and the team of John Wake, He Jung Park, Woojin Choi and Shane Glading.The Award for Outstanding Visual Effects in a Real-Time Project went to Star Wars Outlaws and the team of Stephen Hawes, Lionel Le Dain, Benedikt Podlesnigg and Andi-Bogdan Draghici. Presenter Kelvin Harrison, Jr. accepted the Award.The Award for Outstanding Effects Simulations in a Photoreal Feature went to Dune: Part Two; Atomic Explosions and Wormriding and the team of Nicholas Papworth, Sandy la Tourelle, Lisa Nolan and Christopher Phillips.Krys Marshall, known for her role in For All Mankind, lit up the stage with her charisma and grace as she presented multiple awards.The Award for Outstanding CG Cinematography went to Dune: Part Two; Arrakis and the team of Greig Fraser, Xin Steve Guo, Sandra Murta and Ben Wiggs.The Award for Outstanding Model in a Photoreal or Animated Project went to Alien: Romulus; Renaissance Space Station and the team of Waldemar Bartkowiak, Trevor Wide, Matt Middleton and Ben Shearman.The Award for Outstanding Effects Simulations in an Animated Feature went to The Wild Robot and the team of Derek Cheung, Michael Losure, David Chow and Nyoung Kim.The Award for Outstanding Effects Simulations in an Episode, Commercial, Game Cinematic or Real-Time Project was won by Shgun; Broken to the Fist; Landslide and the team of Dominic Tiedeken, Heinrich Lwe, Charles Guerton and Timmy Lundin.The Award for Outstanding Compositing & Lighting in a Feature went to Dune: Part Two; Wormriding, Geidi Prime and the Final Battle and the team of Christopher Rickard, Francesco DellAnna, Paul Chapman and Ryan Wing.Actor Russell Hornsby, featured in The Woman in the Yard, showcased his charisma and love for VFX as a standout presenter.The Award for Outstanding Special (Practical) Effects in a Photoreal Project went to The Penguin; Safe Guns and the team of Devin Maggio, Johnny Han, Cory Candrilli and Alexandre Prodhomme.The Award for Outstanding Compositing & Lighting in an Episode was presented to The Penguin; After Hours and the team of Jonas Stuckenbrock, Karen Cheng, Eugene Bondar and Miky Girn.The Award for Outstanding Compositing & Lighting in a Commercial went to Coca-Cola; The Heroes and the team of Ryan Knowles, Alex Gabucci, Jack Powell and Dan Yargici.The Emerging Technology Award went to Here; Neural Performance Toolset and the team of Jo Plaete, Oriel Frigo, Tomas Koutsky and Matteo Olivieri-Dancey.Championing the Next Generation. Rachael Appleton, Autodesks Senior Director of Business Strategy, took the stage to present the prestigious VES Autodesk Student Award, honoring emerging talent in the world of VFX.Keanu Reeves honors his friend, award-winning actor-producer Hiroyuki Sanada, and presents him with the distinguished VES Award for Creative Excellence, a tribute to his extraordinary contributions to film, VFX and storytelling.Paul Debevec, VES, Chief Research Officer at Eyeline Studios, presents Virtual Reality and Immersive Technology pioneer Dr. Jacquelyn Ford Morie with the VES Georges Mlis Award, recognizing her trailblazing contributions to immersive storytelling.The Award for Outstanding Visual Effects in a Student Project went to Pittura (entry from ARTFX, The Schools of Digital Arts) and the team of Adam Lauriol, Titouan Lassre, Rmi Vivenza and Hellos Marre.Golden Globe-winning actor-producer Hiroyuki Sanada, one of Japans most celebrated actors, proudly holds his VES Award for Creative Excellence. Known for his unforgettable portrayal of Lord Toranaga in the acclaimed series Shgun, Sanada continues to leave an indelible mark on cinema and television.Virtual Reality/Immersive Technology Pioneer Dr. Jacquelyn Ford Morie was honored with the VESs prestigious Georges Mlis Award.Writer-director Michael Dougherty presented Academy Award-winning filmmaker and VFX Supervisor Takashi Yamazaki with the VES Visionary Award.Hosts The Sklar Brothers share a laugh backstage with award presenter Tia Carrere.Takashi Yamazaki, recipient of the Visionary Award, is one of Japans leading filmmakers and VFX supervisors, earned the 2024 Academy Award for Best Visual Effects for his groundbreaking work on Godzilla Minus One.The celebrated group of Keanu Reeves, VES Executive Director Nancy Ward, Takashi Yamazaki, Hiroyuki Sanada and VES Chair Kim Davidson share a star-studded backstage moment, radiating talent and camaraderieThe dedicated volunteers of the VES Awards Committee Prashant Agrawal, Stephen Chiu, Olun Riley, Michael Ramirez, Kathryn Brillhart, Dave Gouge, Sarah McGrail, Den Serras (Chair), Lopsie Schwartz (Co-Chair), Scott Kilburn (Co-Chair), Eric Greenlief, Dylen Velasquez, Rob Blau, Pramita Mukherjee, Sarah McGee, Brad Simonsen turned their hard work into a spectacular evening celebrating the VES and the best in VFX.0 Comments 0 Shares
-
WWW.VFXVOICE.COMDR. JACQUELYN FORD MORIE: PIONEERING IMMERSIVE TECHNOLOGYBy NAOMI GOLDMANAll photos by Moloshok Photography.Paul Debevec, VES, Chief Research Officer of Eyeline Studios, presents Dr. Jacquelyn Ford Morie with the Georges Mlis Award.Dr. Jacquelyn Ford Morie is a luminary in the realm of virtual reality and a defining voice of immersive technology. Standing at the intersection of art, science and technology, she has been shaping the future of virtual experiences worldwide since the 1980s, and remains dedicated to mentoring the next generation of VR innovators while forging ahead with new paradigms in immersive media.For her immense contributions to the computer animation and visual effects industries, by way of artistry, invention and groundbreaking work, the Society honored Dr. Morie with the Georges Mlis Award at the 23rd Annual VES Awards. Dr. Jacki Morie redefined the possibilities of immersive media, infusing them with profound emotional and educational resonance, said VES Chair Kim Davidson. Her visionary leadership has shaped pivotal projects across health, space exploration, art and education, underscoring her as a transformative innovator and thought leader.Chief Research Officer of Eyeline Studios Paul Debevec, VES gave this heartfelt introduction to his longtime friend: I had the privilege of working with Jacki for over a decade at USCs Institute for Creative Technologies. We were two of the first research leaders hired in 2000, and I was lucky to have a colleague who helped set the tone that this institute would not just be about developing technology, but also about exploring its creative possibilities.Jacki has developed computer animation training programs that have educated a generation of digital artists. But perhaps her most impactful contributions have been pioneering work in VR experiences, creating new forms of multisensory feedback, likehelping NASA astronauts combat isolation on long-duration space missions by keeping them emotionally connected to Earth. It is a theme of Jackis work to develop technology and experiences to make life better for others.In accepting her award, Morie remarked, To say I am honored to receive this award is hardly representative of what I am feeling. It is not everyone who gets to be at the birth of a new medium, and yet here tonight we celebrate both film and computer graphics and effects. I have spent my entire career trying to make thisnew medium of VR more meaningful, more emotional, more purposeful. Id like to think that my work also honors those who are trying to follow the desire lines of where we want to take this medium. I am glad to have started this journey and look forward to many more artists taking it up and making it truly something spectacular! My sincere thanks to the VES for recognizing me and the birth of this new art form.Dr. Morie with the Georges Mlis Award.Dr. Morie and Rob Smith.Paul Debevec, VES, Dr. Morie, VES Executive Director Nancy Ward and VES Chair Kim Davidson.0 Comments 0 Shares
-
WWW.VFXVOICE.COMTAKASHI YAMAZAKI: REIGNITING THE MOVIE MONSTER GENREBy NAOMI GOLDMANAll photos by Moloshok Photography.Writer-director Michael Dougherty presents Takashi Yamazaki with the VES Visionary Award.Takashi Yamazaki with the VES Visionary Award.Yamazaki and Hiroyuki Sanada, recipient of the VES Award for Creative Excellence.Takashi Yamazaki is a renowned talent in Japanese cinema who accomplished a significant feat when he became only the second director to win an Academy Award for Best Visual Effects for Godzilla Minus One and in the process reinvigorated a legendary kaiju franchise. As a filmmaker and visual effects supervisor, he is regarded as one of Japans leading film directors. Yamazaki is set to make his Hollywood debut with Grandgear for Bad Robot and Sony Pictures, and recently announced that he is working on the screenplay and storyboards for the much-anticipated next Godzilla movie.For his consummate artistry, expansive storytelling and profound ability to use visual effects to bring his unique visions to life, the Society honored Takashi Yamazaki with the VES Visionary Award at the 23rd Annual VES Awards. Takashi has been at the forefront in using visual effects to tell remarkable stories that transfix audiences and create unforgettable cinematic experiences, said VES Chair Kim Davidson. As a creative force who has made an indelible mark in the world of filmed entertainment, we are honored to award him with the prestigious VES Visionary Award.Michael Dougherty, director of Godzilla: King of the Monsters, gave this tribute in presenting the award to his colleague in the kaiju genre: Takashi is a filmmaker whose work is both humbling and inspiring. He was so moved by Star Wars and Close Encounters that he started his career building miniatures and working tirelessly asa visual effects supervisor before directing his own films. Takashis work pushes the boundaries of visual effects, blending technical innovation and compelling storytelling to create immersiveand iconic films. Godzilla Minus One resurrected the king of the monsters. Yes, Godzilla has a soul, and Takashi captured it in such a way that moved the world, earning Japan its first Academy Award for Visual Effects.In accepting his award, Yamazaki remarked, Im truly surprised and overjoyed to receive such a wonderful award. I started this career with a dream of working with people around the world like a pioneer in visual effects; however, in Japan, there were hardly any opportunities for this type of work. I kept telling myself that as long as I was born in Japan, bringing spaceships, robots and kaiju to the screen was already a dream come true. But then Godzilla brought me to this incredible place. Thank you, Godzilla! And I believe many of you can relate when I say I want to praise my young self for choosing to pursue this career. Thank you for this great honor.VES Chair Kim Davidson, VES Executive Director Nancy Ward, Michael Dougherty and Takashi Yamazaki.0 Comments 0 Shares
-
WWW.VFXVOICE.COMVFX SCHOOLS ADAPT TO CAPTURE INDUSTRY SHIFT TO VP AND AIBy CHRIS McGOWANThe Martin Scorsese Virtual Production Center at NYUs Tisch School of the Arts. (Image courtesy of Tisch School of the Arts)As far as VFX education, we are constantly seeing new pieces of software and technology being implemented into the pipeline. That is something we are always grappling with when it comes to learning, says Professor Flip Phillips of The School of Film and Animation at Rochester Institute of Technology (RIT). Each VFX and animation school explores the implementation of new tech differently, such as virtual production and AI.The film and media industry is experiencing a significant shift toward LED stages and virtual production technologies, fundamentally changing how stories are told and content is created, comments Patrick Alexander, Department Head at Ringling College Film Program. Real-time visualization capabilities, combined with the seamless integration of physical and digital elements, provide creators with unprecedented creative control and production efficiency. At Ringling College, weve recognized this industry transformation by installing LED walls this semester and are actively developing a series of virtual production courses. From our campus in Sarasota, Florida, we can now create entire worlds and environments once thought impossible to achieve, enabling students to realize their creative visions in real-time. This curriculum development paired with our new technology will prepare students for the rapidly evolving production land-scape, where virtual production technologies enhance creative possibilities while maintaining the fundamental principles of cinematic craft.The Savannah College of Art and Design has two full-size LED volume stages. (Image courtesy of SCAD)We have a soundstage in the MAGIC Center that houses a 32 x 16 LED wall, and the center provides technical support for motion capture, camera tracking, virtual art department and real-time in-camera visual effects. Having the opportunity to see and work with a virtualproduction stage is a great asset for our graduates.Professor Flip Phillips, The School of Film and Animation, Rochester Institute of TechnologyRIT has worked to ensure our students have experience working in virtual production before they leave our campus,Phillips states. We have a soundstage in the MAGIC Centerthat houses a 32 x 16 LED wall, and the center provides technical support for motion capture, camera tracking, virtual art department and real-time in-camera visual effects. Having the opportunity to see and work with a virtual production stage is a great asset for our graduates.To meet the growing emphasis on LED stages and virtual production, Vancouver Film School is launching a VP course alongside Pixomondo in 2025 to meet the industry needs in this area, says Colin Giles, Head of The School for Animation & VFX at Vancouver Film School. The 12-week certificate program will teach the fundamentals of content creation for virtual production in VFX.NYUs Tisch School of the Arts has opened the Martin Scorsese Virtual Production Center, made possible by a major donationfrom the Hobson/Lucas Family Foundations by Mellody Hobson, Co-CEO of Ariel Investments, and filmmaker George Lucas. The facility features two 3,500-square-foot double-height, column-free virtual production stages, with motion capture and VP technology outfitted by Vicon and Lux Machina, and two 1,800-square-foot television studios and state-of-the-art broadcast and control rooms.Savannah College of Art and Design has two full-size LED volume stages where students can get their hands on Mandalorian-style production techniques, as well as classes in virtual production, photogrammetry and real-time lighting, according to Gray Marshall, Chair of Visual Effects at SCAD. We have even launched a multidisciplinary minor in virtual production, bringing visual effects, production design and film students together to create complete and inventive in-camera VFX projects.Working at The School of Visual Arts in New York City. (Image courtesy of SVA)A VR session at The School of Visual Arts in New York City. (Image courtesy of SVA)The amount of class time devoted to AI is also rapidly growing. Educational institutions have a unique opportunity to learn fromindustry standards and histories while pushing the boundaries through emerging technologies, notes Jimmy Calhoun, Chair of BFA 3D Animation and Visual Effects at The School of Visual Arts in New York City. Our students understand this responsibility. Theyre not only exploring the potential of AI but also reflecting on its impact on their rights as artists, the future of their mentors jobs and the environment.SCADs Marshall comments, AI is the most exciting trend in VFX since the advent of the computer itself. AI has already found itself ingrained in many aspects of our day-to-day tools and will continue to do so. It also creates new opportunities to rapidly try ideas, modify them and get stimulated in new directions, but it is still all under your control. Yes, there are some challenges to be faced, both regarding IP and resource utilization, but those can be worked out. I am not one of those who feels well lose jobs.Marshall continues, I watched as computers displaced some older-style workers, only for a whole new style of artist to emerge in greater numbers, driving greater demands for their services. Computers have always been good at replacing repetitive jobs, and I dont think losing that class of jobs will be a loss. Since the basic premise of AI-generated images is to aggregate toward the middle ground, if youre concerned it will take your creative job, I wouldnt be. If you are truly creative, then AI isnt an exit ramp, its a launch ramp.Virtual production at the Savannah College of Art and Design. (Image courtesy of SCAD)Currently, I think there are two groups of educators that I am seeing when it comes to AI, says RITs Phillips. There is one group that is hesitant to adopt AI into their curriculum due to the lack of knowledge of how it could benefit them and their students, and another group that sees the benefits of using AI to make the VFX pipeline more efficient. I am part of the latter group. I have seen many use cases for AI to allow me and my students to deal with problems that are tedious or inefficient. There will be manymore beneficial situations for AI in the VFX field, but we still have to be mindful of the ethical issues that arise.We embrace emerging technologies like AI as valuable tools to be utilized when appropriate, notes Christian Huthmacher, Professor, The Department of Motion Design, specializing in VFX at Ringling College of Art and Design. In our Visual Effects courses, students are introduced to cutting-edge tools such as The Foundrys CopyCat and Topaz AI. However, our approach goes beyond merely teaching technical proficiency. We engage students in critical discussions about the ethical considerations, potential biases and security implications surrounding AI usage. By addressing these complex topics, we ensure our students are uniquely equipped to navigate the evolving landscape of AI in the industry.On an XR Stage at the Savannah School of Art and Design. (Image courtesy of SCAD)We are embracing [AI] when it makes sense, says Ria Ambrose Benard, Director/Senior Educational Administrator at The Rookies Academy (formerly Los Boys Studios). Tools are being created to help artists with the mundane portion of the job and offer more time for the more difficult and rewarding shots. Much like spell check, it is a tool people use regularly, and sometimesit is great, but sometimes it is not. AI is a tool that can be utilized correctly. Its not always the best solution, and often an artist will get better results faster, but it is a tool that can be used in some circumstances to make things easier for the artists.One goal that I always strive for in my classrooms is allowing students to problem-solve using any and all tools available, comments RITs Phillips. I believe this will allow for the industry to continue to evolve and become a place where creativity, design and innovation will help tell the stories in a more unique and beautiful way. Our field is constantly evolving, and that is what makes it exciting. The unknown can be a scary place for some, but I see it as an opportunity to make great strides in VFX.0 Comments 0 Shares
-
WWW.VFXVOICE.COMHONORING THE RETRO-FUTURE OF THE ELECTRIC STATEBy TREVOR HOGGImages courtesy of Netflix.The reliance on visual effects surpassed what Anthony and Joe Russo had previously done in the MCU.Even though the term retro-future is nothing new, Swedish artist and musician Simon Stlenhag has been able to create a unique vision where discarded technology is scattered across vast landscapes situated in an alternative universe. His illustrations and accompanying narratives have captured the attention of filmmakers such as Nathaniel Halpern with Tales from the Loop for Prime Video and siblings Anthony and Joe Russo with TheElectric State for Netflix, which they spent seven years developing. The latter revolves around the aftermath of a robot uprising where an orphaned teenager goes on a cross-country journey to find her lost brother. The human cast consists of Millie Bobby Brown, Chris Pratt, Stanley Tucci, Giancarlo Esposito, Ke Huy Quan and Jason Alexander. At the same time, Woody Harrelson, Anthony Mackie, Brian Cox, Alan Tudyk, Hank Azaria and Colman Domingo voiced the many mechanical co-stars.Simon Stlenhags original artwork electrified us, producer/ director Anthony Russo remarks. Its this strange feeling of familiarity in what hes drawn and also strangeness. Its a historical period that you can recognize whether or not you lived through it, but its not exactly what that period was. There are elements from the 1990s. The story is a parable, producer/director Joe Russo notes. Its less about nostalgia than it is about the idea that technology could have developed faster and maybe deviated humanity from its main path. That was the fun part, thinking through those little elements that allowed us to create a new interpretation of the 1990s. The story taps into the present-day fears of AI usurping its human creators. Part of what we want to do is explore the fact that you can find humanity in technology and inhumanity in humans, states Anthony Russo. We have both of those experiences in our lives and world. Joe and I are technologists. We use technology throughout our lives to tell stories, but at the same time, we all know that technology is powerful and can cause problems, whether the nuclear bomb or social media. Its us recognizing the complex relationship that we all have with technology as human beings.A complex sequence to execute was 20-foot Herman carrying a Volkswagen campervan containing Michelle, Keats and Cosmo on his shoulder.Determining how the robots would be created and executed was a major topic of discussion as they are 80% of the cast. This istrue for all of our projects when youre dealing with a fantasy world that needs to be created from whole cloth that doesnt exist, states Anthony Russo. The methodology used to create that is always a question. It is driven by how Joe and I see the movie. What are we trying to achieve? What do we want to do with the scenes? How do we want to stage things? How do we want the actors to interact with other characters in the movie who may not be played on the screen by physical actors? These all become questions in terms of what is the right methodology to use to create the film. We were involved with our Visual Effects Supervisor, Matthew Butler, to determine the proper methodologies. Because there are so many characters in it that dont exist in reality, we had to rely upon visual effects to create a huge portion of the film.Dennis Gassner and Richard Johnson, who consulted robotic companies, shared production designer duties. I was in charge of making a real walking and moving Cosmo, states Production Designer Richard Johnson. I had to go to every robotics company in the world that would listen to me. The immediate refrain was. The head is a deal-breaker. It throws him immediately out of balance. If you look at all of the real robots that are popping up on the Internet today, they all have tiny heads. The other limiting factor was height. They were all in the zone of 5 7 or less. I now know more about real robots than I ever expected to know my entire life! The robots had to be distinct in their own right. We felt the robots with more screen time needed to be more iconic-looking, so we looked at iconic products or services or things from the last two, three or four decades. Mr. Peanut is very well-known brand name. We thought, He could be a robot. Baseball player. Very iconic. Be a robot.The drones went through a major design change where the heads resemble neurocasters and had a screen that projected the face of the pilot.The drones went through a major design change where the heads resemble neurocasters and had a screen that projected the face of the pilot.Inspiring the imagery was Swedish artist-musician Simon Stlenhag, who has developed a retro-tech and alternative-world visual aesthetic.Digital Domain was responsible for the shot in which Michelle, Keats, Herman and Cosmo are captured and taken into a mall that has become a communal place for exiled robots.Wireframe pass of Herman taking Keats (Chris Pratt) for a ride.Animation pass of Herman taking Keats (Chris Pratt) for a ride.Final composite pass of Herman taking Keats (Chris Pratt) for a ride.Approximately 2,000 visual effects shots are in the final film, with Digital Domain and ILM being the main vendors, followed by Storm Studios, One of Us, Lola VFX and an in-house team.Were not in a world of magic, observes Visual Effects Supervisor Matthew Butler. The idea is that these robots were often designed to make us feel comfortable about them serving us. I fought tooth and nail to put in little piston rod push-pulls and things that could justify that Cosmo could actually move. If we designed a particular ball joint or cylindrical actuator or pitch actuator, we made sure that the motion of these robots was restricted to what that could do. Artistic license was taken with the original design by Simon Stlenhag. I wanted Cosmos eyes to have some emotion. Rather than be just a painted pupil as in the book, we made a smoked glass lens for the pupil that you can see behind it that there is a camera. Artistically, we let those eyes have a gratuitous green light to them. Now, you have a twinkle of an eye and can get emotion into that eye. That was another tricky thing. It was about getting enough emotion into them without breaking the silhouette of the design of the robots that we needed to adhere to that was hard, Butler says.Keatss (Chris Pratt) robot sidekick, Herman, comes in different sizes. It was always the Russian doll thing where the one size smaller fits into the one size bigger, Butler remarks. We did honor the style and personality but not at the expense of physics. Mostof the movie is four-foot Herman with Anthony Mackies [vocal] and Martin Klebbas [mocap] performances. Its also coming out of the script. Hes this sarcastic character. I love his personality, and it came through extremely well. Herman borrowed the power extension cable for his devices and forgot to return it. Meanwhile, all of Keats food in the fridge has gone bad. Herman has messed up, and hes like a guilty teenager shuffling around on the couch, deliberately avoiding eye contact with Keats because hes this busted little kid. That character is amazing, and it carries through the movie well. Chris Pratt was perfect for this, and it works so well for the two of them. Its most peoples favorite relationship in the movie.Remnants from the robot rebellion are scattered throughout the landscapes.Following the example of animated features, Lead Storyboard Artist Darrin Denlinger storyboarded and assembled the entire film into an animatic. I had a version of the movie before we started shooting, or close to when we started shooting, that was a layout, states Jeffrey Ford, Executive Producer and Editor. It had all the storyboards cut together in sequence with sound, music and subtitles instead of dialogue. I used that as a layout to guideus throughout production so we knew roughly how scenes would play out. Of course, things change on the day; actors re-block the scenes. The amount of footage captured surpassed Avengers: Infinity War and Avengers: Endgame. Ford explains, This film that had to be made multiple times because when you deal withanimated characters carrying this much weight dramatically, those performances are created in passes. You may shoot a proxy pass where Millie interacts with Cosmo, and its a motion capture actor. Then you may shoot a pass where she is interacting with nothing. We might go back and shoot that same performance on the mocap stage multiple times. We may end up with various iterations of those visual effects as they come in over the months. An enormous number of iterations go on, and when you do that, you generate enormous amounts of footage.Atlanta doubled as the American Southwest. Tumbleweeds and sagebrush, the basic things a person sees in the Southwest, do not exist in Atlanta or anywhere near there, Johnson notes. We had to fill several trucks with all that stuff and bring it into Atlanta. Parts of Atlanta have not changed for years. Through the cameras eye, if you wanted to say that it was the 1970s or 1980s, it waseasy because nobody had built modern or contemporary homes in those neighborhoods for whatever reason. The same thing happened when selecting the locations for the battles in the city. If you put in the right period of car, voil! Youre in that era. A question that had to be answered was where exiled robots responsible for the uprising would live. The X was in the script and is a large area in the Southwest. Where would these guys go? A country club? A department store? Football stadium? We landed on a shopping mall. It dawned on me one day that the only reason they would go to a place like this is to recharge themselves or maybe for repairs. Thats why in the shopping mall, you see a lot of wires, batteries and charging stations.Remnants from the robot rebellion are scattered throughout the landscapes.It was imperative for believability that Hermans movements be grounded in real physics.Approximately 2,000 visual effects shots are in the final film, with Digital Domain and ILM being the main vendors, followed by Storm Studios, One of Us, Lola VFX and an in-house team.Lighting played a key role in seamlessly integrating live-action Michelle with CG Cosmo.Humans become more robotic because of their addiction to the neurocaster, which can transmit ideas and feelings via a virtual network.Along with the final battle, which is almost entirely synthetic apart from the Keatss interaction with the grass, a virtual meeting occurs between antagonists Ethan Skate (Stanley Tucci) and Colonel Bradbury (Giancarlo Esposito). Its where Ethan pitches the Colonel to go into the X to get Cosmo back, Butler recalls. Bradbury puts on the neurocaster and teleports into this virtualenvironment that takes place in this beautiful mountain lake scene. Skate is standing in the middle of the lake while Bradbury is on terra firma and gently steps out to have a conversation with him. Production didnt go to Norway. Its just a couple of guys and girls from Storm Studios [located in Oslo] with some backpacks hiking out into some beautiful locations in the summer for reference. We had a shallow pool with plexiglass an inch under the surface so we could have Stanley and Giancarlo stand in the water. The only thing that we kept was the little bit of interaction of water locally right at their feet while the rest was digital water. The ripples needed to come out and propagate out into the lake as it would for real. Its absolutely stunning work.The hardest shot to execute was when Cosmo, Herman, Keats and Michelle (Millie Bobby Brown) have been captured andare escorted into the mall. They walk into this forecourt and finally see that the whole mall is filled with robots, Ford recalls. That shot was incredibly hard and took us months to do. The only things in that shot are Chris Pratt, Millie Bobby Brown and an empty mall. I drew maps of each frame, and we did a whole progression. We talked about where the different robots were, what they were doing, what their day was like, where they were going next and why they were moving in a certain way. We wanted it to feel like a real city. If you were to block out extras, those people would all come up with all of their mini-stories, and they would work it out. But we didnt have that. We had to figure it all out as animators. It was fun but brutal. Digital Domain did an incredible job on it. I hope people will stop and rewind the shot because its beautifully detailed and feels completely real.0 Comments 0 Shares
More Stories