• Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA

    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs.
    Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online.
    At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees.
    3D Digital Twins and AI Transform Marketing, Advertising and Product Design
    The meeting of generative AI and 3D product digital twins results in unlimited creative potential.
    Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels.
    The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch.
    Image courtesy of Nestlé
    The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure.
    Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands.
    LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy.
    The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale.
    The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation.
    Image courtesy of Grip
    L’Oréal Gives Marketing and Online Shopping an AI Makeover
    Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI.
    L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines.
    “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.”
    CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences.
    The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates.

    Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products.
    Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare.
    “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.” 

    The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure.
    Rapid Innovation With the NVIDIA Partner Ecosystem
    NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI.
    Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference.
    AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need.
    The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale.
    Physical AI Brings Acceleration to Supply Chain and Logistics
    AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%.
    Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments.
    Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers.
    From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations.
    Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    #retail #reboot #major #global #brands
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goodsindustries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. #retail #reboot #major #global #brands
    BLOGS.NVIDIA.COM
    Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA
    AI is packing and shipping efficiency for the retail and consumer packaged goods (CPG) industries, with a majority of surveyed companies in the space reporting the technology is increasing revenue and reducing operational costs. Global brands are reimagining every facet of their businesses with AI, from how products are designed and manufactured to how they’re marketed, shipped and experienced in-store and online. At NVIDIA GTC Paris at VivaTech, industry leaders including L’Oréal, LVMH and Nestlé shared how they’re using tools like AI agents and physical AI — powered by NVIDIA AI and simulation technologies — across every step of the product lifecycle to enhance operations and experiences for partners, customers and employees. 3D Digital Twins and AI Transform Marketing, Advertising and Product Design The meeting of generative AI and 3D product digital twins results in unlimited creative potential. Nestlé, the world’s largest food and beverage company, today announced a collaboration with NVIDIA and Accenture to launch a new, AI-powered in-house service that will create high-quality product content at scale for e-commerce and digital media channels. The new content service, based on digital twins powered by the NVIDIA Omniverse platform, creates exact 3D virtual replicas of physical products. Product packaging can be adjusted or localized digitally, enabling seamless integration into various environments, such as seasonal campaigns or channel-specific formats. This means that new creative content can be generated without having to constantly reshoot from scratch. Image courtesy of Nestlé The service is developed in partnership with Accenture Song, using Accenture AI Refinery built on NVIDIA Omniverse for advanced digital twin creation. It uses NVIDIA AI Enterprise for generative AI, hosted on Microsoft Azure for robust cloud infrastructure. Nestlé already has a baseline of 4,000 3D digital products — mainly for global brands — with the ambition to convert a total of 10,000 products into digital twins in the next two years across global and local brands. LVMH, the world’s leading luxury goods company, home to 75 distinguished maisons, is bringing 3D digital twins to its content production processes through its wine and spirits division, Moët Hennessy. The group partnered with content configuration engine Grip to develop a solution using the NVIDIA Omniverse platform, which enables the creation of 3D digital twins that power content variation production. With Grip’s solution, Moët Hennessy teams can quickly generate digital marketing assets and experiences to promote luxury products at scale. The initiative, led by Capucine Lafarge and Chloé Fournier, has been recognized by LVMH as a leading approach to scaling content creation. Image courtesy of Grip L’Oréal Gives Marketing and Online Shopping an AI Makeover Innovation starts at the drawing board. Today, that board is digital — and it’s powered by AI. L’Oréal Groupe, the world’s leading beauty player, announced its collaboration with NVIDIA today. Through this collaboration, L’Oréal and its partner ecosystem will leverage the NVIDIA AI Enterprise platform to transform its consumer beauty experiences, marketing and advertising content pipelines. “AI doesn’t think with the same constraints as a human being. That opens new avenues for creativity,” said Anne Machet, global head of content and entertainment at L’Oréal. “Generative AI enables our teams and partner agencies to explore creative possibilities.” CreAItech, L’Oréal’s generative AI content platform, is augmenting the creativity of marketing and content teams. Combining a modular ecosystem of models, expertise, technologies and partners — including NVIDIA — CreAltech empowers marketers to generate thousands of unique, on-brand images, videos and lines of text for diverse platforms and global audiences. The solution empowers L’Oréal’s marketing teams to quickly iterate on campaigns that improve consumer engagement across social media, e-commerce content and influencer marketing — driving higher conversion rates. Noli.com, the first AI-powered multi-brand marketplace startup founded and backed by the  L’Oréal Groupe, is reinventing how people discover and shop for beauty products. Noli’s AI Beauty Matchmaker experience uses L’Oréal Groupe’s century-long expertise in beauty, including its extensive knowledge of beauty science, beauty tech and consumer insights, built from over 1 million skin data points and analysis of thousands of product formulations. It gives users a BeautyDNA profile with expert-level guidance and personalized product recommendations for skincare and haircare. “Beauty shoppers are often overwhelmed by choice and struggling to find the products that are right for them,” said Amos Susskind, founder and CEO of Noli. “By applying the latest AI models accelerated by NVIDIA and Accenture to the unparalleled knowledge base and expertise of the L’Oréal Groupe, we can provide hyper-personalized, explainable recommendations to our users.”  https://blogs.nvidia.com/wp-content/uploads/2025/06/Noli_Demo.mp4 The Accenture AI Refinery, powered by NVIDIA AI Enterprise, will provide the platform for Noli to experiment and scale. Noli’s new agent models will use NVIDIA NIM and NVIDIA NeMo microservices, including NeMo Retriever, running on Microsoft Azure. Rapid Innovation With the NVIDIA Partner Ecosystem NVIDIA’s ecosystem of solution provider partners empowers retail and CPG companies to innovate faster, personalize customer experiences, and optimize operations with NVIDIA accelerated computing and AI. Global digital agency Monks is reshaping the landscape of AI-driven marketing, creative production and enterprise transformation. At the heart of their innovation lies the Monks.Flow platform that enhances both the speed and sophistication of creative workflows through NVIDIA Omniverse, NVIDIA NIM microservices and Triton Inference Server for lightning-fast inference. AI image solutions provider Bria is helping retail giants like Lidl and L’Oreal to enhance marketing asset creation. Bria AI transforms static product images into compelling, dynamic advertisements that can be quickly scaled for use across any marketing need. The company’s generative AI platform uses NVIDIA Triton Inference Server software and the NVIDIA TensorRT software development kit for accelerated inference, as well as NVIDIA NIM and NeMo microservices for quick image generation at scale. Physical AI Brings Acceleration to Supply Chain and Logistics AI’s impact extends far beyond the digital world. Physical AI-powered warehousing robots, for example, are helping maximize efficiency in retail supply chain operations. Four in five retail companies have reported that AI has helped reduce supply chain operational costs, with 25% reporting cost reductions of at least 10%. Technology providers Lyric, KoiReader Technologies and Exotec are tackling the challenges of integrating AI into complex warehouse environments. Lyric is using the NVIDIA cuOpt GPU-accelerated solver for warehouse network planning and route optimization, and is collaborating with NVIDIA to apply the technology to broader supply chain decision-making problems. KoiReader Technologies is tapping the NVIDIA Metropolis stack for its computer vision solutions within logistics, supply chain and manufacturing environments using the KoiVision Platform. And Exotec is using NVIDIA CUDA libraries and the NVIDIA JetPack software development kit for embedded robotic systems in warehouse and distribution centers. From real-time robotics orchestration to predictive maintenance, these solutions are delivering impact on uptime, throughput and cost savings for supply chain operations. Learn more by joining a follow-up discussion on digital twins and AI-powered creativity with Microsoft, Nestlé, Accenture and NVIDIA at Cannes Lions on Monday, June 16. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    Like
    Love
    Sad
    Wow
    Angry
    23
    0 التعليقات 0 المشاركات
  • Games Inbox: Would Xbox ever shut down Game Pass?

    Game Pass – will it continue forever?The Monday letters page struggles to predict what’s going to happen with the PlayStation 6, as one reader sees their opinion of the Switch 2 change over time.
    To join in with the discussions yourself email gamecentral@metro.co.uk
    Final Pass
    I agree with a lot of what was said about the current state of Xbox in the Reader’s Feature this weekend and how the more Microsoft spends, and the more companies they own, the less the seem to be in control. Which is very strange really.The biggest recent failure has got to be Game Pass, which has not had the impact they expected and yet they don’t seem ready to acknowledge that. If they’re thinking of increasing the price again, like those rumours say, then I think that will be the point at which you can draw a line under the whole idea and admit it’s never going to catch on.
    But would Microsoft ever shut down Game Pass completely? I feel that would almost be more humiliating than stopping making consoles, so I can’t really imagine it. Instead, they’ll make it more and more expensive and put more and more restrictions on day one games until it’s no longer recognisable.Grackle
    Panic button
    Strange to see Sony talking relatively openly about Nintendo and Microsoft as competition. I can’t remember the last time they mentioned either of them, even if they obviously would prefer not to have, if they hadn’t been asked by investors.At no point did they acknowledge that the Switch has completely outsold both their last two consoles, so I’m not sure where their confidence comes from. I guess it’s from the fact that they know they’ve done nothing this gen and still come out on top, so from their perspective they’ve got plenty in reserve.

    Expert, exclusive gaming analysis

    Sign up to the GameCentral newsletter for a unique take on the week in gaming, alongside the latest reviews and more. Delivered to your inbox every Saturday morning.

    Having your panic button being ‘do anything at all’ must be pretty reassuring really. Nintendo has had to work to get where they are with the Switch but Sony is just coasting it.Lupus
    James’ LadderJacob’s Ladder is a film I’ve been meaning to watch for a while, and I guessed the ending quite early on, but it feels like a Silent Hill film. I don’t know if you guys have seen it but it’s an excellent film and the hospital scene near the end, and the cages blocking off the underground early on, just remind me of the game.
    A depressing film overall but worth a watch.Simon
    GC: Jacob’s Ladder was as a major influence on Silent Hill 2 in particular, even the jacket James is wearing is the same.
    Email your comments to: gamecentral@metro.co.uk
    Seeing the future
    I know everyone likes to think of themselves as Nostradamus, but I have to admit I have absolutely no clue what Sony is planning for the PlayStation 6. A new console that is just the usual update, that sits under your TV, is easy enough to imagine but surely they’re not going to do that again?But the idea of having new home and portable machines that come out at the same time seems so unlikely to me. Surely the portable wouldn’t be a separate format, but I can’t see it being any kind of portable that runs its own games because it’d never be as powerful as the home machine. So, it’s really just a PlayStation Portal 2?
    Like I said, I don’t know, but for some reason I have a bad feeling about that the next gen and whatever Sony does end up unveiling. I suspect that whatever they and Microsoft does it’s going to end up making the Switch 2seem even more appealing by comparison.Gonch
    Hidden insight
    I’m not going to say that Welcome Tour is a good game but what I will say is that I found it very interesting at times and I’m actually kind of surprised that Nintendo revealed some of the information that they did. Most of it could probably be found out by reverse engineering it and just taking it apart but I’m still surprised it went into as much detail as it did.You’re right that it’s all presented in a very dull way but personally I found the ‘Insights’ to be the best part of the game. The minigames really are not very good and I was always glad when they were over. So, while I would not necessarily recommend the gameI would say that it can be of interest to people who have an interest in how consoles work and how Nintendo think.Mogwai
    Purchase privilege
    I’ve recently had the privilege of buying Clair Obscur: Expedition 33 from the website CDKeys, using a 10% discount code. I was lucky enough to only spend a total of £25.99; much cheaper than purchasing the title for console. If only Ubisoft had the foresight to see what they allowed to slip through their fingers. I’d also like to mention that from what I’ve read quite recently ,and a couple of mixed views, I don’t see myself cancelling my Switch 2. On the contrary, it just is coming across as a disappointment.From the battery life to the lack of launch titles, an empty open world is never a smart choice to make not even Mario is safe from that. That leaves the upcoming ROG Xbox Ally that’s recently been showcased and is set for an October launch.
    I won’t lie it does look in the same vein as the Switch 2, far too similar to the ROG Ally X model. Just with grips and a dedicated Xbox button. The Z2 Extreme chip has me intrigued, however. How much of a transcendental shift it makes is another question however. I’ll have to wait to receive official confirmation for a price and release date. But there’s also a Lenovo Legion Go 2 waiting in the wings. I hope we hear more information soon. Preferably before my 28th in August.Shahzaib Sadiq
    Tip of the iceberg
    Interesting to hear about Cyberpunk 2077 running well on the Switch 2. I think if they’re getting that kind of performance at launch, from a third party not use to working with Nintendo hardware, that bodes very well for the future.I think we’re probably underestimating the Switch 2 a lot at the moment and stuff we’ll be seeing in two or three years is going to be amazing, I predict. What I can’t predict is when we’ll hear about any of this. I really hope there’s a Nintendo Direct this week.Dano
    Changing opinions
    So just a little over a week with the Switch 2 and after initially feeling incredibly meh about the new console and Mario Kart a little more playtime has been more optimistic about the console and much more positive about Mario Kart World.It did feel odd having a new console from Nintendo that didn’t inspire that childlike excitement. An iterative upgrade isn’t very exciting and as I own a Steam Deck the advancements in processing weren’t all that exciting either. I can imagine someone who only bough an OG Switch back in 2017 really noticing the improvements but if you bought an OLED it’s basically a Switch Pro.
    The criminally low level of software support doesn’t help. I double dipped Street Fighter 6 only to discover I can’t transfer progress or DLC across from my Xbox, which sort of means if I want both profiles to have parity I have to buy everything twice! I also treated myself to a new Pro Controller and find using it for Street Fighter almost unplayable as the L and ZL buttons are far too easy to accidently press when playing.
    Mario Kart initially felt like more of the same and it was only after I made an effort to explore the world map, unlock characters and karts, and try the new grinding/ollie mechanic that it clicked. I am now really enjoying it, especially the remixed soundtracks.
    I do however want more Switch 2 exclusive experiences – going back through my back catalogue for improved frame rates doesn’t cut it Nintendo! As someone with a large digital library the system transfer was very frustrating and the new virtual cartridges are just awful – does a Switch 2 need to be online all the time now? Not the best idea for a portable system.
    So, the start of a new console lifecycle and hopefully lots of new IP – I suspect Nintendo will try and get us to revisit our back catalogues first though.BristolPete
    Inbox also-rans
    Just thought I would mention that if anyone’s interested in purchasing the Mortal Kombat 1 Definitive Edition, which includes all DLC, that it’s currently an absolute steal on the Xbox store at £21.99.Nick The GreekI’ve just won my first Knockout Tour online race on Mario Kart World! I’ve got to say, the feeling is magnificent.Rable

    More Trending

    Email your comments to: gamecentral@metro.co.uk
    The small printNew Inbox updates appear every weekday morning, with special Hot Topic Inboxes at the weekend. Readers’ letters are used on merit and may be edited for length and content.
    You can also submit your own 500 to 600-word Reader’s Feature at any time via email or our Submit Stuff page, which if used will be shown in the next available weekend slot.
    You can also leave your comments below and don’t forget to follow us on Twitter.
    Arrow
    MORE: Games Inbox: Is Mario Kart World too hard?

    GameCentral
    Sign up for exclusive analysis, latest releases, and bonus community content.
    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Your information will be used in line with our Privacy Policy
    #games #inbox #would #xbox #ever
    Games Inbox: Would Xbox ever shut down Game Pass?
    Game Pass – will it continue forever?The Monday letters page struggles to predict what’s going to happen with the PlayStation 6, as one reader sees their opinion of the Switch 2 change over time. To join in with the discussions yourself email gamecentral@metro.co.uk Final Pass I agree with a lot of what was said about the current state of Xbox in the Reader’s Feature this weekend and how the more Microsoft spends, and the more companies they own, the less the seem to be in control. Which is very strange really.The biggest recent failure has got to be Game Pass, which has not had the impact they expected and yet they don’t seem ready to acknowledge that. If they’re thinking of increasing the price again, like those rumours say, then I think that will be the point at which you can draw a line under the whole idea and admit it’s never going to catch on. But would Microsoft ever shut down Game Pass completely? I feel that would almost be more humiliating than stopping making consoles, so I can’t really imagine it. Instead, they’ll make it more and more expensive and put more and more restrictions on day one games until it’s no longer recognisable.Grackle Panic button Strange to see Sony talking relatively openly about Nintendo and Microsoft as competition. I can’t remember the last time they mentioned either of them, even if they obviously would prefer not to have, if they hadn’t been asked by investors.At no point did they acknowledge that the Switch has completely outsold both their last two consoles, so I’m not sure where their confidence comes from. I guess it’s from the fact that they know they’ve done nothing this gen and still come out on top, so from their perspective they’ve got plenty in reserve. Expert, exclusive gaming analysis Sign up to the GameCentral newsletter for a unique take on the week in gaming, alongside the latest reviews and more. Delivered to your inbox every Saturday morning. Having your panic button being ‘do anything at all’ must be pretty reassuring really. Nintendo has had to work to get where they are with the Switch but Sony is just coasting it.Lupus James’ LadderJacob’s Ladder is a film I’ve been meaning to watch for a while, and I guessed the ending quite early on, but it feels like a Silent Hill film. I don’t know if you guys have seen it but it’s an excellent film and the hospital scene near the end, and the cages blocking off the underground early on, just remind me of the game. A depressing film overall but worth a watch.Simon GC: Jacob’s Ladder was as a major influence on Silent Hill 2 in particular, even the jacket James is wearing is the same. Email your comments to: gamecentral@metro.co.uk Seeing the future I know everyone likes to think of themselves as Nostradamus, but I have to admit I have absolutely no clue what Sony is planning for the PlayStation 6. A new console that is just the usual update, that sits under your TV, is easy enough to imagine but surely they’re not going to do that again?But the idea of having new home and portable machines that come out at the same time seems so unlikely to me. Surely the portable wouldn’t be a separate format, but I can’t see it being any kind of portable that runs its own games because it’d never be as powerful as the home machine. So, it’s really just a PlayStation Portal 2? Like I said, I don’t know, but for some reason I have a bad feeling about that the next gen and whatever Sony does end up unveiling. I suspect that whatever they and Microsoft does it’s going to end up making the Switch 2seem even more appealing by comparison.Gonch Hidden insight I’m not going to say that Welcome Tour is a good game but what I will say is that I found it very interesting at times and I’m actually kind of surprised that Nintendo revealed some of the information that they did. Most of it could probably be found out by reverse engineering it and just taking it apart but I’m still surprised it went into as much detail as it did.You’re right that it’s all presented in a very dull way but personally I found the ‘Insights’ to be the best part of the game. The minigames really are not very good and I was always glad when they were over. So, while I would not necessarily recommend the gameI would say that it can be of interest to people who have an interest in how consoles work and how Nintendo think.Mogwai Purchase privilege I’ve recently had the privilege of buying Clair Obscur: Expedition 33 from the website CDKeys, using a 10% discount code. I was lucky enough to only spend a total of £25.99; much cheaper than purchasing the title for console. If only Ubisoft had the foresight to see what they allowed to slip through their fingers. I’d also like to mention that from what I’ve read quite recently ,and a couple of mixed views, I don’t see myself cancelling my Switch 2. On the contrary, it just is coming across as a disappointment.From the battery life to the lack of launch titles, an empty open world is never a smart choice to make not even Mario is safe from that. That leaves the upcoming ROG Xbox Ally that’s recently been showcased and is set for an October launch. I won’t lie it does look in the same vein as the Switch 2, far too similar to the ROG Ally X model. Just with grips and a dedicated Xbox button. The Z2 Extreme chip has me intrigued, however. How much of a transcendental shift it makes is another question however. I’ll have to wait to receive official confirmation for a price and release date. But there’s also a Lenovo Legion Go 2 waiting in the wings. I hope we hear more information soon. Preferably before my 28th in August.Shahzaib Sadiq Tip of the iceberg Interesting to hear about Cyberpunk 2077 running well on the Switch 2. I think if they’re getting that kind of performance at launch, from a third party not use to working with Nintendo hardware, that bodes very well for the future.I think we’re probably underestimating the Switch 2 a lot at the moment and stuff we’ll be seeing in two or three years is going to be amazing, I predict. What I can’t predict is when we’ll hear about any of this. I really hope there’s a Nintendo Direct this week.Dano Changing opinions So just a little over a week with the Switch 2 and after initially feeling incredibly meh about the new console and Mario Kart a little more playtime has been more optimistic about the console and much more positive about Mario Kart World.It did feel odd having a new console from Nintendo that didn’t inspire that childlike excitement. An iterative upgrade isn’t very exciting and as I own a Steam Deck the advancements in processing weren’t all that exciting either. I can imagine someone who only bough an OG Switch back in 2017 really noticing the improvements but if you bought an OLED it’s basically a Switch Pro. The criminally low level of software support doesn’t help. I double dipped Street Fighter 6 only to discover I can’t transfer progress or DLC across from my Xbox, which sort of means if I want both profiles to have parity I have to buy everything twice! I also treated myself to a new Pro Controller and find using it for Street Fighter almost unplayable as the L and ZL buttons are far too easy to accidently press when playing. Mario Kart initially felt like more of the same and it was only after I made an effort to explore the world map, unlock characters and karts, and try the new grinding/ollie mechanic that it clicked. I am now really enjoying it, especially the remixed soundtracks. I do however want more Switch 2 exclusive experiences – going back through my back catalogue for improved frame rates doesn’t cut it Nintendo! As someone with a large digital library the system transfer was very frustrating and the new virtual cartridges are just awful – does a Switch 2 need to be online all the time now? Not the best idea for a portable system. So, the start of a new console lifecycle and hopefully lots of new IP – I suspect Nintendo will try and get us to revisit our back catalogues first though.BristolPete Inbox also-rans Just thought I would mention that if anyone’s interested in purchasing the Mortal Kombat 1 Definitive Edition, which includes all DLC, that it’s currently an absolute steal on the Xbox store at £21.99.Nick The GreekI’ve just won my first Knockout Tour online race on Mario Kart World! I’ve got to say, the feeling is magnificent.Rable More Trending Email your comments to: gamecentral@metro.co.uk The small printNew Inbox updates appear every weekday morning, with special Hot Topic Inboxes at the weekend. Readers’ letters are used on merit and may be edited for length and content. You can also submit your own 500 to 600-word Reader’s Feature at any time via email or our Submit Stuff page, which if used will be shown in the next available weekend slot. You can also leave your comments below and don’t forget to follow us on Twitter. Arrow MORE: Games Inbox: Is Mario Kart World too hard? GameCentral Sign up for exclusive analysis, latest releases, and bonus community content. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Your information will be used in line with our Privacy Policy #games #inbox #would #xbox #ever
    METRO.CO.UK
    Games Inbox: Would Xbox ever shut down Game Pass?
    Game Pass – will it continue forever? (Microsoft) The Monday letters page struggles to predict what’s going to happen with the PlayStation 6, as one reader sees their opinion of the Switch 2 change over time. To join in with the discussions yourself email gamecentral@metro.co.uk Final Pass I agree with a lot of what was said about the current state of Xbox in the Reader’s Feature this weekend and how the more Microsoft spends, and the more companies they own, the less the seem to be in control. Which is very strange really.The biggest recent failure has got to be Game Pass, which has not had the impact they expected and yet they don’t seem ready to acknowledge that. If they’re thinking of increasing the price again, like those rumours say, then I think that will be the point at which you can draw a line under the whole idea and admit it’s never going to catch on. But would Microsoft ever shut down Game Pass completely? I feel that would almost be more humiliating than stopping making consoles, so I can’t really imagine it. Instead, they’ll make it more and more expensive and put more and more restrictions on day one games until it’s no longer recognisable.Grackle Panic button Strange to see Sony talking relatively openly about Nintendo and Microsoft as competition. I can’t remember the last time they mentioned either of them, even if they obviously would prefer not to have, if they hadn’t been asked by investors.At no point did they acknowledge that the Switch has completely outsold both their last two consoles, so I’m not sure where their confidence comes from. I guess it’s from the fact that they know they’ve done nothing this gen and still come out on top, so from their perspective they’ve got plenty in reserve. Expert, exclusive gaming analysis Sign up to the GameCentral newsletter for a unique take on the week in gaming, alongside the latest reviews and more. Delivered to your inbox every Saturday morning. Having your panic button being ‘do anything at all’ must be pretty reassuring really. Nintendo has had to work to get where they are with the Switch but Sony is just coasting it.Lupus James’ LadderJacob’s Ladder is a film I’ve been meaning to watch for a while, and I guessed the ending quite early on, but it feels like a Silent Hill film. I don’t know if you guys have seen it but it’s an excellent film and the hospital scene near the end, and the cages blocking off the underground early on, just remind me of the game. A depressing film overall but worth a watch.Simon GC: Jacob’s Ladder was as a major influence on Silent Hill 2 in particular, even the jacket James is wearing is the same. Email your comments to: gamecentral@metro.co.uk Seeing the future I know everyone likes to think of themselves as Nostradamus, but I have to admit I have absolutely no clue what Sony is planning for the PlayStation 6. A new console that is just the usual update, that sits under your TV, is easy enough to imagine but surely they’re not going to do that again?But the idea of having new home and portable machines that come out at the same time seems so unlikely to me. Surely the portable wouldn’t be a separate format, but I can’t see it being any kind of portable that runs its own games because it’d never be as powerful as the home machine. So, it’s really just a PlayStation Portal 2? Like I said, I don’t know, but for some reason I have a bad feeling about that the next gen and whatever Sony does end up unveiling. I suspect that whatever they and Microsoft does it’s going to end up making the Switch 2 (and PC) seem even more appealing by comparison.Gonch Hidden insight I’m not going to say that Welcome Tour is a good game but what I will say is that I found it very interesting at times and I’m actually kind of surprised that Nintendo revealed some of the information that they did. Most of it could probably be found out by reverse engineering it and just taking it apart but I’m still surprised it went into as much detail as it did.You’re right that it’s all presented in a very dull way but personally I found the ‘Insights’ to be the best part of the game. The minigames really are not very good and I was always glad when they were over. So, while I would not necessarily recommend the game (it’s not really a game) I would say that it can be of interest to people who have an interest in how consoles work and how Nintendo think.Mogwai Purchase privilege I’ve recently had the privilege of buying Clair Obscur: Expedition 33 from the website CDKeys, using a 10% discount code. I was lucky enough to only spend a total of £25.99; much cheaper than purchasing the title for console. If only Ubisoft had the foresight to see what they allowed to slip through their fingers. I’d also like to mention that from what I’ve read quite recently ,and a couple of mixed views, I don’t see myself cancelling my Switch 2. On the contrary, it just is coming across as a disappointment.From the battery life to the lack of launch titles, an empty open world is never a smart choice to make not even Mario is safe from that. That leaves the upcoming ROG Xbox Ally that’s recently been showcased and is set for an October launch. I won’t lie it does look in the same vein as the Switch 2, far too similar to the ROG Ally X model. Just with grips and a dedicated Xbox button. The Z2 Extreme chip has me intrigued, however. How much of a transcendental shift it makes is another question however. I’ll have to wait to receive official confirmation for a price and release date. But there’s also a Lenovo Legion Go 2 waiting in the wings. I hope we hear more information soon. Preferably before my 28th in August.Shahzaib Sadiq Tip of the iceberg Interesting to hear about Cyberpunk 2077 running well on the Switch 2. I think if they’re getting that kind of performance at launch, from a third party not use to working with Nintendo hardware, that bodes very well for the future.I think we’re probably underestimating the Switch 2 a lot at the moment and stuff we’ll be seeing in two or three years is going to be amazing, I predict. What I can’t predict is when we’ll hear about any of this. I really hope there’s a Nintendo Direct this week.Dano Changing opinions So just a little over a week with the Switch 2 and after initially feeling incredibly meh about the new console and Mario Kart a little more playtime has been more optimistic about the console and much more positive about Mario Kart World.It did feel odd having a new console from Nintendo that didn’t inspire that childlike excitement. An iterative upgrade isn’t very exciting and as I own a Steam Deck the advancements in processing weren’t all that exciting either. I can imagine someone who only bough an OG Switch back in 2017 really noticing the improvements but if you bought an OLED it’s basically a Switch Pro (minus the OLED). The criminally low level of software support doesn’t help. I double dipped Street Fighter 6 only to discover I can’t transfer progress or DLC across from my Xbox, which sort of means if I want both profiles to have parity I have to buy everything twice! I also treated myself to a new Pro Controller and find using it for Street Fighter almost unplayable as the L and ZL buttons are far too easy to accidently press when playing. Mario Kart initially felt like more of the same and it was only after I made an effort to explore the world map, unlock characters and karts, and try the new grinding/ollie mechanic that it clicked. I am now really enjoying it, especially the remixed soundtracks. I do however want more Switch 2 exclusive experiences – going back through my back catalogue for improved frame rates doesn’t cut it Nintendo! As someone with a large digital library the system transfer was very frustrating and the new virtual cartridges are just awful – does a Switch 2 need to be online all the time now? Not the best idea for a portable system. So, the start of a new console lifecycle and hopefully lots of new IP – I suspect Nintendo will try and get us to revisit our back catalogues first though.BristolPete Inbox also-rans Just thought I would mention that if anyone’s interested in purchasing the Mortal Kombat 1 Definitive Edition, which includes all DLC, that it’s currently an absolute steal on the Xbox store at £21.99.Nick The GreekI’ve just won my first Knockout Tour online race on Mario Kart World! I’ve got to say, the feeling is magnificent.Rable More Trending Email your comments to: gamecentral@metro.co.uk The small printNew Inbox updates appear every weekday morning, with special Hot Topic Inboxes at the weekend. Readers’ letters are used on merit and may be edited for length and content. You can also submit your own 500 to 600-word Reader’s Feature at any time via email or our Submit Stuff page, which if used will be shown in the next available weekend slot. You can also leave your comments below and don’t forget to follow us on Twitter. Arrow MORE: Games Inbox: Is Mario Kart World too hard? GameCentral Sign up for exclusive analysis, latest releases, and bonus community content. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Your information will be used in line with our Privacy Policy
    Like
    Love
    Wow
    Angry
    Sad
    506
    2 التعليقات 0 المشاركات
  • Urban Adaptations – Devonport Tomorrow exhibition coming up at Depot Artspace

    This collaborative project shares creative propositions for the future development of Devonport village on Auckland’s North Shore, from an overall masterplan and individual sites worked up in models and visualisations.
    Led by Devonport locals Julie Stoutand architect Ken Davis, this exhibition features the work of 18 Architecture Masters students from the University of Auckland School of Architecture and Planning.
    Urban Adaptations – Devonport TomorrowWednesday 16 July – Sunday 27 July 2025
    Exhibition opening: Wednesday 16 July at 3 Victoria Road, 6pm to 8pmVenue: Depot Artspace, 3 Victoria Road, DevonportUrban Adaptations – Devonport Tomorrow dovetails with the exhibition/installation Buildingat the Whare Toi. This project is a collaboration between artist Richard Reddaway, designer and architectural historian Kate Linzey, and architect Matt Liggins and architecture students from the University of Auckland’s Bachelor of Architectural Studies. It explores suburban built environments and the genealogy of forms that constitute Te Hau Kapua Devonport to ponder relationships to the whenua, how we choose to create our homes and how different cultural understandings and expressions of home shape our suburban environment.
    BuildingMonday 14 July – Saturday 19 July 2025The Depot’s Whare Toi, Kerr Street, Devonport  
    Public Programmes
    Architecture and urban development panel discussion, lectures and films at The Vic are planned over the duration of the exhibition.
    #urban #adaptations #devonport #tomorrow #exhibition
    Urban Adaptations – Devonport Tomorrow exhibition coming up at Depot Artspace
    This collaborative project shares creative propositions for the future development of Devonport village on Auckland’s North Shore, from an overall masterplan and individual sites worked up in models and visualisations. Led by Devonport locals Julie Stoutand architect Ken Davis, this exhibition features the work of 18 Architecture Masters students from the University of Auckland School of Architecture and Planning. Urban Adaptations – Devonport TomorrowWednesday 16 July – Sunday 27 July 2025 Exhibition opening: Wednesday 16 July at 3 Victoria Road, 6pm to 8pmVenue: Depot Artspace, 3 Victoria Road, DevonportUrban Adaptations – Devonport Tomorrow dovetails with the exhibition/installation Buildingat the Whare Toi. This project is a collaboration between artist Richard Reddaway, designer and architectural historian Kate Linzey, and architect Matt Liggins and architecture students from the University of Auckland’s Bachelor of Architectural Studies. It explores suburban built environments and the genealogy of forms that constitute Te Hau Kapua Devonport to ponder relationships to the whenua, how we choose to create our homes and how different cultural understandings and expressions of home shape our suburban environment. BuildingMonday 14 July – Saturday 19 July 2025The Depot’s Whare Toi, Kerr Street, Devonport   Public Programmes Architecture and urban development panel discussion, lectures and films at The Vic are planned over the duration of the exhibition. #urban #adaptations #devonport #tomorrow #exhibition
    ARCHITECTURENOW.CO.NZ
    Urban Adaptations – Devonport Tomorrow exhibition coming up at Depot Artspace
    This collaborative project shares creative propositions for the future development of Devonport village on Auckland’s North Shore, from an overall masterplan and individual sites worked up in models and visualisations. Led by Devonport locals Julie Stout (Te Kāhui Whaihanga New Zealand Institute of Architects gold medal recipient) and architect Ken Davis, this exhibition features the work of 18 Architecture Masters students from the University of Auckland School of Architecture and Planning. Urban Adaptations – Devonport TomorrowWednesday 16 July – Sunday 27 July 2025 Exhibition opening: Wednesday 16 July at 3 Victoria Road, 6pm to 8pmVenue: Depot Artspace, 3 Victoria Road, DevonportUrban Adaptations – Devonport Tomorrow dovetails with the exhibition/installation Building (Under the Volcano) at the Whare Toi. This project is a collaboration between artist Richard Reddaway (Massey University College of Creative Arts), designer and architectural historian Kate Linzey (The Architectural Centre), and architect Matt Liggins and architecture students from the University of Auckland’s Bachelor of Architectural Studies. It explores suburban built environments and the genealogy of forms that constitute Te Hau Kapua Devonport to ponder relationships to the whenua, how we choose to create our homes and how different cultural understandings and expressions of home shape our suburban environment. Building (Under the Volcano)Monday 14 July – Saturday 19 July 2025The Depot’s Whare Toi, Kerr Street, Devonport   Public Programmes Architecture and urban development panel discussion, lectures and films at The Vic are planned over the duration of the exhibition (to be advised).
    Like
    Love
    Wow
    Angry
    Sad
    474
    0 التعليقات 0 المشاركات
  • Inside Mark Zuckerberg’s AI hiring spree

    AI researchers have recently been asking themselves a version of the question, “Is that really Zuck?”As first reported by Bloomberg, the Meta CEO has been personally asking top AI talent to join his new “superintelligence” AI lab and reboot Llama. His recruiting process typically goes like this: a cold outreach via email or WhatsApp that cites the recruit’s work history and requests a 15-minute chat. Dozens of researchers have gotten these kinds of messages at Google alone. For those who do agree to hear his pitch, Zuckerberg highlights the latitude they’ll have to make risky bets, the scale of Meta’s products, and the money he’s prepared to invest in the infrastructure to support them. He makes clear that this new team will be empowered and sit with him at Meta’s headquarters, where I’m told the desks have already been rearranged for the incoming team.Most of the headlines so far have focused on the eye-popping compensation packages Zuckerberg is offering, some of which are well into the eight-figure range. As I’ve covered before, hiring the best AI researcher is like hiring a star basketball player: there are very few of them, and you have to pay up. Case in point: Zuckerberg basically just paid 14 Instagrams to hire away Scale AI CEO Alexandr Wang. It’s easily the most expensive hire of all time, dwarfing the billions that Google spent to rehire Noam Shazeer and his core team from Character.AI. “Opportunities of this magnitude often come at a cost,” Wang wrote in his note to employees this week. “In this instance, that cost is my departure.”Zuckerberg’s recruiting spree is already starting to rattle his competitors. The day before his offer deadline for some senior OpenAI employees, Sam Altman dropped an essay proclaiming that “before anything else, we are a superintelligence research company.” And after Zuckerberg tried to hire DeepMind CTO Koray Kavukcuoglu, he was given a larger SVP title and now reports directly to Google CEO Sundar Pichai. I expect Wang to have the title of “chief AI officer” at Meta when the new lab is announced. Jack Rae, a principal researcher from DeepMind who has signed on, will lead pre-training. Meta certainly needs a reset. According to my sources, Llama has fallen so far behind that Meta’s product teams have recently discussed using AI models from other companies. Meta’s internal coding tool for engineers, however, is already using Claude. While Meta’s existing AI researchers have good reason to be looking over their shoulders, Zuckerberg’s billion investment in Scale is making many longtime employees, or Scaliens, quite wealthy. They were popping champagne in the office this morning. Then, Wang held his last all-hands meeting to say goodbye and cried. He didn’t mention what he would be doing at Meta. I expect his new team will be unveiled within the next few weeks after Zuckerberg gets a critical number of members to officially sign on. Tim Cook. Getty Images / The VergeApple’s AI problemApple is accustomed to being on top of the tech industry, and for good reason: the company has enjoyed a nearly unrivaled run of dominance. After spending time at Apple HQ this week for WWDC, I’m not sure that its leaders appreciate the meteorite that is heading their way. The hubris they display suggests they don’t understand how AI is fundamentally changing how people use and build software.Heading into the keynote on Monday, everyone knew not to expect the revamped Siri that had been promised the previous year. Apple, to its credit, acknowledged that it dropped the ball there, and it sounds like a large language model rebuild of Siri is very much underway and coming in 2026.The AI industry moves much faster than Apple’s release schedule, though. By the time Siri is perhaps good enough to keep pace, it will have to contend with the lock-in that OpenAI and others are building through their memory features. Apple and OpenAI are currently partners, but both companies want to ultimately control the interface for interacting with AI, which puts them on a collision course. Apple’s decision to let developers use its own, on-device foundational models for free in their apps sounds strategically smart, but unfortunately, the models look far from leading. Apple ran its own benchmarks, which aren’t impressive, and has confirmed a measly context window of 4,096 tokens. It’s also saying that the models will be updated alongside its operating systems — a snail’s pace compared to how quickly AI companies move. I’d be surprised if any serious developers use these Apple models, although I can see them being helpful to indie devs who are just getting started and don’t want to spend on the leading cloud models. I don’t think most people care about the privacy angle that Apple is claiming as a differentiator; they are already sharing their darkest secrets with ChatGPT and other assistants. Some of the new Apple Intelligence features I demoed this week were impressive, such as live language translation for calls. Mostly, I came away with the impression that the company is heavily leaning on its ChatGPT partnership as a stopgap until Apple Intelligence and Siri are both where they need to be. AI probably isn’t a near-term risk to Apple’s business. No one has shipped anything close to the contextually aware Siri that was demoed at last year’s WWDC. People will continue to buy Apple hardware for a long time, even after Sam Altman and Jony Ive announce their first AI device for ChatGPT next year. AR glasses aren’t going mainstream anytime soon either, although we can expect to see more eyewear from Meta, Google, and Snap over the coming year. In aggregate, these AI-powered devices could begin to siphon away engagement from the iPhone, but I don’t see people fully replacing their smartphones for a long time. The bigger question after this week is whether Apple has what it takes to rise to the occasion and culturally reset itself for the AI era. I would have loved to hear Tim Cook address this issue directly, but the only interview he did for WWDC was a cover story in Variety about the company’s new F1 movie.ElsewhereAI agents are coming. I recently caught up with Databricks CEO Ali Ghodsi ahead of his company’s annual developer conference this week in San Francisco. Given Databricks’ position, he has a unique, bird’s-eye view of where things are headed for AI. He doesn’t envision a near-term future where AI agents completely automate real-world tasks, but he does predict a wave of startups over the next year that will come close to completing actions in areas such as travel booking. He thinks humans will needto approve what an agent does before it goes off and completes a task. “We have most of the airplanes flying automated, and we still want pilots in there.”Buyouts are the new normal at Google. That much is clear after this week’s rollout of the “voluntary exit program” in core engineering, the Search organization, and some other divisions. In his internal memo, Search SVP Nick Fox was clear that management thinks buyouts have been successful in other parts of the company that have tried them. In a separate memo I saw, engineering exec Jen Fitzpatrick called the buyouts an “opportunity to create internal mobility and fresh growth opportunities.” Google appears to be attempting a cultural reset, which will be a challenging task for a company of its size. We’ll see if it can pull it off. Evan Spiegel wants help with AR glasses. I doubt that his announcement that consumer glasses are coming next year was solely aimed at AR developers. Telegraphing the plan and announcing that Snap has spent billion on hardware to date feels more aimed at potential partners that want to make a bigger glasses play, such as Google. A strategic investment could help insulate Snap from the pain of the stock market. A full acquisition may not be off the table, either. When he was recently asked if he’d be open to a sale, Spiegel didn’t shut it down like he always has, but instead said he’d “consider anything” that helps the company “create the next computing platform.”Link listMore to click on:If you haven’t already, don’t forget to subscribe to The Verge, which includes unlimited access to Command Line and all of our reporting.As always, I welcome your feedback, especially if you’re an AI researcher fielding a juicy job offer. You can respond here or ping me securely on Signal.Thanks for subscribing.See More:
    #inside #mark #zuckerbergs #hiring #spree
    Inside Mark Zuckerberg’s AI hiring spree
    AI researchers have recently been asking themselves a version of the question, “Is that really Zuck?”As first reported by Bloomberg, the Meta CEO has been personally asking top AI talent to join his new “superintelligence” AI lab and reboot Llama. His recruiting process typically goes like this: a cold outreach via email or WhatsApp that cites the recruit’s work history and requests a 15-minute chat. Dozens of researchers have gotten these kinds of messages at Google alone. For those who do agree to hear his pitch, Zuckerberg highlights the latitude they’ll have to make risky bets, the scale of Meta’s products, and the money he’s prepared to invest in the infrastructure to support them. He makes clear that this new team will be empowered and sit with him at Meta’s headquarters, where I’m told the desks have already been rearranged for the incoming team.Most of the headlines so far have focused on the eye-popping compensation packages Zuckerberg is offering, some of which are well into the eight-figure range. As I’ve covered before, hiring the best AI researcher is like hiring a star basketball player: there are very few of them, and you have to pay up. Case in point: Zuckerberg basically just paid 14 Instagrams to hire away Scale AI CEO Alexandr Wang. It’s easily the most expensive hire of all time, dwarfing the billions that Google spent to rehire Noam Shazeer and his core team from Character.AI. “Opportunities of this magnitude often come at a cost,” Wang wrote in his note to employees this week. “In this instance, that cost is my departure.”Zuckerberg’s recruiting spree is already starting to rattle his competitors. The day before his offer deadline for some senior OpenAI employees, Sam Altman dropped an essay proclaiming that “before anything else, we are a superintelligence research company.” And after Zuckerberg tried to hire DeepMind CTO Koray Kavukcuoglu, he was given a larger SVP title and now reports directly to Google CEO Sundar Pichai. I expect Wang to have the title of “chief AI officer” at Meta when the new lab is announced. Jack Rae, a principal researcher from DeepMind who has signed on, will lead pre-training. Meta certainly needs a reset. According to my sources, Llama has fallen so far behind that Meta’s product teams have recently discussed using AI models from other companies. Meta’s internal coding tool for engineers, however, is already using Claude. While Meta’s existing AI researchers have good reason to be looking over their shoulders, Zuckerberg’s billion investment in Scale is making many longtime employees, or Scaliens, quite wealthy. They were popping champagne in the office this morning. Then, Wang held his last all-hands meeting to say goodbye and cried. He didn’t mention what he would be doing at Meta. I expect his new team will be unveiled within the next few weeks after Zuckerberg gets a critical number of members to officially sign on. Tim Cook. Getty Images / The VergeApple’s AI problemApple is accustomed to being on top of the tech industry, and for good reason: the company has enjoyed a nearly unrivaled run of dominance. After spending time at Apple HQ this week for WWDC, I’m not sure that its leaders appreciate the meteorite that is heading their way. The hubris they display suggests they don’t understand how AI is fundamentally changing how people use and build software.Heading into the keynote on Monday, everyone knew not to expect the revamped Siri that had been promised the previous year. Apple, to its credit, acknowledged that it dropped the ball there, and it sounds like a large language model rebuild of Siri is very much underway and coming in 2026.The AI industry moves much faster than Apple’s release schedule, though. By the time Siri is perhaps good enough to keep pace, it will have to contend with the lock-in that OpenAI and others are building through their memory features. Apple and OpenAI are currently partners, but both companies want to ultimately control the interface for interacting with AI, which puts them on a collision course. Apple’s decision to let developers use its own, on-device foundational models for free in their apps sounds strategically smart, but unfortunately, the models look far from leading. Apple ran its own benchmarks, which aren’t impressive, and has confirmed a measly context window of 4,096 tokens. It’s also saying that the models will be updated alongside its operating systems — a snail’s pace compared to how quickly AI companies move. I’d be surprised if any serious developers use these Apple models, although I can see them being helpful to indie devs who are just getting started and don’t want to spend on the leading cloud models. I don’t think most people care about the privacy angle that Apple is claiming as a differentiator; they are already sharing their darkest secrets with ChatGPT and other assistants. Some of the new Apple Intelligence features I demoed this week were impressive, such as live language translation for calls. Mostly, I came away with the impression that the company is heavily leaning on its ChatGPT partnership as a stopgap until Apple Intelligence and Siri are both where they need to be. AI probably isn’t a near-term risk to Apple’s business. No one has shipped anything close to the contextually aware Siri that was demoed at last year’s WWDC. People will continue to buy Apple hardware for a long time, even after Sam Altman and Jony Ive announce their first AI device for ChatGPT next year. AR glasses aren’t going mainstream anytime soon either, although we can expect to see more eyewear from Meta, Google, and Snap over the coming year. In aggregate, these AI-powered devices could begin to siphon away engagement from the iPhone, but I don’t see people fully replacing their smartphones for a long time. The bigger question after this week is whether Apple has what it takes to rise to the occasion and culturally reset itself for the AI era. I would have loved to hear Tim Cook address this issue directly, but the only interview he did for WWDC was a cover story in Variety about the company’s new F1 movie.ElsewhereAI agents are coming. I recently caught up with Databricks CEO Ali Ghodsi ahead of his company’s annual developer conference this week in San Francisco. Given Databricks’ position, he has a unique, bird’s-eye view of where things are headed for AI. He doesn’t envision a near-term future where AI agents completely automate real-world tasks, but he does predict a wave of startups over the next year that will come close to completing actions in areas such as travel booking. He thinks humans will needto approve what an agent does before it goes off and completes a task. “We have most of the airplanes flying automated, and we still want pilots in there.”Buyouts are the new normal at Google. That much is clear after this week’s rollout of the “voluntary exit program” in core engineering, the Search organization, and some other divisions. In his internal memo, Search SVP Nick Fox was clear that management thinks buyouts have been successful in other parts of the company that have tried them. In a separate memo I saw, engineering exec Jen Fitzpatrick called the buyouts an “opportunity to create internal mobility and fresh growth opportunities.” Google appears to be attempting a cultural reset, which will be a challenging task for a company of its size. We’ll see if it can pull it off. Evan Spiegel wants help with AR glasses. I doubt that his announcement that consumer glasses are coming next year was solely aimed at AR developers. Telegraphing the plan and announcing that Snap has spent billion on hardware to date feels more aimed at potential partners that want to make a bigger glasses play, such as Google. A strategic investment could help insulate Snap from the pain of the stock market. A full acquisition may not be off the table, either. When he was recently asked if he’d be open to a sale, Spiegel didn’t shut it down like he always has, but instead said he’d “consider anything” that helps the company “create the next computing platform.”Link listMore to click on:If you haven’t already, don’t forget to subscribe to The Verge, which includes unlimited access to Command Line and all of our reporting.As always, I welcome your feedback, especially if you’re an AI researcher fielding a juicy job offer. You can respond here or ping me securely on Signal.Thanks for subscribing.See More: #inside #mark #zuckerbergs #hiring #spree
    WWW.THEVERGE.COM
    Inside Mark Zuckerberg’s AI hiring spree
    AI researchers have recently been asking themselves a version of the question, “Is that really Zuck?”As first reported by Bloomberg, the Meta CEO has been personally asking top AI talent to join his new “superintelligence” AI lab and reboot Llama. His recruiting process typically goes like this: a cold outreach via email or WhatsApp that cites the recruit’s work history and requests a 15-minute chat. Dozens of researchers have gotten these kinds of messages at Google alone. For those who do agree to hear his pitch (amazingly, not all of them do), Zuckerberg highlights the latitude they’ll have to make risky bets, the scale of Meta’s products, and the money he’s prepared to invest in the infrastructure to support them. He makes clear that this new team will be empowered and sit with him at Meta’s headquarters, where I’m told the desks have already been rearranged for the incoming team.Most of the headlines so far have focused on the eye-popping compensation packages Zuckerberg is offering, some of which are well into the eight-figure range. As I’ve covered before, hiring the best AI researcher is like hiring a star basketball player: there are very few of them, and you have to pay up. Case in point: Zuckerberg basically just paid 14 Instagrams to hire away Scale AI CEO Alexandr Wang. It’s easily the most expensive hire of all time, dwarfing the billions that Google spent to rehire Noam Shazeer and his core team from Character.AI (a deal Zuckerberg passed on). “Opportunities of this magnitude often come at a cost,” Wang wrote in his note to employees this week. “In this instance, that cost is my departure.”Zuckerberg’s recruiting spree is already starting to rattle his competitors. The day before his offer deadline for some senior OpenAI employees, Sam Altman dropped an essay proclaiming that “before anything else, we are a superintelligence research company.” And after Zuckerberg tried to hire DeepMind CTO Koray Kavukcuoglu, he was given a larger SVP title and now reports directly to Google CEO Sundar Pichai. I expect Wang to have the title of “chief AI officer” at Meta when the new lab is announced. Jack Rae, a principal researcher from DeepMind who has signed on, will lead pre-training. Meta certainly needs a reset. According to my sources, Llama has fallen so far behind that Meta’s product teams have recently discussed using AI models from other companies (although that is highly unlikely to happen). Meta’s internal coding tool for engineers, however, is already using Claude. While Meta’s existing AI researchers have good reason to be looking over their shoulders, Zuckerberg’s $14.3 billion investment in Scale is making many longtime employees, or Scaliens, quite wealthy. They were popping champagne in the office this morning. Then, Wang held his last all-hands meeting to say goodbye and cried. He didn’t mention what he would be doing at Meta. I expect his new team will be unveiled within the next few weeks after Zuckerberg gets a critical number of members to officially sign on. Tim Cook. Getty Images / The VergeApple’s AI problemApple is accustomed to being on top of the tech industry, and for good reason: the company has enjoyed a nearly unrivaled run of dominance. After spending time at Apple HQ this week for WWDC, I’m not sure that its leaders appreciate the meteorite that is heading their way. The hubris they display suggests they don’t understand how AI is fundamentally changing how people use and build software.Heading into the keynote on Monday, everyone knew not to expect the revamped Siri that had been promised the previous year. Apple, to its credit, acknowledged that it dropped the ball there, and it sounds like a large language model rebuild of Siri is very much underway and coming in 2026.The AI industry moves much faster than Apple’s release schedule, though. By the time Siri is perhaps good enough to keep pace, it will have to contend with the lock-in that OpenAI and others are building through their memory features. Apple and OpenAI are currently partners, but both companies want to ultimately control the interface for interacting with AI, which puts them on a collision course. Apple’s decision to let developers use its own, on-device foundational models for free in their apps sounds strategically smart, but unfortunately, the models look far from leading. Apple ran its own benchmarks, which aren’t impressive, and has confirmed a measly context window of 4,096 tokens. It’s also saying that the models will be updated alongside its operating systems — a snail’s pace compared to how quickly AI companies move. I’d be surprised if any serious developers use these Apple models, although I can see them being helpful to indie devs who are just getting started and don’t want to spend on the leading cloud models. I don’t think most people care about the privacy angle that Apple is claiming as a differentiator; they are already sharing their darkest secrets with ChatGPT and other assistants. Some of the new Apple Intelligence features I demoed this week were impressive, such as live language translation for calls. Mostly, I came away with the impression that the company is heavily leaning on its ChatGPT partnership as a stopgap until Apple Intelligence and Siri are both where they need to be. AI probably isn’t a near-term risk to Apple’s business. No one has shipped anything close to the contextually aware Siri that was demoed at last year’s WWDC. People will continue to buy Apple hardware for a long time, even after Sam Altman and Jony Ive announce their first AI device for ChatGPT next year. AR glasses aren’t going mainstream anytime soon either, although we can expect to see more eyewear from Meta, Google, and Snap over the coming year. In aggregate, these AI-powered devices could begin to siphon away engagement from the iPhone, but I don’t see people fully replacing their smartphones for a long time. The bigger question after this week is whether Apple has what it takes to rise to the occasion and culturally reset itself for the AI era. I would have loved to hear Tim Cook address this issue directly, but the only interview he did for WWDC was a cover story in Variety about the company’s new F1 movie.ElsewhereAI agents are coming. I recently caught up with Databricks CEO Ali Ghodsi ahead of his company’s annual developer conference this week in San Francisco. Given Databricks’ position, he has a unique, bird’s-eye view of where things are headed for AI. He doesn’t envision a near-term future where AI agents completely automate real-world tasks, but he does predict a wave of startups over the next year that will come close to completing actions in areas such as travel booking. He thinks humans will need (and want) to approve what an agent does before it goes off and completes a task. “We have most of the airplanes flying automated, and we still want pilots in there.”Buyouts are the new normal at Google. That much is clear after this week’s rollout of the “voluntary exit program” in core engineering, the Search organization, and some other divisions. In his internal memo, Search SVP Nick Fox was clear that management thinks buyouts have been successful in other parts of the company that have tried them. In a separate memo I saw, engineering exec Jen Fitzpatrick called the buyouts an “opportunity to create internal mobility and fresh growth opportunities.” Google appears to be attempting a cultural reset, which will be a challenging task for a company of its size. We’ll see if it can pull it off. Evan Spiegel wants help with AR glasses. I doubt that his announcement that consumer glasses are coming next year was solely aimed at AR developers. Telegraphing the plan and announcing that Snap has spent $3 billion on hardware to date feels more aimed at potential partners that want to make a bigger glasses play, such as Google. A strategic investment could help insulate Snap from the pain of the stock market. A full acquisition may not be off the table, either. When he was recently asked if he’d be open to a sale, Spiegel didn’t shut it down like he always has, but instead said he’d “consider anything” that helps the company “create the next computing platform.”Link listMore to click on:If you haven’t already, don’t forget to subscribe to The Verge, which includes unlimited access to Command Line and all of our reporting.As always, I welcome your feedback, especially if you’re an AI researcher fielding a juicy job offer. You can respond here or ping me securely on Signal.Thanks for subscribing.See More:
    0 التعليقات 0 المشاركات
  • Making a killing: The playful 2D terror of Psycasso®

    A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.”

    Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage.
    #making #killing #playful #terror #psycasso
    Making a killing: The playful 2D terror of Psycasso®
    A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.” Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage. #making #killing #playful #terror #psycasso
    UNITY.COM
    Making a killing: The playful 2D terror of Psycasso®
    A serial killer is stalking the streets, and his murders are a work of art. That’s more or less the premise behind Psycasso®, a tongue-in-cheek 2D pixel art game from Omni Digital Technologies that’s debuting a demo at Steam Next Fest this week, with plans to head into Early Access later this year. Playing as the killer, you get a job and build a life by day, then hunt the streets by night to find and torture victims, paint masterpieces with their blood, then sell them to fund operations.I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a fresh (if gruesome) twist.Let’s start with a bit of background about the game.Omni: We wanted to make something that stands out. We know a lot of indie studios are releasing games and the market is ever growing, so we wanted to make something that’s not just fun to play, but catches people’s attention when others tell them about it. We’ve created an open-world pixel art game about an artist who spends his day getting a job, trying to fit into society. Then at nighttime, things take a more sinister turn and he goes around and makes artwork out of his victim's blood.We didn’t want to make it creepy and gory. We kind of wanted it to be cutesy and fun, just to make it ironic. Making it was a big challenge. We basically had to create an entire city with functioning shops and NPCs who have their own lives, their own hobbies. It was a huge challenge.So what does the actual gameplay look like?Omni: There’s a day cycle and a night cycle that breaks up the gameplay. During the day, you can get a job, level up skills, buy properties and furniture upgrades. At nighttime, the lighting completely changes, the vibe completely changes, there’s police on the street and the flow of the game shifts. The idea is that you can kidnap NPCs using a whole bunch of different weapons – guns, throwable grenades, little traps and cool stuff that you can capture people with.Once captured on the street, you can either harvest their blood and body parts there, or buy a specialist room to keep them in a cage and put them in various equipment like hanging chains or torture chairs. The player gets better rewards for harvesting blood and body parts this way.On the flip side, there’s a whole other element to the game where the player is given missions each week from galleries around the city. They come up on your phone menu, and you can accept them and do either portrait or landscape paintings, with all of the painting being done using only shades of red. We've got some nice drip effects and splat sounds to make it feel like you’re painting with blood. Then you can give your creation a name, submit it to a gallery, then it goes into a fake auction, people will bid on the artwork and you get paid and large amount of in-game money so you can then buy upgrades for the home, upgrade painting tools like bigger paint brushes, more selection tools, stuff like that.Ben: There’s definitely nothing like it. And that was the aim, is when you are telling people about it, they’re like, “Oh, okay. Right. We’re not going to forget about this.” Let’s dig into the 2D tools you used to create this world.Ben: It’s using the 2D Renderer. The Happy Harvest 2D sample project that you guys made was kind of a big starting point, from a lighting perspective, and doing the normal maps of the 2D and getting the lighting to look nice. Our night system is a very stripped-down, then added-on version of the thing that you guys made. I was particularly interested by its shadows. The building’s shadows aren’t actually shadows – it’s a black light. We tried to recreate that with all of our buildings in the entire open world – so it does look beautiful for a 2D game, if I do say so myself.Can you say a bit about how you’re using AI or procedural generation in NPCs?Ben: I don’t know how many actually made it into the demo to be fair, number-wise. Every single NPC has a unique identity, as in they all have a place of work that they go to on a regular schedule. They have hobbies, they have spots where they prefer to loiter, a park bench or whatever. So you can get to know everyone’s individual lifestyle.So, the old man that lives in the same building as me might love to go to the casino at nighttime or go consistently on a Monday and a Friday, that kind of vibe.It uses the A* Pathfinding Project, because we knew we wanted to have a lot of AIs. We’ve locked off most of the city for the demo, but the actual size of the city is huge. The police mechanics are currently turned off, but there’s 80% police mechanics in there as well. If you punch someone or hurt someone, that’s a crime, and if anyone sees it, they can go and report to the police and then things happen. That’s a feature that’s there but not demo-ready yet.How close would you say you are to a full release?Omni: We should be scheduled for October for early access. By that point we’ll have the stealth mechanics and the policing systems polished and in and get some of the other upcoming features buttoned up. We’re fairly close.Ben: Lots of it’s already done, it’s just turned off for the demo. We don’t want to overwhelm people because there’s just so much for the player to do.Tell me a bit about the paint mechanics – how did you build that?Ben: It is custom. We built it ourselves completely from scratch. But I can't take responsibility for that one – someone else did the whole thing – that was their baby. It is really, really cool though.Omni: It’s got a variety of masking tools, the ability to change opacity and spacing, you can undo, redo. It’s a really fantastic feature that gives people the opportunity to express themselves and make some great art.Ben: And it's gamified, so it doesn’t feel like you’ve just opened up Paint in Windows.Omni: Best of all is when you make a painting, it gets turned into an inventory item so you physically carry it around with you and can sell it or treasure it.What’s the most exciting part of Psycasso for you?Omni: Stunning graphics. I think graphically, it looks really pretty.Ben: Visually, you could look at it and go, “Oh, that’s Psycasso.”Omni: What we’ve done is taken a cozy retro-style game, and we’ve brought modern design, logic, and technology into it. So you're playing what feels like a nostalgic game, but you're getting the experience of a much newer project.Check out the Psycasso demo on Steam, and stay tuned for more NextFest coverage.
    0 التعليقات 0 المشاركات
  • Fox News AI Newsletter: Hollywood studios sue 'bottomless pit of plagiarism'

    The Minions pose during the world premiere of the film "Despicable Me 4" in New York City, June 9, 2024. NEWYou can now listen to Fox News articles!
    Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.IN TODAY’S NEWSLETTER:- Major Hollywood studios sue AI company over copyright infringement in landmark move- Meta's Zuckerberg aiming to dominate AI race with recruiting push for new ‘superintelligence’ team: report- OpenAI says this state will play central role in artificial intelligence development The website of Midjourney, an artificial intelligencecapable of creating AI art, is seen on a smartphone on April 3, 2023, in Berlin, Germany.'PIRACY IS PIRACY': Two major Hollywood studios are suing Midjourney, a popular AI image generator, over its use and distribution of intellectual property.AI RACE: Meta CEO Mark Zuckerberg is reportedly building a team of experts to develop artificial general intelligencethat can meet or exceed human capabilities.TECH HUB: New York is poised to play a central role in the development of artificial intelligence, OpenAI executives told key business and civic leaders on Tuesday. Attendees watch a presentation during an event on the Apple campus in Cupertino, Calif., Monday, June 9, 2025. APPLE FALLING BEHIND: Apple’s annual Worldwide Developers Conferencekicked off on Monday and runs through Friday. But the Cupertino-based company is not making us wait until the end. The major announcements have already been made, and there are quite a few. The headliners are new software versions for Macs, iPhones, iPads and Vision. FROM COAL TO CODE: This week, Amazon announced a billion investment in artificial intelligence infrastructure in the form of new data centers, the largest in the commonwealth's history, according to the eCommerce giant.DIGITAL DEFENSE: A growing number of fire departments across the country are turning to artificial intelligence to help detect and respond to wildfires more quickly. Rep. Darin LaHood, R-Ill., leaves the House Republican Conference meeting at the Capitol Hill Club in Washington on Tuesday, May 17, 2022. SHIELD FROM BEIJING: Rep. Darin LaHood, R-Ill., is introducing a new bill Thursday imploring the National Security Administrationto develop an "AI security playbook" to stay ahead of threats from China and other foreign adversaries. ROBOT RALLY PARTNER: Finding a reliable tennis partner who matches your energy and skill level can be a challenge. Now, with Tenniix, an artificial intelligence-powered tennis robot from T-Apex, players of all abilities have a new way to practice and improve. DIGITAL DANGER ZONE: Scam ads on Facebook have evolved beyond the days of misspelled headlines and sketchy product photos. Today, many are powered by artificial intelligence, fueled by deepfake technology and distributed at scale through Facebook’s own ad system.  Fairfield, Ohio, USA - February 25, 2011 : Chipotle Mexican Grill Logo on brick building. Chipotle is a chain of fast casual restaurants in the United States and Canada that specialize in burritos and tacos.'EXPONENTIAL RATE': Artificial intelligence is helping Chipotle rapidly grow its footprint, according to CEO Scott Boatwright. AI TAKEOVER THREAT: The hottest topic nowadays revolves around Artificial Intelligenceand its potential to rapidly and imminently transform the world we live in — economically, socially, politically and even defensively. Regardless of whether you believe that the technology will be able to develop superintelligence and lead a metamorphosis of everything, the possibility that may come to fruition is a catalyst for more far-leftist control.FOLLOW FOX NEWS ON SOCIAL MEDIASIGN UP FOR OUR OTHER NEWSLETTERSDOWNLOAD OUR APPSWATCH FOX NEWS ONLINEFox News GoSTREAM FOX NATIONFox NationStay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here. This article was written by Fox News staff.
    #fox #news #newsletter #hollywood #studios
    Fox News AI Newsletter: Hollywood studios sue 'bottomless pit of plagiarism'
    The Minions pose during the world premiere of the film "Despicable Me 4" in New York City, June 9, 2024. NEWYou can now listen to Fox News articles! Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.IN TODAY’S NEWSLETTER:- Major Hollywood studios sue AI company over copyright infringement in landmark move- Meta's Zuckerberg aiming to dominate AI race with recruiting push for new ‘superintelligence’ team: report- OpenAI says this state will play central role in artificial intelligence development The website of Midjourney, an artificial intelligencecapable of creating AI art, is seen on a smartphone on April 3, 2023, in Berlin, Germany.'PIRACY IS PIRACY': Two major Hollywood studios are suing Midjourney, a popular AI image generator, over its use and distribution of intellectual property.AI RACE: Meta CEO Mark Zuckerberg is reportedly building a team of experts to develop artificial general intelligencethat can meet or exceed human capabilities.TECH HUB: New York is poised to play a central role in the development of artificial intelligence, OpenAI executives told key business and civic leaders on Tuesday. Attendees watch a presentation during an event on the Apple campus in Cupertino, Calif., Monday, June 9, 2025. APPLE FALLING BEHIND: Apple’s annual Worldwide Developers Conferencekicked off on Monday and runs through Friday. But the Cupertino-based company is not making us wait until the end. The major announcements have already been made, and there are quite a few. The headliners are new software versions for Macs, iPhones, iPads and Vision. FROM COAL TO CODE: This week, Amazon announced a billion investment in artificial intelligence infrastructure in the form of new data centers, the largest in the commonwealth's history, according to the eCommerce giant.DIGITAL DEFENSE: A growing number of fire departments across the country are turning to artificial intelligence to help detect and respond to wildfires more quickly. Rep. Darin LaHood, R-Ill., leaves the House Republican Conference meeting at the Capitol Hill Club in Washington on Tuesday, May 17, 2022. SHIELD FROM BEIJING: Rep. Darin LaHood, R-Ill., is introducing a new bill Thursday imploring the National Security Administrationto develop an "AI security playbook" to stay ahead of threats from China and other foreign adversaries. ROBOT RALLY PARTNER: Finding a reliable tennis partner who matches your energy and skill level can be a challenge. Now, with Tenniix, an artificial intelligence-powered tennis robot from T-Apex, players of all abilities have a new way to practice and improve. DIGITAL DANGER ZONE: Scam ads on Facebook have evolved beyond the days of misspelled headlines and sketchy product photos. Today, many are powered by artificial intelligence, fueled by deepfake technology and distributed at scale through Facebook’s own ad system.  Fairfield, Ohio, USA - February 25, 2011 : Chipotle Mexican Grill Logo on brick building. Chipotle is a chain of fast casual restaurants in the United States and Canada that specialize in burritos and tacos.'EXPONENTIAL RATE': Artificial intelligence is helping Chipotle rapidly grow its footprint, according to CEO Scott Boatwright. AI TAKEOVER THREAT: The hottest topic nowadays revolves around Artificial Intelligenceand its potential to rapidly and imminently transform the world we live in — economically, socially, politically and even defensively. Regardless of whether you believe that the technology will be able to develop superintelligence and lead a metamorphosis of everything, the possibility that may come to fruition is a catalyst for more far-leftist control.FOLLOW FOX NEWS ON SOCIAL MEDIASIGN UP FOR OUR OTHER NEWSLETTERSDOWNLOAD OUR APPSWATCH FOX NEWS ONLINEFox News GoSTREAM FOX NATIONFox NationStay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here. This article was written by Fox News staff. #fox #news #newsletter #hollywood #studios
    WWW.FOXNEWS.COM
    Fox News AI Newsletter: Hollywood studios sue 'bottomless pit of plagiarism'
    The Minions pose during the world premiere of the film "Despicable Me 4" in New York City, June 9, 2024.  (REUTERS/Kena Betancur) NEWYou can now listen to Fox News articles! Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.IN TODAY’S NEWSLETTER:- Major Hollywood studios sue AI company over copyright infringement in landmark move- Meta's Zuckerberg aiming to dominate AI race with recruiting push for new ‘superintelligence’ team: report- OpenAI says this state will play central role in artificial intelligence development The website of Midjourney, an artificial intelligence (AI) capable of creating AI art, is seen on a smartphone on April 3, 2023, in Berlin, Germany. (Thomas Trutschel/Photothek via Getty Images)'PIRACY IS PIRACY': Two major Hollywood studios are suing Midjourney, a popular AI image generator, over its use and distribution of intellectual property.AI RACE: Meta CEO Mark Zuckerberg is reportedly building a team of experts to develop artificial general intelligence (AGI) that can meet or exceed human capabilities.TECH HUB: New York is poised to play a central role in the development of artificial intelligence (AI), OpenAI executives told key business and civic leaders on Tuesday. Attendees watch a presentation during an event on the Apple campus in Cupertino, Calif., Monday, June 9, 2025.  (AP Photo/Jeff Chiu)APPLE FALLING BEHIND: Apple’s annual Worldwide Developers Conference (WWDC) kicked off on Monday and runs through Friday. But the Cupertino-based company is not making us wait until the end. The major announcements have already been made, and there are quite a few. The headliners are new software versions for Macs, iPhones, iPads and Vision. FROM COAL TO CODE: This week, Amazon announced a $20 billion investment in artificial intelligence infrastructure in the form of new data centers, the largest in the commonwealth's history, according to the eCommerce giant.DIGITAL DEFENSE: A growing number of fire departments across the country are turning to artificial intelligence to help detect and respond to wildfires more quickly. Rep. Darin LaHood, R-Ill., leaves the House Republican Conference meeting at the Capitol Hill Club in Washington on Tuesday, May 17, 2022.  (Bill Clark/CQ-Roll Call, Inc via Getty Images)SHIELD FROM BEIJING: Rep. Darin LaHood, R-Ill., is introducing a new bill Thursday imploring the National Security Administration (NSA) to develop an "AI security playbook" to stay ahead of threats from China and other foreign adversaries. ROBOT RALLY PARTNER: Finding a reliable tennis partner who matches your energy and skill level can be a challenge. Now, with Tenniix, an artificial intelligence-powered tennis robot from T-Apex, players of all abilities have a new way to practice and improve. DIGITAL DANGER ZONE: Scam ads on Facebook have evolved beyond the days of misspelled headlines and sketchy product photos. Today, many are powered by artificial intelligence, fueled by deepfake technology and distributed at scale through Facebook’s own ad system.  Fairfield, Ohio, USA - February 25, 2011 : Chipotle Mexican Grill Logo on brick building. Chipotle is a chain of fast casual restaurants in the United States and Canada that specialize in burritos and tacos. (iStock)'EXPONENTIAL RATE': Artificial intelligence is helping Chipotle rapidly grow its footprint, according to CEO Scott Boatwright. AI TAKEOVER THREAT: The hottest topic nowadays revolves around Artificial Intelligence (AI) and its potential to rapidly and imminently transform the world we live in — economically, socially, politically and even defensively. Regardless of whether you believe that the technology will be able to develop superintelligence and lead a metamorphosis of everything, the possibility that may come to fruition is a catalyst for more far-leftist control.FOLLOW FOX NEWS ON SOCIAL MEDIASIGN UP FOR OUR OTHER NEWSLETTERSDOWNLOAD OUR APPSWATCH FOX NEWS ONLINEFox News GoSTREAM FOX NATIONFox NationStay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here. This article was written by Fox News staff.
    0 التعليقات 0 المشاركات
  • Could Iran Have Been Close to Making a Nuclear Weapon? Uranium Enrichment Explained

    June 13, 20253 min readCould Iran Have Been Close to Making a Nuclear Weapon? Uranium Enrichment ExplainedWhen Israeli aircraft recently struck a uranium-enrichment complex in the nation, Iran could have been days away from achieving “breakout,” the ability to quickly turn “yellowcake” uranium into bomb-grade fuel, with its new high-speed centrifugesBy Deni Ellis Béchard edited by Dean VisserMen work inside of a uranium conversion facility just outside the city of Isfahan, Iran, on March 30, 2005. The facility in Isfahan made hexaflouride gas, which was then enriched by feeding it into centrifuges at a facility in Natanz, Iran. Getty ImagesIn the predawn darkness on Friday local time, Israeli military aircraft struck one of Iran’s uranium-enrichment complexes near the city of Natanz. The warheads aimed to do more than shatter concrete; they were meant to buy time, according to news reports. For months, Iran had seemed to be edging ever closer to “breakout,” the point at which its growing stockpile of partially enriched uranium could be converted into fuel for a nuclear bomb.But why did the strike occur now? One consideration could involve the way enrichment complexes work. Natural uranium is composed almost entirely of uranium 238, or U-238, an isotope that is relatively “heavy”. Only about 0.7 percent is uranium 235, a lighter isotope that is capable of sustaining a nuclear chain reaction. That means that in natural uranium, only seven atoms in 1,000 are the lighter, fission-ready U-235; “enrichment” simply means raising the percentage of U-235.U-235 can be used in warheads because its nucleus can easily be split. The International Atomic Energy Agency uses 25 kilograms of contained U-235 as the benchmark amount deemed sufficient for a first-generation implosion bomb. In such a weapon, the U-235 is surrounded by conventional explosives that, when detonated, compress the isotope. A separate device releases a neutron stream.Each time a neutron strikes a U-235 atom, the atom fissions; it divides and spits out, on average, two or three fresh neutrons—plus a burst of energy in the form of heat and gamma radiation. And the emitted neutrons in turn strike other U-235 nuclei, creating a self-sustaining chain reaction among the U-235 atoms that have been packed together into a critical mass. The result is a nuclear explosion. By contrast, the more common isotope, U-238, usually absorbs slow neutrons without splitting and cannot drive such a devastating chain reaction.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.To enrich uranium so that it contains enough U-235, the “yellowcake” uranium powder that comes out of a mine must go through a lengthy process of conversions to transform it from a solid into the gas uranium hexafluoride. First, a series of chemical processes refine the uranium and then, at high temperatures, each uranium atom is bound to six fluorine atoms. The result, uranium hexafluoride, is unusual: below 56 degrees Celsiusit is a white, waxy solid, but just above that temperature, it sublimates into a dense, invisible gas.During enrichment, this uranium hexafluoride is loaded into a centrifuge: a metal cylinder that spins at tens of thousands of revolutions per minute—faster than the blades of a jet engine. As the heavier U-238 molecules drift toward the cylinder wall, the lighter U-235 molecules remain closer to the center and are siphoned off. This new, slightly U-235-richer gas is then put into the next centrifuge. The process is repeated 10 to 20 times as ever more enriched gas is sent through a series of centrifuges.Enrichment is a slow process, but the Iranian government has been working on this for years and already holds roughly 400 kilograms of uranium enriched to 60 percent U-235. This falls short of the 90 percent required for nuclear weapons. But whereas Iran’s first-generation IR-1 centrifuges whirl at about 63,000 revolutions per minute and do relatively modest work, its newer IR-6 models, built from high-strength carbon fiber, spin faster and produce enriched uranium far more quickly.Iran has been installing thousands of these units, especially at Fordow, an underground enrichment facility built beneath 80 to 90 meters of rock. According to a report released on Monday by the Institute for Science and International Security, the new centrifuges could produce enough 90 percent U-235 uranium for a warhead “in as little as two to three days” and enough for nine nuclear weapons in three weeks—or 19 by the end of the third month.
    #could #iran #have #been #close
    Could Iran Have Been Close to Making a Nuclear Weapon? Uranium Enrichment Explained
    June 13, 20253 min readCould Iran Have Been Close to Making a Nuclear Weapon? Uranium Enrichment ExplainedWhen Israeli aircraft recently struck a uranium-enrichment complex in the nation, Iran could have been days away from achieving “breakout,” the ability to quickly turn “yellowcake” uranium into bomb-grade fuel, with its new high-speed centrifugesBy Deni Ellis Béchard edited by Dean VisserMen work inside of a uranium conversion facility just outside the city of Isfahan, Iran, on March 30, 2005. The facility in Isfahan made hexaflouride gas, which was then enriched by feeding it into centrifuges at a facility in Natanz, Iran. Getty ImagesIn the predawn darkness on Friday local time, Israeli military aircraft struck one of Iran’s uranium-enrichment complexes near the city of Natanz. The warheads aimed to do more than shatter concrete; they were meant to buy time, according to news reports. For months, Iran had seemed to be edging ever closer to “breakout,” the point at which its growing stockpile of partially enriched uranium could be converted into fuel for a nuclear bomb.But why did the strike occur now? One consideration could involve the way enrichment complexes work. Natural uranium is composed almost entirely of uranium 238, or U-238, an isotope that is relatively “heavy”. Only about 0.7 percent is uranium 235, a lighter isotope that is capable of sustaining a nuclear chain reaction. That means that in natural uranium, only seven atoms in 1,000 are the lighter, fission-ready U-235; “enrichment” simply means raising the percentage of U-235.U-235 can be used in warheads because its nucleus can easily be split. The International Atomic Energy Agency uses 25 kilograms of contained U-235 as the benchmark amount deemed sufficient for a first-generation implosion bomb. In such a weapon, the U-235 is surrounded by conventional explosives that, when detonated, compress the isotope. A separate device releases a neutron stream.Each time a neutron strikes a U-235 atom, the atom fissions; it divides and spits out, on average, two or three fresh neutrons—plus a burst of energy in the form of heat and gamma radiation. And the emitted neutrons in turn strike other U-235 nuclei, creating a self-sustaining chain reaction among the U-235 atoms that have been packed together into a critical mass. The result is a nuclear explosion. By contrast, the more common isotope, U-238, usually absorbs slow neutrons without splitting and cannot drive such a devastating chain reaction.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.To enrich uranium so that it contains enough U-235, the “yellowcake” uranium powder that comes out of a mine must go through a lengthy process of conversions to transform it from a solid into the gas uranium hexafluoride. First, a series of chemical processes refine the uranium and then, at high temperatures, each uranium atom is bound to six fluorine atoms. The result, uranium hexafluoride, is unusual: below 56 degrees Celsiusit is a white, waxy solid, but just above that temperature, it sublimates into a dense, invisible gas.During enrichment, this uranium hexafluoride is loaded into a centrifuge: a metal cylinder that spins at tens of thousands of revolutions per minute—faster than the blades of a jet engine. As the heavier U-238 molecules drift toward the cylinder wall, the lighter U-235 molecules remain closer to the center and are siphoned off. This new, slightly U-235-richer gas is then put into the next centrifuge. The process is repeated 10 to 20 times as ever more enriched gas is sent through a series of centrifuges.Enrichment is a slow process, but the Iranian government has been working on this for years and already holds roughly 400 kilograms of uranium enriched to 60 percent U-235. This falls short of the 90 percent required for nuclear weapons. But whereas Iran’s first-generation IR-1 centrifuges whirl at about 63,000 revolutions per minute and do relatively modest work, its newer IR-6 models, built from high-strength carbon fiber, spin faster and produce enriched uranium far more quickly.Iran has been installing thousands of these units, especially at Fordow, an underground enrichment facility built beneath 80 to 90 meters of rock. According to a report released on Monday by the Institute for Science and International Security, the new centrifuges could produce enough 90 percent U-235 uranium for a warhead “in as little as two to three days” and enough for nine nuclear weapons in three weeks—or 19 by the end of the third month. #could #iran #have #been #close
    WWW.SCIENTIFICAMERICAN.COM
    Could Iran Have Been Close to Making a Nuclear Weapon? Uranium Enrichment Explained
    June 13, 20253 min readCould Iran Have Been Close to Making a Nuclear Weapon? Uranium Enrichment ExplainedWhen Israeli aircraft recently struck a uranium-enrichment complex in the nation, Iran could have been days away from achieving “breakout,” the ability to quickly turn “yellowcake” uranium into bomb-grade fuel, with its new high-speed centrifugesBy Deni Ellis Béchard edited by Dean VisserMen work inside of a uranium conversion facility just outside the city of Isfahan, Iran, on March 30, 2005. The facility in Isfahan made hexaflouride gas, which was then enriched by feeding it into centrifuges at a facility in Natanz, Iran. Getty ImagesIn the predawn darkness on Friday local time, Israeli military aircraft struck one of Iran’s uranium-enrichment complexes near the city of Natanz. The warheads aimed to do more than shatter concrete; they were meant to buy time, according to news reports. For months, Iran had seemed to be edging ever closer to “breakout,” the point at which its growing stockpile of partially enriched uranium could be converted into fuel for a nuclear bomb. (Iran has denied that it has been pursuing nuclear weapons development.)But why did the strike occur now? One consideration could involve the way enrichment complexes work. Natural uranium is composed almost entirely of uranium 238, or U-238, an isotope that is relatively “heavy” (meaning it has more neutrons in its nucleus). Only about 0.7 percent is uranium 235 (U-235), a lighter isotope that is capable of sustaining a nuclear chain reaction. That means that in natural uranium, only seven atoms in 1,000 are the lighter, fission-ready U-235; “enrichment” simply means raising the percentage of U-235.U-235 can be used in warheads because its nucleus can easily be split. The International Atomic Energy Agency uses 25 kilograms of contained U-235 as the benchmark amount deemed sufficient for a first-generation implosion bomb. In such a weapon, the U-235 is surrounded by conventional explosives that, when detonated, compress the isotope. A separate device releases a neutron stream. (Neutrons are the neutral subatomic particle in an atom’s nucleus that adds to their mass.) Each time a neutron strikes a U-235 atom, the atom fissions; it divides and spits out, on average, two or three fresh neutrons—plus a burst of energy in the form of heat and gamma radiation. And the emitted neutrons in turn strike other U-235 nuclei, creating a self-sustaining chain reaction among the U-235 atoms that have been packed together into a critical mass. The result is a nuclear explosion. By contrast, the more common isotope, U-238, usually absorbs slow neutrons without splitting and cannot drive such a devastating chain reaction.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.To enrich uranium so that it contains enough U-235, the “yellowcake” uranium powder that comes out of a mine must go through a lengthy process of conversions to transform it from a solid into the gas uranium hexafluoride. First, a series of chemical processes refine the uranium and then, at high temperatures, each uranium atom is bound to six fluorine atoms. The result, uranium hexafluoride, is unusual: below 56 degrees Celsius (132.8 degrees Fahrenheit) it is a white, waxy solid, but just above that temperature, it sublimates into a dense, invisible gas.During enrichment, this uranium hexafluoride is loaded into a centrifuge: a metal cylinder that spins at tens of thousands of revolutions per minute—faster than the blades of a jet engine. As the heavier U-238 molecules drift toward the cylinder wall, the lighter U-235 molecules remain closer to the center and are siphoned off. This new, slightly U-235-richer gas is then put into the next centrifuge. The process is repeated 10 to 20 times as ever more enriched gas is sent through a series of centrifuges.Enrichment is a slow process, but the Iranian government has been working on this for years and already holds roughly 400 kilograms of uranium enriched to 60 percent U-235. This falls short of the 90 percent required for nuclear weapons. But whereas Iran’s first-generation IR-1 centrifuges whirl at about 63,000 revolutions per minute and do relatively modest work, its newer IR-6 models, built from high-strength carbon fiber, spin faster and produce enriched uranium far more quickly.Iran has been installing thousands of these units, especially at Fordow, an underground enrichment facility built beneath 80 to 90 meters of rock. According to a report released on Monday by the Institute for Science and International Security, the new centrifuges could produce enough 90 percent U-235 uranium for a warhead “in as little as two to three days” and enough for nine nuclear weapons in three weeks—or 19 by the end of the third month.
    0 التعليقات 0 المشاركات
  • Tutorial: Practical Lighting for Production

    Saturday, June 14th, 2025
    Posted by Jim Thacker
    Tutorial: Practical Lighting for Production

    html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" ";

    The Gnomon Workshop has released Practical Lighting for Production, a guide to VFX and cinematics workflows recorded by former Blizzard lighting lead Graham Cunningham.
    The intermediate-level workshop provides four hours of training in Maya, Arnold and Nuke.
    Discover professional workflows for lighting a CG shot to match a movie reference
    In the workshop, Cunningham sets out the complete process of lighting and compositing a shot to match a movie reference, using industry-standard software.
    He begins by setting up a basic look development light rig in Maya, importing a 3D character, assigning materials and shading components, and creating a turntable setup.
    Next, he creates a shot camera and set dresses the environment using kitbash assets.
    Cunningham also discusses strategies for lighting a character, including how to use dome lights and area lights to provide key, fill and rim lighting, and how to use HDRI maps.
    From there, he moves to rendering using Arnold, discussing render settings, depth of field, and how to create render passes.
    Cunningham then assembles the render passes in Nuke, splits out the light AOVs, and sets out how to adjust light colors and intensities.
    He also reveals how to add atmosphere, how to use cryptomattes to fine tune the results, how to add post effects, and how to apply a final color grade to match a chosen movie reference.
    As well as the tutorial videos, viewers of the workshop can download one of Cunningham’s Maya files.
    The workshop uses 3D Scan Store’s commercial Female Explorer Game Character, and KitBash3D’s Wreckage Kit, plus assets from KitBash3D’s Cargo.
    About the artist
    Graham Cunningham is a Senior Lighting, Compositing and Lookdev Artist, beginning his career as a generalist working in VFX for film and TV before moving to Blizzard Entertainment.
    At Blizzard, he contributed to cinematics for Diablo IV, Diablo Immortal, Starcraft II, Heroes of the Storm, World of Warcraft, Overwatch, and Overwatch 2, many of them as a lead lighting artist.
    Pricing and availability
    Practical Lighting for Production is available via a subscription to The Gnomon Workshop, which provides access to over 300 tutorials.
    Subscriptions cost /month or /year. Free trials are available.
    about Practical Lighting for Production on The Gnomon Workshop’s website

    Have your say on this story by following CG Channel on Facebook, Instagram and X. As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects.
    Full disclosure: CG Channel is owned by Gnomon.

    Latest News

    DreamWorks Animation releases MoonRay 2.15
    Check out the new features in the open-source release of DreamWorks Animation's production renderer. used on movies like The Wild Robot.
    Sunday, June 15th, 2025

    Tutorial: Practical Lighting for Production
    Master professional CG lighting workflows with former Blizzard lighting lead Graham Cunningham's tutorial for The Gnomon Workshop.
    Saturday, June 14th, 2025

    Boris FX releases Mocha Pro 2025.5
    Planar tracking tool gets new AI face recognition system for automatically obscuring identities in footage. Check out its other new features.
    Friday, June 13th, 2025

    Leopoly adds voxel sculpting to Shapelab 2025
    Summer 2025 update to the VR modeling app expands the new voxel engine for blocking out 3D forms. See the other new features.
    Friday, June 13th, 2025

    iRender: the next-gen render farm for OctaneRenderOnline render farm iRender explains why its powerful, affordable GPU rendering solutions are a must for OctaneRender users.
    Wednesday, June 11th, 2025

    Master Architectural Design for Games using Blender & UE5
    Discover how to create game environments grounded in architectural principles with The Gnomon Workshop's new tutorial.
    Monday, June 9th, 2025

    More News
    Epic Games' free Live Link Face app is now available for Android
    Adobe launches Photoshop on Android and iPhone
    Sketchsoft releases Feather 1.3
    Autodesk releases 3ds Max 2026.1
    Autodesk adds AI animation tool MotionMaker to Maya 2026.1
    You can now sell MetaHumans, or use them in Unity or Godot
    Epic Games to rebrand RealityCapture as RealityScan 2.0
    Epic Games releases Unreal Engine 5.6
    Pulze releases new network render manager RenderFlow 1.0
    Xencelabs launches Pen Tablet Medium v2
    Desktop edition of sculpting app Nomad enters free beta
    Boris FX releases Silhouette 2025
    Older Posts
    #tutorial #practical #lighting #production
    Tutorial: Practical Lighting for Production
    Saturday, June 14th, 2025 Posted by Jim Thacker Tutorial: Practical Lighting for Production html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "; The Gnomon Workshop has released Practical Lighting for Production, a guide to VFX and cinematics workflows recorded by former Blizzard lighting lead Graham Cunningham. The intermediate-level workshop provides four hours of training in Maya, Arnold and Nuke. Discover professional workflows for lighting a CG shot to match a movie reference In the workshop, Cunningham sets out the complete process of lighting and compositing a shot to match a movie reference, using industry-standard software. He begins by setting up a basic look development light rig in Maya, importing a 3D character, assigning materials and shading components, and creating a turntable setup. Next, he creates a shot camera and set dresses the environment using kitbash assets. Cunningham also discusses strategies for lighting a character, including how to use dome lights and area lights to provide key, fill and rim lighting, and how to use HDRI maps. From there, he moves to rendering using Arnold, discussing render settings, depth of field, and how to create render passes. Cunningham then assembles the render passes in Nuke, splits out the light AOVs, and sets out how to adjust light colors and intensities. He also reveals how to add atmosphere, how to use cryptomattes to fine tune the results, how to add post effects, and how to apply a final color grade to match a chosen movie reference. As well as the tutorial videos, viewers of the workshop can download one of Cunningham’s Maya files. The workshop uses 3D Scan Store’s commercial Female Explorer Game Character, and KitBash3D’s Wreckage Kit, plus assets from KitBash3D’s Cargo. About the artist Graham Cunningham is a Senior Lighting, Compositing and Lookdev Artist, beginning his career as a generalist working in VFX for film and TV before moving to Blizzard Entertainment. At Blizzard, he contributed to cinematics for Diablo IV, Diablo Immortal, Starcraft II, Heroes of the Storm, World of Warcraft, Overwatch, and Overwatch 2, many of them as a lead lighting artist. Pricing and availability Practical Lighting for Production is available via a subscription to The Gnomon Workshop, which provides access to over 300 tutorials. Subscriptions cost /month or /year. Free trials are available. about Practical Lighting for Production on The Gnomon Workshop’s website Have your say on this story by following CG Channel on Facebook, Instagram and X. As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects. Full disclosure: CG Channel is owned by Gnomon. Latest News DreamWorks Animation releases MoonRay 2.15 Check out the new features in the open-source release of DreamWorks Animation's production renderer. used on movies like The Wild Robot. Sunday, June 15th, 2025 Tutorial: Practical Lighting for Production Master professional CG lighting workflows with former Blizzard lighting lead Graham Cunningham's tutorial for The Gnomon Workshop. Saturday, June 14th, 2025 Boris FX releases Mocha Pro 2025.5 Planar tracking tool gets new AI face recognition system for automatically obscuring identities in footage. Check out its other new features. Friday, June 13th, 2025 Leopoly adds voxel sculpting to Shapelab 2025 Summer 2025 update to the VR modeling app expands the new voxel engine for blocking out 3D forms. See the other new features. Friday, June 13th, 2025 iRender: the next-gen render farm for OctaneRenderOnline render farm iRender explains why its powerful, affordable GPU rendering solutions are a must for OctaneRender users. Wednesday, June 11th, 2025 Master Architectural Design for Games using Blender & UE5 Discover how to create game environments grounded in architectural principles with The Gnomon Workshop's new tutorial. Monday, June 9th, 2025 More News Epic Games' free Live Link Face app is now available for Android Adobe launches Photoshop on Android and iPhone Sketchsoft releases Feather 1.3 Autodesk releases 3ds Max 2026.1 Autodesk adds AI animation tool MotionMaker to Maya 2026.1 You can now sell MetaHumans, or use them in Unity or Godot Epic Games to rebrand RealityCapture as RealityScan 2.0 Epic Games releases Unreal Engine 5.6 Pulze releases new network render manager RenderFlow 1.0 Xencelabs launches Pen Tablet Medium v2 Desktop edition of sculpting app Nomad enters free beta Boris FX releases Silhouette 2025 Older Posts #tutorial #practical #lighting #production
    Tutorial: Practical Lighting for Production
    Saturday, June 14th, 2025 Posted by Jim Thacker Tutorial: Practical Lighting for Production html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" The Gnomon Workshop has released Practical Lighting for Production, a guide to VFX and cinematics workflows recorded by former Blizzard lighting lead Graham Cunningham. The intermediate-level workshop provides four hours of training in Maya, Arnold and Nuke. Discover professional workflows for lighting a CG shot to match a movie reference In the workshop, Cunningham sets out the complete process of lighting and compositing a shot to match a movie reference, using industry-standard software. He begins by setting up a basic look development light rig in Maya, importing a 3D character, assigning materials and shading components, and creating a turntable setup. Next, he creates a shot camera and set dresses the environment using kitbash assets. Cunningham also discusses strategies for lighting a character, including how to use dome lights and area lights to provide key, fill and rim lighting, and how to use HDRI maps. From there, he moves to rendering using Arnold, discussing render settings, depth of field, and how to create render passes. Cunningham then assembles the render passes in Nuke, splits out the light AOVs, and sets out how to adjust light colors and intensities. He also reveals how to add atmosphere, how to use cryptomattes to fine tune the results, how to add post effects, and how to apply a final color grade to match a chosen movie reference. As well as the tutorial videos, viewers of the workshop can download one of Cunningham’s Maya files. The workshop uses 3D Scan Store’s commercial Female Explorer Game Character, and KitBash3D’s Wreckage Kit, plus assets from KitBash3D’s Cargo. About the artist Graham Cunningham is a Senior Lighting, Compositing and Lookdev Artist, beginning his career as a generalist working in VFX for film and TV before moving to Blizzard Entertainment. At Blizzard, he contributed to cinematics for Diablo IV, Diablo Immortal, Starcraft II, Heroes of the Storm, World of Warcraft, Overwatch, and Overwatch 2, many of them as a lead lighting artist. Pricing and availability Practical Lighting for Production is available via a subscription to The Gnomon Workshop, which provides access to over 300 tutorials. Subscriptions cost $57/month or $519/year. Free trials are available. Read more about Practical Lighting for Production on The Gnomon Workshop’s website Have your say on this story by following CG Channel on Facebook, Instagram and X (formerly Twitter). As well as being able to comment on stories, followers of our social media accounts can see videos we don’t post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects. Full disclosure: CG Channel is owned by Gnomon. Latest News DreamWorks Animation releases MoonRay 2.15 Check out the new features in the open-source release of DreamWorks Animation's production renderer. used on movies like The Wild Robot. Sunday, June 15th, 2025 Tutorial: Practical Lighting for Production Master professional CG lighting workflows with former Blizzard lighting lead Graham Cunningham's tutorial for The Gnomon Workshop. Saturday, June 14th, 2025 Boris FX releases Mocha Pro 2025.5 Planar tracking tool gets new AI face recognition system for automatically obscuring identities in footage. Check out its other new features. Friday, June 13th, 2025 Leopoly adds voxel sculpting to Shapelab 2025 Summer 2025 update to the VR modeling app expands the new voxel engine for blocking out 3D forms. See the other new features. Friday, June 13th, 2025 iRender: the next-gen render farm for OctaneRender [Sponsored] Online render farm iRender explains why its powerful, affordable GPU rendering solutions are a must for OctaneRender users. Wednesday, June 11th, 2025 Master Architectural Design for Games using Blender & UE5 Discover how to create game environments grounded in architectural principles with The Gnomon Workshop's new tutorial. Monday, June 9th, 2025 More News Epic Games' free Live Link Face app is now available for Android Adobe launches Photoshop on Android and iPhone Sketchsoft releases Feather 1.3 Autodesk releases 3ds Max 2026.1 Autodesk adds AI animation tool MotionMaker to Maya 2026.1 You can now sell MetaHumans, or use them in Unity or Godot Epic Games to rebrand RealityCapture as RealityScan 2.0 Epic Games releases Unreal Engine 5.6 Pulze releases new network render manager RenderFlow 1.0 Xencelabs launches Pen Tablet Medium v2 Desktop edition of sculpting app Nomad enters free beta Boris FX releases Silhouette 2025 Older Posts
    0 التعليقات 0 المشاركات
  • DISCOVERING ELIO

    By TREVOR HOGG

    Images courtesy of Pixar.

    The character design of Glordon is based on a tardigrade, which is a microscopic water bear.

    Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red.
    “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.”

    The character design of Glordon is based on a tardigrade, which is a microscopic water bear.

    There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’”

    Green is the thematic color for Elio.

    Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?”

    The Communiverse was meant to feel like a place that a child would love to visit and explore.

    Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’”

    The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena.

    The variety in the Communiverse is a contrast to the regimented world on the military base.

    There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’”

    Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters.

    Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.”

    An aerial image of Elio as he attempts to get abducted by aliens.

    Part of the design of the Coummuniverse was inspired by Chinese puzzle balls.

    A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.”
    Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.”

    Exploring various facial expressions for Elio.

    A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew.

    Character designs of Elio and Glordon. which shows them interacting with each other.

    Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.”
    Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.”

    Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.”
    Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
    #discovering #elio
    DISCOVERING ELIO
    By TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when thosecharacters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Saniigave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Homand I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowdsis dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessupbecause sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Murenwas keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter. “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.” #discovering #elio
    WWW.VFXVOICE.COM
    DISCOVERING ELIO
    By TREVOR HOGG Images courtesy of Pixar. The character design of Glordon is based on a tardigrade, which is a microscopic water bear. Rather than look at the unknown as something to be feared, Pixar has decided to do some wish fulfillment with Elio, where a lonely adolescent astrophile gets abducted by aliens and is mistaken as the leader of Earth. Originally conceived and directed by Adrian Molina, the coming-of-age science fiction adventure was shepherded by Domee Shi and Madeline Sharafian, who had previously worked together on Turning Red. “Space is often seen as dark, mysterious and scary, but there is also so much hope, wonder and curiosity,” notes Shi, director of Elio. “It’s like anything is ‘out there.’ Elio captures how a lot of us feel at different points of our lives, when we were kids like him, or even now wanting to be off of this current planet because it’s just too much. For Elio, it’s a rescue. I feel that there’s something so universal about that feeling of wanting to be taken away and taken care of. To know that you’re not alone and somebody chose you and picked you up.” The character design of Glordon is based on a tardigrade, which is a microscopic water bear. There is a stark contrast between how Earth and the alien world, known as the Communiverse, are portrayed. “The more we worked with the animators on Glordon and Helix, they began to realize that Domee and I respond positively when those [alien] characters are exaggerated, made cute, round and chubby,” states Sharafian, director of Elio. “That automatically started to differentiate the way the Earth and space feel.” A certain question had to be answered when designing the United Nations-inspired Communiverse. “It was coming from a place of this lonely kid who feels like no one wants him on Earth,” Shi explains. “What would be heaven and paradise for him? The Communiverse was built around that idea.” A sense of belonging is an important theme. “It’s also inspired by Adrian Molina’s backstory, and our backstories too, of going to animation college,” Sharafian remarks. “For the first time, we said, ‘This is where everybody like me is!’” Green is the thematic color for Elio. Visual effects are an important storytelling tool. “Especially, for our movie, which is about this boy going to this crazy incredible world of the Communiverse,” Shi observes. “It has to be dazzling and look spectacular on the big screen and feel like paradise. Elio is such a visual feast, and you do feel like, ‘I want to stay here no matter what. I can’t believe that this place even exists.’ Visual effects are a powerful tool to help you feel what the characters are feeling.” A wishlist became a reality for the directors. “Claudia Chung Sanii [Visual Effects Supervisor] gave Domee and me carte blanche for wish fulfillment for ourselves,” Sharafian remarks. “What do you want Elio’s outfit in space to look like? It was a difficult costume, but now when we watch the movie, we’re all so proud of it. Elio looks fabulous, and he’s so happy to be wearing that outfit. Who would want to take that off?” The Communiverse was meant to feel like a place that a child would love to visit and explore. Methodology rather than technology went through the biggest change for the production. “The Communiverse is super complex and has lots of moving pieces. But there’s not much CG can’t do anymore,” notes Claudia Chung Sanii. “Elemental did effects characters. We did long curly hair, dresses, capes, water and fire. What we hadn’t done before was be a part of that design process. How do we get lighting into layout? How do we see the shaders in animation in layout? The tools department was working on a software called Luna which does that. I went to the tools department and asked, ‘Can I play around with it?’ They were like, ‘Okay. But it’s not ready yet.’ Tools will basically be bringing RenderMan and an interactive lighting workflow to the pipeline across all of these DCCs. Because we light in Katana, you can’t get back upstream. The conceit that we were dipping our toe in on Elio was, ‘Whatever you do in lighting, anyone on the pipeline can see it.’” The influence of microscopic forms and macro photography grounded the Communiverse in natural phenomena. The variety in the Communiverse is a contrast to the regimented world on the military base. There were no departmental borders, in particular with cinematography. “We had our layout and lighting DPs start on the same day. Derek Williams wouldn’t shoot anything without Jordan Rempel, our lighting DP, seeing it,” Sanii states. “Jordan would drop in lighting and start doing key lighting as Derek’s team was laying out. It wasn’t like you had to hit the render button, wait for the render to come up and go, ‘Oh, my god, it’s dark! I didn’t know that it was nighttime.’” A new term was adopted. “Meredith Hom [Production Manager] and I pulled the entire crew and leadership into this mental concept that we called the ‘college project.’ For some of us, college was a time when we didn’t have titles and crafts. You begged, borrowed and stole to hit that deadline. So much of our world has become linear in our process that I wanted to break that down to, ‘No. We’re all working together. The scope of this film is too large for us to wait for each other to finish our piece. If this person is slammed, fine. Figure out a different idea to do it with what tools you have.’” Directors Domee Shi and Madeline Sharafian are drawn to chubby, exaggerated and cute characters. Forgoing the word ‘no’ led to the technology breaking down. “I remember times when crowds [department] is dressing all of the aliens and because of forgetting to constrain it to the Communiverse, they all show up at the origin, and you’re going, ‘Why is there a whole party going on over there?’” Sanii laughs. “On Elio, it was always forward. There were no rules about locking things down or not installing over the weekend. It was always like, ‘Put it all in, and we’ll deal with it on Monday.’ There would be some funny stuff. We never QC’d something before walking it into the room. Everyone saw how the sausage was made. It was fun and not fun for Harley Jessup [Production Designer] because sometimes there would be a big thing in the middle screen, and he would say, ‘Is that finished?’ There was no way we could get through this film if we kept trying to fix the thing that broke.” An aerial image of Elio as he attempts to get abducted by aliens. Part of the design of the Coummuniverse was inspired by Chinese puzzle balls. A former visual effects art director at ILM, Harley Jessup found his previous experiences on projects like Innerspace to be helpful on Elio. “I liked that the directors wanted to build on the effects films from the 1980s and early 1990s,” reflects Jessup. “I was there and part of that. It was fun to look back. At the time, the techniques were all practical, matte paintings and miniatures, which are fun to work with, but without the safety net of CG. One thing Dennis Muren [VES] was keen on, was how people see things like the natural phenomenon you might see in a microscopic or macro photography form. We were using that. I was looking at the mothership of Close Encounters of the Third Kind, which Dennis shot when he was a young artist. It was nice to be able to bring all of that history to this film.” Earth was impacted by a comment made by Pete Docter (CCO, Pixar). “He said, ‘The military base should feel like a parking lot,” Jessup reveals. “You should know why Elio wants to be anywhere else. And the Communiverse needs to be inviting. We built a lot of contrast into those two worlds. The brutalist architecture on the military base, with its hard edges and heavy horizontal forms close to the earth, needed to be harsh but beautiful in its own way, so we tried for that. The Communiverse would be in contrast and be all curves, translucent surfaces and stained-glass backlit effects. Things were wide open about what it could be because each of the aliens are from a different climate and gravity. There are some buildings that are actually upside down on it, and the whole thing is rotating inside like clockwork. It is hopefully an appealing, fun world. It’s not a dystopian outer space.” Exploring various facial expressions for Elio. A tough character to get right was Aunt Olga, who struggles to be the guardian of her nephew. Character designs of Elio and Glordon. which shows them interacting with each other. Architecture was devised to reflect the desired tone for scenes. “In the Grand Assembly Hall where each alien has a desk and booth, the booth is shaped like an eyelid that can close or open,” Jessup explains. “It increases the feeling that they’re evaluating and observing Elio and each of the candidates that have come to join the Communiverse.” A couple of iconic cinematic franchises were avoided for aesthetic reasons. “As much as I love Star Wars and Star Trek, we wanted to be different from those kinds of aliens that are often more humanoid.” Ooooo was the first alien to be designed. “We did Ooooo in collaboration with the effects team, which was small at that time. She was described as a liquid supercomputer. We actually used the wireframe that was turning up and asked, what if it ended up being this network of little lights that are moving around and can express how much she was thinking? Ooooo is Elio’s guide to the Communiverse; her body would deform, so she could become a big screen or reach out and pluck things. Ooooo has an ability like an amoeba to stretch.” Flexibility is important when figuring out shot design. “On Elio, we provided the layout department with a rudimentary version of our environments,” states David Luoh, Sets Supervisor. “It might be simple geometry. We’re not worried necessarily about shading, color and material yet. Things are roughly in place but also built in a way that is flexible. As they’re sorting out the camera and testing out staging, they can move elements of the set around. Maybe this architectural piece needs to be shifted or larger or smaller. There was a variation on what was typically expected of set deliveries of environments to our layout department. That bar was lowered to give the layout department something to work with sooner and also with more flexibility. From their work we get context as to how we partner with our art and design department to build and finalize those environments.” Regional biomes known as disks are part of the Communiverse. “There are aquatic, lush forest, snow and ice, and hot lava disks,” Luoh remarks. “The hot disk is grounded in the desert, volcanic rock and lava, while for the lush disk we looked at interesting plant life found in the world around us.” The Communiverse is a complex geometric form. “We wanted these natural arrangements of alien districts, and that was all happening on this twisting and curving terrain in a way that made traditional dressing approaches clunky. Oftentimes, you’re putting something on the ground or mounted, and the ground is always facing upward. But if you have to dress the wall or ceiling, it becomes a lot more difficult to manipulate and place on something with that dynamic and shape. You have stuff that casts light, is see-through and shifting over time. Ooooo is a living character that looks like electronic circuitry that is constantly moving, and we also have that element in the walls, floors and bubble transport that carry the characters around.” Sets were adjusted throughout the production. “We try to anticipate situations that might come up,” Luoh states. “What if we have a series of shots where you’re getting closer and closer to the Communiverse and you have to bridge the distance between your hero and set extension background? There is a partnership with story, but certainly with our layout camera staging department. As we see shots come out of their work, we know where we need to spend the time to figure out, are we going to see the distant hills in this way? We’re not going to build it until we know because it can be labor-intensive. There is a responsiveness to what we are starting to see as shots get made.” Combining the familiar into something unfamiliar was a process. “There was this curation of being inspired by existing alien sci-fi depictions, but also reaching back into biological phenomena or interesting material because we wanted to ground a lot of those visual elements and ideas in something that people could intuitively grasp on to, even if they were combined or arranged in a way that is surprising, strange and delightful.”
    0 التعليقات 0 المشاركات