• WWW.VG247.COM
    Helldivers 2's latest mission could spell the end for an old Super Earth foe, as the Illuminate carry on chilling in a galactic corner
    This week may have mainly been about chatter in the Helldivers 2 community after game director Johan Pilestedt headed on sabbatical, but there's still been stuff happening in the game itself, and the latest of those things is seemingly offering the chance to permanently see off a pesky foe from last year. Read more
    0 التعليقات 0 المشاركات 168 مشاهدة
  • WWW.NINTENDOLIFE.COM
    Summer Game Fest Returns In June With A 2-Hour Showcase
    Image: Summer Game FestIt's been confirmed that Summer Game Fest will return this year with a 2-hour showcase taking place on 6th June at the YouTube Theatre in Los Angeles.Following this, a 'Play Days' hands-on event will take place from 7th-9th June in which media and influencers can experience new and upcoming games from over 40 attending publishers.A brand-new addition for 2025 includes what's being called a 'thought leader event' curated by Christopher Dring (previously GamesIndustry.biz) and Geoff Keighley.This will feature key leaders from the games industry and beyond who will come together to "delve into some of the key changes, challenges and opportunities facing the global video game industry, as well as celebrate the cultural impact and importance of video games as the most powerful form of entertainment in the world".Public tickets for the showcase will be available for purchase in Spring, but those hoping for a strong presense from Nintendo may need to temper their expectations; the publisher isn't known for collaborating much with Summer Game Fest, and with the Switch 2 launching later this year, it will likely be controlling its own message with the upcoming April Direct.For now then, you can sign up for updates over at summergamefest.com. NontendoHands-on impressions of over a dozen gamesAre you bothered about following this year's Summer Game Fest event, or do you think Nintendo will just stick to its own guns as usual? Let us know with a comment down below.Share:00 Nintendo Lifes resident horror fanatic, when hes not knee-deep in Resident Evil and Silent Hill lore, Ollie likes to dive into a good horror book while nursing a lovely cup of tea. He also enjoys long walks and listens to everything from TOOL to Chuck Berry. Hold on there, you need to login to post a comment...Related ArticlesNintendo Expands Switch Online's SNES Library With Three More TitlesIncluding a special Super Famicom release...Opinion: My Daughter Made Me Realise That Mario Wonder's Difficulty Options Need WorkYoshi or Nabbit, make your choiceXenoblade Chronicles X: Definitive Edition Metal Poster Pre-Order Bonus RevealedAvailable now in the US
    0 التعليقات 0 المشاركات 153 مشاهدة
  • WWW.NINTENDOLIFE.COM
    Sorry Romantics, "Bonkers" Dating Sim 'Date Everything' Has Been Delayed To June
    Image: Team17We've been keeping an eye on Date Everything the romance sim where you can, fittingly, date everything since it was revealed last summer. A last-minute delay pushed the game to a Valentine's Day 2025 release, but developer Sassy Chap Games has since decided that it still needs a little more time in the oven. Put Cupid on hold, because Date Everything is now expected to launch in June.In a statement from Sassy Chap Games' Lead Designer Ray Chase, the dev explained that while the game is complete, there's still a whole bunch of testing that needs to be done. Releasing the game in its current state, Chase explains, would be a "disservice", so the team has decided to push things back by a few months to make sure that everything is in tip-top shape.Subscribe to Nintendo Life on YouTube794kWatch on YouTube You can find Chase's full statement below:To our fellow Dateviators,Since we last updated you on development, We have been extremely hard at work finishing work on Date Everything! And at this point I can confidently say that we have reached that point where the game is complete to a standard that we feel reached our goals with no compromise in our bonkers artistic vision.However... I was too confident that we could properly test ALL the wild amount of content and pathing that exists in this massive game, and unfortunately we ran out of time on our current (and yet so appropriate) release date of February 14th, 2025.Our bug list is finally starting to dwindle down as QA gets through the labyrinthine story pathing, but to submit our game in the state with so many outstanding glitches would be doing you a disservice. We have our final release date set for June 2025. And while it isn't quite as sexy a date as Valentine's Day, we hope we can bring new sexiness to June evermore. And yes, you actually can date the glitches. Their name is Daemon and I am currently in a Love/Haterelationship with them.It's always a shame to hear of a project getting delayed (particularly when it happens twice), but in a game where there are so many romantic outcomes, we can only imagine that the list of potential glitches quickly piles up and we'd always rather have a bug-free experience, if possible.We'll be keeping an eye out for more information from Team17 and Sassy Chap Games over the coming months for a more secure release date, ready to mark it with a heart on the calendar, naturally. And it's called... Date Everything!What do you make of this delay? Are you still keen to play Date Everything later this year? Let us know in the comments.[source x.com]Related GamesSee AlsoShare:01 Jim came to Nintendo Life in 2022 and, despite his insistence that The Minish Cap is the best Zelda game and his unwavering love for the Star Wars prequels (yes, really), he has continued to write news and features on the site ever since. Hold on there, you need to login to post a comment...Related Articles69 Games You Should Pick Up In Nintendo's 'Supercharge' eShop Sale (North America)Every game we scored 9/10 or higherReview: ENDER MAGNOLIA: Bloom In The Mist (Switch) - An Incredibly Polished, Downbeat MetroidvaniaOne that shouldn't be mistXenoblade Chronicles X: Definitive Edition Metal Poster Pre-Order Bonus RevealedAvailable now in the USCeleste Dev Makes "Difficult Decision" To Cancel New Game EarthbladeFollowing "a disagreement about the IP rights of Celeste"
    0 التعليقات 0 المشاركات 154 مشاهدة
  • TECHCRUNCH.COM
    Microsoft signs massive carbon credit deal with reforestation startup Chestnut Carbon
    Microsoft announced Thursday that its buying over 7 million tons of carbon credits from Chestnut Carbon.The 25-year deal would enable Chestnut Carbon to reforest 60,000 acres of land across Arkansas, Louisiana, and Texas, Axios reported. Recently, the tech company has struggled to rein in its carbon emissions as AI has driven a surge in data center construction and use.Microsoft reported last year that its emissions rose 29% since 2020 as a result of the boom in AI and cloud computing, imperiling its 2030 goal to sequester more carbon than it produces. In 2023, the company reported generating 17.1 million tons of greenhouse gas emissions before offsets.Carbon credits come in a range of flavors. Chestnut Carbon focuses on reforestation, in which the company facilitates tree planting and then monitors the new forests to ensure they grow as planned and arent cut down. The company currently has eight projects in the Southeast U.S., which were previously worked as farms or pastures.Trees naturally sequester carbon as they grow, though not all forest-related carbon credits are created equal. Credits from projects that plant non-native, fast-growing trees are generally seen as lower quality and sell for less since they dont tend to support as much biodiversity, and the trees dont tend to live as long. Projects that support diverse, native plantings typically sell at a premium since the ecosystems that result tend to be more resilient over time.Even premium carbon credits from afforestation, reforestation, and avoided deforestation are a relative deal compared with some alternatives. Chestnut Carbon sold credits last year for $34 per ton, whereas direct air capture, which uses fans and chemical sorbents to draw CO2 out of the atmosphere, costs around $600 to $1,000 per ton today. Despite the cost differential, Microsoft has also bought carbon credits from direct air capture startups.For all their strengths, nature-based carbon credits arent always perfect. Verra, which has the largest nature-based carbon credit portfolio, was the subject of an extensive investigation in 2023 which reported that the organization overstated the climate benefit of its projects. The scandal led to the CEOs ouster and made the industry reassess the standards it uses. Chestnut Carbon, which previously used Verra to certify its carbon credits, today uses Gold Standard.
    0 التعليقات 0 المشاركات 160 مشاهدة
  • TECHCRUNCH.COM
    Waymo employees can hail fully autonomous rides in Atlanta now
    Waymo said it is launching fully driverless robotaxi rides for employees in Atlanta, an important step before the company opens the service up to members of the public later in 2025.This is the latest signal of Waymos push into new markets, and it comes two months after the company closed a $5.6 billion Series C round at a $45 billion valuation. The round was led from heavy hitters including Alphabet, Andreessen Horowitz, Fidelity, Tiger Global, and others. The company earlier this week announced plans to test in 10 new cities this year, starting with San Diego and Las Vegas. When Waymo officially launches its commercial robotaxi service in Atlanta, it will be exclusively via the Uber app. Waymo and Uber also plan to launch together in Austin this year. The Alphabet-owned self-driving company opened up robotaxi rides to certain members of the public in Austin in October after first offering rides to employees seven months earlier.Waymos Atlanta milestone comes a day after Elon Musk said Tesla would launch a robotaxi service in Austin in June. Tesla has yet to bring a fully autonomous vehicle to public roads that doesnt require a human driver behind the wheel ready to take over. Waymo runs its own autonomous ride-hail service, Waymo One, in San Francisco, Phoenix, and Los Angeles.
    0 التعليقات 0 المشاركات 158 مشاهدة
  • WWW.ARTOFVFX.COM
    Dune Prophecy: Martyn Culpitt (VFX Supervisor) and the Image Engine team
    InterviewsDune Prophecy: Martyn Culpitt (VFX Supervisor) and the Image Engine teamBy Vincent Frei - 30/01/2025 In 2018, Martyn Culpitt delved into Image Engines visual effects work for Fantastic Beasts: The Crimes of Grindelwald. Since then, he has contributed to a variety of projects, including The Mandalorian, Fantastic Beasts: The Secrets of Dumbledore, Willow, and Kraven the Hunter.Jeremy Mesana brings more than 25 years of visual effects experience to the table, having collaborated with leading studios like MPC, Digital Domain, and Image Engine. His credits include work on Logan, Mulan, Snowpiercer, and Halo.With more than 10 years of experience in visual effects, Adrien Vallecilla built his career at studios like MPC and Digital Domain before joining Image Engine in 2022. His credits include Terminator: Dark Fate, She-Hulk: Attorney at Law, Ahsoka, and Halo.With more than 15 years in the VFX industry, Xander Kennedy has an extensive background that includes time at Luma Pictures, MPC, and ReDefine. Since joining Image Engine in 2021, he has contributed to various shows such as The Book of Boba Fett, Obi-Wan Kenobi, Foundation, and Leave the World Behind.Geoff Pedder brings 25 years of experience in the visual effects industry, having honed his craft at studios like MPC, Cinesite, ILM, and Image Engine. His portfolio includes work on Hawkeye, Fast X, Avatar: The Last Airbender, and Kraven the Hunter.Clement Feuillet joined Image Engine in 2023, following his work with studios such as Mikros Animation, Scanline VFX, Framestore, and Animal Logic on various shows like Godzilla vs. Kong, 1899, Peter Pan & Wendy, and Leo.EungHo Lo launched his VFX career at Weta FX in 2009 before joining Image Engine in 2015. Over the years, he has contributed to various projects, including Prometheus, Game of Thrones, Hawkeye, and Snowpiercer.What was your feeling to enter into the Dune universe?Martyn Culpitt (VFX Supervisor) When I learned that my next project would be Dune: Prophecy, I was beyond excited to contribute to this incredible universe. The opportunity to bring our own creative ideas and visions to such an iconic series was truly an honor.Having been in the industry since I was eighteen, Ive had the privilege of working on only a handful of projects with worlds as unique and imaginative as this one.Denis Villeneuves first Dune film was breathtaking. I was utterly immersed, transported to its rich, otherworldly setting. The VFX work by DNEG was incredible and again for me pushed the ideas of whats possible.Working on the Dune: Prophecy was creatively one of the coolest projects. We really tried to keep to the aesthetics of the first movie but also add our own touch to it.How was the collaboration with the showrunner and with VFX Supervisor Michael Enriquez?Martyn Culpitt (VFX Supervisor) We had a great collaboration with Mike and Alison Schapker, the show runner. We were given a lot of creative freedom to really come up with exciting ideas that were true to the Dune world but also unique to Prophecy. Concepts played a huge part in this for us as it was a quick way to explore ideas and have Mike and Alison give feedback. There were a lot of creative challenges on the show and Mike was great at trying to help us understand what Alisons vision was. It was an awesome show and a pleasure working with both of them. Hopefully we get to do it again.How did you organize the work with your VFX Producer?Martyn Culpitt (VFX Supervisor) Viktoria Rucker, my VFX Producer, and I had an excellent partnership on this project. Vik is incredibly organized and played a key role in ensuring turnovers happened as quickly as possible, allowing us to dive into the complex work for the show. She was fantastic with the client, maintaining a great rapport, and collaborated closely with Terron Pratt to ensure we had everything we needed.The show presented significant challenges due to the sheer number of plates and creative complexities, but Vik and Terron were outstanding in maintaining constant communication and keeping us on track to hit our targets. While the complexity of the shots and ongoing updates to the cut pushed some deadlines, Vik excelled at reshuffling priorities to ensure everything was delivered on time. She worked closely with Kitty Widjaja, our Production Manager, to make it all happen.Viks organization and proactive approach were invaluable. I genuinely couldnt have done it without her and Kittys support. Their ability to stay on top of tight deadlines made a huge difference and alleviated a lot of potential concerns.What are the sequences made by Image Engine?Martyn Culpitt (VFX Supervisor) The show presented us with a variety of unique and challenging sequences, each creatively exciting due to their distinct nature.One of the key sequences involved the sandworm shots, where the worm engulfs the Sisterhood complex in a fully CG-rendered Arrakis environment. These shots also included haunting nightmare scenes set in the Sisterhoods realm, featuring falling sand, thumpers in the Arrakis desert, and many other unique shots.Another standout was the ice lake sequence, where we crafted a dynamic snowstorm that reflected Valyas emotional state, blending white and black snow in a visually striking way. This required a fully CG environment, complete with detailed ice cracks and FX-driven snow effects.The holographic table was particularly memorable, depicting the Arrakis desert and the sandworm consuming Desmond. This scene was one of the most technically demanding, but the final result was stunning, and were immensely proud of it.We also created a full CG mechanical robot battle sequence featuring MEK robots in a dynamic, fully digital junkyard environment.Additionally, we designed a CG bull that needed to feel grounded in reality while hinting at evolutionary traits from thousands of years in the future. This balance between realism and futurism made it an intriguing challenge.Another exciting addition was the CG lizard, created as the Princes toy. This was a fun creature to design. The animators and riggers enjoyed figuring out its unique movements for the sequence.Lastly, the Anirul world posed a fascinating challenge, requiring the development of a fully CG environment filled with intricate graphical elements. Collaborating with Territory Studios, we designed a detailed library of information stored within the Sisterhood, adding depth to this captivating world.The opening sequence with the giant robots is visually stunning. Can you walk us through the creative process behind designing and animating these massive machines?Martyn Culpitt (VFX Supervisor) We wanted the giant robots to feel both immense and imposing, yet grounded in reality. It was equally important that they exude intelligence, reflecting the advanced technical knowledge of their era. The concept department played a crucial role in this process, drawing inspiration from large, heavy machinery and industrial vehicles to give these robots a tangible sense of scale and presence.Once the concepts were approved, we transitioned to building detailed 3D models. The animation team collaborated closely with the modeling team to bring these machines to life, focusing on every aspect of their movement. We emphasized scale and weight, ensuring that each motion felt deliberate, heavy, and consistent with their massive size and mechanical nature.Jeremy Mesana (Animation Supervisor) Like most characters we will always start with a motion study. Aiming to get an idea on locomotion and scale nailed down before we start shot production. Trying to find early nuance in a characters movement usually saves time come shot production time. For the robots, we focused on how the legs moved early on while still in the concept stage. So that we could give mobility feedback to assets while still in the modeling stage. For the blaster fire, we again dove in early to try and figure out the mechanics of where the blaster originates as well of the mechanics of how the laser tracking and blaster firing would work. Making solid headway early on for these things helped the shots move much faster when it came to shot production.How did you approach the challenge of integrating the robots seamlessly into the war zone environment? What techniques were used to ensure their scale felt imposing?Martyn Culpitt (VFX Supervisor) When we received the RAW plates, they featured soldiers running through the frame, with detailed ground debris and practical special effects explosions. These elements provided a strong foundation for integrating the CG robots seamlessly into the scene.Building on this, we added even more debris and significantly extended the environment to enhance the sense of scale and depth. To fully immerse the robots into the action, we layered in additional FX elements, such as explosions, flying debris, and atmospheric effects, ensuring the CG seamlessly blended with the live-action footage and amplified the intensity of the scene.We focused on animating the robots to convey their massive scale, ensuring every movement felt deliberate and heavy. Their positioning was carefully crafted to emphasize their imposing and menacing presence, amplifying their sense of threat.Jeremy Mesana (Animation Supervisor) Having enough reference elements in the environment like bodies and debris gave the robots a nice scale of reference. The ground interaction also helped give scale with the laser explosions and collision effects.Can you tell us more about the interplay between practical effects and CGI in this sequence? Were there any physical elements used as a base for the robots?Martyn Culpitt (VFX Supervisor) There were a few practical elements integrated into the scene. For example, a blue laser mounted on a structure was used to simulate where the robot might scan its surroundings. While it provided a useful reference, the timing didnt align with the robots animation, so we painted it out and replaced it with a CG version that synced perfectly with the robots movements.Similarly, practical explosions were set off to mimic the robot firing its weapon. These worked well but were enhanced with additional CG explosions and FX simulations to help the scale of the impact. While the scene already featured some atmospheric elements and practical explosions, we layered in new FX versions to ensure they seamlessly aligned with the CG robots depth in the scene and animation, enhancing the overall integration and intensity of the shot.Adrien Vallecilla (CG Supervisor) HBO constructed a simple mechanical structure to represent the height of the main robot, with an animated laser mounted on it and aimed at the running soldiers. We replaced the mechanical structure and laser with our CGI robot, extended the set, added traveling blasters, smokes, and explosions over the plate, and enhanced the set explosions with additional CG elements.The plates provided a solid foundation, but replacing the practical laser was challenging. This required us to completely paint out the set laser and incorporate CG interactive lighting from the robots laser/blasters into the plate elements.What were the key technical hurdles in creating such a complex opening scene, and how did the team at Image Engine overcome them?Martyn Culpitt (VFX Supervisor) One of the biggest challenges we faced was the existing practical elements in the plates, which included dynamic atmospherics, lasers, and explosions. This required us to be extremely careful in how we integrated our FX work to ensure everything blended seamlessly. Fortunately, the client provided lidar scans, which were invaluable for aligning the robots and CG environment ext. with the correct depth and be able to blend it together. It was a creative challenge for sure.Adrien Vallecilla (CG Supervisor) One of the biggest concerns was the amount of FX fires, smoke, and explosions needed to make the battlefield feel realistic. Early in FX, we created a library of explosions, fires, smoke plumes, and ambient smoke elements that animation placed in the scenes. We created cross-software attributes for animation to offset their, scale, and duplicate those FX elements, and once approved, these were published to lighting and rendered. Because we render our animation dailies with Arnold too, this approach allowed us to creatively approve the battlefield at the animation stage, saving time downstream and avoiding unnecessary caching in FX.The sandworms are iconic to the Dune universe. How did you ensure your version of these creatures remained faithful to their legacy while introducing new visual elements?Martyn Culpitt (VFX Supervisor) We really studied the Dune movies to thoroughly understand the movements of the sandworms and their sense of scale. The worms larger movements are smooth and subtle, which contributes to their impressive scale. However, its the moments when the worms pause and interact with the actors that reveal the depth and detail of their animation, truly bringing them to life. The movement of the worms teeth, lips, and mouth conveys an incredible sense of scale and believability.To achieve this, we developed a sophisticated rig for the teeth, allowing them to move in response to the creatures vibrations as it traverses its environment and interacts with the sand. We also ensured that the individual hairs could be animated as needed.What was the biggest challenge in creating the sandworms interactions with the environment, especially with the intricate movement of sand?Martyn Culpitt (VFX Supervisor) We carefully analyzed numerous shots from the films to understand how the worms move through the sand, focusing on the flow of their movement and the way the sand flows and interacts with them. Particular attention was given to how the sand behaves in each shot and the individual elements that contribute to the worms immense sense of scale.When you truly study the films, you realize the sand isnt just a simple elementits a complex interplay of layers, movements, and volumes. There are countless FX passes working together to create the illusion of dynamic sand that enhances the scale and realism of the shots.This detailed study of sand movement was essential to integrating our creature seamlessly into its environment and ensuring it remained faithful to the films aesthetic and vision.Adrien Vallecilla (CG Supervisor) The main challenge ended up being the distance from the ground. Because the sand was made of particles, we needed to simulate millions of particles to make the simulations look realistic at that scale. The simulation and rendering time slowed down the creative process, but we managed to find ways to optimize and speed up the process by splitting the elements into sections and reviewing particles and volumes separately.One of the creative challenges we faced was balancing the volumes with the sand particles. There isnt much reference for such a massive amount of moving sand, so it took us some time to find the right balance. Another challenge was that the sequence took place in a dream, and for creative reasons, the worm had different scales per shot. This posed a challenge for FX and required them to build flexible templates to accommodate scale adjustments for the worm.Could you share insights into the textures and detailing of the worms skin? How did the team achieve such a realistic yet otherworldly appearance?Martyn Culpitt (VFX Supervisor) Having the Dune movies as a reference was hugely instrumental in shaping the look of our creature. We dedicated significant time to studying the details and nuances, ensuring we accurately replicated the textural qualities and intricate model details of the worms. To enhance the realism, we added fine surface hairs and introduced additional textural breakup, bringing even more depth and life to the creature in our shots.Geoff Pedder (Asset Supervisor) We studied the worms in the movies quite closely and used a mixture of zbrush sculpting for larger scale features, render time displacement, dust maps and groom to achieve the fine details and light response we were looking for.Clement Feuillet (Texture Artist) Modeling did a phenomenal job detailing the hi-resolution worm, so I was able to bake the very high poly details of the worm to use as a base for texturing, then we tried to find some good references for the worm scale and skin textures by looking at the movies, internet references, different types of stones We blended different scan data of rocks and sand textures, starting with the high-resolution baked data as our foundation. I created the scale mask by comparing the low poly and high poly meshes, and then we worked our way from large details down to smaller ones, utilizing all the data we had generated before, like occlusion to add intricate sand details between scales, using gradients and curvature masks to create a more weathered appearance on the tips of the scales that come into direct contact with the sand etc..EungHo Lo (Organic Modeller) The specific appearance of the worm was already depicted in the movie, so it was a great starting point for the design process. The level of detail varies greatly depending on how much the worm appears in the movie and how close it gets to the camera. Since most of its appearances focus on the head and open mouth, we dedicated significant effort to that part. Additionally, because it always appears with sand, creating realistic sand greatly enhanced the realism of the worm.Since the worm lives in the sand, remains constantly dry, and has a very hard shell, we tried to capture that texture and skin detail by combining the hardness of rocks with the dry, rough texture of an elephants skin.The lizard-like robot adds a unique touch to the narrative. How did you conceptualize its design and movement to balance its mechanical and organic features?Martyn Culpitt (VFX Supervisor) The original design of the Lizard was provided as a concept model by the client, which we further refined and enhanced to add greater detail and bring more life to its overall appearance. Our aim was to create a creature that seamlessly blended the organic qualities of a real lizard with the precision of a mechanical machine. To achieve this, we conducted in-depth movement studies, focusing on key details to realize this unique vision.Animating the Lizard was particularly challenging, as we needed to maintain a lifelike quality while incorporating a mechanical edge. The team did an incredible job striking that balance. The difficulty increased once the Lizard was stabbed by Desmonds knife. We had to carefully adjust its movements to convey damage, giving it a staccato-like, jerky quality. In the end the final animation really worked to meet the feel Alison, the showrunner, was after and she absolutely loved it.Jeremy Mesana (Animation Supervisor) In its undamaged state the lizard we played more organic in its movements, referencing a real-life lizard for motion. Its when it got damaged that we leaned more into a mechanical staccato-like robot motion to further sell that it was one of the thinking machines.What was the most intricate part of animating the robot lizard, particularly when showcasing its interactions with the characters and environment?Martyn Culpitt (VFX Supervisor) The transformation from the ball to the lizard unfurling on Pruwets hand was one of the most complex challenges we faced. The lizards design featured countless scales that needed to be individually animated while also blending seamlessly as the ball unfurled. When in its spherical form, the lizard has an abundance of scales that make up the sphere, but as it transitions into its lizard form, nearly half of those scales disappear.We had to find a precise and intricate way to blend and hide the scales during the transformation. It was almost like a Transformer-style transition, which made the process both fascinating and challenging for the team. Creating this effect required a highly complex rig and a lot of time and effort to get everything just right, but it was incredibly rewarding to see it come together.Jeremy Mesana (Animation Supervisor) The transformation is the more intricate of tasks for the lizard. Unfurling itself from its ball state and springing to life. Make a believable transition from one state to the other.Were there any specific inspirations or references the team used to develop its behavior and personality?Jeremy Mesana (Animation Supervisor) We gathered a slew of real-life lizard footage as reference for the motion.The holograms in Dune: Prophecy are incredibly intricate. What were the creative inspirations behind their design and color palette?Martyn Culpitt (VFX Supervisor) The holographic table sequence was one of the most challenging yet creative moments in the series. Holograms have been portrayed in countless ways before, and we were determined not to mimic what had already been done. To create something unique, we collaborated closely with Mike, the visual effects supervisor, and Alison, the showrunner, presenting numerous concepts for how the holographic table would look and function.Our goal was to make it feel tangible, as though the table was truly projecting light that interacted with surfaces to create the images and worlds we see. One of the biggest hurdles was reconciling the differences between the on-set cameras and the projectors within the scene. This required extensive problem-solving to ensure the visuals came together seamlessly.We also drew inspiration from the first film, particularly the scene where Paul reviews footage in his bedroom of the Fremen walking across the sands of Arrakis. Incorporating elements from that moment helped add continuity and authenticity to our designs. The footage in that scene has a distinct degradationan aged, film-like quality reminiscent of a real projectorand we wanted to ensure our holographic table carried a similar aesthetic.Alison, the showrunner, was clear that the holographic table should not feel like sleek, futuristic technology. Instead, she wanted it grounded in the real world, with a tactile, analog quality rather than a polished, digitally enhanced look.Xander Kennedy (CG Supervisor)- One of our main inspirations for the hologram was the Dune universe itself. There are a lot of conceptual elements that could make up a hologram, but grounding ourselves within the lore and creative elements of our universe was crucial. It became clear that the style in which the dune holograms spoke, were quite an old-school-projector look and feel. The fluttering of a film roll in front of a bulb creates unique artifacts, jitters and light rays that help produce an image that is believable. The color palette has a similar story, All the elements were built up around true photography, as if the hologram had been filmed and recorded in a real environment on Arrakis. Paying careful attention to our key and fill ratios, allowed us to build up the holographic treatment correctly. The warmth of Arrakis played into our favor and using more saturated mids / shadows really gave the hologram a sense of depth while keeping highlights as a sense of connection to the projectors.How did the team handle the technical challenges of rendering holographic elements that interacted dynamically with characters and settings?Xander Kennedy (CG Supervisor) The team did really well collaborating with each other, some of our render layers had to be rendered out of lighting and passed back downstream to FX in order to generate the correct projected light rays to connect to the highlights of the final image. It was a bit of a cyclical process, but utilizing some of our pipeline templates and key leads and artists, we were able to manage a staggered process that ended up being pretty hands-free. Each shot had dozens of AOVs multiple artists and FX elements split for the purpose of recombining in a way that was sympathetic to layering. In order to squeeze out the most height and depth to the hologram, the comp team also spent a lot of time meticulously placing additional elements that gave a lot of life to the final image.Were there any unexpected technical or creative challenges encountered during the production?Martyn Culpitt (VFX Supervisor) The project presented numerous creative and technical challenges, but the crew excelled at prioritizing issues and devising solutions within an incredibly tight deadline. Each sequence was unique in its demands, requiring significant creative effort to solve, often accompanied by unforeseen technical hurdles.One particularly challenging sequence was the ice lake environment. Its black-and-white snow presented a complex problem, as we needed to fine-tune the values and variations of snow on a per-shot basis. This required developing new tools and workflows across multiple departments to achieve the desired results.The teams collaborative spirit was truly remarkable, and their unwavering support for one another made all the difference. Its a testament to their dedication and hard work that we were able to deliver a project of such high creative quality.Adrien Vallecilla (CG Supervisor) One of the challenges was the Anirul genetic library. It is always hard to work with holographic elements, but the challenging aspect of this work is that the environment and holograms are characters in the episode. On top of having many different communication tools, Anirul itself needed to look beautiful and contribute to the artistic universe of Dune. Each asset in this environment has different attributes built-in for animation, the artists were able to move, pulse, and fade in and out the holograms. We needed to create publishable, attribute-driven assets to accomplish this in our 3D scenes, the animators were animating attributes in Maya that allowed the holograms to behave how we wanted, and then these were published to lighting for proper rendering. One of the goals of having the holograms in 3d space was to achieve realistic reflections and lighting that would interact with the environment as Anirul communicates. Organizing all these elements to create the Anirul environment that elegantly tells the story was one of the most challenging aspects of this work together.Xander Kennedy (CG Supervisor) I think something we didnt anticipate was the level of technical and creative direction we needed to give the physical projector lightbulbs of the holographic table itself.We ended up having to make a small rig / animation that was instanced on to each of the light placements and using a variety of attributes just for one shot we had to reveal how the projector turned on. This sent us down a small rabbit hole where we had to think okay how would this thing reveal a bulb from underneath the table surface and turn out a light in an aesthetically pleasing way, this had almost every departments input from asset, animation, FX, lighting and comp to compose the right effect and was quite extensive. I dont know if anyone other than the artists working on it would ever fully grasp the full extent of what we achieved.I think the biggest technical hurdle for me was designing a system to be able to art direct black and white snow within each shot, but still being able to have one depend and influence each other cohesively. The snowstorm was 60 sum shots all of varying density, storm level intensity, volumetrics, particles and black / white levels, which meant we had to carefully design a system that could output control of quite a variety of creative attributes, whilst still being able to hit creative notes on each and pivot if need be. Eventually, we were able to come up with a system to minimize FX simulations and maximize attribute control and provide flexibility for other departments to use. It meant FX lighting and comp had to work closely and share setups and knowledge extensively throughout the process.Looking back on the project, what aspects of the visual effects are you most proud of?Martyn Culpitt (VFX Supervisor) Its incredibly rewarding to look back at the end of a project and reflect on the challenges we faced and overcame, emerging with an exceptional body of work we can all be proud of. Every project is a team effort, but on this one, the crew truly went above and beyond to problem-solve, support one another, and uphold the creative and visual quality of the show to an extraordinary standard.The work is genuinely outstanding, and the teams passion was evident from the very beginning. Everyone was deeply committed to making this project the best it could be, and their excitement to contribute was inspiring.There are so many sequences Im particularly proud of. The Arrakis world with the worm and the FX simulations look fantastic. The holographic table sequence stands out with its distinctive and complex creative build, its a testament to the immense effort that went into it. One of my favorite scenes is the Ice Lake. The amount of environmental FX simulation work, all driven by the emotions of Valya was pretty incredible. There are so many layers from FX and comp really took this and made it special. The Anirul world we created is so unique with the trees and complexity Ive never seen anything quite like it. It was a huge collaborative effort with the team from Territory Studios to bring their graphics to life in our full CG world.This was truly an unforgettable project to be part of, and I feel incredibly fortunate to have worked on it.How long have you worked on this show?Martyn Culpitt (VFX Supervisor) Start date: Jan 8, 2024. Finished: Nov 15, 2024Whats the VFX shots count?Martyn Culpitt (VFX Supervisor) 208.What is your next project?Martyn Culpitt (VFX Supervisor) I cant disclose that yet, but I can tell you its going to be a pretty cool project.A big thanks for your time.WANT TO KNOW MORE?Image Engine: Dedicated page about Dune: Prophecy on Image Engine website.Mike Enriquez & Terron Pratt: Heres my interview of Production VFX Supervisor Mike Enriquez and VFX Producer Terron Pratt. Vincent Frei The Art of VFX 2025
    0 التعليقات 0 المشاركات 164 مشاهدة
  • WWW.ARCHPAPER.COM
    Announcing the 2025 Georgia Titan 100 recipients
    Titan CEO is pleased to announce Holly Gotfredson of American Metalcraft and Finishing Dynamics as a 2025 Georgia Titan 100. The Titan 100 program recognizes Georgias Top 100 CEOs and C-level executives. They are the areas most accomplished business leaders in their industry using criteria that include demonstrating exceptional leadership, vision, and passion. Collectively, the 2025 Georgia Titan 100 and their companies employ over 136,000 individuals and generate over $61 billion in annual revenues. This years honorees will be published in a limited-edition Titan 100 book and profiled exclusively online. They will be honored at the annual awards ceremony on May 8, 2025, and will be given the opportunity to interact and connect multiple times throughout the year with their fellow Titans. This years Titan 100 embody the true diversity of Georgias business landscape. Representing construction, marketing and advertising, financial services, food and beverages, information technology and services, and nonprofit sectors, among others.Georgias Titan 100 are redefining business with vision and purpose, setting new standards for growth, innovation, and impact. These trailblazing leaders inspire transformation across industries, uplift communities, and drive meaningful change. We proudly celebrate their legacy of excellence and unwavering commitment to shaping a brighter future for all, said Jaime Zawmon, president of Titan CEO. Holly Gotfredson isthe president ofAmerican Metalcraft and Finishing Dynamics. (Courtesy American Metalcraft)Holly Gotfredson, the president ofAmerican Metalcraft and Finishing Dynamics, stands at the helm of two WBE-certified architectural metal product companies that have been prominent forces in the industry for decades. Under her visionary leadership, American Metalcraft remains a trusted resource, providing an extensive array of high-quality solid plate aluminum, zinc, and stainless products, including rainscreens, wet seal systems, perforated panels, ornamental metal, custom sunshades, column covers, brake metal and decorative exterior elements.Finishing Dynamics, her in-house metal coating company, provides extrusion coatings for not only American Metalcraft products, but for other fabricators throughout the nation.Holly Gotfredson will be honored at the annual Titan 100 awards celebration on May 8, 2025, held at the Delta Flight Museum. The 68,000-square-foot Delta Flight Museum has allowed visitors from around the world to explore aviation history, celebrate the story and people of Delta, and discover the future of flight. This elegant, cocktail-style awards event will unite 100 Titans of Industry for an unforgettable evening of celebration, camaraderie, and networkingan evening unlike anything that exists in the Georgia business community.
    0 التعليقات 0 المشاركات 153 مشاهدة
  • WWW.ARCHPAPER.COM
    Luis Barragns La Cuadra San Cristbal to be adapted into new, public cultural venue by Fernando Romero
    A compound in Mexico City designed in the 1960s by Luis Barragn and his protg, Andrs Casillas as an equestrian ranch and property for recreational amusement is slated to be transformed into a public cultural destination later this year. La Cuadra San Cristbal, a 6.5-acre site and its signature bright pink walls have become a pilgrimage of sorts for traveling architects seeking an Instagram mecca. In 2016, Louis Vuitton staged a popular editorial campaign there, where model La Seydoux strutted before the equestrian estates polychromatic walls and horses. Fundacin Fernando Romero, a nonprofit established by architect and philanthropist Fernando Romero, announced plans this week to transform La Cuadra San Cristbal into a new, public cultural destination. The vision includes the addition of new pavilions and other temporary interventions. Romero said he hopes that, looking ahead, the compound will serve the local context in Mexico City, but also the global architecture community at large.La Cuadra San Cristbal was designed in the 1960s as an equestrian estate.(Yannick Wegner/Courtesy Fundacin Fernando Romero)Our work at the Fundacin is driven by the belief that architectural innovation and artistic production can help foster a more just and culturally vibrant world, Romero shared in a statement.It is a great honor to begin this work by envisioning La Cuadra as a dynamic cultural hub that encourages new possibilities at the intersection of art and architecture, Romero added. Through a range of programming, we aim to catalyze the power of architecture for the visiting public and celebrate the enduring cultural influence of Luis Barragn. The vision will transform the property into a hub for the arts and artists. (Yannick Wegner/Courtesy Fundacin Fernando Romero)Romero purchased La Cuadra San Cristbal in 2017, one year after the Louis Vuitton photoshoot. Now, he hopes to use the compound for displaying works by artists, architects, and designers.Over the next ten years, a multi-phase plan for the campus by Romero will add new pavilions adjacent to the Barragn compound, one of which will be a timber structure by Kengo Kuma. Marina Abramovi has been invited to create a temporary intervention in the campuss courtyard, where a shallow water feature and Barragns heavy use of color are on full view. Aside from the new pavilion by Kengo Kuma, other new features will include a permanent exhibition that tells the story of Barragns life and work; an artist residency program; additional galleries, including one gallery that displays the Archivo Collection of design objects. In addition to galleries and artist workshops the transformed La Cuadra San Cristbal will house a podcast production studio, a library, a multi-purpose event space, and of course a gift shop and cafe. Future temporary, site-specific installations are also planned for the property.Among the notable elements of the property are a water feature. (Yannick Wegner/Courtesy Fundacin Fernando Romero)The exhibition indebted to Barragn will be curated by Jorge Covarrubias, the same architect that restored Barragns Casa Prieto Lpez and Fuente del Bebedero. The exhibition will focus on a few key projects by the late designer, including Casa Prieto Lpez, Casa Glvez, Casa Gilardi, Casa Estudio Barragn, Convento de las Capuchinas, Torres de Satlite, Fuente del Bebedero, and of course, La Cuadra San Cristbal.La Cuadra San Cristbal will open to the public in October 2025.
    0 التعليقات 0 المشاركات 155 مشاهدة
  • WWW.COMPUTERWEEKLY.COM
    ServiceNow vaunts agentic AI and announces 22% annual revenue growth
    ServiceNow has reported full-year revenue of $10.98bn, representing 22% year-on-year growth. Fourth-quarter revenue was close to $3bn, growing by 21% year-on-year.The supplier is making its artificial intelligence (AI) efforts front and centre of its messaging.Bill McDermott, chairman and CEO of ServiceNow, said: AI is fuelling a top to bottom reordering of the enterprise technology landscape. Leaders are embracing the ServiceNow platform as their AI agent control tower to unlock exponential productivity and seamlessly orchestrate endtoend business transformation. We are still in the early days of a massive opportunity. ServiceNows innovation, growth and profitability put us in a class of one.In its results statement, the company said it has 2,109 customers with more than $1m in annual contract value (ACV), representing 12% yearonyear growth in customers, and nearly 500 customers with more than $5m in ACV, representing 21% yearonyear growth.It added that the number of customers buying two or more of ServiceNows Pro Plus AI capabilities doubled quarter-on-quarter. It said it has nearly 1,000 agentic AI customers.Gina Mastantuono, ServiceNow president and chief financial officer, said: Our GenAI net new ACV stepped up meaningfully in Q4, as the number of Now Assist service desk deals grew over 150% quarteroverquarter. Were just scratching the surface of whats possible.The moves were making in 2025 arent just about maintaining our lead theyre about expanding it. We are setting ourselves up to define the future of agentpowered automation, solidify ServiceNow as the AI platform for business transformation, and deliver strong growth year after year.At the same time, the supplier announced some product updates, including AI Agent Orchestrator, which connects teams of AI agents working across tasks, systems and departments to drive workflows; AI Agent Studio, a low-code/no-code tool allowing customers to build customised AI agents; and thousands of pre-built, ready-to-deploy AI agents, designed for workflows across IT, customer service and HR.Raj Sharma, global managing partner at EY, said in support of ServiceNows agentic AI technology: AI agents are critical to empower teams with intelligent capabilities working in collaboration between humans and AI. This is why we are working with ServiceNow and our ecosystems partners to harness the full potential of agentic AI across our AI platforms at enterprise-scale, enabling us to integrate and contextualise data across our entire organisation in real time, with the high levels of trust and transparency we need built in.And another customer, Rachel Cameron, head of transformational programmes at Rolls-Royce, added: Rolls-Royce has always been at the forefront of engineering excellence and innovation, continuously finding ways to improve efficiency, resilience and employee experience. By integrating ServiceNow AI agents, we are streamlining operations, reducing manual effort and enabling faster, data-driven decision-making. AI-powered automation is helping us deflect service desk tickets, optimise workflows and provide intelligent insights, allowing our teams to focus on high-value activities while ensuring our operations remain efficient, secure and future-ready.ServiceNow ecosystemServiceNow announced, with its results statement, some partnership developments, including:An expanded partnership with Google Cloud: The ServiceNow Platform will be available on Google Cloud Marketplace and Google Distributed Cloud, and the companies will integrate ServiceNow Workflow Data Fabric and cross-enterprise workflows with Google Clouds AI infrastructure, development platforms and productivity tools.Oracle integration: Expansion of ServiceNows Workflow Data Fabric capabilities through an integration with Oracle data sources, turning insights into action for enhanced decision-making and agility.ServiceNows results were announced in the same week as those of its chairman and CEO Bill McDermotts former company, SAP.SAP announced full-year revenue of 34.2bn, representing 10% year-on-year growth, and fourth-quarter revenue of 9.4bn, up 11% year-on-year.The German-headquartered supplier also highlighted its AI story. CEO Christian Klein said: Q4 was a strong finish to the year, with half of our cloud order entry including AI. Looking at the full year, we exceeded our cloud goals, accelerating cloud revenue and current cloud backlog growth against a much larger base. Total cloud backlog now stands at 63bn, up 40%. Revenue growth has returned to double digits. Looking ahead, our strong position in data and business AI gives us additional confidence that we will accelerate revenue growth through 2027.In its results statement, SAP highlighted its October 2024 announcement of powerful new capabilities that complement and extend Joule, including collaborative AI agents imbued with custom skills to complete complex cross-disciplinary tasks. Joule is the suppliers cloud portfolio generative AI assistant.SAP also highlighted fourth-quarter customer sign-ups for its flagship Rise with SAP cloud migration programme. These included BASF, BP International, EY, Ford Motor Company and Hannover Medical School.It stated that, among others, North Yorkshire Council and Warrington Borough Council chose Grow with SAP, the suppliers mid-market programme aimed at increasing sales of the cloud version of its S/4 Hana enterprise resource planning (ERP) system in the suppliers words: An offering helping customers adopt cloud ERP with speed, predictability and continuous innovation.Read more about ServiceNowServiceNow introduces AI agent studios and orchestrator.ServiceNow ramps up partner incentives around AI.Gartner Symposium: Why ServiceNow wants to be seen as the AI platform for business transformation.
    0 التعليقات 0 المشاركات 157 مشاهدة
  • WWW.COMPUTERWEEKLY.COM
    First international AI safety report published
    The first International AI safety report will be used to inform upcoming diplomatic discussions around how to mitigate a variety of dangers associated with artificial intelligence (AI), but it highlights there is a still a high degree of uncertainty around the exact nature of many threats and how to best deal with them.Commissioned after the inaugural AI Safety Summit hosted by the UK government at Bletchley Park in November 2023 and headed by AI academic Yoshua Bengio the report covers a wide range of threats posed by the technology, including its impact on jobs and the environment, its potential to proliferate cyber attacks and deepfakes, and how it can amplify social biases.It also examines the risks associated with market concentrations over AI and the growing AI R&D [Research and Development] divide, but is limited to looking at all of these risks in the context of systems that can perform a wide variety of tasks, otherwise known as general-purpose AI.For each of the many risks assessed, the report refrained from drawing definitive conclusions, highlighting the high degree of uncertainty around how the fast-moving technology will develop. It called for further monitoring and evidence gathering in each area.Current evidence points to two central challenges in general-purpose AI risk management, it said. First, it is difficult to prioritise risks due to uncertainty about their severity and likelihood of occurrence. Second, it can be complex to determine appropriate roles and responsibilities across the AI value chain, and to incentivise effective action.However, the report is clear in its conclusion that all of the potential future impacts of AI it outlines are a primarily political question, which will be determined by the choices of societies and governments today.How general-purpose AI is developed and by whom, which problems it is designed to solve, whether we will be able to reap its full economic potential, who benefits from it, and the types of risks we expose ourselves to the answers to these and many other questions depend on the choices that societies and governments make today and in the future to shape the development of general-purpose AI, it said, adding there is an urgent need for international collaboration and agreement on these issues.Constructive scientific and public discussion will be essential for societies and policymakers to make the right choices.The findings of the report which build on an interim AI safety report released in May 2024 that showed a lack of expert agreement over the biggest risks are intended to inform discussion at the upcoming AI Action Summit in France, slated for early February 2025, which follows on from the two previous summits in Bletchley and Seoul, South Korea.Artificial intelligence is a central topic of our time, and its safety is a crucial foundation for building trust and fostering adoption. Scientific research must remain the fundamental pillar guiding these efforts, said Clara Chappaz, the French minister delegate for AI and digital technologies.This first comprehensive scientific assessment provides the evidence base needed for societies and governments to shape AIs future direction responsibly. These insights will inform crucial discussions at the upcoming AI Action Summit in Paris.In examining the broader societal risks of AI deployment beyond the capabilities of any individual model the report said the impact on labour markets in particular is likely to be profound.It noted that while there is considerable uncertainty in exactly how AI will affect labour markets, the productivity gains made by the technology are likely to lead to mixed effects on wages across different sectors, increasing wages for some workers while decreasing wages for others, with the most significant near-term impact being on jobs that mainly consist of cognitive tasks.Improved general-purpose AI capabilities are also likely to increase current risks to worker autonomy and well-being, it said, highlighting the harmful effects continuous monitoring and AI-driven workload decisions can have, particularly for logistics workers.In line with a January 2024 assessment of AIs impacts on inequality by the International Monetary Fund (IMF) which found AI is likely to worsen inequality without political intervention the report said: AI-driven labour automation is likely to exacerbate inequality by reducing the share of all income that goes to workers relative to capital owners.Inequality could be further deepened as a result of what the report terms the AI R&D divide, in which development of the technology is highly concentrated in the hands of large countries located in countries with strong digital infrastructure.For example, in 2023, the majority of notable general-purpose AI models (56%) were developed in the US. This disparity exposes many LMICs [low- and middle-income countries] to risks of dependency and could exacerbate existing inequalities, it said, adding that development costs are only set to rise, exacerbating this divide further.The report also highlighted the rising trend of ghost work, which refers to the mostly hidden labour performed by workers often in precarious conditions in low-income countries to support the development of AI models. It added that while this work can provide people with economic opportunities, the contract-style nature of this work often provides few benefits and worker protections and less job stability, as platforms rotate markets to find cheaper labour.Related to all of this is the high degree of market concentration around AI, which allows a small handful of powerful companies to dominate decision-making around the development and use of the tech.On the environmental impact, the report noted while datacentre operators are increasingly turning to renewable energy sources, a significant portion of AI training globally still relies on high-carbon energy sources such as coal or natural gas, and uses significant amounts of water as well.It added that efficiency improvements in AI-related hardware alone have not negated the overall growth in energy use of AI and possibly further accelerate it because of rebound effects, but that current figures largely rely on estimates, which become even more variable and unreliable when extrapolated into the future due to the rapid pace of development in the field.Highlighting the concrete harms that AI can cause as a result of its potential to amplify existing political and social biases, the report said it could lead to discriminatory outcomes including unequal resource allocation, reinforcement of stereotypes, and systematic neglect of certain groups or viewpoints.It specifically noted how most AI systems are trained on language and image datasets that disproportionately represent English-speaking and western cultures, that many design choices align to particular worldviews at the expense of others, and that current bias mitigation techniques are unreliable.A holistic and participatory approach that includes a variety of perspectives and stakeholders is essential to mitigate bias, it said.Echoing the findings of the interim report around human loss of control of AI systems which some are worried could cause an extinction-level event the report acknowledged such fears but noted that opinion varies greatly.Some consider it implausible, some consider it likely to occur, and some see it as a modest likelihood risk that warrants attention due to its high severity, it said. More foundationally, competitive pressures may partly determine the risk of loss of control [because] competition between companies or between countries can lead them to accept larger risks to stay ahead.In terms of malicious AI use, the report highlighted issues around cyber security, deepfakes and its use in the development of biological or chemical weapons.On deepfakes, it noted the particularly harmful effects on children and women, who face distinct threats of sexual abuse and violence.Current detection methods and watermarking techniques, while progressing, show mixed results and face persisting technical challenges, it said. This means there is currently no single robust solution for detecting and reducing the spread of harmful AI-generated content. Finally, the rapid advancement of AI technology often outpaces detection methods, highlighting potential limitations of relying solely on technical and reactive interventions.On cyber security, it noted while AI systems have shown significant progress in autonomously identifying and exploiting cyber vulnerabilities, these risks are, in principle, manageable, as AI can also be used defensively.Rapid advancements in capabilities make it difficult to rule out large-scale risks in the near term, thus highlighting the need for evaluating and monitoring these risks, it said. Better metrics are needed to understand real-world attack scenarios, particularly when humans and AIs work together. A critical challenge is mitigating offensive capabilities without compromising defensive applications.It added that while new AI models can create step-by-step guides for creating pathogens and toxins that surpass PhD-level expertise, potentially contributing to a lowering of the barriers to developing biological or chemical weapons, it remains a technically complex process, meaning the practical utility for novices remains uncertain.Read more about artificial intelligence technologyDigital Ethics Summit 2024 recognising AIs socio-technical nature: At trade association TechUKs eighth annual Digital Ethics Summit, public officials and industry figures and civil society groups met to discuss the ethical challenges associated with the proliferation of artificial intelligence tools globally and the direction of travel set for 2025.Barings Law plans to sue Microsoft and Google over AI training data: Microsoft and Google are using peoples personal data without proper consent to train artificial intelligence models, alleges Barings Law, as it prepares to launch a legal challenge against the tech giants.AI interview Thomas Dekeyser, researcher and film director: On the politics of techno-refusal, and the lessons that can be learned from a clandestine group of French IT workers who spent the early 1980s sabotaging technological infrastructure.
    0 التعليقات 0 المشاركات 154 مشاهدة