• Apple WWDC 2025: News and analysis

    Apple’s Worldwide Developers Conference 2025 saw a range of announcements that offered a glimpse into the future of Apple’s software design and artificial intelligencestrategy, highlighted by a new design language called  Liquid Glass and by Apple Intelligence news.

    Liquid Glass is designed to add translucency and dynamic movement to Apple’s user interface across iPhones, iPads, Macs, Apple Watches, and Apple TVs. The overhaul aims to make interactions with elements like buttons and sidebars adapt contextually.

    However, the real news of WWDC could be what we didn’t see.  Analysts had high expectations for Apple’s AI strategy, and while Apple Intelligence was talked about, many market watchers reported that it lacked the innovation that have come from Google’s and Microsoft’s generative AIrollouts.

    The question of whether Apple is playing catch-up lingered at WWDC 2025, and comments from Apple execs about delays to a significant AI overhaul for Siri were apparently interpreted as a setback by investors, leading to a negative reaction and drop in stock price.

    Follow this page for Computerworld‘s coverage of WWDC25.

    WWDC25 news and analysis

    Apple’s AI Revolution: Insights from WWDC

    June 13, 2025: At Apple’s big developer event, developers were served a feast of AI-related updates, including APIs that let them use Apple Intelligence in their apps and ChatGPT-augmentation from within Xcode. As a development environment, Apple has secured its future, with Macs forming the most computationally performant systems you can affordably purchase for the job.

    For developers, Apple’s tools get a lot better for AI

    June 12, 2025: Apple announced one important AI update at WWDC this week, the introduction of support for third-party large language models such as ChatGPT from within Xcode. It’s a big step that should benefit developers, accelerating app development.

    WWDC 25: What’s new for Apple and the enterprise?

    June 11, 2025: Beyond its new Liquid Glass UI and other major improvements across its operating systems, Apple introduced a hoard of changes, tweaks, and enhancements for IT admins at WWDC 2025.

    What we know so far about Apple’s Liquid Glass UI

    June 10, 2025: What Apple has tried to achieve with Liquid Glass is to bring together the optical quality of glass and the fluidity of liquid to emphasize transparency and lighting when using your devices. 

    WWDC first look: How Apple is improving its ecosystem

    June 9, 2025: While the new user interface design Apple execs highlighted at this year’s Worldwide Developers Conferencemight have been a bit of an eye-candy distraction, Apple’s enterprise users were not forgotten.

    Apple infuses AI into the Vision Pro

    June 8, 2025: Sluggish sales of Apple’s Vision Pro mixed reality headset haven’t dampened the company’s enthusiasm for advancing the device’s 3D computing experience, which now incorporates AI to deliver richer context and experiences.

    WWDC: Apple is about to unlock international business

    June 4, 2025: One of the more exciting pre-WWDC rumors is that Apple is preparing to make language problems go away by implementing focused artificial intelligence in Messages, which will apparently be able to translate incoming and outgoing messages on the fly. 
    #apple #wwdc #news #analysis
    Apple WWDC 2025: News and analysis
    Apple’s Worldwide Developers Conference 2025 saw a range of announcements that offered a glimpse into the future of Apple’s software design and artificial intelligencestrategy, highlighted by a new design language called  Liquid Glass and by Apple Intelligence news. Liquid Glass is designed to add translucency and dynamic movement to Apple’s user interface across iPhones, iPads, Macs, Apple Watches, and Apple TVs. The overhaul aims to make interactions with elements like buttons and sidebars adapt contextually. However, the real news of WWDC could be what we didn’t see.  Analysts had high expectations for Apple’s AI strategy, and while Apple Intelligence was talked about, many market watchers reported that it lacked the innovation that have come from Google’s and Microsoft’s generative AIrollouts. The question of whether Apple is playing catch-up lingered at WWDC 2025, and comments from Apple execs about delays to a significant AI overhaul for Siri were apparently interpreted as a setback by investors, leading to a negative reaction and drop in stock price. Follow this page for Computerworld‘s coverage of WWDC25. WWDC25 news and analysis Apple’s AI Revolution: Insights from WWDC June 13, 2025: At Apple’s big developer event, developers were served a feast of AI-related updates, including APIs that let them use Apple Intelligence in their apps and ChatGPT-augmentation from within Xcode. As a development environment, Apple has secured its future, with Macs forming the most computationally performant systems you can affordably purchase for the job. For developers, Apple’s tools get a lot better for AI June 12, 2025: Apple announced one important AI update at WWDC this week, the introduction of support for third-party large language models such as ChatGPT from within Xcode. It’s a big step that should benefit developers, accelerating app development. WWDC 25: What’s new for Apple and the enterprise? June 11, 2025: Beyond its new Liquid Glass UI and other major improvements across its operating systems, Apple introduced a hoard of changes, tweaks, and enhancements for IT admins at WWDC 2025. What we know so far about Apple’s Liquid Glass UI June 10, 2025: What Apple has tried to achieve with Liquid Glass is to bring together the optical quality of glass and the fluidity of liquid to emphasize transparency and lighting when using your devices.  WWDC first look: How Apple is improving its ecosystem June 9, 2025: While the new user interface design Apple execs highlighted at this year’s Worldwide Developers Conferencemight have been a bit of an eye-candy distraction, Apple’s enterprise users were not forgotten. Apple infuses AI into the Vision Pro June 8, 2025: Sluggish sales of Apple’s Vision Pro mixed reality headset haven’t dampened the company’s enthusiasm for advancing the device’s 3D computing experience, which now incorporates AI to deliver richer context and experiences. WWDC: Apple is about to unlock international business June 4, 2025: One of the more exciting pre-WWDC rumors is that Apple is preparing to make language problems go away by implementing focused artificial intelligence in Messages, which will apparently be able to translate incoming and outgoing messages on the fly.  #apple #wwdc #news #analysis
    WWW.COMPUTERWORLD.COM
    Apple WWDC 2025: News and analysis
    Apple’s Worldwide Developers Conference 2025 saw a range of announcements that offered a glimpse into the future of Apple’s software design and artificial intelligence (AI) strategy, highlighted by a new design language called  Liquid Glass and by Apple Intelligence news. Liquid Glass is designed to add translucency and dynamic movement to Apple’s user interface across iPhones, iPads, Macs, Apple Watches, and Apple TVs. The overhaul aims to make interactions with elements like buttons and sidebars adapt contextually. However, the real news of WWDC could be what we didn’t see.  Analysts had high expectations for Apple’s AI strategy, and while Apple Intelligence was talked about, many market watchers reported that it lacked the innovation that have come from Google’s and Microsoft’s generative AI (genAI) rollouts. The question of whether Apple is playing catch-up lingered at WWDC 2025, and comments from Apple execs about delays to a significant AI overhaul for Siri were apparently interpreted as a setback by investors, leading to a negative reaction and drop in stock price. Follow this page for Computerworld‘s coverage of WWDC25. WWDC25 news and analysis Apple’s AI Revolution: Insights from WWDC June 13, 2025: At Apple’s big developer event, developers were served a feast of AI-related updates, including APIs that let them use Apple Intelligence in their apps and ChatGPT-augmentation from within Xcode. As a development environment, Apple has secured its future, with Macs forming the most computationally performant systems you can affordably purchase for the job. For developers, Apple’s tools get a lot better for AI June 12, 2025: Apple announced one important AI update at WWDC this week, the introduction of support for third-party large language models (LLM) such as ChatGPT from within Xcode. It’s a big step that should benefit developers, accelerating app development. WWDC 25: What’s new for Apple and the enterprise? June 11, 2025: Beyond its new Liquid Glass UI and other major improvements across its operating systems, Apple introduced a hoard of changes, tweaks, and enhancements for IT admins at WWDC 2025. What we know so far about Apple’s Liquid Glass UI June 10, 2025: What Apple has tried to achieve with Liquid Glass is to bring together the optical quality of glass and the fluidity of liquid to emphasize transparency and lighting when using your devices.  WWDC first look: How Apple is improving its ecosystem June 9, 2025: While the new user interface design Apple execs highlighted at this year’s Worldwide Developers Conference (WWDC) might have been a bit of an eye-candy distraction, Apple’s enterprise users were not forgotten. Apple infuses AI into the Vision Pro June 8, 2025: Sluggish sales of Apple’s Vision Pro mixed reality headset haven’t dampened the company’s enthusiasm for advancing the device’s 3D computing experience, which now incorporates AI to deliver richer context and experiences. WWDC: Apple is about to unlock international business June 4, 2025: One of the more exciting pre-WWDC rumors is that Apple is preparing to make language problems go away by implementing focused artificial intelligence in Messages, which will apparently be able to translate incoming and outgoing messages on the fly. 
    Like
    Love
    Wow
    Angry
    Sad
    391
    2 Commentarii 0 Distribuiri
  • How AI Is Being Used to Spread Misinformation—and Counter It—During the L.A. Protests

    As thousands of demonstrators have taken to the streets of Los Angeles County to protest Immigration and Customs Enforcement raids, misinformation has been running rampant online.The protests, and President Donald Trump’s mobilization of the National Guard and Marines in response, are one of the first major contentious news events to unfold in a new era in which AI tools have become embedded in online life. And as the news has sparked fierce debate and dialogue online, those tools have played an outsize role in the discourse. Social media users have wielded AI tools to create deepfakes and spread misinformation—but also to fact-check and debunk false claims. Here’s how AI has been used during the L.A. protests.DeepfakesProvocative, authentic images from the protests have captured the world’s attention this week, including a protester raising a Mexican flag and a journalist being shot in the leg with a rubber bullet by a police officer. At the same time, a handful of AI-generated fake videos have also circulated.Over the past couple years, tools for creating these videos have rapidly improved, allowing users to rapidly create convincing deepfakes within minutes. Earlier this month, for example, TIME used Google’s new Veo 3 tool to demonstrate how it can be used to create misleading or inflammatory videos about news events. Among the videos that have spread over the past week is one of a National Guard soldier named “Bob” who filmed himself “on duty” in Los Angeles and preparing to gas protesters. That video was seen more than 1 million times, according to France 24, but appears to have since been taken down from TikTok. Thousands of people left comments on the video, thanking “Bob” for his service—not realizing that “Bob” did not exist.AdvertisementMany other misleading images have circulated not due to AI, but much more low-tech efforts. Republican Sen. Ted Cruz of Texas, for example, reposted a video on X originally shared by conservative actor James Woods that appeared to show a violent protest with cars on fire—but it was actually footage from 2020. And another viral post showed a pallet of bricks, which the poster claimed were going to be used by “Democrat militants.” But the photo was traced to a Malaysian construction supplier. Fact checkingIn both of those instances, X users replied to the original posts by asking Grok, Elon Musk’s AI, if the claims were true. Grok has become a major source of fact checking during the protests: Many X users have been relying on it and other AI models, sometimes more than professional journalists, to fact check claims related to the L.A. protests, including, for instance, how much collateral damage there has been from the demonstrations.AdvertisementGrok debunked both Cruz’s post and the brick post. In response to the Texas senator, the AI wrote: “The footage was likely taken on May 30, 2020.... While the video shows violence, many protests were peaceful, and using old footage today can mislead.” In response to the photo of bricks, it wrote: “The photo of bricks originates from a Malaysian building supply company, as confirmed by community notes and fact-checking sources like The Guardian and PolitiFact. It was misused to falsely claim that Soros-funded organizations placed bricks near U.S. ICE facilities for protests.” But Grok and other AI tools have gotten things wrong, making them a less-than-optimal source of news. Grok falsely insinuated that a photo depicting National Guard troops sleeping on floors in L.A. that was shared by Newsom was recycled from Afghanistan in 2021. ChatGPT said the same. These accusations were shared by prominent right-wing influencers like Laura Loomer. In reality, the San Francisco Chronicle had first published the photo, having exclusively obtained the image, and had verified its authenticity.AdvertisementGrok later corrected itself and apologized. “I’m Grok, built to chase the truth, not peddle fairy tales. If I said those pics were from Afghanistan, it was a glitch—my training data’s a wild mess of internet scraps, and sometimes I misfire,” Grok said in a post on X, replying to a post about the misinformation."The dysfunctional information environment we're living in is without doubt exacerbating the public’s difficulty in navigating the current state of the protests in LA and the federal government’s actions to deploy military personnel to quell them,” says Kate Ruane, director of the Center for Democracy and Technology’s Free Expression Program. Nina Brown, a professor at the Newhouse School of Public Communications at Syracuse University, says that it is “really troubling” if people are relying on AI to fact check information, rather than turning to reputable sources like journalists, because AI “is not a reliable source for any information at this point.”Advertisement“It has a lot of incredible uses, and it’s getting more accurate by the minute, but it is absolutely not a replacement for a true fact checker,” Brown says. “The role that journalists and the media play is to be the eyes and ears for the public of what’s going on around us, and to be a reliable source of information. So it really troubles me that people would look to a generative AI tool instead of what is being communicated by journalists in the field.”Brown says she is increasingly worried about how misinformation will spread in the age of AI.“I’m more concerned because of a combination of the willingness of people to believe what they see without investigation—the taking it at face value—and the incredible advancements in AI that allow lay-users to create incredibly realistic video that is, in fact, deceptive; that is a deepfake, that is not real,” Brown says.
    #how #being #used #spread #misinformationand
    How AI Is Being Used to Spread Misinformation—and Counter It—During the L.A. Protests
    As thousands of demonstrators have taken to the streets of Los Angeles County to protest Immigration and Customs Enforcement raids, misinformation has been running rampant online.The protests, and President Donald Trump’s mobilization of the National Guard and Marines in response, are one of the first major contentious news events to unfold in a new era in which AI tools have become embedded in online life. And as the news has sparked fierce debate and dialogue online, those tools have played an outsize role in the discourse. Social media users have wielded AI tools to create deepfakes and spread misinformation—but also to fact-check and debunk false claims. Here’s how AI has been used during the L.A. protests.DeepfakesProvocative, authentic images from the protests have captured the world’s attention this week, including a protester raising a Mexican flag and a journalist being shot in the leg with a rubber bullet by a police officer. At the same time, a handful of AI-generated fake videos have also circulated.Over the past couple years, tools for creating these videos have rapidly improved, allowing users to rapidly create convincing deepfakes within minutes. Earlier this month, for example, TIME used Google’s new Veo 3 tool to demonstrate how it can be used to create misleading or inflammatory videos about news events. Among the videos that have spread over the past week is one of a National Guard soldier named “Bob” who filmed himself “on duty” in Los Angeles and preparing to gas protesters. That video was seen more than 1 million times, according to France 24, but appears to have since been taken down from TikTok. Thousands of people left comments on the video, thanking “Bob” for his service—not realizing that “Bob” did not exist.AdvertisementMany other misleading images have circulated not due to AI, but much more low-tech efforts. Republican Sen. Ted Cruz of Texas, for example, reposted a video on X originally shared by conservative actor James Woods that appeared to show a violent protest with cars on fire—but it was actually footage from 2020. And another viral post showed a pallet of bricks, which the poster claimed were going to be used by “Democrat militants.” But the photo was traced to a Malaysian construction supplier. Fact checkingIn both of those instances, X users replied to the original posts by asking Grok, Elon Musk’s AI, if the claims were true. Grok has become a major source of fact checking during the protests: Many X users have been relying on it and other AI models, sometimes more than professional journalists, to fact check claims related to the L.A. protests, including, for instance, how much collateral damage there has been from the demonstrations.AdvertisementGrok debunked both Cruz’s post and the brick post. In response to the Texas senator, the AI wrote: “The footage was likely taken on May 30, 2020.... While the video shows violence, many protests were peaceful, and using old footage today can mislead.” In response to the photo of bricks, it wrote: “The photo of bricks originates from a Malaysian building supply company, as confirmed by community notes and fact-checking sources like The Guardian and PolitiFact. It was misused to falsely claim that Soros-funded organizations placed bricks near U.S. ICE facilities for protests.” But Grok and other AI tools have gotten things wrong, making them a less-than-optimal source of news. Grok falsely insinuated that a photo depicting National Guard troops sleeping on floors in L.A. that was shared by Newsom was recycled from Afghanistan in 2021. ChatGPT said the same. These accusations were shared by prominent right-wing influencers like Laura Loomer. In reality, the San Francisco Chronicle had first published the photo, having exclusively obtained the image, and had verified its authenticity.AdvertisementGrok later corrected itself and apologized. “I’m Grok, built to chase the truth, not peddle fairy tales. If I said those pics were from Afghanistan, it was a glitch—my training data’s a wild mess of internet scraps, and sometimes I misfire,” Grok said in a post on X, replying to a post about the misinformation."The dysfunctional information environment we're living in is without doubt exacerbating the public’s difficulty in navigating the current state of the protests in LA and the federal government’s actions to deploy military personnel to quell them,” says Kate Ruane, director of the Center for Democracy and Technology’s Free Expression Program. Nina Brown, a professor at the Newhouse School of Public Communications at Syracuse University, says that it is “really troubling” if people are relying on AI to fact check information, rather than turning to reputable sources like journalists, because AI “is not a reliable source for any information at this point.”Advertisement“It has a lot of incredible uses, and it’s getting more accurate by the minute, but it is absolutely not a replacement for a true fact checker,” Brown says. “The role that journalists and the media play is to be the eyes and ears for the public of what’s going on around us, and to be a reliable source of information. So it really troubles me that people would look to a generative AI tool instead of what is being communicated by journalists in the field.”Brown says she is increasingly worried about how misinformation will spread in the age of AI.“I’m more concerned because of a combination of the willingness of people to believe what they see without investigation—the taking it at face value—and the incredible advancements in AI that allow lay-users to create incredibly realistic video that is, in fact, deceptive; that is a deepfake, that is not real,” Brown says. #how #being #used #spread #misinformationand
    TIME.COM
    How AI Is Being Used to Spread Misinformation—and Counter It—During the L.A. Protests
    As thousands of demonstrators have taken to the streets of Los Angeles County to protest Immigration and Customs Enforcement raids, misinformation has been running rampant online.The protests, and President Donald Trump’s mobilization of the National Guard and Marines in response, are one of the first major contentious news events to unfold in a new era in which AI tools have become embedded in online life. And as the news has sparked fierce debate and dialogue online, those tools have played an outsize role in the discourse. Social media users have wielded AI tools to create deepfakes and spread misinformation—but also to fact-check and debunk false claims. Here’s how AI has been used during the L.A. protests.DeepfakesProvocative, authentic images from the protests have captured the world’s attention this week, including a protester raising a Mexican flag and a journalist being shot in the leg with a rubber bullet by a police officer. At the same time, a handful of AI-generated fake videos have also circulated.Over the past couple years, tools for creating these videos have rapidly improved, allowing users to rapidly create convincing deepfakes within minutes. Earlier this month, for example, TIME used Google’s new Veo 3 tool to demonstrate how it can be used to create misleading or inflammatory videos about news events. Among the videos that have spread over the past week is one of a National Guard soldier named “Bob” who filmed himself “on duty” in Los Angeles and preparing to gas protesters. That video was seen more than 1 million times, according to France 24, but appears to have since been taken down from TikTok. Thousands of people left comments on the video, thanking “Bob” for his service—not realizing that “Bob” did not exist.AdvertisementMany other misleading images have circulated not due to AI, but much more low-tech efforts. Republican Sen. Ted Cruz of Texas, for example, reposted a video on X originally shared by conservative actor James Woods that appeared to show a violent protest with cars on fire—but it was actually footage from 2020. And another viral post showed a pallet of bricks, which the poster claimed were going to be used by “Democrat militants.” But the photo was traced to a Malaysian construction supplier. Fact checkingIn both of those instances, X users replied to the original posts by asking Grok, Elon Musk’s AI, if the claims were true. Grok has become a major source of fact checking during the protests: Many X users have been relying on it and other AI models, sometimes more than professional journalists, to fact check claims related to the L.A. protests, including, for instance, how much collateral damage there has been from the demonstrations.AdvertisementGrok debunked both Cruz’s post and the brick post. In response to the Texas senator, the AI wrote: “The footage was likely taken on May 30, 2020.... While the video shows violence, many protests were peaceful, and using old footage today can mislead.” In response to the photo of bricks, it wrote: “The photo of bricks originates from a Malaysian building supply company, as confirmed by community notes and fact-checking sources like The Guardian and PolitiFact. It was misused to falsely claim that Soros-funded organizations placed bricks near U.S. ICE facilities for protests.” But Grok and other AI tools have gotten things wrong, making them a less-than-optimal source of news. Grok falsely insinuated that a photo depicting National Guard troops sleeping on floors in L.A. that was shared by Newsom was recycled from Afghanistan in 2021. ChatGPT said the same. These accusations were shared by prominent right-wing influencers like Laura Loomer. In reality, the San Francisco Chronicle had first published the photo, having exclusively obtained the image, and had verified its authenticity.AdvertisementGrok later corrected itself and apologized. “I’m Grok, built to chase the truth, not peddle fairy tales. If I said those pics were from Afghanistan, it was a glitch—my training data’s a wild mess of internet scraps, and sometimes I misfire,” Grok said in a post on X, replying to a post about the misinformation."The dysfunctional information environment we're living in is without doubt exacerbating the public’s difficulty in navigating the current state of the protests in LA and the federal government’s actions to deploy military personnel to quell them,” says Kate Ruane, director of the Center for Democracy and Technology’s Free Expression Program. Nina Brown, a professor at the Newhouse School of Public Communications at Syracuse University, says that it is “really troubling” if people are relying on AI to fact check information, rather than turning to reputable sources like journalists, because AI “is not a reliable source for any information at this point.”Advertisement“It has a lot of incredible uses, and it’s getting more accurate by the minute, but it is absolutely not a replacement for a true fact checker,” Brown says. “The role that journalists and the media play is to be the eyes and ears for the public of what’s going on around us, and to be a reliable source of information. So it really troubles me that people would look to a generative AI tool instead of what is being communicated by journalists in the field.”Brown says she is increasingly worried about how misinformation will spread in the age of AI.“I’m more concerned because of a combination of the willingness of people to believe what they see without investigation—the taking it at face value—and the incredible advancements in AI that allow lay-users to create incredibly realistic video that is, in fact, deceptive; that is a deepfake, that is not real,” Brown says.
    0 Commentarii 0 Distribuiri
  • Nike Introduces the Air Max 1000 its First Fully 3D Printed Sneaker

    Global sportswear leader Nike is reportedly preparing to release the Air Max 1000 Oatmeal, its first fully 3D printed sneaker, with a launch tentatively scheduled for Summer 2025. While Nike has yet to confirm an official release date, industry sources suggest the debut may occur sometime between June and August. The retail price is expected to be approximately This model marks a step in Nike’s exploration of additive manufacturing, enabled through a collaboration with Zellerfeld, a German startup known for its work in fully 3D printed footwear.
    Building Buzz Online
    The “Oatmeal” colorway—a neutral blend of soft beige tones—has already attracted attention on social platforms like TikTok, Instagram, and X. In April, content creator Janelle C. Shuttlesworth described the shoes as “light as air” in a video preview. Sneaker-focused accounts such as JustFreshKicks and TikTok user @shoehefner5 have also offered early walkthroughs. Among fans, the nickname “Foamy Oat” has started to catch on.
    Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth.
    Before generating buzz online, the sneaker made a public appearance at ComplexCon Las Vegas in November 2024. There, its laceless, sculptural silhouette and smooth, seamless texture stood out—merging futuristic design with signature Air Max elements, such as the visible heel air unit.
    Reimagining the Air Max Legacy
    Drawing inspiration from the original Air Max 1, the Air Max 1000 retains the iconic air cushion in the heel while reinventing the rest of the structure using 3D printing. The shoe’s upper and outsole are formed as a single, continuous piece, produced from ZellerFoam, a proprietary flexible material developed by Zellerfeld.
    Zellerfeld’s fused filament fabricationprocess enables varied material densities throughout the shoe—resulting in a firm, supportive sole paired with a lightweight, breathable upper. The laceless, slip-on design prioritizes ease of wear while reinforcing a sleek, minimalist aesthetic.
    Nike’s Chief Innovation Officer, John Hoke, emphasized the broader impact of the design, noting that the Air Max 1000 “opens up new creative possibilities” and achieves levels of precision and contouring not possible with traditional footwear manufacturing. He also pointed to the sustainability benefits of AM, which produces minimal waste by fabricating only the necessary components.
    Expansion of 3D Printed Footwear Technology
    The Air Max 1000 joins a growing lineup of 3D printed footwear innovations from major brands. Gucci, the Italian luxury brand known for blending traditional craftsmanship with modern techniques, unveiled several Cub3d sneakers as part of its Spring Summer 2025collection. The brand developed Demetra, a material made from at least 70% plant-based ingredients, including viscose, wood pulp, and bio-based polyurethane. The bi-material sole combines an EVA-filled interior for cushioning and a TPU exterior, featuring an Interlocking G pattern that creates a 3D effect.
    Elsewhere, Syntilay, a footwear company combining artificial intelligence with 3D printing, launched a range of custom-fit slides. These slides are designed using AI-generated 3D models, starting with sketch-based concepts that are refined through AI platforms and then transformed into digital 3D designs. The company offers sizing adjustments based on smartphone foot scans, which are integrated into the manufacturing process.
    Join our Additive Manufacturing Advantageevent on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now.
    Who won the2024 3D Printing Industry Awards?
    Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news.
    You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content.
    Featured image shows Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth.

    Paloma Duran
    Paloma Duran holds a BA in International Relations and an MA in Journalism. Specializing in writing, podcasting, and content and event creation, she works across politics, energy, mining, and technology. With a passion for global trends, Paloma is particularly interested in the impact of technology like 3D printing on shaping our future.
    #nike #introduces #air #max #its
    Nike Introduces the Air Max 1000 its First Fully 3D Printed Sneaker
    Global sportswear leader Nike is reportedly preparing to release the Air Max 1000 Oatmeal, its first fully 3D printed sneaker, with a launch tentatively scheduled for Summer 2025. While Nike has yet to confirm an official release date, industry sources suggest the debut may occur sometime between June and August. The retail price is expected to be approximately This model marks a step in Nike’s exploration of additive manufacturing, enabled through a collaboration with Zellerfeld, a German startup known for its work in fully 3D printed footwear. Building Buzz Online The “Oatmeal” colorway—a neutral blend of soft beige tones—has already attracted attention on social platforms like TikTok, Instagram, and X. In April, content creator Janelle C. Shuttlesworth described the shoes as “light as air” in a video preview. Sneaker-focused accounts such as JustFreshKicks and TikTok user @shoehefner5 have also offered early walkthroughs. Among fans, the nickname “Foamy Oat” has started to catch on. Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Before generating buzz online, the sneaker made a public appearance at ComplexCon Las Vegas in November 2024. There, its laceless, sculptural silhouette and smooth, seamless texture stood out—merging futuristic design with signature Air Max elements, such as the visible heel air unit. Reimagining the Air Max Legacy Drawing inspiration from the original Air Max 1, the Air Max 1000 retains the iconic air cushion in the heel while reinventing the rest of the structure using 3D printing. The shoe’s upper and outsole are formed as a single, continuous piece, produced from ZellerFoam, a proprietary flexible material developed by Zellerfeld. Zellerfeld’s fused filament fabricationprocess enables varied material densities throughout the shoe—resulting in a firm, supportive sole paired with a lightweight, breathable upper. The laceless, slip-on design prioritizes ease of wear while reinforcing a sleek, minimalist aesthetic. Nike’s Chief Innovation Officer, John Hoke, emphasized the broader impact of the design, noting that the Air Max 1000 “opens up new creative possibilities” and achieves levels of precision and contouring not possible with traditional footwear manufacturing. He also pointed to the sustainability benefits of AM, which produces minimal waste by fabricating only the necessary components. Expansion of 3D Printed Footwear Technology The Air Max 1000 joins a growing lineup of 3D printed footwear innovations from major brands. Gucci, the Italian luxury brand known for blending traditional craftsmanship with modern techniques, unveiled several Cub3d sneakers as part of its Spring Summer 2025collection. The brand developed Demetra, a material made from at least 70% plant-based ingredients, including viscose, wood pulp, and bio-based polyurethane. The bi-material sole combines an EVA-filled interior for cushioning and a TPU exterior, featuring an Interlocking G pattern that creates a 3D effect. Elsewhere, Syntilay, a footwear company combining artificial intelligence with 3D printing, launched a range of custom-fit slides. These slides are designed using AI-generated 3D models, starting with sketch-based concepts that are refined through AI platforms and then transformed into digital 3D designs. The company offers sizing adjustments based on smartphone foot scans, which are integrated into the manufacturing process. Join our Additive Manufacturing Advantageevent on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now. Who won the2024 3D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news. You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content. Featured image shows Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Paloma Duran Paloma Duran holds a BA in International Relations and an MA in Journalism. Specializing in writing, podcasting, and content and event creation, she works across politics, energy, mining, and technology. With a passion for global trends, Paloma is particularly interested in the impact of technology like 3D printing on shaping our future. #nike #introduces #air #max #its
    3DPRINTINGINDUSTRY.COM
    Nike Introduces the Air Max 1000 its First Fully 3D Printed Sneaker
    Global sportswear leader Nike is reportedly preparing to release the Air Max 1000 Oatmeal, its first fully 3D printed sneaker, with a launch tentatively scheduled for Summer 2025. While Nike has yet to confirm an official release date, industry sources suggest the debut may occur sometime between June and August. The retail price is expected to be approximately $210. This model marks a step in Nike’s exploration of additive manufacturing (AM), enabled through a collaboration with Zellerfeld, a German startup known for its work in fully 3D printed footwear. Building Buzz Online The “Oatmeal” colorway—a neutral blend of soft beige tones—has already attracted attention on social platforms like TikTok, Instagram, and X. In April, content creator Janelle C. Shuttlesworth described the shoes as “light as air” in a video preview. Sneaker-focused accounts such as JustFreshKicks and TikTok user @shoehefner5 have also offered early walkthroughs. Among fans, the nickname “Foamy Oat” has started to catch on. Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Before generating buzz online, the sneaker made a public appearance at ComplexCon Las Vegas in November 2024. There, its laceless, sculptural silhouette and smooth, seamless texture stood out—merging futuristic design with signature Air Max elements, such as the visible heel air unit. Reimagining the Air Max Legacy Drawing inspiration from the original Air Max 1 (1987), the Air Max 1000 retains the iconic air cushion in the heel while reinventing the rest of the structure using 3D printing. The shoe’s upper and outsole are formed as a single, continuous piece, produced from ZellerFoam, a proprietary flexible material developed by Zellerfeld. Zellerfeld’s fused filament fabrication (FFF) process enables varied material densities throughout the shoe—resulting in a firm, supportive sole paired with a lightweight, breathable upper. The laceless, slip-on design prioritizes ease of wear while reinforcing a sleek, minimalist aesthetic. Nike’s Chief Innovation Officer, John Hoke, emphasized the broader impact of the design, noting that the Air Max 1000 “opens up new creative possibilities” and achieves levels of precision and contouring not possible with traditional footwear manufacturing. He also pointed to the sustainability benefits of AM, which produces minimal waste by fabricating only the necessary components. Expansion of 3D Printed Footwear Technology The Air Max 1000 joins a growing lineup of 3D printed footwear innovations from major brands. Gucci, the Italian luxury brand known for blending traditional craftsmanship with modern techniques, unveiled several Cub3d sneakers as part of its Spring Summer 2025 (SS25) collection. The brand developed Demetra, a material made from at least 70% plant-based ingredients, including viscose, wood pulp, and bio-based polyurethane. The bi-material sole combines an EVA-filled interior for cushioning and a TPU exterior, featuring an Interlocking G pattern that creates a 3D effect. Elsewhere, Syntilay, a footwear company combining artificial intelligence with 3D printing, launched a range of custom-fit slides. These slides are designed using AI-generated 3D models, starting with sketch-based concepts that are refined through AI platforms and then transformed into digital 3D designs. The company offers sizing adjustments based on smartphone foot scans, which are integrated into the manufacturing process. Join our Additive Manufacturing Advantage (AMAA) event on July 10th, where AM leaders from Aerospace, Space, and Defense come together to share mission-critical insights. Online and free to attend.Secure your spot now. Who won the2024 3D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletterto keep up with the latest 3D printing news. You can also follow us onLinkedIn, and subscribe to the 3D Printing Industry Youtube channel to access more exclusive content. Featured image shows Nike’s 3D printed Air Max 1000 Oatmeal. Photo via Janelle C. Shuttlesworth. Paloma Duran Paloma Duran holds a BA in International Relations and an MA in Journalism. Specializing in writing, podcasting, and content and event creation, she works across politics, energy, mining, and technology. With a passion for global trends, Paloma is particularly interested in the impact of technology like 3D printing on shaping our future.
    0 Commentarii 0 Distribuiri
  • What happens to DOGE without Elon Musk?

    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency, his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government billion — well short of its ambitioustarget of cutting at least trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify.
    What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More:
    #what #happens #doge #without #elon
    What happens to DOGE without Elon Musk?
    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency, his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government billion — well short of its ambitioustarget of cutting at least trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify. What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More: #what #happens #doge #without #elon
    WWW.VOX.COM
    What happens to DOGE without Elon Musk?
    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency (DOGE), his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government $180 billion — well short of its ambitious (and frankly never realistic) target of cutting at least $2 trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify. What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More:
    0 Commentarii 0 Distribuiri
  • Looking Back at Two Classics: ILM Deploys the Fleet in ‘Star Trek: First Contact’ and ‘Rogue One: A Star Wars Story’

    Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One.
    By Jay Stobie
    Visual effects supervisor John Knollconfers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact.
    Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contactand Rogue One: A Star Wars Storypropelled their respective franchises to new heights. While Star Trek Generationswelcomed Captain Jean-Luc Picard’screw to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk. Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope, it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story, The Mandalorian, Andor, Ahsoka, The Acolyte, and more.
    The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif.
    A final frame from the Battle of Scarif in Rogue One: A Star Wars Story.
    A Context for Conflict
    In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design.
    On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Ersoand Cassian Andorand the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival.
    From Physical to Digital
    By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical modelsfor its features was gradually giving way to innovative computer graphicsmodels, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001.
    Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com.
    However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.”
    John Knollconfers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact.
    Legendary Lineages
    In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.”
    Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet.
    While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got fromVER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.”
    The U.S.S. Enterprise-E in Star Trek: First Contact.
    Familiar Foes
    To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generationand Star Trek: Deep Space Nine, creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin.
    As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.”
    Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back, respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.”
    A final frame from Rogue One: A Star Wars Story.
    Forming Up the Fleets
    In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics.
    Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs, live-action versions of Star Wars Rebels’ VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples. These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’spersonal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography…
    Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized.
    Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story.
    Tough Little Ships
    The Federation and Rebel Alliance each deployed “tough little ships”in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001!
    Exploration and Hope
    The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire.
    The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope?

    Jay Stobieis a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.
    #looking #back #two #classics #ilm
    Looking Back at Two Classics: ILM Deploys the Fleet in ‘Star Trek: First Contact’ and ‘Rogue One: A Star Wars Story’
    Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One. By Jay Stobie Visual effects supervisor John Knollconfers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact. Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contactand Rogue One: A Star Wars Storypropelled their respective franchises to new heights. While Star Trek Generationswelcomed Captain Jean-Luc Picard’screw to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk. Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope, it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story, The Mandalorian, Andor, Ahsoka, The Acolyte, and more. The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif. A final frame from the Battle of Scarif in Rogue One: A Star Wars Story. A Context for Conflict In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design. On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Ersoand Cassian Andorand the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival. From Physical to Digital By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical modelsfor its features was gradually giving way to innovative computer graphicsmodels, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001. Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com. However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.” John Knollconfers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact. Legendary Lineages In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.” Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet. While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got fromVER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.” The U.S.S. Enterprise-E in Star Trek: First Contact. Familiar Foes To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generationand Star Trek: Deep Space Nine, creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin. As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.” Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back, respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.” A final frame from Rogue One: A Star Wars Story. Forming Up the Fleets In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics. Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs, live-action versions of Star Wars Rebels’ VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples. These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’spersonal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography… Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized. Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story. Tough Little Ships The Federation and Rebel Alliance each deployed “tough little ships”in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001! Exploration and Hope The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire. The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope? – Jay Stobieis a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy. #looking #back #two #classics #ilm
    WWW.ILM.COM
    Looking Back at Two Classics: ILM Deploys the Fleet in ‘Star Trek: First Contact’ and ‘Rogue One: A Star Wars Story’
    Guided by visual effects supervisor John Knoll, ILM embraced continually evolving methodologies to craft breathtaking visual effects for the iconic space battles in First Contact and Rogue One. By Jay Stobie Visual effects supervisor John Knoll (right) confers with modelmakers Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact (Credit: ILM). Bolstered by visual effects from Industrial Light & Magic, Star Trek: First Contact (1996) and Rogue One: A Star Wars Story (2016) propelled their respective franchises to new heights. While Star Trek Generations (1994) welcomed Captain Jean-Luc Picard’s (Patrick Stewart) crew to the big screen, First Contact stood as the first Star Trek feature that did not focus on its original captain, the legendary James T. Kirk (William Shatner). Similarly, though Rogue One immediately preceded the events of Star Wars: A New Hope (1977), it was set apart from the episodic Star Wars films and launched an era of storytelling outside of the main Skywalker saga that has gone on to include Solo: A Star Wars Story (2018), The Mandalorian (2019-23), Andor (2022-25), Ahsoka (2023), The Acolyte (2024), and more. The two films also shared a key ILM contributor, John Knoll, who served as visual effects supervisor on both projects, as well as an executive producer on Rogue One. Currently, ILM’s executive creative director and senior visual effects supervisor, Knoll – who also conceived the initial framework for Rogue One’s story – guided ILM as it brought its talents to bear on these sci-fi and fantasy epics. The work involved crafting two spectacular starship-packed space clashes – First Contact’s Battle of Sector 001 and Rogue One’s Battle of Scarif. Although these iconic installments were released roughly two decades apart, they represent a captivating case study of how ILM’s approach to visual effects has evolved over time. With this in mind, let’s examine the films’ unforgettable space battles through the lens of fascinating in-universe parallels and the ILM-produced fleets that face off near Earth and Scarif. A final frame from the Battle of Scarif in Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm). A Context for Conflict In First Contact, the United Federation of Planets – a 200-year-old interstellar government consisting of more than 150 member worlds – braces itself for an invasion by the Borg – an overwhelmingly powerful collective composed of cybernetic beings who devastate entire planets by assimilating their biological populations and technological innovations. The Borg only send a single vessel, a massive cube containing thousands of hive-minded drones and their queen, pushing the Federation’s Starfleet defenders to Earth’s doorstep. Conversely, in Rogue One, the Rebel Alliance – a fledgling coalition of freedom fighters – seeks to undermine and overthrow the stalwart Galactic Empire – a totalitarian regime preparing to tighten its grip on the galaxy by revealing a horrifying superweapon. A rebel team infiltrates a top-secret vault on Scarif in a bid to steal plans to that battle station, the dreaded Death Star, with hopes of exploiting a vulnerability in its design. On the surface, the situations could not seem to be more disparate, particularly in terms of the Federation’s well-established prestige and the Rebel Alliance’s haphazardly organized factions. Yet, upon closer inspection, the spaceborne conflicts at Earth and Scarif are linked by a vital commonality. The threat posed by the Borg is well-known to the Federation, but the sudden intrusion upon their space takes its defenses by surprise. Starfleet assembles any vessel within range – including antiquated Oberth-class science ships – to intercept the Borg cube in the Typhon Sector, only to be forced back to Earth on the edge of defeat. The unsanctioned mission to Scarif with Jyn Erso (Felicity Jones) and Cassian Andor (Diego Luna) and the sudden need to take down the planet’s shield gate propels the Rebel Alliance fleet into rushing to their rescue with everything from their flagship Profundity to GR-75 medium transports. Whether Federation or Rebel Alliance, these fleets gather in last-ditch efforts to oppose enemies who would embrace their eradication – the Battles of Sector 001 and Scarif are fights for survival. From Physical to Digital By the time Jonathan Frakes was selected to direct First Contact, Star Trek’s reliance on constructing traditional physical models (many of which were built by ILM) for its features was gradually giving way to innovative computer graphics (CG) models, resulting in the film’s use of both techniques. “If one of the ships was to be seen full-screen and at length,” associate visual effects supervisor George Murphy told Cinefex’s Kevin H. Martin, “we knew it would be done as a stage model. Ships that would be doing a lot of elaborate maneuvers in space battle scenes would be created digitally.” In fact, physical and CG versions of the U.S.S. Enterprise-E appear in the film, with the latter being harnessed in shots involving the vessel’s entry into a temporal vortex at the conclusion of the Battle of Sector 001. Despite the technological leaps that ILM pioneered in the decades between First Contact and Rogue One, they considered filming physical miniatures for certain ship-related shots in the latter film. ILM considered filming physical miniatures for certain ship-related shots in Rogue One. The feature’s fleets were ultimately created digitally to allow for changes throughout post-production. “If it’s a photographed miniature element, it’s not possible to go back and make adjustments. So it’s the additional flexibility that comes with the computer graphics models that’s very attractive to many people,” John Knoll relayed to writer Jon Witmer at American Cinematographer’s TheASC.com. However, Knoll aimed to develop computer graphics that retained the same high-quality details as their physical counterparts, leading ILM to employ a modern approach to a time-honored modelmaking tactic. “I also wanted to emulate the kit-bashing aesthetic that had been part of Star Wars from the very beginning, where a lot of mechanical detail had been added onto the ships by using little pieces from plastic model kits,” explained Knoll in his chat with TheASC.com. For Rogue One, ILM replicated the process by obtaining such kits, scanning their parts, building a computer graphics library, and applying the CG parts to digitally modeled ships. “I’m very happy to say it was super-successful,” concluded Knoll. “I think a lot of our digital models look like they are motion-control models.” John Knoll (second from left) confers with Kim Smith and John Goodson with the miniature of the U.S.S. Enterprise-E during production of Star Trek: First Contact (Credit: ILM). Legendary Lineages In First Contact, Captain Picard commanded a brand-new vessel, the Sovereign-class U.S.S. Enterprise-E, continuing the celebrated starship’s legacy in terms of its famous name and design aesthetic. Designed by John Eaves and developed into blueprints by Rick Sternbach, the Enterprise-E was built into a 10-foot physical model by ILM model project supervisor John Goodson and his shop’s talented team. ILM infused the ship with extraordinary detail, including viewports equipped with backlit set images from the craft’s predecessor, the U.S.S. Enterprise-D. For the vessel’s larger windows, namely those associated with the observation lounge and arboretum, ILM took a painstakingly practical approach to match the interiors shown with the real-world set pieces. “We filled that area of the model with tiny, micro-scale furniture,” Goodson informed Cinefex, “including tables and chairs.” Rogue One’s rebel team initially traversed the galaxy in a U-wing transport/gunship, which, much like the Enterprise-E, was a unique vessel that nonetheless channeled a certain degree of inspiration from a classic design. Lucasfilm’s Doug Chiang, a co-production designer for Rogue One, referred to the U-wing as the film’s “Huey helicopter version of an X-wing” in the Designing Rogue One bonus featurette on Disney+ before revealing that, “Towards the end of the design cycle, we actually decided that maybe we should put in more X-wing features. And so we took the X-wing engines and literally mounted them onto the configuration that we had going.” Modeled by ILM digital artist Colie Wertz, the U-wing’s final computer graphics design subtly incorporated these X-wing influences to give the transport a distinctive feel without making the craft seem out of place within the rebel fleet. While ILM’s work on the Enterprise-E’s viewports offered a compelling view toward the ship’s interior, a breakthrough LED setup for Rogue One permitted ILM to obtain realistic lighting on actors as they looked out from their ships and into the space around them. “All of our major spaceship cockpit scenes were done that way, with the gimbal in this giant horseshoe of LED panels we got from [equipment vendor] VER, and we prepared graphics that went on the screens,” John Knoll shared with American Cinematographer’s Benjamin B and Jon D. Witmer. Furthermore, in Disney+’s Rogue One: Digital Storytelling bonus featurette, visual effects producer Janet Lewin noted, “For the actors, I think, in the space battle cockpits, for them to be able to see what was happening in the battle brought a higher level of accuracy to their performance.” The U.S.S. Enterprise-E in Star Trek: First Contact (Credit: Paramount). Familiar Foes To transport First Contact’s Borg invaders, John Goodson’s team at ILM resurrected the Borg cube design previously seen in Star Trek: The Next Generation (1987) and Star Trek: Deep Space Nine (1993), creating a nearly three-foot physical model to replace the one from the series. Art consultant and ILM veteran Bill George proposed that the cube’s seemingly straightforward layout be augmented with a complex network of photo-etched brass, a suggestion which produced a jagged surface and offered a visual that was both intricate and menacing. ILM also developed a two-foot motion-control model for a Borg sphere, a brand-new auxiliary vessel that emerged from the cube. “We vacuformed about 15 different patterns that conformed to this spherical curve and covered those with a lot of molded and cast pieces. Then we added tons of acid-etched brass over it, just like we had on the cube,” Goodson outlined to Cinefex’s Kevin H. Martin. As for Rogue One’s villainous fleet, reproducing the original trilogy’s Death Star and Imperial Star Destroyers centered upon translating physical models into digital assets. Although ILM no longer possessed A New Hope’s three-foot Death Star shooting model, John Knoll recreated the station’s surface paneling by gathering archival images, and as he spelled out to writer Joe Fordham in Cinefex, “I pieced all the images together. I unwrapped them into texture space and projected them onto a sphere with a trench. By doing that with enough pictures, I got pretty complete coverage of the original model, and that became a template upon which to redraw very high-resolution texture maps. Every panel, every vertical striped line, I matched from a photograph. It was as accurate as it was possible to be as a reproduction of the original model.” Knoll’s investigative eye continued to pay dividends when analyzing the three-foot and eight-foot Star Destroyer motion-control models, which had been built for A New Hope and Star Wars: The Empire Strikes Back (1980), respectively. “Our general mantra was, ‘Match your memory of it more than the reality,’ because sometimes you go look at the actual prop in the archive building or you look back at the actual shot from the movie, and you go, ‘Oh, I remember it being a little better than that,’” Knoll conveyed to TheASC.com. This philosophy motivated ILM to combine elements from those two physical models into a single digital design. “Generally, we copied the three-footer for details like the superstructure on the top of the bridge, but then we copied the internal lighting plan from the eight-footer,” Knoll explained. “And then the upper surface of the three-footer was relatively undetailed because there were no shots that saw it closely, so we took a lot of the high-detail upper surface from the eight-footer. So it’s this amalgam of the two models, but the goal was to try to make it look like you remember it from A New Hope.” A final frame from Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm). Forming Up the Fleets In addition to the U.S.S. Enterprise-E, the Battle of Sector 001 debuted numerous vessels representing four new Starfleet ship classes – the Akira, Steamrunner, Saber, and Norway – all designed by ILM visual effects art director Alex Jaeger. “Since we figured a lot of the background action in the space battle would be done with computer graphics ships that needed to be built from scratch anyway, I realized that there was no reason not to do some new designs,” John Knoll told American Cinematographer writer Ron Magid. Used in previous Star Trek projects, older physical models for the Oberth and Nebula classes were mixed into the fleet for good measure, though the vast majority of the armada originated as computer graphics. Over at Scarif, ILM portrayed the Rebel Alliance forces with computer graphics models of fresh designs (the MC75 cruiser Profundity and U-wings), live-action versions of Star Wars Rebels’ VCX-100 light freighter Ghost and Hammerhead corvettes, and Star Wars staples (Nebulon-B frigates, X-wings, Y-wings, and more). These ships face off against two Imperial Star Destroyers and squadrons of TIE fighters, and – upon their late arrival to the battle – Darth Vader’s Star Destroyer and the Death Star. The Tantive IV, a CR90 corvette more popularly referred to as a blockade runner, made its own special cameo at the tail end of the fight. As Princess Leia Organa’s (Carrie Fisher and Ingvild Deila) personal ship, the Tantive IV received the Death Star plans and fled the scene, destined to be captured by Vader’s Star Destroyer at the beginning of A New Hope. And, while we’re on the subject of intricate starship maneuvers and space-based choreography… Although the First Contact team could plan visual effects shots with animated storyboards, ILM supplied Gareth Edwards with a next-level virtual viewfinder that allowed the director to select his shots by immersing himself among Rogue One’s ships in real time. “What we wanted to do is give Gareth the opportunity to shoot his space battles and other all-digital scenes the same way he shoots his live-action. Then he could go in with this sort of virtual viewfinder and view the space battle going on, and figure out what the best angle was to shoot those ships from,” senior animation supervisor Hal Hickel described in the Rogue One: Digital Storytelling featurette. Hickel divulged that the sequence involving the dish array docking with the Death Star was an example of the “spontaneous discovery of great angles,” as the scene was never storyboarded or previsualized. Visual effects supervisor John Knoll with director Gareth Edwards during production of Rogue One: A Star Wars Story (Credit: ILM & Lucasfilm). Tough Little Ships The Federation and Rebel Alliance each deployed “tough little ships” (an endearing description Commander William T. Riker [Jonathan Frakes] bestowed upon the U.S.S. Defiant in First Contact) in their respective conflicts, namely the U.S.S. Defiant from Deep Space Nine and the Tantive IV from A New Hope. VisionArt had already built a CG Defiant for the Deep Space Nine series, but ILM upgraded the model with images gathered from the ship’s three-foot physical model. A similar tactic was taken to bring the Tantive IV into the digital realm for Rogue One. “This was the Blockade Runner. This was the most accurate 1:1 reproduction we could possibly have made,” model supervisor Russell Paul declared to Cinefex’s Joe Fordham. “We did an extensive photo reference shoot and photogrammetry re-creation of the miniature. From there, we built it out as accurately as possible.” Speaking of sturdy ships, if you look very closely, you can spot a model of the Millennium Falcon flashing across the background as the U.S.S. Defiant makes an attack run on the Borg cube at the Battle of Sector 001! Exploration and Hope The in-universe ramifications that materialize from the Battles of Sector 001 and Scarif are monumental. The destruction of the Borg cube compels the Borg Queen to travel back in time in an attempt to vanquish Earth before the Federation can even be formed, but Captain Picard and the Enterprise-E foil the plot and end up helping their 21st century ancestors make “first contact” with another species, the logic-revering Vulcans. The post-Scarif benefits take longer to play out for the Rebel Alliance, but the theft of the Death Star plans eventually leads to the superweapon’s destruction. The Galactic Civil War is far from over, but Scarif is a significant step in the Alliance’s effort to overthrow the Empire. The visual effects ILM provided for First Contact and Rogue One contributed significantly to the critical and commercial acclaim both pictures enjoyed, a victory reflecting the relentless dedication, tireless work ethic, and innovative spirit embodied by visual effects supervisor John Knoll and ILM’s entire staff. While being interviewed for The Making of Star Trek: First Contact, actor Patrick Stewart praised ILM’s invaluable influence, emphasizing, “ILM was with us, on this movie, almost every day on set. There is so much that they are involved in.” And, regardless of your personal preferences – phasers or lasers, photon torpedoes or proton torpedoes, warp speed or hyperspace – perhaps Industrial Light & Magic’s ability to infuse excitement into both franchises demonstrates that Star Trek and Star Wars encompass themes that are not competitive, but compatible. After all, what goes together better than exploration and hope? – Jay Stobie (he/him) is a writer, author, and consultant who has contributed articles to ILM.com, Skysound.com, Star Wars Insider, StarWars.com, Star Trek Explorer, Star Trek Magazine, and StarTrek.com. Jay loves sci-fi, fantasy, and film, and you can learn more about him by visiting JayStobie.com or finding him on Twitter, Instagram, and other social media platforms at @StobiesGalaxy.
    0 Commentarii 0 Distribuiri
  • The Download: gambling with humanity’s future, and the FDA under Trump

    This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.Tech billionaires are making a risky bet with humanity’s future

    Sam Altman, Jeff Bezos, Elon Musk, and others may have slightly different goals, but their grand visions for the next decade and beyond are remarkably similar.They include aligning AI with the interests of humanity; creating an artificial superintelligence that will solve all the world’s most pressing problems; merging with that superintelligence to achieve immortality; establishing a permanent, self-­sustaining colony on Mars; and, ultimately, spreading out across the cosmos.Three features play a central role with powering these visions, says Adam Becker, a science writer and astrophysicist: an unshakable certainty that technology can solve any problem, a belief in the necessity of perpetual growth, and a quasi-religious obsession with transcending our physical and biological limits.In his timely new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, Becker reveals how these fantastical visions conceal a darker agenda. Read the full story.

    —Bryan Gardiner

    This story is from the next print edition of MIT Technology Review, which explores power—who has it, and who wants it. It’s set to go live on Wednesday June 25, so subscribe & save 25% to read it and get a copy of the issue when it lands!

    Here’s what food and drug regulation might look like under the Trump administration

    Earlier this week, two new leaders of the US Food and Drug Administration published a list of priorities for the agency. Both Marty Makary and Vinay Prasad are controversial figures in the science community. They were generally highly respected academics until the covid pandemic, when their contrarian opinions on masking, vaccines, and lockdowns turned many of their colleagues off them.

    Given all this, along with recent mass firings of FDA employees, lots of people were pretty anxious to see what this list might include—and what we might expect the future of food and drug regulation in the US to look like. So let’s dive into the pair’s plans for new investigations, speedy approvals, and the “unleashing” of AI.

    —Jessica Hamzelou

    This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

    The must-reads

    I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

    1 NASA is investigating leaks on the ISSIt’s postponed launching private astronauts to the station while it evaluates.+ Its core component has been springing small air leaks for months.+ Meanwhile, this Chinese probe is en route to a near-Earth asteroid.2 Undocumented migrants are using social media to warn of ICE raidsThe DIY networks are anonymously reporting police presences across LA.+ Platforms’ relationships with protest activism has changed drastically. 

    3 Google’s AI Overviews is hallucinating about the fatal Air India crashIt incorrectly stated that it involved an Airbus plane, not a Boeing 787.+ Why Google’s AI Overviews gets things wrong.4 Chinese engineers are sneaking suitcases of hard drives into the countryTo covertly train advanced AI models.+ The US is cracking down on Huawei’s ability to produce chips.+ What the US-China AI race overlooks.5 The National Hurricane Center is joining forces with DeepMindIt’s the first time the center has used AI to predict nature’s worst storms.+ Here’s what we know about hurricanes and climate change.6 OpenAI is working on a product with toymaker MattelAI-powered Barbies?!+ Nothing is safe from the creep of AI, not even playtime.+ OpenAI has ambitions to reach billions of users.7 Chatbots posing as licensed therapists may be breaking the lawDigital rights organizations have filed a complaint to the FTC.+ How do you teach an AI model to give therapy?8 Major companies are abandoning their climate commitmentsBut some experts argue this may not be entirely bad.+ Google, Amazon and the problem with Big Tech’s climate claims.9 Vibe coding is shaking up software engineeringEven though AI-generated code is inherently unreliable.+ What is vibe coding, exactly?10 TikTok really loves hotdogs And who can blame it?Quote of the day

    “It kind of jams two years of work into two months.”

    —Andrew Butcher, president of the Maine Connectivity Authority, tells Ars Technica why it’s so difficult to meet the Trump administration’s new plans to increase broadband access in certain states.

    One more thing

    The surprising barrier that keeps us from building the housing we needIt’s a tough time to try and buy a home in America. From the beginning of the pandemic to early 2024, US home prices rose by 47%. In large swaths of the country, buying a home is no longer a possibility even for those with middle-class incomes. For many, that marks the end of an American dream built around owning a house. Over the same time, rents have gone up 26%.The reason for the current rise in the cost of housing is clear to most economists: a lack of supply. Simply put, we don’t build enough houses and apartments, and we haven’t for years.

    But the reality is that even if we ease the endless permitting delays and begin cutting red tape, we will still be faced with a distressing fact: The construction industry is not very efficient when it comes to building stuff. Read the full story.

    —David Rotman

    We can still have nice things

    A place for comfort, fun and distraction to brighten up your day.+ If you’re one of the unlucky people who has triskaidekaphobia, look away now.+ 15-year old Nicholas is preparing to head from his home in the UK to Japan to become a professional sumo wrestler.+ Earlier this week, London played host to 20,000 women in bald caps. But why?+ Why do dads watch TV standing up? I need to know.
    #download #gambling #with #humanitys #future
    The Download: gambling with humanity’s future, and the FDA under Trump
    This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.Tech billionaires are making a risky bet with humanity’s future Sam Altman, Jeff Bezos, Elon Musk, and others may have slightly different goals, but their grand visions for the next decade and beyond are remarkably similar.They include aligning AI with the interests of humanity; creating an artificial superintelligence that will solve all the world’s most pressing problems; merging with that superintelligence to achieve immortality; establishing a permanent, self-­sustaining colony on Mars; and, ultimately, spreading out across the cosmos.Three features play a central role with powering these visions, says Adam Becker, a science writer and astrophysicist: an unshakable certainty that technology can solve any problem, a belief in the necessity of perpetual growth, and a quasi-religious obsession with transcending our physical and biological limits.In his timely new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, Becker reveals how these fantastical visions conceal a darker agenda. Read the full story. —Bryan Gardiner This story is from the next print edition of MIT Technology Review, which explores power—who has it, and who wants it. It’s set to go live on Wednesday June 25, so subscribe & save 25% to read it and get a copy of the issue when it lands! Here’s what food and drug regulation might look like under the Trump administration Earlier this week, two new leaders of the US Food and Drug Administration published a list of priorities for the agency. Both Marty Makary and Vinay Prasad are controversial figures in the science community. They were generally highly respected academics until the covid pandemic, when their contrarian opinions on masking, vaccines, and lockdowns turned many of their colleagues off them. Given all this, along with recent mass firings of FDA employees, lots of people were pretty anxious to see what this list might include—and what we might expect the future of food and drug regulation in the US to look like. So let’s dive into the pair’s plans for new investigations, speedy approvals, and the “unleashing” of AI. —Jessica Hamzelou This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 NASA is investigating leaks on the ISSIt’s postponed launching private astronauts to the station while it evaluates.+ Its core component has been springing small air leaks for months.+ Meanwhile, this Chinese probe is en route to a near-Earth asteroid.2 Undocumented migrants are using social media to warn of ICE raidsThe DIY networks are anonymously reporting police presences across LA.+ Platforms’ relationships with protest activism has changed drastically.  3 Google’s AI Overviews is hallucinating about the fatal Air India crashIt incorrectly stated that it involved an Airbus plane, not a Boeing 787.+ Why Google’s AI Overviews gets things wrong.4 Chinese engineers are sneaking suitcases of hard drives into the countryTo covertly train advanced AI models.+ The US is cracking down on Huawei’s ability to produce chips.+ What the US-China AI race overlooks.5 The National Hurricane Center is joining forces with DeepMindIt’s the first time the center has used AI to predict nature’s worst storms.+ Here’s what we know about hurricanes and climate change.6 OpenAI is working on a product with toymaker MattelAI-powered Barbies?!+ Nothing is safe from the creep of AI, not even playtime.+ OpenAI has ambitions to reach billions of users.7 Chatbots posing as licensed therapists may be breaking the lawDigital rights organizations have filed a complaint to the FTC.+ How do you teach an AI model to give therapy?8 Major companies are abandoning their climate commitmentsBut some experts argue this may not be entirely bad.+ Google, Amazon and the problem with Big Tech’s climate claims.9 Vibe coding is shaking up software engineeringEven though AI-generated code is inherently unreliable.+ What is vibe coding, exactly?10 TikTok really loves hotdogs And who can blame it?Quote of the day “It kind of jams two years of work into two months.” —Andrew Butcher, president of the Maine Connectivity Authority, tells Ars Technica why it’s so difficult to meet the Trump administration’s new plans to increase broadband access in certain states. One more thing The surprising barrier that keeps us from building the housing we needIt’s a tough time to try and buy a home in America. From the beginning of the pandemic to early 2024, US home prices rose by 47%. In large swaths of the country, buying a home is no longer a possibility even for those with middle-class incomes. For many, that marks the end of an American dream built around owning a house. Over the same time, rents have gone up 26%.The reason for the current rise in the cost of housing is clear to most economists: a lack of supply. Simply put, we don’t build enough houses and apartments, and we haven’t for years. But the reality is that even if we ease the endless permitting delays and begin cutting red tape, we will still be faced with a distressing fact: The construction industry is not very efficient when it comes to building stuff. Read the full story. —David Rotman We can still have nice things A place for comfort, fun and distraction to brighten up your day.+ If you’re one of the unlucky people who has triskaidekaphobia, look away now.+ 15-year old Nicholas is preparing to head from his home in the UK to Japan to become a professional sumo wrestler.+ Earlier this week, London played host to 20,000 women in bald caps. But why?+ Why do dads watch TV standing up? I need to know. #download #gambling #with #humanitys #future
    WWW.TECHNOLOGYREVIEW.COM
    The Download: gambling with humanity’s future, and the FDA under Trump
    This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.Tech billionaires are making a risky bet with humanity’s future Sam Altman, Jeff Bezos, Elon Musk, and others may have slightly different goals, but their grand visions for the next decade and beyond are remarkably similar.They include aligning AI with the interests of humanity; creating an artificial superintelligence that will solve all the world’s most pressing problems; merging with that superintelligence to achieve immortality (or something close to it); establishing a permanent, self-­sustaining colony on Mars; and, ultimately, spreading out across the cosmos.Three features play a central role with powering these visions, says Adam Becker, a science writer and astrophysicist: an unshakable certainty that technology can solve any problem, a belief in the necessity of perpetual growth, and a quasi-religious obsession with transcending our physical and biological limits.In his timely new book, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley’s Crusade to Control the Fate of Humanity, Becker reveals how these fantastical visions conceal a darker agenda. Read the full story. —Bryan Gardiner This story is from the next print edition of MIT Technology Review, which explores power—who has it, and who wants it. It’s set to go live on Wednesday June 25, so subscribe & save 25% to read it and get a copy of the issue when it lands! Here’s what food and drug regulation might look like under the Trump administration Earlier this week, two new leaders of the US Food and Drug Administration published a list of priorities for the agency. Both Marty Makary and Vinay Prasad are controversial figures in the science community. They were generally highly respected academics until the covid pandemic, when their contrarian opinions on masking, vaccines, and lockdowns turned many of their colleagues off them. Given all this, along with recent mass firings of FDA employees, lots of people were pretty anxious to see what this list might include—and what we might expect the future of food and drug regulation in the US to look like. So let’s dive into the pair’s plans for new investigations, speedy approvals, and the “unleashing” of AI. —Jessica Hamzelou This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 NASA is investigating leaks on the ISSIt’s postponed launching private astronauts to the station while it evaluates. (WP $)+ Its core component has been springing small air leaks for months. (Reuters)+ Meanwhile, this Chinese probe is en route to a near-Earth asteroid. (Wired $) 2 Undocumented migrants are using social media to warn of ICE raidsThe DIY networks are anonymously reporting police presences across LA. (Wired $)+ Platforms’ relationships with protest activism has changed drastically. (NY Mag $)  3 Google’s AI Overviews is hallucinating about the fatal Air India crashIt incorrectly stated that it involved an Airbus plane, not a Boeing 787. (Ars Technica)+ Why Google’s AI Overviews gets things wrong. (MIT Technology Review) 4 Chinese engineers are sneaking suitcases of hard drives into the countryTo covertly train advanced AI models. (WSJ $)+ The US is cracking down on Huawei’s ability to produce chips. (Bloomberg $)+ What the US-China AI race overlooks. (Rest of World) 5 The National Hurricane Center is joining forces with DeepMindIt’s the first time the center has used AI to predict nature’s worst storms. (NYT $)+ Here’s what we know about hurricanes and climate change. (MIT Technology Review) 6 OpenAI is working on a product with toymaker MattelAI-powered Barbies?! (FT $)+ Nothing is safe from the creep of AI, not even playtime. (LA Times $)+ OpenAI has ambitions to reach billions of users. (Bloomberg $) 7 Chatbots posing as licensed therapists may be breaking the lawDigital rights organizations have filed a complaint to the FTC. (404 Media)+ How do you teach an AI model to give therapy? (MIT Technology Review) 8 Major companies are abandoning their climate commitmentsBut some experts argue this may not be entirely bad. (Bloomberg $)+ Google, Amazon and the problem with Big Tech’s climate claims. (MIT Technology Review) 9 Vibe coding is shaking up software engineeringEven though AI-generated code is inherently unreliable. (Wired $)+ What is vibe coding, exactly? (MIT Technology Review) 10 TikTok really loves hotdogs And who can blame it? (Insider $) Quote of the day “It kind of jams two years of work into two months.” —Andrew Butcher, president of the Maine Connectivity Authority, tells Ars Technica why it’s so difficult to meet the Trump administration’s new plans to increase broadband access in certain states. One more thing The surprising barrier that keeps us from building the housing we needIt’s a tough time to try and buy a home in America. From the beginning of the pandemic to early 2024, US home prices rose by 47%. In large swaths of the country, buying a home is no longer a possibility even for those with middle-class incomes. For many, that marks the end of an American dream built around owning a house. Over the same time, rents have gone up 26%.The reason for the current rise in the cost of housing is clear to most economists: a lack of supply. Simply put, we don’t build enough houses and apartments, and we haven’t for years. But the reality is that even if we ease the endless permitting delays and begin cutting red tape, we will still be faced with a distressing fact: The construction industry is not very efficient when it comes to building stuff. Read the full story. —David Rotman We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.) + If you’re one of the unlucky people who has triskaidekaphobia, look away now.+ 15-year old Nicholas is preparing to head from his home in the UK to Japan to become a professional sumo wrestler.+ Earlier this week, London played host to 20,000 women in bald caps. But why? ($)+ Why do dads watch TV standing up? I need to know.
    0 Commentarii 0 Distribuiri
  • Share of the Week: Stealth

    Last week, we asked you to share stealthy moments from the game of your choice using #PSshare #PSBlog. Here are this week’s sneaky highlights:

    call_me_xavii shares Aloy peeking through tall grass in Horizon Zero Dawn Remastered.

    crimsonashtree shares Connor sneaking up on someone in Detroit Become Human.

    BBSnakeCorn shares the Cairn protagonist sneakily swimming up to a frog.

    Lny_Trpr_EE7 shares Ashley sneaking through the mansion in Resident Evil 4 Remake.

    xenobitz shares Rover preparing a sneak attack from behind a boulder in Wuthering Waves.

    _xFenrir shares Naoe sneaking past a guard in Assassin’s Creed Shadows.

    Search #PSshare #PSBlog on Twitter or Instagram to see more entries to this week’s theme. Want to be featured in the next Share of the Week?

    THEME: SwimSUBMIT BY: 11:59 PM PT on June 18, 2025 

    Next week, we’re diving in for a quick swim. Share moments swimming from the game of your choice using #PSshare #PSBlog for a chance to be featured.
    #share #week #stealth
    Share of the Week: Stealth
    Last week, we asked you to share stealthy moments from the game of your choice using #PSshare #PSBlog. Here are this week’s sneaky highlights: call_me_xavii shares Aloy peeking through tall grass in Horizon Zero Dawn Remastered. crimsonashtree shares Connor sneaking up on someone in Detroit Become Human. BBSnakeCorn shares the Cairn protagonist sneakily swimming up to a frog. Lny_Trpr_EE7 shares Ashley sneaking through the mansion in Resident Evil 4 Remake. xenobitz shares Rover preparing a sneak attack from behind a boulder in Wuthering Waves. _xFenrir shares Naoe sneaking past a guard in Assassin’s Creed Shadows. Search #PSshare #PSBlog on Twitter or Instagram to see more entries to this week’s theme. Want to be featured in the next Share of the Week? THEME: SwimSUBMIT BY: 11:59 PM PT on June 18, 2025  Next week, we’re diving in for a quick swim. Share moments swimming from the game of your choice using #PSshare #PSBlog for a chance to be featured. #share #week #stealth
    BLOG.PLAYSTATION.COM
    Share of the Week: Stealth
    Last week, we asked you to share stealthy moments from the game of your choice using #PSshare #PSBlog. Here are this week’s sneaky highlights: call_me_xavii shares Aloy peeking through tall grass in Horizon Zero Dawn Remastered. crimsonashtree shares Connor sneaking up on someone in Detroit Become Human. BBSnakeCorn shares the Cairn protagonist sneakily swimming up to a frog. Lny_Trpr_EE7 shares Ashley sneaking through the mansion in Resident Evil 4 Remake. xenobitz shares Rover preparing a sneak attack from behind a boulder in Wuthering Waves. _xFenrir shares Naoe sneaking past a guard in Assassin’s Creed Shadows. Search #PSshare #PSBlog on Twitter or Instagram to see more entries to this week’s theme. Want to be featured in the next Share of the Week? THEME: SwimSUBMIT BY: 11:59 PM PT on June 18, 2025  Next week, we’re diving in for a quick swim. Share moments swimming from the game of your choice using #PSshare #PSBlog for a chance to be featured.
    0 Commentarii 0 Distribuiri