• Calling on LLMs: New NVIDIA AI Blueprint Helps Automate Telco Network Configuration

    Telecom companies last year spent nearly billion in capital expenditures and over trillion in operating expenditures.
    These large expenses are due in part to laborious manual processes that telcos face when operating networks that require continuous optimizations.
    For example, telcos must constantly tune network parameters for tasks — such as transferring calls from one network to another or distributing network traffic across multiple servers — based on the time of day, user behavior, mobility and traffic type.
    These factors directly affect network performance, user experience and energy consumption.
    To automate these optimization processes and save costs for telcos across the globe, NVIDIA today unveiled at GTC Paris its first AI Blueprint for telco network configuration.
    At the blueprint’s core are customized large language models trained specifically on telco network data — as well as the full technical and operational architecture for turning the LLMs into an autonomous, goal-driven AI agent for telcos.
    Automate Network Configuration With the AI Blueprint
    NVIDIA AI Blueprints — available on build.nvidia.com — are customizable AI workflow examples. They include reference code, documentation and deployment tools that show enterprise developers how to deliver business value with NVIDIA NIM microservices.
    The AI Blueprint for telco network configuration — built with BubbleRAN 5G solutions and datasets — enables developers, network engineers and telecom providers to automatically optimize the configuration of network parameters using agentic AI.
    This can streamline operations, reduce costs and significantly improve service quality by embedding continuous learning and adaptability directly into network infrastructures.
    Traditionally, network configurations required manual intervention or followed rigid rules to adapt to dynamic network conditions. These approaches limited adaptability and increased operational complexities, costs and inefficiencies.
    The new blueprint helps shift telco operations from relying on static, rules-based systems to operations based on dynamic, AI-driven automation. It enables developers to build advanced, telco-specific AI agents that make real-time, intelligent decisions and autonomously balance trade-offs — such as network speed versus interference, or energy savings versus utilization — without human input.
    Powered and Deployed by Industry Leaders
    Trained on 5G data generated by BubbleRAN, and deployed on the BubbleRAN 5G O-RAN platform, the blueprint provides telcos with insight on how to set various parameters to reach performance goals, like achieving a certain bitrate while choosing an acceptable signal-to-noise ratio — a measure that impacts voice quality and thus user experience.
    With the new AI Blueprint, network engineers can confidently set initial parameter values and update them as demanded by continuous network changes.
    Norway-based Telenor Group, which serves over 200 million customers globally, is the first telco to integrate the AI Blueprint for telco network configuration as part of its initiative to deploy intelligent, autonomous networks that meet the performance and agility demands of 5G and beyond.
    “The blueprint is helping us address configuration challenges and enhance quality of service during network installation,” said Knut Fjellheim, chief technology innovation officer at Telenor Maritime. “Implementing it is part of our push toward network automation and follows the successful deployment of agentic AI for real-time network slicing in a private 5G maritime use case.”
    Industry Partners Deploy Other NVIDIA-Powered Autonomous Network Technologies
    The AI Blueprint for telco network configuration is just one of many announcements at NVIDIA GTC Paris showcasing how the telecom industry is using agentic AI to make autonomous networks a reality.
    Beyond the blueprint, leading telecom companies and solutions providers are tapping into NVIDIA accelerated computing, software and microservices to provide breakthrough innovations poised to vastly improve networks and communications services — accelerating the progress to autonomous networks and improving customer experiences.
    NTT DATA is powering its agentic platform for telcos with NVIDIA accelerated compute and the NVIDIA AI Enterprise software platform. Its first agentic use case is focused on network alarms management, where NVIDIA NIM microservices help automate and power observability, troubleshooting, anomaly detection and resolution with closed loop ticketing.
    Tata Consultancy Services is delivering agentic AI solutions for telcos built on NVIDIA DGX Cloud and using NVIDIA AI Enterprise to develop, fine-tune and integrate large telco models into AI agent workflows. These range from billing and revenue assurance, autonomous network management to hybrid edge-cloud distributed inference.
    For example, the company’s anomaly management agentic AI model includes real-time detection and resolution of network anomalies and service performance optimization. This increases business agility and improves operational efficiencies by up to 40% by eliminating human intensive toils, overheads and cross-departmental silos.
    Prodapt has introduced an autonomous operations workflow for networks, powered by NVIDIA AI Enterprise, that offers agentic AI capabilities to support autonomous telecom networks. AI agents can autonomously monitor networks, detect anomalies in real time, initiate diagnostics, analyze root causes of issues using historical data and correlation techniques, automatically execute corrective actions, and generate, enrich and assign incident tickets through integrated ticketing systems.
    Accenture announced its new portfolio of agentic AI solutions for telecommunications through its AI Refinery platform, built on NVIDIA AI Enterprise software and accelerated computing.
    The first available solution, the NOC Agentic App, boosts network operations center tasks by using a generative AI-driven, nonlinear agentic framework to automate processes such as incident and fault management, root cause analysis and configuration planning. Using the Llama 3.1 70B NVIDIA NIM microservice and the AI Refinery Distiller Framework, the NOC Agentic App orchestrates networks of intelligent agents for faster, more efficient decision-making.
    Infosys is announcing its agentic autonomous operations platform, called Infosys Smart Network Assurance, designed to accelerate telecom operators’ journeys toward fully autonomous network operations.
    ISNA helps address long-standing operational challenges for telcos — such as limited automation and high average time to repair — with an integrated, AI-driven platform that reduces operational costs by up to 40% and shortens fault resolution times by up to 30%. NVIDIA NIM and NeMo microservices enhance the platform’s reasoning and hallucination-detection capabilities, reduce latency and increase accuracy.
    Get started with the new blueprint today.
    Learn more about the latest AI advancements for telecom and other industries at NVIDIA GTC Paris, running through Thursday, June 12, at VivaTech, including a keynote from NVIDIA founder and CEO Jensen Huang and a special address from Ronnie Vasishta, senior vice president of telecom at NVIDIA. Plus, hear from industry leaders in a panel session with Orange, Swisscom, Telenor and NVIDIA.
    #calling #llms #new #nvidia #blueprint
    Calling on LLMs: New NVIDIA AI Blueprint Helps Automate Telco Network Configuration
    Telecom companies last year spent nearly billion in capital expenditures and over trillion in operating expenditures. These large expenses are due in part to laborious manual processes that telcos face when operating networks that require continuous optimizations. For example, telcos must constantly tune network parameters for tasks — such as transferring calls from one network to another or distributing network traffic across multiple servers — based on the time of day, user behavior, mobility and traffic type. These factors directly affect network performance, user experience and energy consumption. To automate these optimization processes and save costs for telcos across the globe, NVIDIA today unveiled at GTC Paris its first AI Blueprint for telco network configuration. At the blueprint’s core are customized large language models trained specifically on telco network data — as well as the full technical and operational architecture for turning the LLMs into an autonomous, goal-driven AI agent for telcos. Automate Network Configuration With the AI Blueprint NVIDIA AI Blueprints — available on build.nvidia.com — are customizable AI workflow examples. They include reference code, documentation and deployment tools that show enterprise developers how to deliver business value with NVIDIA NIM microservices. The AI Blueprint for telco network configuration — built with BubbleRAN 5G solutions and datasets — enables developers, network engineers and telecom providers to automatically optimize the configuration of network parameters using agentic AI. This can streamline operations, reduce costs and significantly improve service quality by embedding continuous learning and adaptability directly into network infrastructures. Traditionally, network configurations required manual intervention or followed rigid rules to adapt to dynamic network conditions. These approaches limited adaptability and increased operational complexities, costs and inefficiencies. The new blueprint helps shift telco operations from relying on static, rules-based systems to operations based on dynamic, AI-driven automation. It enables developers to build advanced, telco-specific AI agents that make real-time, intelligent decisions and autonomously balance trade-offs — such as network speed versus interference, or energy savings versus utilization — without human input. Powered and Deployed by Industry Leaders Trained on 5G data generated by BubbleRAN, and deployed on the BubbleRAN 5G O-RAN platform, the blueprint provides telcos with insight on how to set various parameters to reach performance goals, like achieving a certain bitrate while choosing an acceptable signal-to-noise ratio — a measure that impacts voice quality and thus user experience. With the new AI Blueprint, network engineers can confidently set initial parameter values and update them as demanded by continuous network changes. Norway-based Telenor Group, which serves over 200 million customers globally, is the first telco to integrate the AI Blueprint for telco network configuration as part of its initiative to deploy intelligent, autonomous networks that meet the performance and agility demands of 5G and beyond. “The blueprint is helping us address configuration challenges and enhance quality of service during network installation,” said Knut Fjellheim, chief technology innovation officer at Telenor Maritime. “Implementing it is part of our push toward network automation and follows the successful deployment of agentic AI for real-time network slicing in a private 5G maritime use case.” Industry Partners Deploy Other NVIDIA-Powered Autonomous Network Technologies The AI Blueprint for telco network configuration is just one of many announcements at NVIDIA GTC Paris showcasing how the telecom industry is using agentic AI to make autonomous networks a reality. Beyond the blueprint, leading telecom companies and solutions providers are tapping into NVIDIA accelerated computing, software and microservices to provide breakthrough innovations poised to vastly improve networks and communications services — accelerating the progress to autonomous networks and improving customer experiences. NTT DATA is powering its agentic platform for telcos with NVIDIA accelerated compute and the NVIDIA AI Enterprise software platform. Its first agentic use case is focused on network alarms management, where NVIDIA NIM microservices help automate and power observability, troubleshooting, anomaly detection and resolution with closed loop ticketing. Tata Consultancy Services is delivering agentic AI solutions for telcos built on NVIDIA DGX Cloud and using NVIDIA AI Enterprise to develop, fine-tune and integrate large telco models into AI agent workflows. These range from billing and revenue assurance, autonomous network management to hybrid edge-cloud distributed inference. For example, the company’s anomaly management agentic AI model includes real-time detection and resolution of network anomalies and service performance optimization. This increases business agility and improves operational efficiencies by up to 40% by eliminating human intensive toils, overheads and cross-departmental silos. Prodapt has introduced an autonomous operations workflow for networks, powered by NVIDIA AI Enterprise, that offers agentic AI capabilities to support autonomous telecom networks. AI agents can autonomously monitor networks, detect anomalies in real time, initiate diagnostics, analyze root causes of issues using historical data and correlation techniques, automatically execute corrective actions, and generate, enrich and assign incident tickets through integrated ticketing systems. Accenture announced its new portfolio of agentic AI solutions for telecommunications through its AI Refinery platform, built on NVIDIA AI Enterprise software and accelerated computing. The first available solution, the NOC Agentic App, boosts network operations center tasks by using a generative AI-driven, nonlinear agentic framework to automate processes such as incident and fault management, root cause analysis and configuration planning. Using the Llama 3.1 70B NVIDIA NIM microservice and the AI Refinery Distiller Framework, the NOC Agentic App orchestrates networks of intelligent agents for faster, more efficient decision-making. Infosys is announcing its agentic autonomous operations platform, called Infosys Smart Network Assurance, designed to accelerate telecom operators’ journeys toward fully autonomous network operations. ISNA helps address long-standing operational challenges for telcos — such as limited automation and high average time to repair — with an integrated, AI-driven platform that reduces operational costs by up to 40% and shortens fault resolution times by up to 30%. NVIDIA NIM and NeMo microservices enhance the platform’s reasoning and hallucination-detection capabilities, reduce latency and increase accuracy. Get started with the new blueprint today. Learn more about the latest AI advancements for telecom and other industries at NVIDIA GTC Paris, running through Thursday, June 12, at VivaTech, including a keynote from NVIDIA founder and CEO Jensen Huang and a special address from Ronnie Vasishta, senior vice president of telecom at NVIDIA. Plus, hear from industry leaders in a panel session with Orange, Swisscom, Telenor and NVIDIA. #calling #llms #new #nvidia #blueprint
    BLOGS.NVIDIA.COM
    Calling on LLMs: New NVIDIA AI Blueprint Helps Automate Telco Network Configuration
    Telecom companies last year spent nearly $295 billion in capital expenditures and over $1 trillion in operating expenditures. These large expenses are due in part to laborious manual processes that telcos face when operating networks that require continuous optimizations. For example, telcos must constantly tune network parameters for tasks — such as transferring calls from one network to another or distributing network traffic across multiple servers — based on the time of day, user behavior, mobility and traffic type. These factors directly affect network performance, user experience and energy consumption. To automate these optimization processes and save costs for telcos across the globe, NVIDIA today unveiled at GTC Paris its first AI Blueprint for telco network configuration. At the blueprint’s core are customized large language models trained specifically on telco network data — as well as the full technical and operational architecture for turning the LLMs into an autonomous, goal-driven AI agent for telcos. Automate Network Configuration With the AI Blueprint NVIDIA AI Blueprints — available on build.nvidia.com — are customizable AI workflow examples. They include reference code, documentation and deployment tools that show enterprise developers how to deliver business value with NVIDIA NIM microservices. The AI Blueprint for telco network configuration — built with BubbleRAN 5G solutions and datasets — enables developers, network engineers and telecom providers to automatically optimize the configuration of network parameters using agentic AI. This can streamline operations, reduce costs and significantly improve service quality by embedding continuous learning and adaptability directly into network infrastructures. Traditionally, network configurations required manual intervention or followed rigid rules to adapt to dynamic network conditions. These approaches limited adaptability and increased operational complexities, costs and inefficiencies. The new blueprint helps shift telco operations from relying on static, rules-based systems to operations based on dynamic, AI-driven automation. It enables developers to build advanced, telco-specific AI agents that make real-time, intelligent decisions and autonomously balance trade-offs — such as network speed versus interference, or energy savings versus utilization — without human input. Powered and Deployed by Industry Leaders Trained on 5G data generated by BubbleRAN, and deployed on the BubbleRAN 5G O-RAN platform, the blueprint provides telcos with insight on how to set various parameters to reach performance goals, like achieving a certain bitrate while choosing an acceptable signal-to-noise ratio — a measure that impacts voice quality and thus user experience. With the new AI Blueprint, network engineers can confidently set initial parameter values and update them as demanded by continuous network changes. Norway-based Telenor Group, which serves over 200 million customers globally, is the first telco to integrate the AI Blueprint for telco network configuration as part of its initiative to deploy intelligent, autonomous networks that meet the performance and agility demands of 5G and beyond. “The blueprint is helping us address configuration challenges and enhance quality of service during network installation,” said Knut Fjellheim, chief technology innovation officer at Telenor Maritime. “Implementing it is part of our push toward network automation and follows the successful deployment of agentic AI for real-time network slicing in a private 5G maritime use case.” Industry Partners Deploy Other NVIDIA-Powered Autonomous Network Technologies The AI Blueprint for telco network configuration is just one of many announcements at NVIDIA GTC Paris showcasing how the telecom industry is using agentic AI to make autonomous networks a reality. Beyond the blueprint, leading telecom companies and solutions providers are tapping into NVIDIA accelerated computing, software and microservices to provide breakthrough innovations poised to vastly improve networks and communications services — accelerating the progress to autonomous networks and improving customer experiences. NTT DATA is powering its agentic platform for telcos with NVIDIA accelerated compute and the NVIDIA AI Enterprise software platform. Its first agentic use case is focused on network alarms management, where NVIDIA NIM microservices help automate and power observability, troubleshooting, anomaly detection and resolution with closed loop ticketing. Tata Consultancy Services is delivering agentic AI solutions for telcos built on NVIDIA DGX Cloud and using NVIDIA AI Enterprise to develop, fine-tune and integrate large telco models into AI agent workflows. These range from billing and revenue assurance, autonomous network management to hybrid edge-cloud distributed inference. For example, the company’s anomaly management agentic AI model includes real-time detection and resolution of network anomalies and service performance optimization. This increases business agility and improves operational efficiencies by up to 40% by eliminating human intensive toils, overheads and cross-departmental silos. Prodapt has introduced an autonomous operations workflow for networks, powered by NVIDIA AI Enterprise, that offers agentic AI capabilities to support autonomous telecom networks. AI agents can autonomously monitor networks, detect anomalies in real time, initiate diagnostics, analyze root causes of issues using historical data and correlation techniques, automatically execute corrective actions, and generate, enrich and assign incident tickets through integrated ticketing systems. Accenture announced its new portfolio of agentic AI solutions for telecommunications through its AI Refinery platform, built on NVIDIA AI Enterprise software and accelerated computing. The first available solution, the NOC Agentic App, boosts network operations center tasks by using a generative AI-driven, nonlinear agentic framework to automate processes such as incident and fault management, root cause analysis and configuration planning. Using the Llama 3.1 70B NVIDIA NIM microservice and the AI Refinery Distiller Framework, the NOC Agentic App orchestrates networks of intelligent agents for faster, more efficient decision-making. Infosys is announcing its agentic autonomous operations platform, called Infosys Smart Network Assurance (ISNA), designed to accelerate telecom operators’ journeys toward fully autonomous network operations. ISNA helps address long-standing operational challenges for telcos — such as limited automation and high average time to repair — with an integrated, AI-driven platform that reduces operational costs by up to 40% and shortens fault resolution times by up to 30%. NVIDIA NIM and NeMo microservices enhance the platform’s reasoning and hallucination-detection capabilities, reduce latency and increase accuracy. Get started with the new blueprint today. Learn more about the latest AI advancements for telecom and other industries at NVIDIA GTC Paris, running through Thursday, June 12, at VivaTech, including a keynote from NVIDIA founder and CEO Jensen Huang and a special address from Ronnie Vasishta, senior vice president of telecom at NVIDIA. Plus, hear from industry leaders in a panel session with Orange, Swisscom, Telenor and NVIDIA.
    0 Комментарии 0 Поделились
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Комментарии 0 Поделились
  • The protests in Los Angeles have brought a lot of attention, but honestly, it’s just the same old story. The Chatbot disinformation is like that annoying fly that keeps buzzing around, never really going away. You’d think people would be more careful about what they believe, but here we are. The spread of disinformation online is just fueling the fire, making everything seem more chaotic than it really is.

    It’s kind of exhausting to see the same patterns repeat. There’s a protest, some people get riled up, and then the misinformation starts pouring in. It’s like a never-ending cycle. Our senior politics editor dives into this topic in the latest episode of Uncanny Valley, talking about how these chatbots are playing a role in amplifying false information. Not that many people seem to care, though.

    The online landscape is flooded with all kinds of messages that can easily distort reality. It’s almost as if people are too tired to fact-check anymore. Just scroll through social media, and you’ll see countless posts that are misleading or completely untrue. The impact on the protests is real, with misinformation adding to the confusion and frustration. One could argue that it’s a bit depressing, really.

    As the protests continue, it’s hard to see a clear path forward. Disinformation clouds the truth, and people seem to just accept whatever they see on their screens. It’s all so monotonous. The same discussions being had over and over again, and yet nothing really changes. The chatbots keep generating content, and the cycle goes on.

    Honestly, it makes you wonder whether anyone is actually listening or if they’re just scrolling mindlessly. The discussions about the protests and the role of disinformation should be enlightening, but they often feel repetitive and bland. It’s hard to muster any excitement when the conversations feel so stale.

    In the end, it’s just more noise in a world that’s already too loud. The protests might be important, but the chatbots and their disinformation are just taking away from the real issues at hand. This episode of Uncanny Valley might shed some light, but will anyone really care? Who knows.

    #LosAngelesProtests
    #Disinformation
    #Chatbots
    #UncannyValley
    #Misinformation
    The protests in Los Angeles have brought a lot of attention, but honestly, it’s just the same old story. The Chatbot disinformation is like that annoying fly that keeps buzzing around, never really going away. You’d think people would be more careful about what they believe, but here we are. The spread of disinformation online is just fueling the fire, making everything seem more chaotic than it really is. It’s kind of exhausting to see the same patterns repeat. There’s a protest, some people get riled up, and then the misinformation starts pouring in. It’s like a never-ending cycle. Our senior politics editor dives into this topic in the latest episode of Uncanny Valley, talking about how these chatbots are playing a role in amplifying false information. Not that many people seem to care, though. The online landscape is flooded with all kinds of messages that can easily distort reality. It’s almost as if people are too tired to fact-check anymore. Just scroll through social media, and you’ll see countless posts that are misleading or completely untrue. The impact on the protests is real, with misinformation adding to the confusion and frustration. One could argue that it’s a bit depressing, really. As the protests continue, it’s hard to see a clear path forward. Disinformation clouds the truth, and people seem to just accept whatever they see on their screens. It’s all so monotonous. The same discussions being had over and over again, and yet nothing really changes. The chatbots keep generating content, and the cycle goes on. Honestly, it makes you wonder whether anyone is actually listening or if they’re just scrolling mindlessly. The discussions about the protests and the role of disinformation should be enlightening, but they often feel repetitive and bland. It’s hard to muster any excitement when the conversations feel so stale. In the end, it’s just more noise in a world that’s already too loud. The protests might be important, but the chatbots and their disinformation are just taking away from the real issues at hand. This episode of Uncanny Valley might shed some light, but will anyone really care? Who knows. #LosAngelesProtests #Disinformation #Chatbots #UncannyValley #Misinformation
    The Chatbot Disinfo Inflaming the LA Protests
    On this episode of Uncanny Valley, our senior politics editor discusses the spread of disinformation online following the onset of the Los Angeles protests.
    Like
    Love
    Wow
    Sad
    Angry
    649
    1 Комментарии 0 Поделились
  • Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour

    Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour
    A new report indicates that the ROG Xbox Ally will be priced at around €599, while the more powerful ROG Xbox Ally X will cost €899.

    Posted By Joelle Daniels | On 16th, Jun. 2025

    While Microsoft and Asus have unveiled the ROG Xbox Ally and ROG Xbox Ally X handheld gaming systems, the companies have yet to confirm the prices or release dates for the two systems. While the announcement  mentioned that they will be launched later this year, a new report, courtesy of leaker Extas1s, indicates that pre-orders for both devices will be kicked off in August, with the launch then happening in October. As noted by Extas1s, the lower-powered ROG Xbox Ally is expected to be priced around €599. The leaker claims to have corroborated the pricing details for the handheld with two different Europe-based retailers. The more powerful ROG Xbox Ally X, on the other hand, is expected to be priced at €899. This would put its pricing in line with Asus’s own ROG Ally X. Previously, Asus senior manager of marketing content for gaming, Whitson Gordon, had revealed that pricing and power use were the two biggest reasons why both the ROG Xbox Ally and the ROG Xbox Ally X didn’t feature OLED displays. Rather, both systems will come equipped with 7-inch 1080p 120 Hz LCD displays with variable refresh rate capabilities. “We did some R&D and prototyping with OLED, but it’s still not where we want it to be when you factor VRR into the mix and we aren’t willing to give up VRR,” said Gordon. “I’ll draw that line in the sand right now. I am of the opinion that if a display doesn’t have variable refresh rate, it’s not a gaming display in the year 2025 as far as I’m concerned, right? That’s a must-have feature, and OLED with VRR right now draws significantly more power than the LCD that we’re currently using on the Ally and it costs more.” Explaining further that the decision ultimately also came down to keeping the pricing for both systems at reasonable levels, since buyers often tend to get handheld gaming systems as their secondary machiens, Gordon noted that both handhelds would have much higher price tags if OLED displays were used. “That’s all I’ll say about price,” said Gordon. “You have to align your expectations with the market and what we’re doing here. Adding 32GB, OLED, Z2 Extreme, and all of those extra bells and whistles would cost a lot more than the price bracket you guys are used to on the Ally, and the vast majority of users are not willing to pay that kind of price.” Shortly after its announcement, Microsoft and Asus had released a video where the two companies spoke about the various features of the ROG Xbox Ally and ROG Xbox Ally X. In the video, we also get to see an early hardware prototype of the handheld gaming system built inside a cardboard box. The ROG Xbox Ally runs on an AMD Ryzen Z2A chip, and has 16 GB of LPDDR5X-6400 RAM and 512 GB of storage. The ROG Xbox Ally X, on the other hand, runs on an AMD Ryzen Z2 Extreme chip, and has 24 GB of LPDDR5X-8000 RAM and 1 TB of storage. Both systems run on Windows. Tagged With:

    Elden Ring: Nightreign
    Publisher:Bandai Namco Developer:FromSoftware Platforms:PS5, Xbox Series X, PS4, Xbox One, PCView More
    FBC: Firebreak
    Publisher:Remedy Entertainment Developer:Remedy Entertainment Platforms:PS5, Xbox Series X, PCView More
    Death Stranding 2: On the Beach
    Publisher:Sony Developer:Kojima Productions Platforms:PS5View More
    Amazing Articles You Might Want To Check Out!

    Summer Game Fest 2025 Saw 89 Percent Growth in Live Concurrent Viewership Since Last Year This year's Summer Game Fest has been the most successful one so far, with around 1.5 million live viewers on ...
    Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour A new report indicates that the ROG Xbox Ally will be priced at around €599, while the more powerful ROG Xbo...
    Borderlands 4 Gets New Video Explaining the Process of Creating Vault Hunters According to the development team behind Borderlands 4, the creation of Vault Hunters is a studio-wide collabo...
    The Witcher 4 Team is Tapping Into the “Good Creative Chaos” From The Witcher 3’s Development Narrative director Philipp Weber says there are "new questions we want to answer because this is supposed to f...
    The Witcher 4 is Opting for “Console-First Development” to Ensure 60 FPS, Says VP of Tech However, CD Projekt RED's Charles Tremblay says 60 frames per second will be "extremely challenging" on the Xb...
    Red Dead Redemption Voice Actor Teases “Exciting News” for This Week Actor Rob Wiethoff teases an announcement, potentially the rumored release of Red Dead Redemption 2 on Xbox Se... View More
    #asus #rog #xbox #ally #start
    Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour
    Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour A new report indicates that the ROG Xbox Ally will be priced at around €599, while the more powerful ROG Xbox Ally X will cost €899. Posted By Joelle Daniels | On 16th, Jun. 2025 While Microsoft and Asus have unveiled the ROG Xbox Ally and ROG Xbox Ally X handheld gaming systems, the companies have yet to confirm the prices or release dates for the two systems. While the announcement  mentioned that they will be launched later this year, a new report, courtesy of leaker Extas1s, indicates that pre-orders for both devices will be kicked off in August, with the launch then happening in October. As noted by Extas1s, the lower-powered ROG Xbox Ally is expected to be priced around €599. The leaker claims to have corroborated the pricing details for the handheld with two different Europe-based retailers. The more powerful ROG Xbox Ally X, on the other hand, is expected to be priced at €899. This would put its pricing in line with Asus’s own ROG Ally X. Previously, Asus senior manager of marketing content for gaming, Whitson Gordon, had revealed that pricing and power use were the two biggest reasons why both the ROG Xbox Ally and the ROG Xbox Ally X didn’t feature OLED displays. Rather, both systems will come equipped with 7-inch 1080p 120 Hz LCD displays with variable refresh rate capabilities. “We did some R&D and prototyping with OLED, but it’s still not where we want it to be when you factor VRR into the mix and we aren’t willing to give up VRR,” said Gordon. “I’ll draw that line in the sand right now. I am of the opinion that if a display doesn’t have variable refresh rate, it’s not a gaming display in the year 2025 as far as I’m concerned, right? That’s a must-have feature, and OLED with VRR right now draws significantly more power than the LCD that we’re currently using on the Ally and it costs more.” Explaining further that the decision ultimately also came down to keeping the pricing for both systems at reasonable levels, since buyers often tend to get handheld gaming systems as their secondary machiens, Gordon noted that both handhelds would have much higher price tags if OLED displays were used. “That’s all I’ll say about price,” said Gordon. “You have to align your expectations with the market and what we’re doing here. Adding 32GB, OLED, Z2 Extreme, and all of those extra bells and whistles would cost a lot more than the price bracket you guys are used to on the Ally, and the vast majority of users are not willing to pay that kind of price.” Shortly after its announcement, Microsoft and Asus had released a video where the two companies spoke about the various features of the ROG Xbox Ally and ROG Xbox Ally X. In the video, we also get to see an early hardware prototype of the handheld gaming system built inside a cardboard box. The ROG Xbox Ally runs on an AMD Ryzen Z2A chip, and has 16 GB of LPDDR5X-6400 RAM and 512 GB of storage. The ROG Xbox Ally X, on the other hand, runs on an AMD Ryzen Z2 Extreme chip, and has 24 GB of LPDDR5X-8000 RAM and 1 TB of storage. Both systems run on Windows. Tagged With: Elden Ring: Nightreign Publisher:Bandai Namco Developer:FromSoftware Platforms:PS5, Xbox Series X, PS4, Xbox One, PCView More FBC: Firebreak Publisher:Remedy Entertainment Developer:Remedy Entertainment Platforms:PS5, Xbox Series X, PCView More Death Stranding 2: On the Beach Publisher:Sony Developer:Kojima Productions Platforms:PS5View More Amazing Articles You Might Want To Check Out! Summer Game Fest 2025 Saw 89 Percent Growth in Live Concurrent Viewership Since Last Year This year's Summer Game Fest has been the most successful one so far, with around 1.5 million live viewers on ... Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour A new report indicates that the ROG Xbox Ally will be priced at around €599, while the more powerful ROG Xbo... Borderlands 4 Gets New Video Explaining the Process of Creating Vault Hunters According to the development team behind Borderlands 4, the creation of Vault Hunters is a studio-wide collabo... The Witcher 4 Team is Tapping Into the “Good Creative Chaos” From The Witcher 3’s Development Narrative director Philipp Weber says there are "new questions we want to answer because this is supposed to f... The Witcher 4 is Opting for “Console-First Development” to Ensure 60 FPS, Says VP of Tech However, CD Projekt RED's Charles Tremblay says 60 frames per second will be "extremely challenging" on the Xb... Red Dead Redemption Voice Actor Teases “Exciting News” for This Week Actor Rob Wiethoff teases an announcement, potentially the rumored release of Red Dead Redemption 2 on Xbox Se... View More #asus #rog #xbox #ally #start
    GAMINGBOLT.COM
    Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour
    Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour A new report indicates that the ROG Xbox Ally will be priced at around €599, while the more powerful ROG Xbox Ally X will cost €899. Posted By Joelle Daniels | On 16th, Jun. 2025 While Microsoft and Asus have unveiled the ROG Xbox Ally and ROG Xbox Ally X handheld gaming systems, the companies have yet to confirm the prices or release dates for the two systems. While the announcement  mentioned that they will be launched later this year, a new report, courtesy of leaker Extas1s, indicates that pre-orders for both devices will be kicked off in August, with the launch then happening in October. As noted by Extas1s, the lower-powered ROG Xbox Ally is expected to be priced around €599. The leaker claims to have corroborated the pricing details for the handheld with two different Europe-based retailers. The more powerful ROG Xbox Ally X, on the other hand, is expected to be priced at €899. This would put its pricing in line with Asus’s own ROG Ally X. Previously, Asus senior manager of marketing content for gaming, Whitson Gordon, had revealed that pricing and power use were the two biggest reasons why both the ROG Xbox Ally and the ROG Xbox Ally X didn’t feature OLED displays. Rather, both systems will come equipped with 7-inch 1080p 120 Hz LCD displays with variable refresh rate capabilities. “We did some R&D and prototyping with OLED, but it’s still not where we want it to be when you factor VRR into the mix and we aren’t willing to give up VRR,” said Gordon. “I’ll draw that line in the sand right now. I am of the opinion that if a display doesn’t have variable refresh rate, it’s not a gaming display in the year 2025 as far as I’m concerned, right? That’s a must-have feature, and OLED with VRR right now draws significantly more power than the LCD that we’re currently using on the Ally and it costs more.” Explaining further that the decision ultimately also came down to keeping the pricing for both systems at reasonable levels, since buyers often tend to get handheld gaming systems as their secondary machiens, Gordon noted that both handhelds would have much higher price tags if OLED displays were used. “That’s all I’ll say about price,” said Gordon. “You have to align your expectations with the market and what we’re doing here. Adding 32GB, OLED, Z2 Extreme, and all of those extra bells and whistles would cost a lot more than the price bracket you guys are used to on the Ally, and the vast majority of users are not willing to pay that kind of price.” Shortly after its announcement, Microsoft and Asus had released a video where the two companies spoke about the various features of the ROG Xbox Ally and ROG Xbox Ally X. In the video, we also get to see an early hardware prototype of the handheld gaming system built inside a cardboard box. The ROG Xbox Ally runs on an AMD Ryzen Z2A chip, and has 16 GB of LPDDR5X-6400 RAM and 512 GB of storage. The ROG Xbox Ally X, on the other hand, runs on an AMD Ryzen Z2 Extreme chip, and has 24 GB of LPDDR5X-8000 RAM and 1 TB of storage. Both systems run on Windows. Tagged With: Elden Ring: Nightreign Publisher:Bandai Namco Developer:FromSoftware Platforms:PS5, Xbox Series X, PS4, Xbox One, PCView More FBC: Firebreak Publisher:Remedy Entertainment Developer:Remedy Entertainment Platforms:PS5, Xbox Series X, PCView More Death Stranding 2: On the Beach Publisher:Sony Developer:Kojima Productions Platforms:PS5View More Amazing Articles You Might Want To Check Out! Summer Game Fest 2025 Saw 89 Percent Growth in Live Concurrent Viewership Since Last Year This year's Summer Game Fest has been the most successful one so far, with around 1.5 million live viewers on ... Asus ROG Xbox Ally, ROG Xbox Ally X to Start Pre-Orders in August, Launch in October – Rumour A new report indicates that the ROG Xbox Ally will be priced at around €599, while the more powerful ROG Xbo... Borderlands 4 Gets New Video Explaining the Process of Creating Vault Hunters According to the development team behind Borderlands 4, the creation of Vault Hunters is a studio-wide collabo... The Witcher 4 Team is Tapping Into the “Good Creative Chaos” From The Witcher 3’s Development Narrative director Philipp Weber says there are "new questions we want to answer because this is supposed to f... The Witcher 4 is Opting for “Console-First Development” to Ensure 60 FPS, Says VP of Tech However, CD Projekt RED's Charles Tremblay says 60 frames per second will be "extremely challenging" on the Xb... Red Dead Redemption Voice Actor Teases “Exciting News” for This Week Actor Rob Wiethoff teases an announcement, potentially the rumored release of Red Dead Redemption 2 on Xbox Se... View More
    Like
    Love
    Wow
    Sad
    Angry
    600
    2 Комментарии 0 Поделились
  • The AI execution gap: Why 80% of projects don’t reach production

    Enterprise artificial intelligence investment is unprecedented, with IDC projecting global spending on AI and GenAI to double to billion by 2028. Yet beneath the impressive budget allocations and boardroom enthusiasm lies a troubling reality: most organisations struggle to translate their AI ambitions into operational success.The sobering statistics behind AI’s promiseModelOp’s 2025 AI Governance Benchmark Report, based on input from 100 senior AI and data leaders at Fortune 500 enterprises, reveals a disconnect between aspiration and execution.While more than 80% of enterprises have 51 or more generative AI projects in proposal phases, only 18% have successfully deployed more than 20 models into production.The execution gap represents one of the most significant challenges facing enterprise AI today. Most generative AI projects still require 6 to 18 months to go live – if they reach production at all.The result is delayed returns on investment, frustrated stakeholders, and diminished confidence in AI initiatives in the enterprise.The cause: Structural, not technical barriersThe biggest obstacles preventing AI scalability aren’t technical limitations – they’re structural inefficiencies plaguing enterprise operations. The ModelOp benchmark report identifies several problems that create what experts call a “time-to-market quagmire.”Fragmented systems plague implementation. 58% of organisations cite fragmented systems as the top obstacle to adopting governance platforms. Fragmentation creates silos where different departments use incompatible tools and processes, making it nearly impossible to maintain consistent oversight in AI initiatives.Manual processes dominate despite digital transformation. 55% of enterprises still rely on manual processes – including spreadsheets and email – to manage AI use case intake. The reliance on antiquated methods creates bottlenecks, increases the likelihood of errors, and makes it difficult to scale AI operations.Lack of standardisation hampers progress. Only 23% of organisations implement standardised intake, development, and model management processes. Without these elements, each AI project becomes a unique challenge requiring custom solutions and extensive coordination by multiple teams.Enterprise-level oversight remains rare Just 14% of companies perform AI assurance at the enterprise level, increasing the risk of duplicated efforts and inconsistent oversight. The lack of centralised governance means organisations often discover they’re solving the same problems multiple times in different departments.The governance revolution: From obstacle to acceleratorA change is taking place in how enterprises view AI governance. Rather than seeing it as a compliance burden that slows innovation, forward-thinking organisations recognise governance as an important enabler of scale and speed.Leadership alignment signals strategic shift. The ModelOp benchmark data reveals a change in organisational structure: 46% of companies now assign accountability for AI governance to a Chief Innovation Officer – more than four times the number who place accountability under Legal or Compliance. This strategic repositioning reflects a new understanding that governance isn’t solely about risk management, but can enable innovation.Investment follows strategic priority. A financial commitment to AI governance underscores its importance. According to the report, 36% of enterprises have budgeted at least million annually for AI governance software, while 54% have allocated resources specifically for AI Portfolio Intelligence to track value and ROI.What high-performing organisations do differentlyThe enterprises that successfully bridge the ‘execution gap’ share several characteristics in their approach to AI implementation:Standardised processes from day one. Leading organisations implement standardised intake, development, and model review processes in AI initiatives. Consistency eliminates the need to reinvent workflows for each project and ensures that all stakeholders understand their responsibilities.Centralised documentation and inventory. Rather than allowing AI assets to proliferate in disconnected systems, successful enterprises maintain centralised inventories that provide visibility into every model’s status, performance, and compliance posture.Automated governance checkpoints. High-performing organisations embed automated governance checkpoints throughout the AI lifecycle, helping ensure compliance requirements and risk assessments are addressed systematically rather than as afterthoughts.End-to-end traceability. Leading enterprises maintain complete traceability of their AI models, including data sources, training methods, validation results, and performance metrics.Measurable impact of structured governanceThe benefits of implementing comprehensive AI governance extend beyond compliance. Organisations that adopt lifecycle automation platforms reportedly see dramatic improvements in operational efficiency and business outcomes.A financial services firm profiled in the ModelOp report experienced a halving of time to production and an 80% reduction in issue resolution time after implementing automated governance processes. Such improvements translate directly into faster time-to-value and increased confidence among business stakeholders.Enterprises with robust governance frameworks report the ability to many times more models simultaneously while maintaining oversight and control. This scalability lets organisations pursue AI initiatives in multiple business units without overwhelming their operational capabilities.The path forward: From stuck to scaledThe message from industry leaders that the gap between AI ambition and execution is solvable, but it requires a shift in approach. Rather than treating governance as a necessary evil, enterprises should realise it enables AI innovation at scale.Immediate action items for AI leadersOrganisations looking to escape the ‘time-to-market quagmire’ should prioritise the following:Audit current state: Conduct an assessment of existing AI initiatives, identifying fragmented processes and manual bottlenecksStandardise workflows: Implement consistent processes for AI use case intake, development, and deployment in all business unitsInvest in integration: Deploy platforms to unify disparate tools and systems under a single governance frameworkEstablish enterprise oversight: Create centralised visibility into all AI initiatives with real-time monitoring and reporting abilitiesThe competitive advantage of getting it rightOrganisations that can solve the execution challenge will be able to bring AI solutions to market faster, scale more efficiently, and maintain the trust of stakeholders and regulators.Enterprises that continue with fragmented processes and manual workflows will find themselves disadvantaged compared to their more organised competitors. Operational excellence isn’t about efficiency but survival.The data shows enterprise AI investment will continue to grow. Therefore, the question isn’t whether organisations will invest in AI, but whether they’ll develop the operational abilities necessary to realise return on investment. The opportunity to lead in the AI-driven economy has never been greater for those willing to embrace governance as an enabler not an obstacle.
    #execution #gap #why #projects #dont
    The AI execution gap: Why 80% of projects don’t reach production
    Enterprise artificial intelligence investment is unprecedented, with IDC projecting global spending on AI and GenAI to double to billion by 2028. Yet beneath the impressive budget allocations and boardroom enthusiasm lies a troubling reality: most organisations struggle to translate their AI ambitions into operational success.The sobering statistics behind AI’s promiseModelOp’s 2025 AI Governance Benchmark Report, based on input from 100 senior AI and data leaders at Fortune 500 enterprises, reveals a disconnect between aspiration and execution.While more than 80% of enterprises have 51 or more generative AI projects in proposal phases, only 18% have successfully deployed more than 20 models into production.The execution gap represents one of the most significant challenges facing enterprise AI today. Most generative AI projects still require 6 to 18 months to go live – if they reach production at all.The result is delayed returns on investment, frustrated stakeholders, and diminished confidence in AI initiatives in the enterprise.The cause: Structural, not technical barriersThe biggest obstacles preventing AI scalability aren’t technical limitations – they’re structural inefficiencies plaguing enterprise operations. The ModelOp benchmark report identifies several problems that create what experts call a “time-to-market quagmire.”Fragmented systems plague implementation. 58% of organisations cite fragmented systems as the top obstacle to adopting governance platforms. Fragmentation creates silos where different departments use incompatible tools and processes, making it nearly impossible to maintain consistent oversight in AI initiatives.Manual processes dominate despite digital transformation. 55% of enterprises still rely on manual processes – including spreadsheets and email – to manage AI use case intake. The reliance on antiquated methods creates bottlenecks, increases the likelihood of errors, and makes it difficult to scale AI operations.Lack of standardisation hampers progress. Only 23% of organisations implement standardised intake, development, and model management processes. Without these elements, each AI project becomes a unique challenge requiring custom solutions and extensive coordination by multiple teams.Enterprise-level oversight remains rare Just 14% of companies perform AI assurance at the enterprise level, increasing the risk of duplicated efforts and inconsistent oversight. The lack of centralised governance means organisations often discover they’re solving the same problems multiple times in different departments.The governance revolution: From obstacle to acceleratorA change is taking place in how enterprises view AI governance. Rather than seeing it as a compliance burden that slows innovation, forward-thinking organisations recognise governance as an important enabler of scale and speed.Leadership alignment signals strategic shift. The ModelOp benchmark data reveals a change in organisational structure: 46% of companies now assign accountability for AI governance to a Chief Innovation Officer – more than four times the number who place accountability under Legal or Compliance. This strategic repositioning reflects a new understanding that governance isn’t solely about risk management, but can enable innovation.Investment follows strategic priority. A financial commitment to AI governance underscores its importance. According to the report, 36% of enterprises have budgeted at least million annually for AI governance software, while 54% have allocated resources specifically for AI Portfolio Intelligence to track value and ROI.What high-performing organisations do differentlyThe enterprises that successfully bridge the ‘execution gap’ share several characteristics in their approach to AI implementation:Standardised processes from day one. Leading organisations implement standardised intake, development, and model review processes in AI initiatives. Consistency eliminates the need to reinvent workflows for each project and ensures that all stakeholders understand their responsibilities.Centralised documentation and inventory. Rather than allowing AI assets to proliferate in disconnected systems, successful enterprises maintain centralised inventories that provide visibility into every model’s status, performance, and compliance posture.Automated governance checkpoints. High-performing organisations embed automated governance checkpoints throughout the AI lifecycle, helping ensure compliance requirements and risk assessments are addressed systematically rather than as afterthoughts.End-to-end traceability. Leading enterprises maintain complete traceability of their AI models, including data sources, training methods, validation results, and performance metrics.Measurable impact of structured governanceThe benefits of implementing comprehensive AI governance extend beyond compliance. Organisations that adopt lifecycle automation platforms reportedly see dramatic improvements in operational efficiency and business outcomes.A financial services firm profiled in the ModelOp report experienced a halving of time to production and an 80% reduction in issue resolution time after implementing automated governance processes. Such improvements translate directly into faster time-to-value and increased confidence among business stakeholders.Enterprises with robust governance frameworks report the ability to many times more models simultaneously while maintaining oversight and control. This scalability lets organisations pursue AI initiatives in multiple business units without overwhelming their operational capabilities.The path forward: From stuck to scaledThe message from industry leaders that the gap between AI ambition and execution is solvable, but it requires a shift in approach. Rather than treating governance as a necessary evil, enterprises should realise it enables AI innovation at scale.Immediate action items for AI leadersOrganisations looking to escape the ‘time-to-market quagmire’ should prioritise the following:Audit current state: Conduct an assessment of existing AI initiatives, identifying fragmented processes and manual bottlenecksStandardise workflows: Implement consistent processes for AI use case intake, development, and deployment in all business unitsInvest in integration: Deploy platforms to unify disparate tools and systems under a single governance frameworkEstablish enterprise oversight: Create centralised visibility into all AI initiatives with real-time monitoring and reporting abilitiesThe competitive advantage of getting it rightOrganisations that can solve the execution challenge will be able to bring AI solutions to market faster, scale more efficiently, and maintain the trust of stakeholders and regulators.Enterprises that continue with fragmented processes and manual workflows will find themselves disadvantaged compared to their more organised competitors. Operational excellence isn’t about efficiency but survival.The data shows enterprise AI investment will continue to grow. Therefore, the question isn’t whether organisations will invest in AI, but whether they’ll develop the operational abilities necessary to realise return on investment. The opportunity to lead in the AI-driven economy has never been greater for those willing to embrace governance as an enabler not an obstacle. #execution #gap #why #projects #dont
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    The AI execution gap: Why 80% of projects don’t reach production
    Enterprise artificial intelligence investment is unprecedented, with IDC projecting global spending on AI and GenAI to double to $631 billion by 2028. Yet beneath the impressive budget allocations and boardroom enthusiasm lies a troubling reality: most organisations struggle to translate their AI ambitions into operational success.The sobering statistics behind AI’s promiseModelOp’s 2025 AI Governance Benchmark Report, based on input from 100 senior AI and data leaders at Fortune 500 enterprises, reveals a disconnect between aspiration and execution.While more than 80% of enterprises have 51 or more generative AI projects in proposal phases, only 18% have successfully deployed more than 20 models into production.The execution gap represents one of the most significant challenges facing enterprise AI today. Most generative AI projects still require 6 to 18 months to go live – if they reach production at all.The result is delayed returns on investment, frustrated stakeholders, and diminished confidence in AI initiatives in the enterprise.The cause: Structural, not technical barriersThe biggest obstacles preventing AI scalability aren’t technical limitations – they’re structural inefficiencies plaguing enterprise operations. The ModelOp benchmark report identifies several problems that create what experts call a “time-to-market quagmire.”Fragmented systems plague implementation. 58% of organisations cite fragmented systems as the top obstacle to adopting governance platforms. Fragmentation creates silos where different departments use incompatible tools and processes, making it nearly impossible to maintain consistent oversight in AI initiatives.Manual processes dominate despite digital transformation. 55% of enterprises still rely on manual processes – including spreadsheets and email – to manage AI use case intake. The reliance on antiquated methods creates bottlenecks, increases the likelihood of errors, and makes it difficult to scale AI operations.Lack of standardisation hampers progress. Only 23% of organisations implement standardised intake, development, and model management processes. Without these elements, each AI project becomes a unique challenge requiring custom solutions and extensive coordination by multiple teams.Enterprise-level oversight remains rare Just 14% of companies perform AI assurance at the enterprise level, increasing the risk of duplicated efforts and inconsistent oversight. The lack of centralised governance means organisations often discover they’re solving the same problems multiple times in different departments.The governance revolution: From obstacle to acceleratorA change is taking place in how enterprises view AI governance. Rather than seeing it as a compliance burden that slows innovation, forward-thinking organisations recognise governance as an important enabler of scale and speed.Leadership alignment signals strategic shift. The ModelOp benchmark data reveals a change in organisational structure: 46% of companies now assign accountability for AI governance to a Chief Innovation Officer – more than four times the number who place accountability under Legal or Compliance. This strategic repositioning reflects a new understanding that governance isn’t solely about risk management, but can enable innovation.Investment follows strategic priority. A financial commitment to AI governance underscores its importance. According to the report, 36% of enterprises have budgeted at least $1 million annually for AI governance software, while 54% have allocated resources specifically for AI Portfolio Intelligence to track value and ROI.What high-performing organisations do differentlyThe enterprises that successfully bridge the ‘execution gap’ share several characteristics in their approach to AI implementation:Standardised processes from day one. Leading organisations implement standardised intake, development, and model review processes in AI initiatives. Consistency eliminates the need to reinvent workflows for each project and ensures that all stakeholders understand their responsibilities.Centralised documentation and inventory. Rather than allowing AI assets to proliferate in disconnected systems, successful enterprises maintain centralised inventories that provide visibility into every model’s status, performance, and compliance posture.Automated governance checkpoints. High-performing organisations embed automated governance checkpoints throughout the AI lifecycle, helping ensure compliance requirements and risk assessments are addressed systematically rather than as afterthoughts.End-to-end traceability. Leading enterprises maintain complete traceability of their AI models, including data sources, training methods, validation results, and performance metrics.Measurable impact of structured governanceThe benefits of implementing comprehensive AI governance extend beyond compliance. Organisations that adopt lifecycle automation platforms reportedly see dramatic improvements in operational efficiency and business outcomes.A financial services firm profiled in the ModelOp report experienced a halving of time to production and an 80% reduction in issue resolution time after implementing automated governance processes. Such improvements translate directly into faster time-to-value and increased confidence among business stakeholders.Enterprises with robust governance frameworks report the ability to many times more models simultaneously while maintaining oversight and control. This scalability lets organisations pursue AI initiatives in multiple business units without overwhelming their operational capabilities.The path forward: From stuck to scaledThe message from industry leaders that the gap between AI ambition and execution is solvable, but it requires a shift in approach. Rather than treating governance as a necessary evil, enterprises should realise it enables AI innovation at scale.Immediate action items for AI leadersOrganisations looking to escape the ‘time-to-market quagmire’ should prioritise the following:Audit current state: Conduct an assessment of existing AI initiatives, identifying fragmented processes and manual bottlenecksStandardise workflows: Implement consistent processes for AI use case intake, development, and deployment in all business unitsInvest in integration: Deploy platforms to unify disparate tools and systems under a single governance frameworkEstablish enterprise oversight: Create centralised visibility into all AI initiatives with real-time monitoring and reporting abilitiesThe competitive advantage of getting it rightOrganisations that can solve the execution challenge will be able to bring AI solutions to market faster, scale more efficiently, and maintain the trust of stakeholders and regulators.Enterprises that continue with fragmented processes and manual workflows will find themselves disadvantaged compared to their more organised competitors. Operational excellence isn’t about efficiency but survival.The data shows enterprise AI investment will continue to grow. Therefore, the question isn’t whether organisations will invest in AI, but whether they’ll develop the operational abilities necessary to realise return on investment. The opportunity to lead in the AI-driven economy has never been greater for those willing to embrace governance as an enabler not an obstacle.(Image source: Unsplash)
    Like
    Love
    Wow
    Angry
    Sad
    598
    0 Комментарии 0 Поделились
  • Delightfully irreverent Underdogs isn’t your parents’ nature docuseries

    show some love for the losers

    Delightfully irreverent Underdogs isn’t your parents’ nature docuseries

    Ryan Reynolds narrates NatGeo's new series highlighting nature's much less cool and majestic creatures

    Jennifer Ouellette



    Jun 15, 2025 3:11 pm

    |

    5

    The indestructible honey badger is just one of nature's "benchwarmers" featured in Underdogs

    Credit:

    National Geographic/Doug Parker

    The indestructible honey badger is just one of nature's "benchwarmers" featured in Underdogs

    Credit:

    National Geographic/Doug Parker

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    Narrator Ryan Reynolds celebrates nature's outcasts in the new NatGeo docuseries Underdogs.

    Most of us have seen a nature documentary or twoat some point in our lives, so it's a familiar format: sweeping majestic footage of impressively regal animals accompanied by reverently high-toned narration. Underdogs, a new docuseries from National Geographic, takes a decidedly different and unconventional approach. Narrated by with hilarious irreverence by Ryan Reynolds, the five-part series highlights nature's less cool and majestic creatures: the outcasts and benchwarmers, more noteworthy for their "unconventional hygiene choices" and "unsavory courtship rituals." It's like The Suicide Squad or Thunderbolts*, except these creatures actually exist.
    Per the official premise, "Underdogs features a range of never-before-filmed scenes, including the first time a film crew has ever entered a special cave in New Zealand—a huge cavern that glows brighter than a bachelor pad under a black light thanks to the glowing butts of millions of mucus-coated grubs. All over the world, overlooked superstars like this are out there 24/7, giving it maximum effort and keeping the natural world in working order for all those showboating polar bears, sharks and gorillas." It's rated PG-13 thanks to the odd bit of scatalogical humor and shots of Nature Sexy Time
    Each of the five episodes is built around a specific genre. "Superheroes" highlights the surprising superpowers of the honey badger, pistol shrimp, and the invisible glass frog, among others, augmented with comic book graphics; "Sexy Beasts" focuses on bizarre mating habits and follows the format of a romantic advice column; "Terrible Parents" highlights nature's worst practices, following the outline of a parenting guide; "Total Grossout" is exactly what it sounds like; and "The Unusual Suspects" is a heist tale, documenting the supposed efforts of a macaque to put together the ultimate team of masters of deception and disguise.  Green Day even wrote and recorded a special theme song for the opening credits.
    Co-creators Mark Linfield and Vanessa Berlowitz of Wildstar Films are longtime producers of award-winning wildlife films, most notably Frozen Planet, Planet Earth and David Attenborough's Life of Mammals—you know, the kind of prestige nature documentaries that have become a mainstay for National Geographic and the BBC, among others. They're justly proud of that work, but this time around the duo wanted to try something different.

    Madagascar's aye-aye: "as if fear and panic had a baby and rolled it in dog hair"

    National Geographic/Eleanor Paish

    Madagascar's aye-aye: "as if fear and panic had a baby and rolled it in dog hair"

    National Geographic/Eleanor Paish

    An emerald jewel wasp emerges from a cockroach.

    National Geographic/Simon De Glanville

    An emerald jewel wasp emerges from a cockroach.

    National Geographic/Simon De Glanville

    A pack of African hunting dogs is no match for the honey badger's thick hide.

    National Geographic/Tom Walker

    A pack of African hunting dogs is no match for the honey badger's thick hide.

    National Geographic/Tom Walker

    An emerald jewel wasp emerges from a cockroach.

    National Geographic/Simon De Glanville

    A pack of African hunting dogs is no match for the honey badger's thick hide.

    National Geographic/Tom Walker

    A fireworm is hit by a cavitation bubble shot from the claw of a pistol shrimp defending its home.

    National Geographic/Hugh Miller

    As it grows and molts, the mad hatterpillar stacks old head casings on top of its head. Scientists think it is used as a decoy against would-be predators and parasites, and when needed, it can also be used as a weapon.

    National Geographic/Katherine Hannaford

    Worst parents ever? A young barnacle goose chick prepares t make the 800-foot jump from its nest to the ground.

    National Geographic

    An adult pearlfish reverses into a sea cucumber's butt to hide.

    National Geographic

    A vulture sticks its head inside an elephant carcass to eat.

    National Geographic

    A manatee releases flatulence while swimming to lose the buoyancy build up of gas inside its stomach, and descend down the water column.

    National Geographic/Karl Davies

    "There is a sense after awhile that you're playing the same animals to the same people, and the shows are starting to look the same and so is your audience," Linfield told Ars. "We thought, okay, how can we do something absolutely the opposite? We've gone through our careers collecting stories of these weird and crazy creatures that don't end up in the script because they're not big or sexy and they live under a rock. But they often have the best life histories and the craziest superpowers."
    Case in point: the velvet worm featured in the "Superheroes" episode, which creeps up on unsuspecting prey before squirting disgusting slime all over their food.Once Linfield and Berlowitz decided to focus on nature's underdogs and to take a more humorous approach, Ryan Reynolds became their top choice for a narrator—the anti-Richard Attenborough. As luck would have it, the pair shared an agent with the mega-star. So even though they thought there was no way Reynolds would agree to the project, they put together a sizzle reel, complete with a "fake Canadian Ryan Reynolds sound-alike" doing the narration. Reynolds was on set when he received the reel, and loved it so much he recoded his own narration for the footage and sent it back.
    "From that moment he was in," said Linfield, and Wildstar Films worked closely with Reynolds and his company to develop the final series. "We've never worked that way on a series before, a joint collaboration from day one," Berlowitz admitted. But it worked: the end result strikes the perfect balance between scientific revelation and accurate natural history, and an edgy comic tone.
    That tone is quintessential Reynolds, and while he did mostly follow the script, Linfield and Berlowitz admit there was also a fair amount of improvisation—not all of it PG-13.  "What we hadn't appreciated is that he's an incredible improv performer," said Berlowitz. "He can't help himself. He gets into character and starts riffing off. There are some takes that we definitely couldn't use, that potentially would fit a slightly more Hulu audience."  Some of the ad-libs made it into the final episodes, however—like Reynolds describing an Aye-Aye as "if fear and panic had a baby and rolled it in dog hair"—even though it meant going back and doing a bit of recutting to get the new lines to fit.

    Cinematographer Tom Beldam films a long-tailed macaque who stole his smart phone minutes later.

    National Geographic/Laura Pennafort

    Cinematographer Tom Beldam films a long-tailed macaque who stole his smart phone minutes later.

    National Geographic/Laura Pennafort

    The macaque agrees to trade ithe stolen phone for a piece of food.

    National Geographic

    The macaque agrees to trade ithe stolen phone for a piece of food.

    National Geographic

    A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction.

    National Geographic

    A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction.

    National Geographic

    The macaque agrees to trade ithe stolen phone for a piece of food.

    National Geographic

    A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction.

    National Geographic

    A male hippo sprays his feces at another male who is threatening to take over his patch.

    National Geographic

    A male proboscis monkey flaunts his large nose. The noses of these males are used to amplify their calls in the vast forest.

    National Geographic

    Dream girl: A blood-soaked female hyena looks across the African savanna.

    National Geographic

    A male bowerbird presents one of the finest items in his collection to a female in his bower.

    National Geographic

    The male nursery web spider presents his nuptial gift to the female.

    National Geographic

    Cue the Barry White mood music: Two leopard slugs suspend themselves on a rope of mucus as they entwine their bodies to mate with one another.

    National Geographic

    Despite their years of collective experience, Linfield and Berlowitz were initially skeptical when the crew told them about the pearl fish, which hides from predators in a sea cucumber's butt. "It had never been filmed so we said, 'You're going to have to prove it to us,'" said Berlowitz. "They came back with this fantastic, hilarious sequence of a pearl fish reverse parking [in a sea cucumber's anus)."
    The film crew experienced a few heart-pounding moments, most notably while filming the cliffside nests of barnacle geese for the "Terrible Parents" episode. A melting glacier caused a watery avalanche while the crew was filming the geese, and they had to quickly grab a few shots and run to safety. Less dramatic: cinematographer Tom Beldam had his smartphone stolen by a long-tailed macaque mere minutes after he finished capturing the animal on film.
    If all goes well and Underdogs finds its target audience, we may even get a follow-up. "We are slightly plowing new territory but the science is as true as it's ever been and the stories are good. That aspect of the natural history is still there," said Linfield. "I think what we really hope for is that people who don't normally watch natural history will watch it. If people have as much fun watching it as we had making it, then the metrics should be good enough for another season."
    Verdict: Underdogs is positively addictive; I binged all five episodes in a single day.Underdogs premieres June 15, 2025, at 9 PM/8 PM Central on National Geographicand will be available for streaming on Disney+ and Hulu the following day.  You should watch it, if only to get that second season.

    Jennifer Ouellette
    Senior Writer

    Jennifer Ouellette
    Senior Writer

    Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

    5 Comments
    #delightfully #irreverent #underdogs #isnt #your
    Delightfully irreverent Underdogs isn’t your parents’ nature docuseries
    show some love for the losers Delightfully irreverent Underdogs isn’t your parents’ nature docuseries Ryan Reynolds narrates NatGeo's new series highlighting nature's much less cool and majestic creatures Jennifer Ouellette – Jun 15, 2025 3:11 pm | 5 The indestructible honey badger is just one of nature's "benchwarmers" featured in Underdogs Credit: National Geographic/Doug Parker The indestructible honey badger is just one of nature's "benchwarmers" featured in Underdogs Credit: National Geographic/Doug Parker Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Narrator Ryan Reynolds celebrates nature's outcasts in the new NatGeo docuseries Underdogs. Most of us have seen a nature documentary or twoat some point in our lives, so it's a familiar format: sweeping majestic footage of impressively regal animals accompanied by reverently high-toned narration. Underdogs, a new docuseries from National Geographic, takes a decidedly different and unconventional approach. Narrated by with hilarious irreverence by Ryan Reynolds, the five-part series highlights nature's less cool and majestic creatures: the outcasts and benchwarmers, more noteworthy for their "unconventional hygiene choices" and "unsavory courtship rituals." It's like The Suicide Squad or Thunderbolts*, except these creatures actually exist. Per the official premise, "Underdogs features a range of never-before-filmed scenes, including the first time a film crew has ever entered a special cave in New Zealand—a huge cavern that glows brighter than a bachelor pad under a black light thanks to the glowing butts of millions of mucus-coated grubs. All over the world, overlooked superstars like this are out there 24/7, giving it maximum effort and keeping the natural world in working order for all those showboating polar bears, sharks and gorillas." It's rated PG-13 thanks to the odd bit of scatalogical humor and shots of Nature Sexy Time Each of the five episodes is built around a specific genre. "Superheroes" highlights the surprising superpowers of the honey badger, pistol shrimp, and the invisible glass frog, among others, augmented with comic book graphics; "Sexy Beasts" focuses on bizarre mating habits and follows the format of a romantic advice column; "Terrible Parents" highlights nature's worst practices, following the outline of a parenting guide; "Total Grossout" is exactly what it sounds like; and "The Unusual Suspects" is a heist tale, documenting the supposed efforts of a macaque to put together the ultimate team of masters of deception and disguise.  Green Day even wrote and recorded a special theme song for the opening credits. Co-creators Mark Linfield and Vanessa Berlowitz of Wildstar Films are longtime producers of award-winning wildlife films, most notably Frozen Planet, Planet Earth and David Attenborough's Life of Mammals—you know, the kind of prestige nature documentaries that have become a mainstay for National Geographic and the BBC, among others. They're justly proud of that work, but this time around the duo wanted to try something different. Madagascar's aye-aye: "as if fear and panic had a baby and rolled it in dog hair" National Geographic/Eleanor Paish Madagascar's aye-aye: "as if fear and panic had a baby and rolled it in dog hair" National Geographic/Eleanor Paish An emerald jewel wasp emerges from a cockroach. National Geographic/Simon De Glanville An emerald jewel wasp emerges from a cockroach. National Geographic/Simon De Glanville A pack of African hunting dogs is no match for the honey badger's thick hide. National Geographic/Tom Walker A pack of African hunting dogs is no match for the honey badger's thick hide. National Geographic/Tom Walker An emerald jewel wasp emerges from a cockroach. National Geographic/Simon De Glanville A pack of African hunting dogs is no match for the honey badger's thick hide. National Geographic/Tom Walker A fireworm is hit by a cavitation bubble shot from the claw of a pistol shrimp defending its home. National Geographic/Hugh Miller As it grows and molts, the mad hatterpillar stacks old head casings on top of its head. Scientists think it is used as a decoy against would-be predators and parasites, and when needed, it can also be used as a weapon. National Geographic/Katherine Hannaford Worst parents ever? A young barnacle goose chick prepares t make the 800-foot jump from its nest to the ground. National Geographic An adult pearlfish reverses into a sea cucumber's butt to hide. National Geographic A vulture sticks its head inside an elephant carcass to eat. National Geographic A manatee releases flatulence while swimming to lose the buoyancy build up of gas inside its stomach, and descend down the water column. National Geographic/Karl Davies "There is a sense after awhile that you're playing the same animals to the same people, and the shows are starting to look the same and so is your audience," Linfield told Ars. "We thought, okay, how can we do something absolutely the opposite? We've gone through our careers collecting stories of these weird and crazy creatures that don't end up in the script because they're not big or sexy and they live under a rock. But they often have the best life histories and the craziest superpowers." Case in point: the velvet worm featured in the "Superheroes" episode, which creeps up on unsuspecting prey before squirting disgusting slime all over their food.Once Linfield and Berlowitz decided to focus on nature's underdogs and to take a more humorous approach, Ryan Reynolds became their top choice for a narrator—the anti-Richard Attenborough. As luck would have it, the pair shared an agent with the mega-star. So even though they thought there was no way Reynolds would agree to the project, they put together a sizzle reel, complete with a "fake Canadian Ryan Reynolds sound-alike" doing the narration. Reynolds was on set when he received the reel, and loved it so much he recoded his own narration for the footage and sent it back. "From that moment he was in," said Linfield, and Wildstar Films worked closely with Reynolds and his company to develop the final series. "We've never worked that way on a series before, a joint collaboration from day one," Berlowitz admitted. But it worked: the end result strikes the perfect balance between scientific revelation and accurate natural history, and an edgy comic tone. That tone is quintessential Reynolds, and while he did mostly follow the script, Linfield and Berlowitz admit there was also a fair amount of improvisation—not all of it PG-13.  "What we hadn't appreciated is that he's an incredible improv performer," said Berlowitz. "He can't help himself. He gets into character and starts riffing off. There are some takes that we definitely couldn't use, that potentially would fit a slightly more Hulu audience."  Some of the ad-libs made it into the final episodes, however—like Reynolds describing an Aye-Aye as "if fear and panic had a baby and rolled it in dog hair"—even though it meant going back and doing a bit of recutting to get the new lines to fit. Cinematographer Tom Beldam films a long-tailed macaque who stole his smart phone minutes later. National Geographic/Laura Pennafort Cinematographer Tom Beldam films a long-tailed macaque who stole his smart phone minutes later. National Geographic/Laura Pennafort The macaque agrees to trade ithe stolen phone for a piece of food. National Geographic The macaque agrees to trade ithe stolen phone for a piece of food. National Geographic A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction. National Geographic A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction. National Geographic The macaque agrees to trade ithe stolen phone for a piece of food. National Geographic A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction. National Geographic A male hippo sprays his feces at another male who is threatening to take over his patch. National Geographic A male proboscis monkey flaunts his large nose. The noses of these males are used to amplify their calls in the vast forest. National Geographic Dream girl: A blood-soaked female hyena looks across the African savanna. National Geographic A male bowerbird presents one of the finest items in his collection to a female in his bower. National Geographic The male nursery web spider presents his nuptial gift to the female. National Geographic Cue the Barry White mood music: Two leopard slugs suspend themselves on a rope of mucus as they entwine their bodies to mate with one another. National Geographic Despite their years of collective experience, Linfield and Berlowitz were initially skeptical when the crew told them about the pearl fish, which hides from predators in a sea cucumber's butt. "It had never been filmed so we said, 'You're going to have to prove it to us,'" said Berlowitz. "They came back with this fantastic, hilarious sequence of a pearl fish reverse parking [in a sea cucumber's anus)." The film crew experienced a few heart-pounding moments, most notably while filming the cliffside nests of barnacle geese for the "Terrible Parents" episode. A melting glacier caused a watery avalanche while the crew was filming the geese, and they had to quickly grab a few shots and run to safety. Less dramatic: cinematographer Tom Beldam had his smartphone stolen by a long-tailed macaque mere minutes after he finished capturing the animal on film. If all goes well and Underdogs finds its target audience, we may even get a follow-up. "We are slightly plowing new territory but the science is as true as it's ever been and the stories are good. That aspect of the natural history is still there," said Linfield. "I think what we really hope for is that people who don't normally watch natural history will watch it. If people have as much fun watching it as we had making it, then the metrics should be good enough for another season." Verdict: Underdogs is positively addictive; I binged all five episodes in a single day.Underdogs premieres June 15, 2025, at 9 PM/8 PM Central on National Geographicand will be available for streaming on Disney+ and Hulu the following day.  You should watch it, if only to get that second season. Jennifer Ouellette Senior Writer Jennifer Ouellette Senior Writer Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban. 5 Comments #delightfully #irreverent #underdogs #isnt #your
    ARSTECHNICA.COM
    Delightfully irreverent Underdogs isn’t your parents’ nature docuseries
    show some love for the losers Delightfully irreverent Underdogs isn’t your parents’ nature docuseries Ryan Reynolds narrates NatGeo's new series highlighting nature's much less cool and majestic creatures Jennifer Ouellette – Jun 15, 2025 3:11 pm | 5 The indestructible honey badger is just one of nature's "benchwarmers" featured in Underdogs Credit: National Geographic/Doug Parker The indestructible honey badger is just one of nature's "benchwarmers" featured in Underdogs Credit: National Geographic/Doug Parker Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Narrator Ryan Reynolds celebrates nature's outcasts in the new NatGeo docuseries Underdogs. Most of us have seen a nature documentary or two (or three) at some point in our lives, so it's a familiar format: sweeping majestic footage of impressively regal animals accompanied by reverently high-toned narration (preferably with a tony British accent). Underdogs, a new docuseries from National Geographic, takes a decidedly different and unconventional approach. Narrated by with hilarious irreverence by Ryan Reynolds, the five-part series highlights nature's less cool and majestic creatures: the outcasts and benchwarmers, more noteworthy for their "unconventional hygiene choices" and "unsavory courtship rituals." It's like The Suicide Squad or Thunderbolts*, except these creatures actually exist. Per the official premise, "Underdogs features a range of never-before-filmed scenes, including the first time a film crew has ever entered a special cave in New Zealand—a huge cavern that glows brighter than a bachelor pad under a black light thanks to the glowing butts of millions of mucus-coated grubs. All over the world, overlooked superstars like this are out there 24/7, giving it maximum effort and keeping the natural world in working order for all those showboating polar bears, sharks and gorillas." It's rated PG-13 thanks to the odd bit of scatalogical humor and shots of Nature Sexy Time Each of the five episodes is built around a specific genre. "Superheroes" highlights the surprising superpowers of the honey badger, pistol shrimp, and the invisible glass frog, among others, augmented with comic book graphics; "Sexy Beasts" focuses on bizarre mating habits and follows the format of a romantic advice column; "Terrible Parents" highlights nature's worst practices, following the outline of a parenting guide; "Total Grossout" is exactly what it sounds like; and "The Unusual Suspects" is a heist tale, documenting the supposed efforts of a macaque to put together the ultimate team of masters of deception and disguise (an inside man, a decoy, a fall guy, etc.).  Green Day even wrote and recorded a special theme song for the opening credits. Co-creators Mark Linfield and Vanessa Berlowitz of Wildstar Films are longtime producers of award-winning wildlife films, most notably Frozen Planet, Planet Earth and David Attenborough's Life of Mammals—you know, the kind of prestige nature documentaries that have become a mainstay for National Geographic and the BBC, among others. They're justly proud of that work, but this time around the duo wanted to try something different. Madagascar's aye-aye: "as if fear and panic had a baby and rolled it in dog hair" National Geographic/Eleanor Paish Madagascar's aye-aye: "as if fear and panic had a baby and rolled it in dog hair" National Geographic/Eleanor Paish An emerald jewel wasp emerges from a cockroach. National Geographic/Simon De Glanville An emerald jewel wasp emerges from a cockroach. National Geographic/Simon De Glanville A pack of African hunting dogs is no match for the honey badger's thick hide. National Geographic/Tom Walker A pack of African hunting dogs is no match for the honey badger's thick hide. National Geographic/Tom Walker An emerald jewel wasp emerges from a cockroach. National Geographic/Simon De Glanville A pack of African hunting dogs is no match for the honey badger's thick hide. National Geographic/Tom Walker A fireworm is hit by a cavitation bubble shot from the claw of a pistol shrimp defending its home. National Geographic/Hugh Miller As it grows and molts, the mad hatterpillar stacks old head casings on top of its head. Scientists think it is used as a decoy against would-be predators and parasites, and when needed, it can also be used as a weapon. National Geographic/Katherine Hannaford Worst parents ever? A young barnacle goose chick prepares t make the 800-foot jump from its nest to the ground. National Geographic An adult pearlfish reverses into a sea cucumber's butt to hide. National Geographic A vulture sticks its head inside an elephant carcass to eat. National Geographic A manatee releases flatulence while swimming to lose the buoyancy build up of gas inside its stomach, and descend down the water column. National Geographic/Karl Davies "There is a sense after awhile that you're playing the same animals to the same people, and the shows are starting to look the same and so is your audience," Linfield told Ars. "We thought, okay, how can we do something absolutely the opposite? We've gone through our careers collecting stories of these weird and crazy creatures that don't end up in the script because they're not big or sexy and they live under a rock. But they often have the best life histories and the craziest superpowers." Case in point: the velvet worm featured in the "Superheroes" episode, which creeps up on unsuspecting prey before squirting disgusting slime all over their food. (It's a handy defense mechanism, too, against predators like the wolf spider.) Once Linfield and Berlowitz decided to focus on nature's underdogs and to take a more humorous approach, Ryan Reynolds became their top choice for a narrator—the anti-Richard Attenborough. As luck would have it, the pair shared an agent with the mega-star. So even though they thought there was no way Reynolds would agree to the project, they put together a sizzle reel, complete with a "fake Canadian Ryan Reynolds sound-alike" doing the narration. Reynolds was on set when he received the reel, and loved it so much he recoded his own narration for the footage and sent it back. "From that moment he was in," said Linfield, and Wildstar Films worked closely with Reynolds and his company to develop the final series. "We've never worked that way on a series before, a joint collaboration from day one," Berlowitz admitted. But it worked: the end result strikes the perfect balance between scientific revelation and accurate natural history, and an edgy comic tone. That tone is quintessential Reynolds, and while he did mostly follow the script (which his team helped write), Linfield and Berlowitz admit there was also a fair amount of improvisation—not all of it PG-13.  "What we hadn't appreciated is that he's an incredible improv performer," said Berlowitz. "He can't help himself. He gets into character and starts riffing off [the footage]. There are some takes that we definitely couldn't use, that potentially would fit a slightly more Hulu audience."  Some of the ad-libs made it into the final episodes, however—like Reynolds describing an Aye-Aye as "if fear and panic had a baby and rolled it in dog hair"—even though it meant going back and doing a bit of recutting to get the new lines to fit. Cinematographer Tom Beldam films a long-tailed macaque who stole his smart phone minutes later. National Geographic/Laura Pennafort Cinematographer Tom Beldam films a long-tailed macaque who stole his smart phone minutes later. National Geographic/Laura Pennafort The macaque agrees to trade ithe stolen phone for a piece of food. National Geographic The macaque agrees to trade ithe stolen phone for a piece of food. National Geographic A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction. National Geographic A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction. National Geographic The macaque agrees to trade ithe stolen phone for a piece of food. National Geographic A family of tortoise beetles defend themselves from a carnivorous ant by wafting baby poop in its direction. National Geographic A male hippo sprays his feces at another male who is threatening to take over his patch. National Geographic A male proboscis monkey flaunts his large nose. The noses of these males are used to amplify their calls in the vast forest. National Geographic Dream girl: A blood-soaked female hyena looks across the African savanna. National Geographic A male bowerbird presents one of the finest items in his collection to a female in his bower. National Geographic The male nursery web spider presents his nuptial gift to the female. National Geographic Cue the Barry White mood music: Two leopard slugs suspend themselves on a rope of mucus as they entwine their bodies to mate with one another. National Geographic Despite their years of collective experience, Linfield and Berlowitz were initially skeptical when the crew told them about the pearl fish, which hides from predators in a sea cucumber's butt (along with many other species). "It had never been filmed so we said, 'You're going to have to prove it to us,'" said Berlowitz. "They came back with this fantastic, hilarious sequence of a pearl fish reverse parking [in a sea cucumber's anus)." The film crew experienced a few heart-pounding moments, most notably while filming the cliffside nests of barnacle geese for the "Terrible Parents" episode. A melting glacier caused a watery avalanche while the crew was filming the geese, and they had to quickly grab a few shots and run to safety. Less dramatic: cinematographer Tom Beldam had his smartphone stolen by a long-tailed macaque mere minutes after he finished capturing the animal on film. If all goes well and Underdogs finds its target audience, we may even get a follow-up. "We are slightly plowing new territory but the science is as true as it's ever been and the stories are good. That aspect of the natural history is still there," said Linfield. "I think what we really hope for is that people who don't normally watch natural history will watch it. If people have as much fun watching it as we had making it, then the metrics should be good enough for another season." Verdict: Underdogs is positively addictive; I binged all five episodes in a single day. (For his part, Reynolds said in a statement that he was thrilled to "finally watch a project of ours with my children. Technically they saw Deadpool and Wolverine but I don't think they absorbed much while covering their eyes and ears and screaming for two hours.") Underdogs premieres June 15, 2025, at 9 PM/8 PM Central on National Geographic (simulcast on ABC) and will be available for streaming on Disney+ and Hulu the following day.  You should watch it, if only to get that second season. Jennifer Ouellette Senior Writer Jennifer Ouellette Senior Writer Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban. 5 Comments
    Like
    Love
    Wow
    Angry
    Sad
    487
    2 Комментарии 0 Поделились
  • 8 Best Sateen Sheets for a Polished Bedscape, Tested by AD (2025)

    All products featured on Architectural Digest are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links.Featured in this articleBest Overall Sateen SheetsBoll & Branch Signature Hemmed Sheet SetFor a Romantic DrapeEttitude CleanBamboo Sheet Set The Affordable PickGood Sleep Bedding Egyptian Cotton Sateen SheetsShow more3 / 8A close cousin to percale and silk, the best sateen sheets offer a happy medium of refinement and softness, all in one durability, and an easy-to-clean fabric.Sateen is known for having a polished appearance because of its lustrous sheen and wrinkle-resistant material. This comes from a tight satin weave that leaves a shiny look without compromising a smooth hand feel. While you can find this bedding in elevated spaces like this vibrant West Village town house thanks to embroidered touches and traditional prints, they’re surprisingly versatile and come in many forms. Here, our editors dive into their favorites for their bedrooms. Since you can find many in higher thread counts though, these are durable enough for any room in the house—as seen in this family-friendly getaway.Inside this ArticleBest Overall Sateen Sheets1/8Boll & Branch Signature Hemmed Sheet SetBoll & Branch caught commerce director Rachel Fletcher’s attention when she was browsing for new sheets for a few reasons. One: The brand makes organic and fair trade sheets: Two: She loves a sateen weave, and the retailer mentioned that this set was their bestseller and she wanted to see what the hype was about. “Boll & Branch claims that these cotton sateen sheets are buttery soft, and I definitely agree,” Fletcher says. “That extra-soft feel paired with the lovely, cooling properties make them feel like the luxury sheets that they are.” Along with an earthy color paletteand thoughtful hem detailing, this set stood out to be our top pick. These do have a higher price point, but as some of the plushest sheets she’s slept on, Fletcher thinks they’re worth it.Specs:Material: 100% organic cottonThread count: N/ASizes: Twin, Twin XL, Full, Queen, King, King With Std. Cases, California King, Split KingColors: 18 colors; 8 printsUpsides & DownsidesUpsidesSustainable materialBreathableOrganic colorwaysDownsidesExpensiveFor a Romantic DrapePhoto: Yelena Moroz AlpertPhoto: Yelena Moroz Alpert2/8Ettitude CleanBamboo Sheet Set “These sheets are buttery—pun intended,” says senior commerce editor Nashia Baker, who has the set in the butter yellow hue and loves the fabric’s delicate yet durable feel. Contributor Yelena Moroz Alpert also has this set and says that the cooling lyocell fabric set takes the bamboo sheets category up a notch. “Somehow they feel substantial but incredibly light and smooth,” she says of this splurge-worthy set. “The site says that the silky-soft sateen weave is comparable to 1,000 thread count cotton—and I believe it. I’ve never touched a baby alpaca, but I imagine that it’s as soft as these sheets.”Specs:Material: 100% CleanBamboo lyocellThread count: 1,000 thread countSizes: Twin, Twin XL, Full, Queen, King, California KingColors: 8Upsides & DownsidesUpsidesPearly appearanceLightweightUltra softDownsidesPriceyThe Affordable Pick3/8Good Sleep Bedding Egyptian Cotton Sateen SheetsDon’t overlook the best Amazon sheets for high-end sateen bedding. Contributor Erika Owen says these are a great option: “After a single night, they became my favorite set, and a few more nights and a wash only locked in this opinion.” She says they’re sumptuous, cool, and durable—and their qualityhasn’t changed after many rounds through the washer and dryer. “I would buy these as a gift for my best friend, if that tells you anything about how much I recommend these,” says Owen. “There’s nothing better than feeling really good as you hit the hay—who doesn’t want a luxury bed situation—and I felt that way every time I dug into these silky sheets. Let it also be known that I’m no stranger to night sweats and these kept me cool every single night.” The finishing touches are the deep pockets and sturdy elastic on the fitted sheet to fit a grand mattress.Specs:Material: 100% Egyptian cottonThread count: 1,000 thread countSizes: Twin, Twin XL, Full, Queen, King, California King, Split KingColors: 13Upsides & DownsidesUpsidesHigher thread countCoolingSturdy after several washesDownsidesSome shoppers found the fabric weightyA Vibrant Print4/8Rifle Paper Co. Peacock Sateen Bed Sheet SetThese are some of the softest bed sheets out there, just take it from Alpert. Not only are they comfortable to sink into night after night thanks to the plush 300 thread count, but they also veer away from traditional patterns and solid colorways. “I was originally drawn to the peacock print because it is just so whimsical and livens up my guest bedroom,” Alpert says. “But these are also buttery soft. Maybe too soft—my guests never want to leave.” If it wasn’t for the true-to-Rifle print, she would mistake these for hotel sheets because of their supple feel.Specs:Material: 100% combed cotton sateenThread count: 300 thread countSizes: Twin, Full, Queen, KingColors: 3Upsides & DownsidesUpsidesUnique patternsSuppleAiry materialDownsidesNot as ideal for minimalistsClassic Core Set5/8Brooklinen Luxe Sateen Core Sheet SetIf you want sheets with unparalleled quality, durability, and softness that gets better with every wash, multiple AD staff members say you can’t go wrong with these Brooklinen sheets. Fletcher shares that this sateen set is “super classic, smooth, and has a crisp feel.” Sleepers with sensitive skin will also be happy to know that they’re “not at all scratchy or harsh on my skin, like some of the less expensive options I’ve tried in the past,” Fletcher adds.Specs:Material: 100% long-staple cottonThread count: 480 thread countSizes: Twin, Twin XL, Full, Queen, King, California KingColors: 22Upsides & DownsidesUpsidesStructured fabric like a press shirtWrinkle-free designAffordableDownsidesLimited-edition colors sell out fastMore AD-Approved Sateen Sheets6/8Hill House Home Fitted Sheet“For a top sheet and fitted sheet, I truly didn’t know what to expect from a brand as new to the decor game as Hill House Home, but was delightfully surprised at the quality and attention to detail that was put into making these products,” contributor Katarina Kovac says of these Hill House Home sheets.“I wanted something that was crisp yet elevated, and the colored trim in the Savile Sheets was my answer.” Since she’s had her fair share of sheets that have a sandpaper-like texture, she paid close attention to how well these felt after the first wash. To her delight, these “felt soft, velvety, and breathable against my skin, leaving me truly struggling to get out of bed in the morning.”Specs:Material: 100% brushed cotton sateenThread count: N/ASizes: Twin, Full, Queen, King, California KingColors: 6Upsides & DownsidesUpsidesTraditional printsLushSmooth feelThoughtful trimDownsidesFlat sheet, fitted sheet, and pillowcases are sold separately7/8Homebird Sateen Fitted SheetsFletcher loves an ethically made, slippery sateen weave, and it took just one night of sleep to be sold on this Homebird set. “They’re very high quality and everything you want in a sateen sheet: incredibly soft to the touch and slightly silky, with a sturdiness to them that you can tell is the result of a high thread count,” she says. “They fit my bed perfectly and also have the most useful feature that, in my opinion, every set of sheets ever made should have: a long-side and short-side label.”Specs:Material: 100% GOTS-certified, long-staple organic cottonThread count: 300 thread countSizes: Full, Queen, KingColors: 7Upsides & DownsidesUpsidesSilky smoothHelpful labels to make the bedDeep pocketsDownsidesOnly available in muted tones
    #best #sateen #sheets #polished #bedscape
    8 Best Sateen Sheets for a Polished Bedscape, Tested by AD (2025)
    All products featured on Architectural Digest are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links.Featured in this articleBest Overall Sateen SheetsBoll & Branch Signature Hemmed Sheet SetFor a Romantic DrapeEttitude CleanBamboo Sheet Set The Affordable PickGood Sleep Bedding Egyptian Cotton Sateen SheetsShow more3 / 8A close cousin to percale and silk, the best sateen sheets offer a happy medium of refinement and softness, all in one durability, and an easy-to-clean fabric.Sateen is known for having a polished appearance because of its lustrous sheen and wrinkle-resistant material. This comes from a tight satin weave that leaves a shiny look without compromising a smooth hand feel. While you can find this bedding in elevated spaces like this vibrant West Village town house thanks to embroidered touches and traditional prints, they’re surprisingly versatile and come in many forms. Here, our editors dive into their favorites for their bedrooms. Since you can find many in higher thread counts though, these are durable enough for any room in the house—as seen in this family-friendly getaway.Inside this ArticleBest Overall Sateen Sheets1/8Boll & Branch Signature Hemmed Sheet SetBoll & Branch caught commerce director Rachel Fletcher’s attention when she was browsing for new sheets for a few reasons. One: The brand makes organic and fair trade sheets: Two: She loves a sateen weave, and the retailer mentioned that this set was their bestseller and she wanted to see what the hype was about. “Boll & Branch claims that these cotton sateen sheets are buttery soft, and I definitely agree,” Fletcher says. “That extra-soft feel paired with the lovely, cooling properties make them feel like the luxury sheets that they are.” Along with an earthy color paletteand thoughtful hem detailing, this set stood out to be our top pick. These do have a higher price point, but as some of the plushest sheets she’s slept on, Fletcher thinks they’re worth it.Specs:Material: 100% organic cottonThread count: N/ASizes: Twin, Twin XL, Full, Queen, King, King With Std. Cases, California King, Split KingColors: 18 colors; 8 printsUpsides & DownsidesUpsidesSustainable materialBreathableOrganic colorwaysDownsidesExpensiveFor a Romantic DrapePhoto: Yelena Moroz AlpertPhoto: Yelena Moroz Alpert2/8Ettitude CleanBamboo Sheet Set “These sheets are buttery—pun intended,” says senior commerce editor Nashia Baker, who has the set in the butter yellow hue and loves the fabric’s delicate yet durable feel. Contributor Yelena Moroz Alpert also has this set and says that the cooling lyocell fabric set takes the bamboo sheets category up a notch. “Somehow they feel substantial but incredibly light and smooth,” she says of this splurge-worthy set. “The site says that the silky-soft sateen weave is comparable to 1,000 thread count cotton—and I believe it. I’ve never touched a baby alpaca, but I imagine that it’s as soft as these sheets.”Specs:Material: 100% CleanBamboo lyocellThread count: 1,000 thread countSizes: Twin, Twin XL, Full, Queen, King, California KingColors: 8Upsides & DownsidesUpsidesPearly appearanceLightweightUltra softDownsidesPriceyThe Affordable Pick3/8Good Sleep Bedding Egyptian Cotton Sateen SheetsDon’t overlook the best Amazon sheets for high-end sateen bedding. Contributor Erika Owen says these are a great option: “After a single night, they became my favorite set, and a few more nights and a wash only locked in this opinion.” She says they’re sumptuous, cool, and durable—and their qualityhasn’t changed after many rounds through the washer and dryer. “I would buy these as a gift for my best friend, if that tells you anything about how much I recommend these,” says Owen. “There’s nothing better than feeling really good as you hit the hay—who doesn’t want a luxury bed situation—and I felt that way every time I dug into these silky sheets. Let it also be known that I’m no stranger to night sweats and these kept me cool every single night.” The finishing touches are the deep pockets and sturdy elastic on the fitted sheet to fit a grand mattress.Specs:Material: 100% Egyptian cottonThread count: 1,000 thread countSizes: Twin, Twin XL, Full, Queen, King, California King, Split KingColors: 13Upsides & DownsidesUpsidesHigher thread countCoolingSturdy after several washesDownsidesSome shoppers found the fabric weightyA Vibrant Print4/8Rifle Paper Co. Peacock Sateen Bed Sheet SetThese are some of the softest bed sheets out there, just take it from Alpert. Not only are they comfortable to sink into night after night thanks to the plush 300 thread count, but they also veer away from traditional patterns and solid colorways. “I was originally drawn to the peacock print because it is just so whimsical and livens up my guest bedroom,” Alpert says. “But these are also buttery soft. Maybe too soft—my guests never want to leave.” If it wasn’t for the true-to-Rifle print, she would mistake these for hotel sheets because of their supple feel.Specs:Material: 100% combed cotton sateenThread count: 300 thread countSizes: Twin, Full, Queen, KingColors: 3Upsides & DownsidesUpsidesUnique patternsSuppleAiry materialDownsidesNot as ideal for minimalistsClassic Core Set5/8Brooklinen Luxe Sateen Core Sheet SetIf you want sheets with unparalleled quality, durability, and softness that gets better with every wash, multiple AD staff members say you can’t go wrong with these Brooklinen sheets. Fletcher shares that this sateen set is “super classic, smooth, and has a crisp feel.” Sleepers with sensitive skin will also be happy to know that they’re “not at all scratchy or harsh on my skin, like some of the less expensive options I’ve tried in the past,” Fletcher adds.Specs:Material: 100% long-staple cottonThread count: 480 thread countSizes: Twin, Twin XL, Full, Queen, King, California KingColors: 22Upsides & DownsidesUpsidesStructured fabric like a press shirtWrinkle-free designAffordableDownsidesLimited-edition colors sell out fastMore AD-Approved Sateen Sheets6/8Hill House Home Fitted Sheet“For a top sheet and fitted sheet, I truly didn’t know what to expect from a brand as new to the decor game as Hill House Home, but was delightfully surprised at the quality and attention to detail that was put into making these products,” contributor Katarina Kovac says of these Hill House Home sheets.“I wanted something that was crisp yet elevated, and the colored trim in the Savile Sheets was my answer.” Since she’s had her fair share of sheets that have a sandpaper-like texture, she paid close attention to how well these felt after the first wash. To her delight, these “felt soft, velvety, and breathable against my skin, leaving me truly struggling to get out of bed in the morning.”Specs:Material: 100% brushed cotton sateenThread count: N/ASizes: Twin, Full, Queen, King, California KingColors: 6Upsides & DownsidesUpsidesTraditional printsLushSmooth feelThoughtful trimDownsidesFlat sheet, fitted sheet, and pillowcases are sold separately7/8Homebird Sateen Fitted SheetsFletcher loves an ethically made, slippery sateen weave, and it took just one night of sleep to be sold on this Homebird set. “They’re very high quality and everything you want in a sateen sheet: incredibly soft to the touch and slightly silky, with a sturdiness to them that you can tell is the result of a high thread count,” she says. “They fit my bed perfectly and also have the most useful feature that, in my opinion, every set of sheets ever made should have: a long-side and short-side label.”Specs:Material: 100% GOTS-certified, long-staple organic cottonThread count: 300 thread countSizes: Full, Queen, KingColors: 7Upsides & DownsidesUpsidesSilky smoothHelpful labels to make the bedDeep pocketsDownsidesOnly available in muted tones #best #sateen #sheets #polished #bedscape
    WWW.ARCHITECTURALDIGEST.COM
    8 Best Sateen Sheets for a Polished Bedscape, Tested by AD (2025)
    All products featured on Architectural Digest are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links.Featured in this articleBest Overall Sateen SheetsBoll & Branch Signature Hemmed Sheet SetRead moreFor a Romantic DrapeEttitude CleanBamboo Sheet Set Read moreThe Affordable PickGood Sleep Bedding Egyptian Cotton Sateen SheetsRead moreShow more3 / 8A close cousin to percale and silk, the best sateen sheets offer a happy medium of refinement and softness, all in one durability, and an easy-to-clean fabric.Sateen is known for having a polished appearance because of its lustrous sheen and wrinkle-resistant material. This comes from a tight satin weave that leaves a shiny look without compromising a smooth hand feel. While you can find this bedding in elevated spaces like this vibrant West Village town house thanks to embroidered touches and traditional prints, they’re surprisingly versatile and come in many forms. Here, our editors dive into their favorites for their bedrooms. Since you can find many in higher thread counts though (which we dive into more below), these are durable enough for any room in the house—as seen in this family-friendly getaway.Inside this ArticleBest Overall Sateen Sheets1/8Boll & Branch Signature Hemmed Sheet SetBoll & Branch caught commerce director Rachel Fletcher’s attention when she was browsing for new sheets for a few reasons. One: The brand makes organic and fair trade sheets: Two: She loves a sateen weave, and the retailer mentioned that this set was their bestseller and she wanted to see what the hype was about. “Boll & Branch claims that these cotton sateen sheets are buttery soft, and I definitely agree,” Fletcher says. “That extra-soft feel paired with the lovely, cooling properties make them feel like the luxury sheets that they are.” Along with an earthy color palette (Fletcher has her set in mineral) and thoughtful hem detailing, this set stood out to be our top pick. These do have a higher price point, but as some of the plushest sheets she’s slept on, Fletcher thinks they’re worth it.Specs:Material: 100% organic cottonThread count: N/ASizes: Twin, Twin XL, Full, Queen, King, King With Std. Cases, California King, Split KingColors: 18 colors; 8 printsUpsides & DownsidesUpsidesSustainable materialBreathableOrganic colorwaysDownsidesExpensiveFor a Romantic DrapePhoto: Yelena Moroz AlpertPhoto: Yelena Moroz Alpert2/8Ettitude CleanBamboo Sheet Set “These sheets are buttery—pun intended,” says senior commerce editor Nashia Baker, who has the set in the butter yellow hue and loves the fabric’s delicate yet durable feel. Contributor Yelena Moroz Alpert also has this set and says that the cooling lyocell fabric set takes the bamboo sheets category up a notch. “Somehow they feel substantial but incredibly light and smooth,” she says of this splurge-worthy set. “The site says that the silky-soft sateen weave is comparable to 1,000 thread count cotton—and I believe it. I’ve never touched a baby alpaca, but I imagine that it’s as soft as these sheets.”Specs:Material: 100% CleanBamboo lyocellThread count: 1,000 thread countSizes: Twin, Twin XL, Full, Queen, King, California KingColors: 8Upsides & DownsidesUpsidesPearly appearanceLightweightUltra softDownsidesPriceyThe Affordable Pick3/8Good Sleep Bedding Egyptian Cotton Sateen SheetsDon’t overlook the best Amazon sheets for high-end sateen bedding. Contributor Erika Owen says these are a great option: “After a single night, they became my favorite set, and a few more nights and a wash only locked in this opinion.” She says they’re sumptuous, cool, and durable—and their quality (think texture, weight, and comfort) hasn’t changed after many rounds through the washer and dryer. “I would buy these as a gift for my best friend, if that tells you anything about how much I recommend these,” says Owen. “There’s nothing better than feeling really good as you hit the hay—who doesn’t want a luxury bed situation—and I felt that way every time I dug into these silky sheets. Let it also be known that I’m no stranger to night sweats and these kept me cool every single night.” The finishing touches are the deep pockets and sturdy elastic on the fitted sheet to fit a grand mattress.Specs:Material: 100% Egyptian cottonThread count: 1,000 thread countSizes: Twin, Twin XL, Full, Queen, King, California King, Split KingColors: 13Upsides & DownsidesUpsidesHigher thread countCoolingSturdy after several washesDownsidesSome shoppers found the fabric weightyA Vibrant Print4/8Rifle Paper Co. Peacock Sateen Bed Sheet SetThese are some of the softest bed sheets out there, just take it from Alpert. Not only are they comfortable to sink into night after night thanks to the plush 300 thread count, but they also veer away from traditional patterns and solid colorways. “I was originally drawn to the peacock print because it is just so whimsical and livens up my guest bedroom,” Alpert says. “But these are also buttery soft. Maybe too soft—my guests never want to leave.” If it wasn’t for the true-to-Rifle print, she would mistake these for hotel sheets because of their supple feel.Specs:Material: 100% combed cotton sateenThread count: 300 thread countSizes: Twin, Full, Queen, KingColors: 3Upsides & DownsidesUpsidesUnique patternsSuppleAiry materialDownsidesNot as ideal for minimalistsClassic Core Set5/8Brooklinen Luxe Sateen Core Sheet SetIf you want sheets with unparalleled quality, durability, and softness that gets better with every wash, multiple AD staff members say you can’t go wrong with these Brooklinen sheets. Fletcher shares that this sateen set is “super classic, smooth, and has a crisp feel.” Sleepers with sensitive skin will also be happy to know that they’re “not at all scratchy or harsh on my skin, like some of the less expensive options I’ve tried in the past,” Fletcher adds.Specs:Material: 100% long-staple cottonThread count: 480 thread countSizes: Twin, Twin XL, Full, Queen, King, California KingColors: 22Upsides & DownsidesUpsidesStructured fabric like a press shirtWrinkle-free designAffordableDownsidesLimited-edition colors sell out fastMore AD-Approved Sateen Sheets6/8Hill House Home Fitted Sheet“For a $100 top sheet and $125 fitted sheet, I truly didn’t know what to expect from a brand as new to the decor game as Hill House Home, but was delightfully surprised at the quality and attention to detail that was put into making these products,” contributor Katarina Kovac says of these Hill House Home sheets.“I wanted something that was crisp yet elevated, and the colored trim in the Savile Sheets was my answer.” Since she’s had her fair share of sheets that have a sandpaper-like texture, she paid close attention to how well these felt after the first wash. To her delight, these “felt soft, velvety, and breathable against my skin, leaving me truly struggling to get out of bed in the morning.”Specs:Material: 100% brushed cotton sateenThread count: N/ASizes: Twin, Full, Queen, King, California KingColors: 6Upsides & DownsidesUpsidesTraditional printsLushSmooth feelThoughtful trimDownsidesFlat sheet, fitted sheet, and pillowcases are sold separately7/8Homebird Sateen Fitted Sheets (Set of 3)Fletcher loves an ethically made, slippery sateen weave, and it took just one night of sleep to be sold on this Homebird set. “They’re very high quality and everything you want in a sateen sheet: incredibly soft to the touch and slightly silky, with a sturdiness to them that you can tell is the result of a high thread count,” she says. “They fit my bed perfectly and also have the most useful feature that, in my opinion, every set of sheets ever made should have: a long-side and short-side label.”Specs:Material: 100% GOTS-certified, long-staple organic cottonThread count: 300 thread countSizes: Full, Queen, KingColors: 7Upsides & DownsidesUpsidesSilky smoothHelpful labels to make the bedDeep pocketsDownsidesOnly available in muted tones
    Like
    Love
    Wow
    Sad
    Angry
    398
    2 Комментарии 0 Поделились
  • Stanford Doctors Invent Device That Appears to Be Able to Save Tons of Stroke Patients Before They Die

    Image by Andrew BrodheadResearchers have developed a novel device that literally spins away the clots that block blood flow to the brain and cause strokes.As Stanford explains in a blurb, the novel milli-spinner device may be able to save the lives of patients who experience "ischemic stroke" from brain stem clotting.Traditional clot removal, a process known as thrombectomy, generally uses a catheter that either vacuums up the blood blockage or uses a wire mesh to ensnare it — a procedure that's as rough and imprecise as it sounds. Conventional thrombectomy has a very low efficacy rate because of this imprecision, and the procedure can result in pieces of the clot breaking off and moving to more difficult-to-reach regions.Thrombectomy via milli-spinner also enters the brain with a catheter, but instead of using a normal vacuum device, it employs a spinning tube outfitted with fins and slits that can suck up the clot much more meticulously.Stanford neuroimaging expert Jeremy Heit, who also coauthored a new paper about the device in the journal Nature, explained in the school's press release that the efficacy of the milli-spinner is "unbelievable.""For most cases, we’re more than doubling the efficacy of current technology, and for the toughest clots — which we’re only removing about 11 percent of the time with current devices — we’re getting the artery open on the first try 90 percent of the time," Heit said. "This is a sea-change technology that will drastically improve our ability to help people."Renee Zhao, the senior author of the Nature paper who teaches mechanical engineering at Stanford and creates what she calls "millirobots," said that conventional thrombectomies just aren't cutting it."With existing technology, there’s no way to reduce the size of the clot," Zhao said. "They rely on deforming and rupturing the clot to remove it.""What’s unique about the milli-spinner is that it applies compression and shear forces to shrink the entire clot," she continued, "dramatically reducing the volume without causing rupture."Indeed, as the team discovered, the device can cut and vacuum up to five percent of its original size."It works so well, for a wide range of clot compositions and sizes," Zhao said. "Even for tough... clots, which are impossible to treat with current technologies, our milli-spinner can treat them using this simple yet powerful mechanics concept to densify the fibrin network and shrink the clot."Though its main experimental use case is brain clot removal, Zhao is excited about its other uses, too."We’re exploring other biomedical applications for the milli-spinner design, and even possibilities beyond medicine," the engineer said. "There are some very exciting opportunities ahead."More on brains: The Microplastics in Your Brain May Be Causing Mental Health IssuesShare This Article
    #stanford #doctors #invent #device #that
    Stanford Doctors Invent Device That Appears to Be Able to Save Tons of Stroke Patients Before They Die
    Image by Andrew BrodheadResearchers have developed a novel device that literally spins away the clots that block blood flow to the brain and cause strokes.As Stanford explains in a blurb, the novel milli-spinner device may be able to save the lives of patients who experience "ischemic stroke" from brain stem clotting.Traditional clot removal, a process known as thrombectomy, generally uses a catheter that either vacuums up the blood blockage or uses a wire mesh to ensnare it — a procedure that's as rough and imprecise as it sounds. Conventional thrombectomy has a very low efficacy rate because of this imprecision, and the procedure can result in pieces of the clot breaking off and moving to more difficult-to-reach regions.Thrombectomy via milli-spinner also enters the brain with a catheter, but instead of using a normal vacuum device, it employs a spinning tube outfitted with fins and slits that can suck up the clot much more meticulously.Stanford neuroimaging expert Jeremy Heit, who also coauthored a new paper about the device in the journal Nature, explained in the school's press release that the efficacy of the milli-spinner is "unbelievable.""For most cases, we’re more than doubling the efficacy of current technology, and for the toughest clots — which we’re only removing about 11 percent of the time with current devices — we’re getting the artery open on the first try 90 percent of the time," Heit said. "This is a sea-change technology that will drastically improve our ability to help people."Renee Zhao, the senior author of the Nature paper who teaches mechanical engineering at Stanford and creates what she calls "millirobots," said that conventional thrombectomies just aren't cutting it."With existing technology, there’s no way to reduce the size of the clot," Zhao said. "They rely on deforming and rupturing the clot to remove it.""What’s unique about the milli-spinner is that it applies compression and shear forces to shrink the entire clot," she continued, "dramatically reducing the volume without causing rupture."Indeed, as the team discovered, the device can cut and vacuum up to five percent of its original size."It works so well, for a wide range of clot compositions and sizes," Zhao said. "Even for tough... clots, which are impossible to treat with current technologies, our milli-spinner can treat them using this simple yet powerful mechanics concept to densify the fibrin network and shrink the clot."Though its main experimental use case is brain clot removal, Zhao is excited about its other uses, too."We’re exploring other biomedical applications for the milli-spinner design, and even possibilities beyond medicine," the engineer said. "There are some very exciting opportunities ahead."More on brains: The Microplastics in Your Brain May Be Causing Mental Health IssuesShare This Article #stanford #doctors #invent #device #that
    FUTURISM.COM
    Stanford Doctors Invent Device That Appears to Be Able to Save Tons of Stroke Patients Before They Die
    Image by Andrew BrodheadResearchers have developed a novel device that literally spins away the clots that block blood flow to the brain and cause strokes.As Stanford explains in a blurb, the novel milli-spinner device may be able to save the lives of patients who experience "ischemic stroke" from brain stem clotting.Traditional clot removal, a process known as thrombectomy, generally uses a catheter that either vacuums up the blood blockage or uses a wire mesh to ensnare it — a procedure that's as rough and imprecise as it sounds. Conventional thrombectomy has a very low efficacy rate because of this imprecision, and the procedure can result in pieces of the clot breaking off and moving to more difficult-to-reach regions.Thrombectomy via milli-spinner also enters the brain with a catheter, but instead of using a normal vacuum device, it employs a spinning tube outfitted with fins and slits that can suck up the clot much more meticulously.Stanford neuroimaging expert Jeremy Heit, who also coauthored a new paper about the device in the journal Nature, explained in the school's press release that the efficacy of the milli-spinner is "unbelievable.""For most cases, we’re more than doubling the efficacy of current technology, and for the toughest clots — which we’re only removing about 11 percent of the time with current devices — we’re getting the artery open on the first try 90 percent of the time," Heit said. "This is a sea-change technology that will drastically improve our ability to help people."Renee Zhao, the senior author of the Nature paper who teaches mechanical engineering at Stanford and creates what she calls "millirobots," said that conventional thrombectomies just aren't cutting it."With existing technology, there’s no way to reduce the size of the clot," Zhao said. "They rely on deforming and rupturing the clot to remove it.""What’s unique about the milli-spinner is that it applies compression and shear forces to shrink the entire clot," she continued, "dramatically reducing the volume without causing rupture."Indeed, as the team discovered, the device can cut and vacuum up to five percent of its original size."It works so well, for a wide range of clot compositions and sizes," Zhao said. "Even for tough... clots, which are impossible to treat with current technologies, our milli-spinner can treat them using this simple yet powerful mechanics concept to densify the fibrin network and shrink the clot."Though its main experimental use case is brain clot removal, Zhao is excited about its other uses, too."We’re exploring other biomedical applications for the milli-spinner design, and even possibilities beyond medicine," the engineer said. "There are some very exciting opportunities ahead."More on brains: The Microplastics in Your Brain May Be Causing Mental Health IssuesShare This Article
    Like
    Love
    Wow
    Sad
    Angry
    478
    2 Комментарии 0 Поделились
  • OAQ Awards of Excellence winners announced

    Montreal City Hall – Beaupré Michaud and Associates, Architects in collaboration with MU Architecture, Montreal. Photo credit: Raphaël Thibodeau
    The Ordre des architectes du Québechas revealed the winners of its 2025 Awards of Excellence in Architecture.
    A total of eleven projects were recognized at a gala hosted by Jean-René Dufort at Espace St-Denis in Montreal.
    The Grand Prix d’excellence en architecture was awarded to the restoration of Montreal City Hall , a major project led by Beaupré Michaud et Associés, architects, and MU Architecture. This complex project successfully preserves the building’s historical qualities while transforming it into an exemplary place in terms of energy and ecology.  Guided by plans from the 1920s, the architects revived this building by equipping it with contemporary, efficient, more open, and more accessible features for residents. In addition to the heritage restoration, the team also reconciled old and contemporary technologies, energy efficiency, universal accessibility, and the reappropriation of spaces that had become dilapidated.
    The People’s Choice Award was presented to the Coop Milieu de l’île, designed by Pivot: Coopérative d’architecture. Located in Outremont, this 91-unit intergenerational housing cooperative was born from the initiative of a group of committed citizens looking to address the housing crisis by creating affordable, off-market housing. In the context of the housing crisis, the jury emphasized that this project, which is also the recipient of an Award of Excellence, designed by and for its residents, acts as a “breath of fresh air in Outremont.”
    Coop Milieu de l’île. Pivot: Architecture Cooperative, Montreal. Photo credit: Annie Fafard
    “The projects we evaluated this year were truly remarkable in their richness and diversity. The jury found in them everything that makes Quebec architecture so strong and unique: rigor, attention to detail, and respect for the context and built heritage. We saw emblematic projects, but also discreet gestures, almost invisible in the landscape. Some projects rehabilitated forgotten places, transformed historic buildings, or even imagined new spaces for collective living. All, in their own way, highlighted the powerful impact of built quality on our living environments,” said Gabrielle Nadeau, chair of the OAQ Awards of Excellence Jury.
    The jury for the 2025 Awards of Excellence in Architecture was chaired by Gabrielle Nadeau, principal design architect, COBE in Copenhagen. It also included architects Marianne Charbonneau of Agence Spatiale, Maxime-Alexis Frappier of ACDF, and Guillaume Martel-Trudel of Provencher-Roy. Élène Levasseur, director of research and education at Architecture sans frontières Québec, acted as the public representative.
    Through the Awards of Excellence in Architecture, presented annually, the Order aims to raise awareness among Quebecers of the multiple dimensions of architectural quality, in addition to promoting the role of the architects in the design of inspiring, sustainable and thoughtful senior living environments.
    The full list of winners include the following.

    Habitat Sélénite by _naturehumaine
    Habitat Sélénite – _naturehumaine, Eastman. Photo: Raphaël Thibodeau

    École secondaire du Bosquet by ABCP | Menkès Shooner Dagenais LeTourneux | Bilodeau Baril Leeming Architectes
    École secondaire du Bosquet – ABCP | Menkès Shooner Dagenais LeTourneux | Bilodeau Baril Leeming Architectes, Drummondville. Photo: Stéphane Brügger

    Bibliothèque Gabrielle-Roy by Saucier + Perrotte Architectes et GLCRM Architectes
    Bibliothèque Gabrielle-Roy – Saucier + Perrotte Architectes et GLCRM Architectes, Québec. Photo: Olivier Blouin

    Maison A by Atelier Pierre Thibault
    Maison A – Atelier Pierre Thibault, Saint-Nicolas. Photo: Maxime Brouillet

    Nouvel Hôtel de Ville de La Pêche by BGLA Architecture et Design Urbain
    Nouvel Hôtel de Ville de La Pêche – BGLA Architecture et Design Urbain, La Pêche. Photo: Stéphane Brügger / Dominique Laroche

    École du Zénith by Pelletier de Fontenay + Leclerc
    École du Zénith – Pelletier de Fontenay + Leclerc, Shefford. Photo: James Brittain / David Boyer

    Le Paquebot by _naturehumaine
    Le Paquebot – _naturehumaine, Montréal. Photo: Ronan Mézière

    Coopérative funéraire la Seigneurie by ultralocal architectes

    Coopérative funéraire la Seigneurie – ultralocal architectes, Québec. Photo credit: Paul Dussault
    Site d’observation des bélugas Putep’t-awt by atelier5 + mainstudio
    Site d’observation des bélugas Putep’t-awt – atelier5 + mainstudio, Cacouna. Photo: Stéphane Groleau

    The post OAQ Awards of Excellence winners announced appeared first on Canadian Architect.
    #oaq #awards #excellence #winners #announced
    OAQ Awards of Excellence winners announced
    Montreal City Hall – Beaupré Michaud and Associates, Architects in collaboration with MU Architecture, Montreal. Photo credit: Raphaël Thibodeau The Ordre des architectes du Québechas revealed the winners of its 2025 Awards of Excellence in Architecture. A total of eleven projects were recognized at a gala hosted by Jean-René Dufort at Espace St-Denis in Montreal. The Grand Prix d’excellence en architecture was awarded to the restoration of Montreal City Hall , a major project led by Beaupré Michaud et Associés, architects, and MU Architecture. This complex project successfully preserves the building’s historical qualities while transforming it into an exemplary place in terms of energy and ecology.  Guided by plans from the 1920s, the architects revived this building by equipping it with contemporary, efficient, more open, and more accessible features for residents. In addition to the heritage restoration, the team also reconciled old and contemporary technologies, energy efficiency, universal accessibility, and the reappropriation of spaces that had become dilapidated. The People’s Choice Award was presented to the Coop Milieu de l’île, designed by Pivot: Coopérative d’architecture. Located in Outremont, this 91-unit intergenerational housing cooperative was born from the initiative of a group of committed citizens looking to address the housing crisis by creating affordable, off-market housing. In the context of the housing crisis, the jury emphasized that this project, which is also the recipient of an Award of Excellence, designed by and for its residents, acts as a “breath of fresh air in Outremont.” Coop Milieu de l’île. Pivot: Architecture Cooperative, Montreal. Photo credit: Annie Fafard “The projects we evaluated this year were truly remarkable in their richness and diversity. The jury found in them everything that makes Quebec architecture so strong and unique: rigor, attention to detail, and respect for the context and built heritage. We saw emblematic projects, but also discreet gestures, almost invisible in the landscape. Some projects rehabilitated forgotten places, transformed historic buildings, or even imagined new spaces for collective living. All, in their own way, highlighted the powerful impact of built quality on our living environments,” said Gabrielle Nadeau, chair of the OAQ Awards of Excellence Jury. The jury for the 2025 Awards of Excellence in Architecture was chaired by Gabrielle Nadeau, principal design architect, COBE in Copenhagen. It also included architects Marianne Charbonneau of Agence Spatiale, Maxime-Alexis Frappier of ACDF, and Guillaume Martel-Trudel of Provencher-Roy. Élène Levasseur, director of research and education at Architecture sans frontières Québec, acted as the public representative. Through the Awards of Excellence in Architecture, presented annually, the Order aims to raise awareness among Quebecers of the multiple dimensions of architectural quality, in addition to promoting the role of the architects in the design of inspiring, sustainable and thoughtful senior living environments. The full list of winners include the following. Habitat Sélénite by _naturehumaine Habitat Sélénite – _naturehumaine, Eastman. Photo: Raphaël Thibodeau École secondaire du Bosquet by ABCP | Menkès Shooner Dagenais LeTourneux | Bilodeau Baril Leeming Architectes École secondaire du Bosquet – ABCP | Menkès Shooner Dagenais LeTourneux | Bilodeau Baril Leeming Architectes, Drummondville. Photo: Stéphane Brügger Bibliothèque Gabrielle-Roy by Saucier + Perrotte Architectes et GLCRM Architectes Bibliothèque Gabrielle-Roy – Saucier + Perrotte Architectes et GLCRM Architectes, Québec. Photo: Olivier Blouin Maison A by Atelier Pierre Thibault Maison A – Atelier Pierre Thibault, Saint-Nicolas. Photo: Maxime Brouillet Nouvel Hôtel de Ville de La Pêche by BGLA Architecture et Design Urbain Nouvel Hôtel de Ville de La Pêche – BGLA Architecture et Design Urbain, La Pêche. Photo: Stéphane Brügger / Dominique Laroche École du Zénith by Pelletier de Fontenay + Leclerc École du Zénith – Pelletier de Fontenay + Leclerc, Shefford. Photo: James Brittain / David Boyer Le Paquebot by _naturehumaine Le Paquebot – _naturehumaine, Montréal. Photo: Ronan Mézière Coopérative funéraire la Seigneurie by ultralocal architectes Coopérative funéraire la Seigneurie – ultralocal architectes, Québec. Photo credit: Paul Dussault Site d’observation des bélugas Putep’t-awt by atelier5 + mainstudio Site d’observation des bélugas Putep’t-awt – atelier5 + mainstudio, Cacouna. Photo: Stéphane Groleau The post OAQ Awards of Excellence winners announced appeared first on Canadian Architect. #oaq #awards #excellence #winners #announced
    WWW.CANADIANARCHITECT.COM
    OAQ Awards of Excellence winners announced
    Montreal City Hall – Beaupré Michaud and Associates, Architects in collaboration with MU Architecture, Montreal. Photo credit: Raphaël Thibodeau The Ordre des architectes du Québec (OAQ) has revealed the winners of its 2025 Awards of Excellence in Architecture. A total of eleven projects were recognized at a gala hosted by Jean-René Dufort at Espace St-Denis in Montreal. The Grand Prix d’excellence en architecture was awarded to the restoration of Montreal City Hall , a major project led by Beaupré Michaud et Associés, architects, and MU Architecture. This complex project successfully preserves the building’s historical qualities while transforming it into an exemplary place in terms of energy and ecology.  Guided by plans from the 1920s, the architects revived this building by equipping it with contemporary, efficient, more open, and more accessible features for residents. In addition to the heritage restoration, the team also reconciled old and contemporary technologies, energy efficiency, universal accessibility, and the reappropriation of spaces that had become dilapidated. The People’s Choice Award was presented to the Coop Milieu de l’île, designed by Pivot: Coopérative d’architecture. Located in Outremont, this 91-unit intergenerational housing cooperative was born from the initiative of a group of committed citizens looking to address the housing crisis by creating affordable, off-market housing. In the context of the housing crisis, the jury emphasized that this project, which is also the recipient of an Award of Excellence, designed by and for its residents, acts as a “breath of fresh air in Outremont.” Coop Milieu de l’île. Pivot: Architecture Cooperative, Montreal. Photo credit: Annie Fafard “The projects we evaluated this year were truly remarkable in their richness and diversity. The jury found in them everything that makes Quebec architecture so strong and unique: rigor, attention to detail, and respect for the context and built heritage. We saw emblematic projects, but also discreet gestures, almost invisible in the landscape. Some projects rehabilitated forgotten places, transformed historic buildings, or even imagined new spaces for collective living. All, in their own way, highlighted the powerful impact of built quality on our living environments,” said Gabrielle Nadeau, chair of the OAQ Awards of Excellence Jury. The jury for the 2025 Awards of Excellence in Architecture was chaired by Gabrielle Nadeau, principal design architect, COBE in Copenhagen. It also included architects Marianne Charbonneau of Agence Spatiale, Maxime-Alexis Frappier of ACDF, and Guillaume Martel-Trudel of Provencher-Roy. Élène Levasseur, director of research and education at Architecture sans frontières Québec, acted as the public representative. Through the Awards of Excellence in Architecture, presented annually, the Order aims to raise awareness among Quebecers of the multiple dimensions of architectural quality, in addition to promoting the role of the architects in the design of inspiring, sustainable and thoughtful senior living environments. The full list of winners include the following. Habitat Sélénite by _naturehumaine Habitat Sélénite – _naturehumaine, Eastman (Estrie). Photo: Raphaël Thibodeau École secondaire du Bosquet by ABCP | Menkès Shooner Dagenais LeTourneux | Bilodeau Baril Leeming Architectes École secondaire du Bosquet – ABCP | Menkès Shooner Dagenais LeTourneux | Bilodeau Baril Leeming Architectes, Drummondville (Centre-du-Québec). Photo: Stéphane Brügger Bibliothèque Gabrielle-Roy by Saucier + Perrotte Architectes et GLCRM Architectes Bibliothèque Gabrielle-Roy – Saucier + Perrotte Architectes et GLCRM Architectes, Québec (Capitale-Nationale). Photo: Olivier Blouin Maison A by Atelier Pierre Thibault Maison A – Atelier Pierre Thibault, Saint-Nicolas (Chaudière-Appalaches). Photo: Maxime Brouillet Nouvel Hôtel de Ville de La Pêche by BGLA Architecture et Design Urbain Nouvel Hôtel de Ville de La Pêche – BGLA Architecture et Design Urbain, La Pêche (Outaouais). Photo: Stéphane Brügger / Dominique Laroche École du Zénith by Pelletier de Fontenay + Leclerc École du Zénith – Pelletier de Fontenay + Leclerc, Shefford (Estrie). Photo: James Brittain / David Boyer Le Paquebot by _naturehumaine Le Paquebot – _naturehumaine, Montréal (Montréal). Photo: Ronan Mézière Coopérative funéraire la Seigneurie by ultralocal architectes Coopérative funéraire la Seigneurie – ultralocal architectes, Québec (Capitale-Nationale). Photo credit: Paul Dussault Site d’observation des bélugas Putep’t-awt by atelier5 + mainstudio Site d’observation des bélugas Putep’t-awt – atelier5 + mainstudio, Cacouna (Bas-Saint-Laurent). Photo: Stéphane Groleau The post OAQ Awards of Excellence winners announced appeared first on Canadian Architect.
    Like
    Love
    Wow
    Sad
    Angry
    520
    2 Комментарии 0 Поделились
  • How To Create & Animate Breakdance-Inspired Streetwear

    IntroductionHi, my name is Pankaj Kholiya, and I am a Senior 3D Character Artist. I've been working in the game industry for the past 8 years. I worked on titles like Call of Duty: Black Ops 6, That Christmas, Ghost of Tsushima Director's Cut, Star Wars: Outlaws, Alan Wake 2, Street Fighter 6, and many more. Currently, I'm working as a freelancer for the gaming and cinematics industry.Since my last interview, I made a few personal works, was a part of a Netflix movie, That Christmas, and worked with Platige on Star Wars: Outlaws and Call of Duty: Black Ops 6 cinematic.The Breakdancing Clothing ProjectIt all started when I witnessed a dance battle that a friend organized. It was like watching Step Up live. There, I got the inspiration to create a break dancer. I started by gathering different references from the internet. I found one particular image on Pinterest and decided to recreate it in 3D.At first, the idea was to create the outfit in one pose, but along the way, I also decided to create a dancing version of the character and explore Unreal Engine. Here is the ref I used for the dancing version:Getting StartedFor the upcoming talents, I'll try to describe my process in a few points. Even before starting Marvelous Designer, I made sure to have my base character ready for animation and simulation. This time, I decided to use the MetaHuman creator for the base due to its high-quality textures and materials. My primary focus was on the clothing, so using MetaHuman saved a lot of time.After I was satisfied with how my MetaHuman looked, I took it to Mixamo to get some animations. I was really impressed by how good the animations worked on the MetaHuman. Once I had the animations, I took the animation into Marvelous Designer and simulated the clothes.For the posed character, I adjusted the rig to match the pose like the reference and used the same method as in this tutorial to pose the character:ClothingFor this particular project, I didn't focus on the topology as it was just for a single render. I just packed the UVs in Marvelous Designer, exported the quad mesh from Marvelous Designer, subdivided it a few times, and started working on the detailing part in ZBrush.For the texture, I used the low-division mesh from the ZBrush file, as I already had the UVs on it. I then baked the normal and other maps on it and took it to Substance 3D Painter.AnimationThere are multiple ways to animate the metahuman character. For this project, I've used Mixamo. I imported my character into Mixamo, selected the animation I liked, and exported it. After that, I just imported it to Marvelous Designer and hit the simulation button. You can check my previous breakdown for the Mixamo pipeline.Once happy with the result, I exported the simulated cloth as an Alembic to Unreal Engine. Tutorial for importing clothes into Unreal Engine:Lighting & RenderingThe main target was to match the lighting closely to the reference. This was my first project in Unreal Engine, so I wanted to explore the lighting and see how far I could go with it. Being new to the Unreal Engine, I went through a lot of tutorials. Here are the lights I've used for the posed version:For the dancing version, I've created a stage like the ref from the Step Up movie: Some tips I found useful for the rendering are in the video below:ConclusionAt first, I had a clear direction for this project and was confident in my skills to tackle the art aspect of it. But things changed when I dived into Unreal Engine for my presentation. More than half the time on this project went into learning and getting used to Unreal Engine. I don't regret a single second I invested in Unreal, as it was a new experience. It took around 15 days to wrap this one up.The lesson I learned is that upgrading your knowledge and learning new things will help you grow as an artist in the long run. Approaching how you make an artwork has changed a lot ever since I started 3D, and adapting to the changing art environment is a good thing. Here are some recommendations if you are interested in learning Unreal Engine.Pankaj Kholiya, Senior 3D Character ArtistInterview conducted by Amber Rutherford
    #how #create #ampamp #animate #breakdanceinspired
    How To Create & Animate Breakdance-Inspired Streetwear
    IntroductionHi, my name is Pankaj Kholiya, and I am a Senior 3D Character Artist. I've been working in the game industry for the past 8 years. I worked on titles like Call of Duty: Black Ops 6, That Christmas, Ghost of Tsushima Director's Cut, Star Wars: Outlaws, Alan Wake 2, Street Fighter 6, and many more. Currently, I'm working as a freelancer for the gaming and cinematics industry.Since my last interview, I made a few personal works, was a part of a Netflix movie, That Christmas, and worked with Platige on Star Wars: Outlaws and Call of Duty: Black Ops 6 cinematic.The Breakdancing Clothing ProjectIt all started when I witnessed a dance battle that a friend organized. It was like watching Step Up live. There, I got the inspiration to create a break dancer. I started by gathering different references from the internet. I found one particular image on Pinterest and decided to recreate it in 3D.At first, the idea was to create the outfit in one pose, but along the way, I also decided to create a dancing version of the character and explore Unreal Engine. Here is the ref I used for the dancing version:Getting StartedFor the upcoming talents, I'll try to describe my process in a few points. Even before starting Marvelous Designer, I made sure to have my base character ready for animation and simulation. This time, I decided to use the MetaHuman creator for the base due to its high-quality textures and materials. My primary focus was on the clothing, so using MetaHuman saved a lot of time.After I was satisfied with how my MetaHuman looked, I took it to Mixamo to get some animations. I was really impressed by how good the animations worked on the MetaHuman. Once I had the animations, I took the animation into Marvelous Designer and simulated the clothes.For the posed character, I adjusted the rig to match the pose like the reference and used the same method as in this tutorial to pose the character:ClothingFor this particular project, I didn't focus on the topology as it was just for a single render. I just packed the UVs in Marvelous Designer, exported the quad mesh from Marvelous Designer, subdivided it a few times, and started working on the detailing part in ZBrush.For the texture, I used the low-division mesh from the ZBrush file, as I already had the UVs on it. I then baked the normal and other maps on it and took it to Substance 3D Painter.AnimationThere are multiple ways to animate the metahuman character. For this project, I've used Mixamo. I imported my character into Mixamo, selected the animation I liked, and exported it. After that, I just imported it to Marvelous Designer and hit the simulation button. You can check my previous breakdown for the Mixamo pipeline.Once happy with the result, I exported the simulated cloth as an Alembic to Unreal Engine. Tutorial for importing clothes into Unreal Engine:Lighting & RenderingThe main target was to match the lighting closely to the reference. This was my first project in Unreal Engine, so I wanted to explore the lighting and see how far I could go with it. Being new to the Unreal Engine, I went through a lot of tutorials. Here are the lights I've used for the posed version:For the dancing version, I've created a stage like the ref from the Step Up movie: Some tips I found useful for the rendering are in the video below:ConclusionAt first, I had a clear direction for this project and was confident in my skills to tackle the art aspect of it. But things changed when I dived into Unreal Engine for my presentation. More than half the time on this project went into learning and getting used to Unreal Engine. I don't regret a single second I invested in Unreal, as it was a new experience. It took around 15 days to wrap this one up.The lesson I learned is that upgrading your knowledge and learning new things will help you grow as an artist in the long run. Approaching how you make an artwork has changed a lot ever since I started 3D, and adapting to the changing art environment is a good thing. Here are some recommendations if you are interested in learning Unreal Engine.Pankaj Kholiya, Senior 3D Character ArtistInterview conducted by Amber Rutherford #how #create #ampamp #animate #breakdanceinspired
    80.LV
    How To Create & Animate Breakdance-Inspired Streetwear
    IntroductionHi, my name is Pankaj Kholiya, and I am a Senior 3D Character Artist. I've been working in the game industry for the past 8 years. I worked on titles like Call of Duty: Black Ops 6, That Christmas, Ghost of Tsushima Director's Cut, Star Wars: Outlaws, Alan Wake 2, Street Fighter 6, and many more. Currently, I'm working as a freelancer for the gaming and cinematics industry.Since my last interview, I made a few personal works, was a part of a Netflix movie, That Christmas, and worked with Platige on Star Wars: Outlaws and Call of Duty: Black Ops 6 cinematic.The Breakdancing Clothing ProjectIt all started when I witnessed a dance battle that a friend organized. It was like watching Step Up live. There, I got the inspiration to create a break dancer. I started by gathering different references from the internet. I found one particular image on Pinterest and decided to recreate it in 3D.At first, the idea was to create the outfit in one pose, but along the way, I also decided to create a dancing version of the character and explore Unreal Engine. Here is the ref I used for the dancing version:Getting StartedFor the upcoming talents, I'll try to describe my process in a few points. Even before starting Marvelous Designer, I made sure to have my base character ready for animation and simulation. This time, I decided to use the MetaHuman creator for the base due to its high-quality textures and materials. My primary focus was on the clothing, so using MetaHuman saved a lot of time.After I was satisfied with how my MetaHuman looked, I took it to Mixamo to get some animations. I was really impressed by how good the animations worked on the MetaHuman. Once I had the animations, I took the animation into Marvelous Designer and simulated the clothes.For the posed character, I adjusted the rig to match the pose like the reference and used the same method as in this tutorial to pose the character:ClothingFor this particular project, I didn't focus on the topology as it was just for a single render. I just packed the UVs in Marvelous Designer, exported the quad mesh from Marvelous Designer, subdivided it a few times, and started working on the detailing part in ZBrush.For the texture, I used the low-division mesh from the ZBrush file, as I already had the UVs on it. I then baked the normal and other maps on it and took it to Substance 3D Painter.AnimationThere are multiple ways to animate the metahuman character. For this project, I've used Mixamo. I imported my character into Mixamo, selected the animation I liked, and exported it. After that, I just imported it to Marvelous Designer and hit the simulation button. You can check my previous breakdown for the Mixamo pipeline.Once happy with the result, I exported the simulated cloth as an Alembic to Unreal Engine. Tutorial for importing clothes into Unreal Engine:Lighting & RenderingThe main target was to match the lighting closely to the reference. This was my first project in Unreal Engine, so I wanted to explore the lighting and see how far I could go with it. Being new to the Unreal Engine, I went through a lot of tutorials. Here are the lights I've used for the posed version:For the dancing version, I've created a stage like the ref from the Step Up movie: Some tips I found useful for the rendering are in the video below:ConclusionAt first, I had a clear direction for this project and was confident in my skills to tackle the art aspect of it. But things changed when I dived into Unreal Engine for my presentation. More than half the time on this project went into learning and getting used to Unreal Engine. I don't regret a single second I invested in Unreal, as it was a new experience. It took around 15 days to wrap this one up.The lesson I learned is that upgrading your knowledge and learning new things will help you grow as an artist in the long run. Approaching how you make an artwork has changed a lot ever since I started 3D, and adapting to the changing art environment is a good thing. Here are some recommendations if you are interested in learning Unreal Engine.Pankaj Kholiya, Senior 3D Character ArtistInterview conducted by Amber Rutherford
    0 Комментарии 0 Поделились
Расширенные страницы