• Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid

    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand.
    Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation.
    At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics.
    Future use cases for AEON include:

    Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio.
    Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings.
    Part inspection, which includes checking parts for defects or ensuring adherence to specifications.
    Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners.

    “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.”

    Using NVIDIA’s Three Computers to Develop AEON 
    To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models.
    Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations.
    AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning.


    This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment.
    In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation.
    “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.”
    Data Comes to Life Through Reality Capture and Omniverse Integration 
    AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas.

    Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure.
    “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.”
    AEON’s Next Steps
    By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON.
    This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data.
    Watch the Hexagon LIVE keynote, explore presentations and read more about AEON.
    All imagery courtesy of Hexagon.
    #hexagon #taps #nvidia #robotics #software
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio. Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon. #hexagon #taps #nvidia #robotics #software
    BLOGS.NVIDIA.COM
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Reality (HxDR) platform powering Hexagon Reality Cloud Studio (RCS). Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. https://blogs.nvidia.com/wp-content/uploads/2025/06/Copy-of-robotics-hxgn-live-blog-1920x1080-1.mp4 This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon.
    Like
    Love
    Wow
    Sad
    Angry
    38
    0 Комментарии 0 Поделились
  • The Unwritten Rules of Death Stranding 2 Explained

    Death Stranding 2: On the Beach is finally here, and making some massive waves at that. Much of this is due to how much it has committed to improving on the formula of the original, although this has involved streamlining a lot of its systems and all but giving the people what they want. Nevertheless, Death Stranding 2 is already proving to be a fulfilling continuation of the first game's legacy.
    #unwritten #rules #death #stranding #explained
    The Unwritten Rules of Death Stranding 2 Explained
    Death Stranding 2: On the Beach is finally here, and making some massive waves at that. Much of this is due to how much it has committed to improving on the formula of the original, although this has involved streamlining a lot of its systems and all but giving the people what they want. Nevertheless, Death Stranding 2 is already proving to be a fulfilling continuation of the first game's legacy. #unwritten #rules #death #stranding #explained
    GAMERANT.COM
    The Unwritten Rules of Death Stranding 2 Explained
    Death Stranding 2: On the Beach is finally here, and making some massive waves at that. Much of this is due to how much it has committed to improving on the formula of the original, although this has involved streamlining a lot of its systems and all but giving the people what they want. Nevertheless, Death Stranding 2 is already proving to be a fulfilling continuation of the first game's legacy.
    0 Комментарии 0 Поделились
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Комментарии 0 Поделились
  • So, there’s this thing about how Discord was ported to Windows 95 and NT 3.1. Honestly, it’s kind of interesting, but also a bit dull. Like, who even thinks about running Discord on those old systems? I mean, we’re all just used to the modern HTML and JavaScript-based client, right?

    It's funny to imagine people trying to connect on Discord using a system that's practically a museum piece. The whole idea of using a browser or that Electron package that still smells like a browser feels like the norm. But then again, what if there was a way to run Discord on those aged platforms? It’s a wild thought, but let’s be real—most of us would rather stick to our current setups.

    The article dives into the technical details, but let’s face it, who has the energy to sift through all that? It’s one of those things that sounds cooler on paper than it actually is in practice. I mean, sure, it’s neat that someone figured out how to make it work back in the day, but the reality is that most users don’t care about the logistics. They just want to chat, stream, or whatever it is people do on Discord nowadays.

    And it’s not like anyone is lining up to use Discord on Windows 95 or NT 3.1. I can’t even imagine the lag. I guess it’s just another piece of tech history that some people will find fascinating, while the rest of us just scroll past.

    So, yeah, that’s pretty much it. Discord on ancient systems is a thing. It happened. People did it. But let’s not pretend that it’s something we’re all eager to dive into. Honestly, I’d rather just scroll through memes or something.

    #Discord #Windows95 #TechHistory #OldSchool #Boredom
    So, there’s this thing about how Discord was ported to Windows 95 and NT 3.1. Honestly, it’s kind of interesting, but also a bit dull. Like, who even thinks about running Discord on those old systems? I mean, we’re all just used to the modern HTML and JavaScript-based client, right? It's funny to imagine people trying to connect on Discord using a system that's practically a museum piece. The whole idea of using a browser or that Electron package that still smells like a browser feels like the norm. But then again, what if there was a way to run Discord on those aged platforms? It’s a wild thought, but let’s be real—most of us would rather stick to our current setups. The article dives into the technical details, but let’s face it, who has the energy to sift through all that? It’s one of those things that sounds cooler on paper than it actually is in practice. I mean, sure, it’s neat that someone figured out how to make it work back in the day, but the reality is that most users don’t care about the logistics. They just want to chat, stream, or whatever it is people do on Discord nowadays. And it’s not like anyone is lining up to use Discord on Windows 95 or NT 3.1. I can’t even imagine the lag. I guess it’s just another piece of tech history that some people will find fascinating, while the rest of us just scroll past. So, yeah, that’s pretty much it. Discord on ancient systems is a thing. It happened. People did it. But let’s not pretend that it’s something we’re all eager to dive into. Honestly, I’d rather just scroll through memes or something. #Discord #Windows95 #TechHistory #OldSchool #Boredom
    How Discord Was Ported to Windows 95 and NT 3.1
    On the desktop, most people use the official HTML and JavaScript-based client for Discord in either a browser or a still-smells-like-a-browser Electron package. Yet what if there was a way …read more
    Like
    Love
    Wow
    Sad
    Angry
    602
    1 Комментарии 0 Поделились
  • Switch 2 gamers can now get top protection to end the dreaded console drop-and-break

    Accessories firm PowerA have released a series of peripherals and items designed to look after your believed new Switch 2 console to avoid a broken, smashed machine while taking it on-the-goTech14:16, 15 Jun 2025The PowerA Slim case for Switch 2Gamers who have just snapped up their fancy new Switch 2 console need some protection for their latest purchase.Because this fine piece of tech can easily be dropped while gaming onto a hard floor.‌Thankfully, a host of peripherals and accessories are already hitting stores for the Nintendo machine just days after its summertime launch.‌And it means you’ve now got options to protect your pricey device from a nasty fall or screen smash early in its gaming life.The bods at Power A have dropped a series of items worth considering for you Switch 2.Our go-to here is the new Slim Case which is a bargain at just £14.99.Article continues belowOfficially licensed by Nintendo, it has a moulded interior with soft fabric lining that perfectly cups your console, keeping it tightly nested from movement when zipped in.The case has a clean, rugged designIt looks the part too, with a grey tough fabric feel and that all-important Switch 2 logo on the front, bottom right, so you can show off to your pals.‌Inside you can even tuck in 10 game cards for your favourite titles thanks to a dedicated rack area.And that has an integrated play stand for on-the-go gamers who want to put out the magnetic Joy-Cons and have the display stand up in the case at a nice viewable angle where it remains protected while you game outdoors with pals.The play stand doubles as a padded screen protector when the system is inside the case, which is ideal.‌We’ve tried this out and it feels of good quality and well padded to protect your console.You can also get a screen protector from the firm to cover your precious 7.9-inch 1080p LCD screen form a break during a fall.There are two in a pack for £9 and, just like mobile phone screen protectors, they’ll give you an extra layer of cover while not affecting the touch screen mechanisms.‌The pack includes a microfibre cleaning cloth, placement guides, dust removal stickers and applicator.The Mario Time advantage controller for Switch 2Finally, if you want to avoid the Joy-Cons altogether there are new controllers for the Switch 2 to consider.Article continues belowThe best looking one is arguably the Advantage wired controller dubbed ‘Mario Time’ which costs £29 and boasts hall-effect magnetic sensor thumb sticks for fluid gameplay, on board audio controls for your gaming headsets and a cool Super Mario themed look.‌‌‌
    #switch #gamers #can #now #get
    Switch 2 gamers can now get top protection to end the dreaded console drop-and-break
    Accessories firm PowerA have released a series of peripherals and items designed to look after your believed new Switch 2 console to avoid a broken, smashed machine while taking it on-the-goTech14:16, 15 Jun 2025The PowerA Slim case for Switch 2Gamers who have just snapped up their fancy new Switch 2 console need some protection for their latest purchase.Because this fine piece of tech can easily be dropped while gaming onto a hard floor.‌Thankfully, a host of peripherals and accessories are already hitting stores for the Nintendo machine just days after its summertime launch.‌And it means you’ve now got options to protect your pricey device from a nasty fall or screen smash early in its gaming life.The bods at Power A have dropped a series of items worth considering for you Switch 2.Our go-to here is the new Slim Case which is a bargain at just £14.99.Article continues belowOfficially licensed by Nintendo, it has a moulded interior with soft fabric lining that perfectly cups your console, keeping it tightly nested from movement when zipped in.The case has a clean, rugged designIt looks the part too, with a grey tough fabric feel and that all-important Switch 2 logo on the front, bottom right, so you can show off to your pals.‌Inside you can even tuck in 10 game cards for your favourite titles thanks to a dedicated rack area.And that has an integrated play stand for on-the-go gamers who want to put out the magnetic Joy-Cons and have the display stand up in the case at a nice viewable angle where it remains protected while you game outdoors with pals.The play stand doubles as a padded screen protector when the system is inside the case, which is ideal.‌We’ve tried this out and it feels of good quality and well padded to protect your console.You can also get a screen protector from the firm to cover your precious 7.9-inch 1080p LCD screen form a break during a fall.There are two in a pack for £9 and, just like mobile phone screen protectors, they’ll give you an extra layer of cover while not affecting the touch screen mechanisms.‌The pack includes a microfibre cleaning cloth, placement guides, dust removal stickers and applicator.The Mario Time advantage controller for Switch 2Finally, if you want to avoid the Joy-Cons altogether there are new controllers for the Switch 2 to consider.Article continues belowThe best looking one is arguably the Advantage wired controller dubbed ‘Mario Time’ which costs £29 and boasts hall-effect magnetic sensor thumb sticks for fluid gameplay, on board audio controls for your gaming headsets and a cool Super Mario themed look.‌‌‌ #switch #gamers #can #now #get
    WWW.DAILYSTAR.CO.UK
    Switch 2 gamers can now get top protection to end the dreaded console drop-and-break
    Accessories firm PowerA have released a series of peripherals and items designed to look after your believed new Switch 2 console to avoid a broken, smashed machine while taking it on-the-goTech14:16, 15 Jun 2025The PowerA Slim case for Switch 2Gamers who have just snapped up their fancy new Switch 2 console need some protection for their latest purchase.Because this fine piece of tech can easily be dropped while gaming onto a hard floor.‌Thankfully, a host of peripherals and accessories are already hitting stores for the Nintendo machine just days after its summertime launch.‌And it means you’ve now got options to protect your pricey device from a nasty fall or screen smash early in its gaming life.The bods at Power A have dropped a series of items worth considering for you Switch 2.Our go-to here is the new Slim Case which is a bargain at just £14.99.Article continues belowOfficially licensed by Nintendo, it has a moulded interior with soft fabric lining that perfectly cups your console, keeping it tightly nested from movement when zipped in.The case has a clean, rugged designIt looks the part too, with a grey tough fabric feel and that all-important Switch 2 logo on the front, bottom right, so you can show off to your pals.‌Inside you can even tuck in 10 game cards for your favourite titles thanks to a dedicated rack area.And that has an integrated play stand for on-the-go gamers who want to put out the magnetic Joy-Cons and have the display stand up in the case at a nice viewable angle where it remains protected while you game outdoors with pals.The play stand doubles as a padded screen protector when the system is inside the case, which is ideal.‌We’ve tried this out and it feels of good quality and well padded to protect your console.You can also get a screen protector from the firm to cover your precious 7.9-inch 1080p LCD screen form a break during a fall.There are two in a pack for £9 and, just like mobile phone screen protectors, they’ll give you an extra layer of cover while not affecting the touch screen mechanisms.‌The pack includes a microfibre cleaning cloth, placement guides, dust removal stickers and applicator.The Mario Time advantage controller for Switch 2Finally, if you want to avoid the Joy-Cons altogether there are new controllers for the Switch 2 to consider.Article continues belowThe best looking one is arguably the Advantage wired controller dubbed ‘Mario Time’ which costs £29 and boasts hall-effect magnetic sensor thumb sticks for fluid gameplay, on board audio controls for your gaming headsets and a cool Super Mario themed look.‌‌‌
    Like
    Love
    Wow
    Sad
    Angry
    518
    2 Комментарии 0 Поделились
  • F5: Leta Sobierajski Talks Giant Pandas, Sculptural Clothing + More

    When Leta Sobierajski enrolled in college, she already knew what she was meant to do, and she didn’t settle for anything less. “When I went to school for graphic design, I really didn’t have a backup plan – it was this, or nothing,” she says. “My work is a constantly evolving practice, and from the beginning, I have always convinced myself that if I put in the time and experimentation, I would grow and evolve.”
    After graduation, Sobierajski took on a range of projects, which included animation, print, and branding elements. She collaborated with corporate clients, but realized that she wouldn’t feel comfortable following anyone else’s rules in a 9-to-5 environment.
    Leta Sobierajskiand Wade Jeffree\\\ Photo: Matt Dutile
    Sobierajski eventually decided to team up with fellow artist and kindred spirit Wade Jeffree. In 2016 they launched their Brooklyn-based studio, Wade and Leta. The duo, who share a taste for quirky aesthetics, produces sculpture, installations, or anything else they can dream up. Never static in thinking or method, they are constantly searching for another medium to try that will complement their shared vision of the moment.
    The pair is currently interested in permanency, and they want to utilize more metal, a strong material that will stand the test of time. Small architectural pieces are also on tap, and on a grander scale, they’d like to focus on a park or communal area that everyone can enjoy.
    With so many ideas swirling around, Sobierajski will record a concept in at least three different ways so that she’s sure to unearth it at a later date. “In some ways, I like to think I’m impeccably organized, as I have countless spreadsheets tracking our work, our lives, and our well-being,” she explains. “The reality is that I am great at over-complicating situations with my intensified list-making and note-taking. The only thing to do is to trust the process.”
    Today, Leta Sobierajski joins us for Friday Five!
    Photo: Melitta Baumeister and Michał Plata
    1. Melitta Baumeister and Michał Plata
    The work of Melitta Baumeister and Michał Plata has been a constant inspiration to me for their innovative, artful, and architectural silhouettes. By a practice of draping and arduous pattern-making, the garments that they develop season after season feel like they could be designed for existence in another universe. I’m a person who likes to dress up for anything when I’m not in the studio, and every time I opt to wear one of their looks, I feel like I can take on the world. The best part about their pieces is that they’re extremely functional, so whether I need to hop on a bicycle or show up at an opening, I’m still able to make a statement – these garments even have the ability to strike up conversations on their own.
    Photo: Wade and Leta
    2. Pandas!
    I was recently in Chengdu to launch a new project and we took half the day to visit the Chengdu Research Base of Giant Pandas and I am a new panda convert. Yes, they’re docile and cute, but their lifestyles are utterly chill and deeply enviable for us adults with responsibilities. Giant pandas primarily eat bamboo and can consume 20-40 kilograms per day. When they’re not doing that, they’re sleeping. When we visited, many could be seen reclining on their backs, feasting on some of the finest bamboo they could select within arm’s reach. While not necessarily playful in appearance, they do seem quite cheeky in their agendas and will do as little as they can to make the most of their meals. It felt like I was watching a mirrored image of myself on a Sunday afternoon while trying to make the most of my last hours of the weekend.
    Photo: Courtesy of Aoiro
    3. Aoiro
    I’m not really a candle personbut I love the luxurious subtlety of a fragrant space. It’s an intangible feeling that really can only be experienced in the present. Some of the best people to create these fragrances, in my opinion, are Shizuko and Manuel, the masterminds behind Aoiro, a Japanese and Austrian duo who have developed a keen sense for embodying the fragrances of some of the most intriguing and captivating olfactory atmospheres – earthy forest floors with crackling pine needles, blue cypress tickling the moon in an indigo sky, and rainfall on a spirited Japanese island. Despite living in an urban city, Aoiro’s olfactory design is capable of transporting me to the deepest forests of misty Yakushima island.
    Photo: Wade and Leta
    4. Takuro Kuwata
    A few months ago, I saw the work of Japanese ceramicist Takuro Kuwata at an exhibition at Salon94 and have been having trouble getting it out of my head. Kuwata’s work exemplifies someone who has worked with a medium so much to completely use the medium as a medium – if that makes sense. His ability to manipulate clay and glaze and use it to create gravity-defying effects within the kiln are exceptionally mysterious to me and feel like they could only be accomplished with years and years of experimentation with the material. I’m equally impressed seeing how he’s grown his work with scale, juxtaposing it with familiar iconography like the fuzzy peach, but sculpting it from materials like bronze.
    Photo: Wade and Leta
    5. The Site of Reversible Destiny, a park built by artists Arakawa and Gins, in Yoro Japan
    The park is a testament to their career as writers, architects, and their idea of reversible destiny, which in its most extreme form, eliminates death. For all that are willing to listen, Arakawa and Gins’ Reversible Destiny mentality aims to make our lives a little more youthful by encouraging us to reevaluate our relationship with architecture and our surroundings. The intention of “reversible destiny” is not to prolong death, postpone it, grow older alongside it, but to entirely not acknowledge and surpass it. Wadeand I have spent the last ten years traveling to as many of their remaining sites as possible to further understand this notion of creating spaces to extend our lives and question how conventional living spaces can become detrimental to our longevity.
     
    Works by Wade and Leta:
    Photo: Wade and Leta and Matt Alexander
    Now You See Me is a large-scale installation in the heart of Shoreditch, London, that explores the relationship between positive and negative space through bold color, geometry, and light. Simple, familiar shapes are embedded within monolithic forms, creating a layered visual experience that shifts throughout the day. As sunlight passes through the structures, shadows and silhouettes stretch and connect, forming dynamic compositions on the surrounding concrete.
    Photo: Wade and Leta and John Wylie
    Paint Your Own Path is series of five towering sculptures, ranging from 10 to 15 feet tall, invites viewers to explore balance, tension, and perspective through bold color and form. Inspired by the delicate, often precarious act of stacking objects, the sculptures appear as if they might topple – yet each one holds steady, challenging perceptions of stability. Created in partnership with the Corolla Cross, the installation transforms its environment into a pop-colored landscape.
    Photo: Millenia Walk and Outer Edit, Eurthe Studio
    Monument to Movement is a 14-meter-tall kinetic sculpture that celebrates the spirit of the holiday season through rhythm, motion, and color. Rising skyward in layered compositions, the work symbolizes collective joy, renewal, and the shared energy of celebrations that span cultures and traditions. Powered by motors and constructed from metal beams and cardboard forms, the sculpture continuously shifts, inviting viewers to reflect on the passage of time and the cycles that connect us all.
    Photo: Wade and Leta and Erika Hara, Piotr Maslanka, and Jeremy Renault
    Falling Into Place is a vibrant rooftop installation at Ginza Six that explores themes of alignment, adaptability, and perspective. Six colorful structures – each with a void like a missing puzzle piece – serve as spaces for reflection, inviting visitors to consider their place within a greater whole. Rather than focusing on absence, the design transforms emptiness into opportunity, encouraging people to embrace spontaneity and the unfolding nature of life. Playful yet contemplative, the work emphasizes that only through connection and participation can the full picture come into view.
    Photo: Wade and Leta and Erika Hara, Piotr Maslanka, and Jeremy Renault
    Photo: Wade and Leta
    Stop, Listen, Look is a 7-meter-tall interactive artwork atop IFS Chengdu that captures the vibrant rhythm of the city through movement, sound, and form. Blending motorized and wind-powered elements with seesaws and sound modulation, it invites people of all ages to engage, play, and reflect. Inspired by Chengdu’s balance of tradition and modernity, the piece incorporates circular motifs from local symbolism alongside bold, geometric forms to create a dialogue between past and present. With light, motion, and community at its core, the work invites visitors to connect with the city – and each other – through shared interaction.

    The Cloud is a permanent sculptural kiosk in Burlington, Vermont’s historic City Hall Park, created in collaboration with Brooklyn-based Studio RENZ+OEI. Designed to reinterpret the ephemeral nature of clouds through architecture, it blends art, air, and imagination into a light, fluid structure that defies traditional rigidity. Originally born from a creative exchange between longtime friends and collaborators, the design challenges expectations of permanence by embodying movement and openness. Now home to a local food vendor, The Cloud brings a playful, uplifting presence to the park, inviting reflection and interaction rain or shine..
    #leta #sobierajski #talks #giant #pandas
    F5: Leta Sobierajski Talks Giant Pandas, Sculptural Clothing + More
    When Leta Sobierajski enrolled in college, she already knew what she was meant to do, and she didn’t settle for anything less. “When I went to school for graphic design, I really didn’t have a backup plan – it was this, or nothing,” she says. “My work is a constantly evolving practice, and from the beginning, I have always convinced myself that if I put in the time and experimentation, I would grow and evolve.” After graduation, Sobierajski took on a range of projects, which included animation, print, and branding elements. She collaborated with corporate clients, but realized that she wouldn’t feel comfortable following anyone else’s rules in a 9-to-5 environment. Leta Sobierajskiand Wade Jeffree\\\ Photo: Matt Dutile Sobierajski eventually decided to team up with fellow artist and kindred spirit Wade Jeffree. In 2016 they launched their Brooklyn-based studio, Wade and Leta. The duo, who share a taste for quirky aesthetics, produces sculpture, installations, or anything else they can dream up. Never static in thinking or method, they are constantly searching for another medium to try that will complement their shared vision of the moment. The pair is currently interested in permanency, and they want to utilize more metal, a strong material that will stand the test of time. Small architectural pieces are also on tap, and on a grander scale, they’d like to focus on a park or communal area that everyone can enjoy. With so many ideas swirling around, Sobierajski will record a concept in at least three different ways so that she’s sure to unearth it at a later date. “In some ways, I like to think I’m impeccably organized, as I have countless spreadsheets tracking our work, our lives, and our well-being,” she explains. “The reality is that I am great at over-complicating situations with my intensified list-making and note-taking. The only thing to do is to trust the process.” Today, Leta Sobierajski joins us for Friday Five! Photo: Melitta Baumeister and Michał Plata 1. Melitta Baumeister and Michał Plata The work of Melitta Baumeister and Michał Plata has been a constant inspiration to me for their innovative, artful, and architectural silhouettes. By a practice of draping and arduous pattern-making, the garments that they develop season after season feel like they could be designed for existence in another universe. I’m a person who likes to dress up for anything when I’m not in the studio, and every time I opt to wear one of their looks, I feel like I can take on the world. The best part about their pieces is that they’re extremely functional, so whether I need to hop on a bicycle or show up at an opening, I’m still able to make a statement – these garments even have the ability to strike up conversations on their own. Photo: Wade and Leta 2. Pandas! I was recently in Chengdu to launch a new project and we took half the day to visit the Chengdu Research Base of Giant Pandas and I am a new panda convert. Yes, they’re docile and cute, but their lifestyles are utterly chill and deeply enviable for us adults with responsibilities. Giant pandas primarily eat bamboo and can consume 20-40 kilograms per day. When they’re not doing that, they’re sleeping. When we visited, many could be seen reclining on their backs, feasting on some of the finest bamboo they could select within arm’s reach. While not necessarily playful in appearance, they do seem quite cheeky in their agendas and will do as little as they can to make the most of their meals. It felt like I was watching a mirrored image of myself on a Sunday afternoon while trying to make the most of my last hours of the weekend. Photo: Courtesy of Aoiro 3. Aoiro I’m not really a candle personbut I love the luxurious subtlety of a fragrant space. It’s an intangible feeling that really can only be experienced in the present. Some of the best people to create these fragrances, in my opinion, are Shizuko and Manuel, the masterminds behind Aoiro, a Japanese and Austrian duo who have developed a keen sense for embodying the fragrances of some of the most intriguing and captivating olfactory atmospheres – earthy forest floors with crackling pine needles, blue cypress tickling the moon in an indigo sky, and rainfall on a spirited Japanese island. Despite living in an urban city, Aoiro’s olfactory design is capable of transporting me to the deepest forests of misty Yakushima island. Photo: Wade and Leta 4. Takuro Kuwata A few months ago, I saw the work of Japanese ceramicist Takuro Kuwata at an exhibition at Salon94 and have been having trouble getting it out of my head. Kuwata’s work exemplifies someone who has worked with a medium so much to completely use the medium as a medium – if that makes sense. His ability to manipulate clay and glaze and use it to create gravity-defying effects within the kiln are exceptionally mysterious to me and feel like they could only be accomplished with years and years of experimentation with the material. I’m equally impressed seeing how he’s grown his work with scale, juxtaposing it with familiar iconography like the fuzzy peach, but sculpting it from materials like bronze. Photo: Wade and Leta 5. The Site of Reversible Destiny, a park built by artists Arakawa and Gins, in Yoro Japan The park is a testament to their career as writers, architects, and their idea of reversible destiny, which in its most extreme form, eliminates death. For all that are willing to listen, Arakawa and Gins’ Reversible Destiny mentality aims to make our lives a little more youthful by encouraging us to reevaluate our relationship with architecture and our surroundings. The intention of “reversible destiny” is not to prolong death, postpone it, grow older alongside it, but to entirely not acknowledge and surpass it. Wadeand I have spent the last ten years traveling to as many of their remaining sites as possible to further understand this notion of creating spaces to extend our lives and question how conventional living spaces can become detrimental to our longevity.   Works by Wade and Leta: Photo: Wade and Leta and Matt Alexander Now You See Me is a large-scale installation in the heart of Shoreditch, London, that explores the relationship between positive and negative space through bold color, geometry, and light. Simple, familiar shapes are embedded within monolithic forms, creating a layered visual experience that shifts throughout the day. As sunlight passes through the structures, shadows and silhouettes stretch and connect, forming dynamic compositions on the surrounding concrete. Photo: Wade and Leta and John Wylie Paint Your Own Path is series of five towering sculptures, ranging from 10 to 15 feet tall, invites viewers to explore balance, tension, and perspective through bold color and form. Inspired by the delicate, often precarious act of stacking objects, the sculptures appear as if they might topple – yet each one holds steady, challenging perceptions of stability. Created in partnership with the Corolla Cross, the installation transforms its environment into a pop-colored landscape. Photo: Millenia Walk and Outer Edit, Eurthe Studio Monument to Movement is a 14-meter-tall kinetic sculpture that celebrates the spirit of the holiday season through rhythm, motion, and color. Rising skyward in layered compositions, the work symbolizes collective joy, renewal, and the shared energy of celebrations that span cultures and traditions. Powered by motors and constructed from metal beams and cardboard forms, the sculpture continuously shifts, inviting viewers to reflect on the passage of time and the cycles that connect us all. Photo: Wade and Leta and Erika Hara, Piotr Maslanka, and Jeremy Renault Falling Into Place is a vibrant rooftop installation at Ginza Six that explores themes of alignment, adaptability, and perspective. Six colorful structures – each with a void like a missing puzzle piece – serve as spaces for reflection, inviting visitors to consider their place within a greater whole. Rather than focusing on absence, the design transforms emptiness into opportunity, encouraging people to embrace spontaneity and the unfolding nature of life. Playful yet contemplative, the work emphasizes that only through connection and participation can the full picture come into view. Photo: Wade and Leta and Erika Hara, Piotr Maslanka, and Jeremy Renault Photo: Wade and Leta Stop, Listen, Look is a 7-meter-tall interactive artwork atop IFS Chengdu that captures the vibrant rhythm of the city through movement, sound, and form. Blending motorized and wind-powered elements with seesaws and sound modulation, it invites people of all ages to engage, play, and reflect. Inspired by Chengdu’s balance of tradition and modernity, the piece incorporates circular motifs from local symbolism alongside bold, geometric forms to create a dialogue between past and present. With light, motion, and community at its core, the work invites visitors to connect with the city – and each other – through shared interaction. The Cloud is a permanent sculptural kiosk in Burlington, Vermont’s historic City Hall Park, created in collaboration with Brooklyn-based Studio RENZ+OEI. Designed to reinterpret the ephemeral nature of clouds through architecture, it blends art, air, and imagination into a light, fluid structure that defies traditional rigidity. Originally born from a creative exchange between longtime friends and collaborators, the design challenges expectations of permanence by embodying movement and openness. Now home to a local food vendor, The Cloud brings a playful, uplifting presence to the park, inviting reflection and interaction rain or shine.. #leta #sobierajski #talks #giant #pandas
    DESIGN-MILK.COM
    F5: Leta Sobierajski Talks Giant Pandas, Sculptural Clothing + More
    When Leta Sobierajski enrolled in college, she already knew what she was meant to do, and she didn’t settle for anything less. “When I went to school for graphic design, I really didn’t have a backup plan – it was this, or nothing,” she says. “My work is a constantly evolving practice, and from the beginning, I have always convinced myself that if I put in the time and experimentation, I would grow and evolve.” After graduation, Sobierajski took on a range of projects, which included animation, print, and branding elements. She collaborated with corporate clients, but realized that she wouldn’t feel comfortable following anyone else’s rules in a 9-to-5 environment. Leta Sobierajski (standing) and Wade Jeffree (on ladder) \\\ Photo: Matt Dutile Sobierajski eventually decided to team up with fellow artist and kindred spirit Wade Jeffree. In 2016 they launched their Brooklyn-based studio, Wade and Leta. The duo, who share a taste for quirky aesthetics, produces sculpture, installations, or anything else they can dream up. Never static in thinking or method, they are constantly searching for another medium to try that will complement their shared vision of the moment. The pair is currently interested in permanency, and they want to utilize more metal, a strong material that will stand the test of time. Small architectural pieces are also on tap, and on a grander scale, they’d like to focus on a park or communal area that everyone can enjoy. With so many ideas swirling around, Sobierajski will record a concept in at least three different ways so that she’s sure to unearth it at a later date. “In some ways, I like to think I’m impeccably organized, as I have countless spreadsheets tracking our work, our lives, and our well-being,” she explains. “The reality is that I am great at over-complicating situations with my intensified list-making and note-taking. The only thing to do is to trust the process.” Today, Leta Sobierajski joins us for Friday Five! Photo: Melitta Baumeister and Michał Plata 1. Melitta Baumeister and Michał Plata The work of Melitta Baumeister and Michał Plata has been a constant inspiration to me for their innovative, artful, and architectural silhouettes. By a practice of draping and arduous pattern-making, the garments that they develop season after season feel like they could be designed for existence in another universe. I’m a person who likes to dress up for anything when I’m not in the studio, and every time I opt to wear one of their looks, I feel like I can take on the world. The best part about their pieces is that they’re extremely functional, so whether I need to hop on a bicycle or show up at an opening, I’m still able to make a statement – these garments even have the ability to strike up conversations on their own. Photo: Wade and Leta 2. Pandas! I was recently in Chengdu to launch a new project and we took half the day to visit the Chengdu Research Base of Giant Pandas and I am a new panda convert. Yes, they’re docile and cute, but their lifestyles are utterly chill and deeply enviable for us adults with responsibilities. Giant pandas primarily eat bamboo and can consume 20-40 kilograms per day. When they’re not doing that, they’re sleeping. When we visited, many could be seen reclining on their backs, feasting on some of the finest bamboo they could select within arm’s reach. While not necessarily playful in appearance, they do seem quite cheeky in their agendas and will do as little as they can to make the most of their meals. It felt like I was watching a mirrored image of myself on a Sunday afternoon while trying to make the most of my last hours of the weekend. Photo: Courtesy of Aoiro 3. Aoiro I’m not really a candle person (I forget to light it, and then I forget it’s lit, and then I panic when it’s been lit for too long) but I love the luxurious subtlety of a fragrant space. It’s an intangible feeling that really can only be experienced in the present. Some of the best people to create these fragrances, in my opinion, are Shizuko and Manuel, the masterminds behind Aoiro, a Japanese and Austrian duo who have developed a keen sense for embodying the fragrances of some of the most intriguing and captivating olfactory atmospheres – earthy forest floors with crackling pine needles, blue cypress tickling the moon in an indigo sky, and rainfall on a spirited Japanese island. Despite living in an urban city, Aoiro’s olfactory design is capable of transporting me to the deepest forests of misty Yakushima island. Photo: Wade and Leta 4. Takuro Kuwata A few months ago, I saw the work of Japanese ceramicist Takuro Kuwata at an exhibition at Salon94 and have been having trouble getting it out of my head. Kuwata’s work exemplifies someone who has worked with a medium so much to completely use the medium as a medium – if that makes sense. His ability to manipulate clay and glaze and use it to create gravity-defying effects within the kiln are exceptionally mysterious to me and feel like they could only be accomplished with years and years of experimentation with the material. I’m equally impressed seeing how he’s grown his work with scale, juxtaposing it with familiar iconography like the fuzzy peach, but sculpting it from materials like bronze. Photo: Wade and Leta 5. The Site of Reversible Destiny, a park built by artists Arakawa and Gins, in Yoro Japan The park is a testament to their career as writers, architects, and their idea of reversible destiny, which in its most extreme form, eliminates death. For all that are willing to listen, Arakawa and Gins’ Reversible Destiny mentality aims to make our lives a little more youthful by encouraging us to reevaluate our relationship with architecture and our surroundings. The intention of “reversible destiny” is not to prolong death, postpone it, grow older alongside it, but to entirely not acknowledge and surpass it. Wade (my partner) and I have spent the last ten years traveling to as many of their remaining sites as possible to further understand this notion of creating spaces to extend our lives and question how conventional living spaces can become detrimental to our longevity.   Works by Wade and Leta: Photo: Wade and Leta and Matt Alexander Now You See Me is a large-scale installation in the heart of Shoreditch, London, that explores the relationship between positive and negative space through bold color, geometry, and light. Simple, familiar shapes are embedded within monolithic forms, creating a layered visual experience that shifts throughout the day. As sunlight passes through the structures, shadows and silhouettes stretch and connect, forming dynamic compositions on the surrounding concrete. Photo: Wade and Leta and John Wylie Paint Your Own Path is series of five towering sculptures, ranging from 10 to 15 feet tall, invites viewers to explore balance, tension, and perspective through bold color and form. Inspired by the delicate, often precarious act of stacking objects, the sculptures appear as if they might topple – yet each one holds steady, challenging perceptions of stability. Created in partnership with the Corolla Cross, the installation transforms its environment into a pop-colored landscape. Photo: Millenia Walk and Outer Edit, Eurthe Studio Monument to Movement is a 14-meter-tall kinetic sculpture that celebrates the spirit of the holiday season through rhythm, motion, and color. Rising skyward in layered compositions, the work symbolizes collective joy, renewal, and the shared energy of celebrations that span cultures and traditions. Powered by motors and constructed from metal beams and cardboard forms, the sculpture continuously shifts, inviting viewers to reflect on the passage of time and the cycles that connect us all. Photo: Wade and Leta and Erika Hara, Piotr Maslanka, and Jeremy Renault Falling Into Place is a vibrant rooftop installation at Ginza Six that explores themes of alignment, adaptability, and perspective. Six colorful structures – each with a void like a missing puzzle piece – serve as spaces for reflection, inviting visitors to consider their place within a greater whole. Rather than focusing on absence, the design transforms emptiness into opportunity, encouraging people to embrace spontaneity and the unfolding nature of life. Playful yet contemplative, the work emphasizes that only through connection and participation can the full picture come into view. Photo: Wade and Leta and Erika Hara, Piotr Maslanka, and Jeremy Renault Photo: Wade and Leta Stop, Listen, Look is a 7-meter-tall interactive artwork atop IFS Chengdu that captures the vibrant rhythm of the city through movement, sound, and form. Blending motorized and wind-powered elements with seesaws and sound modulation, it invites people of all ages to engage, play, and reflect. Inspired by Chengdu’s balance of tradition and modernity, the piece incorporates circular motifs from local symbolism alongside bold, geometric forms to create a dialogue between past and present. With light, motion, and community at its core, the work invites visitors to connect with the city – and each other – through shared interaction. The Cloud is a permanent sculptural kiosk in Burlington, Vermont’s historic City Hall Park, created in collaboration with Brooklyn-based Studio RENZ+OEI. Designed to reinterpret the ephemeral nature of clouds through architecture, it blends art, air, and imagination into a light, fluid structure that defies traditional rigidity. Originally born from a creative exchange between longtime friends and collaborators, the design challenges expectations of permanence by embodying movement and openness. Now home to a local food vendor, The Cloud brings a playful, uplifting presence to the park, inviting reflection and interaction rain or shine..
    Like
    Love
    Wow
    Sad
    Angry
    502
    0 Комментарии 0 Поделились
  • From Networks to Business Models, AI Is Rewiring Telecom

    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services.
    As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry.
    Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental.
    AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint.
    AI Is Reshaping Wireless Networks Already
    Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access, and intelligent automation across the stack.
    AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models.
    Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time.
    AI Acceleration Will Outpace Past Tech Shifts
    Many may underestimate the speed and magnitude of AI-driven change.
    The shift from traditional voice and data systems to AI-driven network intelligence is already underway.
    Although predictions abound, the true scope remains unclear.
    It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise.

    Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board.
    Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined.
    History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries.
    Technological shifts bring both new opportunities and complex trade-offs.
    AI Disruption Will Move Faster Than Ever
    The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed.
    Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway.
    As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other.
    Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward.
    AI Will Reshape All Sectors and Companies
    This shift will unfold faster than most organizations or individuals are prepared to handle.
    Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries.
    Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption.
    Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage.
    As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name.

    It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries.
    SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption.
    The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers.
    Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives.
    No Industry Is Immune From AI Disruption
    AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp.
    New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent.
    Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets.
    The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow.
    #networks #business #models #rewiring #telecom
    From Networks to Business Models, AI Is Rewiring Telecom
    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services. As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry. Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental. AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint. AI Is Reshaping Wireless Networks Already Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access, and intelligent automation across the stack. AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models. Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time. AI Acceleration Will Outpace Past Tech Shifts Many may underestimate the speed and magnitude of AI-driven change. The shift from traditional voice and data systems to AI-driven network intelligence is already underway. Although predictions abound, the true scope remains unclear. It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise. Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board. Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined. History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries. Technological shifts bring both new opportunities and complex trade-offs. AI Disruption Will Move Faster Than Ever The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed. Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway. As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other. Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward. AI Will Reshape All Sectors and Companies This shift will unfold faster than most organizations or individuals are prepared to handle. Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries. Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption. Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage. As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name. It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries. SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption. The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers. Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives. No Industry Is Immune From AI Disruption AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp. New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent. Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets. The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow. #networks #business #models #rewiring #telecom
    From Networks to Business Models, AI Is Rewiring Telecom
    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services. As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry. Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental. AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint. AI Is Reshaping Wireless Networks Already Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access (FWA), and intelligent automation across the stack. AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models. Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time. AI Acceleration Will Outpace Past Tech Shifts Many may underestimate the speed and magnitude of AI-driven change. The shift from traditional voice and data systems to AI-driven network intelligence is already underway. Although predictions abound, the true scope remains unclear. It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise. Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board. Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined. History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries. Technological shifts bring both new opportunities and complex trade-offs. AI Disruption Will Move Faster Than Ever The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed. Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway. As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other. Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward. AI Will Reshape All Sectors and Companies This shift will unfold faster than most organizations or individuals are prepared to handle. Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries. Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption. Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage. As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name. It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries. SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption. The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers. Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives. No Industry Is Immune From AI Disruption AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp. New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent. Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets. The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow.
    0 Комментарии 0 Поделились
  • Tanks, guns and face-painting

    Of all the jarring things I’ve witnessed on the National Mall, nothing will beat the image of the first thing I saw after I cleared security at the Army festival: a child, sitting at the controls of an M119A3 Howitzer, being instructed by a soldier on how to aim it, as his red-hatted parents took a photo with the Washington Monument in the background. The primary stated reason for the Grand Military Parade is to celebrate the US Army’s 250th birthday. The second stated reason is to use the event for recruiting purposes. Like other military branches, the Army has struggled to meet its enlistment quotas for over the past decade. And according to very defensive Army spokespeople trying to convince skeptics that the parade was not for Donald Trump’s birthday, there had always been a festival planned on the National Mall that day, and it had been in the works for over two years, and the parade, tacked on just two months ago, was purely incidental. Assuming that their statement was true, I wasn’t quite sure if they had anticipated so many people in blatant MAGA swag in attendance — or how eager they were to bring their children and hand them assault rifles. WASHINGTON, DC - JUNE 14: An Army festival attendee holds a M3 Carl Gustav Recoilless Rifle on June 14, 2025 in Washington, DC. Photo by Anna Moneymaker / Getty ImagesThere had been kid-friendly events planned: an NFL Kids Zone with a photo op with the Washington Commanders’ mascot, a few face-painting booths, several rock-climbing walls. But they were dwarfed, literally, by dozens of war machines parked along the jogging paths: massive tanks, trucks with gun-mounted turrets, assault helicopters, many of them currently used in combat, all with helpful signs explaining the history of each vehicle, as well as the guns and ammo it could carry. And the families — wearing everything from J6 shirts to Vineyard Vines — were drawn more to the military vehicles, all-too-ready to place their kids in the cockpit of an AH-1F Cobra 998 helicopter as they pretended to aim the nose-mounted 3-barrelled Gatling Cannon. Parents told their children to smile as they poked their little heads out of the hatch of an M1135 Stryker armored vehicle; reminded them to be patient as they waited in line to sit inside an M109A7 self-propelled Howitzer with a 155MM rifled cannon.Attendees look at a military vehicle on display. Bloomberg via Getty ImagesBut seeing a kid’s happiness of being inside a big thing that goes boom was nothing compared to the grownups’ faces when they got the chance to hold genuine military assault rifles — especially the grownups who had made sure to wear Trump merch during the Army’s birthday party.It seemed that not even a free Army-branded Bluetooth speaker could compare to how fucking sick the modded AR-15 was. Attendees were in raptures over the Boston Dynamics robot dog gun, the quadcopter drone gun, or really any of the other guns available.RelatedHowever many protesters made it out to DC, they were dwarfed by thousands of people winding down Constitution Avenue to enter the parade viewing grounds: lots of MAGA heads, lots of foreign tourists, all people who really just like to see big, big tanks. “Angry LOSERS!” they jeered at the protesters.and after walking past them, crossing the bridge, winding through hundreds of yards of metal fencing, Funneling through security, crossing a choked pedestrian bridge over Constitution Ave, I was finally dumped onto the parade viewing section: slightly muggy and surprisingly navigable. But whatever sluggishness the crowd was feeling, it would immediately dissipate the moment a tank turned the corner — and the music started blasting.Americans have a critical weakness for 70s and 80s rock, and this crowd seemed more than willing to look past the questionable origins of the parade so long as the soundtrack had a sick guitar solo. An M1 Abrams tank driving past you while Barracuda blasts on a tower of speakers? Badass. Black Hawk helicopters circling the Washington Monument and disappearing behind the African-American history museum, thrashing your head to “separate ways” by Journey? Fucking badass. ANOTHER M1 ABRAMS TANK?!?!! AND TO FORTUNATE SON??!?!? “They got me fucking hooked,” a young redheaded man said behind me as the crowd screamed for the waving drivers.Members of the U.S. Army drive Bradley Fighting Vehicles in the 250th birthday parade on June 14, 2025 in Washington, DC. Getty ImagesWhen you listen to the hardest fucking rock soundtrack long enough, and learn more about how fucking sick the Bradley Fighting Vehicles streaming by you are, an animalistic hype takes over you — enough to drown out all the nationwide anger about the parade, the enormity of Trump’s power grab, the fact that two Minnesota Democratic lawmakers were shot in their homes just that morning, the riot police roving the streets of LA.It helped that it didn’t rain. It helped that the only people at the parade were the diehards who didn’t care if they were rained out. And by the end of the parade, they didn’t even bother to stay for Trump’s speech, beelining back to the bridge at the first drop of rain.The only thing that mattered to this crowd inside the security perimeter — more than the Army’s honor and history, and barely more than Trump himself — was firepower, strength, hard rock, and America’s unparalleled, world-class ability to kill.See More:
    #tanks #guns #facepainting
    Tanks, guns and face-painting
    Of all the jarring things I’ve witnessed on the National Mall, nothing will beat the image of the first thing I saw after I cleared security at the Army festival: a child, sitting at the controls of an M119A3 Howitzer, being instructed by a soldier on how to aim it, as his red-hatted parents took a photo with the Washington Monument in the background. The primary stated reason for the Grand Military Parade is to celebrate the US Army’s 250th birthday. The second stated reason is to use the event for recruiting purposes. Like other military branches, the Army has struggled to meet its enlistment quotas for over the past decade. And according to very defensive Army spokespeople trying to convince skeptics that the parade was not for Donald Trump’s birthday, there had always been a festival planned on the National Mall that day, and it had been in the works for over two years, and the parade, tacked on just two months ago, was purely incidental. Assuming that their statement was true, I wasn’t quite sure if they had anticipated so many people in blatant MAGA swag in attendance — or how eager they were to bring their children and hand them assault rifles. WASHINGTON, DC - JUNE 14: An Army festival attendee holds a M3 Carl Gustav Recoilless Rifle on June 14, 2025 in Washington, DC. Photo by Anna Moneymaker / Getty ImagesThere had been kid-friendly events planned: an NFL Kids Zone with a photo op with the Washington Commanders’ mascot, a few face-painting booths, several rock-climbing walls. But they were dwarfed, literally, by dozens of war machines parked along the jogging paths: massive tanks, trucks with gun-mounted turrets, assault helicopters, many of them currently used in combat, all with helpful signs explaining the history of each vehicle, as well as the guns and ammo it could carry. And the families — wearing everything from J6 shirts to Vineyard Vines — were drawn more to the military vehicles, all-too-ready to place their kids in the cockpit of an AH-1F Cobra 998 helicopter as they pretended to aim the nose-mounted 3-barrelled Gatling Cannon. Parents told their children to smile as they poked their little heads out of the hatch of an M1135 Stryker armored vehicle; reminded them to be patient as they waited in line to sit inside an M109A7 self-propelled Howitzer with a 155MM rifled cannon.Attendees look at a military vehicle on display. Bloomberg via Getty ImagesBut seeing a kid’s happiness of being inside a big thing that goes boom was nothing compared to the grownups’ faces when they got the chance to hold genuine military assault rifles — especially the grownups who had made sure to wear Trump merch during the Army’s birthday party.It seemed that not even a free Army-branded Bluetooth speaker could compare to how fucking sick the modded AR-15 was. Attendees were in raptures over the Boston Dynamics robot dog gun, the quadcopter drone gun, or really any of the other guns available.RelatedHowever many protesters made it out to DC, they were dwarfed by thousands of people winding down Constitution Avenue to enter the parade viewing grounds: lots of MAGA heads, lots of foreign tourists, all people who really just like to see big, big tanks. “Angry LOSERS!” they jeered at the protesters.and after walking past them, crossing the bridge, winding through hundreds of yards of metal fencing, Funneling through security, crossing a choked pedestrian bridge over Constitution Ave, I was finally dumped onto the parade viewing section: slightly muggy and surprisingly navigable. But whatever sluggishness the crowd was feeling, it would immediately dissipate the moment a tank turned the corner — and the music started blasting.Americans have a critical weakness for 70s and 80s rock, and this crowd seemed more than willing to look past the questionable origins of the parade so long as the soundtrack had a sick guitar solo. An M1 Abrams tank driving past you while Barracuda blasts on a tower of speakers? Badass. Black Hawk helicopters circling the Washington Monument and disappearing behind the African-American history museum, thrashing your head to “separate ways” by Journey? Fucking badass. ANOTHER M1 ABRAMS TANK?!?!! AND TO FORTUNATE SON??!?!? “They got me fucking hooked,” a young redheaded man said behind me as the crowd screamed for the waving drivers.Members of the U.S. Army drive Bradley Fighting Vehicles in the 250th birthday parade on June 14, 2025 in Washington, DC. Getty ImagesWhen you listen to the hardest fucking rock soundtrack long enough, and learn more about how fucking sick the Bradley Fighting Vehicles streaming by you are, an animalistic hype takes over you — enough to drown out all the nationwide anger about the parade, the enormity of Trump’s power grab, the fact that two Minnesota Democratic lawmakers were shot in their homes just that morning, the riot police roving the streets of LA.It helped that it didn’t rain. It helped that the only people at the parade were the diehards who didn’t care if they were rained out. And by the end of the parade, they didn’t even bother to stay for Trump’s speech, beelining back to the bridge at the first drop of rain.The only thing that mattered to this crowd inside the security perimeter — more than the Army’s honor and history, and barely more than Trump himself — was firepower, strength, hard rock, and America’s unparalleled, world-class ability to kill.See More: #tanks #guns #facepainting
    WWW.THEVERGE.COM
    Tanks, guns and face-painting
    Of all the jarring things I’ve witnessed on the National Mall, nothing will beat the image of the first thing I saw after I cleared security at the Army festival: a child, sitting at the controls of an M119A3 Howitzer, being instructed by a soldier on how to aim it, as his red-hatted parents took a photo with the Washington Monument in the background. The primary stated reason for the Grand Military Parade is to celebrate the US Army’s 250th birthday. The second stated reason is to use the event for recruiting purposes. Like other military branches, the Army has struggled to meet its enlistment quotas for over the past decade. And according to very defensive Army spokespeople trying to convince skeptics that the parade was not for Donald Trump’s birthday, there had always been a festival planned on the National Mall that day, and it had been in the works for over two years, and the parade, tacked on just two months ago, was purely incidental. Assuming that their statement was true, I wasn’t quite sure if they had anticipated so many people in blatant MAGA swag in attendance — or how eager they were to bring their children and hand them assault rifles. WASHINGTON, DC - JUNE 14: An Army festival attendee holds a M3 Carl Gustav Recoilless Rifle on June 14, 2025 in Washington, DC. Photo by Anna Moneymaker / Getty ImagesThere had been kid-friendly events planned: an NFL Kids Zone with a photo op with the Washington Commanders’ mascot, a few face-painting booths, several rock-climbing walls. But they were dwarfed, literally, by dozens of war machines parked along the jogging paths: massive tanks, trucks with gun-mounted turrets, assault helicopters, many of them currently used in combat, all with helpful signs explaining the history of each vehicle, as well as the guns and ammo it could carry. And the families — wearing everything from J6 shirts to Vineyard Vines — were drawn more to the military vehicles, all-too-ready to place their kids in the cockpit of an AH-1F Cobra 998 helicopter as they pretended to aim the nose-mounted 3-barrelled Gatling Cannon. Parents told their children to smile as they poked their little heads out of the hatch of an M1135 Stryker armored vehicle; reminded them to be patient as they waited in line to sit inside an M109A7 self-propelled Howitzer with a 155MM rifled cannon.Attendees look at a military vehicle on display. Bloomberg via Getty ImagesBut seeing a kid’s happiness of being inside a big thing that goes boom was nothing compared to the grownups’ faces when they got the chance to hold genuine military assault rifles — especially the grownups who had made sure to wear Trump merch during the Army’s birthday party. (Some even handed the rifles to their children for their own photo ops.) It seemed that not even a free Army-branded Bluetooth speaker could compare to how fucking sick the modded AR-15 was. Attendees were in raptures over the Boston Dynamics robot dog gun, the quadcopter drone gun, or really any of the other guns available (except for those historic guns, those were only maybe cool).RelatedHowever many protesters made it out to DC, they were dwarfed by thousands of people winding down Constitution Avenue to enter the parade viewing grounds: lots of MAGA heads, lots of foreign tourists, all people who really just like to see big, big tanks. “Angry LOSERS!” they jeered at the protesters. (“Don’t worry about them,” said one cop, “they lost anyways.”) and after walking past them, crossing the bridge, winding through hundreds of yards of metal fencing, Funneling through security, crossing a choked pedestrian bridge over Constitution Ave, I was finally dumped onto the parade viewing section: slightly muggy and surprisingly navigable. But whatever sluggishness the crowd was feeling, it would immediately dissipate the moment a tank turned the corner — and the music started blasting.Americans have a critical weakness for 70s and 80s rock, and this crowd seemed more than willing to look past the questionable origins of the parade so long as the soundtrack had a sick guitar solo. An M1 Abrams tank driving past you while Barracuda blasts on a tower of speakers? Badass. Black Hawk helicopters circling the Washington Monument and disappearing behind the African-American history museum, thrashing your head to “separate ways” by Journey? Fucking badass. ANOTHER M1 ABRAMS TANK?!?!! AND TO FORTUNATE SON??!?!? “They got me fucking hooked,” a young redheaded man said behind me as the crowd screamed for the waving drivers. (The tank was so badass that the irony of “Fortunate Son” didn’t matter.)Members of the U.S. Army drive Bradley Fighting Vehicles in the 250th birthday parade on June 14, 2025 in Washington, DC. Getty ImagesWhen you listen to the hardest fucking rock soundtrack long enough, and learn more about how fucking sick the Bradley Fighting Vehicles streaming by you are (either from the parade announcer or the tank enthusiast next to you), an animalistic hype takes over you — enough to drown out all the nationwide anger about the parade, the enormity of Trump’s power grab, the fact that two Minnesota Democratic lawmakers were shot in their homes just that morning, the riot police roving the streets of LA.It helped that it didn’t rain. It helped that the only people at the parade were the diehards who didn’t care if they were rained out. And by the end of the parade, they didn’t even bother to stay for Trump’s speech, beelining back to the bridge at the first drop of rain.The only thing that mattered to this crowd inside the security perimeter — more than the Army’s honor and history, and barely more than Trump himself — was firepower, strength, hard rock, and America’s unparalleled, world-class ability to kill.See More:
    0 Комментарии 0 Поделились
  • How jam jars explain Apple’s success

    We are told to customize, expand, and provide more options, but that might be a silent killer for our conversion rate. Using behavioral psychology and modern product design, this piece explains why brands like Apple use fewer, smarter choices to convert better.Image generated using ChatgptJam-packed decisionsImagine standing in a supermarket aisle in front of the jam section. How do you decide which jam to buy? You could go for your usual jam, or maybe this is your first time buying jam. Either way, a choice has to be made. Or does it?You may have seen the vast number of choices, gotten overwhelmed, and walked away. The same scenario was reflected in the findings of a 2000 study by Iyengar and Lepper that explored how the number of choice options can affect decision-making.Iyengar and Lepper set up two scenarios; the first customers in a random supermarket being offered 24 jams for a free tasting. In another, they were offered only 6. One would expect that the first scenario would see more sales. After all, more variety means a happier customer. However:Image created using CanvaWhile 60% of customers stopped by for a tasting, only 3% ended up making a purchase.On the other hand, when faced with 6 options, 40% of customers stopped by, but 30% of this number ended up making a purchase.The implications of the study were evident. While one may think that more choices are better when faced with the same, decision-makers prefer fewer.This phenomenon is known as the Paradox of Choice. More choice leads to less satisfaction because one gets overwhelmed.This analysis paralysis results from humans being cognitive misers that is decisions that require deeper thinking feel exhausting and like they come at a cognitive cost. In such scenarios, we tend not to make a choice or choose a default option. Even after a decision has been made, in many cases, regret or the thought of whether you have made the ‘right’ choice can linger.A sticky situationHowever, a 2010 meta-analysis by Benjamin Scheibehenne was unable to replicate the findings. Scheibehenne questioned whether it was choice overload or information overload that was the issue. Other researchers have argued that it is the lack of meaningful choice that affects satisfaction. Additionally, Barry Schwartz, a renowned psychologist and the author of the book ‘The Paradox of Choice: Why Less Is More,’ also later suggested that the paradox of choice diminishes in the presence of a person’s knowledge of the options and if the choices have been presented well.Does that mean the paradox of choice was an overhyped notion? I conducted a mini-study to test this hypothesis.From shelves to spreadsheets: testing the jam jar theoryI created a simple scatterplot in R using a publicly available dataset from the Brazilian e-commerce site Olist. Olist is Brazil’s largest department store on marketplaces. After delivery, customers are asked to fill out a satisfaction survey with a rating or comment option. I analysed the relationship between the number of distinct products in a categoryand the average customer review.Scatterplot generated in R using the Olist datasetBased on the almost horizontal regression line on the plot above, it is evident that more choice does not lead to more satisfaction. Furthermore, categories with fewer than 200 products tend to have average review scores between 4.0 and 4.3. Whereas, categories with more than 1,000 products do not have a higher average satisfaction score, with some even falling below 4.0. This suggests that more choices do not equal more satisfaction and could also reduce satisfaction levels.These findings support the Paradox of Choice, and the dataset helps bring theory into real-world commerce. A curation of lesser, well-presented, and differentiated options could lead to more customer satisfaction.Image created using CanvaFurthermore, the plot could help suggest a more nuanced perspective; people want more choices, as this gives them autonomy. However, beyond a certain point, excessive choice overwhelms rather than empowers, leaving people dissatisfied. Many product strategies reflect this insight: the goal is to inspire confident decision-making rather than limiting freedom. A powerful example of this shift in thinking comes from Apple’s history.Simple tastes, sweeter decisionsImage source: Apple InsiderIt was 1997, and Steve Jobs had just made his return to Apple. The company at the time offered 40 different products; however, its sales were declining. Jobs made one question the company’s mantra,“What are the four products we should be building?”The following year, Apple saw itself return to profitability after introducing the iMac G3. While its success can be attributed to the introduction of a new product line and increased efficiency, one cannot deny that the reduction in the product line simplified the decision-making process for its consumers.To this day, Apple continues to implement this strategy by having a few SKUs and confident defaults.Apple does not just sell premium products; it sells a premium decision-making experience by reducing friction in decision-making for the consumer.Furthermore, a 2015 study based on analyzing scenarios where fewer choice options led to increased sales found the following mitigating factors in buying choices:Time Pressure: Easier and quicker choices led to more sales.Complexity of options: The easier it was to understand what a product was, the better the outcome.Clarity of Preference: How easy it was to compare alternatives and the clarity of one’s preferences.Motivation to Optimize: Whether the consumer wanted to put in the effort to find the ‘best’ option.Picking the right spreadWhile the extent of the validity of the Paradox of Choice is up for debate, its impact cannot be denied. It is still a helpful model that can be used to drive sales and boost customer satisfaction. So, how can one use it as a part of your business’s strategy?Remember, what people want isn’t 50 good choices. They want one confident, easy-to-understand decision that they think they will not regret.Here are some common mistakes that confuse consumers and how you can apply the Jam Jar strategy to curate choices instead:Image is created using CanvaToo many choices lead to decision fatigue.Offering many SKU options usually causes customers to get overwhelmed. Instead, try curating 2–3 strong options that will cover the majority of their needs.2. Being dependent on the users to use filters and specificationsWhen users have to compare specifications themselves, they usually end up doing nothing. Instead, it is better to replace filters with clear labels like “Best for beginners” or “Best for oily skin.”3. Leaving users to make comparisons by themselvesToo many options can make users overwhelmed. Instead, offer default options to show what you recommend. This instills within them a sense of confidence when making the final decision.4. More transparency does not always mean more trustInformation overload never leads to conversions. Instead, create a thoughtful flow that guides the users to the right choices.5. Users do not aim for optimizationAssuming that users will weigh every detail before making a decision is not rooted in reality. In most cases, they will go with their gut. Instead, highlight emotional outcomes, benefits, and uses instead of numbers.6. Not onboarding users is a critical mistakeHoping that users will easily navigate a sea of products without guidance is unrealistic. Instead, use onboarding tools like starter kits, quizzes, or bundles that act as starting points.7. Variety for the sake of varietyUsers crave clarity more than they crave variety. Instead, focus on simplicity when it comes to differentiation.And lastly, remember that while the paradox of choice is a helpful tool in your business strategy arsenal, more choice is not inherently bad. It is the lack of structure in the decision-making process that is the problem. Clear framing will always make decision-making a seamless experience for both your consumers and your business.How jam jars explain Apple’s success was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
    #how #jam #jars #explain #apples
    How jam jars explain Apple’s success
    We are told to customize, expand, and provide more options, but that might be a silent killer for our conversion rate. Using behavioral psychology and modern product design, this piece explains why brands like Apple use fewer, smarter choices to convert better.Image generated using ChatgptJam-packed decisionsImagine standing in a supermarket aisle in front of the jam section. How do you decide which jam to buy? You could go for your usual jam, or maybe this is your first time buying jam. Either way, a choice has to be made. Or does it?You may have seen the vast number of choices, gotten overwhelmed, and walked away. The same scenario was reflected in the findings of a 2000 study by Iyengar and Lepper that explored how the number of choice options can affect decision-making.Iyengar and Lepper set up two scenarios; the first customers in a random supermarket being offered 24 jams for a free tasting. In another, they were offered only 6. One would expect that the first scenario would see more sales. After all, more variety means a happier customer. However:Image created using CanvaWhile 60% of customers stopped by for a tasting, only 3% ended up making a purchase.On the other hand, when faced with 6 options, 40% of customers stopped by, but 30% of this number ended up making a purchase.The implications of the study were evident. While one may think that more choices are better when faced with the same, decision-makers prefer fewer.This phenomenon is known as the Paradox of Choice. More choice leads to less satisfaction because one gets overwhelmed.This analysis paralysis results from humans being cognitive misers that is decisions that require deeper thinking feel exhausting and like they come at a cognitive cost. In such scenarios, we tend not to make a choice or choose a default option. Even after a decision has been made, in many cases, regret or the thought of whether you have made the ‘right’ choice can linger.A sticky situationHowever, a 2010 meta-analysis by Benjamin Scheibehenne was unable to replicate the findings. Scheibehenne questioned whether it was choice overload or information overload that was the issue. Other researchers have argued that it is the lack of meaningful choice that affects satisfaction. Additionally, Barry Schwartz, a renowned psychologist and the author of the book ‘The Paradox of Choice: Why Less Is More,’ also later suggested that the paradox of choice diminishes in the presence of a person’s knowledge of the options and if the choices have been presented well.Does that mean the paradox of choice was an overhyped notion? I conducted a mini-study to test this hypothesis.From shelves to spreadsheets: testing the jam jar theoryI created a simple scatterplot in R using a publicly available dataset from the Brazilian e-commerce site Olist. Olist is Brazil’s largest department store on marketplaces. After delivery, customers are asked to fill out a satisfaction survey with a rating or comment option. I analysed the relationship between the number of distinct products in a categoryand the average customer review.Scatterplot generated in R using the Olist datasetBased on the almost horizontal regression line on the plot above, it is evident that more choice does not lead to more satisfaction. Furthermore, categories with fewer than 200 products tend to have average review scores between 4.0 and 4.3. Whereas, categories with more than 1,000 products do not have a higher average satisfaction score, with some even falling below 4.0. This suggests that more choices do not equal more satisfaction and could also reduce satisfaction levels.These findings support the Paradox of Choice, and the dataset helps bring theory into real-world commerce. A curation of lesser, well-presented, and differentiated options could lead to more customer satisfaction.Image created using CanvaFurthermore, the plot could help suggest a more nuanced perspective; people want more choices, as this gives them autonomy. However, beyond a certain point, excessive choice overwhelms rather than empowers, leaving people dissatisfied. Many product strategies reflect this insight: the goal is to inspire confident decision-making rather than limiting freedom. A powerful example of this shift in thinking comes from Apple’s history.Simple tastes, sweeter decisionsImage source: Apple InsiderIt was 1997, and Steve Jobs had just made his return to Apple. The company at the time offered 40 different products; however, its sales were declining. Jobs made one question the company’s mantra,“What are the four products we should be building?”The following year, Apple saw itself return to profitability after introducing the iMac G3. While its success can be attributed to the introduction of a new product line and increased efficiency, one cannot deny that the reduction in the product line simplified the decision-making process for its consumers.To this day, Apple continues to implement this strategy by having a few SKUs and confident defaults.Apple does not just sell premium products; it sells a premium decision-making experience by reducing friction in decision-making for the consumer.Furthermore, a 2015 study based on analyzing scenarios where fewer choice options led to increased sales found the following mitigating factors in buying choices:Time Pressure: Easier and quicker choices led to more sales.Complexity of options: The easier it was to understand what a product was, the better the outcome.Clarity of Preference: How easy it was to compare alternatives and the clarity of one’s preferences.Motivation to Optimize: Whether the consumer wanted to put in the effort to find the ‘best’ option.Picking the right spreadWhile the extent of the validity of the Paradox of Choice is up for debate, its impact cannot be denied. It is still a helpful model that can be used to drive sales and boost customer satisfaction. So, how can one use it as a part of your business’s strategy?Remember, what people want isn’t 50 good choices. They want one confident, easy-to-understand decision that they think they will not regret.Here are some common mistakes that confuse consumers and how you can apply the Jam Jar strategy to curate choices instead:Image is created using CanvaToo many choices lead to decision fatigue.Offering many SKU options usually causes customers to get overwhelmed. Instead, try curating 2–3 strong options that will cover the majority of their needs.2. Being dependent on the users to use filters and specificationsWhen users have to compare specifications themselves, they usually end up doing nothing. Instead, it is better to replace filters with clear labels like “Best for beginners” or “Best for oily skin.”3. Leaving users to make comparisons by themselvesToo many options can make users overwhelmed. Instead, offer default options to show what you recommend. This instills within them a sense of confidence when making the final decision.4. More transparency does not always mean more trustInformation overload never leads to conversions. Instead, create a thoughtful flow that guides the users to the right choices.5. Users do not aim for optimizationAssuming that users will weigh every detail before making a decision is not rooted in reality. In most cases, they will go with their gut. Instead, highlight emotional outcomes, benefits, and uses instead of numbers.6. Not onboarding users is a critical mistakeHoping that users will easily navigate a sea of products without guidance is unrealistic. Instead, use onboarding tools like starter kits, quizzes, or bundles that act as starting points.7. Variety for the sake of varietyUsers crave clarity more than they crave variety. Instead, focus on simplicity when it comes to differentiation.And lastly, remember that while the paradox of choice is a helpful tool in your business strategy arsenal, more choice is not inherently bad. It is the lack of structure in the decision-making process that is the problem. Clear framing will always make decision-making a seamless experience for both your consumers and your business.How jam jars explain Apple’s success was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story. #how #jam #jars #explain #apples
    UXDESIGN.CC
    How jam jars explain Apple’s success
    We are told to customize, expand, and provide more options, but that might be a silent killer for our conversion rate. Using behavioral psychology and modern product design, this piece explains why brands like Apple use fewer, smarter choices to convert better.Image generated using ChatgptJam-packed decisionsImagine standing in a supermarket aisle in front of the jam section. How do you decide which jam to buy? You could go for your usual jam, or maybe this is your first time buying jam. Either way, a choice has to be made. Or does it?You may have seen the vast number of choices, gotten overwhelmed, and walked away. The same scenario was reflected in the findings of a 2000 study by Iyengar and Lepper that explored how the number of choice options can affect decision-making.Iyengar and Lepper set up two scenarios; the first customers in a random supermarket being offered 24 jams for a free tasting. In another, they were offered only 6. One would expect that the first scenario would see more sales. After all, more variety means a happier customer. However:Image created using CanvaWhile 60% of customers stopped by for a tasting, only 3% ended up making a purchase.On the other hand, when faced with 6 options, 40% of customers stopped by, but 30% of this number ended up making a purchase.The implications of the study were evident. While one may think that more choices are better when faced with the same, decision-makers prefer fewer.This phenomenon is known as the Paradox of Choice. More choice leads to less satisfaction because one gets overwhelmed.This analysis paralysis results from humans being cognitive misers that is decisions that require deeper thinking feel exhausting and like they come at a cognitive cost. In such scenarios, we tend not to make a choice or choose a default option. Even after a decision has been made, in many cases, regret or the thought of whether you have made the ‘right’ choice can linger.A sticky situationHowever, a 2010 meta-analysis by Benjamin Scheibehenne was unable to replicate the findings. Scheibehenne questioned whether it was choice overload or information overload that was the issue. Other researchers have argued that it is the lack of meaningful choice that affects satisfaction. Additionally, Barry Schwartz, a renowned psychologist and the author of the book ‘The Paradox of Choice: Why Less Is More,’ also later suggested that the paradox of choice diminishes in the presence of a person’s knowledge of the options and if the choices have been presented well.Does that mean the paradox of choice was an overhyped notion? I conducted a mini-study to test this hypothesis.From shelves to spreadsheets: testing the jam jar theoryI created a simple scatterplot in R using a publicly available dataset from the Brazilian e-commerce site Olist. Olist is Brazil’s largest department store on marketplaces. After delivery, customers are asked to fill out a satisfaction survey with a rating or comment option. I analysed the relationship between the number of distinct products in a category (choices) and the average customer review (satisfaction).Scatterplot generated in R using the Olist datasetBased on the almost horizontal regression line on the plot above, it is evident that more choice does not lead to more satisfaction. Furthermore, categories with fewer than 200 products tend to have average review scores between 4.0 and 4.3. Whereas, categories with more than 1,000 products do not have a higher average satisfaction score, with some even falling below 4.0. This suggests that more choices do not equal more satisfaction and could also reduce satisfaction levels.These findings support the Paradox of Choice, and the dataset helps bring theory into real-world commerce. A curation of lesser, well-presented, and differentiated options could lead to more customer satisfaction.Image created using CanvaFurthermore, the plot could help suggest a more nuanced perspective; people want more choices, as this gives them autonomy. However, beyond a certain point, excessive choice overwhelms rather than empowers, leaving people dissatisfied. Many product strategies reflect this insight: the goal is to inspire confident decision-making rather than limiting freedom. A powerful example of this shift in thinking comes from Apple’s history.Simple tastes, sweeter decisionsImage source: Apple InsiderIt was 1997, and Steve Jobs had just made his return to Apple. The company at the time offered 40 different products; however, its sales were declining. Jobs made one question the company’s mantra,“What are the four products we should be building?”The following year, Apple saw itself return to profitability after introducing the iMac G3. While its success can be attributed to the introduction of a new product line and increased efficiency, one cannot deny that the reduction in the product line simplified the decision-making process for its consumers.To this day, Apple continues to implement this strategy by having a few SKUs and confident defaults.Apple does not just sell premium products; it sells a premium decision-making experience by reducing friction in decision-making for the consumer.Furthermore, a 2015 study based on analyzing scenarios where fewer choice options led to increased sales found the following mitigating factors in buying choices:Time Pressure: Easier and quicker choices led to more sales.Complexity of options: The easier it was to understand what a product was, the better the outcome.Clarity of Preference: How easy it was to compare alternatives and the clarity of one’s preferences.Motivation to Optimize: Whether the consumer wanted to put in the effort to find the ‘best’ option.Picking the right spreadWhile the extent of the validity of the Paradox of Choice is up for debate, its impact cannot be denied. It is still a helpful model that can be used to drive sales and boost customer satisfaction. So, how can one use it as a part of your business’s strategy?Remember, what people want isn’t 50 good choices. They want one confident, easy-to-understand decision that they think they will not regret.Here are some common mistakes that confuse consumers and how you can apply the Jam Jar strategy to curate choices instead:Image is created using CanvaToo many choices lead to decision fatigue.Offering many SKU options usually causes customers to get overwhelmed. Instead, try curating 2–3 strong options that will cover the majority of their needs.2. Being dependent on the users to use filters and specificationsWhen users have to compare specifications themselves, they usually end up doing nothing. Instead, it is better to replace filters with clear labels like “Best for beginners” or “Best for oily skin.”3. Leaving users to make comparisons by themselvesToo many options can make users overwhelmed. Instead, offer default options to show what you recommend. This instills within them a sense of confidence when making the final decision.4. More transparency does not always mean more trustInformation overload never leads to conversions. Instead, create a thoughtful flow that guides the users to the right choices.5. Users do not aim for optimizationAssuming that users will weigh every detail before making a decision is not rooted in reality. In most cases, they will go with their gut. Instead, highlight emotional outcomes, benefits, and uses instead of numbers.6. Not onboarding users is a critical mistakeHoping that users will easily navigate a sea of products without guidance is unrealistic. Instead, use onboarding tools like starter kits, quizzes, or bundles that act as starting points.7. Variety for the sake of varietyUsers crave clarity more than they crave variety. Instead, focus on simplicity when it comes to differentiation.And lastly, remember that while the paradox of choice is a helpful tool in your business strategy arsenal, more choice is not inherently bad. It is the lack of structure in the decision-making process that is the problem. Clear framing will always make decision-making a seamless experience for both your consumers and your business.How jam jars explain Apple’s success was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
    0 Комментарии 0 Поделились
  • Ninja Gaiden: Ragebound Getting Physical PS5 And Switch Releases

    Ninja Gaiden: Ragebound Standard Edition Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special Edition Preorder at Silver Lining Direct While Ninja Gaiden 4 might be the most anticipated ninja title of 2025, we’re also getting a retro-inspired side-scroller Ninja Gaiden: Ragebound a few month earlier on PS5, Nintendo Switch, Xbox, and PC on July 31. The initial launch will be for the digital version of the game on all platforms. However, Nintendo Switch and PS5 are also getting physical editions packed with cool collectibles launching on September 12--including Special Edition that comes with a cloth map, double-sided medallion, pins, and more. Preorders for the physical Ninja Gaiden: Ragebound editions are available now. Copies are currently exclusive to the game's publisher, Silver Lining Direct, but other retailers like Amazon, Walmart, and Target are expected to carry the game soon. Ninja Gaiden: Ragebound Standard Edition The standard edition of Ragebound gets you a copy of the game, a digital copy of the official soundtrack, and a retro booklet. That’s a pretty nice bundle for just It’s available on PS5 and Switch. Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special Edition Spring for the Special Edition, and you’ll get a bunch of collectibles. Along with everything from the standard edition, it includes the following:Cloth map4 metal pinsDouble-sided hero medallionPixel stage standeeDouble-sided posterPremium collector’s box Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special EditionNinja Gaiden: Ragebound is a 2D sidescroller that takes place while Ryu Hayabusa is traveling to America. However, a demonic army invades Hayabusa Village while he’s gone--and it’ll be up to the young Kenji Mozu, this apprentice, to fend them off. But do to that, he’ll need help from the infamous Black Spider Clan.Continue Reading at GameSpot
    #ninja #gaiden #ragebound #getting #physical
    Ninja Gaiden: Ragebound Getting Physical PS5 And Switch Releases
    Ninja Gaiden: Ragebound Standard Edition Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special Edition Preorder at Silver Lining Direct While Ninja Gaiden 4 might be the most anticipated ninja title of 2025, we’re also getting a retro-inspired side-scroller Ninja Gaiden: Ragebound a few month earlier on PS5, Nintendo Switch, Xbox, and PC on July 31. The initial launch will be for the digital version of the game on all platforms. However, Nintendo Switch and PS5 are also getting physical editions packed with cool collectibles launching on September 12--including Special Edition that comes with a cloth map, double-sided medallion, pins, and more. Preorders for the physical Ninja Gaiden: Ragebound editions are available now. Copies are currently exclusive to the game's publisher, Silver Lining Direct, but other retailers like Amazon, Walmart, and Target are expected to carry the game soon. Ninja Gaiden: Ragebound Standard Edition The standard edition of Ragebound gets you a copy of the game, a digital copy of the official soundtrack, and a retro booklet. That’s a pretty nice bundle for just It’s available on PS5 and Switch. Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special Edition Spring for the Special Edition, and you’ll get a bunch of collectibles. Along with everything from the standard edition, it includes the following:Cloth map4 metal pinsDouble-sided hero medallionPixel stage standeeDouble-sided posterPremium collector’s box Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special EditionNinja Gaiden: Ragebound is a 2D sidescroller that takes place while Ryu Hayabusa is traveling to America. However, a demonic army invades Hayabusa Village while he’s gone--and it’ll be up to the young Kenji Mozu, this apprentice, to fend them off. But do to that, he’ll need help from the infamous Black Spider Clan.Continue Reading at GameSpot #ninja #gaiden #ragebound #getting #physical
    WWW.GAMESPOT.COM
    Ninja Gaiden: Ragebound Getting Physical PS5 And Switch Releases
    Ninja Gaiden: Ragebound Standard Edition $40 Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special Edition $70 Preorder at Silver Lining Direct While Ninja Gaiden 4 might be the most anticipated ninja title of 2025, we’re also getting a retro-inspired side-scroller Ninja Gaiden: Ragebound a few month earlier on PS5, Nintendo Switch, Xbox, and PC on July 31. The initial launch will be for the digital version of the game on all platforms. However, Nintendo Switch and PS5 are also getting physical editions packed with cool collectibles launching on September 12--including Special Edition that comes with a cloth map, double-sided medallion, pins, and more. Preorders for the physical Ninja Gaiden: Ragebound editions are available now. Copies are currently exclusive to the game's publisher, Silver Lining Direct, but other retailers like Amazon, Walmart, and Target are expected to carry the game soon. Ninja Gaiden: Ragebound Standard Edition $40 The standard edition of Ragebound gets you a copy of the game, a digital copy of the official soundtrack, and a retro booklet. That’s a pretty nice bundle for just $40. It’s available on PS5 and Switch. Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special Edition $70 Spring for the Special Edition, and you’ll get a bunch of collectibles. Along with everything from the standard edition, it includes the following:Cloth map4 metal pinsDouble-sided hero medallionPixel stage standeeDouble-sided posterPremium collector’s box Preorder at Silver Lining Direct Ninja Gaiden: Ragebound Special EditionNinja Gaiden: Ragebound is a 2D sidescroller that takes place while Ryu Hayabusa is traveling to America. However, a demonic army invades Hayabusa Village while he’s gone--and it’ll be up to the young Kenji Mozu, this apprentice, to fend them off. But do to that, he’ll need help from the infamous Black Spider Clan.Continue Reading at GameSpot
    0 Комментарии 0 Поделились
Расширенные страницы