• NVIDIA Brings Physical AI to European Cities With New Blueprint for Smart City AI

    Urban populations are expected to double by 2050, which means around 2.5 billion people could be added to urban areas by the middle of the century, driving the need for more sustainable urban planning and public services. Cities across the globe are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions.
    Building a digital twin of a city and testing smart city AI agents within it, however, is a complex and resource-intensive endeavor, fraught with technical and operational challenges.
    To address those challenges, NVIDIA today announced the NVIDIA Omniverse Blueprint for smart city AI, a reference framework that combines the NVIDIA Omniverse, Cosmos, NeMo and Metropolis platforms to bring the benefits of physical AI to entire cities and their critical infrastructure.
    Using the blueprint, developers can build simulation-ready, or SimReady, photorealistic digital twins of cities to build and test AI agents that can help monitor and optimize city operations.
    Leading companies including XXII, AVES Reality, Akila, Blyncsy, Bentley, Cesium, K2K, Linker Vision, Milestone Systems, Nebius, SNCF Gares&Connexions, Trimble and Younite AI are among the first to use the new blueprint.

    NVIDIA Omniverse Blueprint for Smart City AI 
    The NVIDIA Omniverse Blueprint for smart city AI provides the complete software stack needed to accelerate the development and testing of AI agents in physically accurate digital twins of cities. It includes:

    NVIDIA Omniverse to build physically accurate digital twins and run simulations at city scale.
    NVIDIA Cosmos to generate synthetic data at scale for post-training AI models.
    NVIDIA NeMo to curate high-quality data and use that data to train and fine-tune vision language modelsand large language models.
    NVIDIA Metropolis to build and deploy video analytics AI agents based on the NVIDIA AI Blueprint for video search and summarization, helping process vast amounts of video data and provide critical insights to optimize business processes.

    The blueprint workflow comprises three key steps. First, developers create a SimReady digital twin of locations and facilities using aerial, satellite or map data with Omniverse and Cosmos. Second, they can train and fine-tune AI models, like computer vision models and VLMs, using NVIDIA TAO and NeMo Curator to improve accuracy for vision AI use cases​. Finally, real-time AI agents powered by these customized models are deployed to alert, summarize and query camera and sensor data using the Metropolis VSS blueprint.
    NVIDIA Partner Ecosystem Powers Smart Cities Worldwide
    The blueprint for smart city AI enables a large ecosystem of partners to use a single workflow to build and activate digital twins for smart city use cases, tapping into a combination of NVIDIA’s technologies and their own.
    SNCF Gares&Connexions, which operates a network of 3,000 train stations across France and Monaco, has deployed a digital twin and AI agents to enable real-time operational monitoring, emergency response simulations and infrastructure upgrade planning.
    This helps each station analyze operational data such as energy and water use, and enables predictive maintenance capabilities, automated reporting and GDPR-compliant video analytics for incident detection and crowd management.
    Powered by Omniverse, Metropolis and solutions from ecosystem partners Akila and XXII, SNCF Gares&Connexions’ physical AI deployment at the Monaco-Monte-Carlo and Marseille stations has helped SNCF Gares&Connexions achieve a 100% on-time preventive maintenance completion rate, a 50% reduction in downtime and issue response time, and a 20% reduction in energy consumption.

    The city of Palermo in Sicily is using AI agents and digital twins from its partner K2K to improve public health and safety by helping city operators process and analyze footage from over 1,000 public video streams at a rate of nearly 50 billion pixels per second.
    Tapped by Sicily, K2K’s AI agents — built with the NVIDIA AI Blueprint for VSS and cloud solutions from Nebius — can interpret and act on video data to provide real-time alerts on public events.
    To accurately predict and resolve traffic incidents, K2K is generating synthetic data with Cosmos world foundation models to simulate different driving conditions. Then, K2K uses the data to fine-tune the VLMs powering the AI agents with NeMo Curator. These simulations enable K2K’s AI agents to create over 100,000 predictions per second.

    Milestone Systems — in collaboration with NVIDIA and European cities — has launched Project Hafnia, an initiative to build an anonymized, ethically sourced video data platform for cities to develop and train AI models and applications while maintaining regulatory compliance.
    Using a combination of Cosmos and NeMo Curator on NVIDIA DGX Cloud and Nebius’ sovereign European cloud infrastructure, Project Hafnia scales up and enables European-compliant training and fine-tuning of video-centric AI models, including VLMs, for a variety of smart city use cases.
    The project’s initial rollout, taking place in Genoa, Italy, features one of the world’s first VLM models for intelligent transportation systems.

    Linker Vision was among the first to partner with NVIDIA to deploy smart city digital twins and AI agents for Kaohsiung City, Taiwan — powered by Omniverse, Cosmos and Metropolis. Linker Vision worked with AVES Reality, a digital twin company, to bring aerial imagery of cities and infrastructure into 3D geometry and ultimately into SimReady Omniverse digital twins.
    Linker Vision’s AI-powered application then built, trained and tested visual AI agents in a digital twin before deployment in the physical city. Now, it’s scaling to analyze 50,000 video streams in real time with generative AI to understand and narrate complex urban events like floods and traffic accidents. Linker Vision delivers timely insights to a dozen city departments through a single integrated AI-powered platform, breaking silos and reducing incident response times by up to 80%.

    Bentley Systems is joining the effort to bring physical AI to cities with the NVIDIA blueprint. Cesium, the open 3D geospatial platform, provides the foundation for visualizing, analyzing and managing infrastructure projects and ports digital twins to Omniverse. The company’s AI platform Blyncsy uses synthetic data generation and Metropolis to analyze road conditions and improve maintenance.
    Trimble, a global technology company that enables essential industries including construction, geospatial and transportation, is exploring ways to integrate components of the Omniverse blueprint into its reality capture workflows and Trimble Connect digital twin platform for surveying and mapping applications for smart cities.
    Younite AI, a developer of AI and 3D digital twin solutions, is adopting the blueprint to accelerate its development pipeline, enabling the company to quickly move from operational digital twins to large-scale urban simulations, improve synthetic data generation, integrate real-time IoT sensor data and deploy AI agents.
    Learn more about the NVIDIA Omniverse Blueprint for smart city AI by attending this GTC Paris session or watching the on-demand video after the event. Sign up to be notified when the blueprint is available.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    #nvidia #brings #physical #european #cities
    NVIDIA Brings Physical AI to European Cities With New Blueprint for Smart City AI
    Urban populations are expected to double by 2050, which means around 2.5 billion people could be added to urban areas by the middle of the century, driving the need for more sustainable urban planning and public services. Cities across the globe are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions. Building a digital twin of a city and testing smart city AI agents within it, however, is a complex and resource-intensive endeavor, fraught with technical and operational challenges. To address those challenges, NVIDIA today announced the NVIDIA Omniverse Blueprint for smart city AI, a reference framework that combines the NVIDIA Omniverse, Cosmos, NeMo and Metropolis platforms to bring the benefits of physical AI to entire cities and their critical infrastructure. Using the blueprint, developers can build simulation-ready, or SimReady, photorealistic digital twins of cities to build and test AI agents that can help monitor and optimize city operations. Leading companies including XXII, AVES Reality, Akila, Blyncsy, Bentley, Cesium, K2K, Linker Vision, Milestone Systems, Nebius, SNCF Gares&Connexions, Trimble and Younite AI are among the first to use the new blueprint. NVIDIA Omniverse Blueprint for Smart City AI  The NVIDIA Omniverse Blueprint for smart city AI provides the complete software stack needed to accelerate the development and testing of AI agents in physically accurate digital twins of cities. It includes: NVIDIA Omniverse to build physically accurate digital twins and run simulations at city scale. NVIDIA Cosmos to generate synthetic data at scale for post-training AI models. NVIDIA NeMo to curate high-quality data and use that data to train and fine-tune vision language modelsand large language models. NVIDIA Metropolis to build and deploy video analytics AI agents based on the NVIDIA AI Blueprint for video search and summarization, helping process vast amounts of video data and provide critical insights to optimize business processes. The blueprint workflow comprises three key steps. First, developers create a SimReady digital twin of locations and facilities using aerial, satellite or map data with Omniverse and Cosmos. Second, they can train and fine-tune AI models, like computer vision models and VLMs, using NVIDIA TAO and NeMo Curator to improve accuracy for vision AI use cases​. Finally, real-time AI agents powered by these customized models are deployed to alert, summarize and query camera and sensor data using the Metropolis VSS blueprint. NVIDIA Partner Ecosystem Powers Smart Cities Worldwide The blueprint for smart city AI enables a large ecosystem of partners to use a single workflow to build and activate digital twins for smart city use cases, tapping into a combination of NVIDIA’s technologies and their own. SNCF Gares&Connexions, which operates a network of 3,000 train stations across France and Monaco, has deployed a digital twin and AI agents to enable real-time operational monitoring, emergency response simulations and infrastructure upgrade planning. This helps each station analyze operational data such as energy and water use, and enables predictive maintenance capabilities, automated reporting and GDPR-compliant video analytics for incident detection and crowd management. Powered by Omniverse, Metropolis and solutions from ecosystem partners Akila and XXII, SNCF Gares&Connexions’ physical AI deployment at the Monaco-Monte-Carlo and Marseille stations has helped SNCF Gares&Connexions achieve a 100% on-time preventive maintenance completion rate, a 50% reduction in downtime and issue response time, and a 20% reduction in energy consumption. The city of Palermo in Sicily is using AI agents and digital twins from its partner K2K to improve public health and safety by helping city operators process and analyze footage from over 1,000 public video streams at a rate of nearly 50 billion pixels per second. Tapped by Sicily, K2K’s AI agents — built with the NVIDIA AI Blueprint for VSS and cloud solutions from Nebius — can interpret and act on video data to provide real-time alerts on public events. To accurately predict and resolve traffic incidents, K2K is generating synthetic data with Cosmos world foundation models to simulate different driving conditions. Then, K2K uses the data to fine-tune the VLMs powering the AI agents with NeMo Curator. These simulations enable K2K’s AI agents to create over 100,000 predictions per second. Milestone Systems — in collaboration with NVIDIA and European cities — has launched Project Hafnia, an initiative to build an anonymized, ethically sourced video data platform for cities to develop and train AI models and applications while maintaining regulatory compliance. Using a combination of Cosmos and NeMo Curator on NVIDIA DGX Cloud and Nebius’ sovereign European cloud infrastructure, Project Hafnia scales up and enables European-compliant training and fine-tuning of video-centric AI models, including VLMs, for a variety of smart city use cases. The project’s initial rollout, taking place in Genoa, Italy, features one of the world’s first VLM models for intelligent transportation systems. Linker Vision was among the first to partner with NVIDIA to deploy smart city digital twins and AI agents for Kaohsiung City, Taiwan — powered by Omniverse, Cosmos and Metropolis. Linker Vision worked with AVES Reality, a digital twin company, to bring aerial imagery of cities and infrastructure into 3D geometry and ultimately into SimReady Omniverse digital twins. Linker Vision’s AI-powered application then built, trained and tested visual AI agents in a digital twin before deployment in the physical city. Now, it’s scaling to analyze 50,000 video streams in real time with generative AI to understand and narrate complex urban events like floods and traffic accidents. Linker Vision delivers timely insights to a dozen city departments through a single integrated AI-powered platform, breaking silos and reducing incident response times by up to 80%. Bentley Systems is joining the effort to bring physical AI to cities with the NVIDIA blueprint. Cesium, the open 3D geospatial platform, provides the foundation for visualizing, analyzing and managing infrastructure projects and ports digital twins to Omniverse. The company’s AI platform Blyncsy uses synthetic data generation and Metropolis to analyze road conditions and improve maintenance. Trimble, a global technology company that enables essential industries including construction, geospatial and transportation, is exploring ways to integrate components of the Omniverse blueprint into its reality capture workflows and Trimble Connect digital twin platform for surveying and mapping applications for smart cities. Younite AI, a developer of AI and 3D digital twin solutions, is adopting the blueprint to accelerate its development pipeline, enabling the company to quickly move from operational digital twins to large-scale urban simulations, improve synthetic data generation, integrate real-time IoT sensor data and deploy AI agents. Learn more about the NVIDIA Omniverse Blueprint for smart city AI by attending this GTC Paris session or watching the on-demand video after the event. Sign up to be notified when the blueprint is available. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. #nvidia #brings #physical #european #cities
    BLOGS.NVIDIA.COM
    NVIDIA Brings Physical AI to European Cities With New Blueprint for Smart City AI
    Urban populations are expected to double by 2050, which means around 2.5 billion people could be added to urban areas by the middle of the century, driving the need for more sustainable urban planning and public services. Cities across the globe are turning to digital twins and AI agents for urban planning scenario analysis and data-driven operational decisions. Building a digital twin of a city and testing smart city AI agents within it, however, is a complex and resource-intensive endeavor, fraught with technical and operational challenges. To address those challenges, NVIDIA today announced the NVIDIA Omniverse Blueprint for smart city AI, a reference framework that combines the NVIDIA Omniverse, Cosmos, NeMo and Metropolis platforms to bring the benefits of physical AI to entire cities and their critical infrastructure. Using the blueprint, developers can build simulation-ready, or SimReady, photorealistic digital twins of cities to build and test AI agents that can help monitor and optimize city operations. Leading companies including XXII, AVES Reality, Akila, Blyncsy, Bentley, Cesium, K2K, Linker Vision, Milestone Systems, Nebius, SNCF Gares&Connexions, Trimble and Younite AI are among the first to use the new blueprint. NVIDIA Omniverse Blueprint for Smart City AI  The NVIDIA Omniverse Blueprint for smart city AI provides the complete software stack needed to accelerate the development and testing of AI agents in physically accurate digital twins of cities. It includes: NVIDIA Omniverse to build physically accurate digital twins and run simulations at city scale. NVIDIA Cosmos to generate synthetic data at scale for post-training AI models. NVIDIA NeMo to curate high-quality data and use that data to train and fine-tune vision language models (VLMs) and large language models. NVIDIA Metropolis to build and deploy video analytics AI agents based on the NVIDIA AI Blueprint for video search and summarization (VSS), helping process vast amounts of video data and provide critical insights to optimize business processes. The blueprint workflow comprises three key steps. First, developers create a SimReady digital twin of locations and facilities using aerial, satellite or map data with Omniverse and Cosmos. Second, they can train and fine-tune AI models, like computer vision models and VLMs, using NVIDIA TAO and NeMo Curator to improve accuracy for vision AI use cases​. Finally, real-time AI agents powered by these customized models are deployed to alert, summarize and query camera and sensor data using the Metropolis VSS blueprint. NVIDIA Partner Ecosystem Powers Smart Cities Worldwide The blueprint for smart city AI enables a large ecosystem of partners to use a single workflow to build and activate digital twins for smart city use cases, tapping into a combination of NVIDIA’s technologies and their own. SNCF Gares&Connexions, which operates a network of 3,000 train stations across France and Monaco, has deployed a digital twin and AI agents to enable real-time operational monitoring, emergency response simulations and infrastructure upgrade planning. This helps each station analyze operational data such as energy and water use, and enables predictive maintenance capabilities, automated reporting and GDPR-compliant video analytics for incident detection and crowd management. Powered by Omniverse, Metropolis and solutions from ecosystem partners Akila and XXII, SNCF Gares&Connexions’ physical AI deployment at the Monaco-Monte-Carlo and Marseille stations has helped SNCF Gares&Connexions achieve a 100% on-time preventive maintenance completion rate, a 50% reduction in downtime and issue response time, and a 20% reduction in energy consumption. https://blogs.nvidia.com/wp-content/uploads/2025/06/01-Monaco-Akila.mp4 The city of Palermo in Sicily is using AI agents and digital twins from its partner K2K to improve public health and safety by helping city operators process and analyze footage from over 1,000 public video streams at a rate of nearly 50 billion pixels per second. Tapped by Sicily, K2K’s AI agents — built with the NVIDIA AI Blueprint for VSS and cloud solutions from Nebius — can interpret and act on video data to provide real-time alerts on public events. To accurately predict and resolve traffic incidents, K2K is generating synthetic data with Cosmos world foundation models to simulate different driving conditions. Then, K2K uses the data to fine-tune the VLMs powering the AI agents with NeMo Curator. These simulations enable K2K’s AI agents to create over 100,000 predictions per second. https://blogs.nvidia.com/wp-content/uploads/2025/06/02-K2K-Polermo-1600x900-1.mp4 Milestone Systems — in collaboration with NVIDIA and European cities — has launched Project Hafnia, an initiative to build an anonymized, ethically sourced video data platform for cities to develop and train AI models and applications while maintaining regulatory compliance. Using a combination of Cosmos and NeMo Curator on NVIDIA DGX Cloud and Nebius’ sovereign European cloud infrastructure, Project Hafnia scales up and enables European-compliant training and fine-tuning of video-centric AI models, including VLMs, for a variety of smart city use cases. The project’s initial rollout, taking place in Genoa, Italy, features one of the world’s first VLM models for intelligent transportation systems. https://blogs.nvidia.com/wp-content/uploads/2025/06/03-Milestone.mp4 Linker Vision was among the first to partner with NVIDIA to deploy smart city digital twins and AI agents for Kaohsiung City, Taiwan — powered by Omniverse, Cosmos and Metropolis. Linker Vision worked with AVES Reality, a digital twin company, to bring aerial imagery of cities and infrastructure into 3D geometry and ultimately into SimReady Omniverse digital twins. Linker Vision’s AI-powered application then built, trained and tested visual AI agents in a digital twin before deployment in the physical city. Now, it’s scaling to analyze 50,000 video streams in real time with generative AI to understand and narrate complex urban events like floods and traffic accidents. Linker Vision delivers timely insights to a dozen city departments through a single integrated AI-powered platform, breaking silos and reducing incident response times by up to 80%. https://blogs.nvidia.com/wp-content/uploads/2025/06/02-Linker-Vision-1280x680-1.mp4 Bentley Systems is joining the effort to bring physical AI to cities with the NVIDIA blueprint. Cesium, the open 3D geospatial platform, provides the foundation for visualizing, analyzing and managing infrastructure projects and ports digital twins to Omniverse. The company’s AI platform Blyncsy uses synthetic data generation and Metropolis to analyze road conditions and improve maintenance. Trimble, a global technology company that enables essential industries including construction, geospatial and transportation, is exploring ways to integrate components of the Omniverse blueprint into its reality capture workflows and Trimble Connect digital twin platform for surveying and mapping applications for smart cities. Younite AI, a developer of AI and 3D digital twin solutions, is adopting the blueprint to accelerate its development pipeline, enabling the company to quickly move from operational digital twins to large-scale urban simulations, improve synthetic data generation, integrate real-time IoT sensor data and deploy AI agents. Learn more about the NVIDIA Omniverse Blueprint for smart city AI by attending this GTC Paris session or watching the on-demand video after the event. Sign up to be notified when the blueprint is available. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.
    Like
    Love
    Wow
    34
    0 Yorumlar 0 hisse senetleri
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Yorumlar 0 hisse senetleri
  • Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 Yorumlar 0 hisse senetleri
  • The AI execution gap: Why 80% of projects don’t reach production

    Enterprise artificial intelligence investment is unprecedented, with IDC projecting global spending on AI and GenAI to double to billion by 2028. Yet beneath the impressive budget allocations and boardroom enthusiasm lies a troubling reality: most organisations struggle to translate their AI ambitions into operational success.The sobering statistics behind AI’s promiseModelOp’s 2025 AI Governance Benchmark Report, based on input from 100 senior AI and data leaders at Fortune 500 enterprises, reveals a disconnect between aspiration and execution.While more than 80% of enterprises have 51 or more generative AI projects in proposal phases, only 18% have successfully deployed more than 20 models into production.The execution gap represents one of the most significant challenges facing enterprise AI today. Most generative AI projects still require 6 to 18 months to go live – if they reach production at all.The result is delayed returns on investment, frustrated stakeholders, and diminished confidence in AI initiatives in the enterprise.The cause: Structural, not technical barriersThe biggest obstacles preventing AI scalability aren’t technical limitations – they’re structural inefficiencies plaguing enterprise operations. The ModelOp benchmark report identifies several problems that create what experts call a “time-to-market quagmire.”Fragmented systems plague implementation. 58% of organisations cite fragmented systems as the top obstacle to adopting governance platforms. Fragmentation creates silos where different departments use incompatible tools and processes, making it nearly impossible to maintain consistent oversight in AI initiatives.Manual processes dominate despite digital transformation. 55% of enterprises still rely on manual processes – including spreadsheets and email – to manage AI use case intake. The reliance on antiquated methods creates bottlenecks, increases the likelihood of errors, and makes it difficult to scale AI operations.Lack of standardisation hampers progress. Only 23% of organisations implement standardised intake, development, and model management processes. Without these elements, each AI project becomes a unique challenge requiring custom solutions and extensive coordination by multiple teams.Enterprise-level oversight remains rare Just 14% of companies perform AI assurance at the enterprise level, increasing the risk of duplicated efforts and inconsistent oversight. The lack of centralised governance means organisations often discover they’re solving the same problems multiple times in different departments.The governance revolution: From obstacle to acceleratorA change is taking place in how enterprises view AI governance. Rather than seeing it as a compliance burden that slows innovation, forward-thinking organisations recognise governance as an important enabler of scale and speed.Leadership alignment signals strategic shift. The ModelOp benchmark data reveals a change in organisational structure: 46% of companies now assign accountability for AI governance to a Chief Innovation Officer – more than four times the number who place accountability under Legal or Compliance. This strategic repositioning reflects a new understanding that governance isn’t solely about risk management, but can enable innovation.Investment follows strategic priority. A financial commitment to AI governance underscores its importance. According to the report, 36% of enterprises have budgeted at least million annually for AI governance software, while 54% have allocated resources specifically for AI Portfolio Intelligence to track value and ROI.What high-performing organisations do differentlyThe enterprises that successfully bridge the ‘execution gap’ share several characteristics in their approach to AI implementation:Standardised processes from day one. Leading organisations implement standardised intake, development, and model review processes in AI initiatives. Consistency eliminates the need to reinvent workflows for each project and ensures that all stakeholders understand their responsibilities.Centralised documentation and inventory. Rather than allowing AI assets to proliferate in disconnected systems, successful enterprises maintain centralised inventories that provide visibility into every model’s status, performance, and compliance posture.Automated governance checkpoints. High-performing organisations embed automated governance checkpoints throughout the AI lifecycle, helping ensure compliance requirements and risk assessments are addressed systematically rather than as afterthoughts.End-to-end traceability. Leading enterprises maintain complete traceability of their AI models, including data sources, training methods, validation results, and performance metrics.Measurable impact of structured governanceThe benefits of implementing comprehensive AI governance extend beyond compliance. Organisations that adopt lifecycle automation platforms reportedly see dramatic improvements in operational efficiency and business outcomes.A financial services firm profiled in the ModelOp report experienced a halving of time to production and an 80% reduction in issue resolution time after implementing automated governance processes. Such improvements translate directly into faster time-to-value and increased confidence among business stakeholders.Enterprises with robust governance frameworks report the ability to many times more models simultaneously while maintaining oversight and control. This scalability lets organisations pursue AI initiatives in multiple business units without overwhelming their operational capabilities.The path forward: From stuck to scaledThe message from industry leaders that the gap between AI ambition and execution is solvable, but it requires a shift in approach. Rather than treating governance as a necessary evil, enterprises should realise it enables AI innovation at scale.Immediate action items for AI leadersOrganisations looking to escape the ‘time-to-market quagmire’ should prioritise the following:Audit current state: Conduct an assessment of existing AI initiatives, identifying fragmented processes and manual bottlenecksStandardise workflows: Implement consistent processes for AI use case intake, development, and deployment in all business unitsInvest in integration: Deploy platforms to unify disparate tools and systems under a single governance frameworkEstablish enterprise oversight: Create centralised visibility into all AI initiatives with real-time monitoring and reporting abilitiesThe competitive advantage of getting it rightOrganisations that can solve the execution challenge will be able to bring AI solutions to market faster, scale more efficiently, and maintain the trust of stakeholders and regulators.Enterprises that continue with fragmented processes and manual workflows will find themselves disadvantaged compared to their more organised competitors. Operational excellence isn’t about efficiency but survival.The data shows enterprise AI investment will continue to grow. Therefore, the question isn’t whether organisations will invest in AI, but whether they’ll develop the operational abilities necessary to realise return on investment. The opportunity to lead in the AI-driven economy has never been greater for those willing to embrace governance as an enabler not an obstacle.
    #execution #gap #why #projects #dont
    The AI execution gap: Why 80% of projects don’t reach production
    Enterprise artificial intelligence investment is unprecedented, with IDC projecting global spending on AI and GenAI to double to billion by 2028. Yet beneath the impressive budget allocations and boardroom enthusiasm lies a troubling reality: most organisations struggle to translate their AI ambitions into operational success.The sobering statistics behind AI’s promiseModelOp’s 2025 AI Governance Benchmark Report, based on input from 100 senior AI and data leaders at Fortune 500 enterprises, reveals a disconnect between aspiration and execution.While more than 80% of enterprises have 51 or more generative AI projects in proposal phases, only 18% have successfully deployed more than 20 models into production.The execution gap represents one of the most significant challenges facing enterprise AI today. Most generative AI projects still require 6 to 18 months to go live – if they reach production at all.The result is delayed returns on investment, frustrated stakeholders, and diminished confidence in AI initiatives in the enterprise.The cause: Structural, not technical barriersThe biggest obstacles preventing AI scalability aren’t technical limitations – they’re structural inefficiencies plaguing enterprise operations. The ModelOp benchmark report identifies several problems that create what experts call a “time-to-market quagmire.”Fragmented systems plague implementation. 58% of organisations cite fragmented systems as the top obstacle to adopting governance platforms. Fragmentation creates silos where different departments use incompatible tools and processes, making it nearly impossible to maintain consistent oversight in AI initiatives.Manual processes dominate despite digital transformation. 55% of enterprises still rely on manual processes – including spreadsheets and email – to manage AI use case intake. The reliance on antiquated methods creates bottlenecks, increases the likelihood of errors, and makes it difficult to scale AI operations.Lack of standardisation hampers progress. Only 23% of organisations implement standardised intake, development, and model management processes. Without these elements, each AI project becomes a unique challenge requiring custom solutions and extensive coordination by multiple teams.Enterprise-level oversight remains rare Just 14% of companies perform AI assurance at the enterprise level, increasing the risk of duplicated efforts and inconsistent oversight. The lack of centralised governance means organisations often discover they’re solving the same problems multiple times in different departments.The governance revolution: From obstacle to acceleratorA change is taking place in how enterprises view AI governance. Rather than seeing it as a compliance burden that slows innovation, forward-thinking organisations recognise governance as an important enabler of scale and speed.Leadership alignment signals strategic shift. The ModelOp benchmark data reveals a change in organisational structure: 46% of companies now assign accountability for AI governance to a Chief Innovation Officer – more than four times the number who place accountability under Legal or Compliance. This strategic repositioning reflects a new understanding that governance isn’t solely about risk management, but can enable innovation.Investment follows strategic priority. A financial commitment to AI governance underscores its importance. According to the report, 36% of enterprises have budgeted at least million annually for AI governance software, while 54% have allocated resources specifically for AI Portfolio Intelligence to track value and ROI.What high-performing organisations do differentlyThe enterprises that successfully bridge the ‘execution gap’ share several characteristics in their approach to AI implementation:Standardised processes from day one. Leading organisations implement standardised intake, development, and model review processes in AI initiatives. Consistency eliminates the need to reinvent workflows for each project and ensures that all stakeholders understand their responsibilities.Centralised documentation and inventory. Rather than allowing AI assets to proliferate in disconnected systems, successful enterprises maintain centralised inventories that provide visibility into every model’s status, performance, and compliance posture.Automated governance checkpoints. High-performing organisations embed automated governance checkpoints throughout the AI lifecycle, helping ensure compliance requirements and risk assessments are addressed systematically rather than as afterthoughts.End-to-end traceability. Leading enterprises maintain complete traceability of their AI models, including data sources, training methods, validation results, and performance metrics.Measurable impact of structured governanceThe benefits of implementing comprehensive AI governance extend beyond compliance. Organisations that adopt lifecycle automation platforms reportedly see dramatic improvements in operational efficiency and business outcomes.A financial services firm profiled in the ModelOp report experienced a halving of time to production and an 80% reduction in issue resolution time after implementing automated governance processes. Such improvements translate directly into faster time-to-value and increased confidence among business stakeholders.Enterprises with robust governance frameworks report the ability to many times more models simultaneously while maintaining oversight and control. This scalability lets organisations pursue AI initiatives in multiple business units without overwhelming their operational capabilities.The path forward: From stuck to scaledThe message from industry leaders that the gap between AI ambition and execution is solvable, but it requires a shift in approach. Rather than treating governance as a necessary evil, enterprises should realise it enables AI innovation at scale.Immediate action items for AI leadersOrganisations looking to escape the ‘time-to-market quagmire’ should prioritise the following:Audit current state: Conduct an assessment of existing AI initiatives, identifying fragmented processes and manual bottlenecksStandardise workflows: Implement consistent processes for AI use case intake, development, and deployment in all business unitsInvest in integration: Deploy platforms to unify disparate tools and systems under a single governance frameworkEstablish enterprise oversight: Create centralised visibility into all AI initiatives with real-time monitoring and reporting abilitiesThe competitive advantage of getting it rightOrganisations that can solve the execution challenge will be able to bring AI solutions to market faster, scale more efficiently, and maintain the trust of stakeholders and regulators.Enterprises that continue with fragmented processes and manual workflows will find themselves disadvantaged compared to their more organised competitors. Operational excellence isn’t about efficiency but survival.The data shows enterprise AI investment will continue to grow. Therefore, the question isn’t whether organisations will invest in AI, but whether they’ll develop the operational abilities necessary to realise return on investment. The opportunity to lead in the AI-driven economy has never been greater for those willing to embrace governance as an enabler not an obstacle. #execution #gap #why #projects #dont
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    The AI execution gap: Why 80% of projects don’t reach production
    Enterprise artificial intelligence investment is unprecedented, with IDC projecting global spending on AI and GenAI to double to $631 billion by 2028. Yet beneath the impressive budget allocations and boardroom enthusiasm lies a troubling reality: most organisations struggle to translate their AI ambitions into operational success.The sobering statistics behind AI’s promiseModelOp’s 2025 AI Governance Benchmark Report, based on input from 100 senior AI and data leaders at Fortune 500 enterprises, reveals a disconnect between aspiration and execution.While more than 80% of enterprises have 51 or more generative AI projects in proposal phases, only 18% have successfully deployed more than 20 models into production.The execution gap represents one of the most significant challenges facing enterprise AI today. Most generative AI projects still require 6 to 18 months to go live – if they reach production at all.The result is delayed returns on investment, frustrated stakeholders, and diminished confidence in AI initiatives in the enterprise.The cause: Structural, not technical barriersThe biggest obstacles preventing AI scalability aren’t technical limitations – they’re structural inefficiencies plaguing enterprise operations. The ModelOp benchmark report identifies several problems that create what experts call a “time-to-market quagmire.”Fragmented systems plague implementation. 58% of organisations cite fragmented systems as the top obstacle to adopting governance platforms. Fragmentation creates silos where different departments use incompatible tools and processes, making it nearly impossible to maintain consistent oversight in AI initiatives.Manual processes dominate despite digital transformation. 55% of enterprises still rely on manual processes – including spreadsheets and email – to manage AI use case intake. The reliance on antiquated methods creates bottlenecks, increases the likelihood of errors, and makes it difficult to scale AI operations.Lack of standardisation hampers progress. Only 23% of organisations implement standardised intake, development, and model management processes. Without these elements, each AI project becomes a unique challenge requiring custom solutions and extensive coordination by multiple teams.Enterprise-level oversight remains rare Just 14% of companies perform AI assurance at the enterprise level, increasing the risk of duplicated efforts and inconsistent oversight. The lack of centralised governance means organisations often discover they’re solving the same problems multiple times in different departments.The governance revolution: From obstacle to acceleratorA change is taking place in how enterprises view AI governance. Rather than seeing it as a compliance burden that slows innovation, forward-thinking organisations recognise governance as an important enabler of scale and speed.Leadership alignment signals strategic shift. The ModelOp benchmark data reveals a change in organisational structure: 46% of companies now assign accountability for AI governance to a Chief Innovation Officer – more than four times the number who place accountability under Legal or Compliance. This strategic repositioning reflects a new understanding that governance isn’t solely about risk management, but can enable innovation.Investment follows strategic priority. A financial commitment to AI governance underscores its importance. According to the report, 36% of enterprises have budgeted at least $1 million annually for AI governance software, while 54% have allocated resources specifically for AI Portfolio Intelligence to track value and ROI.What high-performing organisations do differentlyThe enterprises that successfully bridge the ‘execution gap’ share several characteristics in their approach to AI implementation:Standardised processes from day one. Leading organisations implement standardised intake, development, and model review processes in AI initiatives. Consistency eliminates the need to reinvent workflows for each project and ensures that all stakeholders understand their responsibilities.Centralised documentation and inventory. Rather than allowing AI assets to proliferate in disconnected systems, successful enterprises maintain centralised inventories that provide visibility into every model’s status, performance, and compliance posture.Automated governance checkpoints. High-performing organisations embed automated governance checkpoints throughout the AI lifecycle, helping ensure compliance requirements and risk assessments are addressed systematically rather than as afterthoughts.End-to-end traceability. Leading enterprises maintain complete traceability of their AI models, including data sources, training methods, validation results, and performance metrics.Measurable impact of structured governanceThe benefits of implementing comprehensive AI governance extend beyond compliance. Organisations that adopt lifecycle automation platforms reportedly see dramatic improvements in operational efficiency and business outcomes.A financial services firm profiled in the ModelOp report experienced a halving of time to production and an 80% reduction in issue resolution time after implementing automated governance processes. Such improvements translate directly into faster time-to-value and increased confidence among business stakeholders.Enterprises with robust governance frameworks report the ability to many times more models simultaneously while maintaining oversight and control. This scalability lets organisations pursue AI initiatives in multiple business units without overwhelming their operational capabilities.The path forward: From stuck to scaledThe message from industry leaders that the gap between AI ambition and execution is solvable, but it requires a shift in approach. Rather than treating governance as a necessary evil, enterprises should realise it enables AI innovation at scale.Immediate action items for AI leadersOrganisations looking to escape the ‘time-to-market quagmire’ should prioritise the following:Audit current state: Conduct an assessment of existing AI initiatives, identifying fragmented processes and manual bottlenecksStandardise workflows: Implement consistent processes for AI use case intake, development, and deployment in all business unitsInvest in integration: Deploy platforms to unify disparate tools and systems under a single governance frameworkEstablish enterprise oversight: Create centralised visibility into all AI initiatives with real-time monitoring and reporting abilitiesThe competitive advantage of getting it rightOrganisations that can solve the execution challenge will be able to bring AI solutions to market faster, scale more efficiently, and maintain the trust of stakeholders and regulators.Enterprises that continue with fragmented processes and manual workflows will find themselves disadvantaged compared to their more organised competitors. Operational excellence isn’t about efficiency but survival.The data shows enterprise AI investment will continue to grow. Therefore, the question isn’t whether organisations will invest in AI, but whether they’ll develop the operational abilities necessary to realise return on investment. The opportunity to lead in the AI-driven economy has never been greater for those willing to embrace governance as an enabler not an obstacle.(Image source: Unsplash)
    Like
    Love
    Wow
    Angry
    Sad
    598
    0 Yorumlar 0 hisse senetleri
  • Waymo limits service ahead of today’s ‘No Kings’ protests

    In Brief

    Posted:
    10:54 AM PDT · June 14, 2025

    Image Credits:Mario Tama / Getty Images

    Waymo limits service ahead of today’s ‘No Kings’ protests

    Alphabet-owned robotaxi company Waymo is limiting service due to Saturday’s scheduled nationwide “No Kings” protests against President Donald Trump and his policies.
    A Waymo spokesperson confirmed the changes to Wired on Friday. Service is reportedly affected in San Francisco, Austin, Atlanta, and Phoenix, and is entirely suspended in Los Angeles. It’s not clear how long the limited service will last.
    As part of protests last weekend in Los Angeles against the Trump administration’s immigration crackdown, five Waymo vehicles were set on fire and spray painted with anti-Immigration and Customs Enforcementmessages. In response, Waymo suspended service in downtown LA.
    While it’s not entirely clear why protestors targeted the vehicles, they may be seen as a surveillance tool, as police departments have requested robotaxi footage for their investigations in the past.According to the San Francisco Chronicle, the city’s fire chief told officials Wednesday that “in a period of civil unrest, we will not try to extinguish those fires unless they are up against a building.”

    Topics
    #waymo #limits #service #ahead #todays
    Waymo limits service ahead of today’s ‘No Kings’ protests
    In Brief Posted: 10:54 AM PDT · June 14, 2025 Image Credits:Mario Tama / Getty Images Waymo limits service ahead of today’s ‘No Kings’ protests Alphabet-owned robotaxi company Waymo is limiting service due to Saturday’s scheduled nationwide “No Kings” protests against President Donald Trump and his policies. A Waymo spokesperson confirmed the changes to Wired on Friday. Service is reportedly affected in San Francisco, Austin, Atlanta, and Phoenix, and is entirely suspended in Los Angeles. It’s not clear how long the limited service will last. As part of protests last weekend in Los Angeles against the Trump administration’s immigration crackdown, five Waymo vehicles were set on fire and spray painted with anti-Immigration and Customs Enforcementmessages. In response, Waymo suspended service in downtown LA. While it’s not entirely clear why protestors targeted the vehicles, they may be seen as a surveillance tool, as police departments have requested robotaxi footage for their investigations in the past.According to the San Francisco Chronicle, the city’s fire chief told officials Wednesday that “in a period of civil unrest, we will not try to extinguish those fires unless they are up against a building.” Topics #waymo #limits #service #ahead #todays
    TECHCRUNCH.COM
    Waymo limits service ahead of today’s ‘No Kings’ protests
    In Brief Posted: 10:54 AM PDT · June 14, 2025 Image Credits:Mario Tama / Getty Images Waymo limits service ahead of today’s ‘No Kings’ protests Alphabet-owned robotaxi company Waymo is limiting service due to Saturday’s scheduled nationwide “No Kings” protests against President Donald Trump and his policies. A Waymo spokesperson confirmed the changes to Wired on Friday. Service is reportedly affected in San Francisco, Austin, Atlanta, and Phoenix, and is entirely suspended in Los Angeles. It’s not clear how long the limited service will last. As part of protests last weekend in Los Angeles against the Trump administration’s immigration crackdown, five Waymo vehicles were set on fire and spray painted with anti-Immigration and Customs Enforcement (ICE) messages. In response, Waymo suspended service in downtown LA. While it’s not entirely clear why protestors targeted the vehicles, they may be seen as a surveillance tool, as police departments have requested robotaxi footage for their investigations in the past. (Waymo says it challenges requests that it sees as overly broad or lacking a legal basis.) According to the San Francisco Chronicle, the city’s fire chief told officials Wednesday that “in a period of civil unrest, we will not try to extinguish those fires unless they are up against a building.” Topics
    0 Yorumlar 0 hisse senetleri
  • MedTech AI, hardware, and clinical application programmes

    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    #medtech #hardware #clinical #application #programmes
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here. #medtech #hardware #clinical #application #programmes
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between $14 billion and $55 billion annually in productivity gains. Through GenAI adoption, an additional $50 billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experience (UX) being equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. (Image source: “IBM Achieves New Deep Learning Breakthrough” by IBM Research is licensed under CC BY-ND 2.0.)Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    0 Yorumlar 0 hisse senetleri
  • Fox News AI Newsletter: Hollywood studios sue 'bottomless pit of plagiarism'

    The Minions pose during the world premiere of the film "Despicable Me 4" in New York City, June 9, 2024. NEWYou can now listen to Fox News articles!
    Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.IN TODAY’S NEWSLETTER:- Major Hollywood studios sue AI company over copyright infringement in landmark move- Meta's Zuckerberg aiming to dominate AI race with recruiting push for new ‘superintelligence’ team: report- OpenAI says this state will play central role in artificial intelligence development The website of Midjourney, an artificial intelligencecapable of creating AI art, is seen on a smartphone on April 3, 2023, in Berlin, Germany.'PIRACY IS PIRACY': Two major Hollywood studios are suing Midjourney, a popular AI image generator, over its use and distribution of intellectual property.AI RACE: Meta CEO Mark Zuckerberg is reportedly building a team of experts to develop artificial general intelligencethat can meet or exceed human capabilities.TECH HUB: New York is poised to play a central role in the development of artificial intelligence, OpenAI executives told key business and civic leaders on Tuesday. Attendees watch a presentation during an event on the Apple campus in Cupertino, Calif., Monday, June 9, 2025. APPLE FALLING BEHIND: Apple’s annual Worldwide Developers Conferencekicked off on Monday and runs through Friday. But the Cupertino-based company is not making us wait until the end. The major announcements have already been made, and there are quite a few. The headliners are new software versions for Macs, iPhones, iPads and Vision. FROM COAL TO CODE: This week, Amazon announced a billion investment in artificial intelligence infrastructure in the form of new data centers, the largest in the commonwealth's history, according to the eCommerce giant.DIGITAL DEFENSE: A growing number of fire departments across the country are turning to artificial intelligence to help detect and respond to wildfires more quickly. Rep. Darin LaHood, R-Ill., leaves the House Republican Conference meeting at the Capitol Hill Club in Washington on Tuesday, May 17, 2022. SHIELD FROM BEIJING: Rep. Darin LaHood, R-Ill., is introducing a new bill Thursday imploring the National Security Administrationto develop an "AI security playbook" to stay ahead of threats from China and other foreign adversaries. ROBOT RALLY PARTNER: Finding a reliable tennis partner who matches your energy and skill level can be a challenge. Now, with Tenniix, an artificial intelligence-powered tennis robot from T-Apex, players of all abilities have a new way to practice and improve. DIGITAL DANGER ZONE: Scam ads on Facebook have evolved beyond the days of misspelled headlines and sketchy product photos. Today, many are powered by artificial intelligence, fueled by deepfake technology and distributed at scale through Facebook’s own ad system.  Fairfield, Ohio, USA - February 25, 2011 : Chipotle Mexican Grill Logo on brick building. Chipotle is a chain of fast casual restaurants in the United States and Canada that specialize in burritos and tacos.'EXPONENTIAL RATE': Artificial intelligence is helping Chipotle rapidly grow its footprint, according to CEO Scott Boatwright. AI TAKEOVER THREAT: The hottest topic nowadays revolves around Artificial Intelligenceand its potential to rapidly and imminently transform the world we live in — economically, socially, politically and even defensively. Regardless of whether you believe that the technology will be able to develop superintelligence and lead a metamorphosis of everything, the possibility that may come to fruition is a catalyst for more far-leftist control.FOLLOW FOX NEWS ON SOCIAL MEDIASIGN UP FOR OUR OTHER NEWSLETTERSDOWNLOAD OUR APPSWATCH FOX NEWS ONLINEFox News GoSTREAM FOX NATIONFox NationStay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here. This article was written by Fox News staff.
    #fox #news #newsletter #hollywood #studios
    Fox News AI Newsletter: Hollywood studios sue 'bottomless pit of plagiarism'
    The Minions pose during the world premiere of the film "Despicable Me 4" in New York City, June 9, 2024. NEWYou can now listen to Fox News articles! Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.IN TODAY’S NEWSLETTER:- Major Hollywood studios sue AI company over copyright infringement in landmark move- Meta's Zuckerberg aiming to dominate AI race with recruiting push for new ‘superintelligence’ team: report- OpenAI says this state will play central role in artificial intelligence development The website of Midjourney, an artificial intelligencecapable of creating AI art, is seen on a smartphone on April 3, 2023, in Berlin, Germany.'PIRACY IS PIRACY': Two major Hollywood studios are suing Midjourney, a popular AI image generator, over its use and distribution of intellectual property.AI RACE: Meta CEO Mark Zuckerberg is reportedly building a team of experts to develop artificial general intelligencethat can meet or exceed human capabilities.TECH HUB: New York is poised to play a central role in the development of artificial intelligence, OpenAI executives told key business and civic leaders on Tuesday. Attendees watch a presentation during an event on the Apple campus in Cupertino, Calif., Monday, June 9, 2025. APPLE FALLING BEHIND: Apple’s annual Worldwide Developers Conferencekicked off on Monday and runs through Friday. But the Cupertino-based company is not making us wait until the end. The major announcements have already been made, and there are quite a few. The headliners are new software versions for Macs, iPhones, iPads and Vision. FROM COAL TO CODE: This week, Amazon announced a billion investment in artificial intelligence infrastructure in the form of new data centers, the largest in the commonwealth's history, according to the eCommerce giant.DIGITAL DEFENSE: A growing number of fire departments across the country are turning to artificial intelligence to help detect and respond to wildfires more quickly. Rep. Darin LaHood, R-Ill., leaves the House Republican Conference meeting at the Capitol Hill Club in Washington on Tuesday, May 17, 2022. SHIELD FROM BEIJING: Rep. Darin LaHood, R-Ill., is introducing a new bill Thursday imploring the National Security Administrationto develop an "AI security playbook" to stay ahead of threats from China and other foreign adversaries. ROBOT RALLY PARTNER: Finding a reliable tennis partner who matches your energy and skill level can be a challenge. Now, with Tenniix, an artificial intelligence-powered tennis robot from T-Apex, players of all abilities have a new way to practice and improve. DIGITAL DANGER ZONE: Scam ads on Facebook have evolved beyond the days of misspelled headlines and sketchy product photos. Today, many are powered by artificial intelligence, fueled by deepfake technology and distributed at scale through Facebook’s own ad system.  Fairfield, Ohio, USA - February 25, 2011 : Chipotle Mexican Grill Logo on brick building. Chipotle is a chain of fast casual restaurants in the United States and Canada that specialize in burritos and tacos.'EXPONENTIAL RATE': Artificial intelligence is helping Chipotle rapidly grow its footprint, according to CEO Scott Boatwright. AI TAKEOVER THREAT: The hottest topic nowadays revolves around Artificial Intelligenceand its potential to rapidly and imminently transform the world we live in — economically, socially, politically and even defensively. Regardless of whether you believe that the technology will be able to develop superintelligence and lead a metamorphosis of everything, the possibility that may come to fruition is a catalyst for more far-leftist control.FOLLOW FOX NEWS ON SOCIAL MEDIASIGN UP FOR OUR OTHER NEWSLETTERSDOWNLOAD OUR APPSWATCH FOX NEWS ONLINEFox News GoSTREAM FOX NATIONFox NationStay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here. This article was written by Fox News staff. #fox #news #newsletter #hollywood #studios
    WWW.FOXNEWS.COM
    Fox News AI Newsletter: Hollywood studios sue 'bottomless pit of plagiarism'
    The Minions pose during the world premiere of the film "Despicable Me 4" in New York City, June 9, 2024.  (REUTERS/Kena Betancur) NEWYou can now listen to Fox News articles! Welcome to Fox News’ Artificial Intelligence newsletter with the latest AI technology advancements.IN TODAY’S NEWSLETTER:- Major Hollywood studios sue AI company over copyright infringement in landmark move- Meta's Zuckerberg aiming to dominate AI race with recruiting push for new ‘superintelligence’ team: report- OpenAI says this state will play central role in artificial intelligence development The website of Midjourney, an artificial intelligence (AI) capable of creating AI art, is seen on a smartphone on April 3, 2023, in Berlin, Germany. (Thomas Trutschel/Photothek via Getty Images)'PIRACY IS PIRACY': Two major Hollywood studios are suing Midjourney, a popular AI image generator, over its use and distribution of intellectual property.AI RACE: Meta CEO Mark Zuckerberg is reportedly building a team of experts to develop artificial general intelligence (AGI) that can meet or exceed human capabilities.TECH HUB: New York is poised to play a central role in the development of artificial intelligence (AI), OpenAI executives told key business and civic leaders on Tuesday. Attendees watch a presentation during an event on the Apple campus in Cupertino, Calif., Monday, June 9, 2025.  (AP Photo/Jeff Chiu)APPLE FALLING BEHIND: Apple’s annual Worldwide Developers Conference (WWDC) kicked off on Monday and runs through Friday. But the Cupertino-based company is not making us wait until the end. The major announcements have already been made, and there are quite a few. The headliners are new software versions for Macs, iPhones, iPads and Vision. FROM COAL TO CODE: This week, Amazon announced a $20 billion investment in artificial intelligence infrastructure in the form of new data centers, the largest in the commonwealth's history, according to the eCommerce giant.DIGITAL DEFENSE: A growing number of fire departments across the country are turning to artificial intelligence to help detect and respond to wildfires more quickly. Rep. Darin LaHood, R-Ill., leaves the House Republican Conference meeting at the Capitol Hill Club in Washington on Tuesday, May 17, 2022.  (Bill Clark/CQ-Roll Call, Inc via Getty Images)SHIELD FROM BEIJING: Rep. Darin LaHood, R-Ill., is introducing a new bill Thursday imploring the National Security Administration (NSA) to develop an "AI security playbook" to stay ahead of threats from China and other foreign adversaries. ROBOT RALLY PARTNER: Finding a reliable tennis partner who matches your energy and skill level can be a challenge. Now, with Tenniix, an artificial intelligence-powered tennis robot from T-Apex, players of all abilities have a new way to practice and improve. DIGITAL DANGER ZONE: Scam ads on Facebook have evolved beyond the days of misspelled headlines and sketchy product photos. Today, many are powered by artificial intelligence, fueled by deepfake technology and distributed at scale through Facebook’s own ad system.  Fairfield, Ohio, USA - February 25, 2011 : Chipotle Mexican Grill Logo on brick building. Chipotle is a chain of fast casual restaurants in the United States and Canada that specialize in burritos and tacos. (iStock)'EXPONENTIAL RATE': Artificial intelligence is helping Chipotle rapidly grow its footprint, according to CEO Scott Boatwright. AI TAKEOVER THREAT: The hottest topic nowadays revolves around Artificial Intelligence (AI) and its potential to rapidly and imminently transform the world we live in — economically, socially, politically and even defensively. Regardless of whether you believe that the technology will be able to develop superintelligence and lead a metamorphosis of everything, the possibility that may come to fruition is a catalyst for more far-leftist control.FOLLOW FOX NEWS ON SOCIAL MEDIASIGN UP FOR OUR OTHER NEWSLETTERSDOWNLOAD OUR APPSWATCH FOX NEWS ONLINEFox News GoSTREAM FOX NATIONFox NationStay up to date on the latest AI technology advancements and learn about the challenges and opportunities AI presents now and for the future with Fox News here. This article was written by Fox News staff.
    0 Yorumlar 0 hisse senetleri
  • How a US agriculture agency became key in the fight against bird flu

    A dangerous strain of bird flu is spreading in US livestockMediaMedium/Alamy
    Since Donald Trump assumed office in January, the leading US public health agency has pulled back preparations for a potential bird flu pandemic. But as it steps back, another government agency is stepping up.

    While the US Department of Health and Human Servicespreviously held regular briefings on its efforts to prevent a wider outbreak of a deadly bird flu virus called H5N1 in people, it largely stopped once Trump took office. It has also cancelled funding for a vaccine that would have targeted the virus. In contrast, the US Department of Agriculturehas escalated its fight against H5N1’s spread in poultry flocks and dairy herds, including by funding the development of livestock vaccines.
    This particular virus – a strain of avian influenza called H5N1 – poses a significant threat to humans, having killed about half of the roughly 1000 people worldwide who tested positive for it since 2003. While the pathogen spreads rapidly in birds, it is poorly adapted to infecting humans and isn’t known to transmit between people. But that could change if it acquires mutations that allow it to spread more easily among mammals – a risk that increases with each mammalian infection.
    The possibility of H5N1 evolving to become more dangerous to people has grown significantly since March 2024, when the virus jumped from migratory birds to dairy cows in Texas. More than 1,070 herds across 17 states have been affected since then.
    H5N1 also infects poultry, placing the virus in closer proximity to people. Since 2022, nearly 175 million domestic birds have been culled in the US due to H5N1, and almost all of the 71 people who have tested positive for it had direct contact with livestock.

    Get the most essential health and fitness news in your inbox every Saturday.

    Sign up to newsletter

    “We need to take this seriously because whenconstantly is spreading, it’s constantly spilling over into humans,” says Seema Lakdawala at Emory University in Georgia. The virus has already killed a person in the US and a child in Mexico this year.
    Still, cases have declined under Trump. The last recorded human case was in February, and the number of affected poultry flocks fell 95 per cent between then and June. Outbreaks in dairy herds have also stabilised.
    It isn’t clear what is behind the decline. Lakdawala believes it is partly due to a lull in bird migration, which reduces opportunities for the virus to spread from wild birds to livestock. It may also reflect efforts by the USDA to contain outbreaks on farms. In February, the USDA unveiled a billion plan for tackling H5N1, including strengthening farmers’ defences against the virus, such as through free biosecurity assessments. Of the 150 facilities that have undergone assessment, only one has experienced an H5N1 outbreak.
    Under Trump, the USDA also continued its National Milk Testing Strategy, which mandates farms provide raw milk samples for influenza testing. If a farm is positive for H5N1, it must allow the USDA to monitor livestock and implement measures to contain the virus. The USDA launched the programme in December and has since ramped up participation to 45 states.
    “The National Milk Testing Strategy is a fantastic system,” says Erin Sorrell at Johns Hopkins University in Maryland. Along with the USDA’s efforts to improve biosecurity measures on farms, milk testing is crucial for containing the outbreak, says Sorrell.

    But while the USDA has bolstered its efforts against H5N1, the HHS doesn’t appear to have followed suit. In fact, the recent drop in human cases may reflect decreased surveillance due to workforce cuts, says Sorrell. In April, the HHS laid off about 10,000 employees, including 90 per cent of staff at the National Institute for Occupational Safety and Health, an office that helps investigate H5N1 outbreaks in farm workers.
    “There is an old saying that if you don’t test for something, you can’t find it,” says Sorrell. Yet a spokesperson for the US Centers for Disease Control and Preventionsays its guidance and surveillance efforts have not changed. “State and local health departments continue to monitor for illness in persons exposed to sick animals,” they told New Scientist. “CDC remains committed to rapidly communicating information as needed about H5N1.”
    The USDA and HHS also diverge on vaccination. While the USDA has allocated million toward developing vaccines and other solutions for preventing H5N1’s spread in livestock, the HHS cancelled million in contracts for influenza vaccine development. The contracts – terminated on 28 May – were with the pharmaceutical company Moderna to develop vaccines targeting flu subtypes, including H5N1, that could cause future pandemics. The news came the same day Moderna reported nearly 98 per cent of the roughly 300 participants who received two doses of the H5 vaccine in a clinical trial had antibody levels believed to be protective against the virus.
    The US has about five million H5N1 vaccine doses stockpiled, but these are made using eggs and cultured cells, which take longer to produce than mRNA-based vaccines like Moderna’s. The Moderna vaccine would have modernised the stockpile and enabled the government to rapidly produce vaccines in the event of a pandemic, says Sorrell. “It seems like a very effective platform and would have positioned the US and others to be on good footing if and when we needed a vaccine for our general public,” she says.

    The HHS cancelled the contracts due to concerns about mRNA vaccines, which Robert F Kennedy Jr – the country’s highest-ranking public health official – has previously cast doubt on. “The reality is that mRNA technology remains under-tested, and we are not going to spend taxpayer dollars repeating the mistakes of the last administration,” said HHS communications director Andrew Nixon in a statement to New Scientist.
    However, mRNA technology isn’t new. It has been in development for more than half a century and numerous clinical trials have shown mRNA vaccines are safe. While they do carry the risk of side effects – the majority of which are mild – this is true of almost every medical treatment. In a press release, Moderna said it would explore alternative funding paths for the programme.
    “My stance is that we should not be looking to take anything off the table, and that includes any type of vaccine regimen,” says Lakdawala.
    “Vaccines are the most effective way to counter an infectious disease,” says Sorrell. “And so having that in your arsenal and ready to go just give you more options.”
    Topics:
    #how #agriculture #agency #became #key
    How a US agriculture agency became key in the fight against bird flu
    A dangerous strain of bird flu is spreading in US livestockMediaMedium/Alamy Since Donald Trump assumed office in January, the leading US public health agency has pulled back preparations for a potential bird flu pandemic. But as it steps back, another government agency is stepping up. While the US Department of Health and Human Servicespreviously held regular briefings on its efforts to prevent a wider outbreak of a deadly bird flu virus called H5N1 in people, it largely stopped once Trump took office. It has also cancelled funding for a vaccine that would have targeted the virus. In contrast, the US Department of Agriculturehas escalated its fight against H5N1’s spread in poultry flocks and dairy herds, including by funding the development of livestock vaccines. This particular virus – a strain of avian influenza called H5N1 – poses a significant threat to humans, having killed about half of the roughly 1000 people worldwide who tested positive for it since 2003. While the pathogen spreads rapidly in birds, it is poorly adapted to infecting humans and isn’t known to transmit between people. But that could change if it acquires mutations that allow it to spread more easily among mammals – a risk that increases with each mammalian infection. The possibility of H5N1 evolving to become more dangerous to people has grown significantly since March 2024, when the virus jumped from migratory birds to dairy cows in Texas. More than 1,070 herds across 17 states have been affected since then. H5N1 also infects poultry, placing the virus in closer proximity to people. Since 2022, nearly 175 million domestic birds have been culled in the US due to H5N1, and almost all of the 71 people who have tested positive for it had direct contact with livestock. Get the most essential health and fitness news in your inbox every Saturday. Sign up to newsletter “We need to take this seriously because whenconstantly is spreading, it’s constantly spilling over into humans,” says Seema Lakdawala at Emory University in Georgia. The virus has already killed a person in the US and a child in Mexico this year. Still, cases have declined under Trump. The last recorded human case was in February, and the number of affected poultry flocks fell 95 per cent between then and June. Outbreaks in dairy herds have also stabilised. It isn’t clear what is behind the decline. Lakdawala believes it is partly due to a lull in bird migration, which reduces opportunities for the virus to spread from wild birds to livestock. It may also reflect efforts by the USDA to contain outbreaks on farms. In February, the USDA unveiled a billion plan for tackling H5N1, including strengthening farmers’ defences against the virus, such as through free biosecurity assessments. Of the 150 facilities that have undergone assessment, only one has experienced an H5N1 outbreak. Under Trump, the USDA also continued its National Milk Testing Strategy, which mandates farms provide raw milk samples for influenza testing. If a farm is positive for H5N1, it must allow the USDA to monitor livestock and implement measures to contain the virus. The USDA launched the programme in December and has since ramped up participation to 45 states. “The National Milk Testing Strategy is a fantastic system,” says Erin Sorrell at Johns Hopkins University in Maryland. Along with the USDA’s efforts to improve biosecurity measures on farms, milk testing is crucial for containing the outbreak, says Sorrell. But while the USDA has bolstered its efforts against H5N1, the HHS doesn’t appear to have followed suit. In fact, the recent drop in human cases may reflect decreased surveillance due to workforce cuts, says Sorrell. In April, the HHS laid off about 10,000 employees, including 90 per cent of staff at the National Institute for Occupational Safety and Health, an office that helps investigate H5N1 outbreaks in farm workers. “There is an old saying that if you don’t test for something, you can’t find it,” says Sorrell. Yet a spokesperson for the US Centers for Disease Control and Preventionsays its guidance and surveillance efforts have not changed. “State and local health departments continue to monitor for illness in persons exposed to sick animals,” they told New Scientist. “CDC remains committed to rapidly communicating information as needed about H5N1.” The USDA and HHS also diverge on vaccination. While the USDA has allocated million toward developing vaccines and other solutions for preventing H5N1’s spread in livestock, the HHS cancelled million in contracts for influenza vaccine development. The contracts – terminated on 28 May – were with the pharmaceutical company Moderna to develop vaccines targeting flu subtypes, including H5N1, that could cause future pandemics. The news came the same day Moderna reported nearly 98 per cent of the roughly 300 participants who received two doses of the H5 vaccine in a clinical trial had antibody levels believed to be protective against the virus. The US has about five million H5N1 vaccine doses stockpiled, but these are made using eggs and cultured cells, which take longer to produce than mRNA-based vaccines like Moderna’s. The Moderna vaccine would have modernised the stockpile and enabled the government to rapidly produce vaccines in the event of a pandemic, says Sorrell. “It seems like a very effective platform and would have positioned the US and others to be on good footing if and when we needed a vaccine for our general public,” she says. The HHS cancelled the contracts due to concerns about mRNA vaccines, which Robert F Kennedy Jr – the country’s highest-ranking public health official – has previously cast doubt on. “The reality is that mRNA technology remains under-tested, and we are not going to spend taxpayer dollars repeating the mistakes of the last administration,” said HHS communications director Andrew Nixon in a statement to New Scientist. However, mRNA technology isn’t new. It has been in development for more than half a century and numerous clinical trials have shown mRNA vaccines are safe. While they do carry the risk of side effects – the majority of which are mild – this is true of almost every medical treatment. In a press release, Moderna said it would explore alternative funding paths for the programme. “My stance is that we should not be looking to take anything off the table, and that includes any type of vaccine regimen,” says Lakdawala. “Vaccines are the most effective way to counter an infectious disease,” says Sorrell. “And so having that in your arsenal and ready to go just give you more options.” Topics: #how #agriculture #agency #became #key
    WWW.NEWSCIENTIST.COM
    How a US agriculture agency became key in the fight against bird flu
    A dangerous strain of bird flu is spreading in US livestockMediaMedium/Alamy Since Donald Trump assumed office in January, the leading US public health agency has pulled back preparations for a potential bird flu pandemic. But as it steps back, another government agency is stepping up. While the US Department of Health and Human Services (HHS) previously held regular briefings on its efforts to prevent a wider outbreak of a deadly bird flu virus called H5N1 in people, it largely stopped once Trump took office. It has also cancelled funding for a vaccine that would have targeted the virus. In contrast, the US Department of Agriculture (USDA) has escalated its fight against H5N1’s spread in poultry flocks and dairy herds, including by funding the development of livestock vaccines. This particular virus – a strain of avian influenza called H5N1 – poses a significant threat to humans, having killed about half of the roughly 1000 people worldwide who tested positive for it since 2003. While the pathogen spreads rapidly in birds, it is poorly adapted to infecting humans and isn’t known to transmit between people. But that could change if it acquires mutations that allow it to spread more easily among mammals – a risk that increases with each mammalian infection. The possibility of H5N1 evolving to become more dangerous to people has grown significantly since March 2024, when the virus jumped from migratory birds to dairy cows in Texas. More than 1,070 herds across 17 states have been affected since then. H5N1 also infects poultry, placing the virus in closer proximity to people. Since 2022, nearly 175 million domestic birds have been culled in the US due to H5N1, and almost all of the 71 people who have tested positive for it had direct contact with livestock. Get the most essential health and fitness news in your inbox every Saturday. Sign up to newsletter “We need to take this seriously because when [H5N1] constantly is spreading, it’s constantly spilling over into humans,” says Seema Lakdawala at Emory University in Georgia. The virus has already killed a person in the US and a child in Mexico this year. Still, cases have declined under Trump. The last recorded human case was in February, and the number of affected poultry flocks fell 95 per cent between then and June. Outbreaks in dairy herds have also stabilised. It isn’t clear what is behind the decline. Lakdawala believes it is partly due to a lull in bird migration, which reduces opportunities for the virus to spread from wild birds to livestock. It may also reflect efforts by the USDA to contain outbreaks on farms. In February, the USDA unveiled a $1 billion plan for tackling H5N1, including strengthening farmers’ defences against the virus, such as through free biosecurity assessments. Of the 150 facilities that have undergone assessment, only one has experienced an H5N1 outbreak. Under Trump, the USDA also continued its National Milk Testing Strategy, which mandates farms provide raw milk samples for influenza testing. If a farm is positive for H5N1, it must allow the USDA to monitor livestock and implement measures to contain the virus. The USDA launched the programme in December and has since ramped up participation to 45 states. “The National Milk Testing Strategy is a fantastic system,” says Erin Sorrell at Johns Hopkins University in Maryland. Along with the USDA’s efforts to improve biosecurity measures on farms, milk testing is crucial for containing the outbreak, says Sorrell. But while the USDA has bolstered its efforts against H5N1, the HHS doesn’t appear to have followed suit. In fact, the recent drop in human cases may reflect decreased surveillance due to workforce cuts, says Sorrell. In April, the HHS laid off about 10,000 employees, including 90 per cent of staff at the National Institute for Occupational Safety and Health, an office that helps investigate H5N1 outbreaks in farm workers. “There is an old saying that if you don’t test for something, you can’t find it,” says Sorrell. Yet a spokesperson for the US Centers for Disease Control and Prevention (CDC) says its guidance and surveillance efforts have not changed. “State and local health departments continue to monitor for illness in persons exposed to sick animals,” they told New Scientist. “CDC remains committed to rapidly communicating information as needed about H5N1.” The USDA and HHS also diverge on vaccination. While the USDA has allocated $100 million toward developing vaccines and other solutions for preventing H5N1’s spread in livestock, the HHS cancelled $776 million in contracts for influenza vaccine development. The contracts – terminated on 28 May – were with the pharmaceutical company Moderna to develop vaccines targeting flu subtypes, including H5N1, that could cause future pandemics. The news came the same day Moderna reported nearly 98 per cent of the roughly 300 participants who received two doses of the H5 vaccine in a clinical trial had antibody levels believed to be protective against the virus. The US has about five million H5N1 vaccine doses stockpiled, but these are made using eggs and cultured cells, which take longer to produce than mRNA-based vaccines like Moderna’s. The Moderna vaccine would have modernised the stockpile and enabled the government to rapidly produce vaccines in the event of a pandemic, says Sorrell. “It seems like a very effective platform and would have positioned the US and others to be on good footing if and when we needed a vaccine for our general public,” she says. The HHS cancelled the contracts due to concerns about mRNA vaccines, which Robert F Kennedy Jr – the country’s highest-ranking public health official – has previously cast doubt on. “The reality is that mRNA technology remains under-tested, and we are not going to spend taxpayer dollars repeating the mistakes of the last administration,” said HHS communications director Andrew Nixon in a statement to New Scientist. However, mRNA technology isn’t new. It has been in development for more than half a century and numerous clinical trials have shown mRNA vaccines are safe. While they do carry the risk of side effects – the majority of which are mild – this is true of almost every medical treatment. In a press release, Moderna said it would explore alternative funding paths for the programme. “My stance is that we should not be looking to take anything off the table, and that includes any type of vaccine regimen,” says Lakdawala. “Vaccines are the most effective way to counter an infectious disease,” says Sorrell. “And so having that in your arsenal and ready to go just give you more options.” Topics:
    0 Yorumlar 0 hisse senetleri
  • Premier Truck Rental: Inside Sales Representative - Remote Salt Lake Area

    Are you in search of a company that resonates with your proactive spirit and entrepreneurial mindset? Your search ends here with Premier Truck Rental! Company Overview At Premier Truck Rental, we provide customized commercial fleet rentals nationwide, helping businesses get the right trucks and equipment to get the job done. Headquartered in Fort Wayne, Indiana, PTR is a family-owned company built on a foundation of integrity, innovation, and exceptional service. We serve a wide range of industriesincluding construction, utilities, and infrastructureby delivering high-quality, ready-to-work trucks and trailers tailored to each customers needs. At PTR, we dont just rent truckswe partner with our customers to drive efficiency and success on every job site. Please keep reading Not sure if you meet every requirement? Thats okay! We encourage you to apply if youre passionate, hardworking, and eager to contribute. We know that diverse perspectives and experiences make us stronger, and we want you to be part of our journey. Inside Sales Representativeat PTR is a friendly, people-oriented, and persuasive steward of the sales process. This role will support our Territory Managers with their sales pipeline while also prospecting and cross-selling PTR products themselves. This support includes driving results by enrolling the commitment and buy-in of other internal departments to achieve sales initiatives. The Inside Sales Representative will also represent PTRs commitment to being our customers easy button by serving as the main point of contact. They will be the front-line hero by assisting them in making informed decisions, providing guidance on our rentals, and resolving any issues they might face. We are seeking someone eager to develop their sales skills and grow within our organization. This role is designed as a stepping stone to a Territory Sales Managerposition, providing hands-on experience with customer interactions, lead qualification, and sales process execution. Ideal candidates will demonstrate a strong drive for results, the ability to build relationships, and a proactive approach to learning and development. High-performing ISRs will have the opportunity to be mentored, trained, and considered for promotion into a TSM role as part of their career path at PTR. COMPENSATION This position offers a competitive compensation package of base salaryplus uncapped commissions =OTE annually. RESPONSIBILITIES Offer top-notch customer service and respond with a sense of urgency for goal achievement in a fast-paced sales environment. Build a strong pipeline of customers by qualifying potential leads in your territory. This includes strategic prospecting and sourcing. Develop creative ways to engage and build rapport with prospective customers by pitching the Premier Truck Rental value proposition. Partner with assigned Territory Managers by assisting with scheduling customer visits, trade shows, new customer hand-offs, and any other travel requested. Facilitate in-person meetings and set appointments with prospective customers. Qualify and quote inquiries for your prospective territories both online and from the Territory Manager. Input data into the system with accuracy and follow up in a timely fashion. Facilitate the onboarding of new customers through the credit process. Drive collaboration between customers, Territory Managers, Logistics, and internal teams to coordinate On-Rent and Off-Rent notices with excellent attention to detail. Identify and arrange the swap of equipment from customers meeting the PTR de-fleeting criteria. Manage the sales tools to organize, compile, and analyze data with accuracy for a variety of activities and multiple projects occurring simultaneously.Building and developing a new 3-4 state territory! REQUIREMENTS MUST HAVE2+ years of strategic prospecting or account manager/sales experience; or an advanced degree or equivalent experience converting prospects into closed sales. Tech-forward approach to sales strategy. Excellent prospecting, follow-up, and follow-through skills. Committed to seeing deals through completion. Accountability and ownership of the sales process and a strong commitment to results. Comfortable with a job that has a variety of tasks and is dynamic and changing. Proactive prospecting skills and can overcome objections; driven to establish relationships with new customers. Ability to communicate in a clear, logical manner in formal and informal situations. Proficiency in CRMs and sales tracking systems Hunters mindsetsomeone who thrives on pursuing new business, driving outbound sales, and generating qualified opportunities. Prospecting: Going on LinkedIn, Looking at Competitor data, grabbing contacts for the TM, may use technology like Apollo and LinkedIn Sales Navigator Partner closely with the Territory Manager to ensure a unified approach in managing customer relationships, pipeline development, and revenue growth. Maintain clear and consistent communication to align on sales strategies, customer needs, and market opportunities, fostering a seamless and collaborative partnership with the Territory Manager. Consistently meet and exceed key performance indicators, including rental revenue, upfit revenue, and conversion rates, by actively managing customer accounts and identifying growth opportunities. Support the saturation and maturation of the customer base through strategic outreach, relationship management, and alignment with the Territory Manager to drive long-term success. Remote in the United States with some travel to trade shows, quarterly travel up to a week at a time, and sales meetingsNICE TO HAVE Rental and/or sales experience in the industry. Proficiency in , Apollo.io , LinkedIn Sales Navigator, Power BI, MS Dynamics, Chat GPT. Established relationships within the marketplace or territory. Motivated to grow into outside territory management position with relocation On Target Earnings:EMPLOYEE BENEFITSWellness & Fitness: Take advantage of our on-site CrossFit-style gym, featuring a full-time personal trainer dedicated to helping you reach your fitness goals. Whether you're into group classes, virtual personal training, personalized workout plans, or nutrition coaching, weve got you covered!Exclusive Employee Perks: PTR Swag & a Uniform/Boot Allowance, On-site Micro-Markets stocked with snacks & essentials, discounts on phone plans, supplier vehicles, mobile detailing, tools, & equipmentand much more!Profit SharingYour Success, rewarded: At PTR, we believe in sharing success. Our Profit-SharingComprehensive BenefitsStarting Day One:Premium healthcare coverage401matching & long-term financial planning Paid time off that lets you recharge Life, accidental death, and disability coverage Ongoing learning & development opportunitiesTraining, Growth & RecognitionWe partner with Predictive Index to better understand your strengths, ensuring tailored coaching, structured training, and career development. Performance and attitude evaluations every 6 months keep you on track for growth.Culture & ConnectionMore Than Just a JobAt PTR, we dont just build relationships with our customerswe build them with each other. Our tech-forward, highly collaborative culture is rooted in our core values. Connect and engage through:PTR Field Days & Team EventsThe Extra Mile Recognition ProgramPTR Text Alerts & Open CommunicationPremier Truck Rental Is an Equal Opportunity Employer We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need support or accommodation due to a disability, contact us at PI6e547fa1c5-
    #premier #truck #rental #inside #sales
    Premier Truck Rental: Inside Sales Representative - Remote Salt Lake Area
    Are you in search of a company that resonates with your proactive spirit and entrepreneurial mindset? Your search ends here with Premier Truck Rental! Company Overview At Premier Truck Rental, we provide customized commercial fleet rentals nationwide, helping businesses get the right trucks and equipment to get the job done. Headquartered in Fort Wayne, Indiana, PTR is a family-owned company built on a foundation of integrity, innovation, and exceptional service. We serve a wide range of industriesincluding construction, utilities, and infrastructureby delivering high-quality, ready-to-work trucks and trailers tailored to each customers needs. At PTR, we dont just rent truckswe partner with our customers to drive efficiency and success on every job site. Please keep reading Not sure if you meet every requirement? Thats okay! We encourage you to apply if youre passionate, hardworking, and eager to contribute. We know that diverse perspectives and experiences make us stronger, and we want you to be part of our journey. Inside Sales Representativeat PTR is a friendly, people-oriented, and persuasive steward of the sales process. This role will support our Territory Managers with their sales pipeline while also prospecting and cross-selling PTR products themselves. This support includes driving results by enrolling the commitment and buy-in of other internal departments to achieve sales initiatives. The Inside Sales Representative will also represent PTRs commitment to being our customers easy button by serving as the main point of contact. They will be the front-line hero by assisting them in making informed decisions, providing guidance on our rentals, and resolving any issues they might face. We are seeking someone eager to develop their sales skills and grow within our organization. This role is designed as a stepping stone to a Territory Sales Managerposition, providing hands-on experience with customer interactions, lead qualification, and sales process execution. Ideal candidates will demonstrate a strong drive for results, the ability to build relationships, and a proactive approach to learning and development. High-performing ISRs will have the opportunity to be mentored, trained, and considered for promotion into a TSM role as part of their career path at PTR. COMPENSATION This position offers a competitive compensation package of base salaryplus uncapped commissions =OTE annually. RESPONSIBILITIES Offer top-notch customer service and respond with a sense of urgency for goal achievement in a fast-paced sales environment. Build a strong pipeline of customers by qualifying potential leads in your territory. This includes strategic prospecting and sourcing. Develop creative ways to engage and build rapport with prospective customers by pitching the Premier Truck Rental value proposition. Partner with assigned Territory Managers by assisting with scheduling customer visits, trade shows, new customer hand-offs, and any other travel requested. Facilitate in-person meetings and set appointments with prospective customers. Qualify and quote inquiries for your prospective territories both online and from the Territory Manager. Input data into the system with accuracy and follow up in a timely fashion. Facilitate the onboarding of new customers through the credit process. Drive collaboration between customers, Territory Managers, Logistics, and internal teams to coordinate On-Rent and Off-Rent notices with excellent attention to detail. Identify and arrange the swap of equipment from customers meeting the PTR de-fleeting criteria. Manage the sales tools to organize, compile, and analyze data with accuracy for a variety of activities and multiple projects occurring simultaneously.Building and developing a new 3-4 state territory! REQUIREMENTS MUST HAVE2+ years of strategic prospecting or account manager/sales experience; or an advanced degree or equivalent experience converting prospects into closed sales. Tech-forward approach to sales strategy. Excellent prospecting, follow-up, and follow-through skills. Committed to seeing deals through completion. Accountability and ownership of the sales process and a strong commitment to results. Comfortable with a job that has a variety of tasks and is dynamic and changing. Proactive prospecting skills and can overcome objections; driven to establish relationships with new customers. Ability to communicate in a clear, logical manner in formal and informal situations. Proficiency in CRMs and sales tracking systems Hunters mindsetsomeone who thrives on pursuing new business, driving outbound sales, and generating qualified opportunities. Prospecting: Going on LinkedIn, Looking at Competitor data, grabbing contacts for the TM, may use technology like Apollo and LinkedIn Sales Navigator Partner closely with the Territory Manager to ensure a unified approach in managing customer relationships, pipeline development, and revenue growth. Maintain clear and consistent communication to align on sales strategies, customer needs, and market opportunities, fostering a seamless and collaborative partnership with the Territory Manager. Consistently meet and exceed key performance indicators, including rental revenue, upfit revenue, and conversion rates, by actively managing customer accounts and identifying growth opportunities. Support the saturation and maturation of the customer base through strategic outreach, relationship management, and alignment with the Territory Manager to drive long-term success. Remote in the United States with some travel to trade shows, quarterly travel up to a week at a time, and sales meetingsNICE TO HAVE Rental and/or sales experience in the industry. Proficiency in , Apollo.io , LinkedIn Sales Navigator, Power BI, MS Dynamics, Chat GPT. Established relationships within the marketplace or territory. Motivated to grow into outside territory management position with relocation On Target Earnings:EMPLOYEE BENEFITSWellness & Fitness: Take advantage of our on-site CrossFit-style gym, featuring a full-time personal trainer dedicated to helping you reach your fitness goals. Whether you're into group classes, virtual personal training, personalized workout plans, or nutrition coaching, weve got you covered!Exclusive Employee Perks: PTR Swag & a Uniform/Boot Allowance, On-site Micro-Markets stocked with snacks & essentials, discounts on phone plans, supplier vehicles, mobile detailing, tools, & equipmentand much more!Profit SharingYour Success, rewarded: At PTR, we believe in sharing success. Our Profit-SharingComprehensive BenefitsStarting Day One:Premium healthcare coverage401matching & long-term financial planning Paid time off that lets you recharge Life, accidental death, and disability coverage Ongoing learning & development opportunitiesTraining, Growth & RecognitionWe partner with Predictive Index to better understand your strengths, ensuring tailored coaching, structured training, and career development. Performance and attitude evaluations every 6 months keep you on track for growth.Culture & ConnectionMore Than Just a JobAt PTR, we dont just build relationships with our customerswe build them with each other. Our tech-forward, highly collaborative culture is rooted in our core values. Connect and engage through:PTR Field Days & Team EventsThe Extra Mile Recognition ProgramPTR Text Alerts & Open CommunicationPremier Truck Rental Is an Equal Opportunity Employer We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need support or accommodation due to a disability, contact us at PI6e547fa1c5- #premier #truck #rental #inside #sales
    WEWORKREMOTELY.COM
    Premier Truck Rental: Inside Sales Representative - Remote Salt Lake Area
    Are you in search of a company that resonates with your proactive spirit and entrepreneurial mindset? Your search ends here with Premier Truck Rental! Company Overview At Premier Truck Rental (PTR), we provide customized commercial fleet rentals nationwide, helping businesses get the right trucks and equipment to get the job done. Headquartered in Fort Wayne, Indiana, PTR is a family-owned company built on a foundation of integrity, innovation, and exceptional service. We serve a wide range of industriesincluding construction, utilities, and infrastructureby delivering high-quality, ready-to-work trucks and trailers tailored to each customers needs. At PTR, we dont just rent truckswe partner with our customers to drive efficiency and success on every job site. Please keep reading Not sure if you meet every requirement? Thats okay! We encourage you to apply if youre passionate, hardworking, and eager to contribute. We know that diverse perspectives and experiences make us stronger, and we want you to be part of our journey. Inside Sales Representative (ISR) at PTR is a friendly, people-oriented, and persuasive steward of the sales process. This role will support our Territory Managers with their sales pipeline while also prospecting and cross-selling PTR products themselves. This support includes driving results by enrolling the commitment and buy-in of other internal departments to achieve sales initiatives. The Inside Sales Representative will also represent PTRs commitment to being our customers easy button by serving as the main point of contact. They will be the front-line hero by assisting them in making informed decisions, providing guidance on our rentals, and resolving any issues they might face. We are seeking someone eager to develop their sales skills and grow within our organization. This role is designed as a stepping stone to a Territory Sales Manager (TSM) position, providing hands-on experience with customer interactions, lead qualification, and sales process execution. Ideal candidates will demonstrate a strong drive for results, the ability to build relationships, and a proactive approach to learning and development. High-performing ISRs will have the opportunity to be mentored, trained, and considered for promotion into a TSM role as part of their career path at PTR. COMPENSATION This position offers a competitive compensation package of base salary ($50,000/yr) plus uncapped commissions =OTE $85,000 annually. RESPONSIBILITIES Offer top-notch customer service and respond with a sense of urgency for goal achievement in a fast-paced sales environment. Build a strong pipeline of customers by qualifying potential leads in your territory. This includes strategic prospecting and sourcing. Develop creative ways to engage and build rapport with prospective customers by pitching the Premier Truck Rental value proposition. Partner with assigned Territory Managers by assisting with scheduling customer visits, trade shows, new customer hand-offs, and any other travel requested. Facilitate in-person meetings and set appointments with prospective customers. Qualify and quote inquiries for your prospective territories both online and from the Territory Manager. Input data into the system with accuracy and follow up in a timely fashion. Facilitate the onboarding of new customers through the credit process. Drive collaboration between customers, Territory Managers, Logistics, and internal teams to coordinate On-Rent and Off-Rent notices with excellent attention to detail. Identify and arrange the swap of equipment from customers meeting the PTR de-fleeting criteria. Manage the sales tools to organize, compile, and analyze data with accuracy for a variety of activities and multiple projects occurring simultaneously.Building and developing a new 3-4 state territory! REQUIREMENTS MUST HAVE2+ years of strategic prospecting or account manager/sales experience; or an advanced degree or equivalent experience converting prospects into closed sales. Tech-forward approach to sales strategy. Excellent prospecting, follow-up, and follow-through skills. Committed to seeing deals through completion. Accountability and ownership of the sales process and a strong commitment to results. Comfortable with a job that has a variety of tasks and is dynamic and changing. Proactive prospecting skills and can overcome objections; driven to establish relationships with new customers. Ability to communicate in a clear, logical manner in formal and informal situations. Proficiency in CRMs and sales tracking systems Hunters mindsetsomeone who thrives on pursuing new business, driving outbound sales, and generating qualified opportunities. Prospecting: Going on LinkedIn, Looking at Competitor data, grabbing contacts for the TM, may use technology like Apollo and LinkedIn Sales Navigator Partner closely with the Territory Manager to ensure a unified approach in managing customer relationships, pipeline development, and revenue growth. Maintain clear and consistent communication to align on sales strategies, customer needs, and market opportunities, fostering a seamless and collaborative partnership with the Territory Manager. Consistently meet and exceed key performance indicators (KPIs), including rental revenue, upfit revenue, and conversion rates, by actively managing customer accounts and identifying growth opportunities. Support the saturation and maturation of the customer base through strategic outreach, relationship management, and alignment with the Territory Manager to drive long-term success. Remote in the United States with some travel to trade shows, quarterly travel up to a week at a time, and sales meetingsNICE TO HAVE Rental and/or sales experience in the industry. Proficiency in , Apollo.io , LinkedIn Sales Navigator, Power BI, MS Dynamics, Chat GPT. Established relationships within the marketplace or territory. Motivated to grow into outside territory management position with relocation On Target Earnings: ($85,000)EMPLOYEE BENEFITSWellness & Fitness: Take advantage of our on-site CrossFit-style gym, featuring a full-time personal trainer dedicated to helping you reach your fitness goals. Whether you're into group classes, virtual personal training, personalized workout plans, or nutrition coaching, weve got you covered!Exclusive Employee Perks: PTR Swag & a Uniform/Boot Allowance, On-site Micro-Markets stocked with snacks & essentials, discounts on phone plans, supplier vehicles, mobile detailing, tools, & equipmentand much more!Profit SharingYour Success, rewarded: At PTR, we believe in sharing success. Our Profit-SharingComprehensive BenefitsStarting Day One:Premium healthcare coverage (medical, dental, vision, mental health & virtual healthcare)401(k) matching & long-term financial planning Paid time off that lets you recharge Life, accidental death, and disability coverage Ongoing learning & development opportunitiesTraining, Growth & RecognitionWe partner with Predictive Index to better understand your strengths, ensuring tailored coaching, structured training, and career development. Performance and attitude evaluations every 6 months keep you on track for growth.Culture & ConnectionMore Than Just a JobAt PTR, we dont just build relationships with our customerswe build them with each other. Our tech-forward, highly collaborative culture is rooted in our core values. Connect and engage through:PTR Field Days & Team EventsThe Extra Mile Recognition ProgramPTR Text Alerts & Open CommunicationPremier Truck Rental Is an Equal Opportunity Employer We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law. If you need support or accommodation due to a disability, contact us at PI6e547fa1c5-
    0 Yorumlar 0 hisse senetleri
  • fxpodcast: Landman’s special effects and explosions with Garry Elmendorf

    Garry Elmendorf isn’t just a special effects supervisor, he’s a master of controlled chaos. With over 50 years in the business, from Logan’s Run in the ’70s to the high-octane worlds of Yellowstone, 1883, 1923, and Landman. Elmendorf has shaped the visual DNA of Taylor Sheridan’s TV empire with a mix of old-school craft and jaw-dropping spectacle. In the latest fxpodcast, Garry joins us to break down the physical effects work behind some of the most explosive moments in Landman.
    As regular listeners know, we occasionally conduct interviews with individuals working in SFX, rather than with VFX. Garry’s work is not the kind of work that’s built in post and his approach is grounded in real-world physics, practical fabrication, and deeply collaborative on-set discipline. Take the aircraft crash in Landman’s premiere: there was no CGI here, other than comp cleanup. It was shot with just a Frankenstein plane built from scrap, rigged with trip triggers and detonated in real time.
    Or the massive oil rig explosion, which involved custom pump jacks, 2,000 gallons of burning diesel and gasoline, propane cannons, and tightly timed pyro rigs. The scale is cinematic. Safety, Garry insists, is always his first concern, but what keeps him up at night is timing. One mistimed trigger, one failed ignition, and the shot is ruined.

    In our conversation, Garry shares incredible behind-the-scenes insights into how these sequences are devised, tested, and executed, whether it’s launching a van skyward via an air cannon or walking Billy Bob Thornton within 40 feet of a roaring fireball. There’s a tactile intensity to his work, and a trust among his crew that only comes from decades of working under pressure. From assembling a crashable aircraft out of mismatched parts to rigging oil rig explosions with precise control over flame size, duration, and safety, his work is rooted in mechanical problem-solving and coordination across departments.

    In Landman, whether coordinating multiple fuel types to achieve specific smoke density or calculating safe clearances for actors and crew around high-temperature pyrotechnics, Elmendorf’s contribution reflects a commitment to realism and repeatability on set. The result is a series where the physicality of explosions, crashes, and fire-driven action carries weight, both in terms of production logistics and visual impact.

    Listen to the full interview on the fxpodcast.
    #fxpodcast #landmans #special #effects #explosions
    fxpodcast: Landman’s special effects and explosions with Garry Elmendorf
    Garry Elmendorf isn’t just a special effects supervisor, he’s a master of controlled chaos. With over 50 years in the business, from Logan’s Run in the ’70s to the high-octane worlds of Yellowstone, 1883, 1923, and Landman. Elmendorf has shaped the visual DNA of Taylor Sheridan’s TV empire with a mix of old-school craft and jaw-dropping spectacle. In the latest fxpodcast, Garry joins us to break down the physical effects work behind some of the most explosive moments in Landman. As regular listeners know, we occasionally conduct interviews with individuals working in SFX, rather than with VFX. Garry’s work is not the kind of work that’s built in post and his approach is grounded in real-world physics, practical fabrication, and deeply collaborative on-set discipline. Take the aircraft crash in Landman’s premiere: there was no CGI here, other than comp cleanup. It was shot with just a Frankenstein plane built from scrap, rigged with trip triggers and detonated in real time. Or the massive oil rig explosion, which involved custom pump jacks, 2,000 gallons of burning diesel and gasoline, propane cannons, and tightly timed pyro rigs. The scale is cinematic. Safety, Garry insists, is always his first concern, but what keeps him up at night is timing. One mistimed trigger, one failed ignition, and the shot is ruined. In our conversation, Garry shares incredible behind-the-scenes insights into how these sequences are devised, tested, and executed, whether it’s launching a van skyward via an air cannon or walking Billy Bob Thornton within 40 feet of a roaring fireball. There’s a tactile intensity to his work, and a trust among his crew that only comes from decades of working under pressure. From assembling a crashable aircraft out of mismatched parts to rigging oil rig explosions with precise control over flame size, duration, and safety, his work is rooted in mechanical problem-solving and coordination across departments. In Landman, whether coordinating multiple fuel types to achieve specific smoke density or calculating safe clearances for actors and crew around high-temperature pyrotechnics, Elmendorf’s contribution reflects a commitment to realism and repeatability on set. The result is a series where the physicality of explosions, crashes, and fire-driven action carries weight, both in terms of production logistics and visual impact. Listen to the full interview on the fxpodcast. #fxpodcast #landmans #special #effects #explosions
    WWW.FXGUIDE.COM
    fxpodcast: Landman’s special effects and explosions with Garry Elmendorf
    Garry Elmendorf isn’t just a special effects supervisor, he’s a master of controlled chaos. With over 50 years in the business, from Logan’s Run in the ’70s to the high-octane worlds of Yellowstone, 1883, 1923, and Landman. Elmendorf has shaped the visual DNA of Taylor Sheridan’s TV empire with a mix of old-school craft and jaw-dropping spectacle. In the latest fxpodcast, Garry joins us to break down the physical effects work behind some of the most explosive moments in Landman. As regular listeners know, we occasionally conduct interviews with individuals working in SFX, rather than with VFX. Garry’s work is not the kind of work that’s built in post and his approach is grounded in real-world physics, practical fabrication, and deeply collaborative on-set discipline. Take the aircraft crash in Landman’s premiere: there was no CGI here, other than comp cleanup. It was shot with just a Frankenstein plane built from scrap, rigged with trip triggers and detonated in real time. Or the massive oil rig explosion, which involved custom pump jacks, 2,000 gallons of burning diesel and gasoline, propane cannons, and tightly timed pyro rigs. The scale is cinematic. Safety, Garry insists, is always his first concern, but what keeps him up at night is timing. One mistimed trigger, one failed ignition, and the shot is ruined. In our conversation, Garry shares incredible behind-the-scenes insights into how these sequences are devised, tested, and executed, whether it’s launching a van skyward via an air cannon or walking Billy Bob Thornton within 40 feet of a roaring fireball. There’s a tactile intensity to his work, and a trust among his crew that only comes from decades of working under pressure. From assembling a crashable aircraft out of mismatched parts to rigging oil rig explosions with precise control over flame size, duration, and safety, his work is rooted in mechanical problem-solving and coordination across departments. In Landman, whether coordinating multiple fuel types to achieve specific smoke density or calculating safe clearances for actors and crew around high-temperature pyrotechnics, Elmendorf’s contribution reflects a commitment to realism and repeatability on set. The result is a series where the physicality of explosions, crashes, and fire-driven action carries weight, both in terms of production logistics and visual impact. Listen to the full interview on the fxpodcast.
    0 Yorumlar 0 hisse senetleri
Arama Sonuçları