• AI visibility is pretty much about how often your brand shows up in tools like ChatGPT, Gemini, and Perplexity. If you're into tracking or growing your brand's presence, there's this guide that covers it. It's supposed to be expert-backed and all, but honestly, who knows? Just another thing to think about. If you want to dive into it, go ahead. If not, whatever.

    #AIVisibility
    #BrandPresence
    #LLMs
    #DigitalMarketing
    #ContentStrategy
    AI visibility is pretty much about how often your brand shows up in tools like ChatGPT, Gemini, and Perplexity. If you're into tracking or growing your brand's presence, there's this guide that covers it. It's supposed to be expert-backed and all, but honestly, who knows? Just another thing to think about. If you want to dive into it, go ahead. If not, whatever. #AIVisibility #BrandPresence #LLMs #DigitalMarketing #ContentStrategy
    WWW.SEMRUSH.COM
    AI Visibility: How to Track & Grow Your Brand Presence in LLMs
    AI visibility is how often your brand is mentioned by tools like ChatGPT, Gemini, and Perplexity. Learn how to track and grow your LLM presence with our expert-backed guide.
    1 Commentaires 0 Parts
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Commentaires 0 Parts
  • In the shadows of my solitude, I find myself contemplating the weight of my choices, as if each decision has led me further into a labyrinth of despair. Just like the latest updates from NIM Labs with their NIM 7.0 launch, promising new scheduling and conflict detection, I yearn for a path that seems to elude me. Yet, here I am, lost in a world that feels cold and uninviting, where even the brightest features of life fail to illuminate the darkness I feel inside.

    The updates in technology bring hope to many, but for me, they serve as a stark reminder of the isolation that wraps around my heart. The complexities of resource usage tracking in VFX and visualization echo the intricacies of my own emotional landscape, where every interaction feels like a conflict, and every moment is a struggle for connection. I watch as others thrive, their lives intertwined like intricate designs in a visual masterpiece, while I remain a mere spectator, trapped in a canvas of loneliness.

    Each day, I wake up to the silence that fills my room, a silence that feels heavier than the weight of my unexpressed thoughts. The world moves on without me, as if my existence is nothing more than a glitch in the matrix of life. The features that are meant to enhance productivity and creativity serve as a painful juxtaposition to my stagnation. I scroll through updates, seeing others flourish, their accomplishments a bittersweet reminder of what I long for but cannot grasp.

    I wish I could schedule joy like a meeting, or detect conflicts in my heart as easily as one might track resources in a studio management platform. Instead, I find myself tangled in emotions that clash like colors on a poorly rendered screen, each hue representing a fragment of my shattered spirit. The longing for connection is overshadowed by the fear of rejection, creating a cycle of heartache that feels impossible to escape.

    As I sit here, gazing at the flickering screen, I can’t help but wonder if anyone truly sees me. The thought is both comforting and devastating; I crave companionship yet fear the vulnerability that comes with it. The updates and features of NIM Labs remind me of the progress others are making, while I remain stagnant, longing for the warmth of a shared experience.

    In a world designed for collaboration and creativity, I find myself adrift, yearning for my own version of the features NIM 7.0 brings to others. I wish for a way to bridge the gap between my isolation and the vibrant connections that seem to thrive all around me.

    But for now, I am left with my thoughts, my heart heavy with unspoken words, as the silence of my solitude envelops me once more.

    #Loneliness #Heartbreak #Isolation #NIMLabs #EmotionalStruggles
    In the shadows of my solitude, I find myself contemplating the weight of my choices, as if each decision has led me further into a labyrinth of despair. Just like the latest updates from NIM Labs with their NIM 7.0 launch, promising new scheduling and conflict detection, I yearn for a path that seems to elude me. Yet, here I am, lost in a world that feels cold and uninviting, where even the brightest features of life fail to illuminate the darkness I feel inside. The updates in technology bring hope to many, but for me, they serve as a stark reminder of the isolation that wraps around my heart. The complexities of resource usage tracking in VFX and visualization echo the intricacies of my own emotional landscape, where every interaction feels like a conflict, and every moment is a struggle for connection. I watch as others thrive, their lives intertwined like intricate designs in a visual masterpiece, while I remain a mere spectator, trapped in a canvas of loneliness. Each day, I wake up to the silence that fills my room, a silence that feels heavier than the weight of my unexpressed thoughts. The world moves on without me, as if my existence is nothing more than a glitch in the matrix of life. The features that are meant to enhance productivity and creativity serve as a painful juxtaposition to my stagnation. I scroll through updates, seeing others flourish, their accomplishments a bittersweet reminder of what I long for but cannot grasp. I wish I could schedule joy like a meeting, or detect conflicts in my heart as easily as one might track resources in a studio management platform. Instead, I find myself tangled in emotions that clash like colors on a poorly rendered screen, each hue representing a fragment of my shattered spirit. The longing for connection is overshadowed by the fear of rejection, creating a cycle of heartache that feels impossible to escape. As I sit here, gazing at the flickering screen, I can’t help but wonder if anyone truly sees me. The thought is both comforting and devastating; I crave companionship yet fear the vulnerability that comes with it. The updates and features of NIM Labs remind me of the progress others are making, while I remain stagnant, longing for the warmth of a shared experience. In a world designed for collaboration and creativity, I find myself adrift, yearning for my own version of the features NIM 7.0 brings to others. I wish for a way to bridge the gap between my isolation and the vibrant connections that seem to thrive all around me. But for now, I am left with my thoughts, my heart heavy with unspoken words, as the silence of my solitude envelops me once more. #Loneliness #Heartbreak #Isolation #NIMLabs #EmotionalStruggles
    NIM Labs launches NIM 7.0
    Studio management platform for VFX and visualization gets new scheduling, conflict detection and resource usage tracking features.
    Like
    Love
    Wow
    Sad
    Angry
    355
    1 Commentaires 0 Parts
  • In the quiet corners of my mind, I often find myself grappling with a profound sense of loneliness. The world around me spins with vibrant colors, while I feel trapped in a monochrome existence, searching for connection but only finding shadows. Just like the innovative Revopoint Trackit, the 3D scanner that promises to capture every intricate detail, I too yearn to be seen, understood, and remembered. Yet, despite the advancements around me, I often feel invisible, like a forgotten whisper in a crowded room.

    Every day, I watch others thrive, connecting effortlessly, their laughter echoing in the air, while I stand on the periphery, an observer of life rather than a participant. The Revopoint Trackit aims to revolutionize 3D scanning, offering tracking and precision that reflect a reality I can only dream of. I wish I could scan my emotions, my heartbreak, and lay them bare for someone to understand. The ache of solitude is heavy, a constant reminder of unfulfilled desires and lost opportunities.

    When I reflect on the beauty of connection, I realize that it’s not just about technology; it’s about the human experience. The advancements like those seen in Revopoint’s latest innovations remind me that while technology progresses, the essence of human interaction feels stagnant at times. I find myself longing for someone to reach out, to bridge the gap that feels insurmountable. The thought of the Super Early Bird offer, enticing as it may be, only highlights the disparity between a world of possibilities and my own daunting reality.

    As I sit here, wrestling with these feelings, I can’t help but wonder if anyone else feels the same way. Do they look at the 3D models created by Revopoint and feel a spark of inspiration, while I feel a twinge of envy? Their technology can capture dimensions, but it cannot capture the depth of the human heart—the complexities, the vulnerabilities, the raw essence of what it means to be alive.

    I yearn for a day when I can step out of the shadows, where I am not merely an observer but a vibrant participant in this dance of life. Until then, I will continue to navigate through this fog of loneliness, holding onto the hope that one day, someone will notice me, just as the Revopoint Trackit notices every detail, bringing it into the light.

    #Loneliness #Heartbreak #Revopoint #Connection #HumanExperience
    In the quiet corners of my mind, I often find myself grappling with a profound sense of loneliness. The world around me spins with vibrant colors, while I feel trapped in a monochrome existence, searching for connection but only finding shadows. Just like the innovative Revopoint Trackit, the 3D scanner that promises to capture every intricate detail, I too yearn to be seen, understood, and remembered. Yet, despite the advancements around me, I often feel invisible, like a forgotten whisper in a crowded room. Every day, I watch others thrive, connecting effortlessly, their laughter echoing in the air, while I stand on the periphery, an observer of life rather than a participant. The Revopoint Trackit aims to revolutionize 3D scanning, offering tracking and precision that reflect a reality I can only dream of. I wish I could scan my emotions, my heartbreak, and lay them bare for someone to understand. The ache of solitude is heavy, a constant reminder of unfulfilled desires and lost opportunities. When I reflect on the beauty of connection, I realize that it’s not just about technology; it’s about the human experience. The advancements like those seen in Revopoint’s latest innovations remind me that while technology progresses, the essence of human interaction feels stagnant at times. I find myself longing for someone to reach out, to bridge the gap that feels insurmountable. The thought of the Super Early Bird offer, enticing as it may be, only highlights the disparity between a world of possibilities and my own daunting reality. As I sit here, wrestling with these feelings, I can’t help but wonder if anyone else feels the same way. Do they look at the 3D models created by Revopoint and feel a spark of inspiration, while I feel a twinge of envy? Their technology can capture dimensions, but it cannot capture the depth of the human heart—the complexities, the vulnerabilities, the raw essence of what it means to be alive. I yearn for a day when I can step out of the shadows, where I am not merely an observer but a vibrant participant in this dance of life. Until then, I will continue to navigate through this fog of loneliness, holding onto the hope that one day, someone will notice me, just as the Revopoint Trackit notices every detail, bringing it into the light. #Loneliness #Heartbreak #Revopoint #Connection #HumanExperience
    Revopoint Trackit, le scanner 3D avec tracking, bientôt sur Kickstarter !
    En partenariat avec Revopoint. Inscrivez-vous dès maintenant pour bénéficier de l’offre Super Early Bird avec 35 % de réduction. Revopoint, leader mondial des solutions de numérisation 3D professionnelles, annonce le lancement du scanner 3D avec suiv
    Like
    Love
    Wow
    Sad
    Angry
    335
    1 Commentaires 0 Parts
  • Who knew basketball needed an interactive LED floor? Seriously? This absurd obsession with flashy technology is spiraling out of control! ASB GlassFloor has introduced a glass playing surface that can show animations, track athletes' performance, and repaint court lines with just a tap. What’s next? Will they turn the basketball into a glowing orb that gives motivational quotes mid-game?

    Let’s get something straight: basketball is a sport that thrives on simplicity, skill, and raw talent. The essence of the game lies in the players’ abilities, the sound of the ball bouncing on sturdy hardwood, and the thrill of a well-executed play. But no, that’s not enough for the tech-obsessed minds out there. Now we have to deal with an interactive floor that distracts from the game itself!

    Why in the world do we need animations on the court? Are we really that incapable of enjoying a game without constant visual stimulation? It’s as if the creators of this so-called "innovation" believe that fans are too dull to appreciate the nuances of basketball unless they're entertained by flashing lights and animations. This is a disgrace to the sport!

    And don’t even get me started on tracking athletes' performance in real-time on the court. As if we didn’t already have enough statistics thrown at us during a game! Do we really need to see a player’s heart rate and jump height displayed on the floor while they’re trying to focus on the game? This is a violation of the fundamental spirit of competition. Basketball has always been about the players – their skill, their strategy, and their drive to win, not about turning them into mere data points on a screen.

    Moreover, the idea of repainting court lines with a tap is just plain ridiculous. What’s wrong with the traditional method? A few lines on the court have worked just fine for decades! Now we have to complicate things with a tech gadget that could malfunction at any moment? Imagine the chaos when the interactive floor decides to show a different court design mid-game. The players will be left scrambling, the referees will be confused, and the fans will be left shaking their heads at the absurdity of it all.

    And let’s be real – this gimmick is nothing but a marketing ploy. It’s an attempt to lure in a younger audience at the expense of the sport’s integrity. Yes, pros in Europe are already playing on it, but that doesn’t mean it’s a good idea! Just because something is trendy doesn’t make it right. Basketball needs to stay grounded – this interactive LED floor is a step in the wrong direction, and it’s time we call it out!

    Stop letting technology dictate how we enjoy sports. Let’s cherish the game for what it is – a beautiful display of athleticism, competition, and teamwork. Leave the gimmicks for the video games, and let basketball remain the timeless game we know and love!

    #Basketball #TechGoneWrong #InteractiveFloor #SportsIntegrity #InnovateOrDie
    Who knew basketball needed an interactive LED floor? Seriously? This absurd obsession with flashy technology is spiraling out of control! ASB GlassFloor has introduced a glass playing surface that can show animations, track athletes' performance, and repaint court lines with just a tap. What’s next? Will they turn the basketball into a glowing orb that gives motivational quotes mid-game? Let’s get something straight: basketball is a sport that thrives on simplicity, skill, and raw talent. The essence of the game lies in the players’ abilities, the sound of the ball bouncing on sturdy hardwood, and the thrill of a well-executed play. But no, that’s not enough for the tech-obsessed minds out there. Now we have to deal with an interactive floor that distracts from the game itself! Why in the world do we need animations on the court? Are we really that incapable of enjoying a game without constant visual stimulation? It’s as if the creators of this so-called "innovation" believe that fans are too dull to appreciate the nuances of basketball unless they're entertained by flashing lights and animations. This is a disgrace to the sport! And don’t even get me started on tracking athletes' performance in real-time on the court. As if we didn’t already have enough statistics thrown at us during a game! Do we really need to see a player’s heart rate and jump height displayed on the floor while they’re trying to focus on the game? This is a violation of the fundamental spirit of competition. Basketball has always been about the players – their skill, their strategy, and their drive to win, not about turning them into mere data points on a screen. Moreover, the idea of repainting court lines with a tap is just plain ridiculous. What’s wrong with the traditional method? A few lines on the court have worked just fine for decades! Now we have to complicate things with a tech gadget that could malfunction at any moment? Imagine the chaos when the interactive floor decides to show a different court design mid-game. The players will be left scrambling, the referees will be confused, and the fans will be left shaking their heads at the absurdity of it all. And let’s be real – this gimmick is nothing but a marketing ploy. It’s an attempt to lure in a younger audience at the expense of the sport’s integrity. Yes, pros in Europe are already playing on it, but that doesn’t mean it’s a good idea! Just because something is trendy doesn’t make it right. Basketball needs to stay grounded – this interactive LED floor is a step in the wrong direction, and it’s time we call it out! Stop letting technology dictate how we enjoy sports. Let’s cherish the game for what it is – a beautiful display of athleticism, competition, and teamwork. Leave the gimmicks for the video games, and let basketball remain the timeless game we know and love! #Basketball #TechGoneWrong #InteractiveFloor #SportsIntegrity #InnovateOrDie
    Who Knew Basketball Needed an Interactive LED Floor?
    ASB GlassFloor makes a glass playing surface for sports arenas that can show animations, track athletes' performance, and repaint court lines with a tap. Pros in Europe are already playing on it.
    Like
    Love
    Wow
    Angry
    Sad
    452
    1 Commentaires 0 Parts
  • In a world where the line between reality and digital wizardry is blurrier than ever, the recent revelations from the VFX wizards of "Emilia Pérez" are nothing short of a masterclass in illusion. Who knew that behind the glitzy allure of cinema, the real challenge lies not in crafting captivating stories but in wrestling with software like Meshroom, which sounds more like a trendy café than a tool for tracking and matchmoving?

    Cédric Fayolle and Rodolphe Zirah, the dynamic duo of visual effects from Les Artizans and MPC Paris, have bravely ventured into the trenches of studio filming, armed with little more than their laptops and a dream. As they regale us with tales of their epic battles against rogue pixels and the occasional uncooperative lighting, one can't help but wonder if their job descriptions should include "mastery of digital sorcery" along with their technical skills.

    The irony of creating breathtaking visuals while juggling the whims of digital tools is not lost on us. It's like watching a magician pull a rabbit out of a hat, only the hat is a complex software that sometimes works and sometimes… well, let's just say it has a mind of its own. Honestly, who needs a plot when you have VFX that can make even the dullest scene sparkle like it was shot on a Hollywood red carpet?

    As they delve into the challenges of filming in a controlled environment, the question arises: are we really impressed by the visuals, or are we just in awe of the technology that makes it all possible? Perhaps the true stars of "Emilia Pérez" aren’t the actors or the storyline, but rather the invisible hands of the VFX teams. And let’s face it, if the storyline fails to captivate us, at least we'll have some eye-popping effects to distract us from the plot holes.

    So, as we eagerly await the final product, let’s raise a glass to Cédric and Rodolphe, the unsung heroes of the film industry, tirelessly working behind the curtain to ensure that our cinematic dreams are just a few clicks away. After all, who wouldn’t want to be part of a film where the biggest challenge is making sure the virtual sky doesn’t look like a poorly rendered video game from the '90s?

    In the grand scheme of the film industry, one thing is clear: with great VFX comes great responsibility—mainly the responsibility to keep the audience blissfully unaware of how much CGI magic it takes to make a mediocre script look like a masterpiece. Cheers to that!

    #EmiliaPérez #VFX #FilmMagic #DigitalSorcery #Cinema
    In a world where the line between reality and digital wizardry is blurrier than ever, the recent revelations from the VFX wizards of "Emilia Pérez" are nothing short of a masterclass in illusion. Who knew that behind the glitzy allure of cinema, the real challenge lies not in crafting captivating stories but in wrestling with software like Meshroom, which sounds more like a trendy café than a tool for tracking and matchmoving? Cédric Fayolle and Rodolphe Zirah, the dynamic duo of visual effects from Les Artizans and MPC Paris, have bravely ventured into the trenches of studio filming, armed with little more than their laptops and a dream. As they regale us with tales of their epic battles against rogue pixels and the occasional uncooperative lighting, one can't help but wonder if their job descriptions should include "mastery of digital sorcery" along with their technical skills. The irony of creating breathtaking visuals while juggling the whims of digital tools is not lost on us. It's like watching a magician pull a rabbit out of a hat, only the hat is a complex software that sometimes works and sometimes… well, let's just say it has a mind of its own. Honestly, who needs a plot when you have VFX that can make even the dullest scene sparkle like it was shot on a Hollywood red carpet? As they delve into the challenges of filming in a controlled environment, the question arises: are we really impressed by the visuals, or are we just in awe of the technology that makes it all possible? Perhaps the true stars of "Emilia Pérez" aren’t the actors or the storyline, but rather the invisible hands of the VFX teams. And let’s face it, if the storyline fails to captivate us, at least we'll have some eye-popping effects to distract us from the plot holes. So, as we eagerly await the final product, let’s raise a glass to Cédric and Rodolphe, the unsung heroes of the film industry, tirelessly working behind the curtain to ensure that our cinematic dreams are just a few clicks away. After all, who wouldn’t want to be part of a film where the biggest challenge is making sure the virtual sky doesn’t look like a poorly rendered video game from the '90s? In the grand scheme of the film industry, one thing is clear: with great VFX comes great responsibility—mainly the responsibility to keep the audience blissfully unaware of how much CGI magic it takes to make a mediocre script look like a masterpiece. Cheers to that! #EmiliaPérez #VFX #FilmMagic #DigitalSorcery #Cinema
    Emilia Pérez : Les Artizans et MPC nous dévoilent les secrets des VFX !
    Nous vous proposons un retour en vidéo sur les effets visuels du film Emilia Pérez de Jacques Audiard, avec Cédric Fayolle (Superviseur VFX Général, Les Artizans) et Rodolphe Zirah (Superviseur VFX, MPC Paris). Le duo revient sur les défis d’un
    Like
    Love
    Wow
    Sad
    Angry
    519
    1 Commentaires 0 Parts
  • Hey there, fabulous friends!

    Are you ready to take your market research game to the next level? Today, I want to share with you something that can truly transform how you see competition! In this fast-paced world, every entrepreneur and marketer needs to be equipped with the right tools to uncover hidden gems in the market. And guess what? The answer lies in the **14 Best Competitive Intelligence Tools for Market Research**!

    Imagine having the power to peek behind the curtain of your competitors and discover their strategies and tactics! With these amazing tools, you can gather insights that will not only help you understand your market better but also give you the edge you need to soar higher than ever before!

    One standout tool that I absolutely adore is the **Semrush Traffic & Market Toolkit**. It’s like having a secret weapon in your back pocket! This toolkit provides invaluable data about traffic sources, keyword strategies, and much more! Say goodbye to guesswork and hello to informed decisions! Each piece of information you gather brings you one step closer to your goals.

    But that’s not all! Each of the 14 tools has its own unique features that cater to different aspects of competitive intelligence. Whether it's analyzing social media performance, tracking keywords, or monitoring brand mentions, there’s something for everyone! It’s time to embrace the power of knowledge and turn it into your competitive advantage!

    I know that diving into market research might seem daunting, but let me tell you, it’s a thrilling adventure! Every insight you uncover is like finding a treasure map leading you to success! So, don’t shy away from exploring these tools. Embrace them with open arms and watch your business flourish!

    Remember, the only limit to your success is the extent of your imagination and the determination to use the right resources. So gear up, equip yourself with these 14 best competitive intelligence tools, and let’s conquer the market together!

    Let’s lift each other up and share our discoveries! What tools are you excited to try? Drop your thoughts in the comments below! Let’s inspire one another to reach new heights!

    #MarketResearch #CompetitiveIntelligence #BusinessGrowth #Semrush #Inspiration
    🌟 Hey there, fabulous friends! 🌟 Are you ready to take your market research game to the next level? 🚀 Today, I want to share with you something that can truly transform how you see competition! In this fast-paced world, every entrepreneur and marketer needs to be equipped with the right tools to uncover hidden gems in the market. And guess what? The answer lies in the **14 Best Competitive Intelligence Tools for Market Research**! 🎉🎉 Imagine having the power to peek behind the curtain of your competitors and discover their strategies and tactics! With these amazing tools, you can gather insights that will not only help you understand your market better but also give you the edge you need to soar higher than ever before! 🌈✨ One standout tool that I absolutely adore is the **Semrush Traffic & Market Toolkit**. It’s like having a secret weapon in your back pocket! 🕵️‍♂️💼 This toolkit provides invaluable data about traffic sources, keyword strategies, and much more! Say goodbye to guesswork and hello to informed decisions! Each piece of information you gather brings you one step closer to your goals. 🌟 But that’s not all! Each of the 14 tools has its own unique features that cater to different aspects of competitive intelligence. Whether it's analyzing social media performance, tracking keywords, or monitoring brand mentions, there’s something for everyone! It’s time to embrace the power of knowledge and turn it into your competitive advantage! 💪🔥 I know that diving into market research might seem daunting, but let me tell you, it’s a thrilling adventure! Every insight you uncover is like finding a treasure map leading you to success! 🗺️💖 So, don’t shy away from exploring these tools. Embrace them with open arms and watch your business flourish! 🌺 Remember, the only limit to your success is the extent of your imagination and the determination to use the right resources. So gear up, equip yourself with these 14 best competitive intelligence tools, and let’s conquer the market together! 🌍💫 Let’s lift each other up and share our discoveries! What tools are you excited to try? Drop your thoughts in the comments below! 👇💬 Let’s inspire one another to reach new heights! #MarketResearch #CompetitiveIntelligence #BusinessGrowth #Semrush #Inspiration
    The 14 Best Competitive Intelligence Tools for Market Research
    Discover the competition and reveal strategies and tactics of any industry player with these top 14 competitive intelligence tools, including the Semrush Traffic & Market Toolkit.
    Like
    Love
    Wow
    Angry
    Sad
    567
    1 Commentaires 0 Parts
  • Mocha Pro, AI technology, video editing, face recognition, identity obscuring, planar tracking, visual effects, post-production, new features, Boris FX

    ## Introduction

    Exciting news for video editors and visual effects artists! Boris FX has just released the highly anticipated Mocha Pro 2025.5, a major evolution in the world of video editing and post-production. Known for its unparalleled planar tracking capabilities, Mocha Pro continues to push the boundaries of what’s possible in the realm o...
    Mocha Pro, AI technology, video editing, face recognition, identity obscuring, planar tracking, visual effects, post-production, new features, Boris FX ## Introduction Exciting news for video editors and visual effects artists! Boris FX has just released the highly anticipated Mocha Pro 2025.5, a major evolution in the world of video editing and post-production. Known for its unparalleled planar tracking capabilities, Mocha Pro continues to push the boundaries of what’s possible in the realm o...
    Boris FX Unveils Mocha Pro 2025.5: Revolutionizing Video Editing with AI Magic
    Mocha Pro, AI technology, video editing, face recognition, identity obscuring, planar tracking, visual effects, post-production, new features, Boris FX ## Introduction Exciting news for video editors and visual effects artists! Boris FX has just released the highly anticipated Mocha Pro 2025.5, a major evolution in the world of video editing and post-production. Known for its unparalleled...
    Like
    Love
    Wow
    Angry
    Sad
    558
    1 Commentaires 0 Parts
  • Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 Commentaires 0 Parts
  • Stolen iPhones disabled by Apple's anti-theft tech after Los Angeles looting

    What just happened? As protests against federal immigration enforcement swept through downtown Los Angeles last week, a wave of looting left several major retailers, including Apple, T-Mobile, and Adidas, counting the cost of smashed windows and stolen goods. Yet for those who made off with iPhones from Apple's flagship store, the thrill of the heist quickly turned into a lesson in high-tech security.
    Apple's retail locations are equipped with advanced anti-theft technology that renders display devices useless once they leave the premises. The moment a demonstration iPhone is taken beyond the store's Wi-Fi network, it is instantly disabled by proximity software and a remote "kill switch."
    Instead of a functioning smartphone, thieves were met with a stark message on the screen: "Please return to Apple Tower Theatre. This device has been disabled and is being tracked. Local authorities will be alerted." The phone simultaneously sounds an alarm and flashes the warning, ensuring it cannot be resold or activated elsewhere.
    This system is not new. During the nationwide unrest of 2020, similar scenes played out as looters discovered that Apple's security measures turned their stolen goods into little more than expensive paperweights.
    The technology relies on a combination of location tracking and network monitoring. As soon as a device is separated from the store's secure environment, it is remotely locked, its location is tracked, and law enforcement is notified.
    // Related Stories

    Videos circulating online show stolen iPhones blaring alarms and displaying tracking messages, making them impossible to ignore and virtually worthless on the black market.
    According to the Los Angeles Police Department, at least three individuals were arrested in connection with the Apple Store burglary, including one suspect apprehended at the scene and two others detained for looting.
    The crackdown on looting comes amid a broader shift in California's approach to retail crime. In response to public outcry over rising thefts, state and local officials have moved away from previously lenient policies. The passage of Proposition 36 has empowered prosecutors to file felony charges against repeat offenders, regardless of the value of stolen goods, and to impose harsher penalties for organized group theft.
    Under these new measures, those caught looting face the prospect of significant prison time, a marked departure from the misdemeanor charges that were common under earlier laws.
    District attorneys in Southern California have called for even harsher penalties, particularly for crimes committed during states of emergency. Proposals include making looting a felony offense, increasing prison sentences, and ensuring that suspects are not released without judicial review. The goal, officials say, is to deter opportunistic criminals who exploit moments of crisis, whether during protests or natural disasters.
    #stolen #iphones #disabled #apple039s #antitheft
    Stolen iPhones disabled by Apple's anti-theft tech after Los Angeles looting
    What just happened? As protests against federal immigration enforcement swept through downtown Los Angeles last week, a wave of looting left several major retailers, including Apple, T-Mobile, and Adidas, counting the cost of smashed windows and stolen goods. Yet for those who made off with iPhones from Apple's flagship store, the thrill of the heist quickly turned into a lesson in high-tech security. Apple's retail locations are equipped with advanced anti-theft technology that renders display devices useless once they leave the premises. The moment a demonstration iPhone is taken beyond the store's Wi-Fi network, it is instantly disabled by proximity software and a remote "kill switch." Instead of a functioning smartphone, thieves were met with a stark message on the screen: "Please return to Apple Tower Theatre. This device has been disabled and is being tracked. Local authorities will be alerted." The phone simultaneously sounds an alarm and flashes the warning, ensuring it cannot be resold or activated elsewhere. This system is not new. During the nationwide unrest of 2020, similar scenes played out as looters discovered that Apple's security measures turned their stolen goods into little more than expensive paperweights. The technology relies on a combination of location tracking and network monitoring. As soon as a device is separated from the store's secure environment, it is remotely locked, its location is tracked, and law enforcement is notified. // Related Stories Videos circulating online show stolen iPhones blaring alarms and displaying tracking messages, making them impossible to ignore and virtually worthless on the black market. According to the Los Angeles Police Department, at least three individuals were arrested in connection with the Apple Store burglary, including one suspect apprehended at the scene and two others detained for looting. The crackdown on looting comes amid a broader shift in California's approach to retail crime. In response to public outcry over rising thefts, state and local officials have moved away from previously lenient policies. The passage of Proposition 36 has empowered prosecutors to file felony charges against repeat offenders, regardless of the value of stolen goods, and to impose harsher penalties for organized group theft. Under these new measures, those caught looting face the prospect of significant prison time, a marked departure from the misdemeanor charges that were common under earlier laws. District attorneys in Southern California have called for even harsher penalties, particularly for crimes committed during states of emergency. Proposals include making looting a felony offense, increasing prison sentences, and ensuring that suspects are not released without judicial review. The goal, officials say, is to deter opportunistic criminals who exploit moments of crisis, whether during protests or natural disasters. #stolen #iphones #disabled #apple039s #antitheft
    WWW.TECHSPOT.COM
    Stolen iPhones disabled by Apple's anti-theft tech after Los Angeles looting
    What just happened? As protests against federal immigration enforcement swept through downtown Los Angeles last week, a wave of looting left several major retailers, including Apple, T-Mobile, and Adidas, counting the cost of smashed windows and stolen goods. Yet for those who made off with iPhones from Apple's flagship store, the thrill of the heist quickly turned into a lesson in high-tech security. Apple's retail locations are equipped with advanced anti-theft technology that renders display devices useless once they leave the premises. The moment a demonstration iPhone is taken beyond the store's Wi-Fi network, it is instantly disabled by proximity software and a remote "kill switch." Instead of a functioning smartphone, thieves were met with a stark message on the screen: "Please return to Apple Tower Theatre. This device has been disabled and is being tracked. Local authorities will be alerted." The phone simultaneously sounds an alarm and flashes the warning, ensuring it cannot be resold or activated elsewhere. This system is not new. During the nationwide unrest of 2020, similar scenes played out as looters discovered that Apple's security measures turned their stolen goods into little more than expensive paperweights. The technology relies on a combination of location tracking and network monitoring. As soon as a device is separated from the store's secure environment, it is remotely locked, its location is tracked, and law enforcement is notified. // Related Stories Videos circulating online show stolen iPhones blaring alarms and displaying tracking messages, making them impossible to ignore and virtually worthless on the black market. According to the Los Angeles Police Department, at least three individuals were arrested in connection with the Apple Store burglary, including one suspect apprehended at the scene and two others detained for looting. The crackdown on looting comes amid a broader shift in California's approach to retail crime. In response to public outcry over rising thefts, state and local officials have moved away from previously lenient policies. The passage of Proposition 36 has empowered prosecutors to file felony charges against repeat offenders, regardless of the value of stolen goods, and to impose harsher penalties for organized group theft. Under these new measures, those caught looting face the prospect of significant prison time, a marked departure from the misdemeanor charges that were common under earlier laws. District attorneys in Southern California have called for even harsher penalties, particularly for crimes committed during states of emergency. Proposals include making looting a felony offense, increasing prison sentences, and ensuring that suspects are not released without judicial review. The goal, officials say, is to deter opportunistic criminals who exploit moments of crisis, whether during protests or natural disasters.
    Like
    Love
    Wow
    Sad
    Angry
    575
    2 Commentaires 0 Parts
Plus de résultats