• European Broadcasting Union and NVIDIA Partner on Sovereign AI to Support Public Broadcasters

    In a new effort to advance sovereign AI for European public service media, NVIDIA and the European Broadcasting Unionare working together to give the media industry access to high-quality and trusted cloud and AI technologies.
    Announced at NVIDIA GTC Paris at VivaTech, NVIDIA’s collaboration with the EBU — the world’s leading alliance of public service media with more than 110 member organizations in 50+ countries, reaching an audience of over 1 billion — focuses on helping build sovereign AI and cloud frameworks, driving workforce development and cultivating an AI ecosystem to create a more equitable, accessible and resilient European media landscape.
    The work will create better foundations for public service media to benefit from European cloud infrastructure and AI services that are exclusively governed by European policy, comply with European data protection and privacy rules, and embody European values.
    Sovereign AI ensures nations can develop and deploy artificial intelligence using local infrastructure, datasets and expertise. By investing in it, European countries can preserve their cultural identity, enhance public trust and support innovation specific to their needs.
    “We are proud to collaborate with NVIDIA to drive the development of sovereign AI and cloud services,” said Michael Eberhard, chief technology officer of public broadcaster ARD/SWR, and chair of the EBU Technical Committee. “By advancing these capabilities together, we’re helping ensure that powerful, compliant and accessible media services are made available to all EBU members — powering innovation, resilience and strategic autonomy across the board.”

    Empowering Media Innovation in Europe
    To support the development of sovereign AI technologies, NVIDIA and the EBU will establish frameworks that prioritize independence and public trust, helping ensure that AI serves the interests of Europeans while preserving the autonomy of media organizations.
    Through this collaboration, NVIDIA and the EBU will develop hybrid cloud architectures designed to meet the highest standards of European public service media. The EBU will contribute its Dynamic Media Facilityand Media eXchange Layerarchitecture, aiming to enable interoperability and scalability for workflows, as well as cost- and energy-efficient AI training and inference. Following open-source principles, this work aims to create an accessible, dynamic technology ecosystem.
    The collaboration will also provide public service media companies with the tools to deliver personalized, contextually relevant services and content recommendation systems, with a focus on transparency, accountability and cultural identity. This will be realized through investment in sovereign cloud and AI infrastructure and software platforms such as NVIDIA AI Enterprise, custom foundation models, large language models trained with local data, and retrieval-augmented generation technologies.
    As part of the collaboration, NVIDIA is also making available resources from its Deep Learning Institute, offering European media organizations comprehensive training programs to create an AI-ready workforce. This will support the EBU’s efforts to help ensure news integrity in the age of AI.
    In addition, the EBU and its partners are investing in local data centers and cloud platforms that support sovereign technologies, such as NVIDIA GB200 Grace Blackwell Superchip, NVIDIA RTX PRO Servers, NVIDIA DGX Cloud and NVIDIA Holoscan for Media — helping members of the union achieve secure and cost- and energy-efficient AI training, while promoting AI research and development.
    Partnering With Public Service Media for Sovereign Cloud and AI
    Collaboration within the media sector is essential for the development and application of comprehensive standards and best practices that ensure the creation and deployment of sovereign European cloud and AI.
    By engaging with independent software vendors, data center providers, cloud service providers and original equipment manufacturers, NVIDIA and the EBU aim to create a unified approach to sovereign cloud and AI.
    This work will also facilitate discussions between the cloud and AI industry and European regulators, helping ensure the development of practical solutions that benefit both the general public and media organizations.
    “Building sovereign cloud and AI capabilities based on EBU’s Dynamic Media Facility and Media eXchange Layer architecture requires strong cross-industry collaboration,” said Antonio Arcidiacono, chief technology and innovation officer at the EBU. “By collaborating with NVIDIA, as well as a broad ecosystem of media technology partners, we are fostering a shared foundation for trust, innovation and resilience that supports the growth of European media.”
    Learn more about the EBU.
    Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. 
    #european #broadcasting #union #nvidia #partner
    European Broadcasting Union and NVIDIA Partner on Sovereign AI to Support Public Broadcasters
    In a new effort to advance sovereign AI for European public service media, NVIDIA and the European Broadcasting Unionare working together to give the media industry access to high-quality and trusted cloud and AI technologies. Announced at NVIDIA GTC Paris at VivaTech, NVIDIA’s collaboration with the EBU — the world’s leading alliance of public service media with more than 110 member organizations in 50+ countries, reaching an audience of over 1 billion — focuses on helping build sovereign AI and cloud frameworks, driving workforce development and cultivating an AI ecosystem to create a more equitable, accessible and resilient European media landscape. The work will create better foundations for public service media to benefit from European cloud infrastructure and AI services that are exclusively governed by European policy, comply with European data protection and privacy rules, and embody European values. Sovereign AI ensures nations can develop and deploy artificial intelligence using local infrastructure, datasets and expertise. By investing in it, European countries can preserve their cultural identity, enhance public trust and support innovation specific to their needs. “We are proud to collaborate with NVIDIA to drive the development of sovereign AI and cloud services,” said Michael Eberhard, chief technology officer of public broadcaster ARD/SWR, and chair of the EBU Technical Committee. “By advancing these capabilities together, we’re helping ensure that powerful, compliant and accessible media services are made available to all EBU members — powering innovation, resilience and strategic autonomy across the board.” Empowering Media Innovation in Europe To support the development of sovereign AI technologies, NVIDIA and the EBU will establish frameworks that prioritize independence and public trust, helping ensure that AI serves the interests of Europeans while preserving the autonomy of media organizations. Through this collaboration, NVIDIA and the EBU will develop hybrid cloud architectures designed to meet the highest standards of European public service media. The EBU will contribute its Dynamic Media Facilityand Media eXchange Layerarchitecture, aiming to enable interoperability and scalability for workflows, as well as cost- and energy-efficient AI training and inference. Following open-source principles, this work aims to create an accessible, dynamic technology ecosystem. The collaboration will also provide public service media companies with the tools to deliver personalized, contextually relevant services and content recommendation systems, with a focus on transparency, accountability and cultural identity. This will be realized through investment in sovereign cloud and AI infrastructure and software platforms such as NVIDIA AI Enterprise, custom foundation models, large language models trained with local data, and retrieval-augmented generation technologies. As part of the collaboration, NVIDIA is also making available resources from its Deep Learning Institute, offering European media organizations comprehensive training programs to create an AI-ready workforce. This will support the EBU’s efforts to help ensure news integrity in the age of AI. In addition, the EBU and its partners are investing in local data centers and cloud platforms that support sovereign technologies, such as NVIDIA GB200 Grace Blackwell Superchip, NVIDIA RTX PRO Servers, NVIDIA DGX Cloud and NVIDIA Holoscan for Media — helping members of the union achieve secure and cost- and energy-efficient AI training, while promoting AI research and development. Partnering With Public Service Media for Sovereign Cloud and AI Collaboration within the media sector is essential for the development and application of comprehensive standards and best practices that ensure the creation and deployment of sovereign European cloud and AI. By engaging with independent software vendors, data center providers, cloud service providers and original equipment manufacturers, NVIDIA and the EBU aim to create a unified approach to sovereign cloud and AI. This work will also facilitate discussions between the cloud and AI industry and European regulators, helping ensure the development of practical solutions that benefit both the general public and media organizations. “Building sovereign cloud and AI capabilities based on EBU’s Dynamic Media Facility and Media eXchange Layer architecture requires strong cross-industry collaboration,” said Antonio Arcidiacono, chief technology and innovation officer at the EBU. “By collaborating with NVIDIA, as well as a broad ecosystem of media technology partners, we are fostering a shared foundation for trust, innovation and resilience that supports the growth of European media.” Learn more about the EBU. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions.  #european #broadcasting #union #nvidia #partner
    BLOGS.NVIDIA.COM
    European Broadcasting Union and NVIDIA Partner on Sovereign AI to Support Public Broadcasters
    In a new effort to advance sovereign AI for European public service media, NVIDIA and the European Broadcasting Union (EBU) are working together to give the media industry access to high-quality and trusted cloud and AI technologies. Announced at NVIDIA GTC Paris at VivaTech, NVIDIA’s collaboration with the EBU — the world’s leading alliance of public service media with more than 110 member organizations in 50+ countries, reaching an audience of over 1 billion — focuses on helping build sovereign AI and cloud frameworks, driving workforce development and cultivating an AI ecosystem to create a more equitable, accessible and resilient European media landscape. The work will create better foundations for public service media to benefit from European cloud infrastructure and AI services that are exclusively governed by European policy, comply with European data protection and privacy rules, and embody European values. Sovereign AI ensures nations can develop and deploy artificial intelligence using local infrastructure, datasets and expertise. By investing in it, European countries can preserve their cultural identity, enhance public trust and support innovation specific to their needs. “We are proud to collaborate with NVIDIA to drive the development of sovereign AI and cloud services,” said Michael Eberhard, chief technology officer of public broadcaster ARD/SWR, and chair of the EBU Technical Committee. “By advancing these capabilities together, we’re helping ensure that powerful, compliant and accessible media services are made available to all EBU members — powering innovation, resilience and strategic autonomy across the board.” Empowering Media Innovation in Europe To support the development of sovereign AI technologies, NVIDIA and the EBU will establish frameworks that prioritize independence and public trust, helping ensure that AI serves the interests of Europeans while preserving the autonomy of media organizations. Through this collaboration, NVIDIA and the EBU will develop hybrid cloud architectures designed to meet the highest standards of European public service media. The EBU will contribute its Dynamic Media Facility (DMF) and Media eXchange Layer (MXL) architecture, aiming to enable interoperability and scalability for workflows, as well as cost- and energy-efficient AI training and inference. Following open-source principles, this work aims to create an accessible, dynamic technology ecosystem. The collaboration will also provide public service media companies with the tools to deliver personalized, contextually relevant services and content recommendation systems, with a focus on transparency, accountability and cultural identity. This will be realized through investment in sovereign cloud and AI infrastructure and software platforms such as NVIDIA AI Enterprise, custom foundation models, large language models trained with local data, and retrieval-augmented generation technologies. As part of the collaboration, NVIDIA is also making available resources from its Deep Learning Institute, offering European media organizations comprehensive training programs to create an AI-ready workforce. This will support the EBU’s efforts to help ensure news integrity in the age of AI. In addition, the EBU and its partners are investing in local data centers and cloud platforms that support sovereign technologies, such as NVIDIA GB200 Grace Blackwell Superchip, NVIDIA RTX PRO Servers, NVIDIA DGX Cloud and NVIDIA Holoscan for Media — helping members of the union achieve secure and cost- and energy-efficient AI training, while promoting AI research and development. Partnering With Public Service Media for Sovereign Cloud and AI Collaboration within the media sector is essential for the development and application of comprehensive standards and best practices that ensure the creation and deployment of sovereign European cloud and AI. By engaging with independent software vendors, data center providers, cloud service providers and original equipment manufacturers, NVIDIA and the EBU aim to create a unified approach to sovereign cloud and AI. This work will also facilitate discussions between the cloud and AI industry and European regulators, helping ensure the development of practical solutions that benefit both the general public and media organizations. “Building sovereign cloud and AI capabilities based on EBU’s Dynamic Media Facility and Media eXchange Layer architecture requires strong cross-industry collaboration,” said Antonio Arcidiacono, chief technology and innovation officer at the EBU. “By collaborating with NVIDIA, as well as a broad ecosystem of media technology partners, we are fostering a shared foundation for trust, innovation and resilience that supports the growth of European media.” Learn more about the EBU. Watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang at VivaTech, and explore GTC Paris sessions. 
    Like
    Love
    Wow
    Sad
    Angry
    35
    0 Commentarios 0 Acciones
  • Riot Will Allow Sports-Betting Sponsorships For League Of Legends Esports Teams

    Riot Games has announced that it will begin officially sanctioning sports-betting sponsorships for esports teams in its Tier 1 League of Legends and Valorant leagues. While the company states that it still won't allow advertisements in its official broadcasts, teams themselves will be able to take money from sports-betting companies for advertising through their own channels.In a blog post, President of Publishing and Esports John Needham writes that the move is designed to take advantage of the rapidly growing sports-betting industry and to make esports-related betting more regulated. Seemingly to address concerns and head off potential criticism, Needham explains that the company is authorizing sports-betting sponsorships under a "guardrails first" strategy.These "guardrails," Needham states, are essentially the rules by which any sponsorship must be executed. First, sports-betting companies need to be vetted and approved by Riot itself, although the company has not shared the criteria on which this vetting is done. Second, to ensure that sports-betting companies are on a level playing field, Riot is mandating that official partners all use GRID, the officially sanctioned data platform for League of Legends and Valorant. Third, esports teams must launch and maintain internal integrity programs to protect against violations of league rules due to the influence of sports betting. Fourth and last, Riot will use some of the revenue from these sponsorships to support its Tier 2esports leagues.Continue Reading at GameSpot
    #riot #will #allow #sportsbetting #sponsorships
    Riot Will Allow Sports-Betting Sponsorships For League Of Legends Esports Teams
    Riot Games has announced that it will begin officially sanctioning sports-betting sponsorships for esports teams in its Tier 1 League of Legends and Valorant leagues. While the company states that it still won't allow advertisements in its official broadcasts, teams themselves will be able to take money from sports-betting companies for advertising through their own channels.In a blog post, President of Publishing and Esports John Needham writes that the move is designed to take advantage of the rapidly growing sports-betting industry and to make esports-related betting more regulated. Seemingly to address concerns and head off potential criticism, Needham explains that the company is authorizing sports-betting sponsorships under a "guardrails first" strategy.These "guardrails," Needham states, are essentially the rules by which any sponsorship must be executed. First, sports-betting companies need to be vetted and approved by Riot itself, although the company has not shared the criteria on which this vetting is done. Second, to ensure that sports-betting companies are on a level playing field, Riot is mandating that official partners all use GRID, the officially sanctioned data platform for League of Legends and Valorant. Third, esports teams must launch and maintain internal integrity programs to protect against violations of league rules due to the influence of sports betting. Fourth and last, Riot will use some of the revenue from these sponsorships to support its Tier 2esports leagues.Continue Reading at GameSpot #riot #will #allow #sportsbetting #sponsorships
    WWW.GAMESPOT.COM
    Riot Will Allow Sports-Betting Sponsorships For League Of Legends Esports Teams
    Riot Games has announced that it will begin officially sanctioning sports-betting sponsorships for esports teams in its Tier 1 League of Legends and Valorant leagues. While the company states that it still won't allow advertisements in its official broadcasts, teams themselves will be able to take money from sports-betting companies for advertising through their own channels.In a blog post, President of Publishing and Esports John Needham writes that the move is designed to take advantage of the rapidly growing sports-betting industry and to make esports-related betting more regulated. Seemingly to address concerns and head off potential criticism, Needham explains that the company is authorizing sports-betting sponsorships under a "guardrails first" strategy.These "guardrails," Needham states, are essentially the rules by which any sponsorship must be executed. First, sports-betting companies need to be vetted and approved by Riot itself, although the company has not shared the criteria on which this vetting is done. Second, to ensure that sports-betting companies are on a level playing field, Riot is mandating that official partners all use GRID, the officially sanctioned data platform for League of Legends and Valorant. Third, esports teams must launch and maintain internal integrity programs to protect against violations of league rules due to the influence of sports betting. Fourth and last, Riot will use some of the revenue from these sponsorships to support its Tier 2 (lower division) esports leagues.Continue Reading at GameSpot
    0 Commentarios 0 Acciones
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Commentarios 0 Acciones
  • Exciting news from the University of Bristol! They are pioneering the use of 3D concrete printing for seismic safety! This innovative technology is not only revolutionizing the construction industry by enabling faster and more cost-effective building processes, but it also ensures our structures can withstand the forces of nature.

    Imagine a future where our homes and buildings are not just strong, but also built with cutting-edge technology! The possibilities are endless, and it’s inspiring to see how creativity meets safety! Let's embrace this amazing journey towards a more resilient world!

    #3DPrinting #SeismicSafety #BristolUniversity #ConstructionInnovation #FutureBuilding
    🌟 Exciting news from the University of Bristol! 🌟 They are pioneering the use of 3D concrete printing for seismic safety! 🏗️✨ This innovative technology is not only revolutionizing the construction industry by enabling faster and more cost-effective building processes, but it also ensures our structures can withstand the forces of nature. 🌍💪 Imagine a future where our homes and buildings are not just strong, but also built with cutting-edge technology! The possibilities are endless, and it’s inspiring to see how creativity meets safety! Let's embrace this amazing journey towards a more resilient world! 🚀💖 #3DPrinting #SeismicSafety #BristolUniversity #ConstructionInnovation #FutureBuilding
    La Universidad de Bristol prueba la impresión 3D de hormigón para la seguridad sísmica
    En los últimos años, la impresión 3D de hormigón se ha venido consolidando como una tecnología legítima dentro de la industria de la construcción. Esta técnica permite producir edificaciones de forma más rápida y rentable, por lo que los expertosR
    1 Commentarios 0 Acciones
  • Agentic AI: الذكاء الاصطناعي المنفّذ الذي لا ينتظر أوامر! Is this the future we want? A world where artificial intelligence operates without oversight, making decisions without human intervention? It's a reckless gamble that could lead to catastrophic consequences! The idea of such an autonomous system is terrifying and irresponsible. We are opening the door to chaos, where machines dictate the course of human lives. Do we really trust technology to act in our best interests? This is not just a technical flaw; it’s a fundamental ethical crisis! We must demand accountability and ensure that AI remains under human control before it spirals out of hand!

    #AgenticAI #ArtificialIntelligence #TechEthics #AIControl #FutureOfTechnology
    Agentic AI: الذكاء الاصطناعي المنفّذ الذي لا ينتظر أوامر! Is this the future we want? A world where artificial intelligence operates without oversight, making decisions without human intervention? It's a reckless gamble that could lead to catastrophic consequences! The idea of such an autonomous system is terrifying and irresponsible. We are opening the door to chaos, where machines dictate the course of human lives. Do we really trust technology to act in our best interests? This is not just a technical flaw; it’s a fundamental ethical crisis! We must demand accountability and ensure that AI remains under human control before it spirals out of hand! #AgenticAI #ArtificialIntelligence #TechEthics #AIControl #FutureOfTechnology
    ARABHARDWARE.NET
    Agentic AI: الذكاء الاصطناعي المنفّذ الذي لا ينتظر أوامر!
    The post Agentic AI: الذكاء الاصطناعي المنفّذ الذي لا ينتظر أوامر! appeared first on عرب هاردوير.
    1 Commentarios 0 Acciones
  • In a world where vintage charm meets pixelated perfection, Leica has decided to bless us with a 64-megapixel upgrade for their digitally-converted cameras. Because who doesn't want their classic film nostalgia captured with the same clarity as a high-definition soap opera? Forget the subtleties of light and shadow; it's all about that mega resolution now! Michael Suguitan must be rolling in his grave—oh wait, he’s probably busy converting more classics to ensure the true essence of “overkill” is preserved.

    Embrace the irony, where the past is resurrected with the most modern of tech, all while collectors nod in approval, blissfully unaware that their beloved relics are now just glorified screensavers.

    #Leica
    In a world where vintage charm meets pixelated perfection, Leica has decided to bless us with a 64-megapixel upgrade for their digitally-converted cameras. Because who doesn't want their classic film nostalgia captured with the same clarity as a high-definition soap opera? Forget the subtleties of light and shadow; it's all about that mega resolution now! Michael Suguitan must be rolling in his grave—oh wait, he’s probably busy converting more classics to ensure the true essence of “overkill” is preserved. Embrace the irony, where the past is resurrected with the most modern of tech, all while collectors nod in approval, blissfully unaware that their beloved relics are now just glorified screensavers. #Leica
    HACKADAY.COM
    Digitally-Converted Leica Gets A 64-Megapixel Upgrade
    Leica’s film cameras were hugely popular in the 20th century, and remain so with collectors to this day. [Michael Suguitan] has previously had great success converting his classic Leica into …read more
    1 Commentarios 0 Acciones
  • In a world where smartphones have become extensions of our very beings, it seems only fitting that the latest buzz is about none other than the Trump Mobile and its dazzling Gold T1 smartphone. Yes, you heard that right – a phone that’s as golden as its namesake’s aspirations and, arguably, just as inflated!

    Let’s dive into the nine *urgent* questions we all have about this technological marvel. First on the list: Is it true that the Trump Mobile can only connect to social media platforms that feature a certain orange-tinted filter? Because if it doesn’t, what’s the point, really? We all know that a phone’s worth is measured by its ability to curate the perfect image, preferably one that makes the user look like a billion bucks—just like the former president himself.

    And while we’re on the topic of money, can we talk about the Gold T1’s price tag? Rumor has it that it’s priced like a luxury yacht, but comes with the battery life of a damp sponge. A perfect combo for those who wish to flaunt their wealth while simultaneously being unable to scroll through their Twitter feed without a panic attack when the battery drops to 1%.

    Now, let’s not forget about the *data plan*. Is it true that the plan includes unlimited access to news outlets that only cover “the best” headlines? Because if I can’t get my daily dose of “Trump is the best” articles, then what’s the point of having a phone that’s practically a golden trophy? I can just see the commercials now: “Get your Trump Mobile and never miss an opportunity to revel in your own glory!”

    Furthermore, what about the customer service? One can only imagine calling for assistance and getting a voicemail that says, “We’re busy making America great again, please leave a message after the beep.” If you’re lucky, you might get a callback… in a week, or perhaps never. After all, who needs help when you have a phone that’s practically an icon of success?

    Let’s also discuss the design. Is it true that the Gold T1 comes with a built-in mirror so you can admire yourself while pretending to check your messages? Because nothing screams “I’m important” like a smartphone that encourages narcissism at every glance.

    And what about the camera? Will it have a special feature that automatically enhances your selfies to ensure you look as good as the carefully curated versions of yourself? I mean, we can’t have anything less than perfection when it comes to our online personas, can we?

    In conclusion, while the Trump Mobile and Gold T1 smartphone might promise a new era of connectivity and self-admiration, one can only wonder if it’s all a glittery façade hiding a less-than-stellar user experience. But hey, for those who’ve always dreamt of owning a piece of tech that’s as bold and brash as its namesake, this might just be the device for you!

    #TrumpMobile #GoldT1 #SmartphoneHumor #TechSatire #DigitalNarcissism
    In a world where smartphones have become extensions of our very beings, it seems only fitting that the latest buzz is about none other than the Trump Mobile and its dazzling Gold T1 smartphone. Yes, you heard that right – a phone that’s as golden as its namesake’s aspirations and, arguably, just as inflated! Let’s dive into the nine *urgent* questions we all have about this technological marvel. First on the list: Is it true that the Trump Mobile can only connect to social media platforms that feature a certain orange-tinted filter? Because if it doesn’t, what’s the point, really? We all know that a phone’s worth is measured by its ability to curate the perfect image, preferably one that makes the user look like a billion bucks—just like the former president himself. And while we’re on the topic of money, can we talk about the Gold T1’s price tag? Rumor has it that it’s priced like a luxury yacht, but comes with the battery life of a damp sponge. A perfect combo for those who wish to flaunt their wealth while simultaneously being unable to scroll through their Twitter feed without a panic attack when the battery drops to 1%. Now, let’s not forget about the *data plan*. Is it true that the plan includes unlimited access to news outlets that only cover “the best” headlines? Because if I can’t get my daily dose of “Trump is the best” articles, then what’s the point of having a phone that’s practically a golden trophy? I can just see the commercials now: “Get your Trump Mobile and never miss an opportunity to revel in your own glory!” Furthermore, what about the customer service? One can only imagine calling for assistance and getting a voicemail that says, “We’re busy making America great again, please leave a message after the beep.” If you’re lucky, you might get a callback… in a week, or perhaps never. After all, who needs help when you have a phone that’s practically an icon of success? Let’s also discuss the design. Is it true that the Gold T1 comes with a built-in mirror so you can admire yourself while pretending to check your messages? Because nothing screams “I’m important” like a smartphone that encourages narcissism at every glance. And what about the camera? Will it have a special feature that automatically enhances your selfies to ensure you look as good as the carefully curated versions of yourself? I mean, we can’t have anything less than perfection when it comes to our online personas, can we? In conclusion, while the Trump Mobile and Gold T1 smartphone might promise a new era of connectivity and self-admiration, one can only wonder if it’s all a glittery façade hiding a less-than-stellar user experience. But hey, for those who’ve always dreamt of owning a piece of tech that’s as bold and brash as its namesake, this might just be the device for you! #TrumpMobile #GoldT1 #SmartphoneHumor #TechSatire #DigitalNarcissism
    9 Urgent Questions About Trump Mobile and the Gold T1 Smartphone
    We don’t know much about the new Trump Mobile phone or the company’s data plan, but we sure do have a lot of questions.
    Like
    Love
    Wow
    Angry
    Sad
    244
    1 Commentarios 0 Acciones
  • Hey everyone!

    Today, I want to talk about something that’s making waves in the gaming community: the launch of the fast-paced online soccer game, Rematch! ⚽️ While many of us were super excited to jump into the action, we heard some news that might have dampened our spirits a bit — the game is launching without crossplay.

    But hold on! Before we let that news take the wind out of our sails, let’s take a moment to reflect on the bigger picture! The developers at Sloclap have made it clear that adding crossplay is a top priority for them. This means they’re listening to us, the players! They want to ensure that our experience is as enjoyable as possible, and they’re committed to making it happen. How awesome is that?

    Sure, it’s disappointing to not have crossplay right at launch, especially when we were all looking forward to uniting friends across different platforms for some thrilling matches. However, let’s remember that every great game has its journey, and sometimes, it takes a little time to get everything just right.

    We have the opportunity to show our support for the developers and the community by remaining optimistic! Imagine the epic matches we’ll have once crossplay is implemented! The idea of teaming up with friends on different consoles or PCs to score those last-minute goals is exhilarating!

    So, instead of focusing on the disappointment, let’s channel our energy into celebrating the launch of Rematch! Let’s dive into the gameplay, explore all the features, and share our experiences with one another! We can build an amazing community that encourages one another and fosters a love for the game.

    Remember, every setback is a setup for a comeback! Let’s keep our spirits high and look forward to all the updates and improvements that Sloclap has in store for us. The future of Rematch is bright, and I can’t wait to see where it takes us!

    Let’s keep playing, keep having fun, and keep believing in the magic of gaming! Who’s ready to hit the pitch? ⚽️

    #RematchGame #GamingCommunity #KeepPlaying #StayPositive #SoccerFun
    🌟 Hey everyone! 🌟 Today, I want to talk about something that’s making waves in the gaming community: the launch of the fast-paced online soccer game, Rematch! ⚽️ While many of us were super excited to jump into the action, we heard some news that might have dampened our spirits a bit — the game is launching without crossplay. 😢 But hold on! Before we let that news take the wind out of our sails, let’s take a moment to reflect on the bigger picture! 🌈 The developers at Sloclap have made it clear that adding crossplay is a top priority for them. This means they’re listening to us, the players! 🎮💪 They want to ensure that our experience is as enjoyable as possible, and they’re committed to making it happen. How awesome is that? 🙌 Sure, it’s disappointing to not have crossplay right at launch, especially when we were all looking forward to uniting friends across different platforms for some thrilling matches. However, let’s remember that every great game has its journey, and sometimes, it takes a little time to get everything just right. 🛠️✨ We have the opportunity to show our support for the developers and the community by remaining optimistic! Imagine the epic matches we’ll have once crossplay is implemented! 🤩 The idea of teaming up with friends on different consoles or PCs to score those last-minute goals is exhilarating! 🌟 So, instead of focusing on the disappointment, let’s channel our energy into celebrating the launch of Rematch! 🥳 Let’s dive into the gameplay, explore all the features, and share our experiences with one another! We can build an amazing community that encourages one another and fosters a love for the game. 🌍❤️ Remember, every setback is a setup for a comeback! Let’s keep our spirits high and look forward to all the updates and improvements that Sloclap has in store for us. The future of Rematch is bright, and I can’t wait to see where it takes us! 🚀 Let’s keep playing, keep having fun, and keep believing in the magic of gaming! Who’s ready to hit the pitch? ⚽️💥 #RematchGame #GamingCommunity #KeepPlaying #StayPositive #SoccerFun
    Rematch Launching Without Crossplay, Disappointing Many Players
    Fast-paced online soccer game Rematch is launching without crossplay. This was confirmed online just a few hours before the sports game launched on consoles and PC. Developers Sloclap say adding crossplay is a top priority, but many players are still
    Like
    Love
    Wow
    Sad
    Angry
    354
    1 Commentarios 0 Acciones
  • In a world where the line between reality and digital wizardry is blurrier than ever, the recent revelations from the VFX wizards of "Emilia Pérez" are nothing short of a masterclass in illusion. Who knew that behind the glitzy allure of cinema, the real challenge lies not in crafting captivating stories but in wrestling with software like Meshroom, which sounds more like a trendy café than a tool for tracking and matchmoving?

    Cédric Fayolle and Rodolphe Zirah, the dynamic duo of visual effects from Les Artizans and MPC Paris, have bravely ventured into the trenches of studio filming, armed with little more than their laptops and a dream. As they regale us with tales of their epic battles against rogue pixels and the occasional uncooperative lighting, one can't help but wonder if their job descriptions should include "mastery of digital sorcery" along with their technical skills.

    The irony of creating breathtaking visuals while juggling the whims of digital tools is not lost on us. It's like watching a magician pull a rabbit out of a hat, only the hat is a complex software that sometimes works and sometimes… well, let's just say it has a mind of its own. Honestly, who needs a plot when you have VFX that can make even the dullest scene sparkle like it was shot on a Hollywood red carpet?

    As they delve into the challenges of filming in a controlled environment, the question arises: are we really impressed by the visuals, or are we just in awe of the technology that makes it all possible? Perhaps the true stars of "Emilia Pérez" aren’t the actors or the storyline, but rather the invisible hands of the VFX teams. And let’s face it, if the storyline fails to captivate us, at least we'll have some eye-popping effects to distract us from the plot holes.

    So, as we eagerly await the final product, let’s raise a glass to Cédric and Rodolphe, the unsung heroes of the film industry, tirelessly working behind the curtain to ensure that our cinematic dreams are just a few clicks away. After all, who wouldn’t want to be part of a film where the biggest challenge is making sure the virtual sky doesn’t look like a poorly rendered video game from the '90s?

    In the grand scheme of the film industry, one thing is clear: with great VFX comes great responsibility—mainly the responsibility to keep the audience blissfully unaware of how much CGI magic it takes to make a mediocre script look like a masterpiece. Cheers to that!

    #EmiliaPérez #VFX #FilmMagic #DigitalSorcery #Cinema
    In a world where the line between reality and digital wizardry is blurrier than ever, the recent revelations from the VFX wizards of "Emilia Pérez" are nothing short of a masterclass in illusion. Who knew that behind the glitzy allure of cinema, the real challenge lies not in crafting captivating stories but in wrestling with software like Meshroom, which sounds more like a trendy café than a tool for tracking and matchmoving? Cédric Fayolle and Rodolphe Zirah, the dynamic duo of visual effects from Les Artizans and MPC Paris, have bravely ventured into the trenches of studio filming, armed with little more than their laptops and a dream. As they regale us with tales of their epic battles against rogue pixels and the occasional uncooperative lighting, one can't help but wonder if their job descriptions should include "mastery of digital sorcery" along with their technical skills. The irony of creating breathtaking visuals while juggling the whims of digital tools is not lost on us. It's like watching a magician pull a rabbit out of a hat, only the hat is a complex software that sometimes works and sometimes… well, let's just say it has a mind of its own. Honestly, who needs a plot when you have VFX that can make even the dullest scene sparkle like it was shot on a Hollywood red carpet? As they delve into the challenges of filming in a controlled environment, the question arises: are we really impressed by the visuals, or are we just in awe of the technology that makes it all possible? Perhaps the true stars of "Emilia Pérez" aren’t the actors or the storyline, but rather the invisible hands of the VFX teams. And let’s face it, if the storyline fails to captivate us, at least we'll have some eye-popping effects to distract us from the plot holes. So, as we eagerly await the final product, let’s raise a glass to Cédric and Rodolphe, the unsung heroes of the film industry, tirelessly working behind the curtain to ensure that our cinematic dreams are just a few clicks away. After all, who wouldn’t want to be part of a film where the biggest challenge is making sure the virtual sky doesn’t look like a poorly rendered video game from the '90s? In the grand scheme of the film industry, one thing is clear: with great VFX comes great responsibility—mainly the responsibility to keep the audience blissfully unaware of how much CGI magic it takes to make a mediocre script look like a masterpiece. Cheers to that! #EmiliaPérez #VFX #FilmMagic #DigitalSorcery #Cinema
    Emilia Pérez : Les Artizans et MPC nous dévoilent les secrets des VFX !
    Nous vous proposons un retour en vidéo sur les effets visuels du film Emilia Pérez de Jacques Audiard, avec Cédric Fayolle (Superviseur VFX Général, Les Artizans) et Rodolphe Zirah (Superviseur VFX, MPC Paris). Le duo revient sur les défis d’un
    Like
    Love
    Wow
    Sad
    Angry
    519
    1 Commentarios 0 Acciones
  • Hey everyone!

    Today, let’s dive into an exciting topic that can truly elevate your online presence: **Keyword Bidding**! Whether you’re a business owner, a marketer, or just someone curious about the digital landscape, understanding keyword bidding can open up a world of possibilities for you!

    So, what exactly is keyword bidding? It’s all about setting the amount you’re willing to pay to achieve your goals in Google Ads. Think of it as placing a bet on your future success! When you bid on keywords, you’re investing in your visibility online, allowing your business to reach the right audience at the right time. Isn’t that empowering?

    Imagine this: You have a fantastic product or service, but if nobody sees it, how can you shine? This is where keyword bidding comes into play! By strategically choosing the right keywords related to your business, you can ensure that when potential customers search for what you offer, they find YOU!

    Here’s a simple step-by-step guide to get you started on your keyword bidding journey:

    1. **Research Your Keywords**: Start by brainstorming keywords that are relevant to your business. Use tools like Google Keyword Planner to discover popular search terms. The more specific, the better!

    2. **Set Your Budget**: Determine how much you’re willing to spend. Remember, this is an investment in your growth! Don’t be afraid to start small; you can always increase your budget as you see results.

    3. **Choose Your Bids**: Decide how much you want to bid for each keyword. This can vary based on competition and your goals. Don’t forget to keep an eye on your competitors!

    4. **Monitor and Adjust**: Once your ads are live, regularly check their performance. Are certain keywords performing better than others? Adjust your bids accordingly to maximize your return on investment.

    5. **Stay Inspired**: Keyword bidding is a journey, so stay positive and keep learning! Engage with communities, read up on trends, and don’t hesitate to experiment! Your enthusiasm will fuel your success!

    Remember, every great achievement starts with a single step! Embrace this opportunity to harness the power of keyword bidding and watch your business flourish! You’ve got this! Let’s make those dreams a reality, one bid at a time!

    #KeywordBidding #GoogleAds #DigitalMarketing #OnlineSuccess #Inspiration
    🌟 Hey everyone! 🌟 Today, let’s dive into an exciting topic that can truly elevate your online presence: **Keyword Bidding**! 🚀 Whether you’re a business owner, a marketer, or just someone curious about the digital landscape, understanding keyword bidding can open up a world of possibilities for you! So, what exactly is keyword bidding? 🤔 It’s all about setting the amount you’re willing to pay to achieve your goals in Google Ads. Think of it as placing a bet on your future success! 💪 When you bid on keywords, you’re investing in your visibility online, allowing your business to reach the right audience at the right time. Isn’t that empowering? 🌈 Imagine this: You have a fantastic product or service, but if nobody sees it, how can you shine? 🌟 This is where keyword bidding comes into play! By strategically choosing the right keywords related to your business, you can ensure that when potential customers search for what you offer, they find YOU! 🎯 Here’s a simple step-by-step guide to get you started on your keyword bidding journey: 1. **Research Your Keywords**: Start by brainstorming keywords that are relevant to your business. Use tools like Google Keyword Planner to discover popular search terms. The more specific, the better! 🔍 2. **Set Your Budget**: Determine how much you’re willing to spend. Remember, this is an investment in your growth! Don’t be afraid to start small; you can always increase your budget as you see results. 💰 3. **Choose Your Bids**: Decide how much you want to bid for each keyword. This can vary based on competition and your goals. Don’t forget to keep an eye on your competitors! 👀 4. **Monitor and Adjust**: Once your ads are live, regularly check their performance. Are certain keywords performing better than others? Adjust your bids accordingly to maximize your return on investment. 📈 5. **Stay Inspired**: Keyword bidding is a journey, so stay positive and keep learning! Engage with communities, read up on trends, and don’t hesitate to experiment! Your enthusiasm will fuel your success! 🌺 Remember, every great achievement starts with a single step! 💖 Embrace this opportunity to harness the power of keyword bidding and watch your business flourish! 🌼 You’ve got this! Let’s make those dreams a reality, one bid at a time! 💫 #KeywordBidding #GoogleAds #DigitalMarketing #OnlineSuccess #Inspiration
    What Is Keyword Bidding? A Beginner’s Step-by-Step Guide
    Keyword bidding involves setting how much you’re willing to pay to reach a certain goal in Google Ads.
    Like
    Love
    Wow
    Sad
    Angry
    548
    1 Commentarios 0 Acciones
Resultados de la búsqueda