• Elden Ring Nightreign's New Boss Is Actually Pretty Easy Thanks In Part To This Completely Busted Relic

    My friends and I spent 10 hours dying to last week’s Everdark Sovereign overhaul of the Gaping Jaw before finally killing the lightning bird late one night, long past the point when we should have all gone to bed. Now Elden Ring Nightreign is back with a new boss for players to take on, but thankfully this one is a…Read more...
    Elden Ring Nightreign's New Boss Is Actually Pretty Easy Thanks In Part To This Completely Busted Relic My friends and I spent 10 hours dying to last week’s Everdark Sovereign overhaul of the Gaping Jaw before finally killing the lightning bird late one night, long past the point when we should have all gone to bed. Now Elden Ring Nightreign is back with a new boss for players to take on, but thankfully this one is a…Read more...
    KOTAKU.COM
    Elden Ring Nightreign's New Boss Is Actually Pretty Easy Thanks In Part To This Completely Busted Relic
    My friends and I spent 10 hours dying to last week’s Everdark Sovereign overhaul of the Gaping Jaw before finally killing the lightning bird late one night, long past the point when we should have all gone to bed. Now Elden Ring Nightreign is back w
    2 Комментарии 0 Поделились
  • Plug and Play: Build a G-Assist Plug-In Today

    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems.
    NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels.

    G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow.
    Below, find popular G-Assist plug-ins, hackathon details and tips to get started.
    Plug-In and Win
    Join the hackathon by registering and checking out the curated technical resources.
    G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation.
    For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins.
    To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code.
    Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action.
    Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16.
    Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in.
    Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit.
    Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver.
    Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows.

    Popular plug-ins include:

    Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay.
    Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay.
    IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device.
    Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists.
    Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more.

    Get G-Assist 
    Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff.
    the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session.
    Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities.
    Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process.
    NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch.
    Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations. 
    Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter.
    Follow NVIDIA Workstation on LinkedIn and X. 
    See notice regarding software product information.
    #plug #play #build #gassist #plugin
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information. #plug #play #build #gassist #plugin
    BLOGS.NVIDIA.COM
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file (plugin.py), requirements.txt, manifest.json, config.json (if applicable), a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU (Intel Pentium G Series, Core i3, i5, i7 or higher; AMD FX, Ryzen 3, 5, 7, 9, Threadripper or higher), specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-In(spiration) Explore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist(ance)  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. Save the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information.
    Like
    Wow
    Love
    Sad
    25
    0 Комментарии 0 Поделились
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Комментарии 0 Поделились
  • Game On With GeForce NOW, the Membership That Keeps on Delivering

    This GFN Thursday rolls out a new reward and games for GeForce NOW members. Whether hunting for hot new releases or rediscovering timeless classics, members can always find more ways to play, games to stream and perks to enjoy.
    Gamers can score major discounts on the titles they’ve been eyeing — perfect for streaming in the cloud — during the Steam Summer Sale, running until Thursday, July 10, at 10 a.m. PT.
    This week also brings unforgettable adventures to the cloud: We Happy Few and Broken Age are part of the five additions to the GeForce NOW library this week.
    The fun doesn’t stop there. A new in-game reward for Elder Scrolls Online is now available for members to claim.
    And SteelSeries has launched a new mobile controller that transforms phones into cloud gaming devices with GeForce NOW. Add it to the roster of on-the-go gaming devices — including the recently launched GeForce NOW app on Steam Deck for seamless 4K streaming.
    Scroll Into Power
    GeForce NOW Premium members receive exclusive 24-hour early access to a new mythical reward in The Elder Scrolls Online — Bethesda’s award-winning role-playing game — before it opens to all members. Sharpen the sword, ready the staff and chase glory across the vast, immersive world of Tamriel.
    Fortune favors the bold.
    Claim the mythical Grand Gold Coast Experience Scrolls reward, a rare item that grants a bonus of 150% Experience Points from all sources for one hour. The scroll’s effect pauses while players are offline and resumes upon return, ensuring every minute counts. Whether tackling dungeon runs, completing epic quests or leveling a new character, the scrolls provide a powerful edge. Claim the reward, harness its power and scroll into the next adventure.
    Members who’ve opted into the GeForce NOW Rewards program can check their emails for redemption instructions. The offer runs through Saturday, July 26, while supplies last. Don’t miss this opportunity to become a legend in Tamriel.
    Steam Up Summer
    The Steam Summer Sale is in full swing. Snag games at discounted prices and stream them instantly from the cloud — no downloads, no waiting, just pure gaming bliss.
    Treat yourself.
    Check out the “Steam Summer Sale” row in the GeForce NOW app to find deals on the next adventure. With GeForce NOW, gaming favorites are always just a click away.
    While picking up discounted games, don’t miss the chance to get a GeForce NOW six-month Performance membership at 40% off. This is also the last opportunity to take advantage of the Performance Day Pass sale, ending Friday, June 27 — which lets gamers access cloud gaming for 24 hours — before diving into the 6-month Performance membership.
    Find Adventure
    Two distinct worlds — where secrets simmer and imagination runs wild — are streaming onto the cloud this week.
    Keep calm and blend in.
    Step into the surreal, retro-futuristic streets of We Happy Few, where a society obsessed with happiness hides its secrets behind a mask of forced cheer and a haze of “Joy.” This darkly whimsical adventure invites players to blend in, break out and uncover the truth lurking beneath the surface of Wellington Wells.
    Two worlds, one wild destiny.
    Broken Age spins a charming, hand-painted tale of two teenagers leading parallel lives in worlds at once strange and familiar. One of the teens yearns to escape a stifling spaceship, and the other is destined to challenge ancient traditions. With witty dialogue and heartfelt moments, Broken Age is a storybook come to life, brimming with quirky characters and clever puzzles.
    Each of these unforgettable adventures brings its own flavor — be it dark satire, whimsical wonder or pulse-pounding suspense — offering a taste of gaming at its imaginative peaks. Stream these captivating worlds straight from the cloud and enjoy seamless gameplay, no downloads or high-end hardware required.
    An Ultimate Controller
    Elevated gaming.
    Get ready for the SteelSeries Nimbus Cloud, a new dual-mode cloud controller. When paired with GeForce NOW, this new controller reaches new heights.
    Designed for versatility and comfort, and crafted specifically for cloud gaming, the SteelSeries Nimbus Cloud effortlessly shifts from a mobile device controller to a full-sized wireless controller, delivering top-notch performance and broad compatibility across devices.
    The Nimbus Cloud enables gamers to play wherever they are, as it easily adapts to fit iPhones and Android phones. Or collapse and connect the controller via Bluetooth to a gaming rig or smart TV. Transform any space into a personal gaming station with GeForce NOW and the Nimbus Cloud, part of the list of recommended products for an elevated cloud gaming experience.
    Gaming Never Sleeps
    “System Shock 2” — now with 100% more existential dread.
    System Shock 2: 25th Anniversary Remaster is an overhaul of the acclaimed sci-fi horror classic, rebuilt by Nightdive Studios with enhanced visuals, refined gameplay and features such as cross-play co-op multiplayer. Face the sinister AI SHODAN and her mutant army aboard the starship Von Braun as a cybernetically enhanced soldier with upgradable skills, powerful weapons and psionic abilities. Stream the title from the cloud with GeForce NOW for ultimate flexibility and performance.
    Look for the following games available to stream in the cloud this week:

    System Shock 2: 25th Anniversary RemasterBroken AgeEasy Red 2Sandwich SimulatorWe Happy FewWhat are you planning to play this weekend? Let us know on X or in the comments below.

    The official GFN summer bucket list
    Play anywhere Stream on every screen you own Finally crush that backlog Skip every single download bar
    Drop the emoji for the one you’re tackling right now
    — NVIDIA GeForce NOWJune 25, 2025
    #game #with #geforce #now #membership
    Game On With GeForce NOW, the Membership That Keeps on Delivering
    This GFN Thursday rolls out a new reward and games for GeForce NOW members. Whether hunting for hot new releases or rediscovering timeless classics, members can always find more ways to play, games to stream and perks to enjoy. Gamers can score major discounts on the titles they’ve been eyeing — perfect for streaming in the cloud — during the Steam Summer Sale, running until Thursday, July 10, at 10 a.m. PT. This week also brings unforgettable adventures to the cloud: We Happy Few and Broken Age are part of the five additions to the GeForce NOW library this week. The fun doesn’t stop there. A new in-game reward for Elder Scrolls Online is now available for members to claim. And SteelSeries has launched a new mobile controller that transforms phones into cloud gaming devices with GeForce NOW. Add it to the roster of on-the-go gaming devices — including the recently launched GeForce NOW app on Steam Deck for seamless 4K streaming. Scroll Into Power GeForce NOW Premium members receive exclusive 24-hour early access to a new mythical reward in The Elder Scrolls Online — Bethesda’s award-winning role-playing game — before it opens to all members. Sharpen the sword, ready the staff and chase glory across the vast, immersive world of Tamriel. Fortune favors the bold. Claim the mythical Grand Gold Coast Experience Scrolls reward, a rare item that grants a bonus of 150% Experience Points from all sources for one hour. The scroll’s effect pauses while players are offline and resumes upon return, ensuring every minute counts. Whether tackling dungeon runs, completing epic quests or leveling a new character, the scrolls provide a powerful edge. Claim the reward, harness its power and scroll into the next adventure. Members who’ve opted into the GeForce NOW Rewards program can check their emails for redemption instructions. The offer runs through Saturday, July 26, while supplies last. Don’t miss this opportunity to become a legend in Tamriel. Steam Up Summer The Steam Summer Sale is in full swing. Snag games at discounted prices and stream them instantly from the cloud — no downloads, no waiting, just pure gaming bliss. Treat yourself. Check out the “Steam Summer Sale” row in the GeForce NOW app to find deals on the next adventure. With GeForce NOW, gaming favorites are always just a click away. While picking up discounted games, don’t miss the chance to get a GeForce NOW six-month Performance membership at 40% off. This is also the last opportunity to take advantage of the Performance Day Pass sale, ending Friday, June 27 — which lets gamers access cloud gaming for 24 hours — before diving into the 6-month Performance membership. Find Adventure Two distinct worlds — where secrets simmer and imagination runs wild — are streaming onto the cloud this week. Keep calm and blend in. Step into the surreal, retro-futuristic streets of We Happy Few, where a society obsessed with happiness hides its secrets behind a mask of forced cheer and a haze of “Joy.” This darkly whimsical adventure invites players to blend in, break out and uncover the truth lurking beneath the surface of Wellington Wells. Two worlds, one wild destiny. Broken Age spins a charming, hand-painted tale of two teenagers leading parallel lives in worlds at once strange and familiar. One of the teens yearns to escape a stifling spaceship, and the other is destined to challenge ancient traditions. With witty dialogue and heartfelt moments, Broken Age is a storybook come to life, brimming with quirky characters and clever puzzles. Each of these unforgettable adventures brings its own flavor — be it dark satire, whimsical wonder or pulse-pounding suspense — offering a taste of gaming at its imaginative peaks. Stream these captivating worlds straight from the cloud and enjoy seamless gameplay, no downloads or high-end hardware required. An Ultimate Controller Elevated gaming. Get ready for the SteelSeries Nimbus Cloud, a new dual-mode cloud controller. When paired with GeForce NOW, this new controller reaches new heights. Designed for versatility and comfort, and crafted specifically for cloud gaming, the SteelSeries Nimbus Cloud effortlessly shifts from a mobile device controller to a full-sized wireless controller, delivering top-notch performance and broad compatibility across devices. The Nimbus Cloud enables gamers to play wherever they are, as it easily adapts to fit iPhones and Android phones. Or collapse and connect the controller via Bluetooth to a gaming rig or smart TV. Transform any space into a personal gaming station with GeForce NOW and the Nimbus Cloud, part of the list of recommended products for an elevated cloud gaming experience. Gaming Never Sleeps “System Shock 2” — now with 100% more existential dread. System Shock 2: 25th Anniversary Remaster is an overhaul of the acclaimed sci-fi horror classic, rebuilt by Nightdive Studios with enhanced visuals, refined gameplay and features such as cross-play co-op multiplayer. Face the sinister AI SHODAN and her mutant army aboard the starship Von Braun as a cybernetically enhanced soldier with upgradable skills, powerful weapons and psionic abilities. Stream the title from the cloud with GeForce NOW for ultimate flexibility and performance. Look for the following games available to stream in the cloud this week: System Shock 2: 25th Anniversary RemasterBroken AgeEasy Red 2Sandwich SimulatorWe Happy FewWhat are you planning to play this weekend? Let us know on X or in the comments below. The official GFN summer bucket list Play anywhere Stream on every screen you own Finally crush that backlog Skip every single download bar Drop the emoji for the one you’re tackling right now — NVIDIA GeForce NOWJune 25, 2025 #game #with #geforce #now #membership
    BLOGS.NVIDIA.COM
    Game On With GeForce NOW, the Membership That Keeps on Delivering
    This GFN Thursday rolls out a new reward and games for GeForce NOW members. Whether hunting for hot new releases or rediscovering timeless classics, members can always find more ways to play, games to stream and perks to enjoy. Gamers can score major discounts on the titles they’ve been eyeing — perfect for streaming in the cloud — during the Steam Summer Sale, running until Thursday, July 10, at 10 a.m. PT. This week also brings unforgettable adventures to the cloud: We Happy Few and Broken Age are part of the five additions to the GeForce NOW library this week. The fun doesn’t stop there. A new in-game reward for Elder Scrolls Online is now available for members to claim. And SteelSeries has launched a new mobile controller that transforms phones into cloud gaming devices with GeForce NOW. Add it to the roster of on-the-go gaming devices — including the recently launched GeForce NOW app on Steam Deck for seamless 4K streaming. Scroll Into Power GeForce NOW Premium members receive exclusive 24-hour early access to a new mythical reward in The Elder Scrolls Online — Bethesda’s award-winning role-playing game — before it opens to all members. Sharpen the sword, ready the staff and chase glory across the vast, immersive world of Tamriel. Fortune favors the bold. Claim the mythical Grand Gold Coast Experience Scrolls reward, a rare item that grants a bonus of 150% Experience Points from all sources for one hour. The scroll’s effect pauses while players are offline and resumes upon return, ensuring every minute counts. Whether tackling dungeon runs, completing epic quests or leveling a new character, the scrolls provide a powerful edge. Claim the reward, harness its power and scroll into the next adventure. Members who’ve opted into the GeForce NOW Rewards program can check their emails for redemption instructions. The offer runs through Saturday, July 26, while supplies last. Don’t miss this opportunity to become a legend in Tamriel. Steam Up Summer The Steam Summer Sale is in full swing. Snag games at discounted prices and stream them instantly from the cloud — no downloads, no waiting, just pure gaming bliss. Treat yourself. Check out the “Steam Summer Sale” row in the GeForce NOW app to find deals on the next adventure. With GeForce NOW, gaming favorites are always just a click away. While picking up discounted games, don’t miss the chance to get a GeForce NOW six-month Performance membership at 40% off. This is also the last opportunity to take advantage of the Performance Day Pass sale, ending Friday, June 27 — which lets gamers access cloud gaming for 24 hours — before diving into the 6-month Performance membership. Find Adventure Two distinct worlds — where secrets simmer and imagination runs wild — are streaming onto the cloud this week. Keep calm and blend in (or else). Step into the surreal, retro-futuristic streets of We Happy Few, where a society obsessed with happiness hides its secrets behind a mask of forced cheer and a haze of “Joy.” This darkly whimsical adventure invites players to blend in, break out and uncover the truth lurking beneath the surface of Wellington Wells. Two worlds, one wild destiny. Broken Age spins a charming, hand-painted tale of two teenagers leading parallel lives in worlds at once strange and familiar. One of the teens yearns to escape a stifling spaceship, and the other is destined to challenge ancient traditions. With witty dialogue and heartfelt moments, Broken Age is a storybook come to life, brimming with quirky characters and clever puzzles. Each of these unforgettable adventures brings its own flavor — be it dark satire, whimsical wonder or pulse-pounding suspense — offering a taste of gaming at its imaginative peaks. Stream these captivating worlds straight from the cloud and enjoy seamless gameplay, no downloads or high-end hardware required. An Ultimate Controller Elevated gaming. Get ready for the SteelSeries Nimbus Cloud, a new dual-mode cloud controller. When paired with GeForce NOW, this new controller reaches new heights. Designed for versatility and comfort, and crafted specifically for cloud gaming, the SteelSeries Nimbus Cloud effortlessly shifts from a mobile device controller to a full-sized wireless controller, delivering top-notch performance and broad compatibility across devices. The Nimbus Cloud enables gamers to play wherever they are, as it easily adapts to fit iPhones and Android phones. Or collapse and connect the controller via Bluetooth to a gaming rig or smart TV. Transform any space into a personal gaming station with GeForce NOW and the Nimbus Cloud, part of the list of recommended products for an elevated cloud gaming experience. Gaming Never Sleeps “System Shock 2” — now with 100% more existential dread. System Shock 2: 25th Anniversary Remaster is an overhaul of the acclaimed sci-fi horror classic, rebuilt by Nightdive Studios with enhanced visuals, refined gameplay and features such as cross-play co-op multiplayer. Face the sinister AI SHODAN and her mutant army aboard the starship Von Braun as a cybernetically enhanced soldier with upgradable skills, powerful weapons and psionic abilities. Stream the title from the cloud with GeForce NOW for ultimate flexibility and performance. Look for the following games available to stream in the cloud this week: System Shock 2: 25th Anniversary Remaster (New release on Steam, June 26) Broken Age (Steam) Easy Red 2 (Steam) Sandwich Simulator (Steam) We Happy Few (Steam) What are you planning to play this weekend? Let us know on X or in the comments below. The official GFN summer bucket list Play anywhere Stream on every screen you own Finally crush that backlog Skip every single download bar Drop the emoji for the one you’re tackling right now — NVIDIA GeForce NOW (@NVIDIAGFN) June 25, 2025
    0 Комментарии 0 Поделились
  • Ah, the glorious return of the zine! Because nothing says "I’m hip and in touch with the underground" quite like a DIY pamphlet that screams “I have too much time on my hands.” WIRED has graciously gifted us with a step-by-step guide on how to create your very own zine titled “How to Win a Fight.”

    Print. Fold. Share. Download. Sounds easy, right? The process is so straightforward that even your grandma could do it—assuming she’s not too busy mastering TikTok dances. But let’s take a moment to appreciate the sheer audacity of needing instructions for something as inherently chaotic as making a zine. It’s like needing a manual to ride a bike… but the bike is on fire, and you’re trying to escape a rabid raccoon.

    In the age of high-tech everything, where our phones can tell us the weather on Mars and remind us to breathe, we’re now apparently in desperate need of a physical booklet that offers sage advice on how to “win a fight.” Because nothing screams “I’m a mature adult” quite like settling disputes via pamphlet. Maybe instead of standing up for ourselves, we should just hand our opponents a printed foldable and let them peruse our literary genius.

    And let’s not forget the nostalgia factor here! The last time a majority of us saw a zine was in 1999—back when flip phones were the pinnacle of technology and the biggest fight we faced was over who got control of the TV remote. Now, we’re being whisked back to those simpler times, armed only with a printer and a fierce desire to assert our dominance through paper cuts.

    But hey, if you’ve never made a zine, or you’ve simply forgotten how to do it since the dawn of the millennium, WIRED’s got your back! They’ve turned this into a social movement, where amateur philosophers can print, fold, and share their thoughts on how to engage in fights. Because why have a conversation when you can battle with paper instead?

    Let’s be honest: this is all about making “fighting” a trendy topic again. Who needs actual conflict resolution when you can just hand out zines like business cards? Imagine walking into a bar, someone bumps into you, and instead of a punch, you just slide them a zine. “Here’s how to win a fight, buddy. Chapter One: Don’t.”

    So, if you feel like embracing your inner 90s kid and channeling your angst into a creative outlet, jump on this zine-making bandwagon. Who knows? You might just win a fight—against boredom, at least.

    #ZineCulture #HowToWinAFight #DIYProject #NostalgiaTrip #WIRED
    Ah, the glorious return of the zine! Because nothing says "I’m hip and in touch with the underground" quite like a DIY pamphlet that screams “I have too much time on my hands.” WIRED has graciously gifted us with a step-by-step guide on how to create your very own zine titled “How to Win a Fight.” Print. Fold. Share. Download. Sounds easy, right? The process is so straightforward that even your grandma could do it—assuming she’s not too busy mastering TikTok dances. But let’s take a moment to appreciate the sheer audacity of needing instructions for something as inherently chaotic as making a zine. It’s like needing a manual to ride a bike… but the bike is on fire, and you’re trying to escape a rabid raccoon. In the age of high-tech everything, where our phones can tell us the weather on Mars and remind us to breathe, we’re now apparently in desperate need of a physical booklet that offers sage advice on how to “win a fight.” Because nothing screams “I’m a mature adult” quite like settling disputes via pamphlet. Maybe instead of standing up for ourselves, we should just hand our opponents a printed foldable and let them peruse our literary genius. And let’s not forget the nostalgia factor here! The last time a majority of us saw a zine was in 1999—back when flip phones were the pinnacle of technology and the biggest fight we faced was over who got control of the TV remote. Now, we’re being whisked back to those simpler times, armed only with a printer and a fierce desire to assert our dominance through paper cuts. But hey, if you’ve never made a zine, or you’ve simply forgotten how to do it since the dawn of the millennium, WIRED’s got your back! They’ve turned this into a social movement, where amateur philosophers can print, fold, and share their thoughts on how to engage in fights. Because why have a conversation when you can battle with paper instead? Let’s be honest: this is all about making “fighting” a trendy topic again. Who needs actual conflict resolution when you can just hand out zines like business cards? Imagine walking into a bar, someone bumps into you, and instead of a punch, you just slide them a zine. “Here’s how to win a fight, buddy. Chapter One: Don’t.” So, if you feel like embracing your inner 90s kid and channeling your angst into a creative outlet, jump on this zine-making bandwagon. Who knows? You might just win a fight—against boredom, at least. #ZineCulture #HowToWinAFight #DIYProject #NostalgiaTrip #WIRED
    Print. Fold. Share. Download WIRED's How to Win a Fight Zine Here
    Never made a zine? Haven’t made one since 1999? We made one, and so can you.
    Like
    Love
    Wow
    Sad
    Angry
    251
    1 Комментарии 0 Поделились
  • Test de Seduced.ai: can you really customize your fantasies with AI? June 2025. Honestly, it sounds like just another tech gimmick. Seduced.ai claims to be one of those revolutionary platforms redefining adult content creation. But does anyone even care?

    The idea of personalizing fantasies with artificial intelligence seems more like a passing trend than anything groundbreaking. Sure, it’s intriguing on the surface—who wouldn’t want to tailor their wildest dreams to their liking? But then again, does it really make a difference?

    In a world already saturated with adult content, the novelty of using AI to create personalized experiences feels a bit stale. I mean, at the end of the day, it’s still just content. The article discusses how Seduced.ai aims to engage users by offering customizable options. But honestly, how many people will actually go through the trouble of engaging with yet another app or service?

    Let’s be real. Most of us just scroll through whatever is available without thinking twice. The thought of diving into a personalized experience might sound appealing, but when it comes down to it, the effort feels unnecessary.

    Sure, technology is evolving, and Seduced.ai is trying to ride that wave. But for the average user, the excitement seems to fade quickly. The article on REALITE-VIRTUELLE.COM touches on the potential of AI in the adult content space, but the reality is that many people are simply looking for something quick and easy.

    Do we really need to complicate things with AI? Or can we just stick to the basics? Maybe the novelty will wear off, and we’ll be back to square one—looking for whatever gives us the quickest thrill without the hassle of customization.

    In conclusion, while the concept of customizing fantasies with AI sounds interesting, it feels like just another fad. The effort to engage might not be worth it for most of us. After all, who has the energy for all that?

    #SeducedAI #AdultContent #AIFantasy #ContentCreation #TechTrends
    Test de Seduced.ai: can you really customize your fantasies with AI? June 2025. Honestly, it sounds like just another tech gimmick. Seduced.ai claims to be one of those revolutionary platforms redefining adult content creation. But does anyone even care? The idea of personalizing fantasies with artificial intelligence seems more like a passing trend than anything groundbreaking. Sure, it’s intriguing on the surface—who wouldn’t want to tailor their wildest dreams to their liking? But then again, does it really make a difference? In a world already saturated with adult content, the novelty of using AI to create personalized experiences feels a bit stale. I mean, at the end of the day, it’s still just content. The article discusses how Seduced.ai aims to engage users by offering customizable options. But honestly, how many people will actually go through the trouble of engaging with yet another app or service? Let’s be real. Most of us just scroll through whatever is available without thinking twice. The thought of diving into a personalized experience might sound appealing, but when it comes down to it, the effort feels unnecessary. Sure, technology is evolving, and Seduced.ai is trying to ride that wave. But for the average user, the excitement seems to fade quickly. The article on REALITE-VIRTUELLE.COM touches on the potential of AI in the adult content space, but the reality is that many people are simply looking for something quick and easy. Do we really need to complicate things with AI? Or can we just stick to the basics? Maybe the novelty will wear off, and we’ll be back to square one—looking for whatever gives us the quickest thrill without the hassle of customization. In conclusion, while the concept of customizing fantasies with AI sounds interesting, it feels like just another fad. The effort to engage might not be worth it for most of us. After all, who has the energy for all that? #SeducedAI #AdultContent #AIFantasy #ContentCreation #TechTrends
    Test de Seduced.ai : peut-on vraiment personnaliser ses fantasmes avec l’IA ? - juin 2025
    Seduced.ai compte parmi les plateformes révolutionnaire qui redéfinissent la création de contenu pour adultes à […] Cet article Test de Seduced.ai : peut-on vraiment personnaliser ses fantasmes avec l’IA ? - juin 2025 a été publié sur REA
    Like
    Love
    Wow
    Sad
    Angry
    296
    1 Комментарии 0 Поделились
  • Ah, the enchanting world of "Beautiful Accessibility"—where design meets a sweet sprinkle of dignity and a dollop of empathy. Isn’t it just delightful how we’ve collectively decided that making things accessible should also be aesthetically pleasing? Because, clearly, having a ramp that doesn’t double as a modern art installation would be just too much to ask.

    Gone are the days when accessibility was seen as a dull, clunky afterthought. Now, we’re on a quest to make sure that every wheelchair ramp looks like it was sculpted by Michelangelo himself. Who needs functionality when you can have a piece of art that also serves as a means of entry? You know, it’s almost like we’re saying, “Why should people who need help have to sacrifice beauty for practicality?”

    Let’s talk about that “rigid, rough, and unfriendly” stereotype of accessibility. Sure, it’s easy to dismiss these concerns. Just slap a coat of trendy paint on a handrail and voilà! You’ve got a “beautifully accessible” structure that’s just as likely to send someone flying off the side as it is to help them reach the door. But hey, at least it’s pretty to look at as they tumble—right?

    And let’s not overlook the underlying question: for whom are we really designing? Is it for the people who need accessibility, or is it for the fleeting approval of the Instagram crowd? If it’s the latter, then congratulations! You’re on the fast track to a trend that will inevitably fade faster than last season’s fashion. Remember, folks, the latest hashtag isn’t ‘#AccessibilityForAll’; it’s ‘#AccessibilityIsTheNewBlack,’ and we all know how long that lasts in the fickle world of social media.

    Now, let’s sprinkle in some empathy, shall we? Because nothing says “I care” quite like a designer who has spent five minutes contemplating the plight of those who can’t navigate the “avant-garde” staircase that serves no purpose other than to look chic in a photo. Empathy is key, but please, let’s not take it too far. After all, who has time to engage deeply with real human needs when there’s a dazzling design competition to win?

    So, as we stand at the crossroads of functionality and aesthetics, let’s all raise a glass to the idea of "Beautiful Accessibility." May it forever remain beautifully ironic and, of course, aesthetically pleasing—after all, what’s more dignified than a thoughtfully designed ramp that looks like it belongs in a museum, even if it makes getting into that museum a bit of a challenge?

    #BeautifulAccessibility #DesignWithEmpathy #AccessibilityMatters #DignityInDesign #IronyInAccessibility
    Ah, the enchanting world of "Beautiful Accessibility"—where design meets a sweet sprinkle of dignity and a dollop of empathy. Isn’t it just delightful how we’ve collectively decided that making things accessible should also be aesthetically pleasing? Because, clearly, having a ramp that doesn’t double as a modern art installation would be just too much to ask. Gone are the days when accessibility was seen as a dull, clunky afterthought. Now, we’re on a quest to make sure that every wheelchair ramp looks like it was sculpted by Michelangelo himself. Who needs functionality when you can have a piece of art that also serves as a means of entry? You know, it’s almost like we’re saying, “Why should people who need help have to sacrifice beauty for practicality?” Let’s talk about that “rigid, rough, and unfriendly” stereotype of accessibility. Sure, it’s easy to dismiss these concerns. Just slap a coat of trendy paint on a handrail and voilà! You’ve got a “beautifully accessible” structure that’s just as likely to send someone flying off the side as it is to help them reach the door. But hey, at least it’s pretty to look at as they tumble—right? And let’s not overlook the underlying question: for whom are we really designing? Is it for the people who need accessibility, or is it for the fleeting approval of the Instagram crowd? If it’s the latter, then congratulations! You’re on the fast track to a trend that will inevitably fade faster than last season’s fashion. Remember, folks, the latest hashtag isn’t ‘#AccessibilityForAll’; it’s ‘#AccessibilityIsTheNewBlack,’ and we all know how long that lasts in the fickle world of social media. Now, let’s sprinkle in some empathy, shall we? Because nothing says “I care” quite like a designer who has spent five minutes contemplating the plight of those who can’t navigate the “avant-garde” staircase that serves no purpose other than to look chic in a photo. Empathy is key, but please, let’s not take it too far. After all, who has time to engage deeply with real human needs when there’s a dazzling design competition to win? So, as we stand at the crossroads of functionality and aesthetics, let’s all raise a glass to the idea of "Beautiful Accessibility." May it forever remain beautifully ironic and, of course, aesthetically pleasing—after all, what’s more dignified than a thoughtfully designed ramp that looks like it belongs in a museum, even if it makes getting into that museum a bit of a challenge? #BeautifulAccessibility #DesignWithEmpathy #AccessibilityMatters #DignityInDesign #IronyInAccessibility
    Accesibilidad bella: diseñar para la dignidad y construir con empatía
    Más que una técnica o una guía de buenas prácticas, la accesibilidad bella es una actitud. Es reflexionar y cuestionar el porqué, el cómo y para quién diseñamos. A menudo se percibe la accesibilidad como algo rígido, rudo y poco amigable, estéticamen
    Like
    Love
    Wow
    Sad
    Angry
    325
    1 Комментарии 0 Поделились
  • Why does the world of animation, particularly at events like the SIGGRAPH Electronic Theater, continue to suffer from mediocrity? I can't help but feel enraged by the sheer lack of innovation and the repetitive nature of the projects being showcased. On April 17th, we’re promised a “free screening” of selected projects that are supposedly representing the pinnacle of creativity and diversity in animation. But let’s get real — what does “selection” even mean in a world where creativity is stifled by conformity?

    Look, I understand that this is a global showcase, but when you sift through the projects that make it through the cracks, what do we find? Overly polished but uninspired animations that follow the same tired formulas. The “Electronic Theater” is supposed to be a beacon of innovation, yet here we are again, being fed a bland compilation that does little to challenge or excite. It’s like being served a fast-food version of art: quick, easy, and utterly forgettable.

    The call for diversity is also a double-edged sword. Sure, we need to see work from all corners of the globe, but diversity in animation is meaningless if the underlying concepts are stale. It’s not enough to tick boxes and say, “Look how diverse we are!” when the actual content fails to push boundaries. Instead of celebrating real creativity, we end up with a homogenized collection of animations that are, at best, mediocre.

    And let’s talk about the timing of this event. April 17th? Are we really thinking this through? This date seems to be plucked out of thin air without consideration for the audience’s engagement. Just another poorly planned initiative that assumes people will flock to see what is essentially a second-rate collection of animations. Is this really the best you can do, Montpellier ACM SIGGRAPH? Where is the excitement? Where is the passion?

    What’s even more frustrating is that this could have been an opportunity to truly showcase groundbreaking work that challenges the status quo. Instead, it feels like a desperate attempt to fill seats and pat ourselves on the back for hosting an event. Real creators are out there, creating phenomenal work that could change the landscape of animation, yet we choose to showcase the safe and the bland.

    It’s time to demand more from events like SIGGRAPH. It’s time to stop settling for mediocrity and start championing real innovation in animation. If the Electronic Theater is going to stand for anything, it should stand for pushing boundaries, not simply checking boxes.

    Let’s not allow ourselves to be content with what we’re served. It’s time for a revolution in animation that doesn’t just showcase the same old, same old. We deserve better, and the art community deserves better.

    #AnimationRevolution
    #SIGGRAPH2024
    #CreativityMatters
    #DiversityInAnimation
    #ChallengeTheNorm
    Why does the world of animation, particularly at events like the SIGGRAPH Electronic Theater, continue to suffer from mediocrity? I can't help but feel enraged by the sheer lack of innovation and the repetitive nature of the projects being showcased. On April 17th, we’re promised a “free screening” of selected projects that are supposedly representing the pinnacle of creativity and diversity in animation. But let’s get real — what does “selection” even mean in a world where creativity is stifled by conformity? Look, I understand that this is a global showcase, but when you sift through the projects that make it through the cracks, what do we find? Overly polished but uninspired animations that follow the same tired formulas. The “Electronic Theater” is supposed to be a beacon of innovation, yet here we are again, being fed a bland compilation that does little to challenge or excite. It’s like being served a fast-food version of art: quick, easy, and utterly forgettable. The call for diversity is also a double-edged sword. Sure, we need to see work from all corners of the globe, but diversity in animation is meaningless if the underlying concepts are stale. It’s not enough to tick boxes and say, “Look how diverse we are!” when the actual content fails to push boundaries. Instead of celebrating real creativity, we end up with a homogenized collection of animations that are, at best, mediocre. And let’s talk about the timing of this event. April 17th? Are we really thinking this through? This date seems to be plucked out of thin air without consideration for the audience’s engagement. Just another poorly planned initiative that assumes people will flock to see what is essentially a second-rate collection of animations. Is this really the best you can do, Montpellier ACM SIGGRAPH? Where is the excitement? Where is the passion? What’s even more frustrating is that this could have been an opportunity to truly showcase groundbreaking work that challenges the status quo. Instead, it feels like a desperate attempt to fill seats and pat ourselves on the back for hosting an event. Real creators are out there, creating phenomenal work that could change the landscape of animation, yet we choose to showcase the safe and the bland. It’s time to demand more from events like SIGGRAPH. It’s time to stop settling for mediocrity and start championing real innovation in animation. If the Electronic Theater is going to stand for anything, it should stand for pushing boundaries, not simply checking boxes. Let’s not allow ourselves to be content with what we’re served. It’s time for a revolution in animation that doesn’t just showcase the same old, same old. We deserve better, and the art community deserves better. #AnimationRevolution #SIGGRAPH2024 #CreativityMatters #DiversityInAnimation #ChallengeTheNorm
    Projection gratuite : l’Electronic Theater du SIGGRAPH, le 17 avril !
    Vous n’étiez pas au SIGGRAPH l’été dernier ? Montpellier ACM SIGGRAPH a pensé à vous, et organise ce jeudi 17 avril une projection gratuite des projets sélectionnés dans l’Electronic Theater 2024, le festival d’animation du SI
    Like
    Love
    Wow
    Angry
    Sad
    625
    1 Комментарии 0 Поделились
  • In the stillness of the night, I often find myself reflecting on the weight of solitude that has become my constant companion. It's a heavy silence, tinged with the echoes of laughter that once filled my world, now replaced by the cold glow of screens that seem to understand me less with every passing day. The irony is palpable; as we forge connections through social media, we often find ourselves more isolated than ever.

    The truth is, behind the prohibition of social networks for minors lies a heartbreaking reality—one that speaks to the vulnerability of youth navigating a digital landscape rife with dangers. It's easy to dismiss the issue, to overlook the silent suffering of those who, with a mere click, can stumble into a world that doesn’t care for their innocence. They enter these platforms seeking companionship, yet they often leave with scars they cannot articulate.

    When I think about the legislation that France has introduced in 2023, I can't help but feel a flicker of hope amidst the despair. Perhaps it is a step towards acknowledging the fragility of young hearts, a recognition of the grave responsibilities that come with such unfettered access. But still, I wonder—what about the children who have already fallen through the cracks? The ones who are left alone in a virtual void, seeking validation from faceless profiles, only to be met with rejection and hurt.

    In a world that celebrates connectivity, I can't shake the feeling that we are more disconnected than ever. Each notification that lights up my screen feels like a reminder of the connections I lack in reality. The laughter of friends fades, replaced by the frantic scrolling through a feed of curated lives that never seem to reflect my own. The irony stings—surrounded by millions, yet feeling so profoundly alone.

    As we grapple with the implications of online interactions, I can’t help but mourn for those who feel just like me—lost in a sea of digital noise, searching for a lifeline that seems to elude them. The question remains: what is the cost of this digital freedom? Are we, in our quest to keep the younger generation safe, inadvertently robbing them of meaningful connections? Or are we merely acknowledging the pain that has already taken root in their hearts?

    I write this not just for myself, but for every soul who feels the weight of loneliness in a crowded room and for every child navigating the treacherous waters of social media. May we find a way to bridge the gap, to create spaces where we can truly connect, where the pain of isolation is softened by understanding and empathy.

    #Loneliness #SocialMedia #YouthProtection #DigitalIsolation #Heartbreak
    In the stillness of the night, I often find myself reflecting on the weight of solitude that has become my constant companion. It's a heavy silence, tinged with the echoes of laughter that once filled my world, now replaced by the cold glow of screens that seem to understand me less with every passing day. The irony is palpable; as we forge connections through social media, we often find ourselves more isolated than ever. 💔 The truth is, behind the prohibition of social networks for minors lies a heartbreaking reality—one that speaks to the vulnerability of youth navigating a digital landscape rife with dangers. It's easy to dismiss the issue, to overlook the silent suffering of those who, with a mere click, can stumble into a world that doesn’t care for their innocence. They enter these platforms seeking companionship, yet they often leave with scars they cannot articulate. 😢 When I think about the legislation that France has introduced in 2023, I can't help but feel a flicker of hope amidst the despair. Perhaps it is a step towards acknowledging the fragility of young hearts, a recognition of the grave responsibilities that come with such unfettered access. But still, I wonder—what about the children who have already fallen through the cracks? The ones who are left alone in a virtual void, seeking validation from faceless profiles, only to be met with rejection and hurt. 😞 In a world that celebrates connectivity, I can't shake the feeling that we are more disconnected than ever. Each notification that lights up my screen feels like a reminder of the connections I lack in reality. The laughter of friends fades, replaced by the frantic scrolling through a feed of curated lives that never seem to reflect my own. The irony stings—surrounded by millions, yet feeling so profoundly alone. 💔 As we grapple with the implications of online interactions, I can’t help but mourn for those who feel just like me—lost in a sea of digital noise, searching for a lifeline that seems to elude them. The question remains: what is the cost of this digital freedom? Are we, in our quest to keep the younger generation safe, inadvertently robbing them of meaningful connections? Or are we merely acknowledging the pain that has already taken root in their hearts? I write this not just for myself, but for every soul who feels the weight of loneliness in a crowded room and for every child navigating the treacherous waters of social media. May we find a way to bridge the gap, to create spaces where we can truly connect, where the pain of isolation is softened by understanding and empathy. 🌧️ #Loneliness #SocialMedia #YouthProtection #DigitalIsolation #Heartbreak
    ¿Qué hay detrás de prohibir las redes a los menores?
    Durante años, las redes sociales han planteado la pregunta por la edad del usuario con una ligereza que rozaba la farsa. Bastaba un clic para acceder. Muchos menores entraban sin dificultad en plataformas diseñadas para adultos, que ni consideraban s
    Like
    Love
    Wow
    Sad
    Angry
    611
    1 Комментарии 0 Поделились
  • Why is it so hard for people to grasp the absolute necessity of setting up 301 redirects in an .htaccess file? Honestly, it’s infuriating! We’re in a digital age where every click counts, and yet, so many website owners continue to neglect this vital aspect of web management. Why? Because they’re either too lazy to learn or they just don’t care about preserving their ranking authority!

    Let’s get one thing straight: if you think you can just change URLs and your content magically stays relevant, you’re living in a fantasy world! When you fail to implement 301 redirects properly, you’re not just risking your SEO; you’re throwing away all the hard work you’ve put into building your online presence. It’s like setting fire to a pile of money because you couldn’t be bothered to use a fire extinguisher. Ridiculous!

    The process of adding 301 redirects in .htaccess files is straightforward. It’s not rocket science, people! You have two methods at your disposal, and yet countless websites are still losing traffic and authority daily because their owners can’t figure it out. You would think that in a realm where every detail matters, folks would prioritize understanding how to maintain their site’s integrity. But no! Instead, they leave their sites vulnerable, confused visitors, and plunging search rankings in their wake.

    If you’re still scratching your head over how to set up 301 redirects in an .htaccess file, wake up! The first method is simply to use the `RedirectPermanent` directive. It’s right there for you, and it’s as easy as pie. You just need to specify the old URL and the new URL, and boom! You’re done. Or, if you’re feeling fancy, the second method involves using the `RewriteRule` directive. Again, it’s not complicated! Just a few lines of code, and you’re on your way to preserving that precious ranking authority.

    What’s more infuriating is when people rush into updating their websites without even considering the fallout of their actions. Do you think Google is going to give you a free pass for being reckless? No! It will punish you for not taking the necessary precautions. Imagine losing all that traffic you worked so hard to get, just because you couldn’t be bothered to set up a simple redirect. Pathetic!

    Let’s not even begin to talk about the customer experience. When users click on a link and end up on a 404 error page because you didn’t implement a 301 redirect, that’s a surefire way to lose their trust and business. Do you really want to be known as the website that provides a dead-end for visitors? Absolutely not! So, for the love of all that is holy in the digital world, get your act together and learn how to set up those redirects!

    In conclusion, if you’re still ignoring the importance of 301 redirects in your .htaccess file, you’re not just being negligent; you’re actively sabotaging your own success. Stop making excuses, roll up your sleeves, and do what needs to be done. Your website deserves better!

    #301Redirects #SEO #WebManagement #DigitalMarketing #htaccess
    Why is it so hard for people to grasp the absolute necessity of setting up 301 redirects in an .htaccess file? Honestly, it’s infuriating! We’re in a digital age where every click counts, and yet, so many website owners continue to neglect this vital aspect of web management. Why? Because they’re either too lazy to learn or they just don’t care about preserving their ranking authority! Let’s get one thing straight: if you think you can just change URLs and your content magically stays relevant, you’re living in a fantasy world! When you fail to implement 301 redirects properly, you’re not just risking your SEO; you’re throwing away all the hard work you’ve put into building your online presence. It’s like setting fire to a pile of money because you couldn’t be bothered to use a fire extinguisher. Ridiculous! The process of adding 301 redirects in .htaccess files is straightforward. It’s not rocket science, people! You have two methods at your disposal, and yet countless websites are still losing traffic and authority daily because their owners can’t figure it out. You would think that in a realm where every detail matters, folks would prioritize understanding how to maintain their site’s integrity. But no! Instead, they leave their sites vulnerable, confused visitors, and plunging search rankings in their wake. If you’re still scratching your head over how to set up 301 redirects in an .htaccess file, wake up! The first method is simply to use the `RedirectPermanent` directive. It’s right there for you, and it’s as easy as pie. You just need to specify the old URL and the new URL, and boom! You’re done. Or, if you’re feeling fancy, the second method involves using the `RewriteRule` directive. Again, it’s not complicated! Just a few lines of code, and you’re on your way to preserving that precious ranking authority. What’s more infuriating is when people rush into updating their websites without even considering the fallout of their actions. Do you think Google is going to give you a free pass for being reckless? No! It will punish you for not taking the necessary precautions. Imagine losing all that traffic you worked so hard to get, just because you couldn’t be bothered to set up a simple redirect. Pathetic! Let’s not even begin to talk about the customer experience. When users click on a link and end up on a 404 error page because you didn’t implement a 301 redirect, that’s a surefire way to lose their trust and business. Do you really want to be known as the website that provides a dead-end for visitors? Absolutely not! So, for the love of all that is holy in the digital world, get your act together and learn how to set up those redirects! In conclusion, if you’re still ignoring the importance of 301 redirects in your .htaccess file, you’re not just being negligent; you’re actively sabotaging your own success. Stop making excuses, roll up your sleeves, and do what needs to be done. Your website deserves better! #301Redirects #SEO #WebManagement #DigitalMarketing #htaccess
    How to Set Up 301 Redirects in an .htaccess File
    Adding 301 redirects in .htaccess files is useful to preserve ranking authority. Here are two methods.
    Like
    Love
    Wow
    Angry
    Sad
    506
    1 Комментарии 0 Поделились
Расширенные страницы