• Plug and Play: Build a G-Assist Plug-In Today

    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems.
    NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels.

    G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow.
    Below, find popular G-Assist plug-ins, hackathon details and tips to get started.
    Plug-In and Win
    Join the hackathon by registering and checking out the curated technical resources.
    G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation.
    For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins.
    To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code.
    Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action.
    Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16.
    Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in.
    Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit.
    Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver.
    Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows.

    Popular plug-ins include:

    Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay.
    Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay.
    IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device.
    Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists.
    Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more.

    Get G-Assist 
    Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff.
    the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session.
    Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities.
    Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process.
    NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch.
    Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations. 
    Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter.
    Follow NVIDIA Workstation on LinkedIn and X. 
    See notice regarding software product information.
    #plug #play #build #gassist #plugin
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file, requirements.txt, manifest.json, config.json, a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU, specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-InExplore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information. #plug #play #build #gassist #plugin
    BLOGS.NVIDIA.COM
    Plug and Play: Build a G-Assist Plug-In Today
    Project G-Assist — available through the NVIDIA App — is an experimental AI assistant that helps tune, control and optimize NVIDIA GeForce RTX systems. NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites the community to explore AI and build custom G-Assist plug-ins for a chance to win prizes and be featured on NVIDIA social media channels. G-Assist allows users to control their RTX GPU and other system settings using natural language, thanks to a small language model that runs on device. It can be used from the NVIDIA Overlay in the NVIDIA App without needing to tab out or switch programs. Users can expand its capabilities via plug-ins and even connect it to agentic frameworks such as Langflow. Below, find popular G-Assist plug-ins, hackathon details and tips to get started. Plug-In and Win Join the hackathon by registering and checking out the curated technical resources. G-Assist plug-ins can be built in several ways, including with Python for rapid development, with C++ for performance-critical apps and with custom system interactions for hardware and operating system automation. For those that prefer vibe coding, the G-Assist Plug-In Builder — a ChatGPT-based app that allows no-code or low-code development with natural language commands — makes it easy for enthusiasts to start creating plug-ins. To submit an entry, participants must provide a GitHub repository, including source code file (plugin.py), requirements.txt, manifest.json, config.json (if applicable), a plug-in executable file and READme code. Then, submit a video — between 30 seconds and two minutes — showcasing the plug-in in action. Finally, hackathoners must promote their plug-in using #AIonRTXHackathon on a social media channel: Instagram, TikTok or X. Submit projects via this form by Wednesday, July 16. Judges will assess plug-ins based on three main criteria: 1) innovation and creativity, 2) technical execution and integration, reviewing technical depth, G-Assist integration and scalability, and 3) usability and community impact, aka how easy it is to use the plug-in. Winners will be selected on Wednesday, Aug. 20. First place will receive a GeForce RTX 5090 laptop, second place a GeForce RTX 5080 GPU and third a GeForce RTX 5070 GPU. These top three will also be featured on NVIDIA’s social media channels, get the opportunity to meet the NVIDIA G-Assist team and earn an NVIDIA Deep Learning Institute self-paced course credit. Project G-Assist requires a GeForce RTX 50, 40 or 30 Series Desktop GPU with at least 12GB of VRAM, Windows 11 or 10 operating system, a compatible CPU (Intel Pentium G Series, Core i3, i5, i7 or higher; AMD FX, Ryzen 3, 5, 7, 9, Threadripper or higher), specific disk space requirements and a recent GeForce Game Ready Driver or NVIDIA Studio Driver. Plug-In(spiration) Explore open-source plug-in samples available on GitHub, which showcase the diverse ways on-device AI can enhance PC and gaming workflows. Popular plug-ins include: Google Gemini: Enables search-based queries using Google Search integration and large language model-based queries using Gemini capabilities in real time without needing to switch programs from the convenience of the NVIDIA App Overlay. Discord: Enables users to easily share game highlights or messages directly to Discord servers without disrupting gameplay. IFTTT: Lets users create automations across hundreds of compatible endpoints to trigger IoT routines — such as adjusting room lights and smart shades, or pushing the latest gaming news to a mobile device. Spotify: Lets users control Spotify using simple voice commands or the G-Assist interface to play favorite tracks and manage playlists. Twitch: Checks if any Twitch streamer is currently live and can access detailed stream information such as titles, games, view counts and more. Get G-Assist(ance)  Join the NVIDIA Developer Discord channel to collaborate, share creations and gain support from fellow AI enthusiasts and NVIDIA staff. Save the date for NVIDIA’s How to Build a G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities, discover the fundamentals of building, testing and deploying Project G-Assist plug-ins, and participate in a live Q&A session. Explore NVIDIA’s GitHub repository, which provides everything needed to get started developing with G-Assist, including sample plug-ins, step-by-step instructions and documentation for building custom functionalities. Learn more about the ChatGPT Plug-In Builder to transform ideas into functional G-Assist plug-ins with minimal coding. The tool uses OpenAI’s custom GPT builder to generate plug-in code and streamline the development process. NVIDIA’s technical blog walks through the architecture of a G-Assist plug-in, using a Twitch integration as an example. Discover how plug-ins work, how they communicate with G-Assist and how to build them from scratch. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information.
    Like
    Wow
    Love
    Sad
    25
    0 Σχόλια 0 Μοιράστηκε
  • One of the Most Iconic Shonen Jump Series Is Finally Getting a New Anime Adaptation

    Fist of the North Star, also known as Hokuto no Ken, one of the most iconic and classic Shonen Jumpseries, is returning next year with a new anime. The social media account for the series has just confirmed that the new anime is coming sooner than fans expected.
    #one #most #iconic #shonen #jump
    One of the Most Iconic Shonen Jump Series Is Finally Getting a New Anime Adaptation
    Fist of the North Star, also known as Hokuto no Ken, one of the most iconic and classic Shonen Jumpseries, is returning next year with a new anime. The social media account for the series has just confirmed that the new anime is coming sooner than fans expected. #one #most #iconic #shonen #jump
    GAMERANT.COM
    One of the Most Iconic Shonen Jump Series Is Finally Getting a New Anime Adaptation
    Fist of the North Star, also known as Hokuto no Ken, one of the most iconic and classic Shonen Jumpseries, is returning next year with a new anime. The social media account for the series has just confirmed that the new anime is coming sooner than fans expected.
    0 Σχόλια 0 Μοιράστηκε
  • The Unwritten Rules of Death Stranding 2 Explained

    Death Stranding 2: On the Beach is finally here, and making some massive waves at that. Much of this is due to how much it has committed to improving on the formula of the original, although this has involved streamlining a lot of its systems and all but giving the people what they want. Nevertheless, Death Stranding 2 is already proving to be a fulfilling continuation of the first game's legacy.
    #unwritten #rules #death #stranding #explained
    The Unwritten Rules of Death Stranding 2 Explained
    Death Stranding 2: On the Beach is finally here, and making some massive waves at that. Much of this is due to how much it has committed to improving on the formula of the original, although this has involved streamlining a lot of its systems and all but giving the people what they want. Nevertheless, Death Stranding 2 is already proving to be a fulfilling continuation of the first game's legacy. #unwritten #rules #death #stranding #explained
    GAMERANT.COM
    The Unwritten Rules of Death Stranding 2 Explained
    Death Stranding 2: On the Beach is finally here, and making some massive waves at that. Much of this is due to how much it has committed to improving on the formula of the original, although this has involved streamlining a lot of its systems and all but giving the people what they want. Nevertheless, Death Stranding 2 is already proving to be a fulfilling continuation of the first game's legacy.
    0 Σχόλια 0 Μοιράστηκε
  • Startup Uses NVIDIA RTX-Powered Generative AI to Make Coolers, Cooler

    Mark Theriault founded the startup FITY envisioning a line of clever cooling products: cold drink holders that come with freezable pucks to keep beverages cold for longer without the mess of ice. The entrepreneur started with 3D prints of products in his basement, building one unit at a time, before eventually scaling to mass production.
    Founding a consumer product company from scratch was a tall order for a single person. Going from preliminary sketches to production-ready designs was a major challenge. To bring his creative vision to life, Theriault relied on AI and his NVIDIA GeForce RTX-equipped system. For him, AI isn’t just a tool — it’s an entire pipeline to help him accomplish his goals. about his workflow below.
    Plus, GeForce RTX 5050 laptops start arriving today at retailers worldwide, from GeForce RTX 5050 Laptop GPUs feature 2,560 NVIDIA Blackwell CUDA cores, fifth-generation AI Tensor Cores, fourth-generation RT Cores, a ninth-generation NVENC encoder and a sixth-generation NVDEC decoder.
    In addition, NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites developers to explore AI and build custom G-Assist plug-ins for a chance to win prizes. the date for the G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities and fundamentals, and to participate in a live Q&A session.
    From Concept to Completion
    To create his standout products, Theriault tinkers with potential FITY Flex cooler designs with traditional methods, from sketch to computer-aided design to rapid prototyping, until he finds the right vision. A unique aspect of the FITY Flex design is that it can be customized with fun, popular shoe charms.
    For packaging design inspiration, Theriault uses his preferred text-to-image generative AI model for prototyping, Stable Diffusion XL — which runs 60% faster with the NVIDIA TensorRT software development kit — using the modular, node-based interface ComfyUI.
    ComfyUI gives users granular control over every step of the generation process — prompting, sampling, model loading, image conditioning and post-processing. It’s ideal for advanced users like Theriault who want to customize how images are generated.
    Theriault’s uses of AI result in a complete computer graphics-based ad campaign. Image courtesy of FITY.
    NVIDIA and GeForce RTX GPUs based on the NVIDIA Blackwell architecture include fifth-generation Tensor Cores designed to accelerate AI and deep learning workloads. These GPUs work with CUDA optimizations in PyTorch to seamlessly accelerate ComfyUI, reducing generation time on FLUX.1-dev, an image generation model from Black Forest Labs, from two minutes per image on the Mac M3 Ultra to about four seconds on the GeForce RTX 5090 desktop GPU.
    ComfyUI can also add ControlNets — AI models that help control image generation — that Theriault uses for tasks like guiding human poses, setting compositions via depth mapping and converting scribbles to images.
    Theriault even creates his own fine-tuned models to keep his style consistent. He used low-rank adaptationmodels — small, efficient adapters into specific layers of the network — enabling hyper-customized generation with minimal compute cost.
    LoRA models allow Theriault to ideate on visuals quickly. Image courtesy of FITY.
    “Over the last few months, I’ve been shifting from AI-assisted computer graphics renders to fully AI-generated product imagery using a custom Flux LoRA I trained in house. My RTX 4080 SUPER GPU has been essential for getting the performance I need to train and iterate quickly.” – Mark Theriault, founder of FITY 

    Theriault also taps into generative AI to create marketing assets like FITY Flex product packaging. He uses FLUX.1, which excels at generating legible text within images, addressing a common challenge in text-to-image models.
    Though FLUX.1 models can typically consume over 23GB of VRAM, NVIDIA has collaborated with Black Forest Labs to help reduce the size of these models using quantization — a technique that reduces model size while maintaining quality. The models were then accelerated with TensorRT, which provides an up to 2x speedup over PyTorch.
    To simplify using these models in ComfyUI, NVIDIA created the FLUX.1 NIM microservice, a containerized version of FLUX.1 that can be loaded in ComfyUI and enables FP4 quantization and TensorRT support. Combined, the models come down to just over 11GB of VRAM, and performance improves by 2.5x.
    Theriault uses the Blender Cycles app to render out final files. For 3D workflows, NVIDIA offers the AI Blueprint for 3D-guided generative AI to ease the positioning and composition of 3D images, so anyone interested in this method can quickly get started.
    Photorealistic renders. Image courtesy of FITY.
    Finally, Theriault uses large language models to generate marketing copy — tailored for search engine optimization, tone and storytelling — as well as to complete his patent and provisional applications, work that usually costs thousands of dollars in legal fees and considerable time.
    Generative AI helps Theriault create promotional materials like the above. Image courtesy of FITY.
    “As a one-man band with a ton of content to generate, having on-the-fly generation capabilities for my product designs really helps speed things up.” – Mark Theriault, founder of FITY

    Every texture, every word, every photo, every accessory was a micro-decision, Theriault said. AI helped him survive the “death by a thousand cuts” that can stall solo startup founders, he added.
    Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations. 
    Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter.
    Follow NVIDIA Workstation on LinkedIn and X. 
    See notice regarding software product information.
    #startup #uses #nvidia #rtxpowered #generative
    Startup Uses NVIDIA RTX-Powered Generative AI to Make Coolers, Cooler
    Mark Theriault founded the startup FITY envisioning a line of clever cooling products: cold drink holders that come with freezable pucks to keep beverages cold for longer without the mess of ice. The entrepreneur started with 3D prints of products in his basement, building one unit at a time, before eventually scaling to mass production. Founding a consumer product company from scratch was a tall order for a single person. Going from preliminary sketches to production-ready designs was a major challenge. To bring his creative vision to life, Theriault relied on AI and his NVIDIA GeForce RTX-equipped system. For him, AI isn’t just a tool — it’s an entire pipeline to help him accomplish his goals. about his workflow below. Plus, GeForce RTX 5050 laptops start arriving today at retailers worldwide, from GeForce RTX 5050 Laptop GPUs feature 2,560 NVIDIA Blackwell CUDA cores, fifth-generation AI Tensor Cores, fourth-generation RT Cores, a ninth-generation NVENC encoder and a sixth-generation NVDEC decoder. In addition, NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites developers to explore AI and build custom G-Assist plug-ins for a chance to win prizes. the date for the G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities and fundamentals, and to participate in a live Q&A session. From Concept to Completion To create his standout products, Theriault tinkers with potential FITY Flex cooler designs with traditional methods, from sketch to computer-aided design to rapid prototyping, until he finds the right vision. A unique aspect of the FITY Flex design is that it can be customized with fun, popular shoe charms. For packaging design inspiration, Theriault uses his preferred text-to-image generative AI model for prototyping, Stable Diffusion XL — which runs 60% faster with the NVIDIA TensorRT software development kit — using the modular, node-based interface ComfyUI. ComfyUI gives users granular control over every step of the generation process — prompting, sampling, model loading, image conditioning and post-processing. It’s ideal for advanced users like Theriault who want to customize how images are generated. Theriault’s uses of AI result in a complete computer graphics-based ad campaign. Image courtesy of FITY. NVIDIA and GeForce RTX GPUs based on the NVIDIA Blackwell architecture include fifth-generation Tensor Cores designed to accelerate AI and deep learning workloads. These GPUs work with CUDA optimizations in PyTorch to seamlessly accelerate ComfyUI, reducing generation time on FLUX.1-dev, an image generation model from Black Forest Labs, from two minutes per image on the Mac M3 Ultra to about four seconds on the GeForce RTX 5090 desktop GPU. ComfyUI can also add ControlNets — AI models that help control image generation — that Theriault uses for tasks like guiding human poses, setting compositions via depth mapping and converting scribbles to images. Theriault even creates his own fine-tuned models to keep his style consistent. He used low-rank adaptationmodels — small, efficient adapters into specific layers of the network — enabling hyper-customized generation with minimal compute cost. LoRA models allow Theriault to ideate on visuals quickly. Image courtesy of FITY. “Over the last few months, I’ve been shifting from AI-assisted computer graphics renders to fully AI-generated product imagery using a custom Flux LoRA I trained in house. My RTX 4080 SUPER GPU has been essential for getting the performance I need to train and iterate quickly.” – Mark Theriault, founder of FITY  Theriault also taps into generative AI to create marketing assets like FITY Flex product packaging. He uses FLUX.1, which excels at generating legible text within images, addressing a common challenge in text-to-image models. Though FLUX.1 models can typically consume over 23GB of VRAM, NVIDIA has collaborated with Black Forest Labs to help reduce the size of these models using quantization — a technique that reduces model size while maintaining quality. The models were then accelerated with TensorRT, which provides an up to 2x speedup over PyTorch. To simplify using these models in ComfyUI, NVIDIA created the FLUX.1 NIM microservice, a containerized version of FLUX.1 that can be loaded in ComfyUI and enables FP4 quantization and TensorRT support. Combined, the models come down to just over 11GB of VRAM, and performance improves by 2.5x. Theriault uses the Blender Cycles app to render out final files. For 3D workflows, NVIDIA offers the AI Blueprint for 3D-guided generative AI to ease the positioning and composition of 3D images, so anyone interested in this method can quickly get started. Photorealistic renders. Image courtesy of FITY. Finally, Theriault uses large language models to generate marketing copy — tailored for search engine optimization, tone and storytelling — as well as to complete his patent and provisional applications, work that usually costs thousands of dollars in legal fees and considerable time. Generative AI helps Theriault create promotional materials like the above. Image courtesy of FITY. “As a one-man band with a ton of content to generate, having on-the-fly generation capabilities for my product designs really helps speed things up.” – Mark Theriault, founder of FITY Every texture, every word, every photo, every accessory was a micro-decision, Theriault said. AI helped him survive the “death by a thousand cuts” that can stall solo startup founders, he added. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information. #startup #uses #nvidia #rtxpowered #generative
    BLOGS.NVIDIA.COM
    Startup Uses NVIDIA RTX-Powered Generative AI to Make Coolers, Cooler
    Mark Theriault founded the startup FITY envisioning a line of clever cooling products: cold drink holders that come with freezable pucks to keep beverages cold for longer without the mess of ice. The entrepreneur started with 3D prints of products in his basement, building one unit at a time, before eventually scaling to mass production. Founding a consumer product company from scratch was a tall order for a single person. Going from preliminary sketches to production-ready designs was a major challenge. To bring his creative vision to life, Theriault relied on AI and his NVIDIA GeForce RTX-equipped system. For him, AI isn’t just a tool — it’s an entire pipeline to help him accomplish his goals. Read more about his workflow below. Plus, GeForce RTX 5050 laptops start arriving today at retailers worldwide, from $999. GeForce RTX 5050 Laptop GPUs feature 2,560 NVIDIA Blackwell CUDA cores, fifth-generation AI Tensor Cores, fourth-generation RT Cores, a ninth-generation NVENC encoder and a sixth-generation NVDEC decoder. In addition, NVIDIA’s Plug and Play: Project G-Assist Plug-In Hackathon — running virtually through Wednesday, July 16 — invites developers to explore AI and build custom G-Assist plug-ins for a chance to win prizes. Save the date for the G-Assist Plug-In webinar on Wednesday, July 9, from 10-11 a.m. PT, to learn more about Project G-Assist capabilities and fundamentals, and to participate in a live Q&A session. From Concept to Completion To create his standout products, Theriault tinkers with potential FITY Flex cooler designs with traditional methods, from sketch to computer-aided design to rapid prototyping, until he finds the right vision. A unique aspect of the FITY Flex design is that it can be customized with fun, popular shoe charms. For packaging design inspiration, Theriault uses his preferred text-to-image generative AI model for prototyping, Stable Diffusion XL — which runs 60% faster with the NVIDIA TensorRT software development kit — using the modular, node-based interface ComfyUI. ComfyUI gives users granular control over every step of the generation process — prompting, sampling, model loading, image conditioning and post-processing. It’s ideal for advanced users like Theriault who want to customize how images are generated. Theriault’s uses of AI result in a complete computer graphics-based ad campaign. Image courtesy of FITY. NVIDIA and GeForce RTX GPUs based on the NVIDIA Blackwell architecture include fifth-generation Tensor Cores designed to accelerate AI and deep learning workloads. These GPUs work with CUDA optimizations in PyTorch to seamlessly accelerate ComfyUI, reducing generation time on FLUX.1-dev, an image generation model from Black Forest Labs, from two minutes per image on the Mac M3 Ultra to about four seconds on the GeForce RTX 5090 desktop GPU. ComfyUI can also add ControlNets — AI models that help control image generation — that Theriault uses for tasks like guiding human poses, setting compositions via depth mapping and converting scribbles to images. Theriault even creates his own fine-tuned models to keep his style consistent. He used low-rank adaptation (LoRA) models — small, efficient adapters into specific layers of the network — enabling hyper-customized generation with minimal compute cost. LoRA models allow Theriault to ideate on visuals quickly. Image courtesy of FITY. “Over the last few months, I’ve been shifting from AI-assisted computer graphics renders to fully AI-generated product imagery using a custom Flux LoRA I trained in house. My RTX 4080 SUPER GPU has been essential for getting the performance I need to train and iterate quickly.” – Mark Theriault, founder of FITY  Theriault also taps into generative AI to create marketing assets like FITY Flex product packaging. He uses FLUX.1, which excels at generating legible text within images, addressing a common challenge in text-to-image models. Though FLUX.1 models can typically consume over 23GB of VRAM, NVIDIA has collaborated with Black Forest Labs to help reduce the size of these models using quantization — a technique that reduces model size while maintaining quality. The models were then accelerated with TensorRT, which provides an up to 2x speedup over PyTorch. To simplify using these models in ComfyUI, NVIDIA created the FLUX.1 NIM microservice, a containerized version of FLUX.1 that can be loaded in ComfyUI and enables FP4 quantization and TensorRT support. Combined, the models come down to just over 11GB of VRAM, and performance improves by 2.5x. Theriault uses the Blender Cycles app to render out final files. For 3D workflows, NVIDIA offers the AI Blueprint for 3D-guided generative AI to ease the positioning and composition of 3D images, so anyone interested in this method can quickly get started. Photorealistic renders. Image courtesy of FITY. Finally, Theriault uses large language models to generate marketing copy — tailored for search engine optimization, tone and storytelling — as well as to complete his patent and provisional applications, work that usually costs thousands of dollars in legal fees and considerable time. Generative AI helps Theriault create promotional materials like the above. Image courtesy of FITY. “As a one-man band with a ton of content to generate, having on-the-fly generation capabilities for my product designs really helps speed things up.” – Mark Theriault, founder of FITY Every texture, every word, every photo, every accessory was a micro-decision, Theriault said. AI helped him survive the “death by a thousand cuts” that can stall solo startup founders, he added. Each week, the RTX AI Garage blog series features community-driven AI innovations and content for those looking to learn more about NVIDIA NIM microservices and AI Blueprints, as well as building AI agents, creative workflows, digital humans, productivity apps and more on AI PCs and workstations.  Plug in to NVIDIA AI PC on Facebook, Instagram, TikTok and X — and stay informed by subscribing to the RTX AI PC newsletter. Follow NVIDIA Workstation on LinkedIn and X.  See notice regarding software product information.
    0 Σχόλια 0 Μοιράστηκε
  • Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30

    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you.

    Play Video

    Monster Hunter Wilds – Free Title Update 2

    In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features.

    ● The Leviathan, Lagiacrus, emerges at last

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission.

    While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities!

    Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well.

    ● The Flying Wyvern, Seregios, strikes

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission.

    New hunter and Palico armor forged from Seregios materials awaits you!

    For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters!

    Hunt in style with layered weapons

    With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another.

    To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon.

    For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon.

    New optional features

    Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time.

    A new Support Hunter joins the fray

    Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests.

    Photo Mode Improvements

    Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more.

    Celebrate summer with the Festival of Accord: Flamefete seasonal event

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion.

    Arch-Tempered Uth Duna slams down starting July 30

    Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest.

    We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new!

    A special collaboration with Fender

    Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®.

    In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life!

    A new round of cosmetic DLC arrives

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    View and download image

    Download the image

    close
    Close

    Download this image

    Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases. 

    Free Title Update roadmap

    We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come.

    A Monster Hunter Wilds background is added to the PS5 Welcome hub

    Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood.

    View and download image

    Download the image

    close
    Close

    Download this image

    How to change the backgroundWelcome hub -> Change background -> Games

    Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30

    View and download image

    Download the image

    close
    Close

    Download this image

    With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try.

    Happy Hunting!
    #monster #hunter #wilds #second #free
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting! #monster #hunter #wilds #second #free
    BLOG.PLAYSTATION.COM
    Monster Hunter Wilds’ second free title update brings fierce new monsters and more June 30
    New monsters, features, and more arrive in the Forbidden Lands with Free Title Update 2, dropping in Monster Hunter Wilds on June 30! Watch the latest trailer for a look at what awaits you. Play Video Monster Hunter Wilds – Free Title Update 2 In addition to what’s featured in the trailer, Free Title Update 2 will also feature improvements and adjustments to various aspects of the game. Make sure to check the official Monster Hunter Wilds website for a new Director’s Letter from Game Director Yuya Tokuda coming soon, for a deeper dive into what’s coming in addition to the core new monsters and features. ● The Leviathan, Lagiacrus, emerges at last View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The long-awaited Leviathan, Lagiacrus, has finally appeared in Monster Hunter Wilds! Floating at the top of the aquatic food chain, Lagiacrus is a master of the sea, boiling the surrounding water by emitting powerful currents of electricity. New missions to hunt Lagiacrus will become available for hunters at Hunter Rank 31 or above, and after clearing the “A World Turned Upside Down” main mission, and the “Forest Doshaguma” side mission. While you’ll fight Lagiacrus primarily on land, your hunt against this formidable foe can also take you deep underwater for a special encounter, where it feels most at home. During the underwater portion of the hunt, hunters won’t be able to use their weapons freely, but there are still ways to fight back and turn the tide of battle. Stay alert for your opportunities! Hunt Lagiacrus to obtain materials for new hunter and Palico armor! As usual, these sets can be used as layered armor as well. ● The Flying Wyvern, Seregios, strikes View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Shining golden bright, the flying wyvern, Seregios, swoops into the Forbidden Lands with Free Title Update 2! Seregios is a highly mobile aerial monster that fires sharp bladescales, inflicting bleeding status on hunters. Keep an eye on your health and bring along rations and well-done steak when hunting this monster. Missions to hunt Seregios are available for hunters at HR 31 or above that have cleared the “A World Turned Upside Down” main mission. New hunter and Palico armor forged from Seregios materials awaits you! For hunters looking for a greater challenge, 8★ Tempered Lagiacrus and Seregios will begin appearing for hunters at HR 41 or higher, after completing their initial missions. Best of luck against these powerful monsters! Hunt in style with layered weapons With Free Title Update 2, hunters will be able to use Layered Weapons, which lets you use the look of any weapon, while keeping the stats and abilities of another. To unlock a weapon design as a Layered Weapon option, you’ll need to craft the final weapon in that weapon’s upgrade tree. Artian Weapons can be used as layered weapons by fully reinforcing a Rarity 8 Artian weapon. For weapons that change in appearance when upgraded, you’ll also have the option to use their pre-upgrade designs as well! You can also craft layered Palico weapons by forging their high-rank weapons. We hope this feature encourages you to delve deeper into crafting the powerful Artian Weapon you’ve been looking for, all while keeping the appearance of your favorite weapon. New optional features Change your choice of handler accompanying you in the field to Eric after completing the Lagiacrus mission in Free Title Update 2! You can always switch back to Alma too, but it doesn’t hurt to give our trusty handler a break from time to time. A new Support Hunter joins the fray Mina, a support hunter who wields a Sword & Shield, joins the hunt. With Free Title Update 2, you’ll be able to choose which support hunters can join you on quests. Photo Mode Improvements Snap even more creative photos of your hunts with some new options, including an Effects tab to adjust brightness and filter effects, and a Character Display tab to toggle off your Handler, Palico, Seikret, and more. Celebrate summer with the Festival of Accord: Flamefete seasonal event View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image The next seasonal event in Monster Hunter Wilds, the Festival of Accord: Flamefete, will take place in the Grand Hub from July 23 to August 6! Cool off with this summer themed celebration, where you can obtain new armor, gestures, and pop-up camp decorations for a limited time. You’ll also be able to eat special seasonal event meals and enjoy the fun of summer as the Grand Hub and all it’s members will be dressed to mark the occasion. Arch-Tempered Uth Duna slams down starting July 30 Take on an even more powerful version of Uth Duna when Arch-Tempered Uth Duna arrives as an Event Quest and Free Challenge Quest from July 30 to August 20! Take on and defeat the challenging apex of the Scarlet Forest to obtain materials for crafting the new Uth Duna γ hunter armor set and the Felyne Uth Duna γ Palico armor set. Be sure you’re at least HR 50 or above to take on this quest. We’ve also got plenty of new Event Quests on the way in the weeks ahead, including some where you can earn new special equipment, quests to obtain more armor spheres, and challenge quests against Mizutsune. Be sure to keep checking back each week to see what’s new! A special collaboration with Fender Monster Hunter Wilds is collaborating with world-renowned guitar brand Fender®! From August 27 to September 24, a special Event Quest will be available to earn a collaboration gesture that lets you rock out with the Monster Hunter Rathalos Telecaster®. In celebration of Monster Hunter’s 20th anniversary, the globally released Monster Hunter Rathalos Telecaster® collaboration guitar is making its way into the game! Be sure to experience it both in-game and in real life! A new round of cosmetic DLC arrives View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image View and download image Download the image close Close Download this image Express your style with additional DLC, including four free dance gestures. Paid cosmetic DLC, such as gestures, stickers, pendants, and more will also be available. If you’ve purchased the Premium Deluxe Edition of Monster Hunter Wilds or the Cosmetic DLC Pass, Cosmetic DLC Pack 2 and other additional items will be available to download when Free Title Update 2 releases.  Free Title Update roadmap We hope you’re excited to dive into all the content coming with Free Title Update 2! We’ll continue to release updates, with Free Title Update 3 coming at the end of September. Stay tuned for more details to come. A Monster Hunter Wilds background is added to the PS5 Welcome hub Alongside Free Title Update 2 on June 30, an animated background featuring the hunters facing Arkveld during the Inclemency will be added to the Welcome hub. Customize your PS5 Welcome hub with Monster Hunter Wilds to get you in the hunting mood. View and download image Download the image close Close Download this image How to change the backgroundWelcome hub -> Change background -> Games Try out Monster Hunter Wilds on PS5 with a PlayStation Plus Premium Game Trial starting on June 30 View and download image Download the image close Close Download this image With the Game Trial, you can try out the full version of the game for 2 hours. If you decide to purchase the full version after the trial, your save data will carry over, allowing you to continue playing seamlessly right where you left off. If you haven’t played Monster Hunter Wilds yet, this is a great way to give it a try. Happy Hunting!
    0 Σχόλια 0 Μοιράστηκε
  • HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE

    By TREVOR HOGG

    Images courtesy of Warner Bros. Pictures.

    Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon.

    “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.”
    —Talia Finlayson, Creative Technologist, Disguise

    Interior and exterior environments had to be created, such as the shop owned by Steve.

    “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”

    Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.”

    A virtual exploration of Steve’s shop in Midport Village.

    Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.”

    “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.”
    —Laura Bell, Creative Technologist, Disguise

    Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack.

    Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.”

    Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!”

    A virtual study and final still of the cast members standing outside of the Lava Chicken Shack.

    “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.”
    —Talia Finlayson, Creative Technologist, Disguise

    The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.”

    Virtually conceptualizing the layout of Midport Village.

    Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.”

    An example of the virtual and final version of the Woodland Mansion.

    “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.”
    —Laura Bell, Creative Technologist, Disguise

    Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.”

    Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment.

    Doing a virtual scale study of the Mountainside.

    Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.”

    Piglots cause mayhem during the Wingsuit Chase.

    Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods.

    “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    #how #disguise #built #out #virtual
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “s the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve. “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Departmenton Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’sLava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younisadapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay Georgeand I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols, Pat Younis, Jake Tuckand Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.” #how #disguise #built #out #virtual
    WWW.VFXVOICE.COM
    HOW DISGUISE BUILT OUT THE VIRTUAL ENVIRONMENTS FOR A MINECRAFT MOVIE
    By TREVOR HOGG Images courtesy of Warner Bros. Pictures. Rather than a world constructed around photorealistic pixels, a video game created by Markus Persson has taken the boxier 3D voxel route, which has become its signature aesthetic, and sparked an international phenomenon that finally gets adapted into a feature with the release of A Minecraft Movie. Brought onboard to help filmmaker Jared Hess in creating the environments that the cast of Jason Momoa, Jack Black, Sebastian Hansen, Emma Myers and Danielle Brooks find themselves inhabiting was Disguise under the direction of Production VFX Supervisor Dan Lemmon. “[A]s the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” —Talia Finlayson, Creative Technologist, Disguise Interior and exterior environments had to be created, such as the shop owned by Steve (Jack Black). “Prior to working on A Minecraft Movie, I held more technical roles, like serving as the Virtual Production LED Volume Operator on a project for Apple TV+ and Paramount Pictures,” notes Talia Finlayson, Creative Technologist for Disguise. “But as the Senior Unreal Artist within the Virtual Art Department (VAD) on Minecraft, I experienced the full creative workflow. What stood out most was how deeply the VAD was embedded across every stage of production. We weren’t working in isolation. From the production designer and director to the VFX supervisor and DP, the VAD became a hub for collaboration.” The project provided new opportunities. “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance,” notes Laura Bell, Creative Technologist for Disguise. “But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” Set designs originally created by the art department in Rhinoceros 3D were transformed into fully navigable 3D environments within Unreal Engine. “These scenes were far more than visualizations,” Finlayson remarks. “They were interactive tools used throughout the production pipeline. We would ingest 3D models and concept art, clean and optimize geometry using tools like Blender, Cinema 4D or Maya, then build out the world in Unreal Engine. This included applying materials, lighting and extending environments. These Unreal scenes we created were vital tools across the production and were used for a variety of purposes such as enabling the director to explore shot compositions, block scenes and experiment with camera movement in a virtual space, as well as passing along Unreal Engine scenes to the visual effects vendors so they could align their digital environments and set extensions with the approved production layouts.” A virtual exploration of Steve’s shop in Midport Village. Certain elements have to be kept in mind when constructing virtual environments. “When building virtual environments, you need to consider what can actually be built, how actors and cameras will move through the space, and what’s safe and practical on set,” Bell observes. “Outside the areas where strict accuracy is required, you want the environments to blend naturally with the original designs from the art department and support the story, creating a space that feels right for the scene, guides the audience’s eye and sets the right tone. Things like composition, lighting and small environmental details can be really fun to work on, but also serve as beautiful additions to help enrich a story.” “I’ve always loved the physicality of working with an LED volume, both for the immersion it provides and the way that seeing the environment helps shape an actor’s performance. But for A Minecraft Movie, we used Simulcam instead, and it was an incredible experience to live-composite an entire Minecraft world in real-time, especially with nothing on set but blue curtains.” —Laura Bell, Creative Technologist, Disguise Among the buildings that had to be created for Midport Village was Steve’s (Jack Black) Lava Chicken Shack. Concept art was provided that served as visual touchstones. “We received concept art provided by the amazing team of concept artists,” Finlayson states. “Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging. Sometimes we would also help the storyboard artists by sending through images of the Unreal Engine worlds to help them geographically position themselves in the worlds and aid in their storyboarding.” At times, the video game assets came in handy. “Exteriors often involved large-scale landscapes and stylized architectural elements, which had to feel true to the Minecraft world,” Finlayson explains. “In some cases, we brought in geometry from the game itself to help quickly block out areas. For example, we did this for the Elytra Flight Chase sequence, which takes place through a large canyon.” Flexibility was critical. “A key technical challenge we faced was ensuring that the Unreal levels were built in a way that allowed for fast and flexible iteration,” Finlayson remarks. “Since our environments were constantly being reviewed by the director, production designer, DP and VFX supervisor, we needed to be able to respond quickly to feedback, sometimes live during a review session. To support this, we had to keep our scenes modular and well-organized; that meant breaking environments down into manageable components and maintaining clean naming conventions. By setting up the levels this way, we could make layout changes, swap assets or adjust lighting on the fly without breaking the scene or slowing down the process.” Production schedules influence the workflows, pipelines and techniques. “No two projects will ever feel exactly the same,” Bell notes. “For example, Pat Younis [VAD Art Director] adapted his typical VR setup to allow scene reviews using a PS5 controller, which made it much more comfortable and accessible for the director. On a more technical side, because everything was cubes and voxels, my Blender workflow ended up being way heavier on the re-mesh modifier than usual, definitely not something I’ll run into again anytime soon!” A virtual study and final still of the cast members standing outside of the Lava Chicken Shack. “We received concept art provided by the amazing team of concept artists. Not only did they send us 2D artwork, but they often shared the 3D models they used to create those visuals. These models were incredibly helpful as starting points when building out the virtual environments in Unreal Engine; they gave us a clear sense of composition and design intent. Storyboards were also a key part of the process and were constantly being updated as the project evolved. Having access to the latest versions allowed us to tailor the virtual environments to match camera angles, story beats and staging.” —Talia Finlayson, Creative Technologist, Disguise The design and composition of virtual environments tended to remain consistent throughout principal photography. “The only major design change I can recall was the removal of a second story from a building in Midport Village to allow the camera crane to get a clear shot of the chicken perched above Steve’s lava chicken shack,” Finlayson remarks. “I would agree that Midport Village likely went through the most iterations,” Bell responds. “The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled. I remember rebuilding the stairs leading up to the rampart five or six times, using different configurations based on the physically constructed stairs. This was because there were storyboarded sequences of the film’s characters, Henry, Steve and Garrett, being chased by piglins, and the action needed to match what could be achieved practically on set.” Virtually conceptualizing the layout of Midport Village. Complex virtual environments were constructed for the final battle and the various forest scenes throughout the movie. “What made these particularly challenging was the way physical set pieces were repurposed and repositioned to serve multiple scenes and locations within the story,” Finlayson reveals. “The same built elements had to appear in different parts of the world, so we had to carefully adjust the virtual environments to accommodate those different positions.” Bell is in agreement with her colleague. “The forest scenes were some of the more complex environments to manage. It could get tricky, particularly when the filming schedule shifted. There was one day on set where the order of shots changed unexpectedly, and because the physical sets looked so similar, I initially loaded a different perspective than planned. Fortunately, thanks to our workflow, Lindsay George [VP Tech] and I were able to quickly open the recorded sequence in Unreal Engine and swap out the correct virtual environment for the live composite without any disruption to the shoot.” An example of the virtual and final version of the Woodland Mansion. “Midport Village likely went through the most iterations. The archway, in particular, became a visual anchor across different levels. We often placed it off in the distance to help orient both ourselves and the audience and show how far the characters had traveled.” —Laura Bell, Creative Technologist, Disguise Extensive detail was given to the center of the sets where the main action unfolds. “For these areas, we received prop layouts from the prop department to ensure accurate placement and alignment with the physical builds,” Finlayson explains. “These central environments were used heavily for storyboarding, blocking and department reviews, so precision was essential. As we moved further out from the practical set, the environments became more about blocking and spatial context rather than fine detail. We worked closely with Production Designer Grant Major to get approval on these extended environments, making sure they aligned with the overall visual direction. We also used creatures and crowd stand-ins provided by the visual effects team. These gave a great sense of scale and placement during early planning stages and allowed other departments to better understand how these elements would be integrated into the scenes.” Cast members Sebastian Hansen, Danielle Brooks and Emma Myers stand in front of the Earth Portal Plateau environment. Doing a virtual scale study of the Mountainside. Practical requirements like camera moves, stunt choreography and crane setups had an impact on the creation of virtual environments. “Sometimes we would adjust layouts slightly to open up areas for tracking shots or rework spaces to accommodate key action beats, all while keeping the environment feeling cohesive and true to the Minecraft world,” Bell states. “Simulcam bridged the physical and virtual worlds on set, overlaying Unreal Engine environments onto live-action scenes in real-time, giving the director, DP and other department heads a fully-realized preview of shots and enabling precise, informed decisions during production. It also recorded critical production data like camera movement paths, which was handed over to the post-production team to give them the exact tracks they needed, streamlining the visual effects pipeline.” Piglots cause mayhem during the Wingsuit Chase. Virtual versions of the exterior and interior of the Safe House located in the Enchanted Woods. “One of the biggest challenges for me was managing constant iteration while keeping our environments clean, organized and easy to update,” Finlayson notes. “Because the virtual sets were reviewed regularly by the director and other heads of departments, feedback was often implemented live in the room. This meant the environments had to be flexible. But overall, this was an amazing project to work on, and I am so grateful for the incredible VAD team I was a part of – Heide Nichols [VAD Supervisor], Pat Younis, Jake Tuck [Unreal Artist] and Laura. Everyone on this team worked so collaboratively, seamlessly and in such a supportive way that I never felt like I was out of my depth.” There was another challenge that is more to do with familiarity. “Having a VAD on a film is still a relatively new process in production,” Bell states. “There were moments where other departments were still learning what we did and how to best work with us. That said, the response was overwhelmingly positive. I remember being on set at the Simulcam station and seeing how excited people were to look at the virtual environments as they walked by, often stopping for a chat and a virtual tour. Instead of seeing just a huge blue curtain, they were stoked to see something Minecraft and could get a better sense of what they were actually shooting.”
    0 Σχόλια 0 Μοιράστηκε
  • Game On With GeForce NOW, the Membership That Keeps on Delivering

    This GFN Thursday rolls out a new reward and games for GeForce NOW members. Whether hunting for hot new releases or rediscovering timeless classics, members can always find more ways to play, games to stream and perks to enjoy.
    Gamers can score major discounts on the titles they’ve been eyeing — perfect for streaming in the cloud — during the Steam Summer Sale, running until Thursday, July 10, at 10 a.m. PT.
    This week also brings unforgettable adventures to the cloud: We Happy Few and Broken Age are part of the five additions to the GeForce NOW library this week.
    The fun doesn’t stop there. A new in-game reward for Elder Scrolls Online is now available for members to claim.
    And SteelSeries has launched a new mobile controller that transforms phones into cloud gaming devices with GeForce NOW. Add it to the roster of on-the-go gaming devices — including the recently launched GeForce NOW app on Steam Deck for seamless 4K streaming.
    Scroll Into Power
    GeForce NOW Premium members receive exclusive 24-hour early access to a new mythical reward in The Elder Scrolls Online — Bethesda’s award-winning role-playing game — before it opens to all members. Sharpen the sword, ready the staff and chase glory across the vast, immersive world of Tamriel.
    Fortune favors the bold.
    Claim the mythical Grand Gold Coast Experience Scrolls reward, a rare item that grants a bonus of 150% Experience Points from all sources for one hour. The scroll’s effect pauses while players are offline and resumes upon return, ensuring every minute counts. Whether tackling dungeon runs, completing epic quests or leveling a new character, the scrolls provide a powerful edge. Claim the reward, harness its power and scroll into the next adventure.
    Members who’ve opted into the GeForce NOW Rewards program can check their emails for redemption instructions. The offer runs through Saturday, July 26, while supplies last. Don’t miss this opportunity to become a legend in Tamriel.
    Steam Up Summer
    The Steam Summer Sale is in full swing. Snag games at discounted prices and stream them instantly from the cloud — no downloads, no waiting, just pure gaming bliss.
    Treat yourself.
    Check out the “Steam Summer Sale” row in the GeForce NOW app to find deals on the next adventure. With GeForce NOW, gaming favorites are always just a click away.
    While picking up discounted games, don’t miss the chance to get a GeForce NOW six-month Performance membership at 40% off. This is also the last opportunity to take advantage of the Performance Day Pass sale, ending Friday, June 27 — which lets gamers access cloud gaming for 24 hours — before diving into the 6-month Performance membership.
    Find Adventure
    Two distinct worlds — where secrets simmer and imagination runs wild — are streaming onto the cloud this week.
    Keep calm and blend in.
    Step into the surreal, retro-futuristic streets of We Happy Few, where a society obsessed with happiness hides its secrets behind a mask of forced cheer and a haze of “Joy.” This darkly whimsical adventure invites players to blend in, break out and uncover the truth lurking beneath the surface of Wellington Wells.
    Two worlds, one wild destiny.
    Broken Age spins a charming, hand-painted tale of two teenagers leading parallel lives in worlds at once strange and familiar. One of the teens yearns to escape a stifling spaceship, and the other is destined to challenge ancient traditions. With witty dialogue and heartfelt moments, Broken Age is a storybook come to life, brimming with quirky characters and clever puzzles.
    Each of these unforgettable adventures brings its own flavor — be it dark satire, whimsical wonder or pulse-pounding suspense — offering a taste of gaming at its imaginative peaks. Stream these captivating worlds straight from the cloud and enjoy seamless gameplay, no downloads or high-end hardware required.
    An Ultimate Controller
    Elevated gaming.
    Get ready for the SteelSeries Nimbus Cloud, a new dual-mode cloud controller. When paired with GeForce NOW, this new controller reaches new heights.
    Designed for versatility and comfort, and crafted specifically for cloud gaming, the SteelSeries Nimbus Cloud effortlessly shifts from a mobile device controller to a full-sized wireless controller, delivering top-notch performance and broad compatibility across devices.
    The Nimbus Cloud enables gamers to play wherever they are, as it easily adapts to fit iPhones and Android phones. Or collapse and connect the controller via Bluetooth to a gaming rig or smart TV. Transform any space into a personal gaming station with GeForce NOW and the Nimbus Cloud, part of the list of recommended products for an elevated cloud gaming experience.
    Gaming Never Sleeps
    “System Shock 2” — now with 100% more existential dread.
    System Shock 2: 25th Anniversary Remaster is an overhaul of the acclaimed sci-fi horror classic, rebuilt by Nightdive Studios with enhanced visuals, refined gameplay and features such as cross-play co-op multiplayer. Face the sinister AI SHODAN and her mutant army aboard the starship Von Braun as a cybernetically enhanced soldier with upgradable skills, powerful weapons and psionic abilities. Stream the title from the cloud with GeForce NOW for ultimate flexibility and performance.
    Look for the following games available to stream in the cloud this week:

    System Shock 2: 25th Anniversary RemasterBroken AgeEasy Red 2Sandwich SimulatorWe Happy FewWhat are you planning to play this weekend? Let us know on X or in the comments below.

    The official GFN summer bucket list
    Play anywhere Stream on every screen you own Finally crush that backlog Skip every single download bar
    Drop the emoji for the one you’re tackling right now
    — NVIDIA GeForce NOWJune 25, 2025
    #game #with #geforce #now #membership
    Game On With GeForce NOW, the Membership That Keeps on Delivering
    This GFN Thursday rolls out a new reward and games for GeForce NOW members. Whether hunting for hot new releases or rediscovering timeless classics, members can always find more ways to play, games to stream and perks to enjoy. Gamers can score major discounts on the titles they’ve been eyeing — perfect for streaming in the cloud — during the Steam Summer Sale, running until Thursday, July 10, at 10 a.m. PT. This week also brings unforgettable adventures to the cloud: We Happy Few and Broken Age are part of the five additions to the GeForce NOW library this week. The fun doesn’t stop there. A new in-game reward for Elder Scrolls Online is now available for members to claim. And SteelSeries has launched a new mobile controller that transforms phones into cloud gaming devices with GeForce NOW. Add it to the roster of on-the-go gaming devices — including the recently launched GeForce NOW app on Steam Deck for seamless 4K streaming. Scroll Into Power GeForce NOW Premium members receive exclusive 24-hour early access to a new mythical reward in The Elder Scrolls Online — Bethesda’s award-winning role-playing game — before it opens to all members. Sharpen the sword, ready the staff and chase glory across the vast, immersive world of Tamriel. Fortune favors the bold. Claim the mythical Grand Gold Coast Experience Scrolls reward, a rare item that grants a bonus of 150% Experience Points from all sources for one hour. The scroll’s effect pauses while players are offline and resumes upon return, ensuring every minute counts. Whether tackling dungeon runs, completing epic quests or leveling a new character, the scrolls provide a powerful edge. Claim the reward, harness its power and scroll into the next adventure. Members who’ve opted into the GeForce NOW Rewards program can check their emails for redemption instructions. The offer runs through Saturday, July 26, while supplies last. Don’t miss this opportunity to become a legend in Tamriel. Steam Up Summer The Steam Summer Sale is in full swing. Snag games at discounted prices and stream them instantly from the cloud — no downloads, no waiting, just pure gaming bliss. Treat yourself. Check out the “Steam Summer Sale” row in the GeForce NOW app to find deals on the next adventure. With GeForce NOW, gaming favorites are always just a click away. While picking up discounted games, don’t miss the chance to get a GeForce NOW six-month Performance membership at 40% off. This is also the last opportunity to take advantage of the Performance Day Pass sale, ending Friday, June 27 — which lets gamers access cloud gaming for 24 hours — before diving into the 6-month Performance membership. Find Adventure Two distinct worlds — where secrets simmer and imagination runs wild — are streaming onto the cloud this week. Keep calm and blend in. Step into the surreal, retro-futuristic streets of We Happy Few, where a society obsessed with happiness hides its secrets behind a mask of forced cheer and a haze of “Joy.” This darkly whimsical adventure invites players to blend in, break out and uncover the truth lurking beneath the surface of Wellington Wells. Two worlds, one wild destiny. Broken Age spins a charming, hand-painted tale of two teenagers leading parallel lives in worlds at once strange and familiar. One of the teens yearns to escape a stifling spaceship, and the other is destined to challenge ancient traditions. With witty dialogue and heartfelt moments, Broken Age is a storybook come to life, brimming with quirky characters and clever puzzles. Each of these unforgettable adventures brings its own flavor — be it dark satire, whimsical wonder or pulse-pounding suspense — offering a taste of gaming at its imaginative peaks. Stream these captivating worlds straight from the cloud and enjoy seamless gameplay, no downloads or high-end hardware required. An Ultimate Controller Elevated gaming. Get ready for the SteelSeries Nimbus Cloud, a new dual-mode cloud controller. When paired with GeForce NOW, this new controller reaches new heights. Designed for versatility and comfort, and crafted specifically for cloud gaming, the SteelSeries Nimbus Cloud effortlessly shifts from a mobile device controller to a full-sized wireless controller, delivering top-notch performance and broad compatibility across devices. The Nimbus Cloud enables gamers to play wherever they are, as it easily adapts to fit iPhones and Android phones. Or collapse and connect the controller via Bluetooth to a gaming rig or smart TV. Transform any space into a personal gaming station with GeForce NOW and the Nimbus Cloud, part of the list of recommended products for an elevated cloud gaming experience. Gaming Never Sleeps “System Shock 2” — now with 100% more existential dread. System Shock 2: 25th Anniversary Remaster is an overhaul of the acclaimed sci-fi horror classic, rebuilt by Nightdive Studios with enhanced visuals, refined gameplay and features such as cross-play co-op multiplayer. Face the sinister AI SHODAN and her mutant army aboard the starship Von Braun as a cybernetically enhanced soldier with upgradable skills, powerful weapons and psionic abilities. Stream the title from the cloud with GeForce NOW for ultimate flexibility and performance. Look for the following games available to stream in the cloud this week: System Shock 2: 25th Anniversary RemasterBroken AgeEasy Red 2Sandwich SimulatorWe Happy FewWhat are you planning to play this weekend? Let us know on X or in the comments below. The official GFN summer bucket list Play anywhere Stream on every screen you own Finally crush that backlog Skip every single download bar Drop the emoji for the one you’re tackling right now — NVIDIA GeForce NOWJune 25, 2025 #game #with #geforce #now #membership
    BLOGS.NVIDIA.COM
    Game On With GeForce NOW, the Membership That Keeps on Delivering
    This GFN Thursday rolls out a new reward and games for GeForce NOW members. Whether hunting for hot new releases or rediscovering timeless classics, members can always find more ways to play, games to stream and perks to enjoy. Gamers can score major discounts on the titles they’ve been eyeing — perfect for streaming in the cloud — during the Steam Summer Sale, running until Thursday, July 10, at 10 a.m. PT. This week also brings unforgettable adventures to the cloud: We Happy Few and Broken Age are part of the five additions to the GeForce NOW library this week. The fun doesn’t stop there. A new in-game reward for Elder Scrolls Online is now available for members to claim. And SteelSeries has launched a new mobile controller that transforms phones into cloud gaming devices with GeForce NOW. Add it to the roster of on-the-go gaming devices — including the recently launched GeForce NOW app on Steam Deck for seamless 4K streaming. Scroll Into Power GeForce NOW Premium members receive exclusive 24-hour early access to a new mythical reward in The Elder Scrolls Online — Bethesda’s award-winning role-playing game — before it opens to all members. Sharpen the sword, ready the staff and chase glory across the vast, immersive world of Tamriel. Fortune favors the bold. Claim the mythical Grand Gold Coast Experience Scrolls reward, a rare item that grants a bonus of 150% Experience Points from all sources for one hour. The scroll’s effect pauses while players are offline and resumes upon return, ensuring every minute counts. Whether tackling dungeon runs, completing epic quests or leveling a new character, the scrolls provide a powerful edge. Claim the reward, harness its power and scroll into the next adventure. Members who’ve opted into the GeForce NOW Rewards program can check their emails for redemption instructions. The offer runs through Saturday, July 26, while supplies last. Don’t miss this opportunity to become a legend in Tamriel. Steam Up Summer The Steam Summer Sale is in full swing. Snag games at discounted prices and stream them instantly from the cloud — no downloads, no waiting, just pure gaming bliss. Treat yourself. Check out the “Steam Summer Sale” row in the GeForce NOW app to find deals on the next adventure. With GeForce NOW, gaming favorites are always just a click away. While picking up discounted games, don’t miss the chance to get a GeForce NOW six-month Performance membership at 40% off. This is also the last opportunity to take advantage of the Performance Day Pass sale, ending Friday, June 27 — which lets gamers access cloud gaming for 24 hours — before diving into the 6-month Performance membership. Find Adventure Two distinct worlds — where secrets simmer and imagination runs wild — are streaming onto the cloud this week. Keep calm and blend in (or else). Step into the surreal, retro-futuristic streets of We Happy Few, where a society obsessed with happiness hides its secrets behind a mask of forced cheer and a haze of “Joy.” This darkly whimsical adventure invites players to blend in, break out and uncover the truth lurking beneath the surface of Wellington Wells. Two worlds, one wild destiny. Broken Age spins a charming, hand-painted tale of two teenagers leading parallel lives in worlds at once strange and familiar. One of the teens yearns to escape a stifling spaceship, and the other is destined to challenge ancient traditions. With witty dialogue and heartfelt moments, Broken Age is a storybook come to life, brimming with quirky characters and clever puzzles. Each of these unforgettable adventures brings its own flavor — be it dark satire, whimsical wonder or pulse-pounding suspense — offering a taste of gaming at its imaginative peaks. Stream these captivating worlds straight from the cloud and enjoy seamless gameplay, no downloads or high-end hardware required. An Ultimate Controller Elevated gaming. Get ready for the SteelSeries Nimbus Cloud, a new dual-mode cloud controller. When paired with GeForce NOW, this new controller reaches new heights. Designed for versatility and comfort, and crafted specifically for cloud gaming, the SteelSeries Nimbus Cloud effortlessly shifts from a mobile device controller to a full-sized wireless controller, delivering top-notch performance and broad compatibility across devices. The Nimbus Cloud enables gamers to play wherever they are, as it easily adapts to fit iPhones and Android phones. Or collapse and connect the controller via Bluetooth to a gaming rig or smart TV. Transform any space into a personal gaming station with GeForce NOW and the Nimbus Cloud, part of the list of recommended products for an elevated cloud gaming experience. Gaming Never Sleeps “System Shock 2” — now with 100% more existential dread. System Shock 2: 25th Anniversary Remaster is an overhaul of the acclaimed sci-fi horror classic, rebuilt by Nightdive Studios with enhanced visuals, refined gameplay and features such as cross-play co-op multiplayer. Face the sinister AI SHODAN and her mutant army aboard the starship Von Braun as a cybernetically enhanced soldier with upgradable skills, powerful weapons and psionic abilities. Stream the title from the cloud with GeForce NOW for ultimate flexibility and performance. Look for the following games available to stream in the cloud this week: System Shock 2: 25th Anniversary Remaster (New release on Steam, June 26) Broken Age (Steam) Easy Red 2 (Steam) Sandwich Simulator (Steam) We Happy Few (Steam) What are you planning to play this weekend? Let us know on X or in the comments below. The official GFN summer bucket list Play anywhere Stream on every screen you own Finally crush that backlog Skip every single download bar Drop the emoji for the one you’re tackling right now — NVIDIA GeForce NOW (@NVIDIAGFN) June 25, 2025
    0 Σχόλια 0 Μοιράστηκε
  • So, Strasbourg has officially joined the world of Hollywood with the grand opening of Ex Persona, a motion capture studio that’s just a hop, skip, and a jump from the city center. Because, of course, what every aspiring actor needs is a high-tech studio where they can perfectly simulate the art of standing still while looking vaguely excited.

    Equipped with fancy Vicon cameras, I can only imagine the thrill of seeing your every awkward movement captured in stunning detail. Finally, you can bring your most cringe-worthy dance moves to life—because who wouldn't want their most embarrassing moments immortalized in 3D?

    Let’s just hope the talent they attract has more personality than their studio name suggests!

    #MotionCapture #ExPersona #Strasbourg
    So, Strasbourg has officially joined the world of Hollywood with the grand opening of Ex Persona, a motion capture studio that’s just a hop, skip, and a jump from the city center. Because, of course, what every aspiring actor needs is a high-tech studio where they can perfectly simulate the art of standing still while looking vaguely excited. Equipped with fancy Vicon cameras, I can only imagine the thrill of seeing your every awkward movement captured in stunning detail. Finally, you can bring your most cringe-worthy dance moves to life—because who wouldn't want their most embarrassing moments immortalized in 3D? Let’s just hope the talent they attract has more personality than their studio name suggests! #MotionCapture #ExPersona #Strasbourg
    Motion Capture : le studio Ex Persona ouvre ses portes
    Il y a quelques jours, un nouveau studio de motion capture a ouvert ses portes à Strasbourg : Ex Persona. Basé à 10 minutes du centre (et donc de la gare) de façon à faciliter l’accès du plateau pour les talents, Ex Persona dispose d’un p
    1 Σχόλια 0 Μοιράστηκε
  • Finally, the moment we've all been waiting for—Alan Wake 2 is now "affordable" thanks to the summer sales of 2025! Who knew that two years of waiting and price inflation would lead us to this groundbreaking revelation? It’s almost like they were waiting for the perfect time to remind us that our wallets have feelings too. So, grab your copy before the next price hike! After all, nothing says “I’m a savvy gamer” like buying a game that's still too expensive... just less so. Cheers to summer sales and to Alan Wake 2—because why not splurge on a game that took two years to drop in price?

    #SummerSales #AlanWake2 #GamingDeals #AffordableGaming #PriceDrop
    Finally, the moment we've all been waiting for—Alan Wake 2 is now "affordable" thanks to the summer sales of 2025! Who knew that two years of waiting and price inflation would lead us to this groundbreaking revelation? It’s almost like they were waiting for the perfect time to remind us that our wallets have feelings too. So, grab your copy before the next price hike! After all, nothing says “I’m a savvy gamer” like buying a game that's still too expensive... just less so. Cheers to summer sales and to Alan Wake 2—because why not splurge on a game that took two years to drop in price? #SummerSales #AlanWake2 #GamingDeals #AffordableGaming #PriceDrop
    WWW.ACTUGAMING.NET
    Alan Wake 2 est enfin à un prix abordable grâce au début des soldes d’été 2025
    ActuGaming.net Alan Wake 2 est enfin à un prix abordable grâce au début des soldes d’été 2025 C’est l’heure des soldes d’été et de faire le plein de bons plans dans les […] L'article Alan Wake 2 est enfin à un prix abor
    1 Σχόλια 0 Μοιράστηκε
  • Ah, the wonders of modern gaming! Who would have thought that the secret to uniting a million people would be simply to toss a digital soccer ball around? Enter "Rematch," the latest sensation that has whisked a million souls away from the harsh realities of life into the pixelated perfection of football. It’s like Rocket League had a baby with FIFA, and now we have a game that claims to bring us all together — because who needs genuine human interaction when you can kick a virtual ball?

    Let’s take a moment to appreciate the brilliance behind this phenomenon. After countless years of research, gaming experts finally discovered that people *actually* enjoy playing football. Shocking, right? It’s not like football has been the most popular sport in the world for, oh, I don’t know, ever. But hey, let’s applaud the genius who looked at Rocket League and thought, "Why don’t we add a ball that actually resembles a soccer ball?"

    With Rematch, we’ve moved past the days of traditional socializing. Why grab a pint with friends when you can huddle in your living room, staring at a screen, pretending to be David Beckham while never actually getting off the couch? The thrill of the game has never been so… sedentary. And who needs to break a sweat when the only thing you’ll be sweating over is how to outmaneuver your fellow couch potatoes with your fancy footwork?

    Now, let’s talk about the social implications. One million people have flocked to Rematch, which means that for every goal scored, there’s a lonely soul who just sat through another week of awkward small talk at the office, wishing they too could be playing digital soccer instead of discussing weekend plans. Talk about a win-win! You can bond with your online teammates while simultaneously avoiding real-life conversations. It’s like the ultimate social life hack!

    But wait, there’s more! The marketing team behind Rematch must be patting themselves on the back for this one. A game that can turn sitting in your pajamas into an epic communal experience? Bravo! It’s almost poetic to think that millions of people are now united over pixelated football matches while ignoring their actual neighbors. Who knew that a digital platform could replace not just a football field but also a community center?

    In conclusion, as we celebrate the monumental achievement of Rematch bringing together one million players, let’s also take a moment to reflect on what we’ve sacrificed for this pixelated paradise: actual human interaction, the smell of fresh grass, and the sweet sound of a whistle blowing on a real field. But hey, at least we’re saving the planet one digital kick at a time, right?

    #Rematch #DigitalSoccer #GamingCommunity #PixelatedFootball #SoccerRevolution
    Ah, the wonders of modern gaming! Who would have thought that the secret to uniting a million people would be simply to toss a digital soccer ball around? Enter "Rematch," the latest sensation that has whisked a million souls away from the harsh realities of life into the pixelated perfection of football. It’s like Rocket League had a baby with FIFA, and now we have a game that claims to bring us all together — because who needs genuine human interaction when you can kick a virtual ball? Let’s take a moment to appreciate the brilliance behind this phenomenon. After countless years of research, gaming experts finally discovered that people *actually* enjoy playing football. Shocking, right? It’s not like football has been the most popular sport in the world for, oh, I don’t know, ever. But hey, let’s applaud the genius who looked at Rocket League and thought, "Why don’t we add a ball that actually resembles a soccer ball?" With Rematch, we’ve moved past the days of traditional socializing. Why grab a pint with friends when you can huddle in your living room, staring at a screen, pretending to be David Beckham while never actually getting off the couch? The thrill of the game has never been so… sedentary. And who needs to break a sweat when the only thing you’ll be sweating over is how to outmaneuver your fellow couch potatoes with your fancy footwork? Now, let’s talk about the social implications. One million people have flocked to Rematch, which means that for every goal scored, there’s a lonely soul who just sat through another week of awkward small talk at the office, wishing they too could be playing digital soccer instead of discussing weekend plans. Talk about a win-win! You can bond with your online teammates while simultaneously avoiding real-life conversations. It’s like the ultimate social life hack! But wait, there’s more! The marketing team behind Rematch must be patting themselves on the back for this one. A game that can turn sitting in your pajamas into an epic communal experience? Bravo! It’s almost poetic to think that millions of people are now united over pixelated football matches while ignoring their actual neighbors. Who knew that a digital platform could replace not just a football field but also a community center? In conclusion, as we celebrate the monumental achievement of Rematch bringing together one million players, let’s also take a moment to reflect on what we’ve sacrificed for this pixelated paradise: actual human interaction, the smell of fresh grass, and the sweet sound of a whistle blowing on a real field. But hey, at least we’re saving the planet one digital kick at a time, right? #Rematch #DigitalSoccer #GamingCommunity #PixelatedFootball #SoccerRevolution
    Déjà 1 million de personnes sur Rematch, le jeu de foot rassemble beaucoup de monde
    ActuGaming.net Déjà 1 million de personnes sur Rematch, le jeu de foot rassemble beaucoup de monde Rematch part d’une idée si bonne et pourtant si évidente après le succès de Rocket […] L'article Déjà 1 million de personnes sur Rematch,
    Like
    Love
    Wow
    Sad
    Angry
    160
    1 Σχόλια 0 Μοιράστηκε
Αναζήτηση αποτελεσμάτων