WWW.FXGUIDE.COM
NAB 25: a leaner show but with an AI innovation focus
The NAB Show in Las Vegas was long the epicentre of breakthroughs in broadcast, media, and production technology. Once sprawling across the entire Las Vegas Convention Center and hosting over 100,000+ attendees, the show now presents a more focused and intense experience; this year, the Conference is expecting just over 60,000 visitors, but with attendees from 160 countries and filling only part of the South Hall. But while the scale may have shifted, the ambition certainly hasnt.Known for historic firsts like the debut of HDTV and the original unveiling of the RED Digital Cinema Camera, NAB remains the place Where Content Comes to Life. In 2025, the spotlight turns squarely toward innovation in AI, automation, and next-generation production tools. From real-time visual effects to AI-driven editorial workflows, the show reflects an industry rapidly evolving under the weight and potential of machine learning and intelligent automation.Here are some of the most interesting new technologies and tools that are reshaping post-production.Blackmagic DesignIn Las Vegas, Blackmagic Design has unveiled DaVinci Resolve 20, a significant update packed with over 100 new features, many of which are powered by AI. While the NAB show may be smaller this year, Blackmagics booth is showcasing some key transformative tools for post, -and AI is clearly taking the lead. This release brings a host of intelligent features designed to streamline workflows and enhance creative control.DaVinci Resolve 20Among the headline tools is AI IntelliScript, which can automatically generate timelines from a text script, effectively turning written dialogue into structured edits. AI Animated Subtitles adds a dynamic layer to captioning by animating words in sync with spoken audio, while AI Multicam SmartSwitch uses speaker detection to automate camera switching in multicam edits. Audio post also gets an upgrade with the AI Audio Assistant, which analyzes entire timelines and builds a professional mix without manual intervention.All of these features represent a deepening integration of AI into the DaVinci editing and finishing process as a practical assistant across real-world workflows. Beyond AI, DaVinci Resolve 20 introduces a new keyframe editor, voiceover recording palettes, expanded compositing capabilities in Fusion, and enhanced grading tools like Chroma Warp.DaVinci Resolve 20 is available now as a public beta from the Blackmagic Design website. We expect a lot of buzz around the booth this week, especially from those watching how AI is integrating into the editorial and finishing landscape.With the Blackmagic URSA Cine Immersive, the worlds first cinema camera designed to shoot for Apple Immersive Video and Apple Vision Pro (AVP) DaVinci Resolve Studio 20 also introduces powerful new features for Apple Immersive Video. Filmmakers can now edit, colour grade, mix Spatial Audio, and deliver Apple Immersive Video captured using the new Blackmagic URSA Cine Immersive camera.AdobeAt NAB 2025, Adobe has just unveiled powerful new updates to Premiere Pro and After Effects, with a strong emphasis on AI-driven features that enhance creative workflows and make editing smarter and faster.In Premiere Pro 25.2, Adobe introduces Firefly-powered Generative Extend, a tool that allows editors to seamlessly add extra 4K frames to a clip, which is perfect for fixing cuts that start too early or end too soon. Generative Extend works in the background as you continue editing, and it is powered by Adobes Firefly AI. It also importantly includes Content Credentials, providing transparency about when and where AI was used.https://www.fxguide.com/wp-content/uploads/2025/04/Generative-Extend-Sizzle-VideoS.mp4Also debuting is Media Intelligence, an AI-based feature that simplifies finding the right shot. By automatically recognizing content, including objects, environments, camera angles, and dialogue. Editors can now search hours of footage using natural language queries. Editors can search for anything from close-ups of hands working in a kitchen to mentions of herbs and Premiere will surface relevant clips, transcripts, and metadata.As this AI analysis happens locally, theres no need for an internet connection, and Adobe emphasizes that no user content is ever used to train its models. Other workflow upgrades in Premiere include dynamic waveforms, improved colour management with drag-and-drop ease, coloured sequence labels, and expanded GPU accelerationsolid quality-of-life features requested by working professionals.In After Effects version 25.2 there is a new high-performance playback engine for faster previews, fresh 3D motion design tools, and support for HDR monitoring, pushing the boundaries of motion graphics and VFX workflows. Together, these updates mark a push from Adobe toward AI-assisted editing that doesnt replace the artist, but rather assists them, automating the repetitive, surfacing whats useful, and letting creators focus more on storytelling according to Adobe.Foundry: Nuke Stage Virtual Production ProductFoundry has announced Nuke Stage, a new standalone application purpose-built for virtual production and in-camera VFX (ICVFX). Designed to streamline workflows from pre-production through final pixels, Nuke Stage gives VFX artists end-to-end control over imagery and colour in a unified pipeline. The tool enables real-time playback of photoreal environments on LED walls, live compositing, and layout, using industry standards like OpenUSD, OpenEXR, and OpenColorIO.Nuke StageNuke Stage has afamiliar node-based interface consistent with Nuke, the new tool is built to deliver efficiency and flexibility for productions of all sizes. Developed in collaboration with the VFX and virtual production community, Nuke Stage fills a gap in on-set workflows by integrating VFX compositing tools into real-time production. Our hope is that Nuke Stage brings the expertise of VFX artists even closer to creative decision-making, said Christy Anzelmo, Chief Product Officer at Foundry. Industry professional Dan Hall of 80six called it a handshake between VFX and virtual production, while Framestores Connor Ling noted that familiarity with Nuke will build confidence in whats seen on the LED wall. Garden Studios Sam Kemp praised the ability to tweak 2D assets live, calling it something thats been missing from virtual production for a long time.Key features of Nuke Stage include real-time photoreal playback, live compositing, comprehensive colour support, and hardware-agnostic operation; with no specialized media servers,Nuke Stage runs on standard hardware. The tool supports 2D, 2.5D, and 3D content playback and allows teams to work in a linear, HDR-ready colour space using a consistent toolset from prep to post. As part of the Nuke ecosystem, Nuke Stage is designed to offer a seamless, efficient pipeline from the first frame captured on set through to the final delivery.NVIDIANVIDIA is showcasing Real-Time AI and Intelligent Media Workflows at NAB. NVIDIA is focused on the Blackwell platform, which serves as the foundation of NVIDIA Media2. This is a collection of NVIDIA technologies, including NVIDIA NIM microservices and NVIDIA AI Blueprints for live video analysis, accelerated computing platforms and generative AI software.NVIDIA Holoscan is also being shown for Media, as an advanced real-time AI platform designed for live media workflows and applications and their NVIDIA AI Blueprint for video search and summarization. These tools make it easy to build and customize video analytics AI agents.Cinnafilm Upconversion toolAdditionally, Cinnafilm and NVIDIA have partnered on a new AI-powered HD-to-UHD upconversion tool, which will be commercially available later this year inside Cinnafilms flagship conversion platform, PixelStrings and it is launching at the 2025 NAB Show this week. Combining Cinnafilms GPU-accelerated engine with a custom version of NVIDIA RTX AI Image Processing, the system delivers ultra-high-definition results in a single pass with what Cinnafilm says is 2530% greater detail than its current tools. Designed for speed, scalability, and top-tier visual quality, the new solution sets a fresh benchmark for media upscaling.DeepDubDeepdub has unveiled Deepdub Live, a real-time multilingual dubbing solution designed for live sports, esports, and breaking news coverage. DeepDub is audio only, but a significant step forward for broadcasters. Powered by the companys proprietary Emotive Text-to-Speech (eTTS) engine, Deepdub Live delivers expressive, emotionally nuanced voiceovers that the company claims are just like the native-language production. The eTTS system dynamically adjusts vocal tone, intensity, and energy to match the emotional cadence of live events, whether its the urgency of breaking news or the excitement of a sports final. Broadcasters can choose to use AI-cloned voices of original speakers or select from Deepdubs licensed voice bank, all cleared for broadcast and streaming. Built for enterprise deployment, the API-driven platform supports more than 100 languages and dialects, with ultra-low latency and frame-accurate synchronisation to ensure seamless, high-quality multilingual experiences in real-time.
0 Comments 0 Shares 51 Views