www.fxguide.com
If you check social media right now, youll find a lot of discussion surrounding Apple Intelligence. At the same time, Apple has launched a new lineup of iPads, MacBook Air models, and the latest Mac Studios. While many YouTubers and influencers have expressed disappointment with Apples approach to AI, particularly the implementation of their Apple Intelligence strategy, the company has simultaneously released one of the most powerful machines ever: the M3 Ultra Mac Studio, which is brilliant for AI.Weve been using the new M3 Ultra since its release, and for artists and TDs working doing AI in M&E , this machine is exceptionally attractive. Our M3 Ultra is far from inexpensive, but with 512GB of RAM and 12TB of storage, it is, without question, a beast when it comes to localized AI, machine learning, LLM applications, and image processing. Weve specifically been putting it through its paces with Topaz Video AI 6, which runs natively on the M3 Ultra Mac Studio.360 to 4K in Topaz Vdeio AI 6. (click to enlarge)The question that naturally follows is: why do you need this much power on a desktop? Topaz AI 6 provides a perfect answer. Unlike many AI applications, this software is not cloud-basedthough Topaz does have an early cloud prototype called the Starlight Project. For some people in production, cloud computing is a viable option, but there are compelling reasons to keep processing local. Three primary factors make local AI processing beneficial: security, speed of upload, and speed of processing. While cloud computing is an alternative for those without high-performance machines, our intense work over the past week left no doubt that in a professional environment, the M3 Ultra is an outstanding option.Security. Security concerns around AI in VFX are significant, as many clients fear that AI or machine learning tools could absorb proprietary data into cloud-based models, thereby compromising intellectual property. While this concern is often unfoundedmost AI models, including ChatGPT, do not learn from user inputs once they are built, but the client perception remains. As some great colleagues of mine recently posted, AI systems like ChatGPT are pre-trained, meaning they dont continuously learn from user interactions; instead, they operate within the confines of their initial training data. Nonetheless, some clients are adamant about avoiding AI tools altogether. With local processing, security is far easier to guarantee, giving professionals a level of control that cloud solutions struggle to match.Speed of upload. While many users have access to fast internet, uploading large filesparticularly in a remote workflowcomes at a cost. This challenge is especially pronounced for VFX professionals working with massive DPX or EXR files. In our situation, where we are focused on upscaling footage, the issue was not a case of uploading a few live action plates elements and processing them over hours, but rather dealing with vast amounts of material being repurposed. This significantly exasperates the upload problem, reinforcing the advantages of local processing.Speed of processing. This is where the M3 Ultra truly excels. With its killer M3 Ultra chip, fast RAM and high-speed storage, it processes massive datasets at an incredible rate. Rather than relying solely on benchmarks, we focus on real-world scenarios. We worked on a range of production assets for this story, some decades old and stored only in 360p resolution, others more recent, at 2K res. Our primary test involved upscaling to 4K, as it provided a meaningful balance between quality and computational demands (Topaz can scale to 8K). One specific scenario involved integrating old green screen footage with newly shot 4K material. Additionally, we experimented with extreme upscaling, pushing very low-resolution footage (360P) to 4K, to evaluate failure points and we also compared the fully released Topaz AI 6 with Topazs prototype cloud-based Project Starlight AI.Rather than discuss specifciation and the data rates of the M3 Ultra Mac Studio we explored what we can do with it, after all it was Apple who sold the original ipod as 1000 songs in your pocket, and not iPod with 5GB Hard drive.Stand alone upresing:Before diving into the green screen examples below, we first examined some traditional upscaling work performed using Topaz AI 6. In each case, the upscaling was applied to a completed shot, with clear before-and-after comparisons to showcase the improvements. Regardless of the hardware running Topaz AI 6, the results are impressive, demonstrating just how far machine learning has advanced as a powerful post-production tool.The combination of Topaz AI 6 and the M3 Ultra particularly excels when handling multiple shots. While cloud-based solutions may be viable for a single, isolated shot, the speed and efficiency of a high-performance local Mac solution are as impressive as the visual enhancements themselves.Topaz AI 6 is far from a one-click tool; it provides the flexibility to experiment with various AI models and fine-tune parameters such as motion blur, color space, and focus. The M3 Ultra tackles these complex AI-driven tasks with exceptional speed, enabling users to iterate and refine their settings efficiently. The ability to quickly generate and manipulate 5-second previews is crucial for making real-world adjustments, and in our experience, the M3 Ultra is the fastest system we have used to handle this insanely computationally intensive process.Still from a clip 1280 720 (Left) to 3840x 2160 : 4K (Right) click for larger version. High frequency details in the background and depth of field defocus are both handled extremely well.Clearly there are important issues such as providence, copyright and artist rights in any machine learning pipeline. However, upres-ing such as this is not about replacing jobs, nor dramatically removing the creative craft of the VFX. It is however all about providing incredible results that just a few years ago would have been impossible. Some of these processed shots are simply unbelievably good.The software also does SDR to HDR inverse tone mapping . It converts Rec 709, BT. 1886 (Standard Display Gamma Response) to BT. 2020, Wide Color Gamut for 4K and 8K, the color standard designed for ultra-high-definition (UHD) video.There are in some artefacts, especially on very large resolution changes, but in addition to upres-ing the same Topaz software can be used for slow-mo, noise reduction, grain removal and we bought our copy for only $299, but you can sometimes get discounts such as during Black Friday/Cyber Monday.(L) 360P upres to 4K (R). Originally shot high frame rate on a Red Epic. Click to expandOriginal Shot Slow motion: (L) 960 540 to 3840x 2160: 4K (R). Click to expandNuke: GreenScreenFor all the greenscreen comparsions we used a standard Nuke Comp setup to pull the key and examine the matte channels. Naturally, a skilled Nuke compositor could improve on the key or build on what we have done here, but the matte images below are presented as indictative rather than best of class compositing. And on the whole, they are sensible key / matte outputs (more on the Nuke setup below).Left: compressed H264 low res footage: vs. Right 4K upres . Click to expand.Greenscreen: 360p to 4KA Nearly Impossible UpscaleWe selected two sources, one being an archival MP4 clip in 360p, originally shot years ago in Las Vegas for NAB. Longtime fxguide readers may remember our infamous fxguide hangover video, (proof that what happens in Vegas doesnt always stay in Vegas). The footage had every issue imaginable: heavy compression, low resolution, and only 8-bit depth. There is no way that any reasonable imagery should be possible from such a huge leap in resolution. However, in real-world productions, sometimes archival footage is all thats available.We ran the clip through both Topaz AI 6 and the Starlight cloud tool. The processing times varied drastically between the two pipelines. The new diffussion model Starlight solution, though remarkable with its new AI-based approach, required over 50 minutes (50m 45s) to upscale just 10 seconds of footage to 2K. Meanwhile, on the Mac Studio the local 4K upscale with Topaz AI 6, (despite using a different algorithm), ran vastly faster at around 10 frames per second, (under 30 seconds for the same clip, (at higher 4K not 2K resolution!)In viewing the mattes below, the results are remarkably sharp and, most importantly, temporally stable. The stability of the algorithms outputs across successive frames is vital to ensuring that edges do not exhibit unwanted fluctuations or boiling in a composite. While the inferred hair detail is impressive, certain elementssuch as the comb in the hairstylists hands, can sometimes be lost as the program attempts to resolve motion-blurred objects.Source: Original material comped in Nuke, matte below is provided as reference.Note the source was heavily compressed 960540 H264.Special thanks to fxguidetvs Angie and Chris who appear in this clip.https://www.fxguide.com/wp-content/uploads/2025/03/Angie_960x540_comp.mp4https://www.fxguide.com/wp-content/uploads/2025/03/Angie-960x540-matte.mp4Starlight (cloud based): Original Upres-ed to HD and comped in Nuke, matte below is provided as reference.https://www.fxguide.com/wp-content/uploads/2025/03/Angie_1920x1080_comp.mp4https://www.fxguide.com/wp-content/uploads/2025/03/Angie-1920x1080-matte.mp44K Topaz AI 6 : Original Upres-ed to 4K, again comped in Nukehttps://www.fxguide.com/wp-content/uploads/2025/03/Angie_3840x2160_comp.mp4https://www.fxguide.com/wp-content/uploads/2025/03/Angie-3840x2160-matte.mp4Project Starlight leverages an AI diffusion model for video to recreate expected, missing details during upscaling, resulting in high-resolution videos that are rich with detail. The feature also operates without artifacts such as incorrect object motions, a temporal error that can occur when AI doesnt create realistic output.Interestingly, in our work, the visual results favored Starlight in most cases. Since this process relies on AI-driven image generation, some hallucinations occurred, including an instance where Topaz AI 6 mistakenly rendered part of a persons hand as an additional face. On the other hand, Topaz AI 6 was stronger in handling scene transitionsour green screen footage, extracted from an edit, showed some flickering artifacts in the Starlight version, particularly at the beginning and end of the clip.1:1 scale of the low res / compressed source to the up-res 4K. Which is an incredible jump. Note the trouble resolving the logo and the face hallucination in the hand. Click for large version.Greenscreen: Production 2K to 4KFor a more controlled and sensible production example, we used professionally shot green screen footage originally captured in RAW .R3D on an EPIC camera. However, we only had access to the exported transcoded 2K files rather than the original RAW .r3d data. This scenario mirrors real-world challenges where VFX artists must integrate older footage with modern 4K backgrounds. Given the limitations, upscaling the foreground is the only viable solution. Topaz Video AI 6 supports a range of file formats, including .png, .tif, .jpg, .dpx, and .exr. However, it processes EXR files by converting them to temporary 16-bit .tiff images, meaning that 32-bit EXRs do not retain full color information. For our test, we loaded the green screen shot as a sequence of DPX files and upscaled it to 4K.As seen below, the program excels at processing faces and people in general. Hair appears sharper and more consistent, and the results avoid traditional upscaling artifacts such as ringing and excessive edge sharpening, which are common in basic kernel-based sharpening algorithms.One of the challenges of upscaling is preserving skin texture, which can often become flattened or blurred. However, in this case, pore details are enhanced in a natural and believable way, making the results highly usable. While Nuke offers internal scaling options for any footage, Topaz AI 6 appears to deliver a superior result.Source (2K):https://www.fxguide.com/wp-content/uploads/2025/03/Original2KComp.mp4https://www.fxguide.com/wp-content/uploads/2025/03/Original2KMatte.mp44K comp:https://www.fxguide.com/wp-content/uploads/2025/03/UpRes4KComp.mp4https://www.fxguide.com/wp-content/uploads/2025/03/UpRes4KMatte.mp4Nuke (using Cattery AI)The key for this main Greenscreen shot used MODNET which is a Cattery node for roughly segmenting the character out of the green screen, then by shrinking and expanding the matte to get an inner core matter overlayed onto of the outer soft edge (hair) matte, to feed into ViMatte, which another Cattery node for creating the overall matte. Then the matte is normalised to the value between 0 and 1. The matte is then copied to the dispelled plate.The background is simple CG element over a BG plate. The CG element has depth map generated from another Cattery node called DepthAnything, the depth map was used for defocusing the CG element. Grain also added to the CG element.Then FG was graded and multiplied, comped over the BG. BG was also graded through the FG matte to bring some more hair detail back. Both 2K comp and 4K comp were using the same setup, but with some small tweaks.Our Nuke Comp setupM3 UltraThe M3 Ultra is built for extreme performance, handling the most demanding workflows that leverage high CPU and GPU core counts alongside unified memory. If you are just surfing the Web this is not for you. The M3 Ultra is built by Apple for video editors, 3D & VFX artists, and AI researchers. We have a Mac Studio with a M3 Ultra, 32-core CPU, an 80-core GPU, 512GB of unified memory with up to 819GB/s bandwidth, and 12TB of ultra-fast internal storageenough for over 12 hours of 8K ProRes footage. On paper, the M3 Ultra delivers 50% more performance than any previous Ultra chip, with 8 efficiency cores pushing CPU speeds 80% faster than the M1 Ultra, which weve used extensively on our prior four AI-driven film projects. Ultimately, the M3 Ultra Mac Studio is an undeniable powerhouse for professionals working with AI, VFX, and machine learning applications. While cloud-based AI services like Project Starlight have their place, the performance, security, and reliability of local processing make a compelling argument for investing in high-end hardware. While unconfirmed, it is hoped that Starlight will one day also be available as a download implementation able to run locally on high-end hardware. It is currently free to test on the cloud.