• Is it just me, or does the phrase "Quick Tip: A Better Wet Shader" sound like the latest buzz from a trendy café, where the barista is more interested in his art than the coffee? I mean, who would have thought that the secret to stunning visuals in Blender would come down to a clearcoat? It’s almost as if John Mervin, that brave pioneer of pixelated perfection, stumbled upon the holy grail of rendering while driving—because, you know, multitasking is all the rage these days!

    Let's take a moment to appreciate the genius of recording tutorials while navigating rush hour traffic. Who needs a calm, focused environment when you could be dodging potholes and merging lanes? I can just picture it: "Okay, folks, today we're going to add a clearcoat to our wet shader... but first, let’s avoid this pedestrian!" Truly inspiring.

    But back to the world of wet shaders. Apparently, the key to mastering the art of sheen is just slapping on a clearcoat and calling it a day. Why bother with the complexities of light diffusion, texture mapping, or even the nuances of realism when you can just... coat it? It's like serving a gourmet meal and then drowning it in ketchup—truly a culinary masterpiece!

    And let’s not forget the vast potential here. If adding a clearcoat is revolutionary, imagine the untapped possibilities! Why not just throw in a sprinkle of fairy dust and call it a magical shader? Or better yet, how about a “drive-by” tutorial series that teaches us how to animate while on a rollercoaster? The future of Blender tutorials is bright—especially if you’re driving towards it at 80 mph!

    After all, who needs to focus on the intricacies of shader creation when we can all just slap on a clearcoat and hope for the best? The art of 3D rendering has clearly reached a new zenith. So, to all the aspiring Blender wizards out there, remember: clearcoat is your best friend, and traffic lights are merely suggestions.

    In conclusion, if you ever find yourself needing a quick fix in Blender, just remember—there’s nothing a good clearcoat can’t solve. Just don’t forget to keep your eyes on the road; after all, we wouldn’t want you to miss a tutorial while mastering the art of shaders on the go!

    #WetShader #BlenderTutorial #Clearcoat #3DRendering #DigitalArt
    Is it just me, or does the phrase "Quick Tip: A Better Wet Shader" sound like the latest buzz from a trendy café, where the barista is more interested in his art than the coffee? I mean, who would have thought that the secret to stunning visuals in Blender would come down to a clearcoat? It’s almost as if John Mervin, that brave pioneer of pixelated perfection, stumbled upon the holy grail of rendering while driving—because, you know, multitasking is all the rage these days! Let's take a moment to appreciate the genius of recording tutorials while navigating rush hour traffic. Who needs a calm, focused environment when you could be dodging potholes and merging lanes? I can just picture it: "Okay, folks, today we're going to add a clearcoat to our wet shader... but first, let’s avoid this pedestrian!" Truly inspiring. But back to the world of wet shaders. Apparently, the key to mastering the art of sheen is just slapping on a clearcoat and calling it a day. Why bother with the complexities of light diffusion, texture mapping, or even the nuances of realism when you can just... coat it? It's like serving a gourmet meal and then drowning it in ketchup—truly a culinary masterpiece! And let’s not forget the vast potential here. If adding a clearcoat is revolutionary, imagine the untapped possibilities! Why not just throw in a sprinkle of fairy dust and call it a magical shader? Or better yet, how about a “drive-by” tutorial series that teaches us how to animate while on a rollercoaster? The future of Blender tutorials is bright—especially if you’re driving towards it at 80 mph! After all, who needs to focus on the intricacies of shader creation when we can all just slap on a clearcoat and hope for the best? The art of 3D rendering has clearly reached a new zenith. So, to all the aspiring Blender wizards out there, remember: clearcoat is your best friend, and traffic lights are merely suggestions. In conclusion, if you ever find yourself needing a quick fix in Blender, just remember—there’s nothing a good clearcoat can’t solve. Just don’t forget to keep your eyes on the road; after all, we wouldn’t want you to miss a tutorial while mastering the art of shaders on the go! #WetShader #BlenderTutorial #Clearcoat #3DRendering #DigitalArt
    Quick Tip: A Better Wet Shader
    John Mervin probably made the shortest Blender tutorial ever ;-) You could just add a clearcoat...But why stop there? P.S. Please do not record tutorials while driving. Source
    Like
    Love
    Wow
    Sad
    Angry
    565
    1 Comments 0 Shares
  • Game Dev Digest Issue #286 - Design Tricks, Deep Dives, and more

    This article was originally published on GameDevDigest.comEnjoy!What was Radiant AI, anyway? - A ridiculously deep dive into Oblivion's controversial AI system and its legacyblog.paavo.meConsider The Horse Game - No I don’t think every dev should make a horse game. But I do think every developer should at least look at them, maybe even play one because, it is very important that you understand the importance of genre, fandom, and how visibility works. Even if you are not making a horse game, the lessons you can learn by looking at this sub genre are very similar to other genres, just not as blatantly clear as they are with horse games.howtomarketagame.comMaking a killing: The playful 2D terror of Psycasso® - I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.UnityIntroduction to Asset Manager transfer methods in Unity - Unity's Asset Manager is a user-friendly digital asset management platform supporting over 70 file formats to help teams centralize, organize, discover, and use assets seamlessly across projects. It reduces redundant work by design, making cross-team collaboration smoother and accelerating production workflows.UnityVideosRules of the Game: Five Tricks of Highly Effective Designers - Every working designer has them: unique techniques or "tricks" that they use when crafting gameplay. Sure, there's the general game design wisdom that everyone agrees on and can be found in many a game design book, but experienced game designers often have very specific rules that are personal to them, techniques that not everyone knows about or even agrees with. In this GDC 2015 session, five experienced game designers join the stage for 10 minutes each to share one game design "trick" that they use.Game Developers ConferenceBinding of Isaac Style Room Generator in Unity- Our third part in the series - making the rooms!Game Dev GarnetIntroduction to Unity Behavior | Unity Tutorial - In this video you'll become familiar with the core concepts of Unity Behavior, including a live example.LlamAcademyHow I got my demo ready for Steam Next Fest - It's Steam Next Fest, and I've got a game in the showcase. So here are 7 tips for making the most of this demo sharing festival.Game Maker's ToolkitOptimizing lighting in Projekt Z: Beyond Order - 314 Arts studio lead and founder Justin Miersch discuss how the team used the Screen Space Global Illumination feature in Unity’s High Definition Render Pipeline, along with the Unity Profiler and Timeline to overcome the lighting challenges they faced in building Projekt Z: Beyond Order.UnityMemory Arenas in Unity: Heap Allocation Without the GC - In this video, we explore how to build a custom memory arena in Unity using unsafe code and manual heap allocation. You’ll learn how to allocate raw memory for temporary graph-like structures—such as crafting trees or decision planners—without triggering the garbage collector. We’ll walk through the concept of stack frames, translate that to heap-based arena allocation, and implement a fast, disposable system that gives you full control over memory layout and lifetime. Perfect for performance-critical systems where GC spikes aren’t acceptable.git-amendCloth Animation Using The Compute Shader - In this video, we dive into cloth simulation using OpenGL compute shaders. By applying simple mathematical equations, we’ll achieve smooth, dynamic movement. We'll explore particle-based simulation, tackle synchronization challenges with double buffering, and optimize rendering using triangle strips for efficient memory usage. Whether you're familiar with compute shaders or just getting started, this is the perfect way to step up your real-time graphics skills!OGLDEVHow we're designing games for a broader audience - Our games are too hardBiteMe GamesAssetsLearn Game Dev - Unity, Godot, Unreal, Gamemaker, Blender & C# - Make games like a pro.Passionate about video games? Then start making your own! Our latest bundle will help you learn vital game development skills. Master the most popular creation platforms like Unity, Godot, Unreal, GameMaker, Blender, and C#—now that’s a sharp-lookin’ bundle! Build a 2.5D farming RPG with Unreal Engine, create a micro turn-based RPG in Godot, explore game optimization, and so much more.__Big Bang Unreal & Unity Asset Packs Bundle - 5000+ unrivaled assets in one bundle. Calling all game devs—build your worlds with this gigantic bundle of over 5000 assets, including realistic and stylized environments, SFX packs, and powerful tools. Perfect for hobbyists, beginners, and professional developers alike, you'll gain access to essential resources, tutorials, and beta-testing–ready content to start building immediately. The experts at Leartes Studios have curated an amazing library packed with value, featuring environments, VFX packs, and tutorial courses on Unreal Engine, Blender, Substance Painter, and ZBrush. Get the assets you need to bring your game to life—and help support One Tree Planted with your purchase! This bundle provides Unity Asset Store keys directly with your purchase, and FAB keys via redemption through Cosmos, if the product is available on those platforms.Humble Bundle AffiliateGameplay Tools 50% Off - Core systems, half the price. Get pro-grade tools to power your gameplay—combat, cutscenes, UI, and more. Including: HTrace: World Space Global Illumination, VFX Graph - Ultra Mega Pack - Vol.1, Magic Animation Blend, Utility Intelligence: Utility AI Framework for Unity 6, Build for iOS/macOS on Windows>?Unity AffiliateHi guys, I created a website about 6 years in which I host all my field recordings and foley sounds. All free to download and use CC0. There is currently 50+ packs with 1000's of sounds and hours of field recordings all perfect for game SFX and UI. - I think game designers can benefit from a wide range of sounds on the site, especially those that enhance immersion and atmosphere.signaturesounds.orgSmartAddresser - Automate Addressing, Labeling, and Version Control for Unity's Addressable Asset System.CyberAgentGameEntertainment Open SourceEasyCS - EasyCS is an easy-to-use and flexible framework for Unity, adopting a Data-Driven Entity & Actor-Component approach. It bridges Unity's classic OOP with powerful data-oriented patterns, without forcing a complete ECS paradigm shift or a mindset change. Build smarter, not harder.Watcher3056 Open SourceBinding-Of-Isaac_Map-Generator - Binding of Isaac map generator for Unity2DGarnetKane99 Open SourceHelion - A modern fast paced Doom FPS engineHelion-Engine Open SourcePixelationFx - Pixelation post effect for Unity UrpNullTale Open SourceExtreme Add-Ons Bundle For Blender & ZBrush - Extraordinary quality—Extreme add-ons Get quality add-ons for Blender and ZBrush with our latest bundle! We’ve teamed up with the pros at FlippedNormals to deliver a gigantic library of powerful tools for your next game development project. Add new life to your creative work with standout assets like Real-time Hair ZBrush Plugin, Physical Starlight and Atmosphere, Easy Mesh ZBrush Plugin, and more. Get the add-ons you need to bring color and individuality to your next project—and help support Extra Life with your purchase!Humble Bundle AffiliateShop up to 50% off Gabriel Aguiar Prod - Publisher Sale - Gabriel Aguiar Prod. is best known for his extensive VFX assets that help many developers prototype and ship games with special effects. His support and educational material are also invaluable resources for the game dev community. PLUS get VFX Graph - Stylized Fire - Vol. 1 for FREE with code GAP2025Unity AffiliateSpotlightDream Garden - Dream Garden is a simulation game about building tiny cute garden dioramas. A large selection of tools, plants, decorations and customization awaits you. Try all of them and create your dream garden.Campfire StudioMy game, Call Of Dookie. Demo available on SteamYou can subscribe to the free weekly newsletter on GameDevDigest.comThis post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article.
    #game #dev #digest #issue #design
    Game Dev Digest Issue #286 - Design Tricks, Deep Dives, and more
    This article was originally published on GameDevDigest.comEnjoy!What was Radiant AI, anyway? - A ridiculously deep dive into Oblivion's controversial AI system and its legacyblog.paavo.meConsider The Horse Game - No I don’t think every dev should make a horse game. But I do think every developer should at least look at them, maybe even play one because, it is very important that you understand the importance of genre, fandom, and how visibility works. Even if you are not making a horse game, the lessons you can learn by looking at this sub genre are very similar to other genres, just not as blatantly clear as they are with horse games.howtomarketagame.comMaking a killing: The playful 2D terror of Psycasso® - I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.UnityIntroduction to Asset Manager transfer methods in Unity - Unity's Asset Manager is a user-friendly digital asset management platform supporting over 70 file formats to help teams centralize, organize, discover, and use assets seamlessly across projects. It reduces redundant work by design, making cross-team collaboration smoother and accelerating production workflows.UnityVideosRules of the Game: Five Tricks of Highly Effective Designers - Every working designer has them: unique techniques or "tricks" that they use when crafting gameplay. Sure, there's the general game design wisdom that everyone agrees on and can be found in many a game design book, but experienced game designers often have very specific rules that are personal to them, techniques that not everyone knows about or even agrees with. In this GDC 2015 session, five experienced game designers join the stage for 10 minutes each to share one game design "trick" that they use.Game Developers ConferenceBinding of Isaac Style Room Generator in Unity- Our third part in the series - making the rooms!Game Dev GarnetIntroduction to Unity Behavior | Unity Tutorial - In this video you'll become familiar with the core concepts of Unity Behavior, including a live example.LlamAcademyHow I got my demo ready for Steam Next Fest - It's Steam Next Fest, and I've got a game in the showcase. So here are 7 tips for making the most of this demo sharing festival.Game Maker's ToolkitOptimizing lighting in Projekt Z: Beyond Order - 314 Arts studio lead and founder Justin Miersch discuss how the team used the Screen Space Global Illumination feature in Unity’s High Definition Render Pipeline, along with the Unity Profiler and Timeline to overcome the lighting challenges they faced in building Projekt Z: Beyond Order.UnityMemory Arenas in Unity: Heap Allocation Without the GC - In this video, we explore how to build a custom memory arena in Unity using unsafe code and manual heap allocation. You’ll learn how to allocate raw memory for temporary graph-like structures—such as crafting trees or decision planners—without triggering the garbage collector. We’ll walk through the concept of stack frames, translate that to heap-based arena allocation, and implement a fast, disposable system that gives you full control over memory layout and lifetime. Perfect for performance-critical systems where GC spikes aren’t acceptable.git-amendCloth Animation Using The Compute Shader - In this video, we dive into cloth simulation using OpenGL compute shaders. By applying simple mathematical equations, we’ll achieve smooth, dynamic movement. We'll explore particle-based simulation, tackle synchronization challenges with double buffering, and optimize rendering using triangle strips for efficient memory usage. Whether you're familiar with compute shaders or just getting started, this is the perfect way to step up your real-time graphics skills!OGLDEVHow we're designing games for a broader audience - Our games are too hardBiteMe GamesAssetsLearn Game Dev - Unity, Godot, Unreal, Gamemaker, Blender & C# - Make games like a pro.Passionate about video games? Then start making your own! Our latest bundle will help you learn vital game development skills. Master the most popular creation platforms like Unity, Godot, Unreal, GameMaker, Blender, and C#—now that’s a sharp-lookin’ bundle! Build a 2.5D farming RPG with Unreal Engine, create a micro turn-based RPG in Godot, explore game optimization, and so much more.__Big Bang Unreal & Unity Asset Packs Bundle - 5000+ unrivaled assets in one bundle. Calling all game devs—build your worlds with this gigantic bundle of over 5000 assets, including realistic and stylized environments, SFX packs, and powerful tools. Perfect for hobbyists, beginners, and professional developers alike, you'll gain access to essential resources, tutorials, and beta-testing–ready content to start building immediately. The experts at Leartes Studios have curated an amazing library packed with value, featuring environments, VFX packs, and tutorial courses on Unreal Engine, Blender, Substance Painter, and ZBrush. Get the assets you need to bring your game to life—and help support One Tree Planted with your purchase! This bundle provides Unity Asset Store keys directly with your purchase, and FAB keys via redemption through Cosmos, if the product is available on those platforms.Humble Bundle AffiliateGameplay Tools 50% Off - Core systems, half the price. Get pro-grade tools to power your gameplay—combat, cutscenes, UI, and more. Including: HTrace: World Space Global Illumination, VFX Graph - Ultra Mega Pack - Vol.1, Magic Animation Blend, Utility Intelligence: Utility AI Framework for Unity 6, Build for iOS/macOS on Windows>?Unity AffiliateHi guys, I created a website about 6 years in which I host all my field recordings and foley sounds. All free to download and use CC0. There is currently 50+ packs with 1000's of sounds and hours of field recordings all perfect for game SFX and UI. - I think game designers can benefit from a wide range of sounds on the site, especially those that enhance immersion and atmosphere.signaturesounds.orgSmartAddresser - Automate Addressing, Labeling, and Version Control for Unity's Addressable Asset System.CyberAgentGameEntertainment Open SourceEasyCS - EasyCS is an easy-to-use and flexible framework for Unity, adopting a Data-Driven Entity & Actor-Component approach. It bridges Unity's classic OOP with powerful data-oriented patterns, without forcing a complete ECS paradigm shift or a mindset change. Build smarter, not harder.Watcher3056 Open SourceBinding-Of-Isaac_Map-Generator - Binding of Isaac map generator for Unity2DGarnetKane99 Open SourceHelion - A modern fast paced Doom FPS engineHelion-Engine Open SourcePixelationFx - Pixelation post effect for Unity UrpNullTale Open SourceExtreme Add-Ons Bundle For Blender & ZBrush - Extraordinary quality—Extreme add-ons Get quality add-ons for Blender and ZBrush with our latest bundle! We’ve teamed up with the pros at FlippedNormals to deliver a gigantic library of powerful tools for your next game development project. Add new life to your creative work with standout assets like Real-time Hair ZBrush Plugin, Physical Starlight and Atmosphere, Easy Mesh ZBrush Plugin, and more. Get the add-ons you need to bring color and individuality to your next project—and help support Extra Life with your purchase!Humble Bundle AffiliateShop up to 50% off Gabriel Aguiar Prod - Publisher Sale - Gabriel Aguiar Prod. is best known for his extensive VFX assets that help many developers prototype and ship games with special effects. His support and educational material are also invaluable resources for the game dev community. PLUS get VFX Graph - Stylized Fire - Vol. 1 for FREE with code GAP2025Unity AffiliateSpotlightDream Garden - Dream Garden is a simulation game about building tiny cute garden dioramas. A large selection of tools, plants, decorations and customization awaits you. Try all of them and create your dream garden.Campfire StudioMy game, Call Of Dookie. Demo available on SteamYou can subscribe to the free weekly newsletter on GameDevDigest.comThis post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article. #game #dev #digest #issue #design
    GAMEDEV.NET
    Game Dev Digest Issue #286 - Design Tricks, Deep Dives, and more
    This article was originally published on GameDevDigest.comEnjoy!What was Radiant AI, anyway? - A ridiculously deep dive into Oblivion's controversial AI system and its legacyblog.paavo.meConsider The Horse Game - No I don’t think every dev should make a horse game (unlike horror, which I still think everyone should at least one). But I do think every developer should at least look at them, maybe even play one because, it is very important that you understand the importance of genre, fandom, and how visibility works. Even if you are not making a horse game, the lessons you can learn by looking at this sub genre are very similar to other genres, just not as blatantly clear as they are with horse games.howtomarketagame.comMaking a killing: The playful 2D terror of Psycasso® - I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a fresh (if gruesome) twist.UnityIntroduction to Asset Manager transfer methods in Unity - Unity's Asset Manager is a user-friendly digital asset management platform supporting over 70 file formats to help teams centralize, organize, discover, and use assets seamlessly across projects. It reduces redundant work by design, making cross-team collaboration smoother and accelerating production workflows.UnityVideosRules of the Game: Five Tricks of Highly Effective Designers - Every working designer has them: unique techniques or "tricks" that they use when crafting gameplay. Sure, there's the general game design wisdom that everyone agrees on and can be found in many a game design book, but experienced game designers often have very specific rules that are personal to them, techniques that not everyone knows about or even agrees with. In this GDC 2015 session, five experienced game designers join the stage for 10 minutes each to share one game design "trick" that they use.Game Developers ConferenceBinding of Isaac Style Room Generator in Unity [Full Tutorial] - Our third part in the series - making the rooms!Game Dev GarnetIntroduction to Unity Behavior | Unity Tutorial - In this video you'll become familiar with the core concepts of Unity Behavior, including a live example.LlamAcademyHow I got my demo ready for Steam Next Fest - It's Steam Next Fest, and I've got a game in the showcase. So here are 7 tips for making the most of this demo sharing festival.Game Maker's ToolkitOptimizing lighting in Projekt Z: Beyond Order - 314 Arts studio lead and founder Justin Miersch discuss how the team used the Screen Space Global Illumination feature in Unity’s High Definition Render Pipeline (HDRP), along with the Unity Profiler and Timeline to overcome the lighting challenges they faced in building Projekt Z: Beyond Order.UnityMemory Arenas in Unity: Heap Allocation Without the GC - In this video, we explore how to build a custom memory arena in Unity using unsafe code and manual heap allocation. You’ll learn how to allocate raw memory for temporary graph-like structures—such as crafting trees or decision planners—without triggering the garbage collector. We’ll walk through the concept of stack frames, translate that to heap-based arena allocation, and implement a fast, disposable system that gives you full control over memory layout and lifetime. Perfect for performance-critical systems where GC spikes aren’t acceptable.git-amendCloth Animation Using The Compute Shader - In this video, we dive into cloth simulation using OpenGL compute shaders. By applying simple mathematical equations, we’ll achieve smooth, dynamic movement. We'll explore particle-based simulation, tackle synchronization challenges with double buffering, and optimize rendering using triangle strips for efficient memory usage. Whether you're familiar with compute shaders or just getting started, this is the perfect way to step up your real-time graphics skills!OGLDEVHow we're designing games for a broader audience - Our games are too hardBiteMe GamesAssetsLearn Game Dev - Unity, Godot, Unreal, Gamemaker, Blender & C# - Make games like a pro.Passionate about video games? Then start making your own! Our latest bundle will help you learn vital game development skills. Master the most popular creation platforms like Unity, Godot, Unreal, GameMaker, Blender, and C#—now that’s a sharp-lookin’ bundle! Build a 2.5D farming RPG with Unreal Engine, create a micro turn-based RPG in Godot, explore game optimization, and so much more.__Big Bang Unreal & Unity Asset Packs Bundle - 5000+ unrivaled assets in one bundle. Calling all game devs—build your worlds with this gigantic bundle of over 5000 assets, including realistic and stylized environments, SFX packs, and powerful tools. Perfect for hobbyists, beginners, and professional developers alike, you'll gain access to essential resources, tutorials, and beta-testing–ready content to start building immediately. The experts at Leartes Studios have curated an amazing library packed with value, featuring environments, VFX packs, and tutorial courses on Unreal Engine, Blender, Substance Painter, and ZBrush. Get the assets you need to bring your game to life—and help support One Tree Planted with your purchase! This bundle provides Unity Asset Store keys directly with your purchase, and FAB keys via redemption through Cosmos, if the product is available on those platforms.Humble Bundle AffiliateGameplay Tools 50% Off - Core systems, half the price. Get pro-grade tools to power your gameplay—combat, cutscenes, UI, and more. Including: HTrace: World Space Global Illumination, VFX Graph - Ultra Mega Pack - Vol.1, Magic Animation Blend, Utility Intelligence (v2): Utility AI Framework for Unity 6, Build for iOS/macOS on Windows>?Unity AffiliateHi guys, I created a website about 6 years in which I host all my field recordings and foley sounds. All free to download and use CC0. There is currently 50+ packs with 1000's of sounds and hours of field recordings all perfect for game SFX and UI. - I think game designers can benefit from a wide range of sounds on the site, especially those that enhance immersion and atmosphere.signaturesounds.orgSmartAddresser - Automate Addressing, Labeling, and Version Control for Unity's Addressable Asset System.CyberAgentGameEntertainment Open SourceEasyCS - EasyCS is an easy-to-use and flexible framework for Unity, adopting a Data-Driven Entity & Actor-Component approach. It bridges Unity's classic OOP with powerful data-oriented patterns, without forcing a complete ECS paradigm shift or a mindset change. Build smarter, not harder.Watcher3056 Open SourceBinding-Of-Isaac_Map-Generator - Binding of Isaac map generator for Unity2DGarnetKane99 Open SourceHelion - A modern fast paced Doom FPS engineHelion-Engine Open SourcePixelationFx - Pixelation post effect for Unity UrpNullTale Open SourceExtreme Add-Ons Bundle For Blender & ZBrush - Extraordinary quality—Extreme add-ons Get quality add-ons for Blender and ZBrush with our latest bundle! We’ve teamed up with the pros at FlippedNormals to deliver a gigantic library of powerful tools for your next game development project. Add new life to your creative work with standout assets like Real-time Hair ZBrush Plugin, Physical Starlight and Atmosphere, Easy Mesh ZBrush Plugin, and more. Get the add-ons you need to bring color and individuality to your next project—and help support Extra Life with your purchase!Humble Bundle AffiliateShop up to 50% off Gabriel Aguiar Prod - Publisher Sale - Gabriel Aguiar Prod. is best known for his extensive VFX assets that help many developers prototype and ship games with special effects. His support and educational material are also invaluable resources for the game dev community. PLUS get VFX Graph - Stylized Fire - Vol. 1 for FREE with code GAP2025Unity AffiliateSpotlightDream Garden - Dream Garden is a simulation game about building tiny cute garden dioramas. A large selection of tools, plants, decorations and customization awaits you. Try all of them and create your dream garden.[You can find it on Steam]Campfire StudioMy game, Call Of Dookie. Demo available on SteamYou can subscribe to the free weekly newsletter on GameDevDigest.comThis post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article.
    0 Comments 0 Shares
  • Confidential Killings [Free] [Adventure] [macOS]

    Set in the glitzy world of Hollywood in the late '70s, Confidential Killings have you investigate a series of gruesome murders that seem connected. There are rumours about a mysterious cult behind them... 
    Explore the crime scenes, use your detective skills to deduce what's going on!
    Wishlist on Steam!
    our discord:  informationDownloadDevelopment logDemo out! 10 days agoCommentsLog in with itch.io to leave a comment.I LOVE it! The art, the gameplay, the story, it's so much fun!ReplyFirst I was like "nah, so you just want to check if I have read everything, or what?" but later it made sense with the twists and hunting for the word you already know but need to find elsewhere.ReplyPicto Games21 hours agothe cursor is blinking it is very disturbing and the game very goodReplyBRANE15 hours agoI recommend trying the desktop builds if you'd like to play without this issue. Or putting more fire on this PR of Godot: day agoNice gameReplylovedovey6661 day agoI love this game! i like the detective games and this is perfect :3ReplyThis is a great game! The old style detective game ambientation is superb, and the art sublime. The misteries were pretty entertaining and interesting to keep you going as you think what truly happened!ReplyI had to take notes.... my memory aint great lol really enjoyed it ReplySebbog1 day agoThis game is kind of like the detective games the Case of the Golden Idol and its sequel, the Rise of the Golden Idol, from 2022 and 2024 respectively. It's not just bullshit. It has a coherent story. If you haven't heard of the Golden Idol games, then it's basically a game where you investigate mysterious deaths and fill in the blanks of the story. You can navigate from multiple different scenes and click on people and objects to gather important clues. I think it was a good game. I like that it's similar to the Golden Idol games. I also liked that you could see the exact amount of wrong slots when it's less than or equal to 2. It said either two incorrect or one incorrect. This isn't how it works in the Golden Idol games. Although, this might make the game too easy. I am not sure tho.
    I also streamed this game on YouTube: ReplyMV_Zarch3 days agoI’m so happy I found this game. Amazing! The mysteries are just so good and well done. The art is beautiful and really sets up the atmosphere well. I am really interested to see the full game.Replyreveoncelink5 days agoIt was amazing!! Perfect gameplay and so many clues to connect the dots. Amazing.ReplyHey, this is a great game except for the flickering of the cursor. It’s the same for your other games. Hope this gets fixed!ReplyBRANE6 days agoheya! For the flickering issue I'm not really sure what's the problem, but having a screen recording of it could help.Other than that we're not that focused on fixing the web build as it's going to be a PC game - so I suggest trying the Windows buildReplyReally fun! Wishing you guys lots of luck!ReplyThank you!Replykcouchpotato8 days agoThis game is so awesome!! I've wishlisted it on steam.ReplyBRANE8 days agoThank you!Reply
    #confidential #killings #free #adventure #macos
    Confidential Killings [Free] [Adventure] [macOS]
    Set in the glitzy world of Hollywood in the late '70s, Confidential Killings have you investigate a series of gruesome murders that seem connected. There are rumours about a mysterious cult behind them...  Explore the crime scenes, use your detective skills to deduce what's going on! Wishlist on Steam! our discord:  informationDownloadDevelopment logDemo out! 10 days agoCommentsLog in with itch.io to leave a comment.I LOVE it! The art, the gameplay, the story, it's so much fun!ReplyFirst I was like "nah, so you just want to check if I have read everything, or what?" but later it made sense with the twists and hunting for the word you already know but need to find elsewhere.ReplyPicto Games21 hours agothe cursor is blinking it is very disturbing and the game very goodReplyBRANE15 hours agoI recommend trying the desktop builds if you'd like to play without this issue. Or putting more fire on this PR of Godot: day agoNice gameReplylovedovey6661 day agoI love this game! i like the detective games and this is perfect :3ReplyThis is a great game! The old style detective game ambientation is superb, and the art sublime. The misteries were pretty entertaining and interesting to keep you going as you think what truly happened!ReplyI had to take notes.... my memory aint great lol really enjoyed it ReplySebbog1 day agoThis game is kind of like the detective games the Case of the Golden Idol and its sequel, the Rise of the Golden Idol, from 2022 and 2024 respectively. It's not just bullshit. It has a coherent story. If you haven't heard of the Golden Idol games, then it's basically a game where you investigate mysterious deaths and fill in the blanks of the story. You can navigate from multiple different scenes and click on people and objects to gather important clues. I think it was a good game. I like that it's similar to the Golden Idol games. I also liked that you could see the exact amount of wrong slots when it's less than or equal to 2. It said either two incorrect or one incorrect. This isn't how it works in the Golden Idol games. Although, this might make the game too easy. I am not sure tho. I also streamed this game on YouTube: ReplyMV_Zarch3 days agoI’m so happy I found this game. Amazing! The mysteries are just so good and well done. The art is beautiful and really sets up the atmosphere well. I am really interested to see the full game.Replyreveoncelink5 days agoIt was amazing!! Perfect gameplay and so many clues to connect the dots. Amazing.ReplyHey, this is a great game except for the flickering of the cursor. It’s the same for your other games. Hope this gets fixed!ReplyBRANE6 days agoheya! For the flickering issue I'm not really sure what's the problem, but having a screen recording of it could help.Other than that we're not that focused on fixing the web build as it's going to be a PC game - so I suggest trying the Windows buildReplyReally fun! Wishing you guys lots of luck!ReplyThank you!Replykcouchpotato8 days agoThis game is so awesome!! I've wishlisted it on steam.ReplyBRANE8 days agoThank you!Reply #confidential #killings #free #adventure #macos
    BRANEGAMES.ITCH.IO
    Confidential Killings [Free] [Adventure] [macOS]
    Set in the glitzy world of Hollywood in the late '70s, Confidential Killings have you investigate a series of gruesome murders that seem connected. There are rumours about a mysterious cult behind them...  Explore the crime scenes, use your detective skills to deduce what's going on! Wishlist on Steam! https://store.steampowered.com/app/2797960/Confidential_KillingsJoin our discord: https://discord.gg/xwFXgbb2xfMore informationDownloadDevelopment logDemo out! 10 days agoCommentsLog in with itch.io to leave a comment.I LOVE it! The art, the gameplay, the story, it's so much fun!ReplyFirst I was like "nah, so you just want to check if I have read everything, or what?" but later it made sense with the twists and hunting for the word you already know but need to find elsewhere.ReplyPicto Games21 hours ago(+2)the cursor is blinking it is very disturbing and the game very goodReplyBRANE15 hours ago (1 edit) (+1)I recommend trying the desktop builds if you'd like to play without this issue. Or putting more fire on this PR of Godot:https://github.com/godotengine/godot/pull/103304ReplybeautifulDegen1 day ago(+1)Nice gameReplylovedovey6661 day ago(+1)I love this game! i like the detective games and this is perfect :3ReplyThis is a great game! The old style detective game ambientation is superb, and the art sublime. The misteries were pretty entertaining and interesting to keep you going as you think what truly happened!ReplyI had to take notes.... my memory aint great lol really enjoyed it ReplySebbog1 day agoThis game is kind of like the detective games the Case of the Golden Idol and its sequel, the Rise of the Golden Idol, from 2022 and 2024 respectively. It's not just bullshit. It has a coherent story. If you haven't heard of the Golden Idol games, then it's basically a game where you investigate mysterious deaths and fill in the blanks of the story. You can navigate from multiple different scenes and click on people and objects to gather important clues. I think it was a good game. I like that it's similar to the Golden Idol games. I also liked that you could see the exact amount of wrong slots when it's less than or equal to 2. It said either two incorrect or one incorrect. This isn't how it works in the Golden Idol games. Although, this might make the game too easy. I am not sure tho. I also streamed this game on YouTube: ReplyMV_Zarch3 days agoI’m so happy I found this game. Amazing! The mysteries are just so good and well done. The art is beautiful and really sets up the atmosphere well. I am really interested to see the full game.Replyreveoncelink5 days agoIt was amazing!! Perfect gameplay and so many clues to connect the dots. Amazing.ReplyHey, this is a great game except for the flickering of the cursor. It’s the same for your other games (We Suspect Foul Play afaik). Hope this gets fixed! (I’m on chrome) ReplyBRANE6 days agoheya! For the flickering issue I'm not really sure what's the problem, but having a screen recording of it could help.Other than that we're not that focused on fixing the web build as it's going to be a PC game - so I suggest trying the Windows buildReplyReally fun! Wishing you guys lots of luck!ReplyThank you!Replykcouchpotato8 days ago(+1)This game is so awesome!! I've wishlisted it on steam.ReplyBRANE8 days agoThank you!Reply
    0 Comments 0 Shares
  • One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale

    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale . Both the Essential and Standard Combos have been discounted to while the Adventure Combo has dropped to DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look.
    details
    View First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second. This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds, but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilizationtech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080por 2.7Kwith a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording. That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C. This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K: 3840×2880@24/25/30/48/50/60fps and 4K: 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSDBattery: 1770mAh, lab tested to offer up to 160 minutes of runtimeOperating Temperature: -20° to 45° CThis article was originally published in August of 2023 and updated in March 2025.Featured reviews
    #one #most #versatile #action #cameras
    One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale
    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale . Both the Essential and Standard Combos have been discounted to while the Adventure Combo has dropped to DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look. details View First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second. This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds, but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilizationtech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080por 2.7Kwith a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording. That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C. This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K: 3840×2880@24/25/30/48/50/60fps and 4K: 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSDBattery: 1770mAh, lab tested to offer up to 160 minutes of runtimeOperating Temperature: -20° to 45° CThis article was originally published in August of 2023 and updated in March 2025.Featured reviews #one #most #versatile #action #cameras
    WWW.ZDNET.COM
    One of the most versatile action cameras I've tested isn't from GoPro - and it's on sale
    DJI Osmo Action 4. Adrian Kingsley-Hughes/ZDNETMultiple DJI Osmo Action 4 packages are on sale at Amazon. Both the Essential and Standard Combos have been discounted to $249, while the Adventure Combo has dropped to $349.DJI might not be the first name on people's lips when it comes to action cameras, but the company that's better known for its drones also has a really solid line of action cameras. And its latest device, the Osmo Action 4 camera, has some very impressive tricks up its sleeve.Also: One of the most versatile cameras I've used is not from Sony or Canon and it's on saleSo, what sets this action camera apart from the competition? Let's take a look. details View at Amazon First off, this is not just an action camera -- it's a pro-grade action camera.From a hardware point of view, the Osmo Action 4 features a 1/1.3-inch image sensor that can record 4K at up to 120 frames per second (fps). This sensor is combined with a wide-angle f/2.8 aperture lens that provides an ultra-wide field of view of up to 155°. And that's wide. Build quality and fit and finish are second to none. Adrian Kingsley-Hughes/ZDNETFor when the going gets rough, the Osmo Action 4 offers 360° HorizonSteady stabilization modes, including RockSteady 3.0/3.0+ for first-person video footage and HorizonBalancing/HorizonSteady modes for horizontal shots. That's pro-grade hardware right there.Also: This new AI video editor is an all-in-one production service for filmmakers - how to try itThe Osmo Action 4 also features a 10-bit D-Log M color mode. This mode allows the sensor to record over one billion colors and offers a wider dynamic range, giving you a video that is more vivid and that offers greater detail in the highlights and shadows. This mode, combined with an advanced color temperature sensor, means that the colors have a true-to-life feel regardless of whether you're shooting outdoors, indoors, or even underwater. The DJI Osmo Action 4 ready for action. Adrian Kingsley-Hughes/ZDNETI've added some video output from the Osmo Action 4 below. There are examples in both 1080p and 4K. To test the stabilization, I attached the camera to the truck and took it on some roads, some of which are pretty rough. The Osmo Action 4 had no problem with that terrain. I also popped the camera into the sea, just because. And again, no problem.I've also captured a few time-lapses with the camera -- not because I like clouds (well, actually, I do like clouds), but pointing a camera at a sky can be a good test of how it handles changing light. Also: I recommend this action camera to beginners and professional creators. Here's whyTimelapses with action cameras can suffer from unsightly exposure changes that cause the image to pulse, a condition known as exposure pumping. This issue can also cause the white balance to change noticeably in a video, but the Osmo Action 4 handled this test well.All the footage I've shot is what I've come to expect from a DJI camera, whether it's from an action camera or drone -- crisp, clear, vivid, and also nice and stable.The Osmo Action 4 is packed with various electronic image-stabilization (EIS) tech to ensure that your footage is smooth and on the horizon. It's worth noting the limitations of EIS -- it's not supported in slow-motion and timelapse modes, and the HorizonSteady and HorizonBalancing features are only available for video recorded at 1080p (16:9) or 2.7K (16:9) with a frame rate of 60fps or below. On the durability front, I've no concerns. I've subjected the Osmo Action 4 to a hard few days of testing, and it's not let me down or complained once. It takes impacts like a champ, and being underwater or in dirt and sand is no problem at all. Also: I'm a full-time Canon photographer, but this Nikon camera made me wonder if I'm missing outYou might think that this heavy-duty testing would be hard on the camera's tiny batteries, but you'd be wrong. Remember I said the Osmo Action 4 offered hours of battery life? Well, I wasn't kidding.  The Osmo Action 4's ultra-long life batteries are incredible.  Adrian Kingsley-Hughes/ZDNETDJI says that a single battery can deliver up to 160 minutes of 1080p/24fps video recording (at room temperature, with RockSteady on, Wi-Fi off, and screen off). That's over two and a half hours of recording time. In the real world, I was blown away by how much a single battery can deliver. I shot video and timelapse, messed around with a load of camera settings, and then transferred that footage to my iPhone, and still had 16% battery left.No action camera has delivered so much for me on one battery. The two extra batteries and the multifunction case that come as part of the Adventure Combo are worth the extra $100. Adrian Kingsley-Hughes/ZDNETAnd when you're ready to recharge, a 30W USB-C charger can take a battery from zero to 80% in 18 minutes. That's also impressive.What's more, the batteries are resistant to cold, offering up to 150 minutes of 1080p/24fps recording in temperatures as low as -20°C (-4°F). This resistance also blows the competition away.Even taking into account all these strong points, the Osmo Action 4 offers even more.The camera has 2x digital zoom for better composition, Voice Prompts that let you know what the camera is doing without looking, and Voice Control that lets you operate the device without touching the screen or using the app. The Osmo Action 4 also digitally hides the selfie stick from a variety of different shots, and you can even connect the DJI Mic to the camera via the USB-C port for better audio capture.Also: Yes, an Android tablet finally made me reconsider my iPad Pro loyaltyAs for price, the Osmo Action 4 Standard Combo bundle comes in at $399, while the Osmo Action 4 Adventure Combo, which comes with two extra Osmo Action Extreme batteries, an additional mini Osmo Action quick-release adapter mount, a battery case that acts as a power bank, and a 1.5-meter selfie stick, is $499.I'm in love with the Osmo Action 4. It's hands down the best, most versatile, most powerful action camera on the market today, offering pro-grade features at a price that definitely isn't pro-grade.  Everything included in the Action Combo bundle. DJIDJI Osmo Action 4 tech specsDimensions: 70.5×44.2×32.8mmWeight: 145gWaterproof: 18m, up to 60m with the optional waterproof case Microphones: 3Sensor 1/1.3-inch CMOSLens: FOV 155°, aperture f/2.8, focus distance 0.4m to ∞Max Photo Resolution: 3648×2736Max Video Resolution: 4K (4:3): 3840×2880@24/25/30/48/50/60fps and 4K (16:9): 3840×2160@24/25/30/48/50/60/100/120fpsISO Range: 100-12800Front Screen: 1.4-inch, 323ppi, 320×320Rear Screen: 2.25-inch, 326ppi, 360×640Front/Rear Screen Brightness: 750±50 cd/m² Storage: microSD (up to 512GB)Battery: 1770mAh, lab tested to offer up to 160 minutes of runtime (tested at room temperature - 25°C/77°F - and 1080p/24fps, with RockSteady on, Wi-Fi off, and screen off)Operating Temperature: -20° to 45° C (-4° to 113° F)This article was originally published in August of 2023 and updated in March 2025.Featured reviews
    0 Comments 0 Shares
  • Double-Whammy When AGI Embeds With Humanoid Robots And Occupies Both White-Collar And Blue-Collar Jobs

    AGI will be embedded into humanoid robots, which makes white-collar and blue-collar jobs a target ... More for walking/talking automation.getty
    In today’s column, I examine the highly worrisome qualms expressed that the advent of artificial general intelligenceis likely to usurp white-collar jobs. The stated concern is that since AGI will be on par with human intellect, any job that relies principally on intellectual pursuits such as typical white-collar work will be taken over via the use of AGI. Employers will realize that rather than dealing with human white-collar workers, they can more readily get the job done via AGI. This, in turn, has led to a rising call that people should aim toward blue-collar jobs, doing so becausethose forms of employment will not be undercut via AGI.

    Sorry to say, that misses the bigger picture, namely that AGI when combined with humanoid robots is coming not only for white-collar jobs but also blue-collar jobs too. It is a proverbial double-whammy when it comes to the attainment of AGI.

    Let’s talk about it.

    This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities.

    Heading Toward AGI And ASI
    First, some fundamentals are required to set the stage for this weighty discussion.
    There is a great deal of research going on to further advance AI. The general goal is to either reach artificial general intelligenceor maybe even the outstretched possibility of achieving artificial superintelligence.
    AGI is AI that is considered on par with human intellect and can seemingly match our intelligence. ASI is AI that has gone beyond human intellect and would be superior in many if not all feasible ways. The idea is that ASI would be able to run circles around humans by outthinking us at every turn. For more details on the nature of conventional AI versus AGI and ASI, see my analysis at the link here.
    We have not yet attained AGI.
    In fact, it is unknown as to whether we will reach AGI, or that maybe AGI will be achievable in decades or perhaps centuries from now. The AGI attainment dates that are floating around are wildly varying and wildly unsubstantiated by any credible evidence or ironclad logic. ASI is even more beyond the pale when it comes to where we are currently with conventional AI.
    AGI Problem Only Half Seen
    Before launching into the primary matter at hand in this discussion, let’s contemplate a famous quote attributed to Charles Kettering, a legendary inventor, who said, “A problem well-stated is a problem half-solved.”

    I bring this up because those loud clamors right now about the assumption that AGI will replace white-collar workers are only seeing half of the problem. The problem as they see it is that since AGI is intellectually on par with humans, and since white-collar workers mainly use intellect in their work endeavors, AGI is going to be used in place of humans for white-collar work.
    I will in a moment explain why that’s only half of the problem and there is a demonstrative need to more carefully and fully articulate the nature of the problem.
    Will AGI Axiomatically Take White-Collar Jobs
    On a related facet, the belief that AGI will axiomatically replace white-collar labor makes a number of other related key assumptions. I shall briefly explore those and then come back to why the problem itself is only half-baked.
    The cost of using AGI for doing white-collar work will need to be presumably a better ROI choice over human workers. If not, then an employer would be wiser to stick with humans rather than employing AGI. There seems to often be an unstated belief that AGI is necessarily going to be a less costly route than employing humans.
    We don’t know yet what the cost of using AGI will be.
    It could be highly expensive. Indeed, some are worried that the world will divide into the AGI haves and AGI have-nots, partially due to the exorbitant cost that AGI might involve. If AGI is free to use, well, that would seem to be the nail in the coffin related to using human workers for the same capacity. Another angle is that AGI is relatively inexpensive in comparison to human labor. In that case, the use of AGI is likely to win over human labor usage.
    But if the cost of AGI is nearer to the cost of human labor, or more so, then employers would rationally need to weigh the use of one versus the other.
    Note that when referring to the cost of human labor, there is more to that calculation than simply the dollar-hour labor rate per se. There are lots of other less apparent costs, such as the cost to manage human labor, the cost of dealing with HR-related issues, and many other factors that come into the weighty matter. Thus, an AGI versus human labor ROI will be more complex than it might seem at an initial glance. In addition, keep in mind that AGI would seemingly be readily switched on and off, and have other capacities that human labor would not equally tend to allow.
    The Other Half Is Coming Too
    Assume that by and large the advent of AGI will decimate the need for white-collar human labor. The refrain right now is that people should begin tilting toward blue-collar jobs as an alternative to white-collar jobs. This is a logical form of thinking in the sense that AGI as an intellectual mechanism would be unable to compete in jobs that involve hands-on work.
    A plumber needs to come to your house and do hands-on work to fix your plumbing. This is a physicality that entails arriving at your physical home, physically bringing and using tools, and physically repairing your faulty home plumbing. A truck driver likewise needs to sit in the cab of a truck and drive the vehicle. These are physically based tasks.
    There is no getting around the fact that these are hands-on activities.
    Aha, yes, those are physical tasks, but that doesn’t necessarily mean that only human hands can perform them. The gradual emergence of humanoid robots will provide an alternative to human hands. A humanoid robot is a type of robot that is built to resemble a human in form and function. You’ve undoubtedly seen those types of robots in the many online video recordings showing them walking, jumping, grasping at objects, and so on.
    A tremendous amount of active research and development is taking place to devise humanoid robots. They look comical right now. You watch those videos and laugh when the robot trips over a mere stick lying on the ground, something that a human would seldom trip over. You scoff when a robot tries to grasp a coffee cup and inadvertently spills most of the coffee. It all seems humorous and a silly pursuit.
    Keep in mind that we are all observing the development process while it is still taking place. At some point, those guffaws of the humanoid robots will lessen. Humanoid robots will be as smooth and graceful as humans. This will continue to be honed. Eventually, humanoid robots will be less prone to physical errors that humans make. In a sense, the physicality of a humanoid robot will be on par with humans, if not better, due to its mechanical properties.
    Do not discount the coming era of quite physically capable humanoid robots.
    AGI And Humanoid Robots Pair Up
    You might remember that in The Wonderful Wizard of Oz, the fictional character known as The Strawman lacked a brain.
    Without seeming to anthropomorphize humanoid robots, the current situation is that those robots typically use a form of AI that is below the sophistication level of modern generative AI. That’s fine for now due to the need to first ensure that the physical movements of the robots get refined.
    I have discussed that a said-to-be realm of Physical AI is going to be a huge breakthrough with incredible ramifications, see my analysis at the link here. The idea underlying Physical AI is that the AI of today is being uplifted by doing data training on the physical world. This also tends to include the use of World Models, consisting of broad constructions about how the physical world works, such as that we are bound to operate under conditions of gravity, and other physical laws of nature, see the link here.
    The bottom line here is that there will be a close pairing of robust AI with humanoid robots.
    Imagine what a humanoid robot can accomplish if it is paired with AGI.
    I’ll break the suspense and point out that AGI paired with humanoid robots means that those robots readily enter the blue-collar worker realm. Suppose your plumbing needs fixing. No worries, a humanoid robot that encompasses AGI will be sent to your home. The AGI is astute enough to carry on conversations with you, and the AGI also fully operates the robot to undertake the plumbing tasks.
    How did the AGI-paired humanoid robot get to your home?
    Easy-peasy, it drove a car or truck to get there.
    I’ve previously predicted that all the work on devising autonomous vehicles and self-driving cars will get shaken up once we have suitable humanoid robots devised. There won’t be a need for a vehicle to contain self-driving capabilities. A humanoid robot will simply sit in the driver’s seat and drive the vehicle. This is a much more open-ended solution than having to craft components that go into and onto a vehicle to enable self-driving. See my coverage at the link here.
    Timing Is Notable
    One of the reasons that many do not give much thought to the pairing of AGI with humanoid robots is that today’s humanoid robots seem extraordinarily rudimentary and incapable of performing physical dexterity tasks on par with human capabilities. Meanwhile, there is brazen talk that AGI is just around the corner.
    AGI is said to be within our grasp.
    Let’s give the timing considerations a bit of scrutiny.
    There are three primary timing angles:

    Option 1: AGI first, then humanoid robots. AGI is attained before humanoid robots are sufficiently devised.
    Option 2: Humanoid robots first, then AGI. Humanoid robots are physically fluently adept before AGI is attained.
    Option 3: AGI and humanoid robots arrive about at the same time. AGI is attained and at the same time, it turns out that humanoid robots are fluently adept too, mainly by coincidence and not due to any cross-mixing.

    A skeptic would insist that there is a fourth possibility, consisting of the possibility that we never achieve AGI and/or we fail to achieve sufficiently physically capable humanoid robots. I am going to reject that possibility. Perhaps I am overly optimistic, but it seems to me that we will eventually attain AGI, and we will eventually attain physically capable humanoid robots.
    I shall next respectively consider each of the three genuinely reasonable possibilities.
    Option 1: AGI First, Then Humanoid Robots
    What if we manage to attain AGI before we manage to achieve physically fluent humanoid robots?
    That’s just fine.
    We would indubitably put AGI to work as a partner with humans in figuring out how we can push along the budding humanoid robot development process. It seems nearly obvious that with AGI’s capable assistance, we would overcome any bottlenecks and soon enough arrive at top-notch physically adept humanoid robots.
    At that juncture, we would then toss AGI into the humanoid robots and have ourselves quite an amazing combination.
    Option 2: Humanoid Robots First, Then AGI
    Suppose that we devise very physically adept humanoid robots but have not yet arrived at AGI.
    Are we in a pickle?
    Nope.
    We could use conventional advanced AI inside those humanoid robots. The combination would certainly be good enough for a wide variety of tasks. The odds are that we would need to be cautious about where such robots are utilized. Nonetheless, we would have essentially walking, talking, and productive humanoid robots.
    If AGI never happens, oh well, we end up with pretty good humanoid robots. On the other hand, once we arrive at AGI, those humanoid robots will be stellar. It’s just a matter of time.
    Option 3: AGI And Humanoid Robots At The Same Time
    Let’s consider the potential of AGI and humanoid robots perchance being attained around the same time. Assume that this timing isn’t due to an outright cross-mixing with each other. They just so happen to advance on a similar timeline.
    I tend to believe that’s the most likely of the three scenarios.
    Here’s why.
    First, despite all the hubris about AGI being within earshot, perhaps in the next year or two, which is a popular pronouncement by many AI luminaries, I tend to side with recent surveys of AI developers that put the date around the year 2040. Some AI luminaires sneakily play with the definition of AGI in hopes of making their predictions come true sooner, akin to moving the goalposts to easily score points. For my coverage on Sam Altman’s efforts of moving the cheese regarding AGI attainment, see the link here.
    Second, if you are willing to entertain the year 2040 as a potential date for achieving AGI, that’s about 15 years from now. In my estimation, the advancements being made in humanoid robots will readily progress such that by 2040 they will be very physically adept. Probably be sooner, but let’s go with the year 2040 for ease of contemplation.
    In my view, we will likely have humanoid robots doing well enough that they will be put into use prior to arriving at AGI. The pinnacle of robust humanoid robots and the attainment of AGI will roughly coincide with each other.

    Two peas in a pod.Impact Of Enormous Consequences
    In an upcoming column posting, I will examine the enormous consequences of having AGI paired with fully physically capable humanoid robots. As noted above, this will have a humongous impact on white-collar work and blue-collar work. There will be gargantuan economic impacts, societal impacts, cultural impacts, and so on.
    Some final thoughts for now.
    A single whammy is already being hotly debated. The debates currently tend to be preoccupied with the loss of white-collar jobs due to the attainment of AGI. A saving grace seems to be that at least blue-collar jobs are going to be around and thriving, even once AGI is attained. The world doesn’t seem overly gloomy if you can cling to the upbeat posture that blue-collar tasks remain intact.
    The double whammy is a lot more to take in.
    But the double whammy is the truth. The truth needs to be faced. If you are having doubts as a human about the future, just remember the famous words of Vince Lombardi: “Winners never quit, and quitters never win.”
    Humankind can handle the double whammy.
    Stay tuned for my upcoming coverage of what this entails.
    #doublewhammy #when #agi #embeds #with
    Double-Whammy When AGI Embeds With Humanoid Robots And Occupies Both White-Collar And Blue-Collar Jobs
    AGI will be embedded into humanoid robots, which makes white-collar and blue-collar jobs a target ... More for walking/talking automation.getty In today’s column, I examine the highly worrisome qualms expressed that the advent of artificial general intelligenceis likely to usurp white-collar jobs. The stated concern is that since AGI will be on par with human intellect, any job that relies principally on intellectual pursuits such as typical white-collar work will be taken over via the use of AGI. Employers will realize that rather than dealing with human white-collar workers, they can more readily get the job done via AGI. This, in turn, has led to a rising call that people should aim toward blue-collar jobs, doing so becausethose forms of employment will not be undercut via AGI. Sorry to say, that misses the bigger picture, namely that AGI when combined with humanoid robots is coming not only for white-collar jobs but also blue-collar jobs too. It is a proverbial double-whammy when it comes to the attainment of AGI. Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities. Heading Toward AGI And ASI First, some fundamentals are required to set the stage for this weighty discussion. There is a great deal of research going on to further advance AI. The general goal is to either reach artificial general intelligenceor maybe even the outstretched possibility of achieving artificial superintelligence. AGI is AI that is considered on par with human intellect and can seemingly match our intelligence. ASI is AI that has gone beyond human intellect and would be superior in many if not all feasible ways. The idea is that ASI would be able to run circles around humans by outthinking us at every turn. For more details on the nature of conventional AI versus AGI and ASI, see my analysis at the link here. We have not yet attained AGI. In fact, it is unknown as to whether we will reach AGI, or that maybe AGI will be achievable in decades or perhaps centuries from now. The AGI attainment dates that are floating around are wildly varying and wildly unsubstantiated by any credible evidence or ironclad logic. ASI is even more beyond the pale when it comes to where we are currently with conventional AI. AGI Problem Only Half Seen Before launching into the primary matter at hand in this discussion, let’s contemplate a famous quote attributed to Charles Kettering, a legendary inventor, who said, “A problem well-stated is a problem half-solved.” I bring this up because those loud clamors right now about the assumption that AGI will replace white-collar workers are only seeing half of the problem. The problem as they see it is that since AGI is intellectually on par with humans, and since white-collar workers mainly use intellect in their work endeavors, AGI is going to be used in place of humans for white-collar work. I will in a moment explain why that’s only half of the problem and there is a demonstrative need to more carefully and fully articulate the nature of the problem. Will AGI Axiomatically Take White-Collar Jobs On a related facet, the belief that AGI will axiomatically replace white-collar labor makes a number of other related key assumptions. I shall briefly explore those and then come back to why the problem itself is only half-baked. The cost of using AGI for doing white-collar work will need to be presumably a better ROI choice over human workers. If not, then an employer would be wiser to stick with humans rather than employing AGI. There seems to often be an unstated belief that AGI is necessarily going to be a less costly route than employing humans. We don’t know yet what the cost of using AGI will be. It could be highly expensive. Indeed, some are worried that the world will divide into the AGI haves and AGI have-nots, partially due to the exorbitant cost that AGI might involve. If AGI is free to use, well, that would seem to be the nail in the coffin related to using human workers for the same capacity. Another angle is that AGI is relatively inexpensive in comparison to human labor. In that case, the use of AGI is likely to win over human labor usage. But if the cost of AGI is nearer to the cost of human labor, or more so, then employers would rationally need to weigh the use of one versus the other. Note that when referring to the cost of human labor, there is more to that calculation than simply the dollar-hour labor rate per se. There are lots of other less apparent costs, such as the cost to manage human labor, the cost of dealing with HR-related issues, and many other factors that come into the weighty matter. Thus, an AGI versus human labor ROI will be more complex than it might seem at an initial glance. In addition, keep in mind that AGI would seemingly be readily switched on and off, and have other capacities that human labor would not equally tend to allow. The Other Half Is Coming Too Assume that by and large the advent of AGI will decimate the need for white-collar human labor. The refrain right now is that people should begin tilting toward blue-collar jobs as an alternative to white-collar jobs. This is a logical form of thinking in the sense that AGI as an intellectual mechanism would be unable to compete in jobs that involve hands-on work. A plumber needs to come to your house and do hands-on work to fix your plumbing. This is a physicality that entails arriving at your physical home, physically bringing and using tools, and physically repairing your faulty home plumbing. A truck driver likewise needs to sit in the cab of a truck and drive the vehicle. These are physically based tasks. There is no getting around the fact that these are hands-on activities. Aha, yes, those are physical tasks, but that doesn’t necessarily mean that only human hands can perform them. The gradual emergence of humanoid robots will provide an alternative to human hands. A humanoid robot is a type of robot that is built to resemble a human in form and function. You’ve undoubtedly seen those types of robots in the many online video recordings showing them walking, jumping, grasping at objects, and so on. A tremendous amount of active research and development is taking place to devise humanoid robots. They look comical right now. You watch those videos and laugh when the robot trips over a mere stick lying on the ground, something that a human would seldom trip over. You scoff when a robot tries to grasp a coffee cup and inadvertently spills most of the coffee. It all seems humorous and a silly pursuit. Keep in mind that we are all observing the development process while it is still taking place. At some point, those guffaws of the humanoid robots will lessen. Humanoid robots will be as smooth and graceful as humans. This will continue to be honed. Eventually, humanoid robots will be less prone to physical errors that humans make. In a sense, the physicality of a humanoid robot will be on par with humans, if not better, due to its mechanical properties. Do not discount the coming era of quite physically capable humanoid robots. AGI And Humanoid Robots Pair Up You might remember that in The Wonderful Wizard of Oz, the fictional character known as The Strawman lacked a brain. Without seeming to anthropomorphize humanoid robots, the current situation is that those robots typically use a form of AI that is below the sophistication level of modern generative AI. That’s fine for now due to the need to first ensure that the physical movements of the robots get refined. I have discussed that a said-to-be realm of Physical AI is going to be a huge breakthrough with incredible ramifications, see my analysis at the link here. The idea underlying Physical AI is that the AI of today is being uplifted by doing data training on the physical world. This also tends to include the use of World Models, consisting of broad constructions about how the physical world works, such as that we are bound to operate under conditions of gravity, and other physical laws of nature, see the link here. The bottom line here is that there will be a close pairing of robust AI with humanoid robots. Imagine what a humanoid robot can accomplish if it is paired with AGI. I’ll break the suspense and point out that AGI paired with humanoid robots means that those robots readily enter the blue-collar worker realm. Suppose your plumbing needs fixing. No worries, a humanoid robot that encompasses AGI will be sent to your home. The AGI is astute enough to carry on conversations with you, and the AGI also fully operates the robot to undertake the plumbing tasks. How did the AGI-paired humanoid robot get to your home? Easy-peasy, it drove a car or truck to get there. I’ve previously predicted that all the work on devising autonomous vehicles and self-driving cars will get shaken up once we have suitable humanoid robots devised. There won’t be a need for a vehicle to contain self-driving capabilities. A humanoid robot will simply sit in the driver’s seat and drive the vehicle. This is a much more open-ended solution than having to craft components that go into and onto a vehicle to enable self-driving. See my coverage at the link here. Timing Is Notable One of the reasons that many do not give much thought to the pairing of AGI with humanoid robots is that today’s humanoid robots seem extraordinarily rudimentary and incapable of performing physical dexterity tasks on par with human capabilities. Meanwhile, there is brazen talk that AGI is just around the corner. AGI is said to be within our grasp. Let’s give the timing considerations a bit of scrutiny. There are three primary timing angles: Option 1: AGI first, then humanoid robots. AGI is attained before humanoid robots are sufficiently devised. Option 2: Humanoid robots first, then AGI. Humanoid robots are physically fluently adept before AGI is attained. Option 3: AGI and humanoid robots arrive about at the same time. AGI is attained and at the same time, it turns out that humanoid robots are fluently adept too, mainly by coincidence and not due to any cross-mixing. A skeptic would insist that there is a fourth possibility, consisting of the possibility that we never achieve AGI and/or we fail to achieve sufficiently physically capable humanoid robots. I am going to reject that possibility. Perhaps I am overly optimistic, but it seems to me that we will eventually attain AGI, and we will eventually attain physically capable humanoid robots. I shall next respectively consider each of the three genuinely reasonable possibilities. Option 1: AGI First, Then Humanoid Robots What if we manage to attain AGI before we manage to achieve physically fluent humanoid robots? That’s just fine. We would indubitably put AGI to work as a partner with humans in figuring out how we can push along the budding humanoid robot development process. It seems nearly obvious that with AGI’s capable assistance, we would overcome any bottlenecks and soon enough arrive at top-notch physically adept humanoid robots. At that juncture, we would then toss AGI into the humanoid robots and have ourselves quite an amazing combination. Option 2: Humanoid Robots First, Then AGI Suppose that we devise very physically adept humanoid robots but have not yet arrived at AGI. Are we in a pickle? Nope. We could use conventional advanced AI inside those humanoid robots. The combination would certainly be good enough for a wide variety of tasks. The odds are that we would need to be cautious about where such robots are utilized. Nonetheless, we would have essentially walking, talking, and productive humanoid robots. If AGI never happens, oh well, we end up with pretty good humanoid robots. On the other hand, once we arrive at AGI, those humanoid robots will be stellar. It’s just a matter of time. Option 3: AGI And Humanoid Robots At The Same Time Let’s consider the potential of AGI and humanoid robots perchance being attained around the same time. Assume that this timing isn’t due to an outright cross-mixing with each other. They just so happen to advance on a similar timeline. I tend to believe that’s the most likely of the three scenarios. Here’s why. First, despite all the hubris about AGI being within earshot, perhaps in the next year or two, which is a popular pronouncement by many AI luminaries, I tend to side with recent surveys of AI developers that put the date around the year 2040. Some AI luminaires sneakily play with the definition of AGI in hopes of making their predictions come true sooner, akin to moving the goalposts to easily score points. For my coverage on Sam Altman’s efforts of moving the cheese regarding AGI attainment, see the link here. Second, if you are willing to entertain the year 2040 as a potential date for achieving AGI, that’s about 15 years from now. In my estimation, the advancements being made in humanoid robots will readily progress such that by 2040 they will be very physically adept. Probably be sooner, but let’s go with the year 2040 for ease of contemplation. In my view, we will likely have humanoid robots doing well enough that they will be put into use prior to arriving at AGI. The pinnacle of robust humanoid robots and the attainment of AGI will roughly coincide with each other. Two peas in a pod.Impact Of Enormous Consequences In an upcoming column posting, I will examine the enormous consequences of having AGI paired with fully physically capable humanoid robots. As noted above, this will have a humongous impact on white-collar work and blue-collar work. There will be gargantuan economic impacts, societal impacts, cultural impacts, and so on. Some final thoughts for now. A single whammy is already being hotly debated. The debates currently tend to be preoccupied with the loss of white-collar jobs due to the attainment of AGI. A saving grace seems to be that at least blue-collar jobs are going to be around and thriving, even once AGI is attained. The world doesn’t seem overly gloomy if you can cling to the upbeat posture that blue-collar tasks remain intact. The double whammy is a lot more to take in. But the double whammy is the truth. The truth needs to be faced. If you are having doubts as a human about the future, just remember the famous words of Vince Lombardi: “Winners never quit, and quitters never win.” Humankind can handle the double whammy. Stay tuned for my upcoming coverage of what this entails. #doublewhammy #when #agi #embeds #with
    WWW.FORBES.COM
    Double-Whammy When AGI Embeds With Humanoid Robots And Occupies Both White-Collar And Blue-Collar Jobs
    AGI will be embedded into humanoid robots, which makes white-collar and blue-collar jobs a target ... More for walking/talking automation.getty In today’s column, I examine the highly worrisome qualms expressed that the advent of artificial general intelligence (AGI) is likely to usurp white-collar jobs. The stated concern is that since AGI will be on par with human intellect, any job that relies principally on intellectual pursuits such as typical white-collar work will be taken over via the use of AGI. Employers will realize that rather than dealing with human white-collar workers, they can more readily get the job done via AGI. This, in turn, has led to a rising call that people should aim toward blue-collar jobs, doing so because (presumably) those forms of employment will not be undercut via AGI. Sorry to say, that misses the bigger picture, namely that AGI when combined with humanoid robots is coming not only for white-collar jobs but also blue-collar jobs too. It is a proverbial double-whammy when it comes to the attainment of AGI. Let’s talk about it. This analysis of an innovative AI breakthrough is part of my ongoing Forbes column coverage on the latest in AI, including identifying and explaining various impactful AI complexities (see the link here). Heading Toward AGI And ASI First, some fundamentals are required to set the stage for this weighty discussion. There is a great deal of research going on to further advance AI. The general goal is to either reach artificial general intelligence (AGI) or maybe even the outstretched possibility of achieving artificial superintelligence (ASI). AGI is AI that is considered on par with human intellect and can seemingly match our intelligence. ASI is AI that has gone beyond human intellect and would be superior in many if not all feasible ways. The idea is that ASI would be able to run circles around humans by outthinking us at every turn. For more details on the nature of conventional AI versus AGI and ASI, see my analysis at the link here. We have not yet attained AGI. In fact, it is unknown as to whether we will reach AGI, or that maybe AGI will be achievable in decades or perhaps centuries from now. The AGI attainment dates that are floating around are wildly varying and wildly unsubstantiated by any credible evidence or ironclad logic. ASI is even more beyond the pale when it comes to where we are currently with conventional AI. AGI Problem Only Half Seen Before launching into the primary matter at hand in this discussion, let’s contemplate a famous quote attributed to Charles Kettering, a legendary inventor, who said, “A problem well-stated is a problem half-solved.” I bring this up because those loud clamors right now about the assumption that AGI will replace white-collar workers are only seeing half of the problem. The problem as they see it is that since AGI is intellectually on par with humans, and since white-collar workers mainly use intellect in their work endeavors, AGI is going to be used in place of humans for white-collar work. I will in a moment explain why that’s only half of the problem and there is a demonstrative need to more carefully and fully articulate the nature of the problem. Will AGI Axiomatically Take White-Collar Jobs On a related facet, the belief that AGI will axiomatically replace white-collar labor makes a number of other related key assumptions. I shall briefly explore those and then come back to why the problem itself is only half-baked. The cost of using AGI for doing white-collar work will need to be presumably a better ROI choice over human workers. If not, then an employer would be wiser to stick with humans rather than employing AGI. There seems to often be an unstated belief that AGI is necessarily going to be a less costly route than employing humans. We don’t know yet what the cost of using AGI will be. It could be highly expensive. Indeed, some are worried that the world will divide into the AGI haves and AGI have-nots, partially due to the exorbitant cost that AGI might involve. If AGI is free to use, well, that would seem to be the nail in the coffin related to using human workers for the same capacity. Another angle is that AGI is relatively inexpensive in comparison to human labor. In that case, the use of AGI is likely to win over human labor usage. But if the cost of AGI is nearer to the cost of human labor (all in), or more so, then employers would rationally need to weigh the use of one versus the other. Note that when referring to the cost of human labor, there is more to that calculation than simply the dollar-hour labor rate per se. There are lots of other less apparent costs, such as the cost to manage human labor, the cost of dealing with HR-related issues, and many other factors that come into the weighty matter. Thus, an AGI versus human labor ROI will be more complex than it might seem at an initial glance. In addition, keep in mind that AGI would seemingly be readily switched on and off, and have other capacities that human labor would not equally tend to allow. The Other Half Is Coming Too Assume that by and large the advent of AGI will decimate the need for white-collar human labor. The refrain right now is that people should begin tilting toward blue-collar jobs as an alternative to white-collar jobs. This is a logical form of thinking in the sense that AGI as an intellectual mechanism would be unable to compete in jobs that involve hands-on work. A plumber needs to come to your house and do hands-on work to fix your plumbing. This is a physicality that entails arriving at your physical home, physically bringing and using tools, and physically repairing your faulty home plumbing. A truck driver likewise needs to sit in the cab of a truck and drive the vehicle. These are physically based tasks. There is no getting around the fact that these are hands-on activities. Aha, yes, those are physical tasks, but that doesn’t necessarily mean that only human hands can perform them. The gradual emergence of humanoid robots will provide an alternative to human hands. A humanoid robot is a type of robot that is built to resemble a human in form and function. You’ve undoubtedly seen those types of robots in the many online video recordings showing them walking, jumping, grasping at objects, and so on. A tremendous amount of active research and development is taking place to devise humanoid robots. They look comical right now. You watch those videos and laugh when the robot trips over a mere stick lying on the ground, something that a human would seldom trip over. You scoff when a robot tries to grasp a coffee cup and inadvertently spills most of the coffee. It all seems humorous and a silly pursuit. Keep in mind that we are all observing the development process while it is still taking place. At some point, those guffaws of the humanoid robots will lessen. Humanoid robots will be as smooth and graceful as humans. This will continue to be honed. Eventually, humanoid robots will be less prone to physical errors that humans make. In a sense, the physicality of a humanoid robot will be on par with humans, if not better, due to its mechanical properties. Do not discount the coming era of quite physically capable humanoid robots. AGI And Humanoid Robots Pair Up You might remember that in The Wonderful Wizard of Oz, the fictional character known as The Strawman lacked a brain. Without seeming to anthropomorphize humanoid robots, the current situation is that those robots typically use a form of AI that is below the sophistication level of modern generative AI. That’s fine for now due to the need to first ensure that the physical movements of the robots get refined. I have discussed that a said-to-be realm of Physical AI is going to be a huge breakthrough with incredible ramifications, see my analysis at the link here. The idea underlying Physical AI is that the AI of today is being uplifted by doing data training on the physical world. This also tends to include the use of World Models, consisting of broad constructions about how the physical world works, such as that we are bound to operate under conditions of gravity, and other physical laws of nature, see the link here. The bottom line here is that there will be a close pairing of robust AI with humanoid robots. Imagine what a humanoid robot can accomplish if it is paired with AGI. I’ll break the suspense and point out that AGI paired with humanoid robots means that those robots readily enter the blue-collar worker realm. Suppose your plumbing needs fixing. No worries, a humanoid robot that encompasses AGI will be sent to your home. The AGI is astute enough to carry on conversations with you, and the AGI also fully operates the robot to undertake the plumbing tasks. How did the AGI-paired humanoid robot get to your home? Easy-peasy, it drove a car or truck to get there. I’ve previously predicted that all the work on devising autonomous vehicles and self-driving cars will get shaken up once we have suitable humanoid robots devised. There won’t be a need for a vehicle to contain self-driving capabilities. A humanoid robot will simply sit in the driver’s seat and drive the vehicle. This is a much more open-ended solution than having to craft components that go into and onto a vehicle to enable self-driving. See my coverage at the link here. Timing Is Notable One of the reasons that many do not give much thought to the pairing of AGI with humanoid robots is that today’s humanoid robots seem extraordinarily rudimentary and incapable of performing physical dexterity tasks on par with human capabilities. Meanwhile, there is brazen talk that AGI is just around the corner. AGI is said to be within our grasp. Let’s give the timing considerations a bit of scrutiny. There are three primary timing angles: Option 1: AGI first, then humanoid robots. AGI is attained before humanoid robots are sufficiently devised. Option 2: Humanoid robots first, then AGI. Humanoid robots are physically fluently adept before AGI is attained. Option 3: AGI and humanoid robots arrive about at the same time. AGI is attained and at the same time, it turns out that humanoid robots are fluently adept too, mainly by coincidence and not due to any cross-mixing. A skeptic would insist that there is a fourth possibility, consisting of the possibility that we never achieve AGI and/or we fail to achieve sufficiently physically capable humanoid robots. I am going to reject that possibility. Perhaps I am overly optimistic, but it seems to me that we will eventually attain AGI, and we will eventually attain physically capable humanoid robots. I shall next respectively consider each of the three genuinely reasonable possibilities. Option 1: AGI First, Then Humanoid Robots What if we manage to attain AGI before we manage to achieve physically fluent humanoid robots? That’s just fine. We would indubitably put AGI to work as a partner with humans in figuring out how we can push along the budding humanoid robot development process. It seems nearly obvious that with AGI’s capable assistance, we would overcome any bottlenecks and soon enough arrive at top-notch physically adept humanoid robots. At that juncture, we would then toss AGI into the humanoid robots and have ourselves quite an amazing combination. Option 2: Humanoid Robots First, Then AGI Suppose that we devise very physically adept humanoid robots but have not yet arrived at AGI. Are we in a pickle? Nope. We could use conventional advanced AI inside those humanoid robots. The combination would certainly be good enough for a wide variety of tasks. The odds are that we would need to be cautious about where such robots are utilized. Nonetheless, we would have essentially walking, talking, and productive humanoid robots. If AGI never happens, oh well, we end up with pretty good humanoid robots. On the other hand, once we arrive at AGI, those humanoid robots will be stellar. It’s just a matter of time. Option 3: AGI And Humanoid Robots At The Same Time Let’s consider the potential of AGI and humanoid robots perchance being attained around the same time. Assume that this timing isn’t due to an outright cross-mixing with each other. They just so happen to advance on a similar timeline. I tend to believe that’s the most likely of the three scenarios. Here’s why. First, despite all the hubris about AGI being within earshot, perhaps in the next year or two, which is a popular pronouncement by many AI luminaries, I tend to side with recent surveys of AI developers that put the date around the year 2040 (see my coverage at the link here). Some AI luminaires sneakily play with the definition of AGI in hopes of making their predictions come true sooner, akin to moving the goalposts to easily score points. For my coverage on Sam Altman’s efforts of moving the cheese regarding AGI attainment, see the link here. Second, if you are willing to entertain the year 2040 as a potential date for achieving AGI, that’s about 15 years from now. In my estimation, the advancements being made in humanoid robots will readily progress such that by 2040 they will be very physically adept. Probably be sooner, but let’s go with the year 2040 for ease of contemplation. In my view, we will likely have humanoid robots doing well enough that they will be put into use prior to arriving at AGI. The pinnacle of robust humanoid robots and the attainment of AGI will roughly coincide with each other. Two peas in a pod.Impact Of Enormous Consequences In an upcoming column posting, I will examine the enormous consequences of having AGI paired with fully physically capable humanoid robots. As noted above, this will have a humongous impact on white-collar work and blue-collar work. There will be gargantuan economic impacts, societal impacts, cultural impacts, and so on. Some final thoughts for now. A single whammy is already being hotly debated. The debates currently tend to be preoccupied with the loss of white-collar jobs due to the attainment of AGI. A saving grace seems to be that at least blue-collar jobs are going to be around and thriving, even once AGI is attained. The world doesn’t seem overly gloomy if you can cling to the upbeat posture that blue-collar tasks remain intact. The double whammy is a lot more to take in. But the double whammy is the truth. The truth needs to be faced. If you are having doubts as a human about the future, just remember the famous words of Vince Lombardi: “Winners never quit, and quitters never win.” Humankind can handle the double whammy. Stay tuned for my upcoming coverage of what this entails.
    Like
    Love
    Wow
    Angry
    Sad
    366
    0 Comments 0 Shares
  • SHINING A LIGHT ON ESSENTIAL DANISH VFX WITH PETER HJORTH

    By OLIVER WEBB

    Images courtesy of Peter Hjorth and Zentropa, except where noted.

    Peter Hjorth.When Peter Hjorth first started out, visual effects were virtually non-existent in the Danish film industry. “We had one guy at the lab who did work on the Oxberry, and I worked at a video production company,” Hjorth states. “I trained as a videotape editor, then it went into online. When the first digital tools arrived, I joined one of the hot post places where they got the first digital VTRs. All my first years of experience were with commercial clients and music videos and making the transition from analogue to digital in video post-production. I did a little bit of work for friends of mine where we actually did it at the lab. I’m old enough to have done stuff with the optical printer and waiting for weeks to get it right. There were some very early start-ups in Copenhagen doing files to film, and I started working with them.”

    Hjorth’s first feature film came in 1998 with Thomas Vinterberg’s Festen, where he served as camera operator and digital consultant. Festen also marked Hjorth’s first foray into the Dogme 95 movement. “We shot on MiniDV, and I was attached to the whole project. I shot the second camera and then was asked if I could do some advanced work in visual effects for commercials. I was then asked by Lars von Trier to help out on Dancer in the Dark when he was starting.”
    Working on Dancer in the Dark marked the beginning of Hjorth’s frequent collaborations with Lars von Trier. “That was sort of a two-fold thing because we had 100 DV cameras that needed some kind of infrastructure to work, and my television background was good for that. We also needed some visual effects work to get rid of some cameras. If you put 100 cameras in the same set, you’re going to get into a visual effects situation. So, I did that and worked on the editing. At that time, people were a little bit afraid of Lars, but I’m up for anything. We had a great time, especially during the editing and post-production.”

    Hjorth was pleased with his collaboration with director Tarik Saleh on the U.S. film The Contractor, on which he served as Production Visual Effects Supervisor.“There’s a special thing about Denmark, which is that we tend to all stick together… It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.”
    —Peter Hjorth, Visual Effects Supervisor

    Initially, production experimented with a wall of cameras, where Hjorth did a test compositing that into an image. Von Trier found it interesting, but felt it wasn’t right for Dancer in the Dark. He later came back to Hjorth with Dogville and explained that he wanted to implement the multi-camera technique for this project. “Lars didn’t want linear perspective, instead he wanted something more like visual arts, fine arts, a notion of perspective, even cubism maybe,” Hjorth adds. “At that point in between those two projects, I did the first big Vinterberg film, It’s All About Love.” Hjorth worked as Visual Effects Supervisor for the film. “We did lots of precise visual effects, matched lenses, matched camera heights, everything by the book. Then I went into this totally crazy project for Lars and really developed a close understanding of what Lars wanted. We’ve done eight feature films and a TV series together. The last one was the third season of The Kingdom. I also did his last feature film, The House That Jack Built. I was Production Visual Effects Supervisor on all the stuff in-between, such as Antichrist and Melancholia.” Hjorth explains that he was very lucky to be in the right place at the right time. “Working on those projects has given me a network all over Europe with good people. We had some decent budgets, and people were thrilled to work on Lars’ films. I’ve made some excellent friends and good connections. If you wanted VFX for a movie in the early 2000s you hired someone from a post house for a specific scene. The notion of a production visual effects supervisor was not very common in Denmark, and the role has since developed. I find that my contribution is now mostly in pre-production. With post-production, I usually take a step back and leave it to the vendors to get right, but I’m happy I’ve been able to assist when the need arose.” 

    Throughout his career, Hjorth has worked across the board as camera operator, colorist and editor. “I did some camera work on the side for music videos, and so on,” he explains. “When I speak to the DP and the gaffers, I know the language. I wouldn’t say I did great work as a cinematographer, but I know the language, the equipment and the limitations. Actually, my first job before even going into post-production was as an electrician. I used to work on really old, heavy movie lights back in the day, so I also know a little bit about departments on set and how it works. That has made it a little bit easier for me to be on set because as a visual effects supervisor, it can be a super scary experience. If you feel like a tourist, it’s just horrible. I, of course, worked on the Dogme 95 films, where we worked closely with the actors, and I’m not afraid to have a conversation with an actor. No matter how good the VFX is, if the actors don’t believe a scene they are in, it doesn’t work. So, I’ve been lucky to do a bit of everything, and I feel blessed that things turned out the way that they did.”
    Starting out in Dogme 95 also proved to be a huge learning curve when it came to film language and understanding how to work within a set of specific rules and guidelines. The movement was founded by Lars von Trier and Thomas Vinterberg, who created the Dogme 95 Manifesto. The Manifesto consisted of 10 rules, which included: camera must be handheld, shooting must be done on location and special lighting isn’t allowed. “It’s a good background to have,” Hjorth states. “We’ve had rules for all of the films I’ve made with Lars, even on projects such as Melancholia.”

    Setting rules hasn’t been limited to Danish cinema and extends beyond that. “We made kind of a set of rules for the films I’ve made with Ali Abbasi, and that’s always made things easier,” Hjorth says. “He first called me when he was in film school. He was doing some early tests and was audacious enough to ask me for a VFX shot. It was hard to understand what he was saying, but then he talked about a scene with a guy coming out of a cake and he kills his brother, slicing his throat with a knife, and he wanted to see that in close-up. I appreciate younger directors calling and asking me to work with them, and it has really paid off.”

    Hjorth was the Visual Effects Supervisor for several episodes of the 1994-2022 TV series The Kingdom and The Kingdom: Exodus.

    Hjorth has worked on eight feature films and a TV series with director Lars von Trier.Hjorth with director Lars von Trier, left, on the set of The Kingdom: Exodus.Hjorth was Visual Effects Supervisor on The House That Jack Built, directed by Lars von Trier.Peter Hjorth was recognized for his work as European Visual Effects Supervisor for the Swedish-Danish feature and Cannes winner Border, directed by Ali Abbasi.Hjorth was Production Visual Effects Supervisor on Lamb, directed by Valdimar Jóhannsson.Hjorth with Simone Grau Roney, Production Designer on The House That Jack Built, directed by Lars von Trier.

    Choosing a favorite visual effect shot from his career, however, is a difficult task for Hjorth, though he’s particularly proud of the work achieved on Dogville. “Nobody noticed how messed up it was,” he explains. “Toward the end of the movie, you can see the masks, and you can see that we didn’t bother to match the grain between layers and all that. We did the first test on Flame, and when we went to layer 99, it just stopped working. We ended up doing it with combustion software, which was crummy, but it worked, and we got the shots done. I think we went to 170 layers on the opening shot. It was a learning experience for everybody involved, and I still work with some of those same people, most recently on the Netflix series I did this spring.”

    Hjorth worked as Visual Effects Supervisor on Holy Spider, directed by Ali Abbasi. 

    Hjorth served as Visual Effects Supervisor on Antichrist, directed by Lars von Trier.

    Hjorth believes that there’s been an immense upgrade in professionalism in Denmark in the years since he’s worked in the business. “The beginning was much less industrial. The directors that I have worked with tend to work with me multiple times. A lot of the stuff I say in the first meeting is really defining for how thatis going to go. I’ve been so lucky to work on films that I actually think made a difference. It has mostly been art house films with limited budgets and resources. When we work together with the same producer or director a few times, sometimes they come back and say, ‘We’d like to have a creature or some special thing.’ It’s an evolving process.”

    Hjorth worked as Visual Effects Supervisor on the Lars von Trier-directed Melancholia, and was also credited for his astrophotography of auroras for the film.Hjorth was Visual Effects Supervisor on Dogville, directed by Lars von Trier.

    Hjorth worked with director Lars von Trier to develop the Automavision technique, which was credited with the cinematography for The Boss of It All. A computer algorithm randomly changes the camera’s tilt, pan, focal length and/or positioning as well as the sound recording without being actively operated by the cinematographer.

    Hjorth works closely with stunts, special effects makeup, animal wranglers and other specialists. “I know the craft and what they need from me. They know more about what’s going to be effective on screen, so I just leave them to it and make sure they have what they need. Same thing with animals and visual effects, makeup and stuff like that, physical things. You know I have a bit of a reputation for trying to get as many pieces of the puzzle as possible with a camera. Some production VFX people get quotes from, say, three different vendors, and then they pick all the cheapest bids for each sequence or shot, and that’s how they get down in budget. I tried to avoid that. I’d rather actually sit down with the director and say for example, ‘We should have some breathing space here.’”
    When it comes to the future of visual effects in Denmark, Hjorth takes an optimistic view. “I think this trend that we have more production supervisors is basically going to continue in the way that even if you have very little work, you hire someone from the get-go and you make sure that’s a balance in ambition and resources. There’s a special thing about Denmark, which is that we tend to all stick together, even people who are not in the same line of work. We have lots of experience sharing. There are no limits to who you can call and ask questions. It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.”
    #shining #light #essential #danish #vfx
    SHINING A LIGHT ON ESSENTIAL DANISH VFX WITH PETER HJORTH
    By OLIVER WEBB Images courtesy of Peter Hjorth and Zentropa, except where noted. Peter Hjorth.When Peter Hjorth first started out, visual effects were virtually non-existent in the Danish film industry. “We had one guy at the lab who did work on the Oxberry, and I worked at a video production company,” Hjorth states. “I trained as a videotape editor, then it went into online. When the first digital tools arrived, I joined one of the hot post places where they got the first digital VTRs. All my first years of experience were with commercial clients and music videos and making the transition from analogue to digital in video post-production. I did a little bit of work for friends of mine where we actually did it at the lab. I’m old enough to have done stuff with the optical printer and waiting for weeks to get it right. There were some very early start-ups in Copenhagen doing files to film, and I started working with them.” Hjorth’s first feature film came in 1998 with Thomas Vinterberg’s Festen, where he served as camera operator and digital consultant. Festen also marked Hjorth’s first foray into the Dogme 95 movement. “We shot on MiniDV, and I was attached to the whole project. I shot the second camera and then was asked if I could do some advanced work in visual effects for commercials. I was then asked by Lars von Trier to help out on Dancer in the Dark when he was starting.” Working on Dancer in the Dark marked the beginning of Hjorth’s frequent collaborations with Lars von Trier. “That was sort of a two-fold thing because we had 100 DV cameras that needed some kind of infrastructure to work, and my television background was good for that. We also needed some visual effects work to get rid of some cameras. If you put 100 cameras in the same set, you’re going to get into a visual effects situation. So, I did that and worked on the editing. At that time, people were a little bit afraid of Lars, but I’m up for anything. We had a great time, especially during the editing and post-production.” Hjorth was pleased with his collaboration with director Tarik Saleh on the U.S. film The Contractor, on which he served as Production Visual Effects Supervisor.“There’s a special thing about Denmark, which is that we tend to all stick together… It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” —Peter Hjorth, Visual Effects Supervisor Initially, production experimented with a wall of cameras, where Hjorth did a test compositing that into an image. Von Trier found it interesting, but felt it wasn’t right for Dancer in the Dark. He later came back to Hjorth with Dogville and explained that he wanted to implement the multi-camera technique for this project. “Lars didn’t want linear perspective, instead he wanted something more like visual arts, fine arts, a notion of perspective, even cubism maybe,” Hjorth adds. “At that point in between those two projects, I did the first big Vinterberg film, It’s All About Love.” Hjorth worked as Visual Effects Supervisor for the film. “We did lots of precise visual effects, matched lenses, matched camera heights, everything by the book. Then I went into this totally crazy project for Lars and really developed a close understanding of what Lars wanted. We’ve done eight feature films and a TV series together. The last one was the third season of The Kingdom. I also did his last feature film, The House That Jack Built. I was Production Visual Effects Supervisor on all the stuff in-between, such as Antichrist and Melancholia.” Hjorth explains that he was very lucky to be in the right place at the right time. “Working on those projects has given me a network all over Europe with good people. We had some decent budgets, and people were thrilled to work on Lars’ films. I’ve made some excellent friends and good connections. If you wanted VFX for a movie in the early 2000s you hired someone from a post house for a specific scene. The notion of a production visual effects supervisor was not very common in Denmark, and the role has since developed. I find that my contribution is now mostly in pre-production. With post-production, I usually take a step back and leave it to the vendors to get right, but I’m happy I’ve been able to assist when the need arose.”  Throughout his career, Hjorth has worked across the board as camera operator, colorist and editor. “I did some camera work on the side for music videos, and so on,” he explains. “When I speak to the DP and the gaffers, I know the language. I wouldn’t say I did great work as a cinematographer, but I know the language, the equipment and the limitations. Actually, my first job before even going into post-production was as an electrician. I used to work on really old, heavy movie lights back in the day, so I also know a little bit about departments on set and how it works. That has made it a little bit easier for me to be on set because as a visual effects supervisor, it can be a super scary experience. If you feel like a tourist, it’s just horrible. I, of course, worked on the Dogme 95 films, where we worked closely with the actors, and I’m not afraid to have a conversation with an actor. No matter how good the VFX is, if the actors don’t believe a scene they are in, it doesn’t work. So, I’ve been lucky to do a bit of everything, and I feel blessed that things turned out the way that they did.” Starting out in Dogme 95 also proved to be a huge learning curve when it came to film language and understanding how to work within a set of specific rules and guidelines. The movement was founded by Lars von Trier and Thomas Vinterberg, who created the Dogme 95 Manifesto. The Manifesto consisted of 10 rules, which included: camera must be handheld, shooting must be done on location and special lighting isn’t allowed. “It’s a good background to have,” Hjorth states. “We’ve had rules for all of the films I’ve made with Lars, even on projects such as Melancholia.” Setting rules hasn’t been limited to Danish cinema and extends beyond that. “We made kind of a set of rules for the films I’ve made with Ali Abbasi, and that’s always made things easier,” Hjorth says. “He first called me when he was in film school. He was doing some early tests and was audacious enough to ask me for a VFX shot. It was hard to understand what he was saying, but then he talked about a scene with a guy coming out of a cake and he kills his brother, slicing his throat with a knife, and he wanted to see that in close-up. I appreciate younger directors calling and asking me to work with them, and it has really paid off.” Hjorth was the Visual Effects Supervisor for several episodes of the 1994-2022 TV series The Kingdom and The Kingdom: Exodus. Hjorth has worked on eight feature films and a TV series with director Lars von Trier.Hjorth with director Lars von Trier, left, on the set of The Kingdom: Exodus.Hjorth was Visual Effects Supervisor on The House That Jack Built, directed by Lars von Trier.Peter Hjorth was recognized for his work as European Visual Effects Supervisor for the Swedish-Danish feature and Cannes winner Border, directed by Ali Abbasi.Hjorth was Production Visual Effects Supervisor on Lamb, directed by Valdimar Jóhannsson.Hjorth with Simone Grau Roney, Production Designer on The House That Jack Built, directed by Lars von Trier. Choosing a favorite visual effect shot from his career, however, is a difficult task for Hjorth, though he’s particularly proud of the work achieved on Dogville. “Nobody noticed how messed up it was,” he explains. “Toward the end of the movie, you can see the masks, and you can see that we didn’t bother to match the grain between layers and all that. We did the first test on Flame, and when we went to layer 99, it just stopped working. We ended up doing it with combustion software, which was crummy, but it worked, and we got the shots done. I think we went to 170 layers on the opening shot. It was a learning experience for everybody involved, and I still work with some of those same people, most recently on the Netflix series I did this spring.” Hjorth worked as Visual Effects Supervisor on Holy Spider, directed by Ali Abbasi.  Hjorth served as Visual Effects Supervisor on Antichrist, directed by Lars von Trier. Hjorth believes that there’s been an immense upgrade in professionalism in Denmark in the years since he’s worked in the business. “The beginning was much less industrial. The directors that I have worked with tend to work with me multiple times. A lot of the stuff I say in the first meeting is really defining for how thatis going to go. I’ve been so lucky to work on films that I actually think made a difference. It has mostly been art house films with limited budgets and resources. When we work together with the same producer or director a few times, sometimes they come back and say, ‘We’d like to have a creature or some special thing.’ It’s an evolving process.” Hjorth worked as Visual Effects Supervisor on the Lars von Trier-directed Melancholia, and was also credited for his astrophotography of auroras for the film.Hjorth was Visual Effects Supervisor on Dogville, directed by Lars von Trier. Hjorth worked with director Lars von Trier to develop the Automavision technique, which was credited with the cinematography for The Boss of It All. A computer algorithm randomly changes the camera’s tilt, pan, focal length and/or positioning as well as the sound recording without being actively operated by the cinematographer. Hjorth works closely with stunts, special effects makeup, animal wranglers and other specialists. “I know the craft and what they need from me. They know more about what’s going to be effective on screen, so I just leave them to it and make sure they have what they need. Same thing with animals and visual effects, makeup and stuff like that, physical things. You know I have a bit of a reputation for trying to get as many pieces of the puzzle as possible with a camera. Some production VFX people get quotes from, say, three different vendors, and then they pick all the cheapest bids for each sequence or shot, and that’s how they get down in budget. I tried to avoid that. I’d rather actually sit down with the director and say for example, ‘We should have some breathing space here.’” When it comes to the future of visual effects in Denmark, Hjorth takes an optimistic view. “I think this trend that we have more production supervisors is basically going to continue in the way that even if you have very little work, you hire someone from the get-go and you make sure that’s a balance in ambition and resources. There’s a special thing about Denmark, which is that we tend to all stick together, even people who are not in the same line of work. We have lots of experience sharing. There are no limits to who you can call and ask questions. It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” #shining #light #essential #danish #vfx
    WWW.VFXVOICE.COM
    SHINING A LIGHT ON ESSENTIAL DANISH VFX WITH PETER HJORTH
    By OLIVER WEBB Images courtesy of Peter Hjorth and Zentropa, except where noted. Peter Hjorth. (Photo courtesy of Danish Film Institute) When Peter Hjorth first started out, visual effects were virtually non-existent in the Danish film industry. “We had one guy at the lab who did work on the Oxberry [rostrum animation camera], and I worked at a video production company,” Hjorth states. “I trained as a videotape editor, then it went into online. When the first digital tools arrived, I joined one of the hot post places where they got the first digital VTRs. All my first years of experience were with commercial clients and music videos and making the transition from analogue to digital in video post-production. I did a little bit of work for friends of mine where we actually did it at the lab. I’m old enough to have done stuff with the optical printer and waiting for weeks to get it right. There were some very early start-ups in Copenhagen doing files to film, and I started working with them.” Hjorth’s first feature film came in 1998 with Thomas Vinterberg’s Festen, where he served as camera operator and digital consultant. Festen also marked Hjorth’s first foray into the Dogme 95 movement. “We shot on MiniDV, and I was attached to the whole project. I shot the second camera and then was asked if I could do some advanced work in visual effects for commercials. I was then asked by Lars von Trier to help out on Dancer in the Dark when he was starting.” Working on Dancer in the Dark marked the beginning of Hjorth’s frequent collaborations with Lars von Trier. “That was sort of a two-fold thing because we had 100 DV cameras that needed some kind of infrastructure to work, and my television background was good for that. We also needed some visual effects work to get rid of some cameras. If you put 100 cameras in the same set, you’re going to get into a visual effects situation. So, I did that and worked on the editing. At that time, people were a little bit afraid of Lars, but I’m up for anything. We had a great time, especially during the editing and post-production.” Hjorth was pleased with his collaboration with director Tarik Saleh on the U.S. film The Contractor (2022), on which he served as Production Visual Effects Supervisor. (Image courtesy of Paramount Pictures) “There’s a special thing about Denmark, which is that we tend to all stick together… It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.” —Peter Hjorth, Visual Effects Supervisor Initially, production experimented with a wall of cameras, where Hjorth did a test compositing that into an image. Von Trier found it interesting, but felt it wasn’t right for Dancer in the Dark. He later came back to Hjorth with Dogville and explained that he wanted to implement the multi-camera technique for this project. “Lars didn’t want linear perspective, instead he wanted something more like visual arts, fine arts, a notion of perspective, even cubism maybe,” Hjorth adds. “At that point in between those two projects, I did the first big Vinterberg film, It’s All About Love.” Hjorth worked as Visual Effects Supervisor for the film. “We did lots of precise visual effects, matched lenses, matched camera heights, everything by the book. Then I went into this totally crazy project for Lars and really developed a close understanding of what Lars wanted. We’ve done eight feature films and a TV series together. The last one was the third season of The Kingdom. I also did his last feature film, The House That Jack Built. I was Production Visual Effects Supervisor on all the stuff in-between, such as Antichrist and Melancholia.” Hjorth explains that he was very lucky to be in the right place at the right time. “Working on those projects has given me a network all over Europe with good people. We had some decent budgets, and people were thrilled to work on Lars’ films. I’ve made some excellent friends and good connections. If you wanted VFX for a movie in the early 2000s you hired someone from a post house for a specific scene. The notion of a production visual effects supervisor was not very common in Denmark, and the role has since developed. I find that my contribution is now mostly in pre-production. With post-production, I usually take a step back and leave it to the vendors to get right, but I’m happy I’ve been able to assist when the need arose.”  Throughout his career, Hjorth has worked across the board as camera operator, colorist and editor. “I did some camera work on the side for music videos, and so on,” he explains. “When I speak to the DP and the gaffers, I know the language. I wouldn’t say I did great work as a cinematographer, but I know the language, the equipment and the limitations. Actually, my first job before even going into post-production was as an electrician. I used to work on really old, heavy movie lights back in the day, so I also know a little bit about departments on set and how it works. That has made it a little bit easier for me to be on set because as a visual effects supervisor, it can be a super scary experience. If you feel like a tourist, it’s just horrible. I, of course, worked on the Dogme 95 films, where we worked closely with the actors, and I’m not afraid to have a conversation with an actor. No matter how good the VFX is, if the actors don’t believe a scene they are in, it doesn’t work. So, I’ve been lucky to do a bit of everything, and I feel blessed that things turned out the way that they did.” Starting out in Dogme 95 also proved to be a huge learning curve when it came to film language and understanding how to work within a set of specific rules and guidelines. The movement was founded by Lars von Trier and Thomas Vinterberg, who created the Dogme 95 Manifesto. The Manifesto consisted of 10 rules, which included: camera must be handheld, shooting must be done on location and special lighting isn’t allowed. “It’s a good background to have,” Hjorth states. “We’ve had rules for all of the films I’ve made with Lars, even on projects such as Melancholia.” Setting rules hasn’t been limited to Danish cinema and extends beyond that. “We made kind of a set of rules for the films I’ve made with Ali Abbasi, and that’s always made things easier,” Hjorth says. “He first called me when he was in film school. He was doing some early tests and was audacious enough to ask me for a VFX shot. It was hard to understand what he was saying, but then he talked about a scene with a guy coming out of a cake and he kills his brother, slicing his throat with a knife, and he wanted to see that in close-up. I appreciate younger directors calling and asking me to work with them, and it has really paid off.” Hjorth was the Visual Effects Supervisor for several episodes of the 1994-2022 TV series The Kingdom and The Kingdom: Exodus. Hjorth has worked on eight feature films and a TV series with director Lars von Trier. (Photo: Peter Hjorth) Hjorth with director Lars von Trier, left, on the set of The Kingdom: Exodus. (Photo: Peter Hjorth) Hjorth was Visual Effects Supervisor on The House That Jack Built, directed by Lars von Trier. (Photo: Christian Geisnæs) Peter Hjorth was recognized for his work as European Visual Effects Supervisor for the Swedish-Danish feature and Cannes winner Border, directed by Ali Abbasi. (Image courtesy of Meta Film Stockholm) Hjorth was Production Visual Effects Supervisor on Lamb (2021), directed by Valdimar Jóhannsson. (Image courtesy of Go To Sheep and A24) Hjorth with Simone Grau Roney, Production Designer on The House That Jack Built (2018), directed by Lars von Trier. Choosing a favorite visual effect shot from his career, however, is a difficult task for Hjorth, though he’s particularly proud of the work achieved on Dogville. “Nobody noticed how messed up it was,” he explains. “Toward the end of the movie, you can see the masks, and you can see that we didn’t bother to match the grain between layers and all that. We did the first test on Flame, and when we went to layer 99, it just stopped working. We ended up doing it with combustion software, which was crummy, but it worked, and we got the shots done. I think we went to 170 layers on the opening shot. It was a learning experience for everybody involved, and I still work with some of those same people, most recently on the Netflix series I did this spring.” Hjorth worked as Visual Effects Supervisor on Holy Spider, directed by Ali Abbasi. (Photo: Nadim Carlsen. Image courtesy of Profile Pictures)   Hjorth served as Visual Effects Supervisor on Antichrist (2009), directed by Lars von Trier. Hjorth believes that there’s been an immense upgrade in professionalism in Denmark in the years since he’s worked in the business. “The beginning was much less industrial. The directors that I have worked with tend to work with me multiple times. A lot of the stuff I say in the first meeting is really defining for how that [job] is going to go. I’ve been so lucky to work on films that I actually think made a difference. It has mostly been art house films with limited budgets and resources. When we work together with the same producer or director a few times, sometimes they come back and say, ‘We’d like to have a creature or some special thing.’ It’s an evolving process.” Hjorth worked as Visual Effects Supervisor on the Lars von Trier-directed Melancholia (2011), and was also credited for his astrophotography of auroras for the film. (Image courtesy Magnolia Pictures) Hjorth was Visual Effects Supervisor on Dogville (2003), directed by Lars von Trier. Hjorth worked with director Lars von Trier to develop the Automavision technique, which was credited with the cinematography for The Boss of It All (2006). A computer algorithm randomly changes the camera’s tilt, pan, focal length and/or positioning as well as the sound recording without being actively operated by the cinematographer. Hjorth works closely with stunts, special effects makeup, animal wranglers and other specialists. “I know the craft and what they need from me. They know more about what’s going to be effective on screen, so I just leave them to it and make sure they have what they need. Same thing with animals and visual effects, makeup and stuff like that, physical things. You know I have a bit of a reputation for trying to get as many pieces of the puzzle as possible with a camera. Some production VFX people get quotes from, say, three different vendors, and then they pick all the cheapest bids for each sequence or shot, and that’s how they get down in budget. I tried to avoid that. I’d rather actually sit down with the director and say for example, ‘We should have some breathing space here.’” When it comes to the future of visual effects in Denmark, Hjorth takes an optimistic view. “I think this trend that we have more production supervisors is basically going to continue in the way that even if you have very little work, you hire someone from the get-go and you make sure that’s a balance in ambition and resources. There’s a special thing about Denmark, which is that we tend to all stick together, even people who are not in the same line of work. We have lots of experience sharing. There are no limits to who you can call and ask questions. It’s not competitive in this way because people will get the jobs they get. Everybody realizes we have to work together, and what really matters is that we put something on screen that gives the audience a good experience.”
    Like
    Love
    Wow
    Sad
    Angry
    245
    0 Comments 0 Shares