• Dev snapshot: Godot 4.5 dev 5

    Replicube
    A game by Walaber Entertainment LLCDev snapshot: Godot 4.5 dev 5By:
    Thaddeus Crews2 June 2025Pre-releaseBrrr… Do you feel that? That’s the cold front of the feature freeze just around the corner. It’s not upon us just yet, but this is likely to be our final development snapshot of the 4.5 release cycle. As we enter the home stretch of new features, bugs are naturally going to follow suit, meaning bug reports and feedback will be especially important for a smooth beta timeframe.Jump to the Downloads section, and give it a spin right now, or continue reading to learn more about improvements in this release. You can also try the Web editor or the Android editor for this release. If you are interested in the latter, please request to join our testing group to get access to pre-release builds.The cover illustration is from Replicube, a programming puzzle game where you write code to recreate voxelized objects. It is developed by Walaber Entertainment LLC. You can get the game on Steam.HighlightsIn case you missed them, see the 4.5 dev 1, 4.5 dev 2, 4.5 dev 3, and 4.5 dev 4 release notes for an overview of some key features which were already in those snapshots, and are therefore still available for testing in dev 5.Native visionOS supportNormally, our featured highlights in these development blogs come from long-time contributors. This makes sense of course, as it’s generally those users that have the familiarity necessary for major changes or additions that are commonly used for these highlights. That’s why it might surprise you to hear that visionOS support comes to us from Ricardo Sanchez-Saez, whose pull request GH-105628 is his very first contribution to the engine! It might not surprise you to hear that Ricardo is part of the visionOS engineering team at Apple, which certainly helps get his foot in the door, but that still makes visionOS the first officially-supported platform integration in about a decade.For those unfamiliar, visionOS is Apple’s XR environment. We’re no strangers to XR as a concept, but XR platforms are as distinct from one another as traditional platforms. visionOS users have expressed a strong interest in integrating with our ever-growing XR community, and now we can make that happen. See you all in the next XR Game Jam!GDScript: Abstract classesWhile the Godot Engine utilizes abstract classes—a class that cannot be directly instantiated—frequently, this was only ever supported internally. Thanks to the efforts of Aaron Franke, this paradigm is now available to GDScript users. Now if a user wants to introduce their own abstract class, they merely need to declare it via the new abstract keyword:abstract class_name MyAbstract extends Node
    The purpose of an abstract class is to create a baseline for other classes to derive from:class_name ExtendsMyAbstract extends MyAbstract
    Shader bakerFrom the technical gurus behind implementing ubershaders, Darío Samo and Pedro J. Estébanez bring us another miracle of rendering via GH-102552: shader baker exporting. This is an optional feature that can be enabled at export time to speed up shader compilation massively. This feature works with ubershaders automatically without any work from the user. Using shader baking is strongly recommended when targeting Apple devices or D3D12 since it makes the biggest difference there!Before:After:However, it comes with tradeoffs:Export time will be much longer.Build size will be much larger since the baked shaders can take up a lot of space.We have removed several MoltenVK bug workarounds from the Forward+ shader, therefore we no longer guarantee support for the Forward+ renderer on Intel Macs. If you are targeting Intel Macs, you should use the Mobile or Compatibility renderers.Baking for Vulkan can be done from any device, but baking for D3D12 needs to be done from a Windows device and baking for Apple .metallib requires a Metal compiler.Web: WebAssembly SIMD supportAs you might recall, Godot 4.0 initially released under the assumption that multi-threaded web support would become the standard, and only supported that format for web builds. This assumption unfortunately proved to be wishful thinking, and was reverted in 4.3 by allowing for single-threaded builds once more. However, this doesn’t mean that these single-threaded environments are inherently incapable of parallel processing; it just requires alternative implementations. One such implementation, SIMD, is a perfect candidate thanks to its support across all major browsers. To that end, web-wiz Adam Scott has taken to integrating this implementation for our web builds by default.Inline color pickersWhile it’s always been possible to see what kind of variable is assigned to an exported color in the inspector, some users have expressed a keen interest in allowing for this functionality within the script editor itself. This is because it would mean seeing what kind of color is represented by a variable without it needing to be exposed, as well as making it more intuitive at a glance as to what color a name or code corresponds to. Koliur Rahman has blessed us with this quality-of-life goodness, which adds an inline color picker GH-105724. Now no matter where the color is declared, users will be able to immediately and intuitively know what is actually represented in a non-intrusive manner.Rendering goodiesThe renderer got a fair amount of love this snapshot; not from any one PR, but rather a multitude of community members bringing some long-awaited features to light. Raymond DiDonato helped SMAA 1x make its transition from addon to fully-fledged engine feature. Capry brings bent normal maps to further enhance specular occlusion and indirect lighting. Our very own Clay John converted our Compatibility backend to use a fragment shader copy instead of a blit copy, working around common sample rate issues on mobile devices. More technical information on these rendering changes can be found in their associated PRs.SMAA comparison:OffOnBent normal map comparison:BeforeAfterAnd more!There are too many exciting changes to list them all here, but here’s a curated selection:Animation: Add alphabetical sorting to Animation Player.Animation: Add animation filtering to animation editor.Audio: Implement seek operation for Theora video files, improve multi-channel audio resampling.Core: Add --scene command line argument.Core: Overhaul resource duplication.Core: Use Grisu2 algorithm in String::num_scientific to fix serializing.Editor: Add “Quick Load” button to EditorResourcePicker.Editor: Add PROPERTY_HINT_INPUT_NAME for use with @export_custom to allow using input actions.Editor: Add named EditorScripts to the command palette.GUI: Add file sort to FileDialog.I18n: Add translation preview in editor.Import: Add Channel Remap settings to ResourceImporterTexture.Physics: Improve performance with non-monitoring areas when using Jolt Physics.Porting: Android: Add export option for custom theme attributes.Porting: Android: Add support for 16 KB page sizes, update to NDK r28b.Porting: Android: Remove the gradle_build/compress_native_libraries export option.Porting: Web: Use actual PThread pool size for get_default_thread_pool_size.Porting: Windows/macOS/Linux: Use SSE 4.2 as a baseline when compiling Godot.Rendering: Add new StandardMaterial properties to allow users to control FPS-style objects.Rendering: FTI - Optimize SceneTree traversal.Changelog109 contributors submitted 252 fixes for this release. See our interactive changelog for the complete list of changes since the previous 4.5-dev4 snapshot.This release is built from commit 64b09905c.DownloadsGodot is downloading...Godot exists thanks to donations from people like you. Help us continue our work:Make a DonationStandard build includes support for GDScript and GDExtension..NET buildincludes support for C#, as well as GDScript and GDExtension.While engine maintainers try their best to ensure that each preview snapshot and release candidate is stable, this is by definition a pre-release piece of software. Be sure to make frequent backups, or use a version control system such as Git, to preserve your projects in case of corruption or data loss.Known issuesWindows executableshave been signed with an expired certificate. You may see warnings from Windows Defender’s SmartScreen when running this version, or outright be prevented from running the executables with a double-click. Running Godot from the command line can circumvent this. We will soon have a renewed certificate which will be used for future builds.With every release, we accept that there are going to be various issues, which have already been reported but haven’t been fixed yet. See the GitHub issue tracker for a complete list of known bugs.Bug reportsAs a tester, we encourage you to open bug reports if you experience issues with this release. Please check the existing issues on GitHub first, using the search function with relevant keywords, to ensure that the bug you experience is not already known.In particular, any change that would cause a regression in your projects is very important to report.SupportGodot is a non-profit, open source game engine developed by hundreds of contributors on their free time, as well as a handful of part and full-time developers hired thanks to generous donations from the Godot community. A big thank you to everyone who has contributed their time or their financial support to the project!If you’d like to support the project financially and help us secure our future hires, you can do so using the Godot Development Fund.Donate now
    #dev #snapshot #godot
    Dev snapshot: Godot 4.5 dev 5
    Replicube A game by Walaber Entertainment LLCDev snapshot: Godot 4.5 dev 5By: Thaddeus Crews2 June 2025Pre-releaseBrrr… Do you feel that? That’s the cold front of the feature freeze just around the corner. It’s not upon us just yet, but this is likely to be our final development snapshot of the 4.5 release cycle. As we enter the home stretch of new features, bugs are naturally going to follow suit, meaning bug reports and feedback will be especially important for a smooth beta timeframe.Jump to the Downloads section, and give it a spin right now, or continue reading to learn more about improvements in this release. You can also try the Web editor or the Android editor for this release. If you are interested in the latter, please request to join our testing group to get access to pre-release builds.The cover illustration is from Replicube, a programming puzzle game where you write code to recreate voxelized objects. It is developed by Walaber Entertainment LLC. You can get the game on Steam.HighlightsIn case you missed them, see the 4.5 dev 1, 4.5 dev 2, 4.5 dev 3, and 4.5 dev 4 release notes for an overview of some key features which were already in those snapshots, and are therefore still available for testing in dev 5.Native visionOS supportNormally, our featured highlights in these development blogs come from long-time contributors. This makes sense of course, as it’s generally those users that have the familiarity necessary for major changes or additions that are commonly used for these highlights. That’s why it might surprise you to hear that visionOS support comes to us from Ricardo Sanchez-Saez, whose pull request GH-105628 is his very first contribution to the engine! It might not surprise you to hear that Ricardo is part of the visionOS engineering team at Apple, which certainly helps get his foot in the door, but that still makes visionOS the first officially-supported platform integration in about a decade.For those unfamiliar, visionOS is Apple’s XR environment. We’re no strangers to XR as a concept, but XR platforms are as distinct from one another as traditional platforms. visionOS users have expressed a strong interest in integrating with our ever-growing XR community, and now we can make that happen. See you all in the next XR Game Jam!GDScript: Abstract classesWhile the Godot Engine utilizes abstract classes—a class that cannot be directly instantiated—frequently, this was only ever supported internally. Thanks to the efforts of Aaron Franke, this paradigm is now available to GDScript users. Now if a user wants to introduce their own abstract class, they merely need to declare it via the new abstract keyword:abstract class_name MyAbstract extends Node The purpose of an abstract class is to create a baseline for other classes to derive from:class_name ExtendsMyAbstract extends MyAbstract Shader bakerFrom the technical gurus behind implementing ubershaders, Darío Samo and Pedro J. Estébanez bring us another miracle of rendering via GH-102552: shader baker exporting. This is an optional feature that can be enabled at export time to speed up shader compilation massively. This feature works with ubershaders automatically without any work from the user. Using shader baking is strongly recommended when targeting Apple devices or D3D12 since it makes the biggest difference there!Before:After:However, it comes with tradeoffs:Export time will be much longer.Build size will be much larger since the baked shaders can take up a lot of space.We have removed several MoltenVK bug workarounds from the Forward+ shader, therefore we no longer guarantee support for the Forward+ renderer on Intel Macs. If you are targeting Intel Macs, you should use the Mobile or Compatibility renderers.Baking for Vulkan can be done from any device, but baking for D3D12 needs to be done from a Windows device and baking for Apple .metallib requires a Metal compiler.Web: WebAssembly SIMD supportAs you might recall, Godot 4.0 initially released under the assumption that multi-threaded web support would become the standard, and only supported that format for web builds. This assumption unfortunately proved to be wishful thinking, and was reverted in 4.3 by allowing for single-threaded builds once more. However, this doesn’t mean that these single-threaded environments are inherently incapable of parallel processing; it just requires alternative implementations. One such implementation, SIMD, is a perfect candidate thanks to its support across all major browsers. To that end, web-wiz Adam Scott has taken to integrating this implementation for our web builds by default.Inline color pickersWhile it’s always been possible to see what kind of variable is assigned to an exported color in the inspector, some users have expressed a keen interest in allowing for this functionality within the script editor itself. This is because it would mean seeing what kind of color is represented by a variable without it needing to be exposed, as well as making it more intuitive at a glance as to what color a name or code corresponds to. Koliur Rahman has blessed us with this quality-of-life goodness, which adds an inline color picker GH-105724. Now no matter where the color is declared, users will be able to immediately and intuitively know what is actually represented in a non-intrusive manner.Rendering goodiesThe renderer got a fair amount of love this snapshot; not from any one PR, but rather a multitude of community members bringing some long-awaited features to light. Raymond DiDonato helped SMAA 1x make its transition from addon to fully-fledged engine feature. Capry brings bent normal maps to further enhance specular occlusion and indirect lighting. Our very own Clay John converted our Compatibility backend to use a fragment shader copy instead of a blit copy, working around common sample rate issues on mobile devices. More technical information on these rendering changes can be found in their associated PRs.SMAA comparison:OffOnBent normal map comparison:BeforeAfterAnd more!There are too many exciting changes to list them all here, but here’s a curated selection:Animation: Add alphabetical sorting to Animation Player.Animation: Add animation filtering to animation editor.Audio: Implement seek operation for Theora video files, improve multi-channel audio resampling.Core: Add --scene command line argument.Core: Overhaul resource duplication.Core: Use Grisu2 algorithm in String::num_scientific to fix serializing.Editor: Add “Quick Load” button to EditorResourcePicker.Editor: Add PROPERTY_HINT_INPUT_NAME for use with @export_custom to allow using input actions.Editor: Add named EditorScripts to the command palette.GUI: Add file sort to FileDialog.I18n: Add translation preview in editor.Import: Add Channel Remap settings to ResourceImporterTexture.Physics: Improve performance with non-monitoring areas when using Jolt Physics.Porting: Android: Add export option for custom theme attributes.Porting: Android: Add support for 16 KB page sizes, update to NDK r28b.Porting: Android: Remove the gradle_build/compress_native_libraries export option.Porting: Web: Use actual PThread pool size for get_default_thread_pool_size.Porting: Windows/macOS/Linux: Use SSE 4.2 as a baseline when compiling Godot.Rendering: Add new StandardMaterial properties to allow users to control FPS-style objects.Rendering: FTI - Optimize SceneTree traversal.Changelog109 contributors submitted 252 fixes for this release. See our interactive changelog for the complete list of changes since the previous 4.5-dev4 snapshot.This release is built from commit 64b09905c.DownloadsGodot is downloading...Godot exists thanks to donations from people like you. Help us continue our work:Make a DonationStandard build includes support for GDScript and GDExtension..NET buildincludes support for C#, as well as GDScript and GDExtension.While engine maintainers try their best to ensure that each preview snapshot and release candidate is stable, this is by definition a pre-release piece of software. Be sure to make frequent backups, or use a version control system such as Git, to preserve your projects in case of corruption or data loss.Known issuesWindows executableshave been signed with an expired certificate. You may see warnings from Windows Defender’s SmartScreen when running this version, or outright be prevented from running the executables with a double-click. Running Godot from the command line can circumvent this. We will soon have a renewed certificate which will be used for future builds.With every release, we accept that there are going to be various issues, which have already been reported but haven’t been fixed yet. See the GitHub issue tracker for a complete list of known bugs.Bug reportsAs a tester, we encourage you to open bug reports if you experience issues with this release. Please check the existing issues on GitHub first, using the search function with relevant keywords, to ensure that the bug you experience is not already known.In particular, any change that would cause a regression in your projects is very important to report.SupportGodot is a non-profit, open source game engine developed by hundreds of contributors on their free time, as well as a handful of part and full-time developers hired thanks to generous donations from the Godot community. A big thank you to everyone who has contributed their time or their financial support to the project!If you’d like to support the project financially and help us secure our future hires, you can do so using the Godot Development Fund.Donate now #dev #snapshot #godot
    GODOTENGINE.ORG
    Dev snapshot: Godot 4.5 dev 5
    Replicube A game by Walaber Entertainment LLCDev snapshot: Godot 4.5 dev 5By: Thaddeus Crews2 June 2025Pre-releaseBrrr… Do you feel that? That’s the cold front of the feature freeze just around the corner. It’s not upon us just yet, but this is likely to be our final development snapshot of the 4.5 release cycle. As we enter the home stretch of new features, bugs are naturally going to follow suit, meaning bug reports and feedback will be especially important for a smooth beta timeframe.Jump to the Downloads section, and give it a spin right now, or continue reading to learn more about improvements in this release. You can also try the Web editor or the Android editor for this release. If you are interested in the latter, please request to join our testing group to get access to pre-release builds.The cover illustration is from Replicube, a programming puzzle game where you write code to recreate voxelized objects. It is developed by Walaber Entertainment LLC (Bluesky, Twitter). You can get the game on Steam.HighlightsIn case you missed them, see the 4.5 dev 1, 4.5 dev 2, 4.5 dev 3, and 4.5 dev 4 release notes for an overview of some key features which were already in those snapshots, and are therefore still available for testing in dev 5.Native visionOS supportNormally, our featured highlights in these development blogs come from long-time contributors. This makes sense of course, as it’s generally those users that have the familiarity necessary for major changes or additions that are commonly used for these highlights. That’s why it might surprise you to hear that visionOS support comes to us from Ricardo Sanchez-Saez, whose pull request GH-105628 is his very first contribution to the engine! It might not surprise you to hear that Ricardo is part of the visionOS engineering team at Apple, which certainly helps get his foot in the door, but that still makes visionOS the first officially-supported platform integration in about a decade.For those unfamiliar, visionOS is Apple’s XR environment. We’re no strangers to XR as a concept (see our recent XR blogpost highlighting the latest Godot XR Game Jam), but XR platforms are as distinct from one another as traditional platforms. visionOS users have expressed a strong interest in integrating with our ever-growing XR community, and now we can make that happen. See you all in the next XR Game Jam!GDScript: Abstract classesWhile the Godot Engine utilizes abstract classes—a class that cannot be directly instantiated—frequently, this was only ever supported internally. Thanks to the efforts of Aaron Franke, this paradigm is now available to GDScript users (GH-67777). Now if a user wants to introduce their own abstract class, they merely need to declare it via the new abstract keyword:abstract class_name MyAbstract extends Node The purpose of an abstract class is to create a baseline for other classes to derive from:class_name ExtendsMyAbstract extends MyAbstract Shader bakerFrom the technical gurus behind implementing ubershaders, Darío Samo and Pedro J. Estébanez bring us another miracle of rendering via GH-102552: shader baker exporting. This is an optional feature that can be enabled at export time to speed up shader compilation massively. This feature works with ubershaders automatically without any work from the user. Using shader baking is strongly recommended when targeting Apple devices or D3D12 since it makes the biggest difference there (over 20× decrease in load times in the TPS demo)!Before:After:However, it comes with tradeoffs:Export time will be much longer.Build size will be much larger since the baked shaders can take up a lot of space.We have removed several MoltenVK bug workarounds from the Forward+ shader, therefore we no longer guarantee support for the Forward+ renderer on Intel Macs. If you are targeting Intel Macs, you should use the Mobile or Compatibility renderers.Baking for Vulkan can be done from any device, but baking for D3D12 needs to be done from a Windows device and baking for Apple .metallib requires a Metal compiler (macOS with Xcode / Command Line Tools installed).Web: WebAssembly SIMD supportAs you might recall, Godot 4.0 initially released under the assumption that multi-threaded web support would become the standard, and only supported that format for web builds. This assumption unfortunately proved to be wishful thinking, and was reverted in 4.3 by allowing for single-threaded builds once more. However, this doesn’t mean that these single-threaded environments are inherently incapable of parallel processing; it just requires alternative implementations. One such implementation, SIMD, is a perfect candidate thanks to its support across all major browsers. To that end, web-wiz Adam Scott has taken to integrating this implementation for our web builds by default (GH-106319).Inline color pickersWhile it’s always been possible to see what kind of variable is assigned to an exported color in the inspector, some users have expressed a keen interest in allowing for this functionality within the script editor itself. This is because it would mean seeing what kind of color is represented by a variable without it needing to be exposed, as well as making it more intuitive at a glance as to what color a name or code corresponds to. Koliur Rahman has blessed us with this quality-of-life goodness, which adds an inline color picker GH-105724. Now no matter where the color is declared, users will be able to immediately and intuitively know what is actually represented in a non-intrusive manner.Rendering goodiesThe renderer got a fair amount of love this snapshot; not from any one PR, but rather a multitude of community members bringing some long-awaited features to light. Raymond DiDonato helped SMAA 1x make its transition from addon to fully-fledged engine feature (GH-102330). Capry brings bent normal maps to further enhance specular occlusion and indirect lighting (GH-89988). Our very own Clay John converted our Compatibility backend to use a fragment shader copy instead of a blit copy, working around common sample rate issues on mobile devices (GH-106267). More technical information on these rendering changes can be found in their associated PRs.SMAA comparison:OffOnBent normal map comparison:BeforeAfterAnd more!There are too many exciting changes to list them all here, but here’s a curated selection:Animation: Add alphabetical sorting to Animation Player (GH-103584).Animation: Add animation filtering to animation editor (GH-103130).Audio: Implement seek operation for Theora video files, improve multi-channel audio resampling (GH-102360).Core: Add --scene command line argument (GH-105302).Core: Overhaul resource duplication (GH-100673).Core: Use Grisu2 algorithm in String::num_scientific to fix serializing (GH-98750).Editor: Add “Quick Load” button to EditorResourcePicker (GH-104490).Editor: Add PROPERTY_HINT_INPUT_NAME for use with @export_custom to allow using input actions (GH-96611).Editor: Add named EditorScripts to the command palette (GH-99318).GUI: Add file sort to FileDialog (GH-105723).I18n: Add translation preview in editor (GH-96921).Import: Add Channel Remap settings to ResourceImporterTexture (GH-99676).Physics: Improve performance with non-monitoring areas when using Jolt Physics (GH-106490).Porting: Android: Add export option for custom theme attributes (GH-106724).Porting: Android: Add support for 16 KB page sizes, update to NDK r28b (GH-106358).Porting: Android: Remove the gradle_build/compress_native_libraries export option (GH-106359).Porting: Web: Use actual PThread pool size for get_default_thread_pool_size() (GH-104458).Porting: Windows/macOS/Linux: Use SSE 4.2 as a baseline when compiling Godot (GH-59595).Rendering: Add new StandardMaterial properties to allow users to control FPS-style objects (hands, weapons, tools close to the camera) (GH-93142).Rendering: FTI - Optimize SceneTree traversal (GH-106244).Changelog109 contributors submitted 252 fixes for this release. See our interactive changelog for the complete list of changes since the previous 4.5-dev4 snapshot.This release is built from commit 64b09905c.DownloadsGodot is downloading...Godot exists thanks to donations from people like you. Help us continue our work:Make a DonationStandard build includes support for GDScript and GDExtension..NET build (marked as mono) includes support for C#, as well as GDScript and GDExtension.While engine maintainers try their best to ensure that each preview snapshot and release candidate is stable, this is by definition a pre-release piece of software. Be sure to make frequent backups, or use a version control system such as Git, to preserve your projects in case of corruption or data loss.Known issuesWindows executables (both the editor and export templates) have been signed with an expired certificate. You may see warnings from Windows Defender’s SmartScreen when running this version, or outright be prevented from running the executables with a double-click (GH-106373). Running Godot from the command line can circumvent this. We will soon have a renewed certificate which will be used for future builds.With every release, we accept that there are going to be various issues, which have already been reported but haven’t been fixed yet. See the GitHub issue tracker for a complete list of known bugs.Bug reportsAs a tester, we encourage you to open bug reports if you experience issues with this release. Please check the existing issues on GitHub first, using the search function with relevant keywords, to ensure that the bug you experience is not already known.In particular, any change that would cause a regression in your projects is very important to report (e.g. if something that worked fine in previous 4.x releases, but no longer works in this snapshot).SupportGodot is a non-profit, open source game engine developed by hundreds of contributors on their free time, as well as a handful of part and full-time developers hired thanks to generous donations from the Godot community. A big thank you to everyone who has contributed their time or their financial support to the project!If you’d like to support the project financially and help us secure our future hires, you can do so using the Godot Development Fund.Donate now
    0 Commentarii 0 Distribuiri
  • CASTING A BLACK MIRROR ON USS CALLISTER: INTO INFINITY

    By TREVOR HOGG

    Images courtesy of Netflix.

    Unlike North America where episodes tend to be no longer than an hour, it is not uncommon in Britain to have feature-length episodes, which explains why the seasons are shorter. Season 7 of Black Mirror has six episodes with the first sequel for the Netflix anthology series that explores the dark side of technology having a run time of 90 minutes. “USS Callister: Into Infinity” comes eight years after “USS Callister” went on to win four Emmys as part of Season 4 and expands the tale where illegally constructed digital clones from human DNA struggle to survive in a multiplayer online video game environment. Returning creative talent includes filmmaker Toby Hayness, writers Charlie Brooker and William Bridges, and cast members Cristin Milioti, Jimmi Simpson, Osy Ikhile, Milanka Brooks, Paul Raymond and Jesse Plemons. Stepping into the Star Trek-meets-The Twilight Zone proceedings for the first time is VFX Supervisor James MacLachlan, who previously handled the digital augmentation for Ted Lasso.

    “… We got on a train and went to the middle of Angleseyto a copper mine. The copper mine was absolutely stunning. … You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. … It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.”
    —James MacLachlan, Visual Effects Supervisor

    Taking advantage of the reflective quality of the bridge set was the LED wall utilized for the main viewscreen.

    Dealing with a sequel to a critically-acclaimed episode was not a daunting task. “It’s almost like I have a cheat code for what we need to do, which I quite like because there’s a language from the previous show, so we have a certain number of outlines and guidelines,” MacLachlan states. “But because this was set beyond where the previous one was. it’s a different kind of aesthetic. I didn’t feel the pressure.” No assets were reused. “We were lucky that the company that previously did the USS Callister ship packaged it out neatly for us, and we were able to take that model; however, it doesn’t fit in pipelines anymore in the same way with the layering and materials. It was different visual effects vendors as well. Union VFX was smashing out all our new ships, planets and the Heart of Infinity. There was a significant number of resources put into new content.” Old props were helpful. “The Metallica ship that shows up in this episode is actually the Valdack ship turned backwards, upside down, re-textured and re-modeled off a prop I happened to wander past and saw in Charlie Brooker’s and Jessica Rhoades’ office.” MacLachlan notes.

    Greenscreens were placed outside of the set windows for the USS Callister.

    “USS Callister: Into Infinity” required 669 visual effects shots while the other five episodes totaled 912. “Josie Henwood, the Visual Effects Producer, sat down with a calculator and did an amazing job of making sure that the budget distribution was well-weighted for each of the scripts,’ MacLachlan remarks. “We shot this one third and posted it all the way to the end, so it overlapped a lot with some of the others. It was almost an advantage because we could work out where we were at with the major numbers and balance things out. It was a huge benefit that Toby had directed ‘USS Callister’. We had conversations about how we could approach the visual effects and make sure they sat within the budget and timeframe.” Working across the series were Crafty Apes, Jam VFX, Jellyfish Pictures, Magic Lab, One of Us, Stargate Studios, Terraform Studios, Union VFX, and Bigtooth Studios.  “We had a spectrum of vendors that were brilliant and weighted so Union VFX took the heavy load on ‘USS Callister: Into Infinity,’ One of Us on ‘Eulogy’ and Jam VFX on ‘Hotel Riverie’ while the other vendors were used for all the shows.”

    “e had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace, outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.”
    —James MacLachlan, Visual Effects Supervisor

    Miranda Jones looked after the production design and constructed a number of practical sets for the different sections of the USS Callister.

    A clever visual effect was deployed when a digital clone of Robert Dalyis in his garage crafting a new world for Infinity, which transforms from a horizontal landscape into a spherical planetary form. “A lot of it was based off current UI when you use your phone and scroll,” MacLachlan remarks. “It is weighted and slows down through an exponential curve, so we tried to do that with the rotational values. We also looked at people using HoloLenses and Minority Report with those gestural moments. It has a language that a number of people are comfortable with, and we have gotten there with AR.” Union VFX spent a lot of time working on the transition. “They had three levels of detail for each of the moments. We had the mountain range and talked about the Himalayas. Union VFX had these moments where they animated between the different sizes and scales of each of these models. The final one is a wrap and reveal to the sphere, so it’s like you’re scaling down and out of the moment, then it folds out from itself. It was really nice.”

    For safety reasons, weapons were digitally thrown. “We had a 3D prop printed for the shuriken and were able to get that out in front of the camera onstage,” MacLachlan explains. “Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible, so it worked well. Cristin did a convincing job of yanking the shuriken out. We added some blood and increased the size of the wound on her top, which we had to do for a couple of other scenes because blood goes dark when its dry, so it needed to be made redder.” Nanette Colethrows a ceremonial knife that hits Robert Daly directly in the head. “That was a crazy one. We had the full prop on the shelf in the beginning that she picks up and throws. The art department made a second one with a cutout section that was mounted to his head. Lucy Cainand I constructed a cage of hair clips and wire to hold it onto his head. Beyond that, we put tracking markers on his forehead, and we were able to add all of the blood. What we didn’t want to do was have too much blood and then have to remove it later. The decision was made to do the blood in post because you don’t want to be redressing it if you’re doing two or three takes; that can take a lot of time out of production.”

    “USS Callister: Into Infinity” required 669 visual effects shots.

    A digital clone of Robert Daly placed inside the game engine is responsible for creating the vast worlds found inside of Infinity.

    “We had a 3D prop printed for the shuriken… Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible… Cristin did a convincing job of yanking the shuriken out.”
    —James MacLachlan, Visual Effects Supervisor

    A cross between 2001: A Space Odyssey and Cast Away is the otherworldly planet where the digital clone of James Waltonis found. “We got on a train and went to the middle of Angleseyto a copper mine,” MacLachlan recounts. “The copper mine was absolutely stunning. It’s not as saturated. You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. We found moments that worked for the different areas. It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.”

    The blue teleportation ring was practically built and digitally enhanced.

    Set pieces were LiDAR scanned. “What was interesting about the ice planetthe art department built these amazing structures in the foreground and beyond that we had white drapes the whole way around, which fell off into darkness beautifully and naturally because of where the light was pulled by Stephan Pehrsson,” MacLachlan states. “On top of that, there was the special effects department, which was wafting in a lot of atmospherics. Some of the atmospherics were in-camera and others were augmented to even it out and boost it in places to help the situation. We did add foreground snow. There is a big crane shot in the beginning where Unreal Engine assisted in generating some material. Then we did matte painting and set extensions beyond that to create a larger scale and cool rock shapes that were on an angle.” The jungle setting was an actual location. “That’s Black Park, and because of the time of year, there are a lot of protected plants. We had a couple of moments where we weren’t allowed to walk in certain places. There is one big stunt where Nanette steps on a mine, and it explodes her back against a tree. That was a protected tree, so the whole thing was wrapped in this giant stunt mat while the stunt woman got thrown across it. Areas would be filled in with dressed plants to help the foreground, but we got most of the background in-camera. There were bits of clean-up where we spotted crew or trucks.”

    Large-scale and distinct rock shapes were placed at an angle to give the ice planet more of an alien quality.

    An exterior space shot of the USS Callister that is entirely CG.

    Twin versions of Nanette Cole and James Walton appear within the same frame. “Literally, we used every trick in the book the whole way through. Stephan and I went to see a motion control company that had a motion control camera on a TechnoDolly. Stephan could put it on his shoulder and record a move on a 20-foot crane. Once Stephan had done that first take, he would step away, then the motion control guys would do the same move again. You get this handheld feel through motion control rather than plotting two points and having it mechanical. You get a wide of a scene of clone Nanette in a chair and real Nanette standing in white, and you’ll notice the two Waltons in the background interacting with one another. Those shots were done on this motion control rig. We had motion control where we could plot points to make it feel like a tracking dolly. Then we also had our cameraman doing handheld moves pushing in and repeating himself. We had a wonderful double for Cristin who was excellent at mirroring what she was achieving, and they would switch and swap. You would have a shoulder or hair in the foreground in front of you, but then we would also stitch plates together that were handheld.”

    The USS Callister approaches the game engine situated at the Heart of Infinity.

    A homage to the fighter cockpit shots featured in the Star Wars franchise.

    USS Callister flies into the game engine while pursued by other Infinity players.

    A major story point is that the game engine is made to look complex but is in fact a façade.

    A copper mine served as the location for the planet where the digital clone of James Waltonis found.

    Principal photography for the jungle planet took place at Black Park in England.

    The blue skin of Elena Tulaskawas achieved with practical makeup.

    Assisting the lighting were some cool tools such as the teleportation ring. “We had this beautiful two-meter blue ring that we were able to put on the ground and light up as people step into it,” MacLachlan remarks. “You get these lovely reflections on their visors, helmets and kits. Then we augmented the blue ring in visual effects where it was replaced with more refined edging and lighting effects that stream up from it, which assisted with the integration with the teleportation effect because of their blue cyan tones.” Virtual production was utilized for the main viewscreen located on the bridge of the USS Callister. “In terms of reflections, the biggest boon for us in visual effects was the LED wall. The last time they did the big screen in the USS Callister was a greenscreen. We got a small version of a LED screen when the set was being built and did some tests. Then we had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace or outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.”
    #casting #black #mirror #uss #callister
    CASTING A BLACK MIRROR ON USS CALLISTER: INTO INFINITY
    By TREVOR HOGG Images courtesy of Netflix. Unlike North America where episodes tend to be no longer than an hour, it is not uncommon in Britain to have feature-length episodes, which explains why the seasons are shorter. Season 7 of Black Mirror has six episodes with the first sequel for the Netflix anthology series that explores the dark side of technology having a run time of 90 minutes. “USS Callister: Into Infinity” comes eight years after “USS Callister” went on to win four Emmys as part of Season 4 and expands the tale where illegally constructed digital clones from human DNA struggle to survive in a multiplayer online video game environment. Returning creative talent includes filmmaker Toby Hayness, writers Charlie Brooker and William Bridges, and cast members Cristin Milioti, Jimmi Simpson, Osy Ikhile, Milanka Brooks, Paul Raymond and Jesse Plemons. Stepping into the Star Trek-meets-The Twilight Zone proceedings for the first time is VFX Supervisor James MacLachlan, who previously handled the digital augmentation for Ted Lasso. “… We got on a train and went to the middle of Angleseyto a copper mine. The copper mine was absolutely stunning. … You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. … It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” —James MacLachlan, Visual Effects Supervisor Taking advantage of the reflective quality of the bridge set was the LED wall utilized for the main viewscreen. Dealing with a sequel to a critically-acclaimed episode was not a daunting task. “It’s almost like I have a cheat code for what we need to do, which I quite like because there’s a language from the previous show, so we have a certain number of outlines and guidelines,” MacLachlan states. “But because this was set beyond where the previous one was. it’s a different kind of aesthetic. I didn’t feel the pressure.” No assets were reused. “We were lucky that the company that previously did the USS Callister ship packaged it out neatly for us, and we were able to take that model; however, it doesn’t fit in pipelines anymore in the same way with the layering and materials. It was different visual effects vendors as well. Union VFX was smashing out all our new ships, planets and the Heart of Infinity. There was a significant number of resources put into new content.” Old props were helpful. “The Metallica ship that shows up in this episode is actually the Valdack ship turned backwards, upside down, re-textured and re-modeled off a prop I happened to wander past and saw in Charlie Brooker’s and Jessica Rhoades’ office.” MacLachlan notes. Greenscreens were placed outside of the set windows for the USS Callister. “USS Callister: Into Infinity” required 669 visual effects shots while the other five episodes totaled 912. “Josie Henwood, the Visual Effects Producer, sat down with a calculator and did an amazing job of making sure that the budget distribution was well-weighted for each of the scripts,’ MacLachlan remarks. “We shot this one third and posted it all the way to the end, so it overlapped a lot with some of the others. It was almost an advantage because we could work out where we were at with the major numbers and balance things out. It was a huge benefit that Toby had directed ‘USS Callister’. We had conversations about how we could approach the visual effects and make sure they sat within the budget and timeframe.” Working across the series were Crafty Apes, Jam VFX, Jellyfish Pictures, Magic Lab, One of Us, Stargate Studios, Terraform Studios, Union VFX, and Bigtooth Studios.  “We had a spectrum of vendors that were brilliant and weighted so Union VFX took the heavy load on ‘USS Callister: Into Infinity,’ One of Us on ‘Eulogy’ and Jam VFX on ‘Hotel Riverie’ while the other vendors were used for all the shows.” “e had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace, outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.” —James MacLachlan, Visual Effects Supervisor Miranda Jones looked after the production design and constructed a number of practical sets for the different sections of the USS Callister. A clever visual effect was deployed when a digital clone of Robert Dalyis in his garage crafting a new world for Infinity, which transforms from a horizontal landscape into a spherical planetary form. “A lot of it was based off current UI when you use your phone and scroll,” MacLachlan remarks. “It is weighted and slows down through an exponential curve, so we tried to do that with the rotational values. We also looked at people using HoloLenses and Minority Report with those gestural moments. It has a language that a number of people are comfortable with, and we have gotten there with AR.” Union VFX spent a lot of time working on the transition. “They had three levels of detail for each of the moments. We had the mountain range and talked about the Himalayas. Union VFX had these moments where they animated between the different sizes and scales of each of these models. The final one is a wrap and reveal to the sphere, so it’s like you’re scaling down and out of the moment, then it folds out from itself. It was really nice.” For safety reasons, weapons were digitally thrown. “We had a 3D prop printed for the shuriken and were able to get that out in front of the camera onstage,” MacLachlan explains. “Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible, so it worked well. Cristin did a convincing job of yanking the shuriken out. We added some blood and increased the size of the wound on her top, which we had to do for a couple of other scenes because blood goes dark when its dry, so it needed to be made redder.” Nanette Colethrows a ceremonial knife that hits Robert Daly directly in the head. “That was a crazy one. We had the full prop on the shelf in the beginning that she picks up and throws. The art department made a second one with a cutout section that was mounted to his head. Lucy Cainand I constructed a cage of hair clips and wire to hold it onto his head. Beyond that, we put tracking markers on his forehead, and we were able to add all of the blood. What we didn’t want to do was have too much blood and then have to remove it later. The decision was made to do the blood in post because you don’t want to be redressing it if you’re doing two or three takes; that can take a lot of time out of production.” “USS Callister: Into Infinity” required 669 visual effects shots. A digital clone of Robert Daly placed inside the game engine is responsible for creating the vast worlds found inside of Infinity. “We had a 3D prop printed for the shuriken… Then we decided to have it stand out more, so asthrows it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible… Cristin did a convincing job of yanking the shuriken out.” —James MacLachlan, Visual Effects Supervisor A cross between 2001: A Space Odyssey and Cast Away is the otherworldly planet where the digital clone of James Waltonis found. “We got on a train and went to the middle of Angleseyto a copper mine,” MacLachlan recounts. “The copper mine was absolutely stunning. It’s not as saturated. You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. We found moments that worked for the different areas. It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” The blue teleportation ring was practically built and digitally enhanced. Set pieces were LiDAR scanned. “What was interesting about the ice planetthe art department built these amazing structures in the foreground and beyond that we had white drapes the whole way around, which fell off into darkness beautifully and naturally because of where the light was pulled by Stephan Pehrsson,” MacLachlan states. “On top of that, there was the special effects department, which was wafting in a lot of atmospherics. Some of the atmospherics were in-camera and others were augmented to even it out and boost it in places to help the situation. We did add foreground snow. There is a big crane shot in the beginning where Unreal Engine assisted in generating some material. Then we did matte painting and set extensions beyond that to create a larger scale and cool rock shapes that were on an angle.” The jungle setting was an actual location. “That’s Black Park, and because of the time of year, there are a lot of protected plants. We had a couple of moments where we weren’t allowed to walk in certain places. There is one big stunt where Nanette steps on a mine, and it explodes her back against a tree. That was a protected tree, so the whole thing was wrapped in this giant stunt mat while the stunt woman got thrown across it. Areas would be filled in with dressed plants to help the foreground, but we got most of the background in-camera. There were bits of clean-up where we spotted crew or trucks.” Large-scale and distinct rock shapes were placed at an angle to give the ice planet more of an alien quality. An exterior space shot of the USS Callister that is entirely CG. Twin versions of Nanette Cole and James Walton appear within the same frame. “Literally, we used every trick in the book the whole way through. Stephan and I went to see a motion control company that had a motion control camera on a TechnoDolly. Stephan could put it on his shoulder and record a move on a 20-foot crane. Once Stephan had done that first take, he would step away, then the motion control guys would do the same move again. You get this handheld feel through motion control rather than plotting two points and having it mechanical. You get a wide of a scene of clone Nanette in a chair and real Nanette standing in white, and you’ll notice the two Waltons in the background interacting with one another. Those shots were done on this motion control rig. We had motion control where we could plot points to make it feel like a tracking dolly. Then we also had our cameraman doing handheld moves pushing in and repeating himself. We had a wonderful double for Cristin who was excellent at mirroring what she was achieving, and they would switch and swap. You would have a shoulder or hair in the foreground in front of you, but then we would also stitch plates together that were handheld.” The USS Callister approaches the game engine situated at the Heart of Infinity. A homage to the fighter cockpit shots featured in the Star Wars franchise. USS Callister flies into the game engine while pursued by other Infinity players. A major story point is that the game engine is made to look complex but is in fact a façade. A copper mine served as the location for the planet where the digital clone of James Waltonis found. Principal photography for the jungle planet took place at Black Park in England. The blue skin of Elena Tulaskawas achieved with practical makeup. Assisting the lighting were some cool tools such as the teleportation ring. “We had this beautiful two-meter blue ring that we were able to put on the ground and light up as people step into it,” MacLachlan remarks. “You get these lovely reflections on their visors, helmets and kits. Then we augmented the blue ring in visual effects where it was replaced with more refined edging and lighting effects that stream up from it, which assisted with the integration with the teleportation effect because of their blue cyan tones.” Virtual production was utilized for the main viewscreen located on the bridge of the USS Callister. “In terms of reflections, the biggest boon for us in visual effects was the LED wall. The last time they did the big screen in the USS Callister was a greenscreen. We got a small version of a LED screen when the set was being built and did some tests. Then we had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace or outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.” #casting #black #mirror #uss #callister
    WWW.VFXVOICE.COM
    CASTING A BLACK MIRROR ON USS CALLISTER: INTO INFINITY
    By TREVOR HOGG Images courtesy of Netflix. Unlike North America where episodes tend to be no longer than an hour, it is not uncommon in Britain to have feature-length episodes, which explains why the seasons are shorter. Season 7 of Black Mirror has six episodes with the first sequel for the Netflix anthology series that explores the dark side of technology having a run time of 90 minutes. “USS Callister: Into Infinity” comes eight years after “USS Callister” went on to win four Emmys as part of Season 4 and expands the tale where illegally constructed digital clones from human DNA struggle to survive in a multiplayer online video game environment. Returning creative talent includes filmmaker Toby Hayness, writers Charlie Brooker and William Bridges, and cast members Cristin Milioti, Jimmi Simpson, Osy Ikhile, Milanka Brooks, Paul Raymond and Jesse Plemons. Stepping into the Star Trek-meets-The Twilight Zone proceedings for the first time is VFX Supervisor James MacLachlan, who previously handled the digital augmentation for Ted Lasso. “[For the planet where the digital clone of James Walton is found]… We got on a train and went to the middle of Anglesey [island in Wales] to a copper mine. The copper mine was absolutely stunning. … You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. … It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” —James MacLachlan, Visual Effects Supervisor Taking advantage of the reflective quality of the bridge set was the LED wall utilized for the main viewscreen. Dealing with a sequel to a critically-acclaimed episode was not a daunting task. “It’s almost like I have a cheat code for what we need to do, which I quite like because there’s a language from the previous show, so we have a certain number of outlines and guidelines,” MacLachlan states. “But because this was set beyond where the previous one was. it’s a different kind of aesthetic. I didn’t feel the pressure.” No assets were reused. “We were lucky that the company that previously did the USS Callister ship packaged it out neatly for us, and we were able to take that model; however, it doesn’t fit in pipelines anymore in the same way with the layering and materials. It was different visual effects vendors as well. Union VFX was smashing out all our new ships, planets and the Heart of Infinity. There was a significant number of resources put into new content.” Old props were helpful. “The Metallica ship that shows up in this episode is actually the Valdack ship turned backwards, upside down, re-textured and re-modeled off a prop I happened to wander past and saw in Charlie Brooker’s and Jessica Rhoades’ office.” MacLachlan notes. Greenscreens were placed outside of the set windows for the USS Callister. “USS Callister: Into Infinity” required 669 visual effects shots while the other five episodes totaled 912. “Josie Henwood, the Visual Effects Producer, sat down with a calculator and did an amazing job of making sure that the budget distribution was well-weighted for each of the scripts,’ MacLachlan remarks. “We shot this one third and posted it all the way to the end, so it overlapped a lot with some of the others. It was almost an advantage because we could work out where we were at with the major numbers and balance things out. It was a huge benefit that Toby had directed ‘USS Callister’. We had conversations about how we could approach the visual effects and make sure they sat within the budget and timeframe.” Working across the series were Crafty Apes, Jam VFX, Jellyfish Pictures, Magic Lab, One of Us, Stargate Studios, Terraform Studios, Union VFX, and Bigtooth Studios.  “We had a spectrum of vendors that were brilliant and weighted so Union VFX took the heavy load on ‘USS Callister: Into Infinity,’ One of Us on ‘Eulogy’ and Jam VFX on ‘Hotel Riverie’ while the other vendors were used for all the shows.” “[W]e had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace, outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.” —James MacLachlan, Visual Effects Supervisor Miranda Jones looked after the production design and constructed a number of practical sets for the different sections of the USS Callister. A clever visual effect was deployed when a digital clone of Robert Daly (Jesse Plemmons) is in his garage crafting a new world for Infinity, which transforms from a horizontal landscape into a spherical planetary form. “A lot of it was based off current UI when you use your phone and scroll,” MacLachlan remarks. “It is weighted and slows down through an exponential curve, so we tried to do that with the rotational values. We also looked at people using HoloLenses and Minority Report with those gestural moments. It has a language that a number of people are comfortable with, and we have gotten there with AR.” Union VFX spent a lot of time working on the transition. “They had three levels of detail for each of the moments. We had the mountain range and talked about the Himalayas. Union VFX had these moments where they animated between the different sizes and scales of each of these models. The final one is a wrap and reveal to the sphere, so it’s like you’re scaling down and out of the moment, then it folds out from itself. It was really nice.” For safety reasons, weapons were digitally thrown. “We had a 3D prop printed for the shuriken and were able to get that out in front of the camera onstage,” MacLachlan explains. “Then we decided to have it stand out more, so as [the Infinity Player] throws it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible, so it worked well. Cristin did a convincing job of yanking the shuriken out. We added some blood and increased the size of the wound on her top, which we had to do for a couple of other scenes because blood goes dark when its dry, so it needed to be made redder.” Nanette Cole (Cristin Milioti) throws a ceremonial knife that hits Robert Daly directly in the head. “That was a crazy one. We had the full prop on the shelf in the beginning that she picks up and throws. The art department made a second one with a cutout section that was mounted to his head. Lucy Cain [Makeup & Hair Designer] and I constructed a cage of hair clips and wire to hold it onto his head. Beyond that, we put tracking markers on his forehead, and we were able to add all of the blood. What we didn’t want to do was have too much blood and then have to remove it later. The decision was made to do the blood in post because you don’t want to be redressing it if you’re doing two or three takes; that can take a lot of time out of production.” “USS Callister: Into Infinity” required 669 visual effects shots. A digital clone of Robert Daly placed inside the game engine is responsible for creating the vast worlds found inside of Infinity. “We had a 3D prop printed for the shuriken [hidden hand weapon]… Then we decided to have it stand out more, so as [the Infinity Player] throws it, it intentionally lights up. On set we couldn’t throw anything at Cristin, so some tracking markers were put on her top where it needed to land. Then we did that in CGI. When she is pulling it off her chest with her hand, the shuriken is all CGI. Because of the shape of the shuriken, we were able to have it poke through the fingers and was visible… Cristin did a convincing job of yanking the shuriken out.” —James MacLachlan, Visual Effects Supervisor A cross between 2001: A Space Odyssey and Cast Away is the otherworldly planet where the digital clone of James Walton (Jimmi Simpson) is found. “We got on a train and went to the middle of Anglesey [island in Wales] to a copper mine,” MacLachlan recounts. “The copper mine was absolutely stunning. It’s not as saturated. You’re a good 50 meters down, and there were little tunnels and caves where over the years things have been mined and stopped. We found moments that worked for the different areas. It was shot there, and we augmented some of it to help sell the fact that it wasn’t Earth. We put in these big beautiful arches of rock, Saturn-like planets up in the sky, a couple of moons, and clean-up of giveaways.” The blue teleportation ring was practically built and digitally enhanced. Set pieces were LiDAR scanned. “What was interesting about the ice planet [was that] the art department built these amazing structures in the foreground and beyond that we had white drapes the whole way around, which fell off into darkness beautifully and naturally because of where the light was pulled by Stephan Pehrsson [Cinematographer],” MacLachlan states. “On top of that, there was the special effects department, which was wafting in a lot of atmospherics. Some of the atmospherics were in-camera and others were augmented to even it out and boost it in places to help the situation. We did add foreground snow. There is a big crane shot in the beginning where Unreal Engine assisted in generating some material. Then we did matte painting and set extensions beyond that to create a larger scale and cool rock shapes that were on an angle.” The jungle setting was an actual location. “That’s Black Park [in England], and because of the time of year, there are a lot of protected plants. We had a couple of moments where we weren’t allowed to walk in certain places. There is one big stunt where Nanette steps on a mine, and it explodes her back against a tree. That was a protected tree, so the whole thing was wrapped in this giant stunt mat while the stunt woman got thrown across it. Areas would be filled in with dressed plants to help the foreground, but we got most of the background in-camera. There were bits of clean-up where we spotted crew or trucks.” Large-scale and distinct rock shapes were placed at an angle to give the ice planet more of an alien quality. An exterior space shot of the USS Callister that is entirely CG. Twin versions of Nanette Cole and James Walton appear within the same frame. “Literally, we used every trick in the book the whole way through. Stephan and I went to see a motion control company that had a motion control camera on a TechnoDolly. Stephan could put it on his shoulder and record a move on a 20-foot crane. Once Stephan had done that first take, he would step away, then the motion control guys would do the same move again. You get this handheld feel through motion control rather than plotting two points and having it mechanical. You get a wide of a scene of clone Nanette in a chair and real Nanette standing in white, and you’ll notice the two Waltons in the background interacting with one another. Those shots were done on this motion control rig. We had motion control where we could plot points to make it feel like a tracking dolly. Then we also had our cameraman doing handheld moves pushing in and repeating himself. We had a wonderful double for Cristin who was excellent at mirroring what she was achieving, and they would switch and swap. You would have a shoulder or hair in the foreground in front of you, but then we would also stitch plates together that were handheld.” The USS Callister approaches the game engine situated at the Heart of Infinity. A homage to the fighter cockpit shots featured in the Star Wars franchise. USS Callister flies into the game engine while pursued by other Infinity players. A major story point is that the game engine is made to look complex but is in fact a façade. A copper mine served as the location for the planet where the digital clone of James Walton (Jimmi Simpson) is found. Principal photography for the jungle planet took place at Black Park in England. The blue skin of Elena Tulaska (Milanka Brooks) was achieved with practical makeup. Assisting the lighting were some cool tools such as the teleportation ring. “We had this beautiful two-meter blue ring that we were able to put on the ground and light up as people step into it,” MacLachlan remarks. “You get these lovely reflections on their visors, helmets and kits. Then we augmented the blue ring in visual effects where it was replaced with more refined edging and lighting effects that stream up from it, which assisted with the integration with the teleportation effect because of their blue cyan tones.” Virtual production was utilized for the main viewscreen located on the bridge of the USS Callister. “In terms of reflections, the biggest boon for us in visual effects was the LED wall. The last time they did the big screen in the USS Callister was a greenscreen. We got a small version of a LED screen when the set was being built and did some tests. Then we had a matte painter at Territory Studio create some generic space looks like exteriors of planets in pre-production. We gave those to Union VFX who animated them so the stars gently drifted and the planets would slowly rotate. Everything in that set was chrome, so no matter where the camera was pointing, when we went to hyperspace or outside planets or in space, there were all of these beautiful reflections all over the surfaces of the USS Callister. What I did not anticipate is when the actors came onto the set not knowing it was going to be a LED wall. Their reaction was enough to say that we had made the right choice.”
    0 Commentarii 0 Distribuiri
  • Battlefront 2 is seeing a sudden surge in players

    Saucycarpdog
    Member

    Oct 25, 2017

    20,317

    On Steam the game is reaching CCU numbers it hasn't gotten in years.

    Console is harder to see but the game is #22 on Xbox Most Played charts right behind Overwatch 2. It's usually out of the top 50.

    Crazy to see. I'm guessing it's the Andor effect? 

    giancarlo123x
    One Winged Slayer
    Member

    Oct 25, 2017

    28,002

    The game still fucks. The game was in such a great spot with the anniversary update then they pulled the plug on content.
     

    AgentOtaku
    Member

    Oct 27, 2017

    4,586

    Saucycarpdog said:

    On Steam the game is reaching CCU numbers it hasn't gotten in years.

    Console is harder to see but the game is #22 on Xbox Most Played charts right behind Overwatch 2. It's usually out of the top 50.

    Crazy to see. I'm guessing it's the Andor effect?
    Click to expand...
    Click to shrink...

    Yeah, noticed it creeping up the last few weeks. Insane and probably the Andor effect. 

    ASleepingMonkey
    The Fallen

    Oct 26, 2017

    4,579

    Iowa

    This game is very, very popular among gen Z in particular. It is hard for me to scroll Instagram or TikTok without seeing a post about it.

    Younger people adore this game, particularly those who love the prequels. 

    Hella
    Member

    Oct 27, 2017

    24,492

    Guess it's time to reinstall Battlefront 2. Game is super fun

    Thank you for the PSA, OP. 

    HockeyBird
    Member

    Oct 27, 2017

    13,798

    You have friends everywhere when you play Battlefront II.
     

    Neverx
    Prophet of Truth - One Winged Slayer
    Member

    Sep 17, 2020

    3,854

    Florida

    It's a perfect storm of the yearly May 4th resurgence, Revenge of the Sith rerelease and Andor S2. Been playing a bit recently and the game is still great, such a shame they pulled the plug. Really excited for Zero Company and Jedi 3 but we really need a new Battlefront.
     

    Fuchsia
    Member

    Oct 28, 2017

    7,249

    It's honestly a fantastic game after all the work that was put into it post launch. It's still a blast to fire up today.

    Really sad that they basically killed support right when it was starting to become an all timer. 

    Nocturnowl
    Member

    Oct 25, 2017

    28,294

    It's all fun and games until the god players who have been playing eternally earn a hero like Yoda early on and the rest of the game becomes a slasher sim where your lil trooper or droid lives in constant terror of the unstoppable turbo speed lightsaber spam racing towards you

    Neat to see it have a resurgence though, not big on star wars these days but I did have a lot of fun playing this messy arse game with friends 

    JakeNoseIt
    Catch My Drift
    Verified

    Oct 27, 2017

    4,751

    Woah I don't watch Andor but I randomly redownloaded the game like a week ago and have been having fun poking around
     

    Zebesian-X
    Member

    Dec 3, 2018

    25,362

    great game kneecapped by its monetization strategy. Sad that they abandoned it right as it was hitting its stride. Glad to see people playing! Shame we never got a third one
     

    Prison_mike
    Banned

    Oct 26, 2017

    1,676

    Hell yeah I bet Fortnight is a reason too.

    I need to renew my online for this, dammit 

    DrScruffleton
    Member

    Oct 26, 2017

    14,887

    game was killed way too soon. Imagine all the crazy promotional stuff we couldve gotten over the years with show tie ins and stuff.
     

    dodo
    Member

    Oct 27, 2017

    4,293

    I'm part of the wave!
     

    Creed Bratton
    Member

    Aug 29, 2019

    782

    I jumped back in after playing the new Star Wars Fortnite content. It made me want a true SW experience.
     

    Gleethor
    Member

    Oct 25, 2017

    4,215

    Dot Matrix with stereo sound

    Coulda had so much new content had they stuck with it. Or at least a third game to break the curse.
     

    AgentOtaku
    Member

    Oct 27, 2017

    4,586

    ASleepingMonkey said:

    This game is very, very popular among gen Z in particular. It is hard for me to scroll Instagram or TikTok without seeing a post about it.

    Younger people adore this game, particularly those who love the prequels.
    Click to expand...
    Click to shrink...

    Haha, makes sense. Both my sons adored it.

    We really should have gotten another one :/ 

    Parker
    Member

    Feb 5, 2018

    698

    Also, the people who got it for free on EGS
     

    Lord Vatek
    Avenger

    Jan 18, 2018

    24,748

    Fortnite, May 4th, and Andor all likely play a part.

    I haven't checked, but I'd wager that Star Wars The Old Republic is seeing a similar bump. 

    dodo
    Member

    Oct 27, 2017

    4,293

    I'm still not a big fan of the card loadout system and I wish the vehicles were handled differently, but when it's hitting it has that DICE magic. Great visuals, amazing sound, loving attention to detail.
     

    Corv
    Member

    Aug 5, 2022

    645

    Prison_mike said:

    Hell yeah I bet Fortnight is a reason too.

    Click to expand...
    Click to shrink...

    Yeah the current Fortnite season made me want to boot it up
     

    Forerunner
    Resetufologist
    The Fallen

    Oct 30, 2017

    18,835

    Dice dropping support of this right when all the new SW content was being released was weird, they could have added so much to it.
     

    Strikerrr
    Member

    Oct 25, 2017

    2,858

    I've seen a big push from Arc Raiders fans who are suffering from tech test withdrawl.

    Even though they're very different games, there's some aspects about the movement and shooting which feel somewhat similar considering that a lot of the staff at DICE went to Embark after BF1/V.
    IIRC some devs who were at Motive during Battlefront 2's development commented that the initial Arc Raiders reveal when it was still a PvE game looked similar to some gameplay concepts that were originally pitched from Battlefront 2.

    View:
    I could see this as built from a discarded Battlefront coop mode where you take down AT-ATs 

    OhhEldenRing
    Member

    Aug 14, 2024

    2,860

    The Jade Raymond effect. It's been in the news that was her last big release.
     

    Santos
    Member

    Oct 25, 2017

    1,376

    Have they addressed hacking on PC? Tried play again on PC a couple of years ago and there were tons of hackers everywhere to the point they were removing map objectives lol.
     

    Jona Basta
    Member

    May 13, 2025

    27

    Santos said:

    Have they addressed hacking on PC? Tried play again on PC a couple of years ago and there were tons of hackers everywhere to the point they were removing map objectives lol.

    Click to expand...
    Click to shrink...

    There's a New Project Called Kyber that fixes that + introduces a lot of new things, but they keep pushing back the release date on that while pushing their Patreon so I'm not feeling to hot on it personally.
     

    Fuchsia
    Member

    Oct 28, 2017

    7,249

    Now, more than ever, I feel a new Battlefront would be so incredible. The issue is how do you feasibly make a game that would have enough content from each era of Star Wars to do it all justice? It would take so much manpower and time.
     

    OP

    OP

    Saucycarpdog
    Member

    Oct 25, 2017

    20,317

    Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod.

    View:
     

    Cheesy
    Member

    Oct 30, 2017

    2,564

    Saucycarpdog said:

    Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod.

    View:

    Click to expand...
    Click to shrink...

    Oh damn I didn't know that game got any mods that weren't just a bunch of really tacticool guns. I'll have to take a look. 

    Jona Basta
    Member

    May 13, 2025

    27

    Saucycarpdog said:

    Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod.

    View:

    Click to expand...
    Click to shrink...

    there's also the fantastic Squad Mod Galactic Contention:

    View:  

    Prasino95
    Member

    Feb 18, 2025

    6

    Such a shame they messed up the remaster of the original so bad, bet that would have got a surge too.
     

    Mocha Joe
    Member

    Jun 2, 2021

    13,433

    I should reinstall. I never gave it an honest try after strongly disliking the first one. Spent on basically a tech demo and I was a fucking idiot and bought the season pass. I never have bought a season pass since then
     

    danhz
    Member

    Apr 20, 2018

    3,539

    Hopefully i can play sone games tomorrow, Game was fire
     

    luca
    Member

    Oct 25, 2017

    19,417

    Never mind. It's the fps Battlefront game. I loved the single player campaign.
     

    tjh282
    Member

    May 29, 2019

    1,097

    My youngest brother and his friends have this in their MP rotation and they're not even huge SW fans. Marketing campaign killed this on arrival, but I loved both of the new BFs
     

    Dance Inferno
    Member

    Oct 29, 2017

    2,739

    Saucycarpdog said:

    Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod.

    View:

    Click to expand...
    Click to shrink...

    Woah WTF how have I never heard of this? This looks dope, downloading this ASAP. 

    Richietto
    One Winged Slayer
    Member

    Oct 25, 2017

    26,052

    North Carolina

    There's also a huge mod called Kyler that'll really bring life to the game.
     

    Man Called Aerodynamics
    Member

    Oct 29, 2017

    8,327

    I'm not a fan of BF2 in particularbut it's always nice to see older games get rediscovered or retain popularity.
     

    Papercuts
    Prophet of Truth
    Member

    Oct 25, 2017

    12,726

    Feel like this happened a few years back too, I remember actually picking it back up and having a surprisingly good time.
     

    panic
    Member

    Oct 26, 2024

    83

    Last time I tried to play this my team was being completely obliterated on spawn by players using starfighters.
     

    thediamondage
    Member

    Oct 25, 2017

    13,613

    i am jonesing hard for Star Wars after Andor so no surprise, rewatching movies now but i'll probably play Star Wars Outlaws next although Battlefront 2 isn't a bad idea, I loved that game in the first year. Its dumb as hell but it can be super fun too if you are just in it for vibes.

    Publishers should be discounting star wars games heavy right now, its an automatic impulse buy i think 

    ASleepingMonkey
    The Fallen

    Oct 26, 2017

    4,579

    Iowa

    AgentOtaku said:

    Haha, makes sense. Both my sons adored it.

    We really should have gotten another one :/
    Click to expand...
    Click to shrink...

    I would like to see DICE give it another go after Battlefield 6. I assume no matter what, they're going to want to shake it up a bit after close to another decade of nothing but Battlefield and so long as Battlefield 6 is good, it can probably sustain itself for a while. Going back to Battlefront would be great.

    I just want a better campaign this time, taking some cues from Andor and placing you in the shoes of just an ordinary person in this galactic war would be a great starting point. 

    BrickArts295
    GOTY Tracking Thread Master
    Member

    Oct 26, 2017

    15,642

    It's 100% Andor + Fortnite Star Wars Event /recent rerelease of Episode III. I caught the fever and even went back to get the platinum for Jedi Survivor cause I just wanted to play anything Star Wars.

    It's crazy how cursed Battlefront 3 is. Seems like both times we could have had the ultimate Star Wars game which would built up from their last 2 iterations. EA seems to be fine with working on Star Wars despite their latest stance on license titles, maybe DICE will be given one more go at it. 

    Javier23
    Member

    Oct 28, 2017

    3,230

    I for one have been thinking about giving this a try because of Fortnite's current SW season.
     

    wellpapp
    Member

    Aug 21, 2018

    531

    Gothenburg

    Wasnt the servers 100% broken by all the cheaters?
     

    The Quentulated Mox
    Corrupted by Vengeance
    Member

    Jun 10, 2022

    6,566

    On the one hand, I love andor in large part because of how much it does different from the rest of the franchise and I think blending it in with all the rest of the stuff risks diluting it

    On the other hand, imagine a Ferrix map in battlefront. Imagine a Narkina 5 map. Imagine blasting a truck full of troopers on Mina-Rau from your tie avenger. Imagine switching Cassian's gun into sniper configuration. Imagine spinning your Fondor haulcraft a billion times and shredding everything around you. Need 

    BubbaKrumpz
    The Fallen

    Oct 25, 2017

    3,849

    Yay Area

    Papercuts said:

    Feel like this happened a few years back too, I remember actually picking it back up and having a surprisingly good time.

    Click to expand...
    Click to shrink...

    Yeah, I went in with some of my buds and it was a great time. I've enjoyed the game since release though.

    I hav a friend who only plays battlefront 2 and most recently fortnite 

    Excelsior
    Member

    Oct 28, 2017

    1,058

    wellpapp said:

    Wasnt the servers 100% broken by all the cheaters?

    Click to expand...
    Click to shrink...

    This is what I heard too, is it playable? 

    Chromie
    Member

    Dec 4, 2017

    5,742

    AgentOtaku said:

    Yeah, noticed it creeping up the last few weeks. Insane and probably the Andor effect.

    Click to expand...
    Click to shrink...

    I will always love Star Wars but man, Andor has made Star Wars chatter online so positive that I jumped back into BF2.
     

    Doctor Shatner
    The Fallen

    Oct 25, 2017

    252

    Lol I literally reinstalled this because of Andor. So at least for me it was that! Funny to see such a weird hivemind esque effect going on there.

    Game is still fun but I have to pretend I'm a literal stormtrooper with my skill level. The number of crazyhigh ranks who fly around as vader/yoda, etc after a couple minutes into the match and insta gib me would drive me mad if I played seriously. 
    #battlefront #seeing #sudden #surge #players
    Battlefront 2 is seeing a sudden surge in players
    Saucycarpdog Member Oct 25, 2017 20,317 On Steam the game is reaching CCU numbers it hasn't gotten in years. Console is harder to see but the game is #22 on Xbox Most Played charts right behind Overwatch 2. It's usually out of the top 50. Crazy to see. I'm guessing it's the Andor effect?  giancarlo123x One Winged Slayer Member Oct 25, 2017 28,002 The game still fucks. The game was in such a great spot with the anniversary update then they pulled the plug on content.   AgentOtaku Member Oct 27, 2017 4,586 Saucycarpdog said: On Steam the game is reaching CCU numbers it hasn't gotten in years. Console is harder to see but the game is #22 on Xbox Most Played charts right behind Overwatch 2. It's usually out of the top 50. Crazy to see. I'm guessing it's the Andor effect? Click to expand... Click to shrink... Yeah, noticed it creeping up the last few weeks. Insane and probably the Andor effect.  ASleepingMonkey The Fallen Oct 26, 2017 4,579 Iowa This game is very, very popular among gen Z in particular. It is hard for me to scroll Instagram or TikTok without seeing a post about it. Younger people adore this game, particularly those who love the prequels.  Hella Member Oct 27, 2017 24,492 Guess it's time to reinstall Battlefront 2. Game is super fun Thank you for the PSA, OP.  HockeyBird Member Oct 27, 2017 13,798 You have friends everywhere when you play Battlefront II.   Neverx Prophet of Truth - One Winged Slayer Member Sep 17, 2020 3,854 Florida It's a perfect storm of the yearly May 4th resurgence, Revenge of the Sith rerelease and Andor S2. Been playing a bit recently and the game is still great, such a shame they pulled the plug. Really excited for Zero Company and Jedi 3 but we really need a new Battlefront.   Fuchsia Member Oct 28, 2017 7,249 It's honestly a fantastic game after all the work that was put into it post launch. It's still a blast to fire up today. Really sad that they basically killed support right when it was starting to become an all timer.  Nocturnowl Member Oct 25, 2017 28,294 It's all fun and games until the god players who have been playing eternally earn a hero like Yoda early on and the rest of the game becomes a slasher sim where your lil trooper or droid lives in constant terror of the unstoppable turbo speed lightsaber spam racing towards you Neat to see it have a resurgence though, not big on star wars these days but I did have a lot of fun playing this messy arse game with friends  JakeNoseIt Catch My Drift Verified Oct 27, 2017 4,751 Woah I don't watch Andor but I randomly redownloaded the game like a week ago and have been having fun poking around   Zebesian-X Member Dec 3, 2018 25,362 great game kneecapped by its monetization strategy. Sad that they abandoned it right as it was hitting its stride. Glad to see people playing! Shame we never got a third one   Prison_mike Banned Oct 26, 2017 1,676 Hell yeah I bet Fortnight is a reason too. I need to renew my online for this, dammit  DrScruffleton Member Oct 26, 2017 14,887 game was killed way too soon. Imagine all the crazy promotional stuff we couldve gotten over the years with show tie ins and stuff.   dodo Member Oct 27, 2017 4,293 I'm part of the wave!   Creed Bratton Member Aug 29, 2019 782 I jumped back in after playing the new Star Wars Fortnite content. It made me want a true SW experience.   Gleethor Member Oct 25, 2017 4,215 Dot Matrix with stereo sound Coulda had so much new content had they stuck with it. Or at least a third game to break the curse.   AgentOtaku Member Oct 27, 2017 4,586 ASleepingMonkey said: This game is very, very popular among gen Z in particular. It is hard for me to scroll Instagram or TikTok without seeing a post about it. Younger people adore this game, particularly those who love the prequels. Click to expand... Click to shrink... Haha, makes sense. Both my sons adored it. We really should have gotten another one :/  Parker Member Feb 5, 2018 698 Also, the people who got it for free on EGS   Lord Vatek Avenger Jan 18, 2018 24,748 Fortnite, May 4th, and Andor all likely play a part. I haven't checked, but I'd wager that Star Wars The Old Republic is seeing a similar bump.  dodo Member Oct 27, 2017 4,293 I'm still not a big fan of the card loadout system and I wish the vehicles were handled differently, but when it's hitting it has that DICE magic. Great visuals, amazing sound, loving attention to detail.   Corv Member Aug 5, 2022 645 Prison_mike said: Hell yeah I bet Fortnight is a reason too. Click to expand... Click to shrink... Yeah the current Fortnite season made me want to boot it up   Forerunner Resetufologist The Fallen Oct 30, 2017 18,835 Dice dropping support of this right when all the new SW content was being released was weird, they could have added so much to it.   Strikerrr Member Oct 25, 2017 2,858 I've seen a big push from Arc Raiders fans who are suffering from tech test withdrawl. Even though they're very different games, there's some aspects about the movement and shooting which feel somewhat similar considering that a lot of the staff at DICE went to Embark after BF1/V. IIRC some devs who were at Motive during Battlefront 2's development commented that the initial Arc Raiders reveal when it was still a PvE game looked similar to some gameplay concepts that were originally pitched from Battlefront 2. View: I could see this as built from a discarded Battlefront coop mode where you take down AT-ATs  OhhEldenRing Member Aug 14, 2024 2,860 The Jade Raymond effect. It's been in the news that was her last big release.   Santos Member Oct 25, 2017 1,376 Have they addressed hacking on PC? Tried play again on PC a couple of years ago and there were tons of hackers everywhere to the point they were removing map objectives lol.   Jona Basta Member May 13, 2025 27 Santos said: Have they addressed hacking on PC? Tried play again on PC a couple of years ago and there were tons of hackers everywhere to the point they were removing map objectives lol. Click to expand... Click to shrink... There's a New Project Called Kyber that fixes that + introduces a lot of new things, but they keep pushing back the release date on that while pushing their Patreon so I'm not feeling to hot on it personally.   Fuchsia Member Oct 28, 2017 7,249 Now, more than ever, I feel a new Battlefront would be so incredible. The issue is how do you feasibly make a game that would have enough content from each era of Star Wars to do it all justice? It would take so much manpower and time.   OP OP Saucycarpdog Member Oct 25, 2017 20,317 Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View:   Cheesy Member Oct 30, 2017 2,564 Saucycarpdog said: Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: Click to expand... Click to shrink... Oh damn I didn't know that game got any mods that weren't just a bunch of really tacticool guns. I'll have to take a look.  Jona Basta Member May 13, 2025 27 Saucycarpdog said: Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: Click to expand... Click to shrink... there's also the fantastic Squad Mod Galactic Contention: View:   Prasino95 Member Feb 18, 2025 6 Such a shame they messed up the remaster of the original so bad, bet that would have got a surge too.   Mocha Joe Member Jun 2, 2021 13,433 I should reinstall. I never gave it an honest try after strongly disliking the first one. Spent on basically a tech demo and I was a fucking idiot and bought the season pass. I never have bought a season pass since then   danhz Member Apr 20, 2018 3,539 Hopefully i can play sone games tomorrow, Game was fire 🔥🔥   luca Member Oct 25, 2017 19,417 Never mind. It's the fps Battlefront game. I loved the single player campaign.   tjh282 Member May 29, 2019 1,097 My youngest brother and his friends have this in their MP rotation and they're not even huge SW fans. Marketing campaign killed this on arrival, but I loved both of the new BFs   Dance Inferno Member Oct 29, 2017 2,739 Saucycarpdog said: Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: Click to expand... Click to shrink... Woah WTF how have I never heard of this? This looks dope, downloading this ASAP.  Richietto One Winged Slayer Member Oct 25, 2017 26,052 North Carolina There's also a huge mod called Kyler that'll really bring life to the game.   Man Called Aerodynamics Member Oct 29, 2017 8,327 I'm not a fan of BF2 in particularbut it's always nice to see older games get rediscovered or retain popularity.   Papercuts Prophet of Truth Member Oct 25, 2017 12,726 Feel like this happened a few years back too, I remember actually picking it back up and having a surprisingly good time.   panic Member Oct 26, 2024 83 Last time I tried to play this my team was being completely obliterated on spawn by players using starfighters.   thediamondage Member Oct 25, 2017 13,613 i am jonesing hard for Star Wars after Andor so no surprise, rewatching movies now but i'll probably play Star Wars Outlaws next although Battlefront 2 isn't a bad idea, I loved that game in the first year. Its dumb as hell but it can be super fun too if you are just in it for vibes. Publishers should be discounting star wars games heavy right now, its an automatic impulse buy i think  ASleepingMonkey The Fallen Oct 26, 2017 4,579 Iowa AgentOtaku said: Haha, makes sense. Both my sons adored it. We really should have gotten another one :/ Click to expand... Click to shrink... I would like to see DICE give it another go after Battlefield 6. I assume no matter what, they're going to want to shake it up a bit after close to another decade of nothing but Battlefield and so long as Battlefield 6 is good, it can probably sustain itself for a while. Going back to Battlefront would be great. I just want a better campaign this time, taking some cues from Andor and placing you in the shoes of just an ordinary person in this galactic war would be a great starting point.  BrickArts295 GOTY Tracking Thread Master Member Oct 26, 2017 15,642 It's 100% Andor + Fortnite Star Wars Event /recent rerelease of Episode III. I caught the fever and even went back to get the platinum for Jedi Survivor cause I just wanted to play anything Star Wars. It's crazy how cursed Battlefront 3 is. Seems like both times we could have had the ultimate Star Wars game which would built up from their last 2 iterations. EA seems to be fine with working on Star Wars despite their latest stance on license titles, maybe DICE will be given one more go at it.  Javier23 Member Oct 28, 2017 3,230 I for one have been thinking about giving this a try because of Fortnite's current SW season.   wellpapp Member Aug 21, 2018 531 Gothenburg Wasnt the servers 100% broken by all the cheaters?   The Quentulated Mox Corrupted by Vengeance Member Jun 10, 2022 6,566 On the one hand, I love andor in large part because of how much it does different from the rest of the franchise and I think blending it in with all the rest of the stuff risks diluting it On the other hand, imagine a Ferrix map in battlefront. Imagine a Narkina 5 map. Imagine blasting a truck full of troopers on Mina-Rau from your tie avenger. Imagine switching Cassian's gun into sniper configuration. Imagine spinning your Fondor haulcraft a billion times and shredding everything around you. Need  BubbaKrumpz The Fallen Oct 25, 2017 3,849 Yay Area Papercuts said: Feel like this happened a few years back too, I remember actually picking it back up and having a surprisingly good time. Click to expand... Click to shrink... Yeah, I went in with some of my buds and it was a great time. I've enjoyed the game since release though. I hav a friend who only plays battlefront 2 and most recently fortnite  Excelsior Member Oct 28, 2017 1,058 wellpapp said: Wasnt the servers 100% broken by all the cheaters? Click to expand... Click to shrink... This is what I heard too, is it playable?  Chromie Member Dec 4, 2017 5,742 AgentOtaku said: Yeah, noticed it creeping up the last few weeks. Insane and probably the Andor effect. Click to expand... Click to shrink... I will always love Star Wars but man, Andor has made Star Wars chatter online so positive that I jumped back into BF2.   Doctor Shatner The Fallen Oct 25, 2017 252 Lol I literally reinstalled this because of Andor. So at least for me it was that! Funny to see such a weird hivemind esque effect going on there. Game is still fun but I have to pretend I'm a literal stormtrooper with my skill level. The number of crazyhigh ranks who fly around as vader/yoda, etc after a couple minutes into the match and insta gib me would drive me mad if I played seriously.  #battlefront #seeing #sudden #surge #players
    WWW.RESETERA.COM
    Battlefront 2 is seeing a sudden surge in players
    Saucycarpdog Member Oct 25, 2017 20,317 On Steam the game is reaching CCU numbers it hasn't gotten in years. Console is harder to see but the game is #22 on Xbox Most Played charts right behind Overwatch 2. It's usually out of the top 50. Crazy to see. I'm guessing it's the Andor effect?  giancarlo123x One Winged Slayer Member Oct 25, 2017 28,002 The game still fucks. The game was in such a great spot with the anniversary update then they pulled the plug on content.   AgentOtaku Member Oct 27, 2017 4,586 Saucycarpdog said: On Steam the game is reaching CCU numbers it hasn't gotten in years. Console is harder to see but the game is #22 on Xbox Most Played charts right behind Overwatch 2. It's usually out of the top 50. Crazy to see. I'm guessing it's the Andor effect? Click to expand... Click to shrink... Yeah, noticed it creeping up the last few weeks. Insane and probably the Andor effect.  ASleepingMonkey The Fallen Oct 26, 2017 4,579 Iowa This game is very, very popular among gen Z in particular. It is hard for me to scroll Instagram or TikTok without seeing a post about it. Younger people adore this game, particularly those who love the prequels.  Hella Member Oct 27, 2017 24,492 Guess it's time to reinstall Battlefront 2. Game is super fun Thank you for the PSA, OP.  HockeyBird Member Oct 27, 2017 13,798 You have friends everywhere when you play Battlefront II.   Neverx Prophet of Truth - One Winged Slayer Member Sep 17, 2020 3,854 Florida It's a perfect storm of the yearly May 4th resurgence, Revenge of the Sith rerelease and Andor S2. Been playing a bit recently and the game is still great, such a shame they pulled the plug. Really excited for Zero Company and Jedi 3 but we really need a new Battlefront.   Fuchsia Member Oct 28, 2017 7,249 It's honestly a fantastic game after all the work that was put into it post launch. It's still a blast to fire up today. Really sad that they basically killed support right when it was starting to become an all timer.  Nocturnowl Member Oct 25, 2017 28,294 It's all fun and games until the god players who have been playing eternally earn a hero like Yoda early on and the rest of the game becomes a slasher sim where your lil trooper or droid lives in constant terror of the unstoppable turbo speed lightsaber spam racing towards you Neat to see it have a resurgence though, not big on star wars these days but I did have a lot of fun playing this messy arse game with friends  JakeNoseIt Catch My Drift Verified Oct 27, 2017 4,751 Woah I don't watch Andor but I randomly redownloaded the game like a week ago and have been having fun poking around   Zebesian-X Member Dec 3, 2018 25,362 great game kneecapped by its monetization strategy. Sad that they abandoned it right as it was hitting its stride. Glad to see people playing! Shame we never got a third one   Prison_mike Banned Oct 26, 2017 1,676 Hell yeah I bet Fortnight is a reason too. I need to renew my online for this, dammit  DrScruffleton Member Oct 26, 2017 14,887 game was killed way too soon. Imagine all the crazy promotional stuff we couldve gotten over the years with show tie ins and stuff.   dodo Member Oct 27, 2017 4,293 I'm part of the wave!   Creed Bratton Member Aug 29, 2019 782 I jumped back in after playing the new Star Wars Fortnite content. It made me want a true SW experience.   Gleethor Member Oct 25, 2017 4,215 Dot Matrix with stereo sound Coulda had so much new content had they stuck with it. Or at least a third game to break the curse.   AgentOtaku Member Oct 27, 2017 4,586 ASleepingMonkey said: This game is very, very popular among gen Z in particular. It is hard for me to scroll Instagram or TikTok without seeing a post about it. Younger people adore this game, particularly those who love the prequels. Click to expand... Click to shrink... Haha, makes sense. Both my sons adored it. We really should have gotten another one :/  Parker Member Feb 5, 2018 698 Also, the people who got it for free on EGS   Lord Vatek Avenger Jan 18, 2018 24,748 Fortnite, May 4th, and Andor all likely play a part. I haven't checked, but I'd wager that Star Wars The Old Republic is seeing a similar bump.  dodo Member Oct 27, 2017 4,293 I'm still not a big fan of the card loadout system and I wish the vehicles were handled differently, but when it's hitting it has that DICE magic. Great visuals, amazing sound, loving attention to detail.   Corv Member Aug 5, 2022 645 Prison_mike said: Hell yeah I bet Fortnight is a reason too. Click to expand... Click to shrink... Yeah the current Fortnite season made me want to boot it up   Forerunner Resetufologist The Fallen Oct 30, 2017 18,835 Dice dropping support of this right when all the new SW content was being released was weird, they could have added so much to it.   Strikerrr Member Oct 25, 2017 2,858 I've seen a big push from Arc Raiders fans who are suffering from tech test withdrawl. Even though they're very different games, there's some aspects about the movement and shooting which feel somewhat similar considering that a lot of the staff at DICE went to Embark after BF1/V. IIRC some devs who were at Motive during Battlefront 2's development commented that the initial Arc Raiders reveal when it was still a PvE game looked similar to some gameplay concepts that were originally pitched from Battlefront 2. View: https://www.youtube.com/watch?v=xuftkDxjGT4 I could see this as built from a discarded Battlefront coop mode where you take down AT-ATs  OhhEldenRing Member Aug 14, 2024 2,860 The Jade Raymond effect. It's been in the news that was her last big release.   Santos Member Oct 25, 2017 1,376 Have they addressed hacking on PC? Tried play again on PC a couple of years ago and there were tons of hackers everywhere to the point they were removing map objectives lol.   Jona Basta Member May 13, 2025 27 Santos said: Have they addressed hacking on PC? Tried play again on PC a couple of years ago and there were tons of hackers everywhere to the point they were removing map objectives lol. Click to expand... Click to shrink... There's a New Project Called Kyber that fixes that + introduces a lot of new things, but they keep pushing back the release date on that while pushing their Patreon so I'm not feeling to hot on it personally.   Fuchsia Member Oct 28, 2017 7,249 Now, more than ever, I feel a new Battlefront would be so incredible. The issue is how do you feasibly make a game that would have enough content from each era of Star Wars to do it all justice? It would take so much manpower and time.   OP OP Saucycarpdog Member Oct 25, 2017 20,317 Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: https://youtu.be/1qfHDopTd3k?si=W_vKAw-W4NWNK2ic   Cheesy Member Oct 30, 2017 2,564 Saucycarpdog said: Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: https://youtu.be/1qfHDopTd3k?si=W_vKAw-W4NWNK2ic Click to expand... Click to shrink... Oh damn I didn't know that game got any mods that weren't just a bunch of really tacticool guns. I'll have to take a look.  Jona Basta Member May 13, 2025 27 Saucycarpdog said: Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: https://youtu.be/1qfHDopTd3k?si=W_vKAw-W4NWNK2ic Click to expand... Click to shrink... there's also the fantastic Squad Mod Galactic Contention: View: https://www.youtube.com/watch?v=uIgrE5Bhvxs  Prasino95 Member Feb 18, 2025 6 Such a shame they messed up the remaster of the original so bad, bet that would have got a surge too.   Mocha Joe Member Jun 2, 2021 13,433 I should reinstall. I never gave it an honest try after strongly disliking the first one. Spent $40 on basically a tech demo and I was a fucking idiot and bought the season pass. I never have bought a season pass since then   danhz Member Apr 20, 2018 3,539 Hopefully i can play sone games tomorrow, Game was fire 🔥🔥   luca Member Oct 25, 2017 19,417 Never mind. It's the fps Battlefront game. I loved the single player campaign.   tjh282 Member May 29, 2019 1,097 My youngest brother and his friends have this in their MP rotation and they're not even huge SW fans. Marketing campaign killed this on arrival, but I loved both of the new BFs   Dance Inferno Member Oct 29, 2017 2,739 Saucycarpdog said: Also, if any of y'all have Insurgency Sandstorm on PC, you can try the new Star Wars Sagas mod. View: https://youtu.be/1qfHDopTd3k?si=W_vKAw-W4NWNK2ic Click to expand... Click to shrink... Woah WTF how have I never heard of this? This looks dope, downloading this ASAP.  Richietto One Winged Slayer Member Oct 25, 2017 26,052 North Carolina There's also a huge mod called Kyler that'll really bring life to the game.   Man Called Aerodynamics Member Oct 29, 2017 8,327 I'm not a fan of BF2 in particular (that campaign sucked) but it's always nice to see older games get rediscovered or retain popularity.   Papercuts Prophet of Truth Member Oct 25, 2017 12,726 Feel like this happened a few years back too, I remember actually picking it back up and having a surprisingly good time.   panic Member Oct 26, 2024 83 Last time I tried to play this my team was being completely obliterated on spawn by players using starfighters.   thediamondage Member Oct 25, 2017 13,613 i am jonesing hard for Star Wars after Andor so no surprise, rewatching movies now but i'll probably play Star Wars Outlaws next although Battlefront 2 isn't a bad idea, I loved that game in the first year. Its dumb as hell but it can be super fun too if you are just in it for vibes. Publishers should be discounting star wars games heavy right now, its an automatic impulse buy i think  ASleepingMonkey The Fallen Oct 26, 2017 4,579 Iowa AgentOtaku said: Haha, makes sense. Both my sons adored it. We really should have gotten another one :/ Click to expand... Click to shrink... I would like to see DICE give it another go after Battlefield 6. I assume no matter what, they're going to want to shake it up a bit after close to another decade of nothing but Battlefield and so long as Battlefield 6 is good, it can probably sustain itself for a while. Going back to Battlefront would be great. I just want a better campaign this time, taking some cues from Andor and placing you in the shoes of just an ordinary person in this galactic war would be a great starting point.  BrickArts295 GOTY Tracking Thread Master Member Oct 26, 2017 15,642 It's 100% Andor + Fortnite Star Wars Event /recent rerelease of Episode III. I caught the fever and even went back to get the platinum for Jedi Survivor cause I just wanted to play anything Star Wars. It's crazy how cursed Battlefront 3 is. Seems like both times we could have had the ultimate Star Wars game which would built up from their last 2 iterations. EA seems to be fine with working on Star Wars despite their latest stance on license titles, maybe DICE will be given one more go at it.  Javier23 Member Oct 28, 2017 3,230 I for one have been thinking about giving this a try because of Fortnite's current SW season.   wellpapp Member Aug 21, 2018 531 Gothenburg Wasnt the servers 100% broken by all the cheaters?   The Quentulated Mox Corrupted by Vengeance Member Jun 10, 2022 6,566 On the one hand, I love andor in large part because of how much it does different from the rest of the franchise and I think blending it in with all the rest of the stuff risks diluting it On the other hand, imagine a Ferrix map in battlefront. Imagine a Narkina 5 map. Imagine blasting a truck full of troopers on Mina-Rau from your tie avenger. Imagine switching Cassian's gun into sniper configuration. Imagine spinning your Fondor haulcraft a billion times and shredding everything around you. Need  BubbaKrumpz The Fallen Oct 25, 2017 3,849 Yay Area Papercuts said: Feel like this happened a few years back too, I remember actually picking it back up and having a surprisingly good time. Click to expand... Click to shrink... Yeah, I went in with some of my buds and it was a great time. I've enjoyed the game since release though. I hav a friend who only plays battlefront 2 and most recently fortnite  Excelsior Member Oct 28, 2017 1,058 wellpapp said: Wasnt the servers 100% broken by all the cheaters? Click to expand... Click to shrink... This is what I heard too, is it playable?  Chromie Member Dec 4, 2017 5,742 AgentOtaku said: Yeah, noticed it creeping up the last few weeks. Insane and probably the Andor effect. Click to expand... Click to shrink... I will always love Star Wars but man, Andor has made Star Wars chatter online so positive that I jumped back into BF2.   Doctor Shatner The Fallen Oct 25, 2017 252 Lol I literally reinstalled this because of Andor (have had zero content creators mention it outside of watching old Giant Bomb E3 videos). So at least for me it was that! Funny to see such a weird hivemind esque effect going on there. Game is still fun but I have to pretend I'm a literal stormtrooper with my skill level. The number of crazyhigh ranks who fly around as vader/yoda, etc after a couple minutes into the match and insta gib me would drive me mad if I played seriously. 
    0 Commentarii 0 Distribuiri
  • Google's Veo 3 Is Already Deepfaking All of YouTube's Most Smooth-Brained Content

    By

    James Pero

    Published May 22, 2025

    |

    Comments|

    Google Veo 3 man-on-the-street video generation. © Screenshot by Gizmodo

    Wake up, babe, new viral AI video generator dropped. This time, it’s not OpenAI’s Sora model in the spotlight, it’s Google’s Veo 3, which was announced on Tuesday during the company’s annual I/O keynote. Naturally, people are eager to see what chaos Veo 3 can wreak, and the results have been, well, chaotic. We’ve got disjointed Michael Bay fodder, talking muffins, self-aware AI sims, puppy-centric pharmaceutical ads—the list goes on. One thing that I keep seeing over and over, however, is—to put it bluntly—AI slop, and a very specific variety. For whatever reason, all of you seem to be absolutely hellbent on getting Veo to conjure up a torrent of smooth-brain YouTube content. The worst part is that this thing is actually kind of good at cranking it out, too. Don’t believe me? Here are the receipts. Is this 100% convincing? No. No, it is not. At a glance, though, most people wouldn’t be able to tell the difference if they’re just scrolling through their social feed mindlessly as one does when they’re using literally any social media site/app. Unboxing not cutting it for you? Well, don’t worry, we’ve got some man-on-the-street slop for your viewing pleasure. Sorry, hawk-tuah girl, it’s the singularity’s turn to capitalize on viral fame.

    Again, Veo’s generation is not perfect by any means, but it’s not exactly unconvincing, either. And there’s more bad news: Your Twitch-like smooth-brain content isn’t safe either. Here’s one of a picture-in-picture-style “Fortnite” stream that simulates gameplay and everything. I say “Fortnite” in scare quotes because this is just an AI representation of what Fortnite looks like, not the real thing. Either way, the only thing worse than mindless game streams is arguably mindless game streams that never even happened. And to be honest, the idea of simulating a simulation makes my brain feel achey, so for that reason alone, I’m going to hard pass. Listen, I’m not trying to be an alarmist here. In the grand scheme of things, AI-generated YouTube, Twitch, or TikTok chum isn’t going to hurt anyone, exactly, but it also doesn’t paint a rosy portrait of our AI-generated future. If there’s one thing we don’t need more of, it’s filler. Social media, without AI entering the equation, is already mostly junk, and it does make one wonder what the results of widespread generative video will really be in the end. Maybe I’ll wind up with AI-generated egg on my face, and video generators like Flow, Google’s “AI filmmaker,” will be a watershed product for real creators, but I have my doubts.

    At the very least, I’d like to see some safeguards if video generation is going to go mainstream. As harmless as AI slop might be, the ability to generate fairly convincing video isn’t one that should be taken lightly. There’s obviously huge potential for misinformation and propaganda, and if all it takes to help mitigate that is watermarking videos created in Veo 3, then it feels like an easy first step. For now, we’ll just have to take the explosion of Veo 3-enabled content with a spoonful of molasses, because there’s a lot of slop to get to, and this might be just the first course.

    Daily Newsletter

    You May Also Like

    Raymond Wong, James Pero, and Kyle Barr

    Published May 22, 2025

    By

    James Pero

    Published May 22, 2025

    By

    Vanessa Taylor

    Published May 22, 2025

    By

    Raymond Wong

    Published May 21, 2025

    By

    James Pero

    Published May 21, 2025

    By

    AJ Dellinger

    Published May 21, 2025
    #google039s #veo #already #deepfaking #all
    Google's Veo 3 Is Already Deepfaking All of YouTube's Most Smooth-Brained Content
    By James Pero Published May 22, 2025 | Comments| Google Veo 3 man-on-the-street video generation. © Screenshot by Gizmodo Wake up, babe, new viral AI video generator dropped. This time, it’s not OpenAI’s Sora model in the spotlight, it’s Google’s Veo 3, which was announced on Tuesday during the company’s annual I/O keynote. Naturally, people are eager to see what chaos Veo 3 can wreak, and the results have been, well, chaotic. We’ve got disjointed Michael Bay fodder, talking muffins, self-aware AI sims, puppy-centric pharmaceutical ads—the list goes on. One thing that I keep seeing over and over, however, is—to put it bluntly—AI slop, and a very specific variety. For whatever reason, all of you seem to be absolutely hellbent on getting Veo to conjure up a torrent of smooth-brain YouTube content. The worst part is that this thing is actually kind of good at cranking it out, too. Don’t believe me? Here are the receipts. Is this 100% convincing? No. No, it is not. At a glance, though, most people wouldn’t be able to tell the difference if they’re just scrolling through their social feed mindlessly as one does when they’re using literally any social media site/app. Unboxing not cutting it for you? Well, don’t worry, we’ve got some man-on-the-street slop for your viewing pleasure. Sorry, hawk-tuah girl, it’s the singularity’s turn to capitalize on viral fame. Again, Veo’s generation is not perfect by any means, but it’s not exactly unconvincing, either. And there’s more bad news: Your Twitch-like smooth-brain content isn’t safe either. Here’s one of a picture-in-picture-style “Fortnite” stream that simulates gameplay and everything. I say “Fortnite” in scare quotes because this is just an AI representation of what Fortnite looks like, not the real thing. Either way, the only thing worse than mindless game streams is arguably mindless game streams that never even happened. And to be honest, the idea of simulating a simulation makes my brain feel achey, so for that reason alone, I’m going to hard pass. Listen, I’m not trying to be an alarmist here. In the grand scheme of things, AI-generated YouTube, Twitch, or TikTok chum isn’t going to hurt anyone, exactly, but it also doesn’t paint a rosy portrait of our AI-generated future. If there’s one thing we don’t need more of, it’s filler. Social media, without AI entering the equation, is already mostly junk, and it does make one wonder what the results of widespread generative video will really be in the end. Maybe I’ll wind up with AI-generated egg on my face, and video generators like Flow, Google’s “AI filmmaker,” will be a watershed product for real creators, but I have my doubts. At the very least, I’d like to see some safeguards if video generation is going to go mainstream. As harmless as AI slop might be, the ability to generate fairly convincing video isn’t one that should be taken lightly. There’s obviously huge potential for misinformation and propaganda, and if all it takes to help mitigate that is watermarking videos created in Veo 3, then it feels like an easy first step. For now, we’ll just have to take the explosion of Veo 3-enabled content with a spoonful of molasses, because there’s a lot of slop to get to, and this might be just the first course. Daily Newsletter You May Also Like Raymond Wong, James Pero, and Kyle Barr Published May 22, 2025 By James Pero Published May 22, 2025 By Vanessa Taylor Published May 22, 2025 By Raymond Wong Published May 21, 2025 By James Pero Published May 21, 2025 By AJ Dellinger Published May 21, 2025 #google039s #veo #already #deepfaking #all
    GIZMODO.COM
    Google's Veo 3 Is Already Deepfaking All of YouTube's Most Smooth-Brained Content
    By James Pero Published May 22, 2025 | Comments (19) | Google Veo 3 man-on-the-street video generation. © Screenshot by Gizmodo Wake up, babe, new viral AI video generator dropped. This time, it’s not OpenAI’s Sora model in the spotlight, it’s Google’s Veo 3, which was announced on Tuesday during the company’s annual I/O keynote. Naturally, people are eager to see what chaos Veo 3 can wreak, and the results have been, well, chaotic. We’ve got disjointed Michael Bay fodder, talking muffins, self-aware AI sims, puppy-centric pharmaceutical ads—the list goes on. One thing that I keep seeing over and over, however, is—to put it bluntly—AI slop, and a very specific variety. For whatever reason, all of you seem to be absolutely hellbent on getting Veo to conjure up a torrent of smooth-brain YouTube content. The worst part is that this thing is actually kind of good at cranking it out, too. Don’t believe me? Here are the receipts. Is this 100% convincing? No. No, it is not. At a glance, though, most people wouldn’t be able to tell the difference if they’re just scrolling through their social feed mindlessly as one does when they’re using literally any social media site/app. Unboxing not cutting it for you? Well, don’t worry, we’ve got some man-on-the-street slop for your viewing pleasure. Sorry, hawk-tuah girl, it’s the singularity’s turn to capitalize on viral fame. Again, Veo’s generation is not perfect by any means, but it’s not exactly unconvincing, either. And there’s more bad news: Your Twitch-like smooth-brain content isn’t safe either. Here’s one of a picture-in-picture-style “Fortnite” stream that simulates gameplay and everything. I say “Fortnite” in scare quotes because this is just an AI representation of what Fortnite looks like, not the real thing. Either way, the only thing worse than mindless game streams is arguably mindless game streams that never even happened. And to be honest, the idea of simulating a simulation makes my brain feel achey, so for that reason alone, I’m going to hard pass. Listen, I’m not trying to be an alarmist here. In the grand scheme of things, AI-generated YouTube, Twitch, or TikTok chum isn’t going to hurt anyone, exactly, but it also doesn’t paint a rosy portrait of our AI-generated future. If there’s one thing we don’t need more of, it’s filler. Social media, without AI entering the equation, is already mostly junk, and it does make one wonder what the results of widespread generative video will really be in the end. Maybe I’ll wind up with AI-generated egg on my face, and video generators like Flow, Google’s “AI filmmaker,” will be a watershed product for real creators, but I have my doubts. At the very least, I’d like to see some safeguards if video generation is going to go mainstream. As harmless as AI slop might be, the ability to generate fairly convincing video isn’t one that should be taken lightly. There’s obviously huge potential for misinformation and propaganda, and if all it takes to help mitigate that is watermarking videos created in Veo 3, then it feels like an easy first step. For now, we’ll just have to take the explosion of Veo 3-enabled content with a spoonful of molasses, because there’s a lot of slop to get to, and this might be just the first course. Daily Newsletter You May Also Like Raymond Wong, James Pero, and Kyle Barr Published May 22, 2025 By James Pero Published May 22, 2025 By Vanessa Taylor Published May 22, 2025 By Raymond Wong Published May 21, 2025 By James Pero Published May 21, 2025 By AJ Dellinger Published May 21, 2025
    0 Commentarii 0 Distribuiri
  • Live Updates From Google I/O 2025

    © Gizmodo I wish I was making this stuff up, but chaos seems to follow me at all tech events. After waiting an hour to try out Google’s hyped-up Android XR smart glasses for five minutes, I was actually given a three-minute demo, where I actually had 90 seconds to use Gemini in an extremely controlled environment. And actually, if you watch the video in my hands-on write-up below, you’ll see that I spent even less time with it because Gemini fumbled a few times in the beginning. Oof. I really hope there’s another chance to try them again because it was just too rushed. I think it might be the most rushed product demo I’ve ever had in my life, and I’ve been covering new gadgets for the past 15 years. —Raymond Wong Google, a company valued at trillion, seemingly brought one pair of Android XR smart glasses for press to demo… and one pair of Samsung’s Project Moohan mixed reality headset running the same augmented reality platform. I’m told the wait is 1 hour to try either device for 5 minutes. Of course, I’m going to try out the smart glasses. But if I want to demo Moohan, I need to get back in line and wait all over again. This is madness! —Raymond Wong May 20Keynote Fin © Raymond Wong / Gizmodo Talk about a loooooong keynote. Total duration: 1 hour and 55 minutes, and then Sundar Pichai walked off stage. What do you make of all the AI announcements? Let’s hang in the comments! I’m headed over to a demo area to try out a pair of Android XR smart glasses. I can’t lie, even though the video stream from the live demo lagged for a good portion, I’m hyped! It really feels like Google is finally delivering on Google Glass over a decade later. Shoulda had Google co-founder Sergey Brin jump out of a helicopter and land on stage again, though. —Raymond Wong Pieces of Project Astra, Google’s computer vision-based UI, are winding up in various different products, it seems, and not all of them are geared toward smart glasses specifically. One of the most exciting updates to Astra is “computer control,” which allows one to do a lot more on their devices with computer vision alone. For instance, you could just point your phone at an objectand then ask Astra to search for the bike, find some brakes for it, and then even pull up a YouTube tutorial on how to fix it—all without typing anything into your phone. —James Pero Shopping bots aren’t just for scalpers anymore. Google is putting the power of automated consumerism in your hands with its new AI shopping tool. There are some pretty wild ideas here, too, including a virtual shopping avatar that’s supposed to represent your own body—the idea is you can make it try on clothes to see how they fit. How all that works in practice is TBD, but if you’re ready for a full AI shopping experience, you’ve finally got it. For the whole story, check out our story from Gizmodo’s Senior Editor, Consumer Tech, Raymond Wong. —James Pero I got what I wanted. Google showed off what its Android XR tech can bring to smart glasses. In a live demo, Google showcased how a pair of unspecified smart glasses did a few of the things that I’ve been waiting to do, including projecting live navigation and remembering objects in your environment—basically the stuff that it pitched with Project Astra last year, but in a glasses form factor. There’s still a lot that needs to happen, both hardware and software-wise, before you can walk around wearing glasses that actually do all those things, but it was exciting to see that Google is making progress in that direction. It’s worth noting that not all of the demos went off smoothly—there was lots of stutter in the live translation demo—but I guess props to them for giving it a go. When we’ll actually get to walk around wearing functional smart glasses with some kind of optical passthrough or virtual display is anyone’s guess, but the race is certainly heating up. —James Pero Google’s SynthID has been around for nearly three years, but it’s been largely kept out of the public eye. The system disturbs AI-generated images, video, or audio with an invisible, undetectable watermark that can be observed with Google DeepMind’s proprietary tool. At I/O, Google said it was working with both Nvidia and GetReal to introduce the same watermarking technique with those companies’ AI image generators. Users may be able to detect these watermarks themselves, even if only part of the media was modified with AI. Early testers are getting access to it “today,” but hopefully more people can acess it at a later date from labs.google/synthid. — Kyle Barr This keynote has been going on for 1.5 hours now. Do I run to the restroom now or wait? But how much longer until it ends??? Can we petiton to Sundar Pichai to make these keynotes shorter or at least have an intermission? Update: I ran for it right near the end before Android XR news hit. I almost made it… —Raymond Wong © Raymond Wong / Gizmodo Google’s new video generator Veo, is getting a big upgrade that includes sound generation, and it’s not just dialogue. Veo 3 can also generate sound effects and music. In a demo, Google showed off an animated forest scene that includes all three—dialogue, sound effects, and video. The length of clips, I assume, will be short at first, but the results look pretty sophisticated if the demo is to be believed. —James Pero If you pay for a Google One subscription, you’ll start to see Gemini in your Google Chrome browserlater this week. This will appear as the sparkle icon at the top of your browser app. You can use this to bring up a prompt box to ask a question about the current page you’re browsing, such as if you want to consolidate a number of user reviews for a local campsite. — Kyle Barr © Google / GIF by Gizmodo Google’s high-tech video conferencing tech, now called Beam, looks impressive. You can make eye contact! It feels like the person in the screen is right in front of you! It’s glasses-free 3D! Come back down to Earth, buddy—it’s not coming out as a consumer product. Commercial first with partners like HP. Time to apply for a new job? —Raymond Wong here: Google doesn’t want Search to be tied to your browser or apps anymore. Search Live is akin to the video and audio comprehension capabilities of Gemini Live, but with the added benefit of getting quick answers based on sites from around the web. Google showed how Search Live could comprehend queries about at-home science experiment and bring in answers from sites like Quora or YouTube. — Kyle Barr Google is getting deep into augmented reality with Android XR—its operating system built specifically for AR glasses and VR headsets. Google showed us how users may be able to see a holographic live Google Maps view directly on their glasses or set up calendar events, all without needing to touch a single screen. This uses Gemini AI to comprehend your voice prompts and follow through on your instructions. Google doesn’t have its own device to share at I/O, but its planning to work with companies like XReal and Samsung to craft new devices across both AR and VR. — Kyle Barr Read our full report here: I know how much you all love subscriptions! Google does too, apparently, and is now offering a per month AI bundle that groups some of its most advanced AI services. Subscribing to Google AI Ultra will get you: Gemini and its full capabilities Flow, a new, more advanced AI filmmaking tool based on Veo Whisk, which allows text-to-image creation NotebookLM, an AI note-taking app Gemini in Gmail and Docs Gemini in Chrome Project Mariner, an agentic research AI 30TB of storage I’m not sure who needs all of this, but maybe there are more AI superusers than I thought. —James Pero Google CEO Sundar Pichai was keen to claim that users are big, big fans of AI overviews in Google Search results. If there wasn’t already enough AI on your search bar, Google will now stick an entire “AI Mode” tab on your search bar next to the Google Lens button. This encompasses the Gemini 2.5 model. This opens up an entirely new UI for searching via a prompt with a chatbot. After you input your rambling search query, it will bring up an assortment of short-form textual answers, links, and even a Google Maps widget depending on what you were looking for. AI Mode should be available starting today. Google said AI Mode pulls together information from the web alongside its other data like weather or academic research through Google Scholar. It should also eventually encompass your “personal context,” which will be available later this summer. Eventually, Google will add more AI Mode capabilities directly to AI Overviews. — Kyle Barr May 20News Embargo Has Lifted! © Xreal Get your butt over to Gizmodo.com’s home page because the Google I/O news embargo just lifted. We’ve got a bunch of stories, including this one about Google partnering up with Xreal for a new pair of “optical see-through”smart glasses called Project Aura. The smart glasses run Android XR and are powered by a Qualcomm chip. You can see three cameras. Wireless, these are not—you’ll need to tether to a phone or other device. Update: Little scoop: I’ve confirmed that Project Aura has a 70-degree field of view, which is way wider than the One Pro’s FOV, which is 57 degrees. —Raymond Wong © Raymond Wong / Gizmodo Google’s DeepMind CEO showed off the updated version of Project Astra running on a phone and drove home how its “personal, proactive, and powerful” AI features are the groundwork for a “universal assistant” that truly understands and works on your behalf. If you think Gemini is a fad, it’s time to get familiar with it because it’s not going anywhere. —Raymond Wong May 20Gemini 2.5 Pro Is Here © Gizmodo Google says Gemini 2.5 Pro is its “most advanced model yet,” and comes with “enhanced reasoning,” better coding ability, and can even create interactive simulations. You can try it now via Google AI Studio. —James Pero There are two major types of transformer AI used today. One is the LLM, AKA large language models, and diffusion models—which are mostly used for image generation. The Gemini Diffusion model blurs the lines of these types of models. Google said its new research model can iterate on a solution quickly and correct itself while generating an answer. For math or coding prompts, Gemini Diffusion can potentially output an entire response much faster than a typical Chatbot. Unlike a traditional LLM model, which may take a few seconds to answer a question, Gemini Diffusion can create a response to a complex math equation in the blink of an eye, and still share the steps it took to reach its conclusion. — Kyle Barr © Gizmodo New Gemini 2.5 Flash and Gemini Pro models are incoming and, naturally, Google says both are faster and more sophisticated across the board. One of the improvements for Gemini 2.5 Flash is even more inflection when speaking. Unfortunately for my ears, Google demoed the new Flash speaking in a whisper that sent chills down my spine. —James Pero Is anybody keeping track of how many times Google execs have said “Gemini” and “AI” so far? Oops, I think I’m already drunk, and we’re only 20 minutes in. —Raymond Wong © Raymond Wong / Gizmodo Google’s Project Astra is supposed to be getting much better at avoiding hallucinations, AKA when the AI makes stuff up. Project Astra’s vision and audio comprehension capabilities are supposed to be far better at knowing when you’re trying to trick it. In a video, Google showed how its Gemini Live AI wouldn’t buy your bullshit if you tell it that a garbage truck is a convertible, a lamp pole is a skyscraper, or your shadow is some stalker. This should hopefully mean the AI doesn’t confidently lie to you, as well. Google CEO Sundar Pichai said “Gemini is really good at telling you when you’re wrong.” These enhanced features should be rolling out today for Gemini app on iOS and Android. — Kyle Barr May 20Release the Agents Like pretty much every other AI player, Google is pursuing agentic AI in a big way. I’d prepare for a lot more talk about how Gemini can take tasks off your hands as the keynote progresses. —James Pero © Gizmodo Google has finally moved Project Starline—its futuristic video-calling machine—into a commercial project called Google Beam. According to Pichai, Google Beam can take a 2D image and transform it into a 3D one, and will also incorporate live translate. —James Pero © Gizmodo Google’s CEO, Sundar Pichai, says Google is shipping at a relentless pace, and to be honest, I tend to agree. There are tons of Gemini models out there already, even though it’s only been out for two years. Probably my favorite milestone, though, is that it has now completed Pokémon Blue, earning all 8 badges according to Pichai. —James Pero May 20Let’s Do This Buckle up, kiddos, it’s I/O time. Methinks there will be a lot to get to, so you may want to grab a snack now. —James Pero Counting down until the keynote… only a few more minutes to go. The DJ just said AI is changing music and how it’s made. But don’t forget that we’re all here… in person. Will we all be wearing Android XR smart glasses next year? Mixed reality headsets? —Raymond Wong © Raymond Wong / Gizmodo Fun fact: I haven’t attended Google I/O in person since before Covid-19. The Wi-Fi is definitely stronger and more stable now. It’s so great to be back and covering for Gizmodo. Dream job, unlocked! —Raymond Wong © Raymond Wong / Gizmodo Mini breakfast burritos… bagels… but these bagels can’t compare to real Made In New York City bagels with that authentic NY water —Raymond Wong © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo I’ve arrived at the Shoreline Amphitheatre in Mountain View, Calif., where the Google I/O keynote is taking place in 40 minutes. Seats are filling up. But first, must go check out the breakfast situation because my tummy is growling… —Raymond Wong May 20Should We Do a Giveaway? © Raymond Wong / Gizmodo Google I/O attendees get a special tote bag, a metal water bottle, a cap, and a cute sheet of stickers. I always end up donating this stuff to Goodwill during the holidays. A guy living in NYC with two cats only has so much room for tote bags and water bottles… Would be cool to do giveaway. Leave a comment to let us know if you’d be into that and I can pester top brass to make it happen —Raymond Wong May 20Got My Press Badge! In 13 hours, Google will blitz everyone with Gemini AI, Gemini AI, and tons more Gemini AI. Who’s ready for… Gemini AI? —Raymond Wong May 19Google Glass: The Redux © Google / Screenshot by Gizmodo Google is very obviously inching toward the release of some kind of smart glasses product for the first time sinceGoogle Glass, and if I were a betting man, I’d say this one will have a much warmer reception than its forebearer. I’m not saying Google can snatch the crown from Meta and its Ray-Ban smart glasses right out of the gate, but if it plays its cards right, it could capitalize on the integration with its other hardwarein a big way. Meta may finally have a real competitor on its hands. ICYMI: Here’s Google’s President of the Android Ecosystem, Sameer Samat, teasing some kind of smart glasses device in a recorded demo last week. —James Pero Hi folks, I’m James Pero, Gizmodo’s new Senior Writer. There’s a lot we have to get to with Google I/O, so I’ll keep this introduction short. I like long walks on the beach, the wind in my nonexistent hair, and I’m really, really, looking forward to bringing you even more of the spicy, insightful, and entertaining coverage on consumer tech that Gizmodo is known for. I’m starting my tenure here out hot with Google I/O, so make sure you check back here throughout the week to get those sweet, sweet blogs and commentary from me and Gizmodo’s Senior Consumer Tech Editor Raymond Wong. —James Pero © Raymond Wong / Gizmodo Hey everyone! Raymond Wong, senior editor in charge of Gizmodo’s consumer tech team, here! Landed in San Francisco, and I’ll be making my way over to Mountain View, California, later today to pick up my press badge and scope out the scene for tomorrow’s Google I/O keynote, which kicks off at 1 p.m. ET / 10 a.m. PT. Google I/O is a developer conference, but that doesn’t mean it’s news only for engineers. While there will be a lot of nerdy stuff that will have developers hollering, what Google announces—expect updates on Gemini AI, Android, and Android XR, to name a few headliners—will shape consumer productsfor the rest of this year and also the years to come. I/O is a glimpse at Google’s technology roadmap as AI weaves itself into the way we compute at our desks and on the go. This is going to be a fun live blog! —Raymond Wong
    #live #updates #google
    Live Updates From Google I/O 2025 🔴
    © Gizmodo I wish I was making this stuff up, but chaos seems to follow me at all tech events. After waiting an hour to try out Google’s hyped-up Android XR smart glasses for five minutes, I was actually given a three-minute demo, where I actually had 90 seconds to use Gemini in an extremely controlled environment. And actually, if you watch the video in my hands-on write-up below, you’ll see that I spent even less time with it because Gemini fumbled a few times in the beginning. Oof. I really hope there’s another chance to try them again because it was just too rushed. I think it might be the most rushed product demo I’ve ever had in my life, and I’ve been covering new gadgets for the past 15 years. —Raymond Wong Google, a company valued at trillion, seemingly brought one pair of Android XR smart glasses for press to demo… and one pair of Samsung’s Project Moohan mixed reality headset running the same augmented reality platform. I’m told the wait is 1 hour to try either device for 5 minutes. Of course, I’m going to try out the smart glasses. But if I want to demo Moohan, I need to get back in line and wait all over again. This is madness! —Raymond Wong May 20Keynote Fin © Raymond Wong / Gizmodo Talk about a loooooong keynote. Total duration: 1 hour and 55 minutes, and then Sundar Pichai walked off stage. What do you make of all the AI announcements? Let’s hang in the comments! I’m headed over to a demo area to try out a pair of Android XR smart glasses. I can’t lie, even though the video stream from the live demo lagged for a good portion, I’m hyped! It really feels like Google is finally delivering on Google Glass over a decade later. Shoulda had Google co-founder Sergey Brin jump out of a helicopter and land on stage again, though. —Raymond Wong Pieces of Project Astra, Google’s computer vision-based UI, are winding up in various different products, it seems, and not all of them are geared toward smart glasses specifically. One of the most exciting updates to Astra is “computer control,” which allows one to do a lot more on their devices with computer vision alone. For instance, you could just point your phone at an objectand then ask Astra to search for the bike, find some brakes for it, and then even pull up a YouTube tutorial on how to fix it—all without typing anything into your phone. —James Pero Shopping bots aren’t just for scalpers anymore. Google is putting the power of automated consumerism in your hands with its new AI shopping tool. There are some pretty wild ideas here, too, including a virtual shopping avatar that’s supposed to represent your own body—the idea is you can make it try on clothes to see how they fit. How all that works in practice is TBD, but if you’re ready for a full AI shopping experience, you’ve finally got it. For the whole story, check out our story from Gizmodo’s Senior Editor, Consumer Tech, Raymond Wong. —James Pero I got what I wanted. Google showed off what its Android XR tech can bring to smart glasses. In a live demo, Google showcased how a pair of unspecified smart glasses did a few of the things that I’ve been waiting to do, including projecting live navigation and remembering objects in your environment—basically the stuff that it pitched with Project Astra last year, but in a glasses form factor. There’s still a lot that needs to happen, both hardware and software-wise, before you can walk around wearing glasses that actually do all those things, but it was exciting to see that Google is making progress in that direction. It’s worth noting that not all of the demos went off smoothly—there was lots of stutter in the live translation demo—but I guess props to them for giving it a go. When we’ll actually get to walk around wearing functional smart glasses with some kind of optical passthrough or virtual display is anyone’s guess, but the race is certainly heating up. —James Pero Google’s SynthID has been around for nearly three years, but it’s been largely kept out of the public eye. The system disturbs AI-generated images, video, or audio with an invisible, undetectable watermark that can be observed with Google DeepMind’s proprietary tool. At I/O, Google said it was working with both Nvidia and GetReal to introduce the same watermarking technique with those companies’ AI image generators. Users may be able to detect these watermarks themselves, even if only part of the media was modified with AI. Early testers are getting access to it “today,” but hopefully more people can acess it at a later date from labs.google/synthid. — Kyle Barr This keynote has been going on for 1.5 hours now. Do I run to the restroom now or wait? But how much longer until it ends??? Can we petiton to Sundar Pichai to make these keynotes shorter or at least have an intermission? Update: I ran for it right near the end before Android XR news hit. I almost made it… —Raymond Wong © Raymond Wong / Gizmodo Google’s new video generator Veo, is getting a big upgrade that includes sound generation, and it’s not just dialogue. Veo 3 can also generate sound effects and music. In a demo, Google showed off an animated forest scene that includes all three—dialogue, sound effects, and video. The length of clips, I assume, will be short at first, but the results look pretty sophisticated if the demo is to be believed. —James Pero If you pay for a Google One subscription, you’ll start to see Gemini in your Google Chrome browserlater this week. This will appear as the sparkle icon at the top of your browser app. You can use this to bring up a prompt box to ask a question about the current page you’re browsing, such as if you want to consolidate a number of user reviews for a local campsite. — Kyle Barr © Google / GIF by Gizmodo Google’s high-tech video conferencing tech, now called Beam, looks impressive. You can make eye contact! It feels like the person in the screen is right in front of you! It’s glasses-free 3D! Come back down to Earth, buddy—it’s not coming out as a consumer product. Commercial first with partners like HP. Time to apply for a new job? —Raymond Wong here: Google doesn’t want Search to be tied to your browser or apps anymore. Search Live is akin to the video and audio comprehension capabilities of Gemini Live, but with the added benefit of getting quick answers based on sites from around the web. Google showed how Search Live could comprehend queries about at-home science experiment and bring in answers from sites like Quora or YouTube. — Kyle Barr Google is getting deep into augmented reality with Android XR—its operating system built specifically for AR glasses and VR headsets. Google showed us how users may be able to see a holographic live Google Maps view directly on their glasses or set up calendar events, all without needing to touch a single screen. This uses Gemini AI to comprehend your voice prompts and follow through on your instructions. Google doesn’t have its own device to share at I/O, but its planning to work with companies like XReal and Samsung to craft new devices across both AR and VR. — Kyle Barr Read our full report here: I know how much you all love subscriptions! Google does too, apparently, and is now offering a per month AI bundle that groups some of its most advanced AI services. Subscribing to Google AI Ultra will get you: Gemini and its full capabilities Flow, a new, more advanced AI filmmaking tool based on Veo Whisk, which allows text-to-image creation NotebookLM, an AI note-taking app Gemini in Gmail and Docs Gemini in Chrome Project Mariner, an agentic research AI 30TB of storage I’m not sure who needs all of this, but maybe there are more AI superusers than I thought. —James Pero Google CEO Sundar Pichai was keen to claim that users are big, big fans of AI overviews in Google Search results. If there wasn’t already enough AI on your search bar, Google will now stick an entire “AI Mode” tab on your search bar next to the Google Lens button. This encompasses the Gemini 2.5 model. This opens up an entirely new UI for searching via a prompt with a chatbot. After you input your rambling search query, it will bring up an assortment of short-form textual answers, links, and even a Google Maps widget depending on what you were looking for. AI Mode should be available starting today. Google said AI Mode pulls together information from the web alongside its other data like weather or academic research through Google Scholar. It should also eventually encompass your “personal context,” which will be available later this summer. Eventually, Google will add more AI Mode capabilities directly to AI Overviews. — Kyle Barr May 20News Embargo Has Lifted! © Xreal Get your butt over to Gizmodo.com’s home page because the Google I/O news embargo just lifted. We’ve got a bunch of stories, including this one about Google partnering up with Xreal for a new pair of “optical see-through”smart glasses called Project Aura. The smart glasses run Android XR and are powered by a Qualcomm chip. You can see three cameras. Wireless, these are not—you’ll need to tether to a phone or other device. Update: Little scoop: I’ve confirmed that Project Aura has a 70-degree field of view, which is way wider than the One Pro’s FOV, which is 57 degrees. —Raymond Wong © Raymond Wong / Gizmodo Google’s DeepMind CEO showed off the updated version of Project Astra running on a phone and drove home how its “personal, proactive, and powerful” AI features are the groundwork for a “universal assistant” that truly understands and works on your behalf. If you think Gemini is a fad, it’s time to get familiar with it because it’s not going anywhere. —Raymond Wong May 20Gemini 2.5 Pro Is Here © Gizmodo Google says Gemini 2.5 Pro is its “most advanced model yet,” and comes with “enhanced reasoning,” better coding ability, and can even create interactive simulations. You can try it now via Google AI Studio. —James Pero There are two major types of transformer AI used today. One is the LLM, AKA large language models, and diffusion models—which are mostly used for image generation. The Gemini Diffusion model blurs the lines of these types of models. Google said its new research model can iterate on a solution quickly and correct itself while generating an answer. For math or coding prompts, Gemini Diffusion can potentially output an entire response much faster than a typical Chatbot. Unlike a traditional LLM model, which may take a few seconds to answer a question, Gemini Diffusion can create a response to a complex math equation in the blink of an eye, and still share the steps it took to reach its conclusion. — Kyle Barr © Gizmodo New Gemini 2.5 Flash and Gemini Pro models are incoming and, naturally, Google says both are faster and more sophisticated across the board. One of the improvements for Gemini 2.5 Flash is even more inflection when speaking. Unfortunately for my ears, Google demoed the new Flash speaking in a whisper that sent chills down my spine. —James Pero Is anybody keeping track of how many times Google execs have said “Gemini” and “AI” so far? Oops, I think I’m already drunk, and we’re only 20 minutes in. —Raymond Wong © Raymond Wong / Gizmodo Google’s Project Astra is supposed to be getting much better at avoiding hallucinations, AKA when the AI makes stuff up. Project Astra’s vision and audio comprehension capabilities are supposed to be far better at knowing when you’re trying to trick it. In a video, Google showed how its Gemini Live AI wouldn’t buy your bullshit if you tell it that a garbage truck is a convertible, a lamp pole is a skyscraper, or your shadow is some stalker. This should hopefully mean the AI doesn’t confidently lie to you, as well. Google CEO Sundar Pichai said “Gemini is really good at telling you when you’re wrong.” These enhanced features should be rolling out today for Gemini app on iOS and Android. — Kyle Barr May 20Release the Agents Like pretty much every other AI player, Google is pursuing agentic AI in a big way. I’d prepare for a lot more talk about how Gemini can take tasks off your hands as the keynote progresses. —James Pero © Gizmodo Google has finally moved Project Starline—its futuristic video-calling machine—into a commercial project called Google Beam. According to Pichai, Google Beam can take a 2D image and transform it into a 3D one, and will also incorporate live translate. —James Pero © Gizmodo Google’s CEO, Sundar Pichai, says Google is shipping at a relentless pace, and to be honest, I tend to agree. There are tons of Gemini models out there already, even though it’s only been out for two years. Probably my favorite milestone, though, is that it has now completed Pokémon Blue, earning all 8 badges according to Pichai. —James Pero May 20Let’s Do This Buckle up, kiddos, it’s I/O time. Methinks there will be a lot to get to, so you may want to grab a snack now. —James Pero Counting down until the keynote… only a few more minutes to go. The DJ just said AI is changing music and how it’s made. But don’t forget that we’re all here… in person. Will we all be wearing Android XR smart glasses next year? Mixed reality headsets? —Raymond Wong © Raymond Wong / Gizmodo Fun fact: I haven’t attended Google I/O in person since before Covid-19. The Wi-Fi is definitely stronger and more stable now. It’s so great to be back and covering for Gizmodo. Dream job, unlocked! —Raymond Wong © Raymond Wong / Gizmodo Mini breakfast burritos… bagels… but these bagels can’t compare to real Made In New York City bagels with that authentic NY water 😏 —Raymond Wong © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo I’ve arrived at the Shoreline Amphitheatre in Mountain View, Calif., where the Google I/O keynote is taking place in 40 minutes. Seats are filling up. But first, must go check out the breakfast situation because my tummy is growling… —Raymond Wong May 20Should We Do a Giveaway? © Raymond Wong / Gizmodo Google I/O attendees get a special tote bag, a metal water bottle, a cap, and a cute sheet of stickers. I always end up donating this stuff to Goodwill during the holidays. A guy living in NYC with two cats only has so much room for tote bags and water bottles… Would be cool to do giveaway. Leave a comment to let us know if you’d be into that and I can pester top brass to make it happen 🤪 —Raymond Wong May 20Got My Press Badge! In 13 hours, Google will blitz everyone with Gemini AI, Gemini AI, and tons more Gemini AI. Who’s ready for… Gemini AI? —Raymond Wong May 19Google Glass: The Redux © Google / Screenshot by Gizmodo Google is very obviously inching toward the release of some kind of smart glasses product for the first time sinceGoogle Glass, and if I were a betting man, I’d say this one will have a much warmer reception than its forebearer. I’m not saying Google can snatch the crown from Meta and its Ray-Ban smart glasses right out of the gate, but if it plays its cards right, it could capitalize on the integration with its other hardwarein a big way. Meta may finally have a real competitor on its hands. ICYMI: Here’s Google’s President of the Android Ecosystem, Sameer Samat, teasing some kind of smart glasses device in a recorded demo last week. —James Pero Hi folks, I’m James Pero, Gizmodo’s new Senior Writer. There’s a lot we have to get to with Google I/O, so I’ll keep this introduction short. I like long walks on the beach, the wind in my nonexistent hair, and I’m really, really, looking forward to bringing you even more of the spicy, insightful, and entertaining coverage on consumer tech that Gizmodo is known for. I’m starting my tenure here out hot with Google I/O, so make sure you check back here throughout the week to get those sweet, sweet blogs and commentary from me and Gizmodo’s Senior Consumer Tech Editor Raymond Wong. —James Pero © Raymond Wong / Gizmodo Hey everyone! Raymond Wong, senior editor in charge of Gizmodo’s consumer tech team, here! Landed in San Francisco, and I’ll be making my way over to Mountain View, California, later today to pick up my press badge and scope out the scene for tomorrow’s Google I/O keynote, which kicks off at 1 p.m. ET / 10 a.m. PT. Google I/O is a developer conference, but that doesn’t mean it’s news only for engineers. While there will be a lot of nerdy stuff that will have developers hollering, what Google announces—expect updates on Gemini AI, Android, and Android XR, to name a few headliners—will shape consumer productsfor the rest of this year and also the years to come. I/O is a glimpse at Google’s technology roadmap as AI weaves itself into the way we compute at our desks and on the go. This is going to be a fun live blog! —Raymond Wong #live #updates #google
    GIZMODO.COM
    Live Updates From Google I/O 2025 🔴
    © Gizmodo I wish I was making this stuff up, but chaos seems to follow me at all tech events. After waiting an hour to try out Google’s hyped-up Android XR smart glasses for five minutes, I was actually given a three-minute demo, where I actually had 90 seconds to use Gemini in an extremely controlled environment. And actually, if you watch the video in my hands-on write-up below, you’ll see that I spent even less time with it because Gemini fumbled a few times in the beginning. Oof. I really hope there’s another chance to try them again because it was just too rushed. I think it might be the most rushed product demo I’ve ever had in my life, and I’ve been covering new gadgets for the past 15 years. —Raymond Wong Google, a company valued at $2 trillion, seemingly brought one pair of Android XR smart glasses for press to demo… and one pair of Samsung’s Project Moohan mixed reality headset running the same augmented reality platform. I’m told the wait is 1 hour to try either device for 5 minutes. Of course, I’m going to try out the smart glasses. But if I want to demo Moohan, I need to get back in line and wait all over again. This is madness! —Raymond Wong May 20Keynote Fin © Raymond Wong / Gizmodo Talk about a loooooong keynote. Total duration: 1 hour and 55 minutes, and then Sundar Pichai walked off stage. What do you make of all the AI announcements? Let’s hang in the comments! I’m headed over to a demo area to try out a pair of Android XR smart glasses. I can’t lie, even though the video stream from the live demo lagged for a good portion, I’m hyped! It really feels like Google is finally delivering on Google Glass over a decade later. Shoulda had Google co-founder Sergey Brin jump out of a helicopter and land on stage again, though. —Raymond Wong Pieces of Project Astra, Google’s computer vision-based UI, are winding up in various different products, it seems, and not all of them are geared toward smart glasses specifically. One of the most exciting updates to Astra is “computer control,” which allows one to do a lot more on their devices with computer vision alone. For instance, you could just point your phone at an object (say, a bike) and then ask Astra to search for the bike, find some brakes for it, and then even pull up a YouTube tutorial on how to fix it—all without typing anything into your phone. —James Pero Shopping bots aren’t just for scalpers anymore. Google is putting the power of automated consumerism in your hands with its new AI shopping tool. There are some pretty wild ideas here, too, including a virtual shopping avatar that’s supposed to represent your own body—the idea is you can make it try on clothes to see how they fit. How all that works in practice is TBD, but if you’re ready for a full AI shopping experience, you’ve finally got it. For the whole story, check out our story from Gizmodo’s Senior Editor, Consumer Tech, Raymond Wong. —James Pero I got what I wanted. Google showed off what its Android XR tech can bring to smart glasses. In a live demo, Google showcased how a pair of unspecified smart glasses did a few of the things that I’ve been waiting to do, including projecting live navigation and remembering objects in your environment—basically the stuff that it pitched with Project Astra last year, but in a glasses form factor. There’s still a lot that needs to happen, both hardware and software-wise, before you can walk around wearing glasses that actually do all those things, but it was exciting to see that Google is making progress in that direction. It’s worth noting that not all of the demos went off smoothly—there was lots of stutter in the live translation demo—but I guess props to them for giving it a go. When we’ll actually get to walk around wearing functional smart glasses with some kind of optical passthrough or virtual display is anyone’s guess, but the race is certainly heating up. —James Pero Google’s SynthID has been around for nearly three years, but it’s been largely kept out of the public eye. The system disturbs AI-generated images, video, or audio with an invisible, undetectable watermark that can be observed with Google DeepMind’s proprietary tool. At I/O, Google said it was working with both Nvidia and GetReal to introduce the same watermarking technique with those companies’ AI image generators. Users may be able to detect these watermarks themselves, even if only part of the media was modified with AI. Early testers are getting access to it “today,” but hopefully more people can acess it at a later date from labs.google/synthid. — Kyle Barr This keynote has been going on for 1.5 hours now. Do I run to the restroom now or wait? But how much longer until it ends??? Can we petiton to Sundar Pichai to make these keynotes shorter or at least have an intermission? Update: I ran for it right near the end before Android XR news hit. I almost made it… —Raymond Wong © Raymond Wong / Gizmodo Google’s new video generator Veo, is getting a big upgrade that includes sound generation, and it’s not just dialogue. Veo 3 can also generate sound effects and music. In a demo, Google showed off an animated forest scene that includes all three—dialogue, sound effects, and video. The length of clips, I assume, will be short at first, but the results look pretty sophisticated if the demo is to be believed. —James Pero If you pay for a Google One subscription, you’ll start to see Gemini in your Google Chrome browser (and—judging by this developer conference—everywhere else) later this week. This will appear as the sparkle icon at the top of your browser app. You can use this to bring up a prompt box to ask a question about the current page you’re browsing, such as if you want to consolidate a number of user reviews for a local campsite. — Kyle Barr © Google / GIF by Gizmodo Google’s high-tech video conferencing tech, now called Beam, looks impressive. You can make eye contact! It feels like the person in the screen is right in front of you! It’s glasses-free 3D! Come back down to Earth, buddy—it’s not coming out as a consumer product. Commercial first with partners like HP. Time to apply for a new job? —Raymond Wong Read more here: Google doesn’t want Search to be tied to your browser or apps anymore. Search Live is akin to the video and audio comprehension capabilities of Gemini Live, but with the added benefit of getting quick answers based on sites from around the web. Google showed how Search Live could comprehend queries about at-home science experiment and bring in answers from sites like Quora or YouTube. — Kyle Barr Google is getting deep into augmented reality with Android XR—its operating system built specifically for AR glasses and VR headsets. Google showed us how users may be able to see a holographic live Google Maps view directly on their glasses or set up calendar events, all without needing to touch a single screen. This uses Gemini AI to comprehend your voice prompts and follow through on your instructions. Google doesn’t have its own device to share at I/O, but its planning to work with companies like XReal and Samsung to craft new devices across both AR and VR. — Kyle Barr Read our full report here: I know how much you all love subscriptions! Google does too, apparently, and is now offering a $250 per month AI bundle that groups some of its most advanced AI services. Subscribing to Google AI Ultra will get you: Gemini and its full capabilities Flow, a new, more advanced AI filmmaking tool based on Veo Whisk, which allows text-to-image creation NotebookLM, an AI note-taking app Gemini in Gmail and Docs Gemini in Chrome Project Mariner, an agentic research AI 30TB of storage I’m not sure who needs all of this, but maybe there are more AI superusers than I thought. —James Pero Google CEO Sundar Pichai was keen to claim that users are big, big fans of AI overviews in Google Search results. If there wasn’t already enough AI on your search bar, Google will now stick an entire “AI Mode” tab on your search bar next to the Google Lens button. This encompasses the Gemini 2.5 model. This opens up an entirely new UI for searching via a prompt with a chatbot. After you input your rambling search query, it will bring up an assortment of short-form textual answers, links, and even a Google Maps widget depending on what you were looking for. AI Mode should be available starting today. Google said AI Mode pulls together information from the web alongside its other data like weather or academic research through Google Scholar. It should also eventually encompass your “personal context,” which will be available later this summer. Eventually, Google will add more AI Mode capabilities directly to AI Overviews. — Kyle Barr May 20News Embargo Has Lifted! © Xreal Get your butt over to Gizmodo.com’s home page because the Google I/O news embargo just lifted. We’ve got a bunch of stories, including this one about Google partnering up with Xreal for a new pair of “optical see-through” (OST) smart glasses called Project Aura. The smart glasses run Android XR and are powered by a Qualcomm chip. You can see three cameras. Wireless, these are not—you’ll need to tether to a phone or other device. Update: Little scoop: I’ve confirmed that Project Aura has a 70-degree field of view, which is way wider than the One Pro’s FOV, which is 57 degrees. —Raymond Wong © Raymond Wong / Gizmodo Google’s DeepMind CEO showed off the updated version of Project Astra running on a phone and drove home how its “personal, proactive, and powerful” AI features are the groundwork for a “universal assistant” that truly understands and works on your behalf. If you think Gemini is a fad, it’s time to get familiar with it because it’s not going anywhere. —Raymond Wong May 20Gemini 2.5 Pro Is Here © Gizmodo Google says Gemini 2.5 Pro is its “most advanced model yet,” and comes with “enhanced reasoning,” better coding ability, and can even create interactive simulations. You can try it now via Google AI Studio. —James Pero There are two major types of transformer AI used today. One is the LLM, AKA large language models, and diffusion models—which are mostly used for image generation. The Gemini Diffusion model blurs the lines of these types of models. Google said its new research model can iterate on a solution quickly and correct itself while generating an answer. For math or coding prompts, Gemini Diffusion can potentially output an entire response much faster than a typical Chatbot. Unlike a traditional LLM model, which may take a few seconds to answer a question, Gemini Diffusion can create a response to a complex math equation in the blink of an eye, and still share the steps it took to reach its conclusion. — Kyle Barr © Gizmodo New Gemini 2.5 Flash and Gemini Pro models are incoming and, naturally, Google says both are faster and more sophisticated across the board. One of the improvements for Gemini 2.5 Flash is even more inflection when speaking. Unfortunately for my ears, Google demoed the new Flash speaking in a whisper that sent chills down my spine. —James Pero Is anybody keeping track of how many times Google execs have said “Gemini” and “AI” so far? Oops, I think I’m already drunk, and we’re only 20 minutes in. —Raymond Wong © Raymond Wong / Gizmodo Google’s Project Astra is supposed to be getting much better at avoiding hallucinations, AKA when the AI makes stuff up. Project Astra’s vision and audio comprehension capabilities are supposed to be far better at knowing when you’re trying to trick it. In a video, Google showed how its Gemini Live AI wouldn’t buy your bullshit if you tell it that a garbage truck is a convertible, a lamp pole is a skyscraper, or your shadow is some stalker. This should hopefully mean the AI doesn’t confidently lie to you, as well. Google CEO Sundar Pichai said “Gemini is really good at telling you when you’re wrong.” These enhanced features should be rolling out today for Gemini app on iOS and Android. — Kyle Barr May 20Release the Agents Like pretty much every other AI player, Google is pursuing agentic AI in a big way. I’d prepare for a lot more talk about how Gemini can take tasks off your hands as the keynote progresses. —James Pero © Gizmodo Google has finally moved Project Starline—its futuristic video-calling machine—into a commercial project called Google Beam. According to Pichai, Google Beam can take a 2D image and transform it into a 3D one, and will also incorporate live translate. —James Pero © Gizmodo Google’s CEO, Sundar Pichai, says Google is shipping at a relentless pace, and to be honest, I tend to agree. There are tons of Gemini models out there already, even though it’s only been out for two years. Probably my favorite milestone, though, is that it has now completed Pokémon Blue, earning all 8 badges according to Pichai. —James Pero May 20Let’s Do This Buckle up, kiddos, it’s I/O time. Methinks there will be a lot to get to, so you may want to grab a snack now. —James Pero Counting down until the keynote… only a few more minutes to go. The DJ just said AI is changing music and how it’s made. But don’t forget that we’re all here… in person. Will we all be wearing Android XR smart glasses next year? Mixed reality headsets? —Raymond Wong © Raymond Wong / Gizmodo Fun fact: I haven’t attended Google I/O in person since before Covid-19. The Wi-Fi is definitely stronger and more stable now. It’s so great to be back and covering for Gizmodo. Dream job, unlocked! —Raymond Wong © Raymond Wong / Gizmodo Mini breakfast burritos… bagels… but these bagels can’t compare to real Made In New York City bagels with that authentic NY water 😏 —Raymond Wong © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo © Raymond Wong / Gizmodo I’ve arrived at the Shoreline Amphitheatre in Mountain View, Calif., where the Google I/O keynote is taking place in 40 minutes. Seats are filling up. But first, must go check out the breakfast situation because my tummy is growling… —Raymond Wong May 20Should We Do a Giveaway? © Raymond Wong / Gizmodo Google I/O attendees get a special tote bag, a metal water bottle, a cap, and a cute sheet of stickers. I always end up donating this stuff to Goodwill during the holidays. A guy living in NYC with two cats only has so much room for tote bags and water bottles… Would be cool to do giveaway. Leave a comment to let us know if you’d be into that and I can pester top brass to make it happen 🤪 —Raymond Wong May 20Got My Press Badge! In 13 hours, Google will blitz everyone with Gemini AI, Gemini AI, and tons more Gemini AI. Who’s ready for… Gemini AI? —Raymond Wong May 19Google Glass: The Redux © Google / Screenshot by Gizmodo Google is very obviously inching toward the release of some kind of smart glasses product for the first time since (gulp) Google Glass, and if I were a betting man, I’d say this one will have a much warmer reception than its forebearer. I’m not saying Google can snatch the crown from Meta and its Ray-Ban smart glasses right out of the gate, but if it plays its cards right, it could capitalize on the integration with its other hardware (hello, Pixel devices) in a big way. Meta may finally have a real competitor on its hands. ICYMI: Here’s Google’s President of the Android Ecosystem, Sameer Samat, teasing some kind of smart glasses device in a recorded demo last week. —James Pero Hi folks, I’m James Pero, Gizmodo’s new Senior Writer. There’s a lot we have to get to with Google I/O, so I’ll keep this introduction short. I like long walks on the beach, the wind in my nonexistent hair, and I’m really, really, looking forward to bringing you even more of the spicy, insightful, and entertaining coverage on consumer tech that Gizmodo is known for. I’m starting my tenure here out hot with Google I/O, so make sure you check back here throughout the week to get those sweet, sweet blogs and commentary from me and Gizmodo’s Senior Consumer Tech Editor Raymond Wong. —James Pero © Raymond Wong / Gizmodo Hey everyone! Raymond Wong, senior editor in charge of Gizmodo’s consumer tech team, here! Landed in San Francisco (the sunrise was *chef’s kiss*), and I’ll be making my way over to Mountain View, California, later today to pick up my press badge and scope out the scene for tomorrow’s Google I/O keynote, which kicks off at 1 p.m. ET / 10 a.m. PT. Google I/O is a developer conference, but that doesn’t mean it’s news only for engineers. While there will be a lot of nerdy stuff that will have developers hollering, what Google announces—expect updates on Gemini AI, Android, and Android XR, to name a few headliners—will shape consumer products (hardware, software, and services) for the rest of this year and also the years to come. I/O is a glimpse at Google’s technology roadmap as AI weaves itself into the way we compute at our desks and on the go. This is going to be a fun live blog! —Raymond Wong
    0 Commentarii 0 Distribuiri
  • On the road: My year of conferences on Unity’s Diversity Recruiting team

    My name is Kaylynn, and I joined Unity as a member of the Diversity Recruiting team in 2022 as a project coordinator. My first priority was to head our conference engagements for the year. While I was initially nervous to lead such an important project, I am very happy to have done so alongside an amazing team, and am excited to share a behind-the-scenes recap of our recruiting travels.To kick off our first in-person conference of the year, members of the University Recruiting team and I headed to Washington, D.C., in September for the Center for Minorities and People with Disabilities in Information Technology’s Richard Tapia Celebration of Diversity in Computing Conference. The Tapia Conference was a great experience and held true to its 2022 theme, “A Time to Celebrate!” Attendees included undergraduate and graduate students, faculty, and professionals in computing from all backgrounds and ethnicities. This was my first in-person recruiting event coming out of the pandemic and my first with Unity overall. I can still remember experiencing jitters before we opened our booth on the first day, but the University team brought me up to speed fast, which boosted my confidence in speaking to students about our internship programs.Looking back at my time at Tapia, my favorite memories come from the engagement opportunities – connecting with others over common backgrounds, ethnicities, disabilities, and genders. Our team really did a great job in fostering the opportunity to develop relationships with candidates that extended beyond the conference itself.At the end of September, I joined several other Unity representatives in Orlando, Florida, for the Grace Hopper Celebration. While I have been to many recruiting events before, I had never been to a conference as big as this one. In 2021, GHC virtually hosted 29,000 people, and holds the record for the world’s largest gathering of women in computing.During our time at the conference, our recruiters had a lot of early mornings and late nights but, nonetheless, it was always fun. We usually started our day at 8:00 am sharp, when we could be spotted running to the shuttle stop to make it to the convention center on time.During the day, attendees could find the smiling and welcoming faces of our team at the booth, talking to potential candidates about the business and open opportunities. The career fair closed each evening at 5:00 pm; however, because we had so many candidates who wanted to speak with us, we would usually stay for an extra 30–45 minutes. Other conference events included participating in onsite internship interviews led by our University team or volunteers from across Unity. Having volunteers available allowed us to carry out all interviews and, later, make some internship offers.Although GHC was full of long days and nights, it was one of my favorite conference engagements because of the talent and drive evident in the room. It was rewarding to connect with so many candidates who were really excited to talk about Unity and eager to show us the projects they have built or are working on with our software.A second reason I enjoyed GHC was due to it being my first chance to meet all my wonderful teammates in person. These days, since we primarily meet and chat virtually, it was nice to see each other in person. The conference gave us the chance to spend five days working together and getting to know one another.In November, my team, our inclusion partners, and ComUnidad – the Latinx Employee Resource Group– members packed our bags and headed to Charlotte, North Carolina, for the annual get-together of the Society of Hispanic Professional Engineers. At the SHPE 2022 National Convention, we met hundreds of candidates from the Latinx community and created relationships with these prospective candidates for upcoming programs.With more than 13,000 students and professionals attending the event, we spoke to many aspiring engineers from different specialties. At the time, I was unaware that there were so many different types of engineering. Other than meeting so many talented engineers, networking, and giving out cool swag at the booth, our team also participated in onsite interviews, which – much like the other events – allowed us to speak to many candidates and, ultimately, make some internship offers.During my time at the booth, candidates shared their interests, projects, and career goals, and it was exciting to hear stories about how their persistence and hard work got them to where they are today. Seeing undergraduate and graduate students juggling finding internships at conferences on top of their busy school and home lives made me feel very proud of all those attending.A week after we wrapped up at SHPE, we headed to Austin, Texas, for Blavity Inc.’s AFROTECH Conference. AFROTECH is known for being the place for all things Black in tech and Web3, and was quite different from our other in-person engagements in 2022. This was because we were not only there to recruit, but we also participated in two additional conference activations: hosting a learning lab and an event in our Austin office.On the Monday, we started with a career fair, which included several team members from Recruiting and B-United – the Black ERG – helping out at the booth. We spent three days speaking to both internship candidates and advanced professionals and tech entrepreneurs.On the second day, Unity hosted a learning lab that showcased our real-time 3D capabilities. As part of the lab, an expert panel featured Unity’s Raymond Graham, Krystal Cooper, and Nick Straughn. Over 300 attendees packed the workshop to engage with our experts and learn more about the panelists’ respective career journeys.On the final day, we wrapped up our AFROTECH activations with an in-office Unity @ AFROTECH event. Unity for Humanity grantee Black Terminus AR joined us, and its founder Damien McDuffie wowed the crowd with an interactive VR exhibit. The in-office event was one of my favorite parts of this conference because it gave such a personal touch to the week. As a company, we were able to connect with candidates and creators on a more intimate level in our cool workspace.Throughout 2022, our team attended and represented Unity at seven virtual or in-person diversity recruiting engagements. In addition to the conferences I shared above, we participated in the National Society of Black Engineers’ annual convention, Latinx in Gaming’s CONEXION, and three QueerTech activations.During my time as a coordinator on some of these projects, I traveled to places I’d never been before, worked with amazing team members, and met candidates with great passion for technology and Unity.While our diversity recruiting approach at Unity includes many additional strategies to build relationships with candidates traditionally underrepresented in tech, I really enjoyed being able to create sustainable relationships with candidates face-to-face during my first year on the team.As we know, it is hard to be what you can’t see. At Unity, this means that showing up to events like these, with representation as a focus, helps us connect with candidates we might never have had the chance to get to know. It was an honor to be a part of these efforts in 2022, and, from a diversity, equity, and inclusionperspective, it is great to know that, with each conference we attend and each person we meet, we inch one step closer to connecting underrepresented communities to the world of tech and to Unity.To learn more about Unity’s DEI strategies and initiatives, visit our Inclusion and Diversity page. To explore open roles, check out the Unity Careers site.
    #road #year #conferences #unitys #diversity
    On the road: My year of conferences on Unity’s Diversity Recruiting team
    My name is Kaylynn, and I joined Unity as a member of the Diversity Recruiting team in 2022 as a project coordinator. My first priority was to head our conference engagements for the year. While I was initially nervous to lead such an important project, I am very happy to have done so alongside an amazing team, and am excited to share a behind-the-scenes recap of our recruiting travels.To kick off our first in-person conference of the year, members of the University Recruiting team and I headed to Washington, D.C., in September for the Center for Minorities and People with Disabilities in Information Technology’s Richard Tapia Celebration of Diversity in Computing Conference. The Tapia Conference was a great experience and held true to its 2022 theme, “A Time to Celebrate!” Attendees included undergraduate and graduate students, faculty, and professionals in computing from all backgrounds and ethnicities. This was my first in-person recruiting event coming out of the pandemic and my first with Unity overall. I can still remember experiencing jitters before we opened our booth on the first day, but the University team brought me up to speed fast, which boosted my confidence in speaking to students about our internship programs.Looking back at my time at Tapia, my favorite memories come from the engagement opportunities – connecting with others over common backgrounds, ethnicities, disabilities, and genders. Our team really did a great job in fostering the opportunity to develop relationships with candidates that extended beyond the conference itself.At the end of September, I joined several other Unity representatives in Orlando, Florida, for the Grace Hopper Celebration. While I have been to many recruiting events before, I had never been to a conference as big as this one. In 2021, GHC virtually hosted 29,000 people, and holds the record for the world’s largest gathering of women in computing.During our time at the conference, our recruiters had a lot of early mornings and late nights but, nonetheless, it was always fun. We usually started our day at 8:00 am sharp, when we could be spotted running to the shuttle stop to make it to the convention center on time.During the day, attendees could find the smiling and welcoming faces of our team at the booth, talking to potential candidates about the business and open opportunities. The career fair closed each evening at 5:00 pm; however, because we had so many candidates who wanted to speak with us, we would usually stay for an extra 30–45 minutes. Other conference events included participating in onsite internship interviews led by our University team or volunteers from across Unity. Having volunteers available allowed us to carry out all interviews and, later, make some internship offers.Although GHC was full of long days and nights, it was one of my favorite conference engagements because of the talent and drive evident in the room. It was rewarding to connect with so many candidates who were really excited to talk about Unity and eager to show us the projects they have built or are working on with our software.A second reason I enjoyed GHC was due to it being my first chance to meet all my wonderful teammates in person. These days, since we primarily meet and chat virtually, it was nice to see each other in person. The conference gave us the chance to spend five days working together and getting to know one another.In November, my team, our inclusion partners, and ComUnidad – the Latinx Employee Resource Group– members packed our bags and headed to Charlotte, North Carolina, for the annual get-together of the Society of Hispanic Professional Engineers. At the SHPE 2022 National Convention, we met hundreds of candidates from the Latinx community and created relationships with these prospective candidates for upcoming programs.With more than 13,000 students and professionals attending the event, we spoke to many aspiring engineers from different specialties. At the time, I was unaware that there were so many different types of engineering. Other than meeting so many talented engineers, networking, and giving out cool swag at the booth, our team also participated in onsite interviews, which – much like the other events – allowed us to speak to many candidates and, ultimately, make some internship offers.During my time at the booth, candidates shared their interests, projects, and career goals, and it was exciting to hear stories about how their persistence and hard work got them to where they are today. Seeing undergraduate and graduate students juggling finding internships at conferences on top of their busy school and home lives made me feel very proud of all those attending.A week after we wrapped up at SHPE, we headed to Austin, Texas, for Blavity Inc.’s AFROTECH Conference. AFROTECH is known for being the place for all things Black in tech and Web3, and was quite different from our other in-person engagements in 2022. This was because we were not only there to recruit, but we also participated in two additional conference activations: hosting a learning lab and an event in our Austin office.On the Monday, we started with a career fair, which included several team members from Recruiting and B-United – the Black ERG – helping out at the booth. We spent three days speaking to both internship candidates and advanced professionals and tech entrepreneurs.On the second day, Unity hosted a learning lab that showcased our real-time 3D capabilities. As part of the lab, an expert panel featured Unity’s Raymond Graham, Krystal Cooper, and Nick Straughn. Over 300 attendees packed the workshop to engage with our experts and learn more about the panelists’ respective career journeys.On the final day, we wrapped up our AFROTECH activations with an in-office Unity @ AFROTECH event. Unity for Humanity grantee Black Terminus AR joined us, and its founder Damien McDuffie wowed the crowd with an interactive VR exhibit. The in-office event was one of my favorite parts of this conference because it gave such a personal touch to the week. As a company, we were able to connect with candidates and creators on a more intimate level in our cool workspace.Throughout 2022, our team attended and represented Unity at seven virtual or in-person diversity recruiting engagements. In addition to the conferences I shared above, we participated in the National Society of Black Engineers’ annual convention, Latinx in Gaming’s CONEXION, and three QueerTech activations.During my time as a coordinator on some of these projects, I traveled to places I’d never been before, worked with amazing team members, and met candidates with great passion for technology and Unity.While our diversity recruiting approach at Unity includes many additional strategies to build relationships with candidates traditionally underrepresented in tech, I really enjoyed being able to create sustainable relationships with candidates face-to-face during my first year on the team.As we know, it is hard to be what you can’t see. At Unity, this means that showing up to events like these, with representation as a focus, helps us connect with candidates we might never have had the chance to get to know. It was an honor to be a part of these efforts in 2022, and, from a diversity, equity, and inclusionperspective, it is great to know that, with each conference we attend and each person we meet, we inch one step closer to connecting underrepresented communities to the world of tech and to Unity.To learn more about Unity’s DEI strategies and initiatives, visit our Inclusion and Diversity page. To explore open roles, check out the Unity Careers site. #road #year #conferences #unitys #diversity
    UNITY.COM
    On the road: My year of conferences on Unity’s Diversity Recruiting team
    My name is Kaylynn, and I joined Unity as a member of the Diversity Recruiting team in 2022 as a project coordinator. My first priority was to head our conference engagements for the year. While I was initially nervous to lead such an important project, I am very happy to have done so alongside an amazing team, and am excited to share a behind-the-scenes recap of our recruiting travels.To kick off our first in-person conference of the year, members of the University Recruiting team and I headed to Washington, D.C., in September for the Center for Minorities and People with Disabilities in Information Technology’s Richard Tapia Celebration of Diversity in Computing Conference. The Tapia Conference was a great experience and held true to its 2022 theme, “A Time to Celebrate!” Attendees included undergraduate and graduate students, faculty, and professionals in computing from all backgrounds and ethnicities. This was my first in-person recruiting event coming out of the pandemic and my first with Unity overall. I can still remember experiencing jitters before we opened our booth on the first day, but the University team brought me up to speed fast, which boosted my confidence in speaking to students about our internship programs.Looking back at my time at Tapia, my favorite memories come from the engagement opportunities – connecting with others over common backgrounds, ethnicities, disabilities, and genders. Our team really did a great job in fostering the opportunity to develop relationships with candidates that extended beyond the conference itself.At the end of September, I joined several other Unity representatives in Orlando, Florida, for the Grace Hopper Celebration (GHC). While I have been to many recruiting events before, I had never been to a conference as big as this one. In 2021, GHC virtually hosted 29,000 people, and holds the record for the world’s largest gathering of women in computing.During our time at the conference, our recruiters had a lot of early mornings and late nights but, nonetheless, it was always fun. We usually started our day at 8:00 am sharp, when we could be spotted running to the shuttle stop to make it to the convention center on time.During the day, attendees could find the smiling and welcoming faces of our team at the booth, talking to potential candidates about the business and open opportunities. The career fair closed each evening at 5:00 pm; however, because we had so many candidates who wanted to speak with us, we would usually stay for an extra 30–45 minutes. Other conference events included participating in onsite internship interviews led by our University team or volunteers from across Unity. Having volunteers available allowed us to carry out all interviews and, later, make some internship offers.Although GHC was full of long days and nights, it was one of my favorite conference engagements because of the talent and drive evident in the room. It was rewarding to connect with so many candidates who were really excited to talk about Unity and eager to show us the projects they have built or are working on with our software.A second reason I enjoyed GHC was due to it being my first chance to meet all my wonderful teammates in person. These days, since we primarily meet and chat virtually, it was nice to see each other in person. The conference gave us the chance to spend five days working together and getting to know one another.In November, my team, our inclusion partners, and ComUnidad – the Latinx Employee Resource Group (ERG) – members packed our bags and headed to Charlotte, North Carolina, for the annual get-together of the Society of Hispanic Professional Engineers (SHPE). At the SHPE 2022 National Convention, we met hundreds of candidates from the Latinx community and created relationships with these prospective candidates for upcoming programs.With more than 13,000 students and professionals attending the event, we spoke to many aspiring engineers from different specialties. At the time, I was unaware that there were so many different types of engineering. Other than meeting so many talented engineers, networking, and giving out cool swag at the booth, our team also participated in onsite interviews, which – much like the other events – allowed us to speak to many candidates and, ultimately, make some internship offers.During my time at the booth, candidates shared their interests, projects, and career goals, and it was exciting to hear stories about how their persistence and hard work got them to where they are today. Seeing undergraduate and graduate students juggling finding internships at conferences on top of their busy school and home lives made me feel very proud of all those attending.A week after we wrapped up at SHPE, we headed to Austin, Texas, for Blavity Inc.’s AFROTECH Conference. AFROTECH is known for being the place for all things Black in tech and Web3, and was quite different from our other in-person engagements in 2022. This was because we were not only there to recruit, but we also participated in two additional conference activations: hosting a learning lab and an event in our Austin office.On the Monday, we started with a career fair, which included several team members from Recruiting and B-United – the Black ERG – helping out at the booth. We spent three days speaking to both internship candidates and advanced professionals and tech entrepreneurs.On the second day, Unity hosted a learning lab that showcased our real-time 3D capabilities. As part of the lab, an expert panel featured Unity’s Raymond Graham, Krystal Cooper, and Nick Straughn. Over 300 attendees packed the workshop to engage with our experts and learn more about the panelists’ respective career journeys.On the final day, we wrapped up our AFROTECH activations with an in-office Unity @ AFROTECH event. Unity for Humanity grantee Black Terminus AR joined us, and its founder Damien McDuffie wowed the crowd with an interactive VR exhibit. The in-office event was one of my favorite parts of this conference because it gave such a personal touch to the week. As a company, we were able to connect with candidates and creators on a more intimate level in our cool workspace.Throughout 2022, our team attended and represented Unity at seven virtual or in-person diversity recruiting engagements. In addition to the conferences I shared above, we participated in the National Society of Black Engineers’ annual convention, Latinx in Gaming’s CONEXION, and three QueerTech activations.During my time as a coordinator on some of these projects, I traveled to places I’d never been before, worked with amazing team members, and met candidates with great passion for technology and Unity.While our diversity recruiting approach at Unity includes many additional strategies to build relationships with candidates traditionally underrepresented in tech, I really enjoyed being able to create sustainable relationships with candidates face-to-face during my first year on the team.As we know, it is hard to be what you can’t see. At Unity, this means that showing up to events like these, with representation as a focus, helps us connect with candidates we might never have had the chance to get to know. It was an honor to be a part of these efforts in 2022, and, from a diversity, equity, and inclusion (DEI) perspective, it is great to know that, with each conference we attend and each person we meet, we inch one step closer to connecting underrepresented communities to the world of tech and to Unity.To learn more about Unity’s DEI strategies and initiatives, visit our Inclusion and Diversity page. To explore open roles, check out the Unity Careers site.
    0 Commentarii 0 Distribuiri
  • Roman and Williams, STUDIO V, Raymond / Nicolas, Subtila, and Jenkins are Archinect Jobs' latest featured employers

    In this week's curated employer highlight from Archinect Jobs, we are featuring five architecture and design firms with current openings in New York City, Los Angeles, and Miami.
    For even more opportunities, visit the Archinect job board and explore our active community of job seekers, firms, and schools.
    New York City-based design firm Roman and Williams Buildings and Interiors has three exciting job opportunities: a Sr. Designer, Architecture - Residential with a minimum of five to eight years of experience and is fluent in AutoCAD, an Architectural Designer - Residential with a minimum of two to five years of experience and is fluent in AutoCAD and SketchUp, and a Sr. Interior Designer, Residential with a minimum of seven to ten years of experience and is fluent in AutoCAD, Adobe Creative Suite, and Microsoft Office.
    The Standard NY Hotel by Roman and Williams Buildings and Interiors.STUDIO V Architecture is in search of a Project Manager in New York City with seven-plus years of...
    #roman #williams #studio #raymond #nicolas
    Roman and Williams, STUDIO V, Raymond / Nicolas, Subtila, and Jenkins are Archinect Jobs' latest featured employers
    In this week's curated employer highlight from Archinect Jobs, we are featuring five architecture and design firms with current openings in New York City, Los Angeles, and Miami. For even more opportunities, visit the Archinect job board and explore our active community of job seekers, firms, and schools. New York City-based design firm Roman and Williams Buildings and Interiors has three exciting job opportunities: a Sr. Designer, Architecture - Residential with a minimum of five to eight years of experience and is fluent in AutoCAD, an Architectural Designer - Residential with a minimum of two to five years of experience and is fluent in AutoCAD and SketchUp, and a Sr. Interior Designer, Residential with a minimum of seven to ten years of experience and is fluent in AutoCAD, Adobe Creative Suite, and Microsoft Office. The Standard NY Hotel by Roman and Williams Buildings and Interiors.STUDIO V Architecture is in search of a Project Manager in New York City with seven-plus years of... #roman #williams #studio #raymond #nicolas
    ARCHINECT.COM
    Roman and Williams, STUDIO V, Raymond / Nicolas, Subtila, and Jenkins are Archinect Jobs' latest featured employers
    In this week's curated employer highlight from Archinect Jobs, we are featuring five architecture and design firms with current openings in New York City, Los Angeles, and Miami. For even more opportunities, visit the Archinect job board and explore our active community of job seekers, firms, and schools. New York City-based design firm Roman and Williams Buildings and Interiors has three exciting job opportunities: a Sr. Designer, Architecture - Residential with a minimum of five to eight years of experience and is fluent in AutoCAD, an Architectural Designer - Residential with a minimum of two to five years of experience and is fluent in AutoCAD and SketchUp, and a Sr. Interior Designer, Residential with a minimum of seven to ten years of experience and is fluent in AutoCAD, Adobe Creative Suite, and Microsoft Office. The Standard NY Hotel by Roman and Williams Buildings and Interiors.STUDIO V Architecture is in search of a Project Manager in New York City with seven-plus years of...
    0 Commentarii 0 Distribuiri