Samsung cracks the AI puzzle with Galaxy S25, finally
www.techspot.com
Editor's take: After years of half-filled promises and underwhelming realities, it looks like Samsung has finally succeeded in bringing the kind of seamless experience that we all hoped AI, digital assistants, and agents would or could bring to our mobile devices. Well, to be fair, it's Samsung in conjunction with Google (along with some help from Qualcomm) that's making the magic happen inside the just-launched Galaxy S25. From the outside, Samsung's new S25 mobile phone is simply the latest iteration of the company's long-running line of premium Android smartphones powered by Qualcomm's latest generation Snapdragon processor. It's also got a pleasant, rounded-edge design, better cameras, a fresh set of color options, and a few cool new features all the things you'd expect from a next-generation device.But what really stands out on the Galaxy S25 based on some of the demos and brief hands-on time we've had with the device is the fact that it has digital assistant capabilities that actually work. Plus, it offers options for learning individual preferences that go well beyond what's been available on other devices.In other words, it brings the remarkable breadth, impressive accuracy, and highly individualized personalization promises of modern LLM-enabled AI to life in an always-on, always-connected, always-with-us mobile device.Part of this is due to the new level of partnership between Samsung and Google that suggests a deeper level of cooperation and co-design than has existed before. For example, in the past, it often felt like Samsung tried to replicate the software functionality that Google already offered in Android. With the Galaxy S25, however, the companies are working to bring the best of both worlds together to enable the best possible user experience.The most notable example is the default action of automatically launching Google's Gemini personal assistant with a long push on the S25. Yes, you can enable that option on Google's latest Pixel phones, but it's not on by default, and Samsung sells a significantly larger number of phones than Google. That means most people will experience this for the first time on a new S25 or previous generations of Galaxy smartphones when Samsung provides upgrades to them.Our experience with Gemini both voice-based and text input-based so far has been extremely impressive, offering a huge range of options for requesting information, getting suggestions, and even participating in long conversations on virtually any topic imaginable via Gemini Live. In addition, Google announced that future extensions of Gemini Screen Share and Live Video, which add new multimodal intelligence capabilities that can understand what's currently on your phone's screen and what the phone's camera is seeing will be coming to S25 first.But it's not just the Gemini integration that makes the S25 impressive. Samsung has also integrated Bixby as well as several other custom AI models it created for on-device data personalization options. Samsung smartly recognized that Gemini can offer a much more comprehensive set of cloud-based personal assistant features than Bixby, but Bixby has the advantage of running directly on the device and having access to both the actions we perform on devices as well as things like device settings. // Related StoriesAs a result, Bixby and these other on-device models can start to learn the types of actions we regularly perform, data we search for, etc., and it can store all that information securely on the device through the use of Samsung's Knox device security framework.Best of all, Gemini and Samsung's onboard models can work together in some pretty compelling ways. For example, we can ask Gemini for information about an upcoming event and have it put on our personal calendar.This public information to private calendar integration is possible because Samsung worked with Google to allow data to be passed from Gemini to several of Samsung's own apps, which have access to the data stored on the phone. While this might seem like a small step, it's hugely important because it's one of the first times the ability to combine these two "data worlds" has been enabled. More importantly, it means the experience is as completely seamless and intuitive as it needs to be for regular people to actually use these kinds of capabilities.At present, these integrations are limited to Samsung native apps, Google's suite of apps, as well as Spotify and WhatsApp, but apparently, a number of integrations with popular third-party apps are in the works.In addition, eventually, we'll be able to do things like push the side button to call up Gemini, request a change to the phone's settings, and have Bixby perform the operation. In the meantime, the two can work side-by-side, and we can launch either of them by voice using the appropriate keyword if we don't want to use the hardware button. Regardless, hiding the workings of multiple models behind a simple, unified user interface of Samsung's OneUI 7 is exactly what makes the combined design efforts on the S25 so intriguing.One of the other important advantages of having Bixby run on the device with help from the NPU on the Snapdragon 8 Elite Mobile Platform is the ability to discover and store information about our personal preferences and routines on the phone. The new Snapdragon SoC, by the way, will now be built into every S25 worldwide the sign of increased collaboration between Samsung and Qualcomm as well and instead of just offering slightly higher speeds, includes new custom circuitry to help with the camera processing and other AI features on the S25.Leveraging what Samsung is calling the Personal Data Engine, the onboard Samsung models are able to see what activities occur on our phone screen regardless of the app we're running and then train the onboard model to learn the kinds of information we're requesting, the types of activities we do on a regular basis, etc.From that, using what the company referred to as knowledge graph technology, it can eventually start to make recommendations or automatically program routines and perform them on our behalf (all with our permission, that is). Importantly, all of this knowledge graph data stays on the device and never goes to the cloud. The Personal Data Engine (PDE) is also what powers the new Now Brief and Now Bar functions, which serve up information based on our preferences and the information gleaned from the knowledge graph.It's this level of customization that has the potential to turn on-device AI into something that evolves from a clever parlor trick into an indispensable personal digital assistant. Of course, it also has the potential to create an incredible privacy and security nightmare. Data of this type could provide the most detailed dossier about what any individual does online that we've probably ever seen a huge magnet ("honeypot") for the bad guys.Thankfully, Samsung recognized that and embedded the PDE data into its hardware-based, on-device Knox Vault security solution, which has now been upgraded to offer post-quantum levels of cryptography. Because of these potential security concerns, Samsung allows us, of course, to turn off the data tracking features if we don't want to leverage them. It's an issue that any kind of AI-powered personalization device or service is going to face.Collectively, Samsung is calling all of these various AI-related capabilities Galaxy AI a phrase that has expanded to cover the company's own AI models, the extensions that integrate Gemini with Bixby, the AI features built into several of its apps, and the personalization enabled by the Personal Data Engine. Trying to make sense of it all isn't particularly easy, nor is it anything that the vast majority of consumers will ever really care to understand. But it's the combination of these Galaxy AI features along with the clean integration of Google's Gemini in the S25 that makes this such a compelling offering.To be clear, there's still a great deal more work to be done at a model integration and app ecosystem level, but at last, it seems the promise of AI on devices is finally coming to life.Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech
0 Σχόλια ·0 Μοιράστηκε ·64 Views