![](https://www.computerworld.com/wp-content/uploads/2025/02/3815564-0-84436900-1738839760-Screenshot-2025-01-14-at-12.33.15.png)
Why I want glasses that are always listening
www.computerworld.com
The trouble with virtual assistants is that theyre just so darn needy.Specifically, they need to be told exactly what to do. They sit around doing nothing until explicitly directed. They dont take any initiative. Theyre, for lack of a better word, lazy.Science fiction writers, industry prognosticators, and techno-futurists like me have been predicting and promising for decades that once we get real AI, our computer assistants will do our bidding unbidden.Its called agency or proactivity.Butwhere is it? Where are those go-getter virtual assistants that do things on our behalf without being explicitly directed?Most of todays agency or proactivity features are found in enterprise applications, leveraging machine learning, sensor data, and user behavior analysis to act autonomously. For example:Hospitals affiliated with Johns Hopkins analyze real-time data to predict health crises, alerting staff at least six hours in advance.A mobile app called MoodTrainer uses location data and behavior monitoring to trigger cognitive behavioral therapy exercises when loneliness or stress patterns emerge.Tools like MonkeyLearn detect frustration in chat logs, prompting human agents to intervene with empathy-driven solutions.An IT tool called Workgrid uses AI to monitor network health, automatically resolving connectivity issues or scheduling updates during off-peak hours.Siemens uses vibration and temperature data from machinery to schedule repairs before breakdowns, which saves money on repairs and costly downtime.The HR tool Paradox AI scans prospective-employee applications, schedules interviews, and sends follow-ups without recruiter involvement.The growing emergence of proactive features for enterprise applications is great. But these features benefit organizations more than their employees.What about empowering individual users and individual employees? What about theAugmented Connected Workforce concept?Dont get me wrong. Agency has existed in mainstream personal assistants for 13 years.Google launched the Google Now assistant in 2012 as part of Android 4.1 Jelly Bean. The feature pioneered context-aware assistance by anticipating user needs through email, location, and search history analysis. It provided real-time travel alerts such as flight updates, traffic-optimized commute times, and location-triggered reminders for tasks or reservations. (Google discontinued Google Now in 2019, rolling some of its features into Google Assistant.)Other assistants offer limited proactivity. Apples Siri has AI-driven Proactive Intelligence to auto-summarize notifications and suggest context-aware actions. Amazons Alexa predicts user intent through Latent Goals and autonomously manages smart home devices.The proactivity features of these assistants go largely unnoticed and unappreciated nay, unused simply because their agency is often limited to bland, needless, and less-than-earth-shaking tasks.Unprompted help is on the wayOne of the best-demoed smart glasses products at this years CES wasHalliday smart glasses.The $489 glasses (due to ship in March) are different than (and potentially superior to) most competing products in several ways. One is that while the glasses can be used via voice and touch controls, control is expanded and enhanced with an optional ring worn on a finger. (If that idea sounds bonkers, you should know that Apples future smart glasses might do the same; at least, Apple has a big pile of patents suggesting that direction.)Another differentiator is that, instead of projecting visual feedback onto special lenses via a light engine, the electronics instead beam directly into the eye when the user looks up slightly. The company says its approach lowers costs and weight and improves visibility in bright sunlight.And finally, the glasses dont have a camera.Hooray! you might be thinking, No camera means theyre prioritizing privacy,right? (Wait till you hear what theyre doing with the microphones.)In general, Halliday glasses can listen to everything all the time. By combining AI analysis of what it hears with location and other data, the glasses can figure out how to offer help in a variety of ways. Halliday calls this subscription-based feature Proactive AI, and what it describes is a powerful personal enhancement of the users capabilities if it all works as advertised.Listening to your conversations, the glasses can fact check claims made by the person youre talking to, showing text that challenges falsehoods. They can interpret idioms, explain cultural references, summarize the content of meetings and list action items.If the other person is speaking a different language than you, the glasses can translate their words into your language. And if music is playing, the glasses can show you the lyrics.A Proactive AI subscription provides other features not triggered by audio, such as walking directions, teleprompter functionality, and conversation starters in social settings.Halliday isnt the only company advancing proactivity.Google makes the callGoogle Duplex, announced in 2018 at the Google I/O developers conference, is an AI feature of Google Assistant that can make phone calls to book reservations, schedule appointments, or check business hours.Recently, Search Labs extended Duplex in a feature called Ask for me. Its an experimental tool that finds out information for you by calling businesses on the phone, conversing with people at those businesses, and then reporting back on what they said. (The current iteration is for users who opted into Google Search Labs. It calls only auto repair shopsandnail salonsin the United States, but other business types and nations will be added in the future, according to Google.)The feature appears in search results as an Ask for Me card. Users can enter specifics (car type, fingernail matters, etc.); Google AI places a call and uses natural language speech technology to ask questions that will get the users answers, and the results are delivered via SMS or email.The automated voice identifies itself as Google AI, and Google offers businesses the ability to opt-out.Proactive AI: What could go right?Its become a clichin technology circles that replacing people with AI is bad; enhancing people with AI partnering with AI is a better way forward.AI that acts on our behalf with our knowledge, but without our explicit advance permission finding out information by searching or calling, feeding us information as we need it, enabling us to understand and learn from what other people are saying regardless of what language theyre speaking, is a stunning vision for realizing what Reid Hoffmans calls Superagency. In hisbook of the same name, Hoffman presents an optimistic vision of AI as a transformative force that, when developed inclusively, can empower people by enhancing human ability and potential.Maybe proactive AI could even help me understand why this vision of the future is coming from a dinky startup rather than Apple,Googleor Meta.
0 Reacties
·0 aandelen
·53 Views