AIs on Duty, But Im the One Staying Late
www.informationweek.com
Asaff Zamir, VP of Global Customer Success, Solution Architecture, and Business Operations at AI2January 14, 20254 Min ReadBrain light via Alamy StockWorkplace efficiency is a significant challenge in todays fast-paced business environment. With new technologies entering our workplaces daily, employees spend more time upskilling and adapting to new tools than ever before.While these innovations are intended to enhance productivity and efficiency, employees often report the opposite experience. Upwork Study (July 2023) reveals that while 96% of C-suite leaders expect AI to boost worker productivity, 77% of employees report that AI has increased their workload instead. This disconnect suggests a gap between the potential of AI technologies and their current implementation in workplaces.In a survey I conducted earlier this year within a professional community of 5,000 customer success and other go-to-market functions, several clear challenges emerged -- challenges that are echoed in multiple research studies and provide a deeper overlook on these gaps.Adoption of AI Workflows in Enterprise EnvironmentsI believe there are two distinct phases that illustrate the progression of AI adoption in enterprise workflows, starting with the implementation of large language models (LLMs) for specific, high-impact use cases with a clear return on investment (ROI), followed by the adoption of intelligent, proactive personal assistants that will revolutionize the way employees engage with their work.Related:Phase 1: Adoption of LLMs with/without retrieval-augmented generation (RAG) for a specific use case with a clear and easy to achieve ROI.The technology is already here. The primary challenge is not the technology itself but identifying use cases that generate significant value relatively easily, while addressing concerns over data privacy, data structuring, and availability.Hence, use cases like auto-text generation or personalized and grounded chat solutions through LLMs, enhanced by RAG, have an increasing demand and far less resistance from the employees in specific industries where there is a significant amount of time invested on research, and in environments that require a large amount of manual repetitive tasks.Here are a few use cases we have already experienced, and I believe will spearhead this phase:Boosting customer service. Banks, for example, have relied on chatbots for 24/7 customer support, but generic chatbots struggle to provide the personalized and specific answers customers expect. As a result, banks spend too much time on inquiries and too little on customer engagement and experience. By integrating LLMs with a RAG engine, banks can offer personalized, grounded and real-time assistance.Related:According to a study by Salesforce, customer expectations for personalization have increased, with 81% of customers now expecting more personalized experiences than in the past.While implementing a chat + RAG solution with a leading bank, they report a 40% reduction in support tickets post-implementation, unlocking human agents' time for strategic, proactive conversations. This is a win-win for both customer satisfaction and employee fulfillment.Empowering research/medical oriented workflows. We learned that doctors, for example, in several healthcare institutions, often search manually for medical research documentation and guidelines, a tedious and time-consuming process.Oncologists face significant time burdens when searching for and managing medical literature, including time spent navigating electronic health records and managing documentation, rather than engaging directly in patient care.In a field where time equals life, inefficiencies can have serious consequences. By providing a chatbot + RAG system that enables doctors to easily interact with medical guidelines and literature, they can receive accurate answers in real-time, eliminating the need to navigate through multiple articles.Related:Enhancing e-commerce efficiency. The e-commerce sector has seen phenomenal growth in recent years, but many sellers struggle with providing high-quality product information. Incomplete descriptions, missing specifications, and poor-quality images often result in customer dissatisfaction, increased returns, and eroded trust. A Syndigo (2024) report highlighted that 65% of product returns are due to incomplete or inaccurate product descriptions.Additionally, 83% of global respondents stated they would abandon a website if they couldn't find sufficient product information, with 73% of shoppers thinking less of a brand if they find inconsistent or incorrect product details. These issues not only hurt sales but damage brand trust and lead to higher return rates.LLM-powered product description generation can automate the creation of detailed, engaging, and accurate product descriptions, enabling sellers to offer a richer shopping experience at scale.Phase 2: We will witness the rise of personal genius assistants that go beyond expertise in a specific use case and seamlessly integrate into workplace ecosystems. These assistants will not only automate repetitive tasks or answer simple questions, but also proactively suggest relevant context and resources before important milestones. Instead of merely responding to prompts, these assistants will anticipate needs, provide actionable insights, and help employees stay one step ahead.Imagine an assistant that can research complex questions, providing fast, curated information and recommendations. This will allow employees to focus on higher-order, creative tasks while feeling more fulfilled at work. The assistant will act as an intelligent collaborator, enhancing productivity and fostering a sense of accomplishment among employees.The journey toward fulfilling employees real potential lies in leveraging AI to work for us, not the other way around. By adopting technologies like LLMs with RAG and developing personal genius assistants, enterprises can transform workflows, enhance productivity, and, most importantly, allow employees to focus on meaningful, value-generating tasks.About the AuthorAsaff ZamirVP of Global Customer Success, Solution Architecture, and Business Operations at AI2Asaff Zamir is VP of Global Customer Success, Solution Architecture, and Business at AI21. A recognized thought leader in customer success and operations, he was named one of the Top 100 Customer Success Strategists from 2020 to 2023. Eight years ago, Asaff founded the Israel CS community and continues to contribute to its growth. He is also a guest lecturer at several academic institutions and developed Israels first-ever customer success academic course as part of MTA (The Academic College of Tel Aviv-Yaffo) MBA program, which launched in 2022.Previously, Asaff served as COO at Zencity and, before that, he built and led customer success teams at Siemplify (acquired by Google) and Mobilogy (Cellebrite).See more from Asaff ZamirNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
0 Comments
·0 Shares
·2 Views