WWW.FORBES.COM
AWS AI Data Lead: Pushing Past Prototypes In Generative AI
POLAND - 2024/01/31: In this photo illustration an Amazon Web Services AWS logo seen displayed on a ... [+] smartphone. (Photo Illustration by Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images)SOPA Images/LightRocket via Getty ImagesAI is ahead of its time. For many organizations, the push to adopt full-blown cloud-native technologies (or at least those stemming from public cloud services, rather than a private cloud or hybrid halfway house) is still quite a leap. To suggest that cloud is a ubiquitous de facto standard for many application types is something of an exaggeration, even though migration and penetration is indeed moving on an upward trajectory (if not quite exponentially). With that grounding in mind, can we perhaps suggest that a chasm leap to productionalized, functional AI-based applications is also still some way off?Suitably sanguine yet technically realistic given his experience and position on this matter is Swami Sivasubramanian, VP for AI and data at AWS. Sivasubramanian suggests that as generative AI now transitions to production systems in 2025, early adopters are reaping rewards like accelerated productivity and customer experiences.So then, how do software engineers build this stuff?How To Build AI ApplicationsThe lines between data analytics and machine learning are blurring, reshaping the way we access and interact with our data, create predictions with ML and build the next wave of foundation models," suggests Sivasubramanian. Through a convergence of tools, data and AI, we are making it easier for ML scientists, data engineers and developers to access all they need to build generative AI applications in a single place, with principles like security and privacy built in from the start. This has allowed organizations of all sizes and across industries to unlock productivity improvements, delivering improved experiences and realizing substantial cost reductions by leveraging our services for generative AI.But there are some practical considerations to bring into question here i.e. when it comes to building and operating modern data strategies, at what level should architectural provisioning for data ingestion be part process of this and, are organizations ready to swallow what could be a relatively big spike in their data ingestion throughput, requirement and responsibility?MORE FROMFORBES VETTEDBuilding generative AI models requires a data ingestion infrastructure that can scale effectively with demand. Think of it like a highway that needs to accommodate fluctuating traffic patterns - if its too narrow, bottlenecks will form, explained Sivasubramanian. For organizations, this means planning data ingestion pipelines that not only handle current volumes but also adapt to future needs. With AWS, weve created flexible tools like AWS Glue for data preparation and integration, Amazon Kinesis for real-time streaming, Amazon S3 for durable storage to build data lakes and Amazon Redshift for data warehousing. These services scale as demand grows, so customers can ingest, manage and transform data efficiently and maintain control over costs and performance. By provisioning architecture thats adaptable and cost-effective, businesses can support the growing data needs of AI-driven applications without getting bogged down in complexity.Synthetic Data Is Getting RealAs we have been analyzing repeatedly this year, the use of synthetic data is also surfacing as an important element in the total data pipeline, especially in area where sensitive personal information is involved or where data generation sources are hard to find (such is rare medical event data) and where data diversity is limited. Its not surprising to see an organization as expansive as AWS working in this space.This approach allows for safer experimentation, faster model training and more equitable AI development. By incorporating synthetic data, we can accelerate model creation, reduce costs and build more inclusive AI solutions. However, advises Sivasubramanian, synthetic data is not a panacea as fully relying on it to train a model can potentially lead to "model loss" in his view. So, we need to have the right mix. With all these factors to juggle, how does the AWS AI leader think we can push past prototypes and start to productize AI?Today, the tipping point for productionizing AI is about providing organizations with robust, scalable and secure tools that integrate into existing workflows while maintaining high standards of governance. Rapid advancements in AI technology are transforming AI from an experimental tool into a production-ready, business-critical capability, said Sivasubramanian. At AWS, we focus on making AI not only more powerful but also practical, so it can be reliably deployed across industries. Our scalable infrastructure powered by Traininum and GPU instances, advanced model training powered by Amazon SageMaker and easy application development using Amazon Bedrock and generative AI assistants like Amazon Q allows organizations to integrate AI into real-world use cases, moving beyond prototypes into full-scale deployment.The AWS team say that were now seeing real world AI being used in scenarios stretching from software development acceleration to supply chain optimization and customer support enhancement. The key enablers are said to be trust, cost-efficiency and enterprise-grade performance. Sivasubramanian doesnt think the reason for this tipping point is a single chatbot or model, its a wider total play that encompasses a whole variety of factors. He suggests that generative AI application development requires developers to know how to pick the large language models they need and then have the competency to customize them with their own data, figure out the guardrails and then understand how to build AI agents.Amazon Q DeveloperUltimately, generative AI needs to meet people where they are working. That's why we launched Amazon Q. With Amazon Q Developer, we provide the a coding assistant which is used by Accenture, BT Group and National Australia Bank, amongst others. Additionally, with Amazon Q Business, business analysts can analyze data. For instance, Smarsheet can enable more than 3,000 workers to get insights from their data using Q without writing a single line of code, said Sivasubramanian. Generative AI is set to reshape the developer experience and were advancing quickly to make that transformation a reality. Amazon Q Developer gives organizations an AI-powered assistant that supports all phases of software development. It boosts productivity by generating code, providing real-time recommendations and enabling efficient workflows like debugging, security scanning and resource optimization across the AWS ecosystem.Software developers are able to customize Amazon Q with their organizations own codebase to get more relevant recommendations, which seems to be a pretty logical feature. That data is never shared with AI model providers or used to train the underlying models and this is a key aspect of any enterprise-grade solution.Also here we come into the realm of inline chat functions designed to enhance productivity in developer workflows, a function that is also powered by generative AI. But developers need contextual support right where they work i.e. whether in their integrated development environment, the AWS Management Console, or directly within code repositories. So Amazon Q Developers inline chat functions let developers ask questions, request code suggestions, troubleshoot issues, or perform environment within the same workflow and tools. This is said to reduce cognitive load and help developers make faster, more informed decisions.AWS also recently launched an enhanced local IDE experience for AWS Lambda developers, that sounds fairly technical, so is there a more digestible explanation of this technology that we need to take on board?The enhanced local IDE experience for AWS Lambda is an exciting development because it makes it far easier and faster for developers to build, test and deploy their applications directly from their local environment, enthused Sivasubramanian. This update brings AWS Lambda closer to the developers workflow, letting them code and debug in a familiar setup without needing to constantly switch between their local machine and the cloud. This means developers can test their code locally, troubleshoot issues quickly and iterate faster all of which lead to shorter development cycles and quicker time to market.From a business perspective, AWS says that this improvement helps organizations be more agile and efficient, as it reduces downtime, accelerates feature development and ensures a smoother transition to production. Plus, since it works with the AWS ecosystem, teams can take get additional scalability and security.No More Developers?Theres a huge amount of smart tooling for programmers on offer here, any suggestion that AI will replace coding and coders as a profession seems somewhat premature and unlikely. Sivasubramanian agrees with this sentiment and says that generative AI assistants when done right, (he again brings up Amazon Q as his favorite illustrative example) will come to take tasks that developers don't want to do. So in more realistic terms, well see AI coding robots will handle things like software upgrades, security scans and writing tests. Moreover, with generative AI assistants we can perhaps look forward to enabling everyone (including business analysts) to be deep experts in data and not have to wait for their data engineers or business intelligence experts to get them answers.While some jobs will inevitably become obsolete, using AI tools will transform the way people work for the better. The shift were seeing is about eliminating the undifferentiated heavy lifting so that developers spend less time on tedious, time-consuming work and have more time to focus on strategic and creative work. This shift is about augmenting developers capabilities. AI will open up new opportunities for developers to drive innovation at a faster pace. This is about equipping developers with better tools to enhance their work, said Sivasubramanian. Additionally, generative AI will democratize IT-related tasks because no-code and low-code tools will enable more tech-adjacent roles like sales, HR and marketing with the ability to create chatbots, analyze and summarize data, etc. We encourage organizations to upskill and reskill people to offset job loss as much as possible.AI As A Functionality, EventuallyAs we know then, AI discussions are ubiquitous. Every technology vendor wants as much share of voice and theres a lot of potential AI washing out there with what appears to be the entire tech trade now bolting on (sorry, theyd prefer us to say innovating) new AI services into their existing platform and tools. This landgrab throws up obvious questions i.e. surely not every database specialist, ERP vendor or CRM, HR and FinOps player can have all constructed new AI services at an advanced level that operates safely, without bias and within appropriate guardrails and governance, right?Given the level of prototyping and productization pandemonium, it is perhaps worthwhile listening to a player as large as AWS to attempt to get a realistic view on where we really are right now with AI. Sivasubramanian has said that AI is already on its way to becoming embedded functionality and the transition is happening faster than many realize.At AWS, were focused on integrating AI into everyday workflows, where it can provide real, tangible value without requiring special attention or complex setups. We see this happening in applications like (and I know Ive mentioned this tool already) Amazon Q Developer, where AI is already helping developers with tasks like code generation, debugging and even security scanning - all with minimal friction, concluded Sivasubramanian, in a private AI deep dive for press & analysts this month. In other industries, AI is powering customer support chatbots and virtual assistants - helping organizations address inquiries instantly and at scale - or handling predictive fraud detection in finance, where it identifies and flags potential risks in real-time. As AI continues to evolve, it will become more deeply embedded in business processes, powering everything from customer support to supply chain optimisation,In a few years, he believes AI will be ubiquitous in the background of many applications, working quietly and seamlessly to improve user experiences, drive efficiency and help organizations make data-driven decisions. Just as we now take for granted technologies like spellcheck that once seemed groundbreaking. Now that'll be smart.
0 Yorumlar
0 hisse senetleri
11 Views