AI Projects at the Edge: How To Plan for Success
www.informationweek.com
Przemysaw Krokosz, Edge and Embedded Technology Solutions Expert,MobicaJanuary 27, 20255 Min ReadDragos Condrea via Alamy StockArtificial Intelligence continues to gain traction as one of the hottest areas in the technology sector. To meet AIs requirements for processing power we are seeing a race by US vendors to establish data centers worldwide. Google recently announced a $1 billion investment in cloud infrastructure in Thailand, which was followed almost immediately by Oracles promise of $6.5bn in Malaysia. Added to this are many similar ventures in Europe, all under the flag of AI development. Its hardly surprising then that people thinking about AI investment, typically think of a cloud-based project. Yet, we are also seeing significant growth in AI deployments at the edge, and theres good reason for this. The Case for the EdgeTwo of the most compelling reasons are the superiority of speed and security that edge computing can offer. Edges freedom from dependence on connectivity provides low latency and makes it possible to create air gaps through which cyber criminals cannot penetrate. These are both vitally important issues. Speed is of the essence in many applications -- in hospitals, industrial sites or transportation, for example. A delay in machine calculations in a critical care unit is literally a matter of life and death. The same applies to an autonomous vehicle detecting an imminent collision. Theres no time for the technology to wait for a cellular connection. Related:Meanwhile, cybercrime increasingly poses a major threat throughout the world. The 2024 Cloud Security Report from Check Point software and Cybersecurity Insiders, based on conversations with 800 cloud and cybersecurity professionals, found that 96% of respondents were concerned about their capacity to manage cloud security risks, with 39% describing themselves as very concerned. For sectors such as energy, utilities, and pharmaceuticals, security is a top priority for obvious reasons.Another reason for considering the edge deployment for an AI implementation is cost. If you have a user base that is likely to grow substantially, operational expenditure may increase significantly in a cloud model. It may do so even more if the AI solution also requires the regular transfer of large amounts of data, such as video imagery. In these cases, a cloud-based approach may not be financially sustainable in the long term. Developments at the EdgeWhile edge will never be able to compete with the cloud in terms of sheer processing power, a new class of system-on-chip (SoC) processors has emerged, which is designed for AI inference. Many of the vendors in this space have also designed chipsets that are dedicated to specific use cases that allow further cost optimization. Related:Some specific examples of these new products are Intels platforms to support computer vision edge deployments, Qualcomms improved chips for mobile and wearable devices, and Ambarella advancing whats possible with video and image processing. Meanwhile, Nvidia is producing versatile solutions for applications in autonomous vehicles, healthcare, industry and more.These are just some of the contributory factors in the growth of the global edge AI market. One market research company recently estimated that it would grow to $61.63 billion in 2028, from $24.48 billion in 2024. Taking AI to the EdgeSo how do you bring your AI project to the edge? The answer is carefully. Perhaps counter-intuitively, an edge AI project often should begin in the cloud. The initial development of edge AI inference usually requires a level of processing power that can only be found in a cloud environment. But once the development and training of the AI model is complete, the fully mature version can be deployed at the edge. The next step will be to consider how the data processing requirements can be kept to a minimum. The insatiable demand for computing power from the most capable AI models is widely known, but this applies to all scales of AI -- even smaller models at the edge. Therefore, at this point, a range of optimization techniques will be required to minimize the size of both processing power and required data inputs. Related:This will involve reviewing the specific use case and the capabilities of the selected SoC, along with all edge device components, such as cameras and sensors, that may be supplying the data. The process is likely to involve a sizeable degree of experimentation and adjustment to find the lowest acceptable level of decision-making accuracy that can be achieved without undue compromises in the quality of the solution. The AI model itself also needs to be iteratively optimized to enable inference at the edge. Achieving this almost certainly will involve several transformations, as the model goes through the processes of quantization and simplification. Businesses also need to address openness and extensibility factors to ensure that the system will be interoperable with third party products. This will likely involve the development of a dedicated API to support the integration of internal and external plugins, and the creation of a software development kit to ensure smooth deployments. Finally, AI solutions are progressing at an unprecedented rate, with better models being released all the time. So, there needs to be a reliable method for quickly updating the ML models at the core of an edge solution. This is where MLOps kicks in, alongside DevOps methodology, to provide the complete development pipeline. Tools and techniques developed for and used in traditional DevOps, such as containerization, can be applied to maintain competitive advantage.Given the speed of AI development, most organizations will soon be considering its adoption in one form or another. With edge technology advancing rapidly as well, businesses need to seriously consider the benefits this can provide before they invest. About the AuthorPrzemysaw KrokoszEdge and Embedded Technology Solutions Expert,MobicaPrzemysaw Krokosz is an edgeand embedded technology solutions expertatMobica. He works closely with some of the worldslargest and most prestigious organizations on innovative areas of tech development.See more from Przemysaw KrokoszNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
0 Σχόλια
·0 Μοιράστηκε
·44 Views