WWW.COMPUTERWEEKLY.COM
Red Hat acquires tech to lower the cost of machine learning
your123 - stock.adobe.comNewsRed Hat acquires tech to lower the cost of machine learningThe acquisition of Neural Magic by Red Hat is being positioned as a way to democratise machine learning and reduce the need for GPUsByCliff Saran,Managing EditorPublished: 13 Nov 2024 14:55 Red Hat has announced its intention to acquire Neural Magic, the lead developer behind the open source vLLM project.The acquisition is being positioned as a way for Red Hat and its parent IBM to lower the barrier to entry for organisations that want to run machine learning workloads without the need to deploy servers equipped with graphics processing units (GPUs). This reliance creates a barrier to entry, hindering the widespread adoption of artificial intelligence (AI) across various industries and limiting its potential to revolutionise how we live and work.The GitHub entry for vLLM describes the software as: A high-throughput and memory-efficient inference and serving engine for LLMs [large language models].In a blog discussing the deal, Red Hat president and CEO Matt Hicks said Neural Magic had developed a way to run machine learning (ML) algorithms without the need for expensive and often difficult to source GPU server hardware.He said the founders of Neural Magic wanted to empower anyone, regardless of their resources, to harness the power of AI. Their groundbreaking approach involved leveraging techniques like pruning and quantisation to optimise machine learning models, starting by allowing ML models to run efficiently on readily available CPUs without sacrificing performance, he wrote.Hicks spoke about the shift towards smaller, more specialised AI models, which can deliver exceptional performance with greater efficiency. These models are not only more efficient to train and deploy, but they also offer significant advantages in terms of customisation and adaptability, he wrote.Red Hat is pushing the idea of sparsification, which, according to Hicks, strategically removes unnecessary connections within a model. This approach, he said, reduces the size and computational requirements of the model without sacrificing accuracy or performance. Quantisation is then used to reduce model size further, enabling the AI model to run on platforms with reduced memory requirements.All of this translates to lower costs, faster inference and the ability to run AI workloads on a wider range of hardware, he added.Red Hats intention to acquire Neural Magic fits into parent company IBMs strategy to help enterprise customers use AI models.In a recent interview with Computer Weekly, Kareem Yusuf, product management lead for IBMs software portfolio, said the supplier has identified a business opportunity to support customers that want to easily mash their data into the large language model. This, he said, allows them to take advantage of large language models in a way that enables protection and control of enterprise data.IBM has developed a project called InstructLab that provides the tools to create and merge changes to LLMs without having to retrain the model from scratch. It is available in the open source community, along with IBM Granite, a foundation AI model for enterprise datasets. Listen to the full interview with Kareem Yusuf Dario Gil, IBMs senior vice-president and director of research, said: As our clients look to scale AI across their hybrid environments, virtualised, cloud-native LLMs built on open foundations will become the industry standard. Red Hats leadership in open source, combined with the choice of efficient, open source models like IBM Granite and Neural Magics offerings for scaling AI across platforms, empower businesses with the control and flexibility they need to deploy AI across the enterprise.Read more about IBMs AI strategyIBMs latest Z mainframe offers lessons in building AI systems: Studying the engineering behind IBMs mainframe architecture could help enterprises build higher reliability into the GPU clusters used to run AI applications.IBM throws its Red Hat into open source AI ring with RHEL AI: IBM and Red Hat open source their first LLMs, but IT experts say RHEL AI is more likely to stand out in the ways it links AI to hybrid cloud infrastructure.In The Current Issue:Interview: Niall Robinson, head of product innovation, Met OfficeIAM: Enterprises face a long, hard road to improveDownload Current IssueHow enterprises can improve ROI on AI investments Data MattersHow To "Vawlt" Superclouds Networks GenerationView All Blogs
0 Commentarii 0 Distribuiri 46 Views