0 Kommentare
0 Anteile
124 Ansichten
Verzeichnis
Verzeichnis
-
Please log in to like, share and comment!
-
WWW.COMPUTERWEEKLY.COMPublic cloud: Data sovereignty and data security in the UKThe UK governments decision to designate datacentres as critical national infrastructure (CNI) in September 2024 signalled its ambition to build a digital economy that is secure and globally competitive.But behind the headlines about protecting against cyber crime and IT blackouts lies a more complicated reality a sector grappling with policy uncertainty, reliance on foreign cloud giants and a data sovereignty agenda that looks increasingly compromised.In a blog post, Forrester principal analyst Tracy Woo wrote: New sovereignty requirements such as SecNumCloud, Cloud de Confiance from France, and the Cloud Computing Compliance Controls Catalog (C5) from Germany, along with the push to keep data in-country, have created a broader push for private and sovereign clouds.But the promise of protected infrastructure rings hollow when hyperscalers openly admit they cannot guarantee that UK government data stored in cloud services such as Microsoft 365 and Azure will remain within national borders.Woo points out that countries in the European Union (EU) and Asia-Pacific (APAC) have been attempting to more heavily leverage non-US-based cloud providers, create sovereign clouds, or leave workloads on-premise.In the UK, regulatory scrutiny is exposing the fragile state of the UKs digital independence. Looking at the UKs approach to data sovereignty, law firm Kennedys Law describes the Data Use and Access (DUA) Bill, which was published in October 2024, as a more flexible risk-based approach for international data transfers.Kennedys notes that the new test requires that the data protection standards in the destination jurisdiction must not be materially lower than those in the UK. According to Kennedys, this standard is less rigid than the EUs essential equivalence requirement but raises questions about how materially lower will be interpreted in practice.Understandably, with the governments reliance on cloud-based productivity tools, concerns about compliance with UK data protection laws have intensified.The Competition and Markets Authority (CMA) is now investigating cloud market practices that could lock customers into foreign providers. A provisional report is expected in early 2025, setting the stage for potential regulatory reforms aimed at boosting data sovereignty and curbing monopolistic practices.This is not before time for Mark Boost, CEO of Civo, a UK-based cloud hosting specialist. The inability to ensure data remains within UK borders underscores the risks of depending on hyperscalers, warns Boost. If we keep outsourcing critical data infrastructure, we risk losing more than just technical control, we lose national independence.The CMAs review could reshape the countrys digital future, potentially mandating greater transparency and requiring UK data storage guarantees from global cloud providers. This is something Boost has been talking about for some time.Transparency isnt just about where data is stored, its about how datacentres are powered, maintained and secured, he says. His argument highlights the essential connection between data sovereignty and operational clarity, urging providers to adopt clearer accountability measures. The inability to ensure data remains within UK borders underscores the risks of depending on hyperscalers. If we keep outsourcing critical data infrastructure, we risk losing more than just technical control, we lose national independence Mark Boost, CivoDespite these challenges around transparency, the UK datacentre industry has seen promising signs, particularly in regional investment. The governments recent announcement of a 250m datacentre project in Salford showcases how local government cooperation and targeted investment can drive growth. But such projects remain exceptions rather than the rule.Luisa Cardani, head of datacentres at TechUK and author of the report Foundations for the future: How datacentres can supercharge UK economic growth, warns that without a national policy statement (NPS), the datacentre sector risks becoming fragmented. Local planning authorities lack the expertise and resources to approve projects efficiently, creating bottlenecks that could delay critical infrastructure developments for years.The industry wants to work with local people and authorities, but clear national planning guidance is missing, says Cardani. Without a coherent strategy, were stuck in a cycle of fragmented decisions and regulatory inertia.The proposed inclusion of datacentres under the nationally significant infrastructure projects (NSIP) regime could streamline the approval process, ensuring faster decision-making. However, this remains, for the moment at least, more of an aspiration. In reality, investment will remain stalled until the UK develops a coherent, national approach that balances public and private interests while streamlining the project approval process.Data sovereignty and security requirements are fundamental to this, and to a large extent it will be market forces that determine the shape and size of the UKs datacentre industry. On this front, Alvin Nguyen, senior analyst at Forrester, says businesses must recognise the different risk profiles posed by local and hyperscaler-operated datacentres.It should be expected that hyperscalers will have more bandwidth, more scalability and more redundancy than their more localised counterparts, but having datacentres classified as critical to the UKs infrastructure may help with mitigating some, but not all, security risks, he says.Nguyen also questions whether data sovereignty debates might be over-simplified in some cases.With data security, it comes down to what the organisations requirements are to determine whether or not to go to a hyperscaler or a local datacentre, he says. With sovereignty, that is a bit different. If there are components to the sovereignty laws to restrict access or use of data outside of the local datacentres, hyperscalers will need to ensure that guardrails are in place.Nguyens comments underscore the complexity of managing sensitive data across hybrid environments. Rather than focusing solely on whether to choose a local or global provider, businesses should consider managing workloads across hybrid cloud environments more strategically.Many organisations will find a mix of cloud and datacentres makes the most sense ... the risk profile of each is different and that blend of risk when combining cloud and datacentres can be made to be optimised for them, he says.The security risks associated with data sovereignty are multifaceted, extending far beyond simple data storage concerns. For businesses in regulated sectors, particularly financial services, the stakes are immense.Jon Cosson, head of IT and chief information security officer at wealth management firm JM Finn, underscores the potential dangers when businesses assume that using a large cloud provider automatically guarantees security.Its absolutely imperative you know where your data is and how to secure it, he warns. You would not believe how many businesses still just rely on somebody else.The issue is compounded by the jurisdictional complexity of global cloud services. When sensitive data crosses borders, it may fall under multiple regulatory regimes, raising questions about legal access and government overreach. This concern has been amplified by legislation such as the US Cloud Act.In 2019, the then home secretary, Priti Patel, signed a US Cloud Act Agreement covering the UK and Northern Ireland, in which the US and UK governments agreed to provide timely access to electronic data for authorised law enforcement purposes. The Cloud Act could compel US-based hyperscalers to provide foreign-stored data to US authorities, bypassing local laws.I want to know exactly where my data goes, how its encrypted and how quickly I can get out if needed, says Cosson, reflecting a broader industry concern that opaque data paths and limited contractual assurances can expose businesses to significant compliance risks.We use the cloud when we have to, but still run key systems on-premise for control, adds Cosson. This approach is typical of companies handling sensitive financial data. There is a lack of trust with organisations not prepared to take promises of secure cloud storage at face value.While Cosson acknowledges that cloud adoption is inevitable for some services, such as Microsoft 365, he underscores the enduring role of on-premise infrastructure for businesses that require absolute control over sensitive data. This, of course, raises an additional problem of how to manage hybrid data environments securely and efficiently.According to Cosson, companies like Nutanix play a critical role here, enabling organisations to manage workloads across cloud and on-premise environments while maintaining data control. Nutanixs infrastructure services are designed to address sovereignty concerns, he says, by ensuring businesses have clear data management policies and remain compliant with local regulations. We need coordinated efforts between government, industry and local authorities to build a resilient datacentre ecosystem. This means shared responsibility, clearer policy frameworks, and incentives for both hyperscalers and UK-based providers Luisa Cardani, TechUKThe next five years will be decisive, says Civos Boost. If transparency becomes a legal requirement, well see businesses demanding more from providers, not just about where data resides, but also how infrastructure is managed and powered.TechUKs Cardani believes public-private partnerships will play a crucial role here. We need coordinated efforts between government, industry and local authorities to build a resilient datacentre ecosystem, she says. This means shared responsibility, clearer policy frameworks, and incentives for both hyperscalers and UK-based providers.Boost and Cardani each agree that the balance of power between hyperscalers and local operators may shift, particularly if future policies mandate data localisation or prohibit cross-border data transfers without explicit guarantees. Sovereignty-by-design, where infrastructure is built to meet local compliance from the start, could become the new standard.Until that point, organisations need to work out how they can meet existing standards. Cardani argues that adherence to standards must be supported by national policies that enable transparent reporting and clear accountability structures.In practice, this means enforcing mandatory audits, data residency certifications and security benchmarks tailored to UK-specific legal frameworks. Without these measures, businesses risk falling into compliance gaps that could expose them to data breaches, fines and legal disputes.Frameworks such as ISO 27001 for information security management, General Data Protection Regulation (GDPR) for data privacy and Payment Card Industry Data Security Standard (PCI DSS) for payment security set clear operational expectations. Yet these standards are only part of the equation, as evolving regulations increasingly emphasise data sovereignty and security-by-design.Ensuring that datacentres comply with such frameworks while offering sovereignty guarantees has become a pressing challenge. Hyperscalers operating across multiple jurisdictions complicate audits and compliance checks due to varying legal obligations and data transfer rules.The introduction of the CMAs investigation is urgently needed, if only to provide some clarity around what, for most buyers, has become a confusing subject.For IT leaders, the critical takeaway is that responsibility cannot be outsourced. Security, compliance and sovereignty must be actively managed through risk assessments, compliance audits and multi-supplier strategies.And as the UKs digital infrastructure evolves, only businesses that stay ahead of regulation and demand transparency from their providers will be able to navigate the uncertainties.On that score, the UKs datacentre industry stands at a crossroads but with policy clarity, local investment and industry transparency, it has the potential to become a global digital leader in this space.Its about trust and everyone playing by the same, fair rules, but from a UK perspective it is also about protecting that most valuable national asset data.At JM Finns Cosson puts it: Data sovereignty is not a buzzword, its survival.Read more stories about data regulationsHow to build an effective third-party risk assessment framework: Dont overlook the threats associated with connecting vendors and partners to internal systems. Do your due diligence and use third-party risk assessments to prevent supply chain attacks.Security in the public cloud explained a guide for IT and security admins: In this guide, IT security and industry experts share their top recommendations for protecting public cloud deployments.0 Kommentare 0 Anteile 130 Ansichten
-
WWW.FORBES.COMGoogle Reveals 2025s Android UpgradeDoes This Beat Your iPhone?Android is catching up fastGetty ImagesDespite some recent headlines, iPhones remain safer than Androids given Apples tighter restrictions and control of its ecosystem. But Google is catching up and Samsung is moving even faster to narrow the gap. Now, a new Android upgrade for 2025 could make Androids more like iPhones than ever before but theres a twist.The upgrade news comes by way of Android Authority, which has delved into the new Android 16 Beta 1 to discover more detail on the new Advanced Protection Mode feature that was first discussed last year. This fully disables app installs from outside Play Store or other pre-installed stores, and it disables 2G.The Advanced Protection Mode is an Android settings mode that works alongside Googles Advanced Protection Program (APP). This account setting safeguards users with high visibility and sensitive information from targeted online attacks, [with] new protections automatically added to defend against today's wide range of threats.APP mandates the use of a passkey or security key to log into your Google account, flags dangerous downloads, protects web browsing and locks down access to your account. In short, it makes Android safer kind of more like iPhone.MORE FOR YOUApple has its own uber security mode, which it calls Lockdown Mode. But while Apple warns that this is an optional, extreme protection thats designed for the very few individuals who might be personally targeted by some of the most sophisticated digital threats, Google is not quite so restrictive.In contrast with Apples most people are never targeted by attacks of this nature, and so should not enable Lockdown Mode, Google says people whose accounts contain particularly valuable files or sensitive information should consider Advanced Protection, and it strongly recommends business executives to enable the mode, as well as journalists, activists and people involved in elections.Notice the difference?Almost no-one reading this needs Lockdown Mode, it really is for the sub 1% and Apple warns anyone tempted that when Lockdown Mode is enabled, your device wont function like it usually would. To reduce the attack surface that could potentially be exploited by highly targeted mercenary spyware, certain apps, websites and features will be strictly limited for security, and some experiences may not be available at all. This will make your iPhone painful to use.I have warned before that despite headlines suggesting Lockdown Mode is simply an extra security setting, its not. Unless youre in one of those highly sensitive roles or have reason to fear nation-state level attacks, you dont need this. It will remove attachments, restrict web browsing, and even block shared photo albums.Googles APP has more mass appeal especially to business users, because the risks on Android are greater than the risks on iPhone. This new mode is something of a leveler. Passkey access, restricting account data, and blocking sideloading are all sensible measures. And theyre all areas where iPhones are safer than Androids.Android 15 is an exceptional update when it comes to security and privacy, and it looks like Android 16 will bring more of the same. Thats very welcome. The other theme here is the appeal to enterprises, to assure that Android can be locked down. Cue Samsung and its 25 reasons to switch appeal to iPhone users. The enhanced levels of corporate control and device lockdowns are central to this.And so while my advice to almost all iPhone users is not to enable Lockdown Mode, my advice to Android users is likely to be that they enable Advanced Protection Mode as and when its available subject to the final small print, of course.0 Kommentare 0 Anteile 121 Ansichten
-
WWW.FORBES.COMRed Hat: A Solution Means Eminently Adaptable SoftwareGOA, INDIA - FEBRUARY 28: Foreign tourists and travelers learn yoga on the beach on February 27, ... [+] 2006 in Arambol, Goa, India. The tiny Indian state became known as a hippie heaven in the 1960's and its beaches have hosted all night parties for adventurous backpacking tourists ever since. (Photo by Ami Vitale/Getty Images)Getty ImagesDisks are dead. To be clear, this comment is in no way related to the still-functioning hard disks inside your laptop (or desktop) machine, the industrious blade server disks located in datacenters around the globe that provide us all with the power of always-on computing and the cloud, or indeed the USBs and hard disk extensions that most of use for backup and additional memory.But disks as we once knew them are a thing of the past if we consider how used to install Windows 98 on over 30 pre-formatted 3.5-inch floppy disks, with Microsoft Office also requiring almost 20 disks. As we moved through the age of the CD-ROM into the nineties and the naughties, we started to enjoy more ubiquitous connection to the world wide web (as the Internet was once known) and users started to appreciate the need for continously composed patches, updates and downloads.What that brief history of desktop computing allows us to realize today is that data engineering has moved beyond its traditional roots i.e. its no longer just about building static pipelines to move and transform data its about designing adaptable systems that thrive in a complex world.In other words, when we talk about software solutions - that oft-hackneyed and over-user term that the IT industry loves to use - what we really means is: software that can change, morph, grow, extend and adapt.The Age Of AdaptabilityToday we know that workloads evolve, technologies shift and data sprawls across hybrid and multi-cloud environments. In this context, while scale is still vital, adaptability has overtaken it as the key driver of success for modern data systems, explained Erica Langhi, associate principal solution architect, Red Hat. For data engineers, this means rethinking how pipelines are made. They must no longer function simply as static workflows, but become real-time, modular and productized systems designed to adapt and consistently deliver value to their consumers.Langhi and team of course base their opinions on the open source DNA that beats at the heart of Red Hat which has now (very arguably, given the widespread embrace of open systems architecture by previously proprietaryonly protagonists) proven itself to be among them more manageably malleable ways to create enterprise software applications for the post-Covid things could disrupt any moment world we now live in.As such, she reminds us that open source technologies and hybrid cloud architectures provide the essential building blocks for the evolved systems that we need today. But, without thoughtful data engineering practices that prioritize usability, collaboration, lifecycle management and adaptability, even the best tools risk becoming just another layer of complexity. What this truth leads us to is a need to think about our enterprise data and its full-flowing pipeline as a product that we use in a more agile way.Data Pipelines As ProductsTraditional data pipelines were designed for linear workflows: they ingested, processed and delivered data outputs. While sufficient for the static environments of the past, this model falls short in addressing modern, dynamic use cases demands. Treating data pipelines as products flips this approach on its head, said Langhi. Productized pipelines are built as modular components, each handling specific functions like data ingestion or enrichment. These components can be updated, replaced or scaled independently, making the pipeline adaptable to changing requirements.For instance, she explains, when a new data format or source is introduced, only the relevant module needs adjustment, minimising disruption and downtime.Versioning each iteration of the pipeline ensures downstream consumers, like AI models or analytics dashboards, can trace data lineage and access accurate datasets. This supports auditing, compliance and confidence in the data. Strong governance practices further enhance these pipelines by ensuring data quality and consistency. If data is oil, metadata is gold: a vital resource for ensuring traceability and unlocking actionable insights.Lets consider what this could look like in a healthcare context. A productized pipeline might automate the ingestion and anonymisation of patient imaging data from edge devices. It could enrich the data in real-time, add metadata for regulatory compliance and make information immediately accessible to researchers or AI diagnostic models. Unlike traditional pipelines, the system would evolve to handle new data sources, scale with growing patient data and integrate emerging tools for advanced analysis, clarified Red Hats Langhi.Breaking Down SilosFor data pipelines to function as adaptable products, the Red Hat team are adamant that they must break free from silos. Data locked within department-specific systems or proprietary platforms leads to rigid workflows. This makes it nearly impossible to create pipelines that deliver value across an organisation.Open source is widely agreed to helps with this. Pipelines built with open source can harness community expertise to provide a shared, reusable foundation. This empowers users to design pipelines that are portable, interoperable and adaptable to new tools and evolving business needs.Open source data pipelines provide the flexibility needed to bridge hybrid cloud environments by combining data from on-premise systems and private and public cloud platforms into unified workflows, without requiring major re-architecture. Take Kafka: an open source data streaming pipeline, it can accelerate data delivery, enable real-time insights, provide regulatory compliance and support AI use cases, regardless of the data's origin. Kafka benefits from continuous growth and optimization through open collaboration with innovators, said Langhi. As workloads evolve and expand, combining technologies like Kafka and Kubernetes enables the development of scalable, reliable and highly available data pipelines, essential for machine learning applications. New tools can be added, workloads can be shifted across environments and processes can evolve with minimal disruption.AI Needs Quality DataOne of the most transformative applications of modern data engineering is in artificial intelligence. AIs value lies in its ability to turn data into insights. But this is only possible if the data itself is prepared to meet AI models demands. Raw data, in its unstructured and inconsistent form, holds little value until it is transformed into a usable state. This is where data engineering plays a key role, bridging the gap between raw inputs and the refined, enriched datasets that fuel AI.As AI adoption grows, data engineers are tasked with managing the ever-increasing volume, variety and velocity of data. Its no longer enough to simply collect and store information; data must be accessible, trustworthy and ready to use in real time. The evolving role of data engineers reflects this complexity. They now focus on building pipelines that deliver high-quality data for fine-tuning models, handle real-time streaming from edge devices, and adapt seamlessly to new tools and requirements, concluded Langhi.Talking about the future of data engineering, Langhi feels strongly that we need to realize how important it is to talk about cultivating systems that thrive in uncertainty and deliver real, ongoing value. As the data landscape grows more complex, the true measure of success will be the ability to adapt, iterate and deliver quality outputs.0 Kommentare 0 Anteile 150 Ansichten
-
Developer creates a subpixel version of Snake that requires a microscope to playIn brief: Is Snake becoming the new Doom? A few weeks after the classic mobile title was shrunk down to a 56-byte QR code, someone has developed what's likely to be the world's smallest version of the game. It's so tiny that it requires a microscope to play correctly. The microscopic version of Snake is the work of software developer Patrick Gillespie, who demonstrated the amazing feat on his YouTube channel.Gillespie explains in the video that he created a JavaScript Snake game 15 years ago. His goal was to shrink it down to the point where the game uses the individual subpixels of a monitor.Subpixels are the smaller components that make up a single pixel on a digital display. They typically come in red, green, and blue lights, and their brightness is adjusted to create the different colors that we see while looking at a display.Gillespie used his iMac for the browser-based project as its pixel geometry is an RGB stripe formation. The project didn't get off to a smooth start as he struggled to make the game show just one color in each subpixel. The green subpixel was showing some red and blue, requiring him to switch to an LED color space with a wider gamut.For those who don't own a microscope but still want to try this subpixel version of Snake, it can be played in a web browser at maximum zoom, though you also need to use the Windows Magnifier function turned up to maximum. This won't be as effective as using a microscope, of course. // Related StoriesYou can check out Gillespie's Snake game on his personal website. You can also take a look at the code over on GitHub to discover more about how it was put together.This is the second unconventional version of Snake we've seen this month. A couple of weeks ago, developer donno2048 managed to squeeze the game down to just 56 bytes, making it small enough to be encoded into a single QR code. You can check out the demo of that project right here.0 Kommentare 0 Anteile 149 Ansichten
-
WWW.WSJ.COMChip Stocks Tumble After Chinas DeepSeek AI Models Raise Doubts Over U.S. Tech DominanceGlobal chip stocks slumped Monday on DeepSeek revealing it had developed AI models that nearly matched American rivals despite using inferior chips.0 Kommentare 0 Anteile 156 Ansichten
-
WWW.NEWSCIENTIST.COMCovid smell loss eased by injecting blood cells into the noseTransmission electron micrograph of SARS-CoV-2 virus particles (gold) within a nasal cellBSIP SA/AlamyPeople who had lost their sense of smell after catching covid-19 partly regained it following the injection of blood cells called platelets into their noses, which could help to improve their quality of life.Since the beginning of the pandemic, a loss or change to your sense of smell or taste has been considered a common covid-19 symptom.The SARS-CoV-2 virus enters cells in the nose, causing inflammation that can damage neurons,0 Kommentare 0 Anteile 156 Ansichten
-
WWW.TECHNOLOGYREVIEW.COMUseful quantum computing is inevitableand increasingly imminentOn January 8, Nvidia CEO Jensen Huang jolted the stock market by saying that practical quantum computing is still 15 to 30 years away, at the same time suggesting those computers will need Nvidia GPUs in order to implement the necessary error correction. However, history shows that brilliant people are not immune to making mistakes. Huangs predictions miss the mark, both on the timeline for useful quantum computing and on the role his companys technology will play in that future. Ive been closely following developments in quantum computing as an investor, and its clear to me that it is rapidly converging on utility. Last year, Googles Willow device demonstrated that there is a promising pathway to scaling up to bigger and bigger computers. It showed that errors can be reduced exponentially as the number of quantum bits, or qubits, increases. It also ran a benchmark test in under five minutes that would take one of todays fastest supercomputers 10 septillion years. While too small to be commercially useful with known algorithms, Willow shows that quantum supremacy (executing a task that is effectively impossible for any classical computer to handle in a reasonable amount of time) and fault tolerance (correcting errors faster than they are made) are achievable. For example, PsiQuantum, a startup my company is invested in, is set to break ground on two quantum computers that will enter commercial service before the end of this decade. The plan is for each one to be 10 thousand times the size of Willow, big enough to tackle important questions about materials, drugs, and the quantum aspects of nature. These computers will not use GPUs to implement error correction. Rather, they will have custom hardware, operating at speeds that would be impossible with Nvidia hardware. At the same time, quantum algorithms are improving far faster than hardware. A recent collaboration between the pharmaceutical giant Boehringer Ingelheim and PsiQuantum demonstrated a more than 200x improvement in algorithms to simulate important drugs and materials. Phasecraft, another company we have invested in, has improved the simulation performance for a wide variety of crystal materials and has published a quantum-enhanced version of a widely used Advances like these lead me to believe that useful quantum computing is inevitable and increasingly imminent. And thats good news, because the hope is that they will be able to perform calculations that no amount of AI or classical computation could ever achieve. We should care about the prospect of useful quantum computers because today we don't really know how to do chemistry. We lack knowledge about the mechanisms of action for many of our most important drugs. The catalysts that drive our industries are generally poorly understood, require expensive exotic materials, or both. Despite appearances, we have significant gaps in our agency over the physical world; our achievements belie the fact that we are, in many ways, stumbling around in the dark. Nature operates on the principles of quantum mechanics. Our classical computational methods fail to accurately capture the quantum nature of reality, even though much of our high-performance computing resources are dedicated to this pursuit. Despite all the intellectual and financial capital expended, we still dont understand why the painkiller acetaminophen works, how type-II superconductors function, or why a simple crystal of iron and nitrogen can produce a magnet with such incredible field strength. We search for compounds in Amazonian tree bark to cure cancer and other maladies, manually rummaging through a pitifully small subset of a design space encompassing 1060 small molecules. Its more than a little embarrassing. We do, however, have some tools to work with. In industry, density functional theory (DFT) is the workhorse of computational chemistry and materials modeling, widely used to investigate the electronic structure of many-body systemssuch as atoms, molecules, and solids. When DFT is applied to systems where electron-electron correlations are weak, it produces reasonable results. But it fails entirely on a broad class of interesting problems. Take, for example, the buzz in the summer of 2023 around the room-temperature superconductor LK-99. Many accomplished chemists turned to DFT to try to characterize the material and determine whether it was, indeed, a superconductor. Results were, to put it politely, mixedso we abandoned our best computational methods, returning to mortar and pestle to try to make some of the stuff. Sadly, although LK-99 might have many novel characteristics, a room-temperature superconductor it isnt. Thats unfortunate, as such a material could revolutionize energy generation, transmission, and storage, not to mention magnetic confinement for fusion reactors, particle accelerators, and more. AI will certainly help with our understanding of materials, but it is no panacea. New AI techniques have emerged in the last few years, with some promising results. DeepMinds Graph Networks for Materials Exploration (GNoME), for example, found 380,000 new potentially stable materials. The fundamental issue is that an AI model is only as good as the data it's trained on. Training an LLM on the entire internet corpus, for instance, can yield a model that has a reasonable grasp of most human culture and can process language effectively. But if DFT fails for any non-trivially correlated quantum systems, how useful can a DFT-derived training set really be? We could also turn to synthesis and experimentation to create training data, but the number of physical samples we can realistically produce is minuscule relative to the vast design space, leaving a great deal of potential untapped. Only once we have reliable quantum simulations to produce sufficiently accurate training data will we be able to create AI models that answer quantum questions on classical hardware. And that means that we need quantum computers. They afford us the opportunity to shift from a world of discovery to a world of design. Todays iterative process of guessing, synthesizing, and testing materials is comically inadequate. In a few tantalizing cases, we have stumbled on materials, like superconductors, with near-magical properties. How many more might these new tools reveal in the coming years? We will eventually have machines with millions of qubits that, when used to simulate crystalline materials, open up a vast new design space. It will be like waking up one day and finding a million new elements with fascinating properties on the periodic table. Of course, building a million-qubit quantum computer is not for the faint of heart. Such machines will be the size of supercomputers, and require large amounts of capital, cryoplant, electricity, concrete, and steel. They also require silicon photonics components that perform well beyond anything in industry, error correction hardware that runs fast enough to chase photons, and single-photon detectors with unprecedented sensitivity. But after years of research and development, and more than a billion dollars of investment, the challenge is now moving from science and engineering to construction. It is impossible to fully predict how quantum computing will affect our world, but a thought exercise might offer a mental model of some of the possibilities. Imagine our world without metal. We could have wooden houses built with stone tools, agriculture, wooden plows, movable type, printing, poetry, and even thoughtfully edited science periodicals. But we would have no inkling of phenomena like electricity or electromagnetismno motors, generators, radio, MRI machines, silicon, or AI. We wouldnt miss them, as wed be oblivious to their existence. Today, we are living in a world without quantum materials, oblivious to the unrealized potential and abundance that lie just out of sight. With large-scale quantum computers on the horizon and advancements in quantum algorithms, we are poised to shift from discovery to design, entering an era of unprecedented dynamism in chemistry, materials science, and medicine. It will be a new age of mastery over the physical world. Peter Barrett is a general partner at Playground Global, which invests in early-stage deep-tech companies including several in quantum computing, quantum algorithms, and quantum sensing: PsiQuantum, Phasecraft, NVision, and Ideon.0 Kommentare 0 Anteile 170 Ansichten
-
WWW.BUSINESSINSIDER.COMAirlines would buy the A380 if Airbus gives it a makeover, says Emirates bossThe boss of Emirates thinks Airbus should make a new version of the double-decker Airbus A380.Tim Clark suggested new lighter materials and more fuel-efficient engines could make it more viable."If we were to put $20 billion on the table for Airbus, they'd probably build it for us," he told BI.A revamped version of the Airbus A380 could get orders from several airlines, the president of Emirates told Business Insider.Asked if he'd like Airbus to resume production of the superjumbo, Tim Clark replied, "Well, they know we do. I've given them the designs."The "compelling nature" of a four-engine plane remains "quite clear to many, many people," he said. Tim Clark of Emirates thinks the A380 remains a "compelling" aircraft for some airlines. BRENDAN SMIALOWSKI/AFP via Getty Images Emirates is by far the largest operator of the double-decker plane, with a fleet of 118. Singapore Airlines is next with just 13.Airbus ended production of the A380 in 2021 18 years after it began.The four-engined plane received 251 orders from 14 customers, with many airlines wary of its high operating costs.However, Clark suggested that a modernized version of the A380 could be up to 25% more fuel efficient.He pointed to using lighter and more aerodynamic materials, as well as new engines with UltraFan technology being developed by Rolls-Royce.Clark said the Airbus A380 is "probably the most profitable asset we've got," while a more fuel-efficient version would be cheaper to operate as well as more environmentally friendly."I believe there is a case," Clark told BI. "The risk-averse nature of my peer group, CEOs, and boards is probably a major inhibitor to that But if we were to put $20 billion on the table for Airbus, they'd probably build it for us."Airbus did not respond to a request for comment from BI.While some airlines, such as Air France and Thai Airways, retired their A380s during the pandemic, the superjumbo has since seen a resurgence.Lufthansa brought eight of its 14 out of retirement, and Etihad has reactivated six A380s. Global Airlines, a British startup, has acquired one formerly used by China Southern Airlines and hopes to launch commercial flights between London and New York this year. Emirates is fitting premium economy cabins on many of its A380s. Ryan Lim/AFP/Getty Images The A380 has been popular with passengers because its size offers more comfort, and it's quieter than other wide-body jets, especially when seated on the upper deck.Its mammoth size has also allowed airlines to install luxurious amenities, like Emirates' bar and shower for first-class passengers.But its huge capacity of about 500 passengers means it needs to be used on very popular routes. This works well for Emirates' hub-and-spoke route model connecting passengers to destinations around the world via Dubai but less so for others. Airbus did not get any orders from airlines in North or South America, for example.Capacity constraintsYet, Clark thinks the A380 could be a solution as some major airports face constraints as demand for air travel keeps rising."If you look at the demand as it stands for all of us, not just Emirates, all of us today, there is a high-class problem in the making," he said. A British Airways A380 takes off from London Heathrow Airport. Tejas Sandhu/SOPA/Getty Images Clark pointed to increasing passenger numbers at New York's JFK, Boston, Paris, Frankfurt, and London Heathrow where a debate has been ongoing for many years about constructing a third runway to cope with demand."It's a no-brainer for the aviation community, particularly in the airport world, to see the passengers getting off, say, an Emirates A380, 500 at a time into Heathrow or join it, empty their pockets in the departure lounge or the fast food or the merchandising rather than a slot occupied at 50 seats," Clark said.0 Kommentare 0 Anteile 158 Ansichten
-
WWW.BUSINESSINSIDER.COMNew York City is still the center of the hedge fund universe. Here are the numbers.A review of regulatory filings of the biggest multimanagers reveals New York's continued supremacy.Managers like Citadel and Balyasny have most of their PMs in New York, even if they are headquartered elsewhere.Despite interest in places like Miami, investing talent remains in established locales.This week, the $4.5 trillion hedge fund industry is gathered in Miami for iConnections' annual Global Alts conference. Nearly all of them will leave the Magic City after the conference concludes.While some big names have fled high-tax cities like New York and Chicago for sunny spots like Miami and West Palm Beach, especially amid the pandemic-era buzz over Wall Street South, the data show that New York is still the place to be for money-managing talent.Regulatory filings for the industry's largest multimanagers, including Citadel, Millennium, and Point72, show that a vast majority of those who "perform investment advisory functions" work from the Big Apple. Including the three aforementioned managers as well as Balyasny, Schonfeld, ExodusPoint, Verition, Walleye, and Hudson Bay, more than 75% of investing talent works in New York, a Business Insider review of ADVs and internal metrics from certain funds show.(Story continues after graphic. The ADVs, while updated throughout the year, show a snapshot of the investing head count for each firm from March, so the data reflects firms' staffing from last spring.)Even managers not based in New York such as Citadel, Point72, Verition, Hudson Bay, and Balyasny have more investing talent in Gotham than their respective headquarters in Florida, Connecticut, and Illinois. Walleye, which was once based in Minnesota and still has 21 investors in Minneapolis, moved its headquarters to New York at the end of 2023 and now has dozens more traders there than any other office."It's an apprenticeship business," Adam Kahn, founder of headhunter firm Odyssey Search Partners, told Business Insider. "For the most part, the opportunity to surround yourself with the best people is going to be in major money centers.""If you want to sit next to your PM, you need to be where your PM is," he said. And while there are senior leaders who have decamped to sunnier, cheaper spots around the country, New York, Chicago, and San Francisco still have a significant concentration of people.Greenwich and Stamford, a pair of bedroom communities of the New York metro, have also continued to be important centers of gravity for hedge funds, which have become a part of the social fabric of these Connecticut towns.Citadel's talent breakdown, according to its ADV showing data from last March, is interesting given the firm's billionaire founder's preference for Miami. Ken Griffin, who is originally from Florida, moved his firm's headquarters south from Chicago in 2022 and has commented that one day, the city could surpass New York as a financial center though he still referred to Manhattan as the "epicenter of thoughtful people passionately engaged in their careers" in 2023.While Griffin and some of his executives, including chief risk officer Joanna Welsh and commodities head Sebastian Barrack, have relocated, the firm's investing talent has not yet moved en masse to Florida.The data showed that more investment-focused staffers were based in the firm's two Texas offices, Houston and Dallas, than in its two Sunshine State offices, Miami and Tampa. The $66 billion fund has more investors in each of its two New York offices as well as its Greenwich, Chicago, and San Francisco offices than it did in its Miami outpost last March. (Story continues below the graphic)A person close to the manager said the firm is committed to Miami and plans to break ground on its new 54-story waterfront building that will serve as company headquarters sometime later this year or early next year."We've welcomed roughly 400 team members to this vibrant city since establishing the firms' global headquarters here in 2022, and we have exciting plans to keep growing our presence in the months and years to come," a statement from the firm reads. The 400 people include employees from both Citadel and Citadel Securities, Griffin's market maker.Can always catch a flightInvestors on the ground for these managers say different locations provide different benefits. Stockpickers focused on certain industries, like energy or technology, find places like Houston and San Francisco useful to be plugged into the companies they invest in. Quants who do not need to meet with corporate leaders to run their strategies say they're generally more flexible about where they can work than stockpickers who want to hear directly from CEOs.Meanwhile, young analysts working for portfolio managers based in Connecticut offices often reverse commute from New York to places like Greenwich or Stamford so they can still enjoy the nightlife and culture of the Big Apple.The ongoing talent war for top-shelf PMs means funds are generally more flexible on location, said Vikram Tandon, the head of Durlston Partners US, a recruiting firm. But that flexibility has a limit."The only people who demand to be somewhere and get it are the senior people who are setting up a whole team," Tandon said.This isn't to say Florida isn't on the rise. Data from hedge-fund seeder Borealis Strategic Partners shows that 11% of US hedge fund launches in 2024 were in Florida, compared to just 3% in 2020. The Tri-State area was at 52%, down from 56% the year prior, according to Borealis.Two portfolio managers who moved to Miami and West Palm in recent years for two different multimanagers told BI that it's been a net positive for them and their families.Plus, one of these PMs said, "It's only a two-hour flight back to New York."0 Kommentare 0 Anteile 164 Ansichten