• Amazon Teamsters Authorize Third Strike at U.S. Facility
    www.wsj.com
    Illinois union members join workers in New York City in calling for a work stoppage.
    0 Comments ·0 Shares ·92 Views
  • TikTok Asks Supreme Court to Intervene on U.S. Ban
    www.wsj.com
    The Chinese-backed app is seeking to delay the bans Jan. 19 effective date.
    0 Comments ·0 Shares ·122 Views
  • The Furys Live in Brooklyn and Jason Palmers The Cross Over Review: A Boroughs Jazz
    www.wsj.com
    Two albums recorded at the same intimate Brooklyn club are marked by immediacy, agility and gracefully rendered musical complexity.
    0 Comments ·0 Shares ·116 Views
  • Joe Walshs So What Turns 50
    www.wsj.com
    Shortly before he joined the Eagles, the guitarist and singer released this forceful yet elegiac rock album, expressing his grief over his young daughters death.
    0 Comments ·0 Shares ·106 Views
  • In IT? Need cash? Cybersecurity whistleblowers are earning big payouts.
    arstechnica.com
    blow that whistle In IT? Need cash? Cybersecurity whistleblowers are earning big payouts. The US government now relies on whistleblowers to bring many cases. Nate Anderson Dec 16, 2024 5:38 pm | 20 Credit: Getty Images | spxChrome Credit: Getty Images | spxChrome Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreMatthew Decker is the former chief information officer for Penn State Universitys Applied Research Laboratory. As of October, he's also $250,000 richer.In his Penn State position, Decker was well placed to see that the university was not implementing all of the cybersecurity controls that were required by its various contracts with NASA and the Department of Defense (DoD). It did not, for instance, use an external cloud services provider that met the DoD's security guidelines, and it fudged some of the self-submitted "scores" it made to the government about Penn State's IT security.So Decker sued the school under the False Claims Act, which lets private individuals bring cases against organizations on behalf of the government if they come across evidence of wrongdoing related to government contracts. In many of these cases, the government later "intervenes" to assist with the case (as it did here), but whether it does so or not, whistleblowers stand to collect a percentage of any fines if they win.In October, Penn State agreed to a $1.25 million settlement with the government; Decker got $250,000 of the money.On the regularThis now happens in IT with some regularity. In November, Dell, Dell Federal Systems, and Iron Bow Technologies settled with the government for $4.3 million over claims that they "violated the False Claims Act by submitting and causing the submission of non-competitive bids to the Army and thereby overcharging the Army under the Army Desktop and Mobile Computing 3 (ADMC-3) contract."But once again, this wasn't something the government uncovered on its own; a whistleblower named Brent Lillard, who was an executive at another company in the industry, brought the initial complaint. For his work, Lillard just made $345,000.In early December, Gen Digital (formerly Symantec) paid a much larger fee$55.1 millionafter losing a trial in 2022. Gen Digital/Symantec was found liable for charging the government higher prices than it charged to companies.Once again, the issue was brought to light by a whistleblower, Lori Morsell, who oversaw the contract for Gen Digital/Symantec. Morsell's award has not yet been determined by the court, but given the amount of the payout, it should be substantial.False Claims Act goes digitalDue to the complexity of investigatingor even finding out abouttechnical failures and False Claims Act cases from the outside of an organization, the government has increasingly relied on whistleblowers to kick-start these sorts of IT cases.The False Claims Act goes back to the Civil War, where it was used on unscrupulous vendors who sold poor-quality goods to the Union army. Today, it has become the tool of choice to prosecute cyber-failures regarding government contractors, largely because of the Act's robust whistleblower rules (technically known as its "qui tam" provisions).This was, even just a few years ago, a novel proposition. In 2020, the law firm Carlton Fields noted that "two significant whistleblower cases sent ripples through the False Claims Act (FCA) community by demonstrating the specter of FCA liability resulting from the failure to comply with cybersecurity requirements in government contracts."In one of these cases, Brian Markus earned $2.61 million for his False Claims Act case against Aerojet Rocketdyne.In the other, James Glenn sued Cisco over a video surveillance product that had known security flaws and yet was sold to numerous government agencies. Cisco eventually paid $8.6 million, of which Glenn walked away with more than $1 million.By 2021, however, False Claims Act cases to go after government contractors, especially in the IT sector, had become downright normal. The Department of Justice even stood up a special program called the Civil Cyber-Fraud Initiative to assist with such cases. In a late 2021 speech, Acting Assistant Attorney General Brian Boynton said that the initiative would use whistleblowers and the False Claims Act to focus on three things:Knowing failures to comply with contractual cyber standardsKnowing misrepresentation of security controls and practicesKnowing failure to report suspected breaches in a timely fashionIn the last four years, the initiative has brought in judgments and settlements against many major companies like Boeing (which paid $8.1 million in 2023; several whistleblowers split $1.5 million), and it has gone after huge universities like Penn State (see above) and Georgia Tech (earlier this year, still tied up in court).Blowing a whistle for yearsThese cases all rely on insiders, and the payouts can be hefty, but the cases can also take years to reach their conclusions. The Cisco case, for instance, lasted eight years before the whistleblower got his money. The Penn State case was relatively speedy by contrasta mere two years from its filing in October 2022 to the university's payout earlier this year.To report fraud against the federal government, contact the Department of Justice here. But be aware that, if you're hoping to collect a share of any future payout, you generally need to retain a lawyer and file a whistleblower case first.Nate AndersonDeputy EditorNate AndersonDeputy Editor Nate is the deputy editor at Ars Technica. His most recent book is In Emergency, Break Glass: What Nietzsche Can Teach Us About Joyful Living in a Tech-Saturated World, which is much funnier than it sounds. 20 Comments
    0 Comments ·0 Shares ·113 Views
  • Huge math error corrected in black plastic study; authors say it doesnt matter
    arstechnica.com
    Missed a zero Huge math error corrected in black plastic study; authors say it doesnt matter Correction issued for black plastic study that had people tossing spatulas. Beth Mole Dec 16, 2024 5:23 pm | 87 Close-up view of cooking utensils in container on kitchen counter Credit: Getty | Grace Cary Close-up view of cooking utensils in container on kitchen counter Credit: Getty | Grace Cary Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreEditors of the environmental chemistry journal Chemosphere have posted an eye-catching correction to a study reporting toxic flame retardants from electronics wind up in some household products made of black plastic, including kitchen utensils. The study sparked a flurry of media reports a few weeks ago that urgently implored people to ditch their kitchen spatulas and spoons. Wirecutter even offered a buying guide for what to replace them with.The correction, posted Sunday, will likely take some heat off the beleaguered utensils. The authors made a math error that put the estimated risk from kitchen utensils off by an order of magnitude.Specifically, the authors estimated that if a kitchen utensil contained middling levels of a key toxic flame retardant (BDE-209), the utensil would transfer 34,700 nanograms of the contaminant a day based on regular use while cooking and serving hot food. The authors then compared that estimate to a reference level of BDE-209 considered safe by the Environmental Protection Agency. The EPA's safe level is 7,000 ngper kilogram of body weightper day, and the authors used 60 kg as the adult weight (about 132 pounds) for their estimate. So, the safe EPA limit would be 7,000 multiplied by 60, yielding 420,000 ng per day. That's 12 times more than the estimated exposure of 34,700 ng per day.However, the authors missed a zero and reported the EPA's safe limit as 42,000 ng per day for a 60 kg adult. The error made it seem like the estimated exposure was nearly at the safe limit, even though it was actually less than a tenth of the limit."[W]e miscalculated the reference dose for a 60 kg adult, initially estimating it at 42,000 ng/day instead of the correct value of 420,000 ng/day," the correction reads. "As a result, we revised our statement from 'the calculated daily intake would approach the U.S. BDE-209 reference dose' to 'the calculated daily intake remains an order of magnitude lower than the U.S. BDE-209 reference dose.' We regret this error and have updated it in our manuscript."Unchanged conclusionWhile being off by an order of magnitude seems like a significant error, the authors don't seem to think it changes anything. "This calculation error does not affect the overall conclusion of the paper," the correction reads. The corrected study still ends by saying that the flame retardants "significantly contaminate" the plastic products, which have "high exposure potential."Ars has reached out to the lead author, Megan Liu, but has not received a response. Liu works for the environmental health advocacy group Toxic-Free Future, which led the study.The study highlighted that flame retardants used in plastic electronics may, in some instances, be recycled into household items."Companies continue to use toxic flame retardants in plastic electronics, and that's resulting in unexpected and unnecessary toxic exposures, Liu said in a press release from October. "These cancer-causing chemicals shouldn't be used to begin with, but with recycling, they are entering our environment and our homes in more ways than one. The high levels we found are concerning."BDE-209, aka decabromodiphenyl ether or deca-BDE, was a dominant component of TV and computer housings before it was banned by the European Union in 2006 and some US states in 2007. China only began restricting BDE-209 in 2023. The flame retardant is linked to carcinogenicity, endocrine disruption, neurotoxicity, and reproductive harm.Uncommon contaminantThe presence of such toxic compounds in household items is important for noting the potential hazards in the plastic waste stream. However, in addition to finding levels that were an order of magnitude below safe limits, the study also suggested that the contamination is not very common.The study examined 203 black plastic household products, including 109 kitchen utensils, 36 toys, 30 hair accessories, and 28 food serviceware products. Of those 203 products, only 20 (10 percent) had any bromine-containing compounds at levels that might indicate contamination from bromine-based flame retardants, like BDE-209. Of the 109 kitchen utensils tested, only nine (8 percent) contained concerning bromine levels."[A] minority of black plastic products are contaminated at levels >50 ppm [bromine]," the study states.But that's just bromine compounds. Overall, only 14 of the 203 products contained BDE-209 specifically.The product that contained the highest level of bromine compounds was a disposable sushi tray at 18,600 ppm. Given that heating is a significant contributor to chemical leaching, it's unclear what exposure risk the sushi tray poses. Of the 28 food serviceware products assessed in the study, the sushi tray was only one of two found to contain bromine compounds. The other was a fast food tray that was at the threshold of contamination with 51 ppm.Beth MoleSenior Health ReporterBeth MoleSenior Health Reporter Beth is Ars Technicas Senior Health Reporter. Beth has a Ph.D. in microbiology from the University of North Carolina at Chapel Hill and attended the Science Communication program at the University of California, Santa Cruz. She specializes in covering infectious diseases, public health, and microbes. 87 Comments
    0 Comments ·0 Shares ·112 Views
  • The Cloud You Want Versus the Cloud You Need
    www.informationweek.com
    How do operational needs compare with organizations ambitions when it comes to using the cloud? Do plans for the cloud get ahead of what companies need?
    0 Comments ·0 Shares ·142 Views
  • 9 Cloud Service Adoption Trends
    www.informationweek.com
    Lisa Morgan, Freelance WriterDecember 16, 202411 Min ReadDubo via Alamy StockAs the competitive landscape changes and the mix of cloud services available continues to grow, organizations are moving deeper into the cloud to stay competitive. Many are adopting a cloud-first strategy.Organizations are adopting more advanced, integrated cloud strategies that include multi-cloud environments and expanded services such as platform as a service (PaaS) and infrastructure as a service (IaaS), says Bryant Robinson, principal consultant at management consulting firm Sendero Consulting. This shift is driven by increasing demands for flexibility, scalability, and the need to support emerging technologies such as remote collaboration, real-time data processing and AI-powered diagnostics.Recent surges in cyberattacks have also accelerated these changes, highlighting the need for adaptable digital infrastructure to ensure continuity of business processes, enhance user accessibility, and protect sensitive customer data.Companies that are succeeding with cloud adoption are investing in improved security frameworks, focusing on interoperability, and leveraging cloud-native tools to build scalable applications, says Robinson. In addition, certain industries have to prioritize technology with regulation and compliance mechanisms that add a level of complexity. Within healthcare, for example, regulations like HIPAA are [considered] and prioritized through implementing secure data-sharing practices across cloud environments.Related:However, some organizations struggle with managing multi-cloud complexity and the resulting inability to access, share, and seamlessly use data across those environments. Organizations may also lack the in-house expertise needed to implement and operationalize cloud platforms effectively, leading to the inefficient use of resources and potential security risks.Organizations should develop a clear, long-term cloud strategy that aligns with organizational goals, focusing on interoperability, scalability, and security. Prioritize upskilling IT teams to manage cloud environments effectively and invest in disaster recovery and cybersecurity solutions to protect sensitive customer data, says Robinson. Embrace multi-cloud approaches for flexibility, simplifying management with automation and centralized control systems. Finally, select cloud vendors with a strong track record and expertise in supporting compliance within heavily regulated environments.Following are more trends driving cloud service shifts.1. InnovationPreviously, the demand for cloud data services was largely driven by flexibility, convenience and cost, but Emma McGrattan, CTO at Actian, a division of HCL Software, has seen a dramatic shift in how cloud data services are leveraged to accelerate innovation.Related:AI and ML use cases, specifically a desire to deliver on GenAI initiatives, are causing organizations to rethink their traditional approach to data and use cloud data services to provide a shortcut to seamless data integration, efficient orchestration, accelerated data quality, and effective governance, says McGrattan. [The] successful companies understand the importance of investing in data preparation, governance, and management to prepare for GenAI-ready data. They also understand that high-quality data is essential, not only for success but also to mitigate the reputational and financial risks associated with inaccurate AI-driven decisions, including the very real danger of automating actions based on AI hallucinations.The advantages of embracing these data trends include accelerated insights, enhanced customer experiences, and significant gains in operational efficiency. However, substantial challenges persist. Data integration across diverse systems remains a complex undertaking, and the scarcity of skilled data professionals presents a significant hurdle. Furthermore, keeping pace with the relentless acceleration of technological advancements demands continuous adaptation and learning. Successfully navigating these challenges requires sound data governance.Related:My advice is to focus on encouraging data literacy across the organization and to foster a culture of data curiosity, says McGrattan. I believe the most successful companies will be staffed with teams fluent in the language of data and empowered to ask questions of the data, explore trends, and uncover insights without encountering complexity or fearing repercussions for challenging the status quo. It is this curiosity that will lead to breakthrough insights and innovation because it pushes people to go beyond surface-level metrics.2. Cloud computing applicationsMost organizations are building modern cloud computing applications to enable greater scalability while reducing cost and consumption costs. Theyre also more focused on the security and compliance of cloud systems and how providers are validating and ensuring data protection.Their main focus is really around cost, but a second focus would be whether providers can meet or exceed their current compliance requirements, says Will Milewski, SVP of cloud infrastructure and operations at content management solution provider Hyland. Customers across industries are very cost-conscious. They want technology thats good, safe and secure at a much cheaper rate.Providers are shifting to more now container-based or server-free workloads to control cost because they allow providers to scale up to meet the needs of customer activity while also scaling back when systems are not heavily utilized.You want to unload as many apps as possible to vendors whose main role is to service those apps. That hasnt changed. What has changed is how much theyre willing to spend on moving forward on their digital transformation objectives, says Milewski.3. Artificial intelligence and machine learningTheres a fundamental shift in cloud adoption patterns, driven largely by the emergence of AI and ML capabilities. Unlike previous cycles focused primarily on infrastructure migration, organizations are now having to balance traditional cloud ROI metrics with strategic technology bets, particularly around AI services. According to Kyle Campos, chief technology and product officer at cloud management platform provider CloudBolt Software, this evolution is being catalyzed by two major forces: First, cloud providers are aggressively pushing AI capabilities as key differentiators rather than competing on cost or basic services. Second, organizations are realizing that cloud strategy decisions today have more profound implications for future innovation capabilities than ever before.The most successful organizations are maintaining disciplined focus on cloud ROI while exploring AI capabilities. Theyre treating AI services as part of their broader cloud fabric rather than isolated initiatives, ensuring that investments align with actual business value rather than just chasing the next shiny object, says Campos. [However,] many organizations are falling into the trap of making strategic cloud provider commitments based on current AI capabilities without fully understanding the long-term implications. Were seeing some get burned by premature all-in strategies, reminiscent of early cloud adoption mistakes. Theres also a tendency to underestimate the importance of maintaining optionality in this rapidly evolving landscape.4. Global collaboration and remote workMore organizations are embracing global collaboration and remote work, and they are facing an unprecedented quantity of data to manage.Companies are recognizing that with the exponential growth of data, the status quo for their IT stack cant accommodate their evolving performance, scalability and budget requirements. Both large enterprises and agile, innovative SMBs are seeking new ways to manage their data, and they understand that cloud services enable the future and accelerate business, says Colby Winegar, CEO at cloud storage company Storj. The companies on the leading edge are trying to incorporate non-traditional architectures and tools to deliver new services at lower cost without compromising on performance, security or ultimately, their customers experience.Some companies are struggling to adapt traditional IT infrastructure to future IT requirements when many of those solutions just cant accommodate burgeoning data growth and sustainability, legal and regulatory requirements. Other companies are facing data lock-ins.5. Business requirementsMost of todays enterprises have adopted hybrid cloud and multi-cloud strategies to avoid vendor lock-in and to optimize their utilization of cloud resources.The need for flexibility, cost control, and improved security are some factors driving this movement. Businesses are realizing various workloads could function better on various platforms, which helps to maximize efficiency and save expenses, says Roy Benesh, chief technology officer and co-founder of eSIMple, an eSIM offering.However, managing cloud costs is a challenge for many companies and some lack the security they need to minimize the potential for data breaches and non-compliance. There are also lingering issues with integrating new cloud services with current IT infrastructure.It is vital to start with a well-defined strategy that involves assessing present requirements and potential expansion. Cost and security management will be aided by the implementation of strong governance and monitoring mechanisms, says Benesh. Meanwhile, staff members can fully exploit cloud technology if training is invested in, resulting in optimization.6. Operational improvementCloud was initially adopted for cost efficiency, though many enterprises learned the hard way that cloud costs need to be constantly monitored and managed. Todays companies are increasingly using cloud for greater agility, innovation, to be closer to customers, ensure business continuity and reduce overall risk.Companies are getting it right when they invest in [a] cloud-native approach including design, deployment and operational processes while automating infrastructure management, enhancing cloud security and using data to drive decisions, says Sanjay Macwan, CIO/CISO at cloud communications company Vonage. These steps make operations more efficient and secure. However, challenges arise when decision-makers underestimate the complexity of managing multiple cloud environments. Why does this matter? Because it often leads to inefficient use of resources, security gaps and spiraling costs that hurt long-term strategic goals.To stay ahead, businesses must remain adaptable and resilient.My advice is to take a cloud-smart approach. This means balancing innovation with a strong governance framework. Invest in solutions for cloud cost optimization and implement comprehensive security measures from the start, says Macwan. This is crucial to staying ahead of security and cost management issues to ensure that your cloud strategy remains sustainable and effective while capturing full innovation agility that the cloud can offer. Train your teams to handle these complex environments, and always prioritize a design that is both secure and resilient.7. Performance, security and costMany organizations have questioned whether their wholesale migrations to cloud were worth it. Common concerns include security, performance and cost which has driven the move to hybrid cloud. Instead of going back to the old way of doing things, they want to take the lessons learned in public cloud and apply them on premises.Performance, security, and cost concerns are driving change. As cloud has become more popular, its also become more expensive. [Workload security] is now a bigger concern than ever, especially with modern speculative execution attacks at the CPU level. Lastly, some applications need to be physically close for latency and/or bandwidth reasons, says Kenny Van Alstyne, CTO at private cloud infrastructure company SoftIron. [M]igrating back to the legacy way of doing on-premises infrastructure will lead to the desire to move back to cloud again. To succeed and be accepted, on-premises must be managed as if it were your own cloud.One reason private cloud is gaining popularity is because organizations can gain the efficiencies of cloud, while maintaining control over cost, performance and security on-prem, assuming they have the prerequisite knowledge and experience to succeed or the help necessary to avoid common pitfalls.8. Specific workload requirementsOrganizations deploying AI at scale are discovering that while traditional cloud infrastructure performs work well for general-purpose compute workloads, it presents challenges for AI operations, such as the unpredictable availability of GPUs, prohibitive at-scale costs, the operational complexity of energy-dense workloads and performance bottlenecks in storage and networking. Complicating matters further, edge inferencing, initially touted as a darling AI deployment model, has been deprioritized by global telecommunications carriers due to 5Gs underwhelming commercial returns.Large language models demand high-performance storage systems capable of sustaining consistent, high-throughput data flows to keep pace with GPU processing speeds. While traditional cloud storage [and] enterprise SAN deployments work well for many use cases, AI training often requires vast sequential bandwidth to manage reduction operations effectively. Storage limitations can bottleneck training times and lead to costly delays, says Brennen Smith, head of infrastructure at cloud computing platform provider RunPod. While building these specialized systems in-house reduces overall [operating expenses], this requires deep internal architectural knowledge and is capital-intensive, further complicated by Nvidias release cadence, which is rendering GPUs outdated before their full depreciation cycle.These dynamics are leading to a different type of hybrid strategy, one thats using resources for what they do best. This includes combining public cloud, AI/ML-specific cloud offerings and on-premises infrastructure.9. Healthcare agilityHealthcare organizations made the same mistake many enterprises did: they started with lifting and shifting infrastructure to the cloud that was essentially recreating their on-premises environment in a cloud setting. While this provided some benefits, particularly around disaster recovery, it failed to unlock the clouds full potential.Today, we're witnessing a more mature approach. Organizations are increasingly understanding that true cloud value comes from embracing cloud-native architectures and principles. This means building new applications as cloud-first and modernizing existing systems to leverage native cloud capabilities rather than just hosting them there, says Nandy Vaisman, CISO and VP of operations at health data integration platform Vim.Given the value of EHRs, healthcare organizations cannot afford to take a lift-and-shift approach to cybersecurity. When they do, it creates potential vulnerabilities.Vaisman recommends the following:Moving beyond simple lift-and-shift to truly embrace cloud-native architecturesInvesting in cloud security expertise and trainingAdapting security practices specifically for cloud environmentsFocusing on privacy-by-design in cloud implementationsLeveraging cloud-native tools for compliance and security monitoringAbout the AuthorLisa MorganFreelance WriterLisa Morgan is a freelance writer who covers business and IT strategy and emergingtechnology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.See more from Lisa MorganNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Comments ·0 Shares ·129 Views
  • Plastic chemicals linked to hundreds of thousands of deaths worldwide
    www.newscientist.com
    Plastic food packaging can expose people to chemicals such as bisphenol A (BPA)Shutterstock/Trong NguyenHundreds of thousands of deaths and millions of cases of heart disease worldwide may be linked to chemicals in common plastic products, suggesting that more stringent regulations on such toxins could benefit public health.Maureen Cropper at the University of Maryland and her colleagues assessed the public health impact of exposure to three types of chemicals primarily used in plastics: bisphenol A (BPA), di(2-ethylhexyl) phthalate (DEHP) and polybrominated diphenyl ethers (PBDEs). BPA and DEHP are found in plastic food packaging and PBDEs are flame retardants used in some household goods, such as furniture and electronics. AdvertisementDrawing on more than 1700 previously published studies, the team estimated peoples exposure to these three classes of chemicals across 38 countries, which represent roughly a third of the worlds population. Three of the countries the US, Canada and South Korea also have public databases that monitor levels of these chemicals in urine and blood samples, providing even more accurate data.In combination with medical records and toxicology reports, the researchers calculated health outcomes attributable to these chemicals. They found that in 2015, about 5.4 million cases of coronary artery disease and 346,000 strokes were associated with BPA exposure and that roughly 164,000 deaths in people between 55 and 64 years old may have been due to DEHP.Thanks to regulationsenacted in the late 2000s, the prevalence of these chemicals has since decreased in many countries such as the US, Canada and those in Europe. The researchers estimate that about 515,000 deaths could have been avoided if BPA and DEHP exposures in the US had been at post-regulation levels since 2003. This underscores the importance of governments and manufacturers limiting the use of toxic chemicals in plastic products before they reach consumers, says Cropper.However, it is important to remember these findings are only approximations. I think one of the real limitations, frankly, is the lack of exposure data on these substances, says Cropper, meaning estimates for some countries may be less accurate than others. It would be a good idea if more countries actually monitored [exposures to] these and other substances, which would improve our understanding of their public health burden, she says.Journal reference:PNAS DOI: 10.1073/pnas.2412714121Topics:
    0 Comments ·0 Shares ·113 Views
  • Liquid metal particles can self-assemble into electronics
    www.newscientist.com
    A crosshatch pattern of wires created by self-assembling liquid metal particlesJulia Chang / North Carolina State UniversitySelf-assembling electronics made from liquid metal particles could provide a cheaper way of manufacturing computer chips, simply by harnessing the basic physics of how fluids flow through tiny structures.The cost of entry in manufacturing electronics and building new chip fabrication plants in the US right now, were talking billions of dollars, says Martin Thuo at North Carolina State University. Its not cheap.Thuo and his colleagues first created a mixture of
    0 Comments ·0 Shares ·109 Views