• Amazon Is Having a Buy Two, Get One 50% Off Sale on the Fourth Wing Books Today
    www.ign.com
    The Empyrean series has had at least one book in Amazon's bestsellers list for years now. Just last year, Onyx Storm was the second best-selling book of the entire year and it didn't even come out until January of this year. And when it did come out, it became the fastest-selling novel for adults of the past 20 years. Now, you can pick up all three books in the series Fourth Wing, Iron Flame, and Onyx Storm in Amazon's buy two get one 50% off promotion.The Empyrean Series Is Discounted at Amazon TodayBook OneFourth WingBook TwoIron FlameOnyx Storm (Deluxe Limited Edition)The discounts and promotion only include the hardcover editions of each of these books, rather than the paperback or Kindle versions. It's also worth noting that the deluxe edition of Onyx Storm, which was actually hard to find when the book was first released back in January, is included in this sale.To take advantage of Amazon's buy two, get one 50% off sale on these books, you just need to add them all to your cart at the same time. The discount will automatically be applied to the least expensive item, which in this case would be Fourth Wing. That means you can get all three hard cover books in the series for $45.75. All in all, this is a great price if you're trying to add the physical copies to your book collection or simply read them for the first time.See more books in this saleYou're My Little Cuddle BugThe HousemaidAtomic HabitsGood EnergyWhat Is the Empyrean Series About?If you've never read Fourth Wing or any of the other books in the Empyrean series, you may be wondering why the Rebecca Yarros novels are currently experiencing a Hunger Games-level of popularity. In short, there are elements familiar to the Harry Potter books, romance similar to the Twilight series, and dragons similar to the Inheritance Cycle. Reading these books feels familiar, and yet somehow very new. Here's a quick synopsis of the series below:The story follows Violet Sorrengail, a seemingly fragile young girl who has been forced by her powerful mother to enter into the very dangerous academy of dragon riders. As she finds ways to overcome her weaknesses and survive, Violet must also battle her own complex emotions about her mother, her old friend, and a boy she thinks wants nothing more than to see her dead. All the while, there's more going on with the dragons and her world than meets the eye, and she finds herself at the center of it all while in the heat of an epic romance.Looking for more Presidents' Day sales like this? Amazon is also currently offering the first Kindle Paperwhite deal of the year.
    0 Yorumlar ·0 hisse senetleri ·33 Views
  • Anker's Newest High-Capacity Power Bank Now Includes Two Built-In USB Type-C Cables
    www.ign.com
    Anker quietly released a new high-capacity power bank earlier this year that sits alongside their Anker 737 and Prime series of power banks. This particular model boasts a massive 25,000mAh battery capacity, 165W of total charging output, and two built-in USB Type-C cables in case you forget to bring your own. It's also attractively priced at under $100, and there's even a deal today that drops the price by another $10 to $89.99. This is a great complement to your power-hungry gaming handheld PC like the Steam Deck, Asus Rog Ally, or Lenovo Legion Go.New Release: Anker 25,000mAh 165W Power BankAnker 25,000mAh 165W Power Bank with Two Built-In USB Type-C CablesThe new Anker power bank features a 25,000mAh battery capacity, which is the second largest capacity we've seen from Anker in a compact form factor. So how much juice will that offer to today's gaming handheld PCs? A 25,000mAh battery equates to a 95Whr capacity. An 80% power efficiency rating (which is standard for power banks) means you get about 76Whr of usable charge. That means this power bank will charge a Steam Deck or ROG Ally (40Whr) from empty to completely full 2 times, an Asus ROG Ally X (80Whr) 1 time, and a Nintendo Switch (16Whr) about 4.75 times.The Anker power bank has one USB Type-C port and one USB Type-A port. In addition, there are two built-in USB Type-C cables. One is a retractable cable that can extend up to 2.3 feet. The other is a fixed 1-foot cable that doubles up as a lanyard when not in use. Each USB Type-C port is capable of up to 100W of Power Delivery with a total 165W maximum output. That means all three USB outputs are capable of charging any gaming handheld PC at its fastest rate, including the Asus ROG Ally X, which supports up to 100W of fast charging.Another feature common to Anker's other premium power banks is the digital LCD readout. It displays a wealth of information like remaining battery capacity, current charging rate, input/output wattage, battery temperature, battery health, charge cycle count, and more.TSA-ApprovedTSA states that power banks must be under 100Whr in capacity for carry-on (check-in is not allowed under any circumstances). This Anker power bank is rated at 95Whr. You might get checked simply because this is a relatively hefty power bank, but you shouldn't have any problems getting it cleared.See more power banks we've recommendedGreat for LaptopsAnker 737 Power BankSee it at AmazonGreat Compact OptionINIU Portable ChargerSee it at AmazonGreat for iPhonesBaseus Wireless Magsafe Battery PackSee it at AmazonSolar Powered OptionSolar Power BankSee it at AmazonWhy Should You Trust IGN's Deals Team?IGN's deals team has a combined 30+ years of experience finding the best discounts in gaming, tech, and just about every other category. We don't try to trick our readers into buying things they don't need at prices that aren't worth buying something at. Our ultimate goal is to surface the best possible deals from brands we trust and our editorial team has personal experience with. You can check out our deals standards here for more information on our process, or keep up with the latest deals we find on IGN's Deals account on Twitter.Eric Song is the IGN commerce manager in charge of finding the best gaming and tech deals every day. When Eric isn't hunting for deals for other people at work, he's hunting for deals for himself during his free time.
    0 Yorumlar ·0 hisse senetleri ·35 Views
  • Discord introduces button to ignore people without them knowing
    9to5mac.com
    As part of the Safer Internet Day 2025 awareness campaign, the popular messaging platform Discord announced on Monday a new feature that will help users ignore other people without them knowing. With just a button, you can now easily mute someone on Discord.Discord now has an Ignore buttonCalled Ignore, the new feature does exactly what it promises help you ignore other people. Discord says the idea is to let users take space from specific people without them knowing. The Ignore button can be used to take some time away or to discreetly avoid talking to someone specific you dont want to talk to.Discord already has an option to block another user, which prevents any interactions coming from the blocked account. However, the company says it has heard from users that blocking someone sometimes feels confrontational and scary, especially among teenagers. The Ignore button also limits interactions, but the other person wont know theyve been blocked.When you choose to ignore a person, their messages, DMs, notifications, or any other status updates are hidden from you. All you have to do is go to the persons profile, tap the three dots button, and then choose Ignore.While tools like Ignore help you manage your interactions, its equally important to stay aware of potential threats online, Discord warns its users.The platform is also teaming up with a new non-profit organization called ROOST, or Robust Open Online Safety Tools, which will offer free, open-source tools for detecting and reporting child sexual abuse material (CSAM). ROOST is also supported by companies such as OpenAI, Google, and Roblox.You can access Discord on the web or download the iOS app from the App Store.Add 9to5Mac to your Google News feed. FTC: We use income earning auto affiliate links. More.Youre reading 9to5Mac experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Dont know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel
    0 Yorumlar ·0 hisse senetleri ·43 Views
  • Roborock Saros 10 and 10R bring robot cleaners into the AI era
    9to5mac.com
    Robot cleaners have evolved significantly since their early days, offering deeper cleaning, smarter routing, auto-emptying, and more. But larger homes with mixed flooring surfaces and complex layouts can still prove challenging for conventional robots.Roborocks new Saros 10 and Saros 10R models bring robocleaners into the AI era, employing advanced machine learning algorithms for both cleaning and navigation tasks The most advanced cleaning featuresBoth models offer the most advanced cleaning features yet, starting with the thinnest design ever seen in a Roborock. Just 3.15 inches (7.98cm) high, the machines slip easily beneath low-sitting furniture that would block other cleaners.Four different lifting modes allow the Saros 10 models to cope with a wide range of flooring surfaces. The cleaner can independently raise and lower the chassis, mop, main brush, and side brush for unrivalled flexibility.This adaptability enables the most efficient cleaning of everything from hardwood surfaces to deep pile carpets, avoiding the common problem of cleaners which drag dirt from one surface to another. The AdaptiLift chassis can also easily cross the most challenging doorway thresholds.Pet hair is another pain point for many robocleaners, with hair tangles a common reason for devices to lose cleaning efficiency. The Saros 10 models have a clever anti-tangle system which divides the main brush and FlexiArm rising side brush to enable 100% hair removal and ensure full cleaning performance is retained.Both models are also able to employ their intelligence to adapt suction power and cleaning modes in real-time to perfectly balance efficiency and thoroughness, but they can also learn your own preferences. If particular areas of the home need to be cleaned more frequently than others, for example, the Saros 10 models will mark those as higher priority.Saros 10: The AI-powered deep cleaning specialistThe Saros 10 is the successor to the S8 MaxV Ultra, and is specifically designed for larger homes with the most challenging cleaning demands.You get maximum cleaning power, with 22,000 Pa suction and the most sophisticated mopping system, VibraRise 4.0. This increases the vibrating area of the mop by 27%, and the wheels can lift the front of the cleaner to increase pressure on the mop during a secondary pass for the most stubborn of stains.One of the most common problems experienced with conventional cleaning robots is coping with cables running across the floor to lights and other devices. The Saros 10 is equipped with a wide vertical laser in addition to the standard horizontal one, allowing it to detect both cables and other irregular shapes, cleaning around these.That secondary laser also comes into its own when the robot is faced with very low height clearance beneath furniture. The RetractSense Navigation System enables the robot to measure the available height and where necessary retract the main LiDAR module into the body of the robot, further reducing its height. While retracted, the Saros 10 switches navigation to the secondary laser, which offers an ultra-wide 100-degree scan. The primary module then raises again once clear of the height restriction.Finally, the Saros 10 really brings cleaning to a new level through the use of AI algorithms to automatically identify the different types of dirt and stains encountered, to work out which cleaning modes will be most effective, and to make all the necessary adjustments.Saros 10R: The ultimate AI-powered navigation systemFor homes with the most complex layouts, the Saros 10R model introduces a whole new generation of navigation technology.Instead of a traditional laser distance sensor, the 10R combines two different types of scanning. The first is dual-light Time-of-Flight sensors, which offer a sampling frequency 21 times higher than standard laser modules. The second are RGB cameras providing a video feed to the robot.The combined inputs allow the robot to create not just the most detailed and precise mapping of your home, but a true 3D model rather than the 2D version provided by LiDAR.Both feeds are processed by the StarSight Autonomous System 2.0, an AI system which can intelligently work out whats in front of it. It can distinguish more than 100 different types of obstacle, from loose cables through childrens toys to socks. The AI capabilities are also used to enable the 10R to automatically adjust its cleaning modes based on both what it sees and usage history. For example, it knows when it has cleaned a kitchen or a bathroom, and will trigger immediate mop-washing so that dirt from those rooms isnt spread to the rest of the home. It also takes into account the time of day, automatically reducing suction power during quiet hours.The Saros 10R offers voice control, either through its own built-in Hello, Rocky commands, or through Alexa, Google Home, and Siri Shortcuts. Matter support will later be added as a free over-the-air update, offering even greater integration with smart home ecosystems like Apple Home.Of course, the Saros 10R isnt just about smarts you also get all the high-performance cleaning capabilities youd expect. That includes the same advanced multi-lifting and anti-tangle technology as the Saros 10 model.Launch discounts offer $200 savingTo celebrate the launch of its most innovative models, Roborock is offering an introductory discount of $200 on both models.Saros 10: $1599 now $1399Saros 10R: $1599 now $1399Both discounts are available from February 10 to 16 inclusive.Add 9to5Mac to your Google News feed. FTC: We use income earning auto affiliate links. More.Youre reading 9to5Mac experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Dont know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel
    0 Yorumlar ·0 hisse senetleri ·32 Views
  • Elon Musk Makes Huge Bid to Seize Control of OpenAI
    futurism.com
    Billionaire and White House advisor Elon Musk is leading a gigantic $97.4 billion bid to buy the nonprofit that controls OpenAI, the Wall Street Journal reports.Musk's lawyer, Marc Toberoff, delivered the unsolicited offer to the ChatGPT maker's board of directors today, signaling a dramatic escalation in his years-long beef with the company he co-founded alongside CEO Sam Altman in 2015.Musk rage quit the AI company after growing frustrated with its direction in 2019."Its time for OpenAI to return to the open-source, safety-focused force for good it once was," Musk said in a statement provided by Toberoff, as quoted by the WSJ. "We will make sure that happens."The timing of the bid is especially interesting, as Altman revealed that he regrets leading OpenAI's transition ditch its open source roots in a Reddit AMA late last month."I personally think we have been on the wrong side of history here and need to figure out a different open source strategy," hewrote.Musk has repeatedly lashed out at Altman, going as far as to sue OpenAI last year for having been "transformed into a closed-source de facto subsidiary of the largest technology company in the world: Microsoft," citing the latter's major investment in the AI company.There's a bit of a double standard there, it's worth pointing out; Musk has raised billions of dollars for his own AI company, xAI, which is for-profit.Meanwhile, OpenAI has defended itself against Musk's lawsuit, even demonstrating that Musk had once supported turning OpenAI into a for-profit himself, but later ditched the firm since he couldn't assume full control.Musk ended up dropping the lawsuit three months later.In short, the latest bid to take back control of OpenAI could easily be seen as the richest man in the world's dissatisfaction with not being in charge of one of the key players in the ongoing AI race.The bid may have also been inspired by plain jealousy. Altman signed onto president Donald Trump's $500 billion Stargate AI infrastructure deal, which had Musk who has grown extremely close to Trump seething with rage last month as he was left out.According to the WSJ, xAI is backing Musk's bid. The Grok AI creator would presumably merge with the ChatGPT maker if the bid were to be accepted.Musk has also secured several other investors, including a number of VC funds and founders.Independent of Musk's latest bid, OpenAI is reportedly trying to raise another $40 billion which would value it at a gargantuan $340 billion."No thank you but we will buy twitter for $9.74 billion if you want,"Altman posted on X after the story went public, in a clear dig at Musk."Swindler,"Musk replied.Share This Article
    0 Yorumlar ·0 hisse senetleri ·45 Views
  • The Cost of AI: Power Hunger -- Why the Grid Cant Support AI
    www.informationweek.com
    Joao-Pierre S. Ruth, Senior EditorFebruary 10, 20258 Min ReadTithi Luadthong via Alamy Stock PhotoRemember when plans to use geothermal energy from volcanoes to power bitcoin mining turned heads as examples of skyrocketing, tech-driven power consumption?If it possessed feelings, AI would probably say that was cute as it gazes hungrily at the power grid.InformationWeeks The Cost of AI series previously explored how energy bills might rise with demand from artificial intelligence, but what happens if the grid cannot meet escalating needs?Would regions be forced to ration power with rolling blackouts? Will companies have to wait their turn for access to AI and the power needed to drive it? Will more sources of power go online fast enough to absorb demand?Answers to those questions might not be as simple as adding windmills, solar panels, and more nuclear reactors to the grid. Experts from KX, GlobalFoundries, and Infosys shared some of their perspectives on AIs energy demands and the power grids struggle to accommodate this escalation.I think the most interesting benchmark to talk about is the Stargate [project] that was just announced, says Thomas Barber, vice president, communications infrastructure and data center at GlobalFoundries. The multiyear Stargate effort, announced late January, is a $500 billion plan to build AI infrastructure for OpenAI with data centers in the United States. Youre talking about building upwards of 50 to 100 gigawatts of new IT capacity every year for the next seven to eight years, and thats really just one company.Related:That is in addition to Microsoft and Google developing their own data center buildouts, he says. The scale of that, if you think about it, is the Hoover Dam generates two gigawatts per year. You need 50 new Hoover Dams per year to do it.The Stargate site planned for Abilene, Texas would include power from green energy sources, Barber says. Its wind and solar power in West Texas thats being used to supply power for that.Business Insider reported that developers also filed permits to operate natural gas turbines at Stargate's site in Abilene.Barber says as power gets allocated to data centers, in a broad sense, some efforts to go green are being applied. It depends on whether or not you consider nuclear green, he says. Nuclear is one option, which is not carbon-centric. Theres a lot of work going into colocated data centers in areas where solar is available, where wind is available.Barber says very few exponentials, such as Moores Law on microchips, last, but AI is now on the upslope of the performance curve of these models. Even as AI gets tested against more difficult problems, these are still the early training days in the technologys development.Related:When AI moves from training and more into inference -- where AI draws conclusions -- Barber says demand could be significantly greater, maybe even 10 times so, than with training data. Right now, the slope is driven by training, he says. As these models roll out, as people start adopting them, the demand for inference is going to pick up and the capacity is going to go into serving inference.A Nuclear Scale MatterThe world already sees very hungry AI models, says Neil Kanungo, vice president of product led growth for KX, and that demand is expected to rise. According to research released in May by the Electric Power Research Institute (EPRI), data centers currently account for about 4 percent of electricity use in the United States, and project that number could rise as high as 9.1% by 2030.While AI training drives high power consumption, Kanungo says the ubiquity of AI inference makes its draw on power is significant as well. One way to improve efficiency, he says, would be to remove the transmission side of power from the equation by placing data centers closer to power plants. You get huge efficiency gains by cutting inefficiency out, where youre having over 30% losses traditionally in power generation, Kanungo says. He is also a proponent of the use of nuclear power, considering its energy load and land usage impact. The ability to put these data centers near nuclear power plants and what youre transmitting out is not power, he says. Youre transmitting data out. Youre not having losses on data transmission.Related:Nuclear power development in the United States, he says, has seen some stalling due to negative perspectives on safety and potential environmental concerns. Rising energy demands might be a catalyst to revisit such conversations. This might be the right time to switch those perceptions, Kanungo says, because you have tech giants that are willing to take the risks and handle the waste, and go through the red tape, and make this a profitable endeavor.He believes these are still the very early stages of AI adoption and as more agents are used with LLMs -- with agents completing tasks such as shopping for users, filling out tabular data, or deep research -- more computation is needed. Were just at the tip of the iceberg of agents, Kanungo says. The use cases for these transformer-based LLMs are so great, I think the demand for them is going to continue to go up and therefore we should be investing power to ensure that youre not jeopardizing residential power youre not having blackouts, youre not stealing base load.Energy Hungry GPUsThere is an unprecedented load being put on the grid according, to Ashiss Kumar Dash, executive vice president and global head - services, utilities, resources, energy and sustainability for Infosys. He says the power conundrum as it relates to AI is three-pronged.The increase in demand for electricity, increase in demand for energy is unprecedented, Dash says. No other general-purpose technology has put this much demand in the past they say a ChatGPT query consumes 10 times the energy that a Google search would. (According to research published in 2024 by the International Energy Agency, the average electricity demand of a basic Google search without AI was 0.3 Watt hours, while the average demand of a ChatGPT request was 2.9 Wh. a typical Google search without AI is 0.3 Wh (watt-hours) of electricity, while the average electricity demand of a ChatGPT request is 2.9 Wh.)Dash also cited a CNBC documentary that posited that to train an LLM today would effectively emit as much carbon dioxide as five gas-fueled cars in their entire lifetimes. There is this dimension of unprecedented load, he says. There are energy-hungry GPUs, energy-hungry data centers, and the cloud infrastructure that it needs.The second part of the problem, Dash says, is data centers tend to be concentrated geographically. If you look at the global data centers, we have [thousands of] data centers in the world, but you can pretty much name where the data centers are, he says. Seventy percent of the worlds internet traffic goes through Virginia." According to research released in May by the Electric Power Research Institute (EPRI), data centers accounted for 25.6% of Virginia's total electricity consumption.There is some debate over the actual amount of internet traffic funneled through Virginia, with some sources, such as TeleGeography, debunking the 70% scale while Amazon affirmed that figure just last year. Regardless, Virginia is noted by Visual Capitalist for the energy consumption seen tied directly to the concentration of data centers there, as cited by Inside Climate News.That grid must obviously serve residents and local, commercial businesses, he says. When you concentrate the demand like this, its very difficult for the local grid to manage, Dash says. Same thing in Europe -- Ireland. Seventeen or 18% of Irelands electricity demand is on data centers. EPRI estimates that 20% of Ireland's electricity use is attributable to data centers.The third aspect of the problem, he says, is load growth. Utility companies tend to base their grid resiliency models on 2% to 3% maximum growth on a yearly basis, Dash says. Thats how the funding works. Thats how the rate cases are built. But now were talking, in some parts of the US, 20% growth year-on-year. Portland is going to see massive growth. California is seeing the demand.The grid and utility models are not designed to handle such fast growth, he says. For them to invest in the infrastructure and to build up transmission lines and substations and transformers is going to be a big challenge. That does not include recurring spikes in energy load in parts of the country, Dash says. If you have the data centers running at 20% higher energy demand and summer peak hits, the grid is not going to survive -- it's going to go down.However, there is some hope such outages might be avoided. AI companies, energy companies, and multiple partners are building an ecosystem to think about the problem, he says. There was even a discussion at the International Energy Agency Conference in December, he says, on using AI to work on AIs energy needs. It was good to hear tech companies, regulators, energy companies, oil and gas and utilities equally.Dash says he sees encouragement in redesigning and rethinking the grid, for example with the advent of the power usage effectiveness (PUE) metric, which can help drive more efficiency to data centers. I look at the reports and I find that quite a few organizations are able to optimize their power usage to a level where the power used for IT or tech is almost similar to the power used for the entire operations of the company, he says.Initiatives such as the creation of coolants that are more energy efficient, the creation of renewable microgrids close to data centers, and AI modeling to help utilities envision load growth are also encouraging, Dash says. Its AI solving the problem AI created.Read more about:Cost of AIAbout the AuthorJoao-Pierre S. RuthSenior EditorJoao-Pierre S. Ruth covers tech policy, including ethics, privacy, legislation, and risk; fintech; code strategy; and cloud & edge computing for InformationWeek. He has been a journalist for more than 25 years, reporting on business and technology first in New Jersey, then covering the New York tech startup community, and later as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight.See more from Joao-Pierre S. RuthNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Yorumlar ·0 hisse senetleri ·36 Views
  • How to Avoid Common Hybrid Cloud Pitfalls
    www.informationweek.com
    Lisa Morgan, Freelance WriterFebruary 10, 20259 Min ReadBrain light via Alamy StockOrganizations continue to fine-tune how they approach data, applications, and infrastructure. The latest move is pulling some data and workloads back to on-premises data centers. The question is whether theyre applying what they learned in the cloud, especially when it comes to private cloud and hybrid implementations.They need to be hybrid by design in terms of intentionally aligning business and technology. It has to be thought through and designed [as] a hybrid cloud architecture, so they dont think, Hey, Im going on prem, so I need to do virtualization here, says Nataraj Nagaratnam, CTO for AI governance and cloud security at IBM. The policies, processes, and organizational construct will enable this transformation, and we see this happening increasingly.What they should be doing is taking the learnings from their move to cloud and having an intentional hybrid strategy.I think AI is an opportunity to get your data right because data feeds the AI. To create value, you need to know where your data is, so governance is important, which in turn means do you have a hybrid landscape in place and a view of your digital assets, data assets and applications? says Nagaratnam.Common PitfallsOne common pitfall that organizations experience is moving entirely to cloud without being intentional about workload placement, Nagaratnam says. Another issue is underestimating the management complexity when theyve built different management control planes and have lost visibility. The third issue is they didnt understand the cloud services shared responsibility model.Related:Nataraj Nagartanam, IBMIts not only infrastructure, like cloud providers, but it is also business applications, software, software providers, SaaS providers, so bringing that together becomes important, says Nagaratnam. AI will shed more [light] on that shared responsibility, because it's no longer infrastructure only. If you think of a model provider, what's the risk? What's the responsibility of the model provider? So that notion of shared responsibility will continue to increase as you deal with data.More fundamentally, the complexity issue has been exacerbated by siloed departmental operations and mergers or acquisitions. Add to that inconsistent policies and significant skills gaps, and its a recipe for disaster.As companies grow their cloud infrastructure, it becomes more complex and presents a significant challenge to keep under control. This leads to unplanned cloud costs, security risks, production downtime, non-compliant cloud assets and misconfigurations in production, says AJ Thompson, chief commercial officer at IT consultancy Northdoor. Losing control means more cloud expenses and more potential downtime. While most companies appear to have mastered their migration to cloud and modernizing their applications, so they are cloud native, many struggle with the operation of cloud and cost containment. This is why we have seen some organizations move workloads back on-premises and why many operate in a hybrid environment.Related:Brian Oates, product manager of cloud VPS & cloud metal at Liquid Web, says the greatest failure in implementing hybrid clouds has to do with on-premises and cloud systems that are not integrated consistently. And without clear governance, there is also an inability to handle the data sprawl.Most of the hybrid cloud pitfalls are basically related to poor planning and strategy. Most of the time, organizations use a hybrid solution because of urgent needs, not in the frame of a long-term architectural strategy. Thus, there is a misjudgment in understanding workload compliance, performance and latency requirements of major importance, says Oates. Most of the organizations have taken too lightly the management of hybrid environments by considering that integrating modern cloud with legacy could be seamless.Related:Northdoors Thompson says monitoring hybrid clouds can be challenging because cloud services may not integrate easily with existing on-premises solutions. Interoperability issues and ensuring secure communication and seamless integration within the entire organizations infrastructure can be challenging. And, underestimating network latency can undermine hybrid cloud performance.One of the key reasons that organizations keep some of their workloads on premises is because they must adhere to strict industry standards surrounding the safeguarding of their data. Businesses must understand the implications of these regulations, including the most recent DORA and NIS2 regulations and how they apply to their hybrid cloud environment, says Thompson. This can become even more complicated for global businesses, as different territories often have their own unique requirements. Therefore, organizations must make sure they implement the appropriate governance and policies for their cloud resources.Ferris Ellis, CEO andprincipal at software engineering firm Urban Dynamics, says people make the mistake of taking the network for granted and just assume that it provides reasonably low latency and high bandwidth. This is not the case with hybrid cloud where connectivity is a problem and can cause an SLA failure. There are also the potential cloud egress fees to consider, depending on how the hybrid cloud network design is done.Ferris Ellis, Urban DynamicsNetworks tend to be taken for granted so people dont think about them, says Ellis. Second, it requires a more advanced network design than many IT departments are familiar with. They may be familiar with connecting a bunch of offices using a VPN or SD-WAN solution. But for serious workloads you need to have 10, 40, or even 100+ Gbps of reliable, low latency connectivity between one or more of your locations and multiple cloud regions. There are known ways to do this, but they require familiarity with the internals of the internet that remain hidden to most.Get the Right People InvolvedAside from the product teams whose workloads are moving, the obvious players are the platform and infrastructure teams. There are also a couple of less obvious groups, notably the security, risk, and compliance teams.In my experience, you need these teams to be bought in so they dont become barriers. You must ensure that the conversion doesnt increase risk, and that you are not giving up controls, says Jacob Rosenberg, head of Infrastructure at observability platform provider Chronosphere. In many cases, you can decrease risk, so while getting them bought in may take some work upfront, I think it can be a real win-win.Liquid Webs Oates Utility believes a hybrid cloud strategy needs many stakeholder groups to define and implement.It has to be IT-driven, as this is the team that will have expertise in infrastructure and system integrations, says Oates. On the other hand, business leaders take part in being onboard, which will ensure that it fits with organizational priorities. Compliance officers and the legal department should review requirements from a regulatory perspective and discuss the best ways to mitigate risk. Financial managers provide insights on cost implications and budgeting. Cybersecurity experts are also necessary for robust defense and data integrity assurance. Its essential to include the right business stakeholders to ensure the implementation meets the needs of the operation. Next is tapping external consultants or managed service providers who bring in new ideas and specialized expertise.The approach is sure to be comprehensive, yet practical, says Oates.How to Ensure a Smoother TransformationA hybrid-by-design approach is wise because it forces organizations to be mindful of what data and workloads they have and where they should be to meet business objectives. It also requires business and technology leaders to work together.Architecture that factors in the application layer and infrastructure is another critical consideration.Do you have a view of your data and the end state of your IT? Does the landscape accommodate a hybrid cloud architecture [using] a hybrid cloud platform like OpenShift and Kubernetes for applications? Where is your data? How are you consuming data? Is it as a service? What does your data pipeline look like? says IBMs Nagaratnam. Because data is not just going to stay somewhere. [It] has to move. It moves closer to application.Data must also move for AI models, inferencing, and agents, which means thinking about data pipelines in a hybrid context.Hybrid cloud architecture [should] take into account your workload placement and data decisions so that nothing can go to the public cloud or everything needs to stay on prem and whatever decisions there are, but take a risk-based approach, based on data sensitivity, says Nagaratnam. Create a path to continuous compliance and get ahead with AI governance.An ethos of continuous improvement is necessary because it helps ensure agility and more accurate alignment with business requirements.A hybrid cloud strategy should develop and evolve as your business and technology evolve. Base this on a small pilot project to refine the approach to find any challenges early in the process, says Liquid Webs Oates. Second, prioritize security, making extensive use of the zero-trust model and applying policy consistency across all environments. [Make sure to have] a great IT staff or partner who can help manage hybrid environment complexities. Invest in tools that will provide your team with a single source of visibility and automate routine tasks. In this way, enable your team to focus on more strategic work.Collaboration across departments ensures that the strategy fits business and regulatory purposes. Its also important to review workload placement to ensure effective cost control.Unexpected cloud costs [come] down to several factors, including inadequate planning, unforeseen disruptions, underutilized cloud instances, a lack of visibility and/or the need for additional resources. Therefore, a key requirement is to understand hybrid cloud pricing structures, as these can be extremely complex and vary from provider to provider, says Northdoors Thompson. Utilizing cloud without knowing what the business needs to pay for can lead to overspending on redundant or underutilized services.Chronospheres Rosenberg has observed two approaches to hybrid cloud that tend to have very different outcomes.The first is to make your public cloud looks like your on-prem infrastructure, and the second method is making their on-prem infrastructure look as cloud-native as possible, says Rosenberg. The former is often quicker and enables a lift and shift migration of workloads, but the second method maximizes the benefits of the cloud environment. For many companies, this means bringing in Kubernetes and refactoring applications to be cloud-native. I find the second method is more appealing because not only do you make your deployment, management, and observability of all your applications uniform across both environments, you also get the advantages of cloud-architecture combined with retaining the security and compliance benefits of remaining on-prem.About the AuthorLisa MorganFreelance WriterLisa Morgan is a freelance writer who covers business and IT strategy and emergingtechnology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.See more from Lisa MorganNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Yorumlar ·0 hisse senetleri ·44 Views
  • Sony Removes 'Eslop' Games From PlayStation Store, Report Says
    www.cnet.com
    A number of games have recently disappeared from Sony's PlayStation Store, leading to speculation that they were removed by Sony, according to a report from Eurogamer on Monday. The games in question have been referred to as spam or "eslop," games that rely heavily on AI and have misleading names and descriptions to attract gamers.Sony didn't immediately respond to a request for confirmation on the removal of the games.These games were the subject of a report from IGN earlier in the month detailing the current problematic trend primarily affecting the PS Store and the Nintendo eShop. A small number of developers, usually consisting of just one person, flood both stores with low-cost games that sound similar to popular games and use generative AI images to come off as having cutting-edge graphics. This is intended to entice gamers looking for a deal on games.These eslop titles typically use assets from other games or free assets available from game developer tools and are slapped together to create an experience not worth the low cost charged. Since the games are sold for such a low price, the goal for developers is to sell a large number of copies before people catch on.Also sometimes called "AI slop," developers of these games take advantage of the certification process to bring their games to the storefronts with little oversight from the store owners. Sony's and Nintendo's stores reportedly get flooded with these games while Microsoft's Store is a bit more of a challenge to release games onto. Similarly, Steam has so many games being added that attempting to release multiple titles fails to get any attention from PC gamers.
    0 Yorumlar ·0 hisse senetleri ·33 Views
  • No, That's Not Patrick Mahomes: Millions View Deepfake Super Bowl Video
    www.cnet.com
    Quarterback Patrick Mahomes and the Kansas City Chiefs didn't have the best day Sunday, as they were thoroughly defeated 40-22 in Super Bowl LIX. Video creators seized the moment to use AI to poke fun at the three-time Super Bowl MVP in a surprisingly realistic deepfake video. Viewer discretion is advised, as some of the jokes are NSFW"I'm a choker, I'm a trash can. Jalen Hurts absolutely destroyed me tonight," the fake Mahomes said in the video, which was posted to bothTikTokand YouTube.Fake Mahomes goes on to criticize teammate Travis Kelce, Chiefs' coach Andy Reid, and even Kelce's girlfriend, superstar singer Taylor Swift."And Travis Kelce was too worried about fixing his hair the whole game trying to look cute for Taylor Swift," Fake Mahomes says. "He went ghost when I needed him most."This wasn't just a small joke video that no one saw. As of Monday afternoon, the TikTok video had a whopping 7.7 million views, and more than 557,000 likes.The video creator did make it clear that the video was AI-generated. They used the "creator labeled as AI-generated" label that TikTok introduced last year, and the video itself features a watermark.Read more: Here's How To Rewatch the Kendrick Lamar Super Bowl Halftime ShowLook for signs of fakesThe realistic AI was made using Parrot AI, according to its creator. On a smartphone at a distance, it's difficult to tell that it's generated by AI at all.However, a closer look reveals the fake Mahomes' entire mouth and jaw quivering unnaturally multiple times throughout the video. It was still enough to fool many people on TikTok who watched the video."This is the most realistic AI I've ever seen," remarked one commenter, followed by a "bruh the AI is getting scary good" from another. Most Chiefs fans took the joke in stride. "I'm a Chiefs fan and this is funny as hell," remarked another TikTok user.This is one more deepfake AI video in a long-running series of joke videos poking fun at the Kansas City Chiefs. Most of them involve the conspiracy theory that the NFL referees call games in the Chiefs' favor. One examplefeatures AI-generated audio of the referee declaring that "during the play, personal foul, violating Patrick Mahomes' personal bubble. The result of the play is first down, touchdown, and Chiefs win by 50 points."Latest in a long line of celebrity deepfakesMahomes is one of many celebrities depicted in deepfake AI videos. Tom Hanks and CBS Mornings host Gayle King had AI-generated deepfakes used in adswithout their permission.Deepfake videos of Donald Trump, Elon Musk, and Joe Biden were popular during the most recent election. Meanwhile, this terrifying deepfake shows Bill Hader's face slowly warping into Tom Cruise's face.Facebook has been stepping up its effort to crack down on celebrity deepfakes and misinformation campaigns after a video of Ukrainian President Volodymyr Zelenskyy was posted falsely directing Ukrainian soldiers to surrender to Russian forces.Other tech companies like Google are also taking steps, with the search giant making it easier to remove deepfake images from Google Search.
    0 Yorumlar ·0 hisse senetleri ·45 Views
  • NIH Funding Cuts Would Hobble U.S. Medical Research, Insider Says
    www.scientificamerican.com
    February 10, 20255 min readNIH Funding Cuts Would Hobble U.S. Medical Research, Insider SaysLaboratories would literally go dark, says a medical research insider, if Trump administration cuts to NIH funding go through. Patients will suffer from lost medical advances, he tells Scientific AmericanBy Dan Vergano edited by Jeanna BrynerNational Institutes of Health (NIH) campus, Bethesda, Maryland. Grandbrothers/Alamy Stock PhotoThe U.S. National Institutes of Health announced on February 7 it was immediately cutting some $4 billion a year in funding to biomedical researchers nationwide. The move would reduce the share of NIH grants paid to indirect costslab upkeep, administration and operationto 15 percent, cutting their historical rate almost in half, overnight.In the announcement, NIH said that of roughly $35 billion spent funding 300,000 researchers nationwide in 2023, $9 billion went to indirect costs. The move to a lower indirect cost rate, it argued, put them more in line with those put in place by private foundations.On February 10 in response, 22 states filed a federal lawsuit, to protect their states and residents from unlawful action by the National Institutes of Health (NIH) that will devastate critical public health research at universities and research institutions in the United States.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Donald Trump proposed dropping NIHs indirect cost rate to 10 percent in 2017, but faced congressional resistance. As was the case then, the newly proposed cuts have triggered widespread criticism from scientists, who say it endangers patients and the U.S. strategic advantage in research. Frankly, this means that the lives of my children and grandchildrenand maybe yourswill be shorter and sicker, medical professor Theodore Iwashyna of Johns Hopkins University told CNN.Indirect costs eating into lab grants have long triggered complaints from scientists, but a 2014 Nature analysis concluded that overall, the data support administrators assertions that their actual recovery of indirect costs often falls well below their negotiated rates.Scientific American spoke to David Skorton, president of the Association of American Medical Colleges, which represents all the medical degreegranting schools in the U.S., about this shift, and its effects on medicine.[An edited transcript of the interview follows.]How does this affect people who may have never heard of indirect NIH grants before, but who get sick or know people who could benefit from better medicine?So the idea of biomedical research is multifaceted. Some of it is meant to help understand the way life works. Over a decade of research led to the idea that messenger RNA, a basic building block of biology, for example, could actually be used as a platform for vaccines. That knowledge was very basic, very fundamental, and eventually fed into Operation Warp Speed and the development of vaccines against COVID. So that's one thing.Then there are research projects that you might call applied research, like cancer clinical trials. Someone unfortunately has cancer, and basic research has shown that perhaps a new approach, like immunotherapy, harnessing the immune system to fight off cancer cells, might help. We need to find out, so it goes to human, clinical trials. Those clinical trials are also research projects. And then there are research projects that have to do with diagnosing illnesses. Not treating it. I did some research early in my career on computer processing medical images from the cardiovascular system. The idea there was to develop better diagnostic techniques that could lead to a quicker way to diagnose an illness. So that you know the right treatment.All of those obviously have different requirements. Fundamental research will often require a complicated and expensive laboratory, one that has the right kind of water, utilities and capabilities. And in that laboratory, there can be five research groups, and theyre all studying different questions. So those individual research groups, doing their individual projects, will apply for [NIH] funding. For the overall laboratory itself, for the cost of running it, for the utilities, all those things, can sometimes not be attributed exactly to any one project, because it really applies to all the projects. And so, the physical lab operation, utility cost, the libraries that back it up, are considered a so-called facility and administrative cost and sometimes referred to as indirect costsDecades ago the federal government came up with the idea that they would periodically, carefully, audit medical schools or universities that were doing the research to find out how much money they were actually spending on things like high-tech lab [equipment], high-speed data processing, security, data storage. And these are the things that are reimbursed through facilities and administrative costs or indirect costs.Just to be clear these, these costs are needed to do research. These are all costs that are vital to running labs at universities. Theyre just not things like an individual pipette. They are functions, say libraries or data centers, things you need to run the lab. The reason the federal government has a careful auditing system is that it's recognized that the direct grant itself is really funding the specific problem thats being studied. But you need this infrastructure to study many, many problems. So no, its not frills.So what happens when you cut this kind of funding?In fact, if the facility administrative costs are cut very, very severely as was announced by the NIH, laboratories would literally go dark. The research would stop. The march of knowledge that someone, a relative, friend, neighbor, co-worker, needs to survive an illness, or to have a diagnosis or move on with their lives after an accident, whatever it might happen to be, they would suffer because the research wouldnt be able to go on. Indirect costs are not frills, and NIH grants do not pay the full cost of doing research for just that reason.And thats the intention, right? The idea for decades behind NIH paying this grant money was to build a gigantic, biomedical research colossus in the United States, which it has become. These indirect costs are the way we did just that.Thats exactly correct. But the enterprise was not being built just for the sake of bragging rights. It was built to serve the American people.What do you make of the comparison to indirect cost rates for foundation grants in the NIH announcement? Is that a fair comparison?We have something online about this, but a couple high points: Foundations often allow more flexibility than the federal government does to allow for administrative expenses. So thats an apples and oranges comparison. And foundations often have a research focus that may differ from the federal government. When you think about the differencesin other words, what would the foundation pay as a direct cost that the federal government will notonce you figure that in, it shows a head-to-head comparison, the rates are not very much different.
    0 Yorumlar ·0 hisse senetleri ·54 Views