• 15 Best Nintendo Switch RPGs Ranked
    www.denofgeek.com
    Whether you just have a few minutes to level up, or youre taking a longer trip that allows you plenty of time to dive into a story and explore a massive new world, the RPG genre is perfect for playing on the go. Or gaming on the couch too, since were talking about the Nintendo Switch after all.The Switchs massive success over the last eight years has helped it accumulate one of the most impressive RPG libraries in history. From beloved classics to modern favorites, these are the 15 best RPGs on the Nintendo console.15. Ni No Kuni: Wrath of the White WitchIt may be more than a decade old at this point, but Ni No Kuni is still a wonderful mix of all the best parts of classic JRPGs and newer ideas. Gameplay leans a little bit into the Tales series, with a bit of Pokmon influence as well, but very much has its own identity. Either way, the combination of solid combat and exploration, along with a beautifully written story about a boy trying to resurrect his recently deceased mother, will likely keep you playing until the end.Plus, all of the games animated sequences were done by the legendary Studio Ghibli. If you like whats here, the sequel, Ni no Kuni II: Revenant Kingdom, is also on the Switch, and while its a very good game, it doesnt quite recapture the same magic as the original, and it lacks the involvement of Studio Ghibli as well.14. Tales of Vesperia: Definitive EditionWhile its reputation seems to improve a little more with every new release, the Tales franchise still doesnt have the name recognition of heavy hitters in the genre like Dragon Quest and Final Fantasy. Several games from the series have made their way to Switch (including fan favorite Tales of Symphonia), but Vesperia is just a little bit better.Yuri is just such a compelling main character, whos more cynical than your average JRPG protagonist. While the story may not break much new ground, this is some of the best combat in the Tales series, with plenty of customization and big combos at your fingertips. And it still looks great for a game that originally came out in 2008.13. OmoriA lot of indie games deal with mental health issues, but few are as effective and impactful as Omori. This is a game that uses its JRPG aesthetic to fully explore the issues of depression, anxiety, and isolation, but with a surreal horror twist. Omori is a game that will make you feel a lot of complex emotions. Even the happier moments come with a sense of looming dread.The games look has been favorably compared to EarthBound, another JRPG darling that dealt with some heavy issues, but while that game mostly keeps its Nintendo family-friendly charm, Omori isnt afraid of venturing into some very dark spaces.12. Final Fantasy VIIMost of the mainline Final Fantasy games (plus quite a few spin-offs) have made their way to the Switch by now, but the seventh game in the series still seems to have the most ardent fanbase. This is the original PS1 classic with all of the turn-based combat and blocky graphics still in place, and its still fantastic.This is just such a timeless story focusing on loss, identity, and environmentalism filled with big, iconic moments. Its still worth checking out this version over the recent remakes. Plus, it has several quality of life improvements over the original release, such as the options to speed up gameplay and eliminate random encounters. This makes it well worth revisiting if you havent played it in awhile.11. Sea of StarsFrom top to bottom, Sea of Stars is the ultimate love letter to 16-bit RPGs, just a beautiful pixel art adventure with a soaring soundtrack, tight turn-based battles, and an engaging story about saving the world. If you grew up on games like Chrono Trigger and Final Fantasy VI, this game was made for you.Join our mailing listGet the best of Den of Geek delivered right to your inbox!Admittedly, it does sometimes lack some of the depth of those older RPGs, but what Sea of Stars lacks in complexity, it makes up for with plenty of charm, and its old school design makes it a perfect fit for lengthy play sessions on the go.10. Super Mario RPGAll of the negatives youve heard about Super Mario RPG are absolutely true. It is an incredibly easy game, and short by any measure. Youll probably beat it in about 12 hours. But even if you just spend a weekend with it, that will be one of the greatest gaming experiences of your life.Super Mario RPG always keeps in mind that games are supposed to be fun. While simple, the battle system (tweaked slightly in this remake to be even easier than the original), is always engaging, and the classic Mario characters are written hilariously. With the updated 3D graphics and new post-game content, this is the definitive way to experience one of the greatest RPGs ever made.9. Pokmon Legends: ArceusTheres certainly no shortage of Pokmon games on the Switch between the Lets Go titles, the Generation IV remakes, plus Sword and Shield and Scarlet and Violet. Thats even without going into the many spin-offs. But the one title that really feels like its trying to do something different with the series and move it forward is Arceus.Obviously, its still Pokmon at its core, but the large maps, full of wild Pokmon who can attack you when provoked, combined with the simpler catching mechanics, just make it all feel like the next-gen monster catching experience fans have been dreaming about for decades instead of just another rehash of the same old formula.8. Octopath Traveler IIThe first Octopath Traveler was a fine example of Square Enix experimenting with several different ideas to see what would stick, and it turns out, pretty much everything did. Gamers loved being able to follow eight different stories that eventually converged, and the new HD-2D graphical style was the perfect blend of old school charm with modern technology.Aside from following a new cast of characters, Octopath Traveler II doesnt do much different from the original game. It doesnt have to. This is just a really well-written, foundationally strong RPG. Hopefully, this isnt the last weve heard of the series.7. Paper Mario: The Thousand-Year DoorThe Thousand-Year Door was originally released for the GameCube back in 2004, but its still widely regarded as the very best of the many Mario RPGs. This just seems like the RPG title that Nintendo put the most care into. The battle system, which requires timed button presses to get the most of attacks, feels pitch perfect here, and the treasure-seeking story is actually really good.After this game, Nintendo just got too experimental with the sequels, always to their detriment. Wisely, they didnt do much to change the core gameplay or script with this remake. This is mostly just a prettier version of The Thousand-Year Door, but now much more readily available.6. Dragon Quest XI S: Echoes of an Elusive Age Definitive EditionUnlike Final Fantasy, which keeps reinventing itself every few years, the Dragon Quest series has remained stubbornly true to its roots for more than three decades now. Though Dragon Quest XIs graphics are top tier, the actual gameplay isnt actually all that different from the original NES titles. Some gamers might find that to be a little too old school, but if its not broken, why fix it?This is quite simply a JRPG at its very best, with an epic, sprawling story about saving the world, an eclectic cast of likeable party members, and tried and true turn-based gameplay. Dragon Quest XI doesnt do anything to reinvent the wheel, but its just so well made that its hard to pass up.5. Star Ocean: The Second Story RSince its first release on the original PlayStation, Star Ocean: The Second Story has been continually overlooked despite plenty of praise from gamers and critics alike. Thankfully, Square Enix keeps remaking it to get it the attention it deserves. With its combination of gorgeous 3D backgrounds and sprite-based characters, this is one of the most beautiful and unique looking RPGs on the Switch, especially when combat ramps up and spells start flying.Combat is the real star here. Most battles are quick, action-packed affairs, but more difficult fights require some real strategy, and there are quite a few options in how you approach these fights. As for the story, its frankly pretty ridiculous the longer it goes on, but still oddly endearing. And if the game really gets its hooks into you, there are a whopping 99 endings to unlock.4. Persona 5 RoyalTheres a very good argument to be made that Persona 5 Royal is the greatest JRPG ever made. At the very least, it has one of the greatest stories, a lengthy tale about a silent teenager who grows into the confident Joker, as he and his friends venture into the subconscious minds of adults to change their hearts, all while balancing the demands of high school during the day.The writing is top notch throughout, but even better is how the game drips super cool style, from how thoughtfully designed every last menu is to its award-winning soundtrack. Most playthroughs clock in around 100 hours, and yet, it never outstays its welcome.3. The Witcher 3: Wild HuntIts really a miracle of programming that The Witcher 3 works on the Switch as well as it does. This is a game with one of the largest and most detailed maps in any RPG, a complex narrative where seemingly small choices can have major ramifications, and one of the deepest real-time combat systems ever devised in the genre. When it was first released in 2015, it pushed the Xbox One and PS4 pretty hard.And to be fair, playing this in docked mode is not the best way to experience The Witcher 3, even if its perfectly serviceable. But whats really impressive is just how great the game looks and plays in handheld mode. You can still get the full experience of playing one of the greatest RPGs ever made anywhere, anytime with minimal compromises.2. Fire Emblem: Three HousesJust when the Fire Emblem series seemed to have settled into a routine, Nintendo shook things up with a game that may be the very best in the venerable franchise. Three Houses feels very much like two different games, but it works well here. The opening hours mainly take place at a school where you teach classes, build relationships, and choose which of the titular three houses youll represent.After a time skip, the game opens up into a more traditional Fire Emblem game with all its trademark strategic depth, but the increased focus on social aspects means youll spend a lot more time getting to know the characters than in earlier titles. Thankfully, the writing is among the series best, and there are plenty of likable characters you wont mind getting to know better. Best of all, once youve gone through the game with one house, theres plenty more to see if you go back and replay it as the other two.1. Xenoblade Chronicles 3Given how reluctant Nintendo was to even release the first Xenoblade in the West, its shocking that this game exists, let alone that its the best RPG on the Switch. If you played the first two games, youre familiar with a lot of whats here. Xenoblade Chronicles 3 is full of frenetic combat, beautiful open worlds youre free to explore at your leisure, and an intriguing story about soldiers with 10-year lifespans.Monolith Soft took everything they learned from the prior games to craft the best combat in the series with so many skills and options, youll still be discovering new things dozens of hours into the game. Yes, this is another particularly lengthy RPG, but then again, it kind of has to be since it ties up so many storylines from the entire trilogy, and actually does so surprisingly well given the sprawling nature of the series. This is RPG perfection.
    0 Σχόλια ·0 Μοιράστηκε ·33 Views
  • Fantastic Four Red Ghost Theory Makes Perfect Sense for the Marvel Movie
    www.denofgeek.com
    The first trailer for Fantastic Four: First Steps brims with 1960s Marvel goodness. Weve got the Marvel-1, weve got H.E.R.B.I.E. with a reel-to-reel for a face, weve got Baxter Building in all of its glory.However, we dont have much of another defining aspect of 1960s Marvel: Cold War politics. Unless, of course, the mystery villain played by John Malkovich is not the Puppet Master, as was initially rumored, but rather Ivan Kragoff, aka the Red Ghost.If Malkovich is indeed the Red Ghost, then First Steps is embracing the 60s even more than we previously thought. The character also makes a lot of sense for this take on Marvels First Family.Red Ghost Is the Wacky B-Villain the Fantastic Four NeedsWho is He? What is He? asks the cover to 1962s Fantastic Four #13, an early chapter in Stan Lee and Jack Kirbys defining run on the series. The issue doesnt answer that question right away, and instead opens with Reed Richards nearly destroying his lab with an amazing discovery: a substance that would allow them send a rocket to the moon, thus winning the space race. While Ben and Reed argue about Mr. Fantastic attempting a solo mission instead of putting everyone at risk, we cut to Russia where scientist Ivan Kragoff prepares his team for a Soviet moon mission, except its a trio of trained apes. The apes allow Kragoff to launch his expedition at the same time as the FF. Like the already-irradiated foursome, Kragoff and his apes get exposed to cosmic rays and gain powers, including shape-shifting and super-strength. Because of his newfound power to become intangible, Kragoff takes the codename Red Ghost.Fantastic Four #13 perfectly encapsulates the tone of 1960s Marvel Comics. Like the comics published by DC, they took a goofy sci-fi tone, working as many monkeys as possible into stories about rockets and lasers. In other words, pulling Kragoff off the page and into First Steps is a way for Marvel to tap into the teams wacky early adventures, just as things are also about to get cosmic with Galactus and the Silver Surfer.Through Kragoff, Marvel can also explore another key part of the Fantastic Fours early days. You see, while Superman and Green Lantern largely stayed out of politics, 1960s Marvel stories traded on the Cold War. Soviets sent Natasha Romanoff aka the Black Widow to steal Tony Starks secrets. The Russians created their own Hulk out of Emil Blonsky, aka the Abomination. They created their own Captain America with the Red Guardian to battle the Avengers.In short, Lee may have sided with the hippies during his college campus visits, but he sure hated Commies in the pages of Marvel Comics.Red Ghost Is a Necessary Part of the 1960s SettingAlthough Black Widow, the Abomination, and Red Guardian have all appeared in the MCU, the modern setting means that Marvel doesnt have to deal with the characters original politics. The MCU Abomination is English, just like actor Tim Roth. David Harbour plays Red Guardian as a lovable buffoon whose pride in Mother Russia is cartoonish instead of scary. Even Natasha aligns herself more in opposition to the Red Room than to Russia, as indicated by the accent that Scarlett Johansson avoids and that Florence Pugh, as her sister Yelena Belova, adopts. None of these characters feel like they came from the cold war. Which makes sense, given that Iron Man premiered almost 20 years after the fall of the Berlin Wall.But First Steps cant ignore that tension, at least not if the movie wants to harness some the 60s optimism of the original Stan and Jack comics.Tony Stark, Reed Richards, Bruce Banner, and other first wave Marvel heroes were free market capitalists and individualists, whose ingenuity and hard work allowed them to create inventions that would prove the superiority of the West. Sure, there was some tragedy in there, as when Bruce risked his life to save a dumb teenager hanging around a gamma bomb test site and became the Hulk. But even Bruces bravery was framed as something the Soviets lacked. These comics believed that America would inevitably defeat the USSR, and that faith motivated their high-flying imagination.Join our mailing listGet the best of Den of Geek delivered right to your inbox!Red Ghost was one of the villains who provided a counter-point. Ugly and old, the Red Ghost represented everything retrograde, as did his decision to rely not on inventions but on apes, an evolutionary step backwards. The Red Ghost may cause trouble for the FF, but its their boundless faith in the Good Ol U.S. of A. that charges every wisecrack the Thing makes at Ivans expense.Red Ghost Gets a Good LaughGiven that Marvel worked hard to even call Nazis bad, separating Red Skull and Hydra from Hitler, its hard to believe that First Steps will lean so directly into a political message, even one as conservative as the anti-Communist rhetoric of the era.Still, its impossible to separate the FFs sense of achievement and their wild celebrity status from their ability to win the Space Race for America. Even if First Steps doesnt want to foreground that tension, having a character like Red Ghost around will allow the movie to recognize the Cold War politics indirectly. And as long as were laughing at his goofy set of super apes, we probably wont notice the political perspective the movie takes, at least not enough to champion it or get mad about it.After all, who can get mad at First Steps when were watching rockets, robots, super-monkeys, and other fantastic sights?Fantastic Four: First Steps zooms into theaters on July 25, 2025.
    0 Σχόλια ·0 Μοιράστηκε ·34 Views
  • macOS 15.3 fixes backup bugs affecting multiple apps
    9to5mac.com
    Update: SuperDuper developer Dave Nanian reports that Apple has fixed the issue in macOS 15.3.One or more backup bugs in macOS 15.2 Sequoia is affecting Apples own Time Machine utility, as well as third-party apps SuperDuper and CarbonCopyCloner. Initially the problem appeared to affect bootable backups only, but it now appears that it is either more general than this, or there is more than one bug affecting Mac backups Things started with SuperDuperSuperDuper developer Shirt Pocket originally used its own code to create bootable backups of Mac volumes, before Apple locked things down so that developers had to use the companys own replicator functionality.After SuperDuper bootable backups began failing, the company investigated and found that the bug was in Apples replicator code in macOS 15.2.macOS 15.2 was released a few days ago, with a surprise. A terrible, awful surprise. Apple broke the replicator.Towards the end of replicating the Data volume, seemingly when its about to copy either Preboot or Recovery, it fails with a Resource Busy error. In the past, Resource Busy could be worked around by ensuring the system was kept awake. But this new bug means, on most systems, theres no fix. It just fails.Since Apple took away the ability for 3rd parties (eg, us) to copy the OS, and took on the responsibility themselves, its been up to them to ensure this functionality continues to work. And in that, theyve failed in macOS 15.2. Because this is their code, and were forced to rely on it to copy the OS, OS copying will not work until they fix it.Synology said that the same issue is impacting bootable backups to its NAS systems.Wider macOS 15 Sequoia backup bug(s)CarbonCopyCloner was also affected, which again normally supports bootable backups, but then users reported that (non-bootable) Time Machine backups were also failing.MacOS 15.2 breaks Time Machine backups. A friend had his entire backup history blown away by Time Machine.Can confirm. I use Time Machine to back up to a Western Digital drive and will (only occasionally) get an error about not all files being available and the backup will try again when the iMac is unlocked. View all commentsThe issue appears to be specific to Apple Silicon Macs.Weve reached out to Apple for comment, and will update with any response.Via Daring Fireball. Photo bySamsung MemoryonUnsplash.Add 9to5Mac to your Google News feed. FTC: We use income earning auto affiliate links. More.Youre reading 9to5Mac experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Dont know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel
    0 Σχόλια ·0 Μοιράστηκε ·34 Views
  • Mac malware after your passwords and credit cards will get much worse this year
    9to5mac.com
    So-called macOS Stealers malware that seeks to extract personal data like passwords and credit card numbers from your machine is expected to be significantly more prevalent this year.A new annual report on the state of malware says that Mac owners could be at almost as much risk as Windows PC users this year Malwarebytes describes the growing security and privacy threat in its 2025 State of Malware report.Mac malware is undergoing a revolution as an old guard of threats gives way to a dangerous new breed of information stealers that use the same feature set and distribution channels as Windows malware []In 2024, a new generation of information stealers emerged to challenge the status quo and give Mac-using businesses a much more serious problem to worry about.Stealers make money for criminals by finding and stealing valuable information on the computers they infect, such as credit card details, authentication cookies, passwords and cryptocurrency. Although they do not discriminate between computers on home or corporate networks, stealers appetite for passwords and authentication cookies should be a serious concern to organizations using Macs.The report cites Poseidon and Atomic Stealer as examples.Poseidon boasts that it can steal cryptocurrency from over 160 different wallets, and passwords from web browsers, the Bitwarden and KeePassC password managers, the FileZilla file transfer app, and VPN configurations including Fortinet and OpenVPN []Information stealers like Atomic Stealer and Poseidon are a serious and growing threat on the Mac platform. Criminals can use stolen credentials to steal information, access sensitive resources, and create convincing social engineering attacks. In 2025, AI agents will be used to carry out a lot of the legwork for these attacks, meaning that they are likely to be carried out on an unprecedented scale.The company suggests that while Mac owners have historically been much safer than Windows PC users, the threat levels this year could be much closer.9to5Macs TakeMalwarebytes is in the business of selling corporate defenses against malware attacks, so its to be expected that it will talk up the risks.However, its certainly true that macOS Stealers have become a much bigger problem in the past year, and the use of autonomous AI agents to carry out attacks is a question of when rather than if.Most Mac malware relies on tricking users into installing it, so your best protection is to be very careful about where you source your Mac software. The Mac App Store is the safest place, followed by the websites of developers you trust. It shouldnt even need saying, but pirate software sites are of course rife with malware.Image: MalwarebytesAdd 9to5Mac to your Google News feed. FTC: We use income earning auto affiliate links. More.Youre reading 9to5Mac experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Dont know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel
    0 Σχόλια ·0 Μοιράστηκε ·33 Views
  • Cybercriminals Use Go Resty and Node Fetch in 13 Million Password Spraying Attempts
    thehackernews.com
    Feb 05, 2025Ravie LakshmananCybersecurity / Cloud SecurityCybercriminals are increasingly leveraging legitimate HTTP client tools to facilitate account takeover (ATO) attacks on Microsoft 365 environments.Enterprise security company Proofpoint said it observed campaigns using HTTP clients Axios and Node Fetch to send HTTP requests and receive HTTP responses from web servers with the goal of conducting ATO attacks."Originally sourced from public repositories like GitHub, these tools are increasingly used in attacks like Adversary-in-the-Middle (AitM) and brute force techniques, leading to numerous account takeover (ATO) incidents," security researcher Anna Akselevich said.The use of HTTP client tools for brute-force attacks has been a long-observed trend since at least February 2018, with successive iterations employing variants of OkHttp clients to target Microsoft 365 environments at least until early 2024.But by March 2024, Proofpoint said it began to observe a wide range of HTTP clients gaining traction, with the attacks scaling a new high such that 78% of Microsoft 365 tenants were targeted at least once by an ATO attempt by the second half of last year."In May 2024, these attacks peaked, leveraging millions of hijacked residential IPs to target cloud accounts," Akselevich said.The volume and diversity of these attack attempts is evidenced by the emergence of HTTP clients such as Axios, Go Resty, Node Fetch, and Python Requests, with those combining precision targeting with AitM techniques achieving a higher compromise rate.Axios, per Proofpoint, is designed for Node.js and browsers and can be paired with AitM platforms like Evilginx to enable theft of credentials and multi-factor authentication (MFA) codes.The threat actors have also been observed setting up new mailbox rules to conceal evidence of malicious activities, stealing sensitive data, and even registering a new OAuth application with excessive permission scopes to establish persistent remote access to the compromised environment.The Axios campaign is said to have primarily singled out high-value targets like executives, financial officers, account managers, and operational staff across transportation, construction, finance, IT, and healthcare verticals.Over 51% of the targeted organizations have been assessed to be successfully impacted between June and November 2024, compromising 43% of targeted user accounts.The cybersecurity company said it also detected a large-scale password spraying campaign using Node Fetch and Go Resty clients, recording no less than 13 million login attempts since June 9, 2024, averaging over 66,000 malicious attempts per day. The success rate, however, remained low, affecting only 2% of targeted entities.More than 178,000 targeted user accounts across 3,000 organizations have been identified to date, a majority of which belong to the education sector, particularly student user accounts that are likely to be less protected and can be weaponized for other campaigns or sold to different threat actors."Threat actors' tools for ATO attacks have greatly evolved, with various HTTP client tools used for exploiting APIs and making HTTP requests," Akselevich said. "These tools offer distinct advantages, making attacks more efficient.""Given this trend, attackers are likely to continue switching between HTTP client tools, adapting strategies to leverage new technologies and evade detection, reflecting a broader pattern of constant evolution to enhance their effectiveness and minimize exposure."Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.SHARE
    0 Σχόλια ·0 Μοιράστηκε ·36 Views
  • Silent Lynx Using PowerShell, Golang, and C++ Loaders in Multi-Stage Cyberattacks
    thehackernews.com
    Feb 05, 2025Ravie LakshmananThreat Intelligence / MalwareA previously undocumented threat actor known as Silent Lynx has been linked to cyber attacks targeting various entities in Kyrgyzstan and Turkmenistan."This threat group has previously targeted entities around Eastern Europe and Central Asian government think tanks involved in economic decision making and banking sector," Seqrite Labs researcher Subhajeet Singha said in a technical report published late last month.Targets of the hacking group's attacks include embassies, lawyers, government-backed banks, and think tanks. It has been assessed to be a Kazakhstan-origin threat actor with a medium level of confidence.The infections commence with a spear-phishing email containing a RAR archive attachment that ultimately acts as a delivery vehicle for malicious payloads responsible for granting remote access to the compromised hosts.The first of the two campaigns, detected by the cybersecurity company on December 27, 2024, leverages the RAR archive to launch an ISO file that, in turn, includes a malicious C++ binary and a decoy PDF file. The executable subsequently proceeds to run a PowerShell script that uses Telegram bots (named "@south_korea145_bot" and "@south_afr_angl_bot") for command execution and data exfiltration.Some of the commands executed via the bots include curl commands to download and save additional payloads from a remote server ("pweobmxdlboi[.]com") or Google Drive.The other campaign, in contrast, employs a malicious RAR archive containing two files: A decoy PDF and a Golang executable, the latter of which is designed to establish a reverse shell to an attacker-controlled server ("185.122.171[.]22:8082").Seqrite Labs said it observed some level of tactical overlaps between the threat actor and YoroTrooper (aka SturgeonPhisher), which has been linked to attacks targeting the Commonwealth of Independent States (CIS) countries using PowerShell and Golang tools."Silent Lynx's campaigns demonstrate a sophisticated multi-stage attack strategy using ISO files, C++ loaders, PowerShell scripts, and Golang implants," Singha said."Their reliance on Telegram bots for command and control, combined with decoy documents and regional targeting which also highlights their focus on espionage in Central Asia and SPECA based nations."Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.SHARE
    0 Σχόλια ·0 Μοιράστηκε ·36 Views
  • It Takes a Village: New Infrastructure Costs for AI -- Utility Bills
    www.informationweek.com
    Joao-Pierre S. Ruth, Senior EditorFebruary 5, 20257 Min Read Tithi Luadthong via Alamy Stock PhotoDemand for artificial intelligence, from generative AI to the development of artificial general intelligence, puts greater burdens on power plants and water resources, which might also put the pinch on surrounding communities.The need to feed power to the digital beast to support trends, such as the rise of cryptocurrency, is not new but the persistent demand to build and grow AI calls new attention to the limits of such resources and inevitable rises in price.The growth in power utilized by data centers is unprecedented, says David Driggers, CTO for cloud services provider Cirrascale. With the AI boom thats occurred in the last 18 to 24 months, it is literally unprecedented on the amount of power thats going to data centers and the projected amount of power going into data centers. Dot-com didnt do this. Linux clustering did not this.The hunger for AI led to a new race for energy and water that can be very precious in some regions. The goal might be to find a wary balance, but for now stakeholders are just looking for ways to keep up. Data centers used to take up 1% of the worlds power, and thats now tripled, and its still going up, Driggers says. Thats just insane growth.In recent years, chipmakers such as Nvidia and AMD saw their sales to data centers ramp up in response to demand and expectations for AI, he says, as more users and companies dove into the technology. A big part of it is just the power density of these platforms is significantly higher than anything thats been seen before, Driggers says.Related:Feeding the MachinesThere was a time when an entire data center might need one megawatt of power, he says. Then that became the power scale to support just a suite -- now it can take five megawatts to do the job. Were not a hyperscaler but even within our requirements, were seeing over six months, our minimum capacity requirements are doubling, Driggers says. Thats hard to keep up with.The runaway demand might not be simple to respond to given the complexities of regulations, supply, and the costs this all brings.Evan Caron, co-founder and chief investment officer, Montauk Climate, says a very complicated interdependency exists between public and private infrastructure. Who bears the cost of infrastructure buildout? What markets are you in? Theres a lot of nuance associated with where, what, when, how, et cetera.There is no catchall answer to this demand, he says, given local and regional differences in resources and regulations. Its very hard to assume the same story works for every part, every region in the US, every region globally, Caron says, who ultimately bears the cost, whether its inflationary, whether its ultimately deflationary.Related:Even before the heightened demand for AI, data centers already came with significant utility price tags. Generally speaking, a data center uses a lot of land, a lot of water -- fresh water -- a lot of power, Caron says. And you need to be able to build infrastructure to support the needs of that customer.Depending on where in the US the data center is located, he says there can be requirements for data centers to build substations, transmission infrastructure, pipeline infrastructure, and roads, which all add to the final bill. Some of it will be borne by the consumers in the market, Caron says. The residential customers, the commercial customers that arent the data center are going to get charged a share of the cost to interconnect that data center.Still, it is not as simple as hiking up prices any time demand increases. Utility companies typically must present before their respective utility commissions the plans to provide those services, their need to build transmission lines, and more to determine whether it is worth making such upgrades, Caron says.Related:Thats why youre seeing a lot of pushback, he says, because the assets that are going behind the meter get unfair subsidies from a utility, from a transmission company, from a generation company. This can increase costs passed on to other consumers.Footing the BillIt does not have to be that way though. If hyperscalers were required to front the entire bill for such new infrastructure, Caron says, it could be argued that it would be a benefit to the rest of the customers and community. However, that is not the current state of affairs. Theyre not interested in bearing the cost across the board he says, so theyre pushing a lot of those costs back to consumers.The first several years of such buildouts could be very inflationary, Caron says. The promise of AI -- to deliver smarter systems that are more efficient with lower costs of living -- would ultimately be deflationary. In the near term, however, there is a supply and demand imbalance, he says. You have more demand than supply; prices have to rise to meet that.That could lead to increased costs across technology-driven regions with elevated competition for resources. Its going to be very inflationary for a long time, Caron says.He foresees the Trump administration moving to rip out regulation based on a narrative that these processes can be easier, but state governments and the federal governments have distinct powers that can make this more complex than solving the problem with the stroke of one pen Utilities are regulated monopolies in the state, Caron says. Theres almost 3,000 separate utilities in North America.Multiple stakeholders, incumbent energy companies, independent power producers, and the fairness doctrine around antitrust are all elements that come into play in this energy race. Youre not going to get everyone to be aligned around the same set of expectations, Caron says.Consumers want prices to go down, he says, while energy generators can want prices to go up, transmission companies get a regulated rate of return, and public utility commissions are responsible for the protection of consumer interests. You dont have a situation where this is a cooperative game, Caron says. It is a multi-stakeholder systems approach and its not going to be that easy to solve all the problems in a short period of time.A complex lattice of operators, state law, co-ops, government agencies, commissions, and federal involvement that all come into play as well. It is not obvious how this can be solved quickly.The near-term demand for power could have a historic impact. Its probably the second time in modern history where weve had to completely rethink how power markets evolve and how power markets grow and scale, he says.Not a Drop to DrinkThat still does not even include water in the equation yet. Water is a scarce resource, Coran says. Data centers use five million gallons a day of water. That waters got to come from somewhere. It can come from brackish water or greywater systems, he says, as well as from fresh water. That demand can compete with residential water systems and hospital water systems.Could demand and the cost of these resources push systems to their breaking point, where supply simply cannot keep up? He says the recent executive orders issued around creating a national energy emergency likely would not emerge if demand remained moderate.Improved efficiencies and upgraded systems contributed to deflationary energy loads in some energy markets, Caron says. We werent in an energy crisis, he says. We were actually retiring power plants. We had too much. We were in an abundance scenario. That honeymoon with energy seems over with have changed with the swelling demand for power to support technology such as data centers and AI.The reason why were in an energy crisis now, and thats why the Trump administration has issued an executive order for an emergency an energy crisis, is we do not have the resources today, Caron says, The national priority, including national security, placed on owning AI and data center infrastructure means more power and other resources will be necessary. Without mobilizing every bit of the economy, like its almost wartime mobilization, we will run out of those resources to be able to support the load growth that people are predicting for AGI, AI, inference, and LLM. We just dont have it.About the AuthorJoao-Pierre S. RuthSenior EditorJoao-Pierre S. Ruth covers tech policy, including ethics, privacy, legislation, and risk; fintech; code strategy; and cloud & edge computing for InformationWeek. He has been a journalist for more than 25 years, reporting on business and technology first in New Jersey, then covering the New York tech startup community, and later as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight.See more from Joao-Pierre S. RuthNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Σχόλια ·0 Μοιράστηκε ·34 Views
  • The Cost of AI Infrastructure: New Gear for AI Liftoff
    www.informationweek.com
    Richard Pallardy, Freelance WriterFebruary 5, 202514 Min ReadTithi Luadthong via Alamy StockOptimizing an organization for AI utilization is challenging -- not least due to the difficulty in determining which equipment and services are actually necessary and balancing those demands with how much it will cost. In a rapidly changing landscape, companies must decide how much they want to depend on AI and make highly consequential decisions in short order.A 2024 Expereo report found that 69% of businesses are planning on adopting AI in some form. According to a 2024 Microsoft report, 41% of leaders surveyed are in search of assistance in improving their AI infrastructure. Two-thirds of executives were dissatisfied with how their organizations were progressing in AI adoptions according to a BCG survey last year.Circumstances vary wildly, from actively training AI programs to simply deploying them -- or both.Regardless of the use case, a complex array of chips is required -- central processing units (CPUs), graphics processing units (GPUs), and potentially data processing units (DPUs) and tensor processing units (TPUs).Enormous amounts of data are required to train and run AI models and these chips are essential to doing so. Discerning how much compute power will be required for a given AI application is crucial to deciding how many of these chips are needed -- and where to get them. Solutions must be simultaneously cost-effective and adaptable.Related:Cloud services are accessible and easily scalable, but costs can add up quickly. Pricing structures are often opaque and budgets can balloon in short order even with relatively constrained use. And depending on the applications of the technology, some hardware may be required as well.On-premise solutions can be eye-wateringly expensive too -- and they come with maintenance and updating costs. Setting up servers in-office or in data centers requires an even more sophisticated understanding of projected computing needs -- the amount of hardware that will be needed and how much it will cost to run it. Still, they are also customizable, and users have more direct control.Then, the technicalities of how to store the data used to train and operate AI models and how to transmit that data at high bandwidths and with low latency come into play. So, too, privacy is a concern, especially in the development of new AI models that often use sensitive data.It is a messy and highly volatile ecosystem, making it even more crucial to make informed decisions on technological investment.Here, InformationWeek investigates the complexities of establishing an AI optimized organization, with insights from Rick Bentley, founder of AI surveillance and remote guarding company Cloudastructure and crypto-mining company Hydro Hash, Adnan Masood, chief AI architect for digital solutions company UST, and Lars Nyman, chief marketing officer of cloud computing company CUDO Compute.Related:All About the ChipsTraining and deploying AI programs hinges on CPUs, GPUs and in some cases TPUs.CPUs provide basic services -- running operating systems, delivering code, and wrangling data. While newer CPUs are capable of the parallel processing required for AI workloads, they are best at sequential processing. An ecosystem only using CPUs is capable of running very moderate AI workloads -- typically, inference only.GPUs of course are the linchpin of AI technology. They allow the processing of multiple streams of data in parallel -- AI is reliant on massive amounts of data and it is crucial that systems can handle these workloads without interruption. Training and running AI models of any significant size -- particularly those using any form of deep learning -- will require GPU power. GPUs may be up to 100 times as efficient as CPUs at performing certain deep learning tasks.Related:Whether they are purchased or rented, GPUs cost a pretty penny. They are also sometimes hard to come by given the high demand.Lars Nyman, CUDO ComputeThey can crunch data and run training models at hyperspeed. SMEs might go for mid-tier Nvidia GPUs like the A100s, while larger enterprises may dive headfirst into specialized systems like Nvidia DGX SuperPODs, Nyman says. A single high-performance GPU server can cost $40,000$400,000, depending on scale and spec.Certain specialized tasks may benefit from the implementation of application specific integrated circuits (ASICs) such as TPUs, which can accelerate workloads that use neural networks.Where Does the Data Live?AI relies on enormous amounts of data -- words, images, recordings. Some of it is structured and some of it is not.Data can exist either in data lakes -- unstructured pools of raw data that must be processed for use -- or data warehouses -- structured repositories of data that can be more easily accessed by AI applications. Data processing protocols can help filter the former into the latter.Organizations looking to optimize their operations through AI need to figure out where to store that data securely while still allowing machine learning algorithms to access and utilize it.Hard disk drives or flash-based solid-state drive arrays may be sufficient for some projects.Good old spindle hard drives are delightfully cheap, Bentley says. They store a lot of data. But they're not that fast compared to the solid state drives that are out now. It depends on what you're trying to do.Organizations that rely on larger amounts of data may need non-volatile memory express (NVMe)-based storage arrays. These systems are primed to communicate with CPUs and channel the data into the AI program where it can be analyzed and deployed.That data needs to be backed up, too.AI systems obviously thrive on data, but that data can be fragile, Nyman observes. At minimum, SMEs need triple-redundancy storage: local drives, cloud backup, and cold storage. Object storage systems like Ceph or S3-compatible services run around $100/TB a month, scaling up fast with your needs.Networking for AIAn efficient network is essential for establishing an effective AI operation. High-speed networking fools the computer into thinking that it actually has the whole model loaded up, Masood says.Ethernet and fiber connections are generally considered optimal due to their high bandwidth and low latency. Remote direct memory access (RDMA) over Converged Ethernet protocols are considered superior to standard Ethernet-based networks due to their smooth handling of large data transfers. InfiniBand may also be an option for AI applications that require high performance.Low-latency, high-bandwidth networking gear, such as 100 Gigabytes per second (Gbps) switches, fiber cabling, and SDN (software-defined networking) keeps your data moving fast -- a necessity, Nyman claims.Bandwidth for AI must be high. Enormous amounts of data must be transferred at high speeds even for relatively constrained AI models. If that data is held up because it simply cannot be transferred in time to complete an operation, the model will not provide the promised service to the end user.Latency is a major hang-up. According to findings by Meta, 30% of wasted time in an AI application is due to slow network speeds. Ensuring that no compute node is idle for any significant amount of time can save enormous amounts of money. Failing to utilize a GPU, for example, can result in lost investment and operational costs.Front-end networks handle the non-AI component of the compute necessary to complete the operations as well as the connectivity and management of the actual AI components. Back-end networks handle the compute involved in training and inference -- communication between the chips.Both Ethernet and fiber are viable choices for the front end network. Ethernet is increasingly the preferred choice for back-end networks. Infrastructure as a service (IaaS) arrangements may take some of the burden off of organizations attempting to navigate the construction of their networks.If you have a large data setup, you don't want to run it with Ethernet, Masood cautions, however. If you're using a protocol like InfiniBand or RDMA, you have to use fiber.Though superior for some situations, these solutions come at a premium. The switches, the transceivers, the fiber cables -- they are expensive, and the maintenance cost is very high, he adds.While some level of onsite technology is likely necessary in some cases, these networking services can be taken offsite, allowing for easier management of the complex array of transfers between the site, data centers and cloud locations. Still, communication between on-premise devices must also be handled rapidly. Private 5G networks may be useful in some cases.Automation of these processes is key -- this can be facilitated by the implementation of a network operating system (NOS) that can handle the various inputs and outputs and scale as the operation grows. Interoperability is key given that many organizations will utilize a hybrid of cloud, data center and onsite resources.DPUs can be used to further streamline network operations by processing data packets, taking some of the workload from CPUs and allowing them to focus on more complex computations.Where Oh Where Do I Site My Compute?AI implementation is tricky: everything, it seems, must happen everywhere and all at once. It is thus challenging to develop a balance of on-site technology, data center resources and cloud technologies that meets the unique needs of a given application.I've seen 30% of people go with the on-prem route and 70% of the people go with the cloud route, Masood says.Adnan Masood, USTSome organizations may be able to get away with using their existing technology, leaning on cloud solutions to keep things running. Implementing a chatbot does not necessarily mean dumping funds into cutting edge hardware and expensive data center storage.Others, however, may find themselves needing more complex workstations, in-house and off-site storage and processing capabilities facilitated by bespoke networks. Training and inference of more complex models requires specialized technology that must be fine-tuned to the task at hand -- balancing exigent costs with scalability and privacy as the project progresses.Onsite SolutionsAll organizations will need some level of onsite hardware. Small-scale implementation of AI in cloud-based applications will likely require only minor upgrades, if any.The computers that people need to run anything on the cloud are just browsers. It's just a dumb terminal, Bentley says. So you don't really need anything in the office. Larger projects will likely need more specialized set ups.The gap, however, is closing rapidly. According to Gartner, AI-enabled PCs containing neural processing units (NPUs) will comprise 43% of PC purchases in 2025. Canalys expects this ratio to rise to 60% by 2027. The transition may be accelerated by the end of support for Windows 10 this year. This suggests that as organizations modernize their basic in-office hardware in the next several years, some level of AI capability will almost certainly be embedded. Some hardware companies are more aggressively rolling out purpose-built AI capable devices as well.Thus, some of the compute power required to power AI will be moved to the edge by default -- likely reducing reliance on cloud and data centers to an extent, especially for organizations treading lightly with their early AI use. Speeds will likely be improved by the simple proximity of the necessary hardware.Organizations considering more advanced equipment must consider the amount of compute power they need from their devices in comparison to what they can get from their cloud or data center services -- and how easily it can be upgraded in the future. It's worth noting, for example, that many laptops are difficult to upgrade because the CPUs and GPUs are soldered to the motherboard.The cost for a good workstation with high-end machines is usually between $5,000$15,000, depending on your setup, Masood reports. That's really valuable, because the workload people have is constantly increasing.Bentley suggests that in some cases, a simpler solution is available. One of the best bangs for the buck as a step up is a gaming PC. It's just an Intel i9. The CPU almost doesn't matter. It has an RTX 4090 graphics card, he says.Organizations that are going all in will benefit from the increasing sophistication of this type of hardware. But they may also require on-premise servers out of practicality. Siting servers in-house allows for easier customization, maintenance and scaling. Bandwidth requirements and latency may be reduced. And it is also a privacy safeguard -- organizations handling high volumes of proprietary data and developing their own algorithms to utilize it need to ensure that it is housed and moved with the greatest of care.The upfront costs of installation, in addition to maintenance and staffing, present a challenge.It's harder to procure hardware, Masood notes. Unless you are running a very sophisticated shop where you have a lot of data privacy restrictions and other concerns, you probably want to still go with the cloud approach.For an SME starting from scratch, youre looking at $500,000 -- $1 million for a modest AI-ready setup: a handful of GPU servers, a solid networking backbone, and basic redundancy, Nyman says. Add more if your ambitions include large-scale training or real-time AI inference.Building in-house data centers is a heavy lift. We're looking at $20$50 million for a mid-sized operation, Nyman estimates. Then theres of course the ongoing cost of cooling, electricity, and maintenance. A 1 megawatt (MW) data center -- enough to power about 10 racks of high-end GPUs -- can cost around $1 million annually just to keep the lights on.But for organizations confident in the profitability of their product, it is likely a worthwhile investment. It may in fact be cheaper than utilizing cloud services in some cases. Further, the cloud is likely to be subjected to an increasing level of strain -- and thus may become less reliable.Off-Site SolutionsData center co-location services may be suitable solutions for organizations that wish to maintain some level of control over their equipment but do not wish to maintain it themselves. They can customize their servers in the same way they might in an on-premise situation -- installing exactly the number of GPUs and other components they require to operate their programs.SMEs may invest in a shared space in a data center -- they will have 100 GPUs, which they're using to handle training or dev based workloads. That costs around $100,000$200,000 upfront, Masood says. People have been experimenting with it.Rick Bentley, CloudastructureThey can then pay the data center to maintain the servers -- which of course results in additional costs. The tools get increasingly sophisticated the more data you're dealing with, and that gets expensive, Bentley says. Support plans can be like $50,000 a month for the guy who sold you the storage array to keep it running well for you.Still, data centers obviate the need for retrofitting on-premise conditions --proper connections, cooling infrastructure and power needs. And at least some maintenance and costs are standardized and predictable. Security protocols will also already be in place, reducing separate security costs.Cloud SolutionsOrganizations that prefer minimal hardware infrastructure -- or none at all -- have the option of utilizing cloud computing providers such as Amazon, Google and Microsoft. These services offer flexible and scalable solutions without the complexity of setting up servers and investing in specialized workstations.Major cloud providers offer a shared responsibility model -- they provide you the GPU instances, they provide the setup. They provide everything for you, Masood says. It's easier.This may be a good option for organizations just beginning to experiment with AI integration or still deciding how to scale up their existing AI applications without spending more on hardware. A wide variety of advanced resources are available, allowing companies to decide on which ones are most useful to them without any overhead aside from the cost of the service and the work itself. Further, they typically offer intuitive interfaces that allow beginners to play with the technology and learn as they go.If companies are using a public cloud provider, they have two options. They can either use managed AI services or they can use the GPU instances the companies provide, Masood says. When they use the GPU instances which companies provide, that is divided into two different categories: spot instances, which means you buy it on demand right away, and renting them. If you rent over longer periods, of course, the cost is cheaper.But cloud is not always the most cost-efficient option. Those bills can get fantastically huge, Bentley says. They start charging for storing data while it's there. There are companies who exist just to help you understand your bill so you can reduce it.They kind of leave you to do the math a lot of the time. I think it's somewhat obfuscated on purpose, he adds. You still need to have at least one full time DevOps person whose job it is to run these things well.In the current environment, organizations are compelled to piece together the solutions that work best for their needs. There are no magic formulas that work for everyone -- it pays to solicit the advice of knowledgeable parties and devise custom setups.AI definitely isnt a plug and play solution -- yet, Nyman says. Its more like building a spaceship where each part is critical and the whole greater than the sum. Costs can be staggering but the potential ROI (process automation, faster insights, and market disruption), can justify the investment.Nonetheless, Masood is encouraged. People used to have this idea that AI was a very capital-intensive business. I think that's unfounded. Models are maturing and things are becoming much more accessible, he says.Read more about:Network ComputingAbout the AuthorRichard PallardyFreelance WriterRichard Pallardy is a freelance writer based in Chicago. He has written for such publications as Vice, Discover, Science Magazine, and the Encyclopedia Britannica.See more from Richard PallardyNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
    0 Σχόλια ·0 Μοιράστηκε ·34 Views
  • The Download: smart glasses in 2025, and Chinas AI scene
    www.technologyreview.com
    This is today's edition ofThe Download,our weekday newsletter that provides a daily dose of what's going on in the world of technology. Whats next for smart glasses For every technological gadget that becomes a household name, there are dozens that never catch on. This year marks a full decade since Google confirmed it was stopping production of Google Glass, and for a long time it appeared as though mixed-reality products would remain the preserve of enthusiasts rather than casual consumers. Fast-forward 10 years, and smart glasses are on the verge of becomingwhisper itcool. Sleeker designs are certainly making this new generation of glasses more appealing. But more importantly, smart glasses are finally on the verge of becoming useful, and its clear that Big Tech is betting that augmented specs will be the next big consumer device category. Heres what to expect from smart glasses in 2025 and beyond. Rhiannon Williams This story is part of MIT Technology Reviews Whats Next series, which looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here. Four Chinese AI startups to watch beyond DeepSeek The meteoric rise of DeepSeekthe Chinese AI startup now challenging global giantshas stunned observers and put the spotlight on Chinas AI sector. Since ChatGPTs debut in 2022, the countrys tech ecosystem has been in relentless pursuit of homegrown alternatives, giving rise to a wave of startups and billion-dollar bets. Today, the race is dominated by tech titans like Alibaba and ByteDance, alongside well-funded rivals backed by heavyweight investors. But two years into Chinas generative AI boom we are seeing a shift: Smaller innovators have to carve out their own niches or risk missing out. What began as a sprint has become a high-stakes marathonChinas AI ambitions have never been higher. We have identified these four Chinese AI companies as the ones to watch. Caiwei Chen The must-reads Ive combed the internet to find you todays most fun/important/scary/fascinating stories about technology. 1 The US Postal Service has stopped accepting parcels from China And plunged the ecommerce industry into utter chaos. (Wired $)+ Trumps China tariffs are coming for Amazon, too. (Insider $)2 Elon Musk has weaponized X in his war on government spendingThe billionaire is conducting polls asking users which agency he should gut next. (NYT $) + Musks staffers reportedly entered NOAA headquarters yesterday. (The Guardian)+ DOGE now appears to have access to Treasury payment systems. (Fast Company $)+ But it does appear as though Trump blocked Musk from hiring a noncitizen. (The Atlantic $)3 Google has quietly dropped its promise not to use its AI to build weapons Just weeks after rival OpenAI also reversed its anti-weapons development stance. (CNN)+ OpenAIs new defense contract completes its military pivot. (MIT Technology Review)4 The metaverses future isnt looking so rosy Metas CTO has conceded that this year is critical to its success or failure. (Insider $) 5 OpenAI is attempting to court Hollywoods filmmakers But its Sora video tool has been met with a frosty reception. (Bloomberg $)+ How to use Sora, OpenAIs video generating tool. (MIT Technology Review)6 These drones are launching drones to attack other dronesUkraine is continuing to produce innovative battlefield technologies. (Ars Technica) + Meet the radio-obsessed civilian shaping Ukraines drone defense. (MIT Technology Review)7 How to make artificial blood Were running out of the real stuff. Is fake blood a viable alternative? (New Yorker $) 8 Students have worked out how to hack schools phone prisonsTeachers should know that smart kids will always find a workaround. (NY Mag $) 9 Social media cant give you validation So stop trying to find it there. (Vox)10 Internet slang is out of control Skibidi, gigachad, or deeve, anyone? (WSJ $)Quote of the day While we encourage people to use AI systems during their role to help them work faster and more effectively, please do not use AI assistants during the application process. AI company Anthropic urges people applying to work there not to use chatbots and other tools during the process, the Financial Times reports. The big story The race to save our online lives from a digital dark age August 2024There is a photo of my daughter that I love. She is sitting, smiling, in our old back garden, chubby hands grabbing at the cool grass. It was taken on a digital camera in 2013, when she was almost one, but now lives on Google Photos. But what if, one day, Google ceased to function? What if I lost my treasured photos forever? For many archivists, alarm bells are ringing. Across the world, they are scraping up defunct websites or at-risk data collections to save as much of our digital lives as possible. Others are working on ways to store that data in formats that will last hundreds, perhaps even thousands, of years.The endeavor raises complex questions. What is important to us? How and why do we decide what to keepand what do we let go? And how will future generations make sense of what were able to save? Read the full story.Niall Firth We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + Letsa goNintendo has added 49 Super Mario World tracks to its music app!+ Congratulations are in order for New Zealands Mount Taranaki, which is now legally recognized as a person. + Ive got something in common with these Hollywood greats at last: they never won an Oscar, either.+ Do you prefer music or silence in your yoga class?
    0 Σχόλια ·0 Μοιράστηκε ·33 Views
  • Lords committee finds grey belt policy unlikely to have significant impact on housebuilding
    www.bdonline.co.uk
    The governments plan to introduce a new grey belt category of land within the green belt is unlikely to have a significant impact on housebuilding, an influential committee has found.The House of Lords Built Environment committee has concluded its inquiry into the government policy, which was envisaged as a way for local planning authorities to overcome opposition to development of green belt land with limited environmental value.However, following a five-month inquiry, the committee has concluded the policy itself is now likely to make little direct difference.Source: ShutterstockA parliamentary committee has found proposals to build on so-called grey belt land will make little difference on housing shortagesCommittee chair Lord Moylan, in a letter to housing secretary Angela Rayner, said: Our assessment is that the grey belt policy has been implemented in a somewhat rushed and incoherent manner, and we do not believe that it is likely to have any significant or lasting impact on planning decision-making or on achieving your target of 1.5 million new homes by the end of this parliament.The committee said it originally saw potential in the grey belt policy to expand rural settlements and unlock sites on the boundaries of existing communities.It said that by making grey belt land a distinct category and highlighting that it is land that makes a limited contribution to the original green belt principles, it might have been possible to mitigate local opposition to development.However, in December, the final National Planning Policy Framework was published, including a requirement for councils to review green belt boundaries and propose alterations if they are not able to satisfy their identified needs for homes through other means.Lord Moylan wrote: We suspect that the concept of grey belt land may now be largely redundant and that it has been eclipsed by more significant changes to other aspects of the NPPF, which will be likely to see land released from the green belt through existing channels instead. He said the most likely effect of the finalised grey belt policy if any will be to nudge councils and developers towards using the existing recognised processes to allow slightly more development in the green belt.He added: A policy that once had the potential to be innovative and unique is now, at best, relegated to the margins.Lord Moylan also said the committee was not satisfied that the government has a sufficient understanding of the implications raised when introducing concurrent intersecting planning policies, risking its ability to deliver them in a coherent way.>>See also:Pennycook convinced 1.5 million homes are deliverable but wont commit to annual targetsThe committee was also critical of the governments monitoring of performance against its housing and planning policies, citing housing minister Matthew Pennycook, who told the committee government has no specified annual target for progress towards its 1.5 million homes goalLord Moylan said: We see no evidence that the government has a clear plan to track the progress and assess the effectiveness of its new policies.We appreciate the difficulty of determining a precise trajectory for reaching targets of this nature, but the proposed approach does not support the measurement or tracking of progress either in terms of the governments housing target, or in terms of evaluating the successes or failures of new or existing planning policies.The committee also found:Grey belt sites have the potential to support SME housebuilders as the smaller size of some grey belt sites would be less economically attractive to larger builders. However, the affordable housing requirement makes it financially difficult for smaller firmsLocal authority planning departments will lack sufficient resourcing and expertise to be able to deliver change at the pace demanded of themThe introduction of the concept of grey belt land could have the undesirable effect of encouraging ad hoc and speculative applications for development on land within the green beltA spokesperson for the Ministry of Housing, Communities and Local Government said: Our green belt reforms are informed through widespread consultation and will unlock more land for the homes and infrastructure communities desperately need, delivering sustainable, affordable and well-designed developments on low quality grey belt.
    0 Σχόλια ·0 Μοιράστηκε ·36 Views