• Patch Notes #9: Xbox debuts its first handhelds, Hong Kong authorities ban a video game, and big hopes for Big Walk

    We did it gang. We completed another week in the impossible survival sim that is real life. Give yourself a appreciative pat on the back and gaze wistfully towards whatever adventures or blissful respite the weekend might bring.This week I've mostly been recovering from my birthday celebrations, which entailed a bountiful Korean Barbecue that left me with a rampant case of the meat sweats and a pub crawl around one of Manchester's finest suburbs. There was no time for video games, but that's not always a bad thing. Distance makes the heart grow fonder, after all.I was welcomed back to the imaginary office with a news bludgeon to the face. The headlines this week have come thick and fast, bringing hardware announcements, more layoffs, and some notable sales milestones. As always, there's a lot to digest, so let's venture once more into the fray. The first Xbox handhelds have finally arrivedvia Game Developer // Microsoft finally stopped flirting with the idea of launching a handheld this week and unveiled not one, but two devices called the ROG Xbox Ally and ROG Xbox Ally X. The former is pitched towards casual players, while the latter aims to entice hardcore video game aficionados. Both devices were designed in collaboration with Asus and will presumably retail at price points that reflect their respective innards. We don't actually know yet, mind, because Microsoft didn't actually state how much they'll cost. You have the feel that's where the company really needs to stick the landing here.Related:Switch 2 tops 3.5 million sales to deliver Nintendo's biggest console launchvia Game Developer // Four days. That's all it took for the Switch 2 to shift over 3.5 million units worldwide to deliver Nintendo's biggest console launch ever. The original Switch needed a month to reach 2.74 million sales by contrast, while the PS5 needed two months to sell 4.5 million units worldwide. Xbox sales remain a mystery because Microsoft just doesn't talk about that sort of thing anymore, which is decidedly frustrating for those oddballswho actually enjoy sifting through financial documents in search of those juicy juicy numbers.Inside the ‘Dragon Age’ Debacle That Gutted EA’s BioWare Studiovia Bloomberg// How do you kill a franchise like Dragon Age and leave a studio with the pedigree of BioWare in turmoil? According to a new report from Bloomberg, the answer will likely resonate with developers across the industry: corporate meddling. Sources speaking to the publication explained how Dragon Age: The Veilguard, which failed to meet the expectations of parent company EA, was in constant disarray because the American publisher couldn't decide whether it should be a live-service or single player title. Indecision from leadership within EA and an eventual pivot away from the live-service model only caused more confusion, with BioWare being told to implement foundational changes within impossible timelines. It's a story that's all the more alarming because of how familiar it feels.Related:Sony is making layoffs at Days Gone developer Bend Studiovia Game Developer // Sony has continued its Tony Award-winning tun as the Grim Reaper by cutting even more jobs within PlayStation Studios. Days Gone developer Bend Studio was the latest casualty, with the first-party developer confirming a number of employees were laid off just months after the cancellation of a live-service project. Sony didn't confirm how many people lost their jobs, but Bloomberg reporter Jason Schreier heard that around 40 peoplewere let go. Embracer CEO Lars Wingefors to become executive chair and focus on M&Avia Game Developer // Somewhere, in a deep dark corner of the world, the monkey's paw has curled. Embracer CEO Lars Wingefors, who demonstrated his leadership nous by spending years embarking on a colossal merger and acquisition spree only to immediately start downsizing, has announced he'll be stepping down as CEO. The catch? Wingefors is currently proposed to be appointed executive chair of the board of Embracer. In his new role, he'll apparently focus on strategic initiatives, capital allocation, and mergers and acquisitions. And people wonder why satire is dead. Related:Hong Kong Outlaws a Video Game, Saying It Promotes 'Armed Revolution'via The New York Times// National security police in Hong Kong have banned a Taiwanese video game called Reversed Front: Bonfire for supposedly "advocating armed revolution." Authorities in the region warned that anybody who downloads or recommends the online strategy title will face serious legal charges. The game has been pulled from Apple's marketplace in Hong Kong but is still available for download elsewhere. It was never available in mainland China. Developer ESC Taiwan, part of an group of volunteers who are vocal detractors of China's Communist Party, thanked Hong Kong authorities for the free publicity in a social media post and said the ban shows how political censorship remains prominent in the territory. RuneScape developer accused of ‘catering to American conservatism’ by rolling back Pride Month eventsvia PinkNews // Runescape developers inside Jagex have reportedly been left reeling after the studio decided to pivot away from Pride Month content to focus more on "what players wanted." Jagex CEO broke the news to staff with a post on an internal message board, prompting a rush of complaints—with many workers explaining the content was either already complete or easy to implement. Though Jagex is based in the UK, it's parent company CVC Capital Partners operates multiple companies in the United States. It's a situation that left one employee who spoke to PinkNews questioning whether the studio has caved to "American conservatism." SAG-AFTRA suspends strike and instructs union members to return to workvia Game Developer // It has taken almost a year, but performer union SAG-AFTRA has finally suspended strike action and instructed members to return to work. The decision comes after protracted negotiations with major studios who employ performers under the Interactive Media Agreement. SAG-AFTRA had been striking to secure better working conditions and AI protections for its members, and feels it has now secured a deal that will install vital "AI guardrails."A Switch 2 exclusive Splatoon spinoff was just shadow-announced on Nintendo Todayvia Game Developer // Nintendo did something peculiar this week when it unveiled a Splatoon spinoff out of the blue. That in itself might not sound too strange, but for a short window the announcement was only accessible via the company's new Nintendo Today mobile app. It's a situation that left people without access to the app questioning whether the news was even real. Nintendo Today prevented users from capturing screenshots or footage, only adding to the sense of confusion. It led to this reporter branding the move a "shadow announcement," which in turn left some of our readers perplexed. Can you ever announce and announcement? What does that term even mean? Food for thought. A wonderful new Big Walk trailer melted this reporter's heartvia House House//  The mad lads behind Untitled Goose Game are back with a new jaunt called Big Walk. This one has been on my radar for a while, but the studio finally debuted a gameplay overview during Summer Game Fest and it looks extraordinary in its purity. It's about walking and talking—and therein lies the charm. Players are forced to cooperate to navigate a lush open world, solve puzzles, and embark upon hijinks. Proximity-based communication is the core mechanic in Big Walk—whether that takes the form of voice chat, written text, hand signals, blazing flares, or pictograms—and it looks like it'll lead to all sorts of weird and wonderful antics. It's a pitch that cuts through because it's so unashamedly different, and there's a lot to love about that. I'm looking forward to this one.
    #patch #notes #xbox #debuts #its
    Patch Notes #9: Xbox debuts its first handhelds, Hong Kong authorities ban a video game, and big hopes for Big Walk
    We did it gang. We completed another week in the impossible survival sim that is real life. Give yourself a appreciative pat on the back and gaze wistfully towards whatever adventures or blissful respite the weekend might bring.This week I've mostly been recovering from my birthday celebrations, which entailed a bountiful Korean Barbecue that left me with a rampant case of the meat sweats and a pub crawl around one of Manchester's finest suburbs. There was no time for video games, but that's not always a bad thing. Distance makes the heart grow fonder, after all.I was welcomed back to the imaginary office with a news bludgeon to the face. The headlines this week have come thick and fast, bringing hardware announcements, more layoffs, and some notable sales milestones. As always, there's a lot to digest, so let's venture once more into the fray. The first Xbox handhelds have finally arrivedvia Game Developer // Microsoft finally stopped flirting with the idea of launching a handheld this week and unveiled not one, but two devices called the ROG Xbox Ally and ROG Xbox Ally X. The former is pitched towards casual players, while the latter aims to entice hardcore video game aficionados. Both devices were designed in collaboration with Asus and will presumably retail at price points that reflect their respective innards. We don't actually know yet, mind, because Microsoft didn't actually state how much they'll cost. You have the feel that's where the company really needs to stick the landing here.Related:Switch 2 tops 3.5 million sales to deliver Nintendo's biggest console launchvia Game Developer // Four days. That's all it took for the Switch 2 to shift over 3.5 million units worldwide to deliver Nintendo's biggest console launch ever. The original Switch needed a month to reach 2.74 million sales by contrast, while the PS5 needed two months to sell 4.5 million units worldwide. Xbox sales remain a mystery because Microsoft just doesn't talk about that sort of thing anymore, which is decidedly frustrating for those oddballswho actually enjoy sifting through financial documents in search of those juicy juicy numbers.Inside the ‘Dragon Age’ Debacle That Gutted EA’s BioWare Studiovia Bloomberg// How do you kill a franchise like Dragon Age and leave a studio with the pedigree of BioWare in turmoil? According to a new report from Bloomberg, the answer will likely resonate with developers across the industry: corporate meddling. Sources speaking to the publication explained how Dragon Age: The Veilguard, which failed to meet the expectations of parent company EA, was in constant disarray because the American publisher couldn't decide whether it should be a live-service or single player title. Indecision from leadership within EA and an eventual pivot away from the live-service model only caused more confusion, with BioWare being told to implement foundational changes within impossible timelines. It's a story that's all the more alarming because of how familiar it feels.Related:Sony is making layoffs at Days Gone developer Bend Studiovia Game Developer // Sony has continued its Tony Award-winning tun as the Grim Reaper by cutting even more jobs within PlayStation Studios. Days Gone developer Bend Studio was the latest casualty, with the first-party developer confirming a number of employees were laid off just months after the cancellation of a live-service project. Sony didn't confirm how many people lost their jobs, but Bloomberg reporter Jason Schreier heard that around 40 peoplewere let go. Embracer CEO Lars Wingefors to become executive chair and focus on M&Avia Game Developer // Somewhere, in a deep dark corner of the world, the monkey's paw has curled. Embracer CEO Lars Wingefors, who demonstrated his leadership nous by spending years embarking on a colossal merger and acquisition spree only to immediately start downsizing, has announced he'll be stepping down as CEO. The catch? Wingefors is currently proposed to be appointed executive chair of the board of Embracer. In his new role, he'll apparently focus on strategic initiatives, capital allocation, and mergers and acquisitions. And people wonder why satire is dead. Related:Hong Kong Outlaws a Video Game, Saying It Promotes 'Armed Revolution'via The New York Times// National security police in Hong Kong have banned a Taiwanese video game called Reversed Front: Bonfire for supposedly "advocating armed revolution." Authorities in the region warned that anybody who downloads or recommends the online strategy title will face serious legal charges. The game has been pulled from Apple's marketplace in Hong Kong but is still available for download elsewhere. It was never available in mainland China. Developer ESC Taiwan, part of an group of volunteers who are vocal detractors of China's Communist Party, thanked Hong Kong authorities for the free publicity in a social media post and said the ban shows how political censorship remains prominent in the territory. RuneScape developer accused of ‘catering to American conservatism’ by rolling back Pride Month eventsvia PinkNews // Runescape developers inside Jagex have reportedly been left reeling after the studio decided to pivot away from Pride Month content to focus more on "what players wanted." Jagex CEO broke the news to staff with a post on an internal message board, prompting a rush of complaints—with many workers explaining the content was either already complete or easy to implement. Though Jagex is based in the UK, it's parent company CVC Capital Partners operates multiple companies in the United States. It's a situation that left one employee who spoke to PinkNews questioning whether the studio has caved to "American conservatism." SAG-AFTRA suspends strike and instructs union members to return to workvia Game Developer // It has taken almost a year, but performer union SAG-AFTRA has finally suspended strike action and instructed members to return to work. The decision comes after protracted negotiations with major studios who employ performers under the Interactive Media Agreement. SAG-AFTRA had been striking to secure better working conditions and AI protections for its members, and feels it has now secured a deal that will install vital "AI guardrails."A Switch 2 exclusive Splatoon spinoff was just shadow-announced on Nintendo Todayvia Game Developer // Nintendo did something peculiar this week when it unveiled a Splatoon spinoff out of the blue. That in itself might not sound too strange, but for a short window the announcement was only accessible via the company's new Nintendo Today mobile app. It's a situation that left people without access to the app questioning whether the news was even real. Nintendo Today prevented users from capturing screenshots or footage, only adding to the sense of confusion. It led to this reporter branding the move a "shadow announcement," which in turn left some of our readers perplexed. Can you ever announce and announcement? What does that term even mean? Food for thought. A wonderful new Big Walk trailer melted this reporter's heartvia House House//  The mad lads behind Untitled Goose Game are back with a new jaunt called Big Walk. This one has been on my radar for a while, but the studio finally debuted a gameplay overview during Summer Game Fest and it looks extraordinary in its purity. It's about walking and talking—and therein lies the charm. Players are forced to cooperate to navigate a lush open world, solve puzzles, and embark upon hijinks. Proximity-based communication is the core mechanic in Big Walk—whether that takes the form of voice chat, written text, hand signals, blazing flares, or pictograms—and it looks like it'll lead to all sorts of weird and wonderful antics. It's a pitch that cuts through because it's so unashamedly different, and there's a lot to love about that. I'm looking forward to this one. #patch #notes #xbox #debuts #its
    WWW.GAMEDEVELOPER.COM
    Patch Notes #9: Xbox debuts its first handhelds, Hong Kong authorities ban a video game, and big hopes for Big Walk
    We did it gang. We completed another week in the impossible survival sim that is real life. Give yourself a appreciative pat on the back and gaze wistfully towards whatever adventures or blissful respite the weekend might bring.This week I've mostly been recovering from my birthday celebrations, which entailed a bountiful Korean Barbecue that left me with a rampant case of the meat sweats and a pub crawl around one of Manchester's finest suburbs. There was no time for video games, but that's not always a bad thing. Distance makes the heart grow fonder, after all.I was welcomed back to the imaginary office with a news bludgeon to the face. The headlines this week have come thick and fast, bringing hardware announcements, more layoffs, and some notable sales milestones. As always, there's a lot to digest, so let's venture once more into the fray. The first Xbox handhelds have finally arrivedvia Game Developer // Microsoft finally stopped flirting with the idea of launching a handheld this week and unveiled not one, but two devices called the ROG Xbox Ally and ROG Xbox Ally X. The former is pitched towards casual players, while the latter aims to entice hardcore video game aficionados. Both devices were designed in collaboration with Asus and will presumably retail at price points that reflect their respective innards. We don't actually know yet, mind, because Microsoft didn't actually state how much they'll cost. You have the feel that's where the company really needs to stick the landing here.Related:Switch 2 tops 3.5 million sales to deliver Nintendo's biggest console launchvia Game Developer // Four days. That's all it took for the Switch 2 to shift over 3.5 million units worldwide to deliver Nintendo's biggest console launch ever. The original Switch needed a month to reach 2.74 million sales by contrast, while the PS5 needed two months to sell 4.5 million units worldwide. Xbox sales remain a mystery because Microsoft just doesn't talk about that sort of thing anymore, which is decidedly frustrating for those oddballs (read: this writer) who actually enjoy sifting through financial documents in search of those juicy juicy numbers.Inside the ‘Dragon Age’ Debacle That Gutted EA’s BioWare Studiovia Bloomberg (paywalled) // How do you kill a franchise like Dragon Age and leave a studio with the pedigree of BioWare in turmoil? According to a new report from Bloomberg, the answer will likely resonate with developers across the industry: corporate meddling. Sources speaking to the publication explained how Dragon Age: The Veilguard, which failed to meet the expectations of parent company EA, was in constant disarray because the American publisher couldn't decide whether it should be a live-service or single player title. Indecision from leadership within EA and an eventual pivot away from the live-service model only caused more confusion, with BioWare being told to implement foundational changes within impossible timelines. It's a story that's all the more alarming because of how familiar it feels.Related:Sony is making layoffs at Days Gone developer Bend Studiovia Game Developer // Sony has continued its Tony Award-winning tun as the Grim Reaper by cutting even more jobs within PlayStation Studios. Days Gone developer Bend Studio was the latest casualty, with the first-party developer confirming a number of employees were laid off just months after the cancellation of a live-service project. Sony didn't confirm how many people lost their jobs, but Bloomberg reporter Jason Schreier heard that around 40 people (roughly 30 percent of the studio's headcount) were let go. Embracer CEO Lars Wingefors to become executive chair and focus on M&Avia Game Developer // Somewhere, in a deep dark corner of the world, the monkey's paw has curled. Embracer CEO Lars Wingefors, who demonstrated his leadership nous by spending years embarking on a colossal merger and acquisition spree only to immediately start downsizing, has announced he'll be stepping down as CEO. The catch? Wingefors is currently proposed to be appointed executive chair of the board of Embracer. In his new role, he'll apparently focus on strategic initiatives, capital allocation, and mergers and acquisitions. And people wonder why satire is dead. Related:Hong Kong Outlaws a Video Game, Saying It Promotes 'Armed Revolution'via The New York Times (paywalled) // National security police in Hong Kong have banned a Taiwanese video game called Reversed Front: Bonfire for supposedly "advocating armed revolution." Authorities in the region warned that anybody who downloads or recommends the online strategy title will face serious legal charges. The game has been pulled from Apple's marketplace in Hong Kong but is still available for download elsewhere. It was never available in mainland China. Developer ESC Taiwan, part of an group of volunteers who are vocal detractors of China's Communist Party, thanked Hong Kong authorities for the free publicity in a social media post and said the ban shows how political censorship remains prominent in the territory. RuneScape developer accused of ‘catering to American conservatism’ by rolling back Pride Month eventsvia PinkNews // Runescape developers inside Jagex have reportedly been left reeling after the studio decided to pivot away from Pride Month content to focus more on "what players wanted." Jagex CEO broke the news to staff with a post on an internal message board, prompting a rush of complaints—with many workers explaining the content was either already complete or easy to implement. Though Jagex is based in the UK, it's parent company CVC Capital Partners operates multiple companies in the United States. It's a situation that left one employee who spoke to PinkNews questioning whether the studio has caved to "American conservatism." SAG-AFTRA suspends strike and instructs union members to return to workvia Game Developer // It has taken almost a year, but performer union SAG-AFTRA has finally suspended strike action and instructed members to return to work. The decision comes after protracted negotiations with major studios who employ performers under the Interactive Media Agreement. SAG-AFTRA had been striking to secure better working conditions and AI protections for its members, and feels it has now secured a deal that will install vital "AI guardrails."A Switch 2 exclusive Splatoon spinoff was just shadow-announced on Nintendo Todayvia Game Developer // Nintendo did something peculiar this week when it unveiled a Splatoon spinoff out of the blue. That in itself might not sound too strange, but for a short window the announcement was only accessible via the company's new Nintendo Today mobile app. It's a situation that left people without access to the app questioning whether the news was even real. Nintendo Today prevented users from capturing screenshots or footage, only adding to the sense of confusion. It led to this reporter branding the move a "shadow announcement," which in turn left some of our readers perplexed. Can you ever announce and announcement? What does that term even mean? Food for thought. A wonderful new Big Walk trailer melted this reporter's heartvia House House (YouTube) //  The mad lads behind Untitled Goose Game are back with a new jaunt called Big Walk. This one has been on my radar for a while, but the studio finally debuted a gameplay overview during Summer Game Fest and it looks extraordinary in its purity. It's about walking and talking—and therein lies the charm. Players are forced to cooperate to navigate a lush open world, solve puzzles, and embark upon hijinks. Proximity-based communication is the core mechanic in Big Walk—whether that takes the form of voice chat, written text, hand signals, blazing flares, or pictograms—and it looks like it'll lead to all sorts of weird and wonderful antics. It's a pitch that cuts through because it's so unashamedly different, and there's a lot to love about that. I'm looking forward to this one.
    Like
    Love
    Wow
    Sad
    Angry
    524
    0 Commenti 0 condivisioni
  • Archaeologists Stumble Onto Sprawling Ancient Roman Villa During Construction of a Road in France

    Cool Finds

    Archaeologists Stumble Onto Sprawling Ancient Roman Villa During Construction of a Road in France
    Located near Auxerre, the grand estate once possessed an exorbitant level of wealth, with thermal baths and heated floors

    Aerial view of the villa, with thermal baths at the bottom right, the garden and fountain in the center, and the agricultural fields expanding to the left
    Ch. Fouquin / INRAP

    In ancient times, all roads led to Rome—or so the saying goes. Nowadays, new roads can lead to Roman ruins.
    During construction on an alternative route to D606, a regional road just under two miles outside of Auxerre, in central France, salvage archaeologists unearthed a sprawling Roman villa complete with a stately garden, a fountain and an elaborate system of underfloor heating known as a hypocaust, according to a statement from the French National Institute for Preventive Archaeological Research.
    While researchers have been aware of the ruins on the outskirts of the Gallo-Roman settlement of Autissiodorumsince the 19th century, previous excavations have been limited. The most recent dig, in 1966, found a 7,500-square-foot building with ten rooms and amenities that suggested its residents enjoyed great wealth and regional power.

    The site of Sainte-Nitasse, adjacent to a regional highway

    Ch. Fouquin / INRAP

    But until now, the true scale of the villa known as Sainte-Nitasse and its surrounding agricultural estates along the River Yonne was unclear. Archaeologists at INRAP have since discovered a 43,000-square-foot building thought to date to between the first and third centuries C.E. It suggests a previously unimagined level of grandeur.
    INRAP identifies the site as one of the “grand villas of Roman Gaul,” according to the statement. Grand villas are typified by their vast dimensions and sophisticated architectural style. They typically encompass both agricultural and residential portions, known in Latin as pars rustica and pars urbana, respectively. In the pars urbana, grand villas tend to feature stately construction materials like marble; extensive mosaics and frescoes; and amenities like private baths, fountains and gardens.
    So far, the excavations at Sainte-Nitasse have revealed all these features and more.
    The villa’s development is extensive. A 4,800-square-foot garden is enclosed by a fountain to the south and a water basin, or an ornamental pond, to the north. The hypocaust, an ancient system of central heating that circulated hot air beneath the floors of the house, signals a level of luxury atypical for rural estates in Roman Gaul.

    A section of the villa's hypocaust heating system, which circulated hot air beneath the floor

    Ch. Fouquin / INRAP

    “We can imagine it as an ‘aristocratic’ villa, belonging to someone with riches, responsibilities—perhaps municipal, given the proximity to Auxerre—a landowner who had staff on site,” Alexandre Burgevin, the archaeologist in charge of the excavations with INRAP, tells France Info’s Lisa Guyenne.
    Near the banks of the Yonne, a thermal bath site contains several pools where the landowner and his family bathed. On the other side of the garden, workers toiled in the fields of a massive agricultural estate.
    Aside from its size and amenities, the villa’s level of preservation also astounded archaeologists. “For a rural site, it’s quite exceptional,” Burgevin tells L’Yonne Républicaine’s Titouan Stücker. “You can walk on floors from the time period, circulate between rooms like the Gallo-Romans did.”Over time, Autissiodorum grew to become a major city along the Via Agrippa, eventually earning the honor of serving as a provincial Roman capital by the fourth century C.E. As Gaul began slipping away from the Roman Empire around the same time, the prominence of the city fluctuated. INRAP archaeologists speculate that the site was repurposed during medieval times, around the 13th century.
    Burgevin offers several explanations for why the site remained so well preserved in subsequent centuries. The humid conditions along the banks of the river might have prevented excess decay. Since this portion of the River Yonne wasn’t canalized until the 19th century, engineers may have already been aware of the presence of ruins. Or, perhaps the rubble of the villa created “bumpy,” intractable soil that was “not easy to pass over with a tractor,” he tells France Info.
    While the site will briefly open to the public on June 15 for European Archaeology Days, an annual event held at sites across the continent, excavations will continue until September, at which time construction on the road will resume. Much work is to be done, including filling in large gaps of the site’s chronology between the Roman and medieval eras.
    “We have well-built walls but few objects,” says Burgevin, per L’Yonne Républicaine. “It will be necessary to continue digging to understand better.”

    Get the latest stories in your inbox every weekday.
    #archaeologists #stumble #onto #sprawling #ancient
    Archaeologists Stumble Onto Sprawling Ancient Roman Villa During Construction of a Road in France
    Cool Finds Archaeologists Stumble Onto Sprawling Ancient Roman Villa During Construction of a Road in France Located near Auxerre, the grand estate once possessed an exorbitant level of wealth, with thermal baths and heated floors Aerial view of the villa, with thermal baths at the bottom right, the garden and fountain in the center, and the agricultural fields expanding to the left Ch. Fouquin / INRAP In ancient times, all roads led to Rome—or so the saying goes. Nowadays, new roads can lead to Roman ruins. During construction on an alternative route to D606, a regional road just under two miles outside of Auxerre, in central France, salvage archaeologists unearthed a sprawling Roman villa complete with a stately garden, a fountain and an elaborate system of underfloor heating known as a hypocaust, according to a statement from the French National Institute for Preventive Archaeological Research. While researchers have been aware of the ruins on the outskirts of the Gallo-Roman settlement of Autissiodorumsince the 19th century, previous excavations have been limited. The most recent dig, in 1966, found a 7,500-square-foot building with ten rooms and amenities that suggested its residents enjoyed great wealth and regional power. The site of Sainte-Nitasse, adjacent to a regional highway Ch. Fouquin / INRAP But until now, the true scale of the villa known as Sainte-Nitasse and its surrounding agricultural estates along the River Yonne was unclear. Archaeologists at INRAP have since discovered a 43,000-square-foot building thought to date to between the first and third centuries C.E. It suggests a previously unimagined level of grandeur. INRAP identifies the site as one of the “grand villas of Roman Gaul,” according to the statement. Grand villas are typified by their vast dimensions and sophisticated architectural style. They typically encompass both agricultural and residential portions, known in Latin as pars rustica and pars urbana, respectively. In the pars urbana, grand villas tend to feature stately construction materials like marble; extensive mosaics and frescoes; and amenities like private baths, fountains and gardens. So far, the excavations at Sainte-Nitasse have revealed all these features and more. The villa’s development is extensive. A 4,800-square-foot garden is enclosed by a fountain to the south and a water basin, or an ornamental pond, to the north. The hypocaust, an ancient system of central heating that circulated hot air beneath the floors of the house, signals a level of luxury atypical for rural estates in Roman Gaul. A section of the villa's hypocaust heating system, which circulated hot air beneath the floor Ch. Fouquin / INRAP “We can imagine it as an ‘aristocratic’ villa, belonging to someone with riches, responsibilities—perhaps municipal, given the proximity to Auxerre—a landowner who had staff on site,” Alexandre Burgevin, the archaeologist in charge of the excavations with INRAP, tells France Info’s Lisa Guyenne. Near the banks of the Yonne, a thermal bath site contains several pools where the landowner and his family bathed. On the other side of the garden, workers toiled in the fields of a massive agricultural estate. Aside from its size and amenities, the villa’s level of preservation also astounded archaeologists. “For a rural site, it’s quite exceptional,” Burgevin tells L’Yonne Républicaine’s Titouan Stücker. “You can walk on floors from the time period, circulate between rooms like the Gallo-Romans did.”Over time, Autissiodorum grew to become a major city along the Via Agrippa, eventually earning the honor of serving as a provincial Roman capital by the fourth century C.E. As Gaul began slipping away from the Roman Empire around the same time, the prominence of the city fluctuated. INRAP archaeologists speculate that the site was repurposed during medieval times, around the 13th century. Burgevin offers several explanations for why the site remained so well preserved in subsequent centuries. The humid conditions along the banks of the river might have prevented excess decay. Since this portion of the River Yonne wasn’t canalized until the 19th century, engineers may have already been aware of the presence of ruins. Or, perhaps the rubble of the villa created “bumpy,” intractable soil that was “not easy to pass over with a tractor,” he tells France Info. While the site will briefly open to the public on June 15 for European Archaeology Days, an annual event held at sites across the continent, excavations will continue until September, at which time construction on the road will resume. Much work is to be done, including filling in large gaps of the site’s chronology between the Roman and medieval eras. “We have well-built walls but few objects,” says Burgevin, per L’Yonne Républicaine. “It will be necessary to continue digging to understand better.” Get the latest stories in your inbox every weekday. #archaeologists #stumble #onto #sprawling #ancient
    WWW.SMITHSONIANMAG.COM
    Archaeologists Stumble Onto Sprawling Ancient Roman Villa During Construction of a Road in France
    Cool Finds Archaeologists Stumble Onto Sprawling Ancient Roman Villa During Construction of a Road in France Located near Auxerre, the grand estate once possessed an exorbitant level of wealth, with thermal baths and heated floors Aerial view of the villa, with thermal baths at the bottom right, the garden and fountain in the center, and the agricultural fields expanding to the left Ch. Fouquin / INRAP In ancient times, all roads led to Rome—or so the saying goes. Nowadays, new roads can lead to Roman ruins. During construction on an alternative route to D606, a regional road just under two miles outside of Auxerre, in central France, salvage archaeologists unearthed a sprawling Roman villa complete with a stately garden, a fountain and an elaborate system of underfloor heating known as a hypocaust, according to a statement from the French National Institute for Preventive Archaeological Research (INRAP). While researchers have been aware of the ruins on the outskirts of the Gallo-Roman settlement of Autissiodorum (as Auxerre was once known) since the 19th century, previous excavations have been limited. The most recent dig, in 1966, found a 7,500-square-foot building with ten rooms and amenities that suggested its residents enjoyed great wealth and regional power. The site of Sainte-Nitasse, adjacent to a regional highway Ch. Fouquin / INRAP But until now, the true scale of the villa known as Sainte-Nitasse and its surrounding agricultural estates along the River Yonne was unclear. Archaeologists at INRAP have since discovered a 43,000-square-foot building thought to date to between the first and third centuries C.E. It suggests a previously unimagined level of grandeur. INRAP identifies the site as one of the “grand villas of Roman Gaul,” according to the statement. Grand villas are typified by their vast dimensions and sophisticated architectural style. They typically encompass both agricultural and residential portions, known in Latin as pars rustica and pars urbana, respectively. In the pars urbana, grand villas tend to feature stately construction materials like marble; extensive mosaics and frescoes; and amenities like private baths, fountains and gardens. So far, the excavations at Sainte-Nitasse have revealed all these features and more. The villa’s development is extensive. A 4,800-square-foot garden is enclosed by a fountain to the south and a water basin, or an ornamental pond, to the north. The hypocaust, an ancient system of central heating that circulated hot air beneath the floors of the house, signals a level of luxury atypical for rural estates in Roman Gaul. A section of the villa's hypocaust heating system, which circulated hot air beneath the floor Ch. Fouquin / INRAP “We can imagine it as an ‘aristocratic’ villa, belonging to someone with riches, responsibilities—perhaps municipal, given the proximity to Auxerre—a landowner who had staff on site,” Alexandre Burgevin, the archaeologist in charge of the excavations with INRAP, tells France Info’s Lisa Guyenne. Near the banks of the Yonne, a thermal bath site contains several pools where the landowner and his family bathed. On the other side of the garden, workers toiled in the fields of a massive agricultural estate. Aside from its size and amenities, the villa’s level of preservation also astounded archaeologists. “For a rural site, it’s quite exceptional,” Burgevin tells L’Yonne Républicaine’s Titouan Stücker. “You can walk on floors from the time period, circulate between rooms like the Gallo-Romans did.”Over time, Autissiodorum grew to become a major city along the Via Agrippa, eventually earning the honor of serving as a provincial Roman capital by the fourth century C.E. As Gaul began slipping away from the Roman Empire around the same time, the prominence of the city fluctuated. INRAP archaeologists speculate that the site was repurposed during medieval times, around the 13th century. Burgevin offers several explanations for why the site remained so well preserved in subsequent centuries. The humid conditions along the banks of the river might have prevented excess decay. Since this portion of the River Yonne wasn’t canalized until the 19th century, engineers may have already been aware of the presence of ruins. Or, perhaps the rubble of the villa created “bumpy,” intractable soil that was “not easy to pass over with a tractor,” he tells France Info. While the site will briefly open to the public on June 15 for European Archaeology Days, an annual event held at sites across the continent, excavations will continue until September, at which time construction on the road will resume. Much work is to be done, including filling in large gaps of the site’s chronology between the Roman and medieval eras. “We have well-built walls but few objects,” says Burgevin, per L’Yonne Républicaine. “It will be necessary to continue digging to understand better.” Get the latest stories in your inbox every weekday.
    Like
    Love
    Wow
    Sad
    Angry
    509
    2 Commenti 0 condivisioni
  • In conflict: Putting Russia’s datacentre market under the microscope

    When Russian troops invaded Ukraine on 24 February 2022, Russia’s datacentre sector was one of the fastest-growing segments of the country’s IT industry, with annual growth rates in the region of 10-12%.
    However, with the conflict resulting in the imposition of Western sanctions against Russia and an outflow of US-based tech companies from the country, including Apple and Microsoft, optimism about the sector’s potential for further growth soon disappeared.
    In early March 2025, it was reported that Google had disconnected from traffic exchange points and datacentres in Russia, leading to concerns about how this could negatively affect the speed of access to some Google services for Russian users.
    Initially, there was hope that domestic technology and datacentre providers might be able to plug the gaps left by the exodus of the US tech giants, but it seems they could not keep up with the hosting demands of Russia’s increasingly digital economy.
    Oleg Kim, director of the hardware systems department at Russian IT company Axoft, says the departure of foreign cloud providers and equipment manufacturers has led to a serious shortage of compute capacity in Russia.
    This is because the situation resulted in a sharp, initial increase in demand for domestic datacentres, but Russian providers simply did not have time to expand their capacities on the required scale, continues Kim.

    According to the estimates of Key Point, one of Russia’s largest datacentre networks, meeting Russia’s demand for datacentres will require facilities with a total capacity of 30,000 racks to be built each year over the next five years.
    On top of this, it has also become more costly to build datacentres in Russia.
    Estimates suggest that prior to 2022, the cost of a datacentre rack totalled 100,000 rubles, but now exceeds 150,000 rubles.
    And analysts at Forbes Russia expect these figures will continue to grow, due to rising logistics costs and the impact the war is having on the availability of skilled labour in the construction sector.
    The impact of these challenges is being keenly felt by users, with several of the country’s large banks experiencing serious problems when finding suitable locations for their datacentres.
    Sberbank is among the firms affected, with its chairperson, German Gref, speaking out previously about how the bank is in need of a datacentre with at least 200MW of capacity, but would ideally need 300-400MW to address its compute requirements.
    Stanislav Bliznyuk, chairperson of T-Bank, says trying to build even two 50MW datacentres to meet its needs is proving problematic. “Finding locations where such capacity and adequate tariffs are available is a difficult task,” he said.

    about datacentre developments

    North Lincolnshire Council has received a planning permission application for another large-scale datacentre development, in support of its bid to become an AI Growth Zone
    A proposal to build one of the biggest datacentres in Europe has been submitted to Hertsmere Borough Council, and already has the support of the technology secretary and local councillors.
    The UK government has unveiled its 50-point AI action plan, which commits to building sovereign artificial intelligence capabilities and accelerating AI datacentre developments – but questions remain about the viability of the plans.

    Despite this, T-Bank is establishing its own network of data processing centres – the first of which should open in early 2027, he confirmed in November 2024.
    Kirill Solyev, head of the engineering infrastructure department of the Softline Group of Companies, who specialise in IT, says many large Russian companies are resorting to building their own datacentres – because compute capacity is in such short supply.
    The situation is, however, complicated by the lack of suitable locations for datacentres in the largest cities of Russia – Moscow and St Petersburg. “For example, to build a datacentre with a capacity of 60MW, finding a suitable site can take up to three years,” says Solyev. “In Moscow, according to preliminary estimates, there are about 50MW of free capacity left, which is equivalent to 2-4 large commercial datacentres.
    “The capacity deficit only in the southern part of the Moscow region is predicted at 564MW by 2030, and up to 3.15GW by 2042.”
    As a result, datacentre operators and investors are now looking for suitable locations outside of Moscow and St Petersburg, and seeking to co-locate new datacentres in close proximity to renewable energy sources.
    And this will be important as demand for datacentre capacity in Russia is expected to increase, as it is in most of the rest of the world, due to the growing use of artificial intelligencetools and services.
    The energy-intensive nature of AI workloads will put further pressure on operators that are already struggling to meet the compute capacity demands of their customers.

    Speaking at the recent Ural Forum on cyber security in finance, Alexander Kraynov, director of AI technology development at Yandex, says solving the energy consumption issue of AI datacentres will not be easy.
    “The world is running out of electricity, including for AI, while the same situation is observed in Russia,” he said. “In order to ensure a stable energy supply of a newly built large datacentre, we will need up to one year.”
    According to a recent report of the Russian Vedomosti business paper, as of April 2024, Russian datacentres have used about 2.6GW, which is equivalent to about 1% of the installed capacity of the Unified Energy System of Russia.
    Accommodating AI workloads will also mean operators will need to purchase additional equipment, including expensive accelerators based on graphic processing units and higher-performing data storage systems.
    The implementation of these plans and the viability of these purchases is likely to be seriously complicated by the current sanctions regime against Russia.
    That said, Russia’s prime minister, Mikhail Mishustin, claims this part of the datacentre supply equation is being partially solved by an uptick in the domestic production of datacentre kit.
    According to the Mishustin, more than half of the server equipment and industrial storage and information processing systems needed for datacentres are already being produced in Russia – and these figures will continue to grow.

    The government also plans to provide additional financial support to the industry, as – to date – building datacentres in Russia has been prevented by relatively long payback periods, of up to 10 years in some cases, of such projects.
    One of the possible support measures on offer could include the subsidisation of at least part of the interest rates on loans to datacentre developers and operators.
    At the same time, though, the government’s actions in other areas have made it harder for operators to build new facilities.
    For example, in March 2025, the Russian government significantly tightened the existing norms for the establishment of new datacentres in the form of new rules for the design of data processing centres, which came into force after the approval by the Russian Ministry of Construction.
    According to Nikita Tsaplin, CEO of Russian hosting provider RUVDS, the rules led to additional bureaucracy in the sector.
    And, according to his predictions, that situation can extend the construction cycle of a datacentre from around five years to seven years.
    The government’s intervention here was to prevent the installation of servers in residential areas, such as garages, but it looks set to complicate an already complex situation – prompting questions about whether Russia’s datacentre market will ever reach its full potential.
    #conflict #putting #russias #datacentre #market
    In conflict: Putting Russia’s datacentre market under the microscope
    When Russian troops invaded Ukraine on 24 February 2022, Russia’s datacentre sector was one of the fastest-growing segments of the country’s IT industry, with annual growth rates in the region of 10-12%. However, with the conflict resulting in the imposition of Western sanctions against Russia and an outflow of US-based tech companies from the country, including Apple and Microsoft, optimism about the sector’s potential for further growth soon disappeared. In early March 2025, it was reported that Google had disconnected from traffic exchange points and datacentres in Russia, leading to concerns about how this could negatively affect the speed of access to some Google services for Russian users. Initially, there was hope that domestic technology and datacentre providers might be able to plug the gaps left by the exodus of the US tech giants, but it seems they could not keep up with the hosting demands of Russia’s increasingly digital economy. Oleg Kim, director of the hardware systems department at Russian IT company Axoft, says the departure of foreign cloud providers and equipment manufacturers has led to a serious shortage of compute capacity in Russia. This is because the situation resulted in a sharp, initial increase in demand for domestic datacentres, but Russian providers simply did not have time to expand their capacities on the required scale, continues Kim. According to the estimates of Key Point, one of Russia’s largest datacentre networks, meeting Russia’s demand for datacentres will require facilities with a total capacity of 30,000 racks to be built each year over the next five years. On top of this, it has also become more costly to build datacentres in Russia. Estimates suggest that prior to 2022, the cost of a datacentre rack totalled 100,000 rubles, but now exceeds 150,000 rubles. And analysts at Forbes Russia expect these figures will continue to grow, due to rising logistics costs and the impact the war is having on the availability of skilled labour in the construction sector. The impact of these challenges is being keenly felt by users, with several of the country’s large banks experiencing serious problems when finding suitable locations for their datacentres. Sberbank is among the firms affected, with its chairperson, German Gref, speaking out previously about how the bank is in need of a datacentre with at least 200MW of capacity, but would ideally need 300-400MW to address its compute requirements. Stanislav Bliznyuk, chairperson of T-Bank, says trying to build even two 50MW datacentres to meet its needs is proving problematic. “Finding locations where such capacity and adequate tariffs are available is a difficult task,” he said. about datacentre developments North Lincolnshire Council has received a planning permission application for another large-scale datacentre development, in support of its bid to become an AI Growth Zone A proposal to build one of the biggest datacentres in Europe has been submitted to Hertsmere Borough Council, and already has the support of the technology secretary and local councillors. The UK government has unveiled its 50-point AI action plan, which commits to building sovereign artificial intelligence capabilities and accelerating AI datacentre developments – but questions remain about the viability of the plans. Despite this, T-Bank is establishing its own network of data processing centres – the first of which should open in early 2027, he confirmed in November 2024. Kirill Solyev, head of the engineering infrastructure department of the Softline Group of Companies, who specialise in IT, says many large Russian companies are resorting to building their own datacentres – because compute capacity is in such short supply. The situation is, however, complicated by the lack of suitable locations for datacentres in the largest cities of Russia – Moscow and St Petersburg. “For example, to build a datacentre with a capacity of 60MW, finding a suitable site can take up to three years,” says Solyev. “In Moscow, according to preliminary estimates, there are about 50MW of free capacity left, which is equivalent to 2-4 large commercial datacentres. “The capacity deficit only in the southern part of the Moscow region is predicted at 564MW by 2030, and up to 3.15GW by 2042.” As a result, datacentre operators and investors are now looking for suitable locations outside of Moscow and St Petersburg, and seeking to co-locate new datacentres in close proximity to renewable energy sources. And this will be important as demand for datacentre capacity in Russia is expected to increase, as it is in most of the rest of the world, due to the growing use of artificial intelligencetools and services. The energy-intensive nature of AI workloads will put further pressure on operators that are already struggling to meet the compute capacity demands of their customers. Speaking at the recent Ural Forum on cyber security in finance, Alexander Kraynov, director of AI technology development at Yandex, says solving the energy consumption issue of AI datacentres will not be easy. “The world is running out of electricity, including for AI, while the same situation is observed in Russia,” he said. “In order to ensure a stable energy supply of a newly built large datacentre, we will need up to one year.” According to a recent report of the Russian Vedomosti business paper, as of April 2024, Russian datacentres have used about 2.6GW, which is equivalent to about 1% of the installed capacity of the Unified Energy System of Russia. Accommodating AI workloads will also mean operators will need to purchase additional equipment, including expensive accelerators based on graphic processing units and higher-performing data storage systems. The implementation of these plans and the viability of these purchases is likely to be seriously complicated by the current sanctions regime against Russia. That said, Russia’s prime minister, Mikhail Mishustin, claims this part of the datacentre supply equation is being partially solved by an uptick in the domestic production of datacentre kit. According to the Mishustin, more than half of the server equipment and industrial storage and information processing systems needed for datacentres are already being produced in Russia – and these figures will continue to grow. The government also plans to provide additional financial support to the industry, as – to date – building datacentres in Russia has been prevented by relatively long payback periods, of up to 10 years in some cases, of such projects. One of the possible support measures on offer could include the subsidisation of at least part of the interest rates on loans to datacentre developers and operators. At the same time, though, the government’s actions in other areas have made it harder for operators to build new facilities. For example, in March 2025, the Russian government significantly tightened the existing norms for the establishment of new datacentres in the form of new rules for the design of data processing centres, which came into force after the approval by the Russian Ministry of Construction. According to Nikita Tsaplin, CEO of Russian hosting provider RUVDS, the rules led to additional bureaucracy in the sector. And, according to his predictions, that situation can extend the construction cycle of a datacentre from around five years to seven years. The government’s intervention here was to prevent the installation of servers in residential areas, such as garages, but it looks set to complicate an already complex situation – prompting questions about whether Russia’s datacentre market will ever reach its full potential. #conflict #putting #russias #datacentre #market
    WWW.COMPUTERWEEKLY.COM
    In conflict: Putting Russia’s datacentre market under the microscope
    When Russian troops invaded Ukraine on 24 February 2022, Russia’s datacentre sector was one of the fastest-growing segments of the country’s IT industry, with annual growth rates in the region of 10-12%. However, with the conflict resulting in the imposition of Western sanctions against Russia and an outflow of US-based tech companies from the country, including Apple and Microsoft, optimism about the sector’s potential for further growth soon disappeared. In early March 2025, it was reported that Google had disconnected from traffic exchange points and datacentres in Russia, leading to concerns about how this could negatively affect the speed of access to some Google services for Russian users. Initially, there was hope that domestic technology and datacentre providers might be able to plug the gaps left by the exodus of the US tech giants, but it seems they could not keep up with the hosting demands of Russia’s increasingly digital economy. Oleg Kim, director of the hardware systems department at Russian IT company Axoft, says the departure of foreign cloud providers and equipment manufacturers has led to a serious shortage of compute capacity in Russia. This is because the situation resulted in a sharp, initial increase in demand for domestic datacentres, but Russian providers simply did not have time to expand their capacities on the required scale, continues Kim. According to the estimates of Key Point, one of Russia’s largest datacentre networks, meeting Russia’s demand for datacentres will require facilities with a total capacity of 30,000 racks to be built each year over the next five years. On top of this, it has also become more costly to build datacentres in Russia. Estimates suggest that prior to 2022, the cost of a datacentre rack totalled 100,000 rubles ($1,200), but now exceeds 150,000 rubles. And analysts at Forbes Russia expect these figures will continue to grow, due to rising logistics costs and the impact the war is having on the availability of skilled labour in the construction sector. The impact of these challenges is being keenly felt by users, with several of the country’s large banks experiencing serious problems when finding suitable locations for their datacentres. Sberbank is among the firms affected, with its chairperson, German Gref, speaking out previously about how the bank is in need of a datacentre with at least 200MW of capacity, but would ideally need 300-400MW to address its compute requirements. Stanislav Bliznyuk, chairperson of T-Bank, says trying to build even two 50MW datacentres to meet its needs is proving problematic. “Finding locations where such capacity and adequate tariffs are available is a difficult task,” he said. Read more about datacentre developments North Lincolnshire Council has received a planning permission application for another large-scale datacentre development, in support of its bid to become an AI Growth Zone A proposal to build one of the biggest datacentres in Europe has been submitted to Hertsmere Borough Council, and already has the support of the technology secretary and local councillors. The UK government has unveiled its 50-point AI action plan, which commits to building sovereign artificial intelligence capabilities and accelerating AI datacentre developments – but questions remain about the viability of the plans. Despite this, T-Bank is establishing its own network of data processing centres – the first of which should open in early 2027, he confirmed in November 2024. Kirill Solyev, head of the engineering infrastructure department of the Softline Group of Companies, who specialise in IT, says many large Russian companies are resorting to building their own datacentres – because compute capacity is in such short supply. The situation is, however, complicated by the lack of suitable locations for datacentres in the largest cities of Russia – Moscow and St Petersburg. “For example, to build a datacentre with a capacity of 60MW, finding a suitable site can take up to three years,” says Solyev. “In Moscow, according to preliminary estimates, there are about 50MW of free capacity left, which is equivalent to 2-4 large commercial datacentres. “The capacity deficit only in the southern part of the Moscow region is predicted at 564MW by 2030, and up to 3.15GW by 2042.” As a result, datacentre operators and investors are now looking for suitable locations outside of Moscow and St Petersburg, and seeking to co-locate new datacentres in close proximity to renewable energy sources. And this will be important as demand for datacentre capacity in Russia is expected to increase, as it is in most of the rest of the world, due to the growing use of artificial intelligence (AI) tools and services. The energy-intensive nature of AI workloads will put further pressure on operators that are already struggling to meet the compute capacity demands of their customers. Speaking at the recent Ural Forum on cyber security in finance, Alexander Kraynov, director of AI technology development at Yandex, says solving the energy consumption issue of AI datacentres will not be easy. “The world is running out of electricity, including for AI, while the same situation is observed in Russia,” he said. “In order to ensure a stable energy supply of a newly built large datacentre, we will need up to one year.” According to a recent report of the Russian Vedomosti business paper, as of April 2024, Russian datacentres have used about 2.6GW, which is equivalent to about 1% of the installed capacity of the Unified Energy System of Russia. Accommodating AI workloads will also mean operators will need to purchase additional equipment, including expensive accelerators based on graphic processing units and higher-performing data storage systems. The implementation of these plans and the viability of these purchases is likely to be seriously complicated by the current sanctions regime against Russia. That said, Russia’s prime minister, Mikhail Mishustin, claims this part of the datacentre supply equation is being partially solved by an uptick in the domestic production of datacentre kit. According to the Mishustin, more than half of the server equipment and industrial storage and information processing systems needed for datacentres are already being produced in Russia – and these figures will continue to grow. The government also plans to provide additional financial support to the industry, as – to date – building datacentres in Russia has been prevented by relatively long payback periods, of up to 10 years in some cases, of such projects. One of the possible support measures on offer could include the subsidisation of at least part of the interest rates on loans to datacentre developers and operators. At the same time, though, the government’s actions in other areas have made it harder for operators to build new facilities. For example, in March 2025, the Russian government significantly tightened the existing norms for the establishment of new datacentres in the form of new rules for the design of data processing centres, which came into force after the approval by the Russian Ministry of Construction. According to Nikita Tsaplin, CEO of Russian hosting provider RUVDS, the rules led to additional bureaucracy in the sector (due to the positioning of datacentres as typical construction objects). And, according to his predictions, that situation can extend the construction cycle of a datacentre from around five years to seven years. The government’s intervention here was to prevent the installation of servers in residential areas, such as garages, but it looks set to complicate an already complex situation – prompting questions about whether Russia’s datacentre market will ever reach its full potential.
    Like
    Love
    Wow
    Sad
    Angry
    631
    0 Commenti 0 condivisioni
  • Resilience Spacecraft Likely Crashed Into the Moon, Ispace Confirms

    Japan-based Ispace confirmed its Resilience lander likely crashed during its second failed attempt at a lunar landing, after a sensor malfunction prevented proper deceleration. Despite the setback, the company remains committed to future missions, with funding secured for a third attempt using a new lander, Apex 1.0, scheduled for 2027. "Until then, Ispace has its work cut out for it," reports CNN. "said during the news briefing he will need to work to regain the trust of investors, and the company will need to deeply investigate what went wrong on the Resilience mission to ensure similar issues don't plague Apex 1.0."

    The company has ambitious "plans to eventually build a city on the lunar surface that would house a thousand people and welcome thousands more for tourist visits," notes ABC News. "If ispace is going to establish a colony on the moon, it will need to identify an ample supply of ice or water, which it will convert into fuel for a future lunar fueling station. The ability to produce fuel on the moon will enable the company to transport people back and forth between the Earth and the moon."

    of this story at Slashdot.
    #resilience #spacecraft #likely #crashed #into
    Resilience Spacecraft Likely Crashed Into the Moon, Ispace Confirms
    Japan-based Ispace confirmed its Resilience lander likely crashed during its second failed attempt at a lunar landing, after a sensor malfunction prevented proper deceleration. Despite the setback, the company remains committed to future missions, with funding secured for a third attempt using a new lander, Apex 1.0, scheduled for 2027. "Until then, Ispace has its work cut out for it," reports CNN. "said during the news briefing he will need to work to regain the trust of investors, and the company will need to deeply investigate what went wrong on the Resilience mission to ensure similar issues don't plague Apex 1.0." The company has ambitious "plans to eventually build a city on the lunar surface that would house a thousand people and welcome thousands more for tourist visits," notes ABC News. "If ispace is going to establish a colony on the moon, it will need to identify an ample supply of ice or water, which it will convert into fuel for a future lunar fueling station. The ability to produce fuel on the moon will enable the company to transport people back and forth between the Earth and the moon." of this story at Slashdot. #resilience #spacecraft #likely #crashed #into
    SCIENCE.SLASHDOT.ORG
    Resilience Spacecraft Likely Crashed Into the Moon, Ispace Confirms
    Japan-based Ispace confirmed its Resilience lander likely crashed during its second failed attempt at a lunar landing, after a sensor malfunction prevented proper deceleration. Despite the setback, the company remains committed to future missions, with funding secured for a third attempt using a new lander, Apex 1.0, scheduled for 2027. "Until then, Ispace has its work cut out for it," reports CNN. "[Ispace CEO and founder Takeshi Hakamada] said during the news briefing he will need to work to regain the trust of investors, and the company will need to deeply investigate what went wrong on the Resilience mission to ensure similar issues don't plague Apex 1.0." The company has ambitious "plans to eventually build a city on the lunar surface that would house a thousand people and welcome thousands more for tourist visits," notes ABC News. "If ispace is going to establish a colony on the moon, it will need to identify an ample supply of ice or water, which it will convert into fuel for a future lunar fueling station. The ability to produce fuel on the moon will enable the company to transport people back and forth between the Earth and the moon." Read more of this story at Slashdot.
    Like
    Love
    Wow
    Sad
    Angry
    275
    8 Commenti 0 condivisioni
  • Japanese Private Lunar Lander Resilience Fails Mission, Crashes on Moon

    Photo Credit: ispace A delay in rangefinder data prevented timely deceleration, causing a hard landing on the lunar surface

    Highlights

    Resilience lander lost signal one minute before scheduled moon touchdown
    Delayed laser rangefinder data caused failure in the landing speed adjust
    ispace lunar lander crashes on final descent, marking its second mission

    Advertisement

    A Japanese spacecraft attempting to achieve the country's first private moon landing instead crashed on the lunar surface, according to mission officials. The Resilience lander, developed by Tokyo-based ispace, lost communication one minute and 45 seconds before its scheduled soft touchdown on June 5 at 3:17 p.m. EDT. The descent was targeted for the Mare Frigoris region on the Moon's near side. ispace had its second problem on the moon when its laser rangefinder broke, which is a big improvement over its prior failure in April 2023.Japan's Resilience Lunar Lander Crashes in Hard Landing, ispace Vows to Learn and RebuildAs per an official statement from ispace, telemetry from Resilience revealed that the rangefinder's delayed data caused a failure in adjusting landing speed. This likely led to a “hard landing”, suggesting the spacecraft hit the moon's surface too fast to survive or complete its mission. The lander, carrying five payloads, such as a Tenacious rover and scientific instruments, crashed with no survivors. The firm's CEO, Takeshi Hakamada, apologised and remarked that the company would use the mission to learn about future missions.The Hakuto-R Mission 2 team launched a 7.5-foot-tall, 2,200-pound Resilience lander into space aboard a SpaceX Falcon 9 rocket in early May. But with a perfect orbit, the lander smashed into the lunar surface at 192 metres – an echo of Mission 1's mission failure in 2023, which crashed because a fault in one of its altitude sensors was not corrected.The Resilience crash adds to private attempts to explore the moon, including the unsuccessful Beresheet and Peregrine missions. Crewed landings such as Odysseus and Blue Ghost prove that dreams of commercial space are possible. The second Hakuto-R mission was a private attempt and a blow to Japan's space ambitions. Failure has not stopped ispace development for Mission 3 and Mission 4 with its larger Apex 1.0 lander.Hakamada mentioned that the priority for the team was now to find out what caused the crash. “Supporters are disappointed,” CFO Nozaki says, “but ispace has yet to cover the moon, and the road does not end, even if Mission 2 didn't go as planned.”

    For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

    Further reading:
    ispace, Hakuto-R, moon landing, Resilience, lunar exploration

    Gadgets 360 Staff

    The resident bot. If you email me, a human will respond.
    More

    Related Stories
    #japanese #private #lunar #lander #resilience
    Japanese Private Lunar Lander Resilience Fails Mission, Crashes on Moon
    Photo Credit: ispace A delay in rangefinder data prevented timely deceleration, causing a hard landing on the lunar surface Highlights Resilience lander lost signal one minute before scheduled moon touchdown Delayed laser rangefinder data caused failure in the landing speed adjust ispace lunar lander crashes on final descent, marking its second mission Advertisement A Japanese spacecraft attempting to achieve the country's first private moon landing instead crashed on the lunar surface, according to mission officials. The Resilience lander, developed by Tokyo-based ispace, lost communication one minute and 45 seconds before its scheduled soft touchdown on June 5 at 3:17 p.m. EDT. The descent was targeted for the Mare Frigoris region on the Moon's near side. ispace had its second problem on the moon when its laser rangefinder broke, which is a big improvement over its prior failure in April 2023.Japan's Resilience Lunar Lander Crashes in Hard Landing, ispace Vows to Learn and RebuildAs per an official statement from ispace, telemetry from Resilience revealed that the rangefinder's delayed data caused a failure in adjusting landing speed. This likely led to a “hard landing”, suggesting the spacecraft hit the moon's surface too fast to survive or complete its mission. The lander, carrying five payloads, such as a Tenacious rover and scientific instruments, crashed with no survivors. The firm's CEO, Takeshi Hakamada, apologised and remarked that the company would use the mission to learn about future missions.The Hakuto-R Mission 2 team launched a 7.5-foot-tall, 2,200-pound Resilience lander into space aboard a SpaceX Falcon 9 rocket in early May. But with a perfect orbit, the lander smashed into the lunar surface at 192 metres – an echo of Mission 1's mission failure in 2023, which crashed because a fault in one of its altitude sensors was not corrected.The Resilience crash adds to private attempts to explore the moon, including the unsuccessful Beresheet and Peregrine missions. Crewed landings such as Odysseus and Blue Ghost prove that dreams of commercial space are possible. The second Hakuto-R mission was a private attempt and a blow to Japan's space ambitions. Failure has not stopped ispace development for Mission 3 and Mission 4 with its larger Apex 1.0 lander.Hakamada mentioned that the priority for the team was now to find out what caused the crash. “Supporters are disappointed,” CFO Nozaki says, “but ispace has yet to cover the moon, and the road does not end, even if Mission 2 didn't go as planned.” For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube. Further reading: ispace, Hakuto-R, moon landing, Resilience, lunar exploration Gadgets 360 Staff The resident bot. If you email me, a human will respond. More Related Stories #japanese #private #lunar #lander #resilience
    WWW.GADGETS360.COM
    Japanese Private Lunar Lander Resilience Fails Mission, Crashes on Moon
    Photo Credit: ispace A delay in rangefinder data prevented timely deceleration, causing a hard landing on the lunar surface Highlights Resilience lander lost signal one minute before scheduled moon touchdown Delayed laser rangefinder data caused failure in the landing speed adjust ispace lunar lander crashes on final descent, marking its second mission Advertisement A Japanese spacecraft attempting to achieve the country's first private moon landing instead crashed on the lunar surface, according to mission officials. The Resilience lander, developed by Tokyo-based ispace, lost communication one minute and 45 seconds before its scheduled soft touchdown on June 5 at 3:17 p.m. EDT. The descent was targeted for the Mare Frigoris region on the Moon's near side. ispace had its second problem on the moon when its laser rangefinder broke, which is a big improvement over its prior failure in April 2023.Japan's Resilience Lunar Lander Crashes in Hard Landing, ispace Vows to Learn and RebuildAs per an official statement from ispace, telemetry from Resilience revealed that the rangefinder's delayed data caused a failure in adjusting landing speed. This likely led to a “hard landing”, suggesting the spacecraft hit the moon's surface too fast to survive or complete its mission. The lander, carrying five payloads, such as a Tenacious rover and scientific instruments, crashed with no survivors. The firm's CEO, Takeshi Hakamada, apologised and remarked that the company would use the mission to learn about future missions.The Hakuto-R Mission 2 team launched a 7.5-foot-tall, 2,200-pound Resilience lander into space aboard a SpaceX Falcon 9 rocket in early May. But with a perfect orbit, the lander smashed into the lunar surface at 192 metres – an echo of Mission 1's mission failure in 2023, which crashed because a fault in one of its altitude sensors was not corrected.The Resilience crash adds to private attempts to explore the moon, including the unsuccessful Beresheet and Peregrine missions. Crewed landings such as Odysseus and Blue Ghost prove that dreams of commercial space are possible. The second Hakuto-R mission was a private attempt and a blow to Japan's space ambitions. Failure has not stopped ispace development for Mission 3 and Mission 4 with its larger Apex 1.0 lander.Hakamada mentioned that the priority for the team was now to find out what caused the crash. “Supporters are disappointed,” CFO Nozaki says, “but ispace has yet to cover the moon, and the road does not end, even if Mission 2 didn't go as planned.” For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube. Further reading: ispace, Hakuto-R, moon landing, Resilience, lunar exploration Gadgets 360 Staff The resident bot. If you email me, a human will respond. More Related Stories
    Like
    Love
    Wow
    Sad
    Angry
    343
    0 Commenti 0 condivisioni
  • Can AI Mistakes Lead to Real Legal Exposure?

    Posted on : June 5, 2025

    By

    Tech World Times

    AI 

    Rate this post

    Artificial intelligence tools now touch nearly every corner of modern business, from customer service and marketing to supply chain management and HR. These powerful technologies promise speed, accuracy, and insight, but their missteps can cause more than temporary inconvenience. A single AI-driven error can result in regulatory investigations, civil lawsuits, or public scandals that threaten the foundation of a business. Understanding how legal exposure arises from AI mistakes—and how a skilled attorney protects your interests—is no longer an option, but a requirement for any forward-thinking business owner.
    What Types of AI Errors Create Legal Liability?
    AI does not think or reason like a human; it follows code and statistical patterns, sometimes with unintended results. These missteps can create a trail of legal liability for any business owner. For example, an online retailer’s AI recommends discriminatory pricing, sparking allegations of unfair trade practices. An HR department automates hiring decisions with AI, only to face lawsuits for violating anti-discrimination laws. Even an AI-driven chatbot, when programmed without proper safeguards, can inadvertently give health advice or misrepresent product claims—exposing the company to regulatory penalties. Cases like these are regularly reported in Legal news as businesses discover the high cost of digital shortcuts.
    When Is a Business Owner Liable for AI Mistakes?
    Liability rarely rests with the software developer or the tool itself. Courts and regulators expect the business to monitor, supervise, and, when needed, override AI decisions. Suppose a financial advisor uses AI to recommend investments, but the algorithm suggests securities that violate state regulations. Even if the AI was “just following instructions,” the advisor remains responsible for client losses. Similarly, a marketing team cannot escape liability if their AI generates misleading advertising. The bottom line: outsourcing work to AI does not outsource legal responsibility.
    How Do AI Errors Harm Your Reputation and Operations?
    AI mistakes can leave lasting marks on a business’s reputation, finances, and operations. A logistics firm’s route-optimization tool creates data leaks that breach customer privacy and trigger costly notifications. An online business suffers public backlash after an AI-powered customer service tool sends offensive responses to clients. Such incidents erode public trust, drive customers to competitors, and divert resources into damage control rather than growth. Worse, compliance failures can result in penalties or shutdown orders, putting the entire enterprise at risk.
    What Steps Reduce Legal Risk From AI Deployments?
    Careful planning and continuous oversight keep AI tools working for your business—not against it. Compliance is not a “set it and forget it” matter. Proactive risk management transforms artificial intelligence from a liability into a valuable asset.
    Routine audits, staff training, and transparent policies form the backbone of safe, effective AI use in any organization.
    You should review these AI risk mitigation strategies below.

    Implement Manual Review of Sensitive Outputs: Require human approval for high-risk tasks, such as legal filings, financial transactions, or customer communications. A payroll company’s manual audits prevented the accidental overpayment of employees by catching AI-generated errors before disbursement.
    Update AI Systems for Regulatory Changes: Stay ahead of new laws and standards by regularly reviewing AI algorithms and outputs. An insurance brokerage avoided regulatory fines by updating their risk assessment models as privacy laws evolved.
    Document Every Incident and Remediation Step: Keep records of AI errors, investigations, and corrections. A healthcare provider’s transparency during a patient data mix-up helped avoid litigation and regulatory penalties.
    Limit AI Access to Personal and Sensitive Data: Restrict the scope and permissions of AI tools to reduce the chance of data misuse. A SaaS provider used data minimization techniques, lowering the risk of exposure in case of a system breach.
    Consult With Attorneys for Custom Policies and Protocols: Collaborate with experienced Attorneys to design, review, and update AI compliance frameworks.

    How Do Attorneys Shield Your Business From AI Legal Risks?
    Attorneys provide a critical safety net as AI integrates deeper into business operations. They draft tailored contracts, establish protocols for monitoring and escalation, and assess risks unique to your industry. In the event of an AI-driven incident, legal counsel investigates the facts, manages communication with regulators, and builds a robust defense. By providing training, ongoing guidance, and crisis management support, attorneys ensure that innovation doesn’t lead to exposure—or disaster. With the right legal partner, businesses can harness AI’s power while staying firmly on the right side of the law.
    Tech World TimesTech World Times, a global collective focusing on the latest tech news and trends in blockchain, Fintech, Development & Testing, AI and Startups. If you are looking for the guest post then contact at techworldtimes@gmail.com
    #can #mistakes #lead #real #legal
    Can AI Mistakes Lead to Real Legal Exposure?
    Posted on : June 5, 2025 By Tech World Times AI  Rate this post Artificial intelligence tools now touch nearly every corner of modern business, from customer service and marketing to supply chain management and HR. These powerful technologies promise speed, accuracy, and insight, but their missteps can cause more than temporary inconvenience. A single AI-driven error can result in regulatory investigations, civil lawsuits, or public scandals that threaten the foundation of a business. Understanding how legal exposure arises from AI mistakes—and how a skilled attorney protects your interests—is no longer an option, but a requirement for any forward-thinking business owner. What Types of AI Errors Create Legal Liability? AI does not think or reason like a human; it follows code and statistical patterns, sometimes with unintended results. These missteps can create a trail of legal liability for any business owner. For example, an online retailer’s AI recommends discriminatory pricing, sparking allegations of unfair trade practices. An HR department automates hiring decisions with AI, only to face lawsuits for violating anti-discrimination laws. Even an AI-driven chatbot, when programmed without proper safeguards, can inadvertently give health advice or misrepresent product claims—exposing the company to regulatory penalties. Cases like these are regularly reported in Legal news as businesses discover the high cost of digital shortcuts. When Is a Business Owner Liable for AI Mistakes? Liability rarely rests with the software developer or the tool itself. Courts and regulators expect the business to monitor, supervise, and, when needed, override AI decisions. Suppose a financial advisor uses AI to recommend investments, but the algorithm suggests securities that violate state regulations. Even if the AI was “just following instructions,” the advisor remains responsible for client losses. Similarly, a marketing team cannot escape liability if their AI generates misleading advertising. The bottom line: outsourcing work to AI does not outsource legal responsibility. How Do AI Errors Harm Your Reputation and Operations? AI mistakes can leave lasting marks on a business’s reputation, finances, and operations. A logistics firm’s route-optimization tool creates data leaks that breach customer privacy and trigger costly notifications. An online business suffers public backlash after an AI-powered customer service tool sends offensive responses to clients. Such incidents erode public trust, drive customers to competitors, and divert resources into damage control rather than growth. Worse, compliance failures can result in penalties or shutdown orders, putting the entire enterprise at risk. What Steps Reduce Legal Risk From AI Deployments? Careful planning and continuous oversight keep AI tools working for your business—not against it. Compliance is not a “set it and forget it” matter. Proactive risk management transforms artificial intelligence from a liability into a valuable asset. Routine audits, staff training, and transparent policies form the backbone of safe, effective AI use in any organization. You should review these AI risk mitigation strategies below. Implement Manual Review of Sensitive Outputs: Require human approval for high-risk tasks, such as legal filings, financial transactions, or customer communications. A payroll company’s manual audits prevented the accidental overpayment of employees by catching AI-generated errors before disbursement. Update AI Systems for Regulatory Changes: Stay ahead of new laws and standards by regularly reviewing AI algorithms and outputs. An insurance brokerage avoided regulatory fines by updating their risk assessment models as privacy laws evolved. Document Every Incident and Remediation Step: Keep records of AI errors, investigations, and corrections. A healthcare provider’s transparency during a patient data mix-up helped avoid litigation and regulatory penalties. Limit AI Access to Personal and Sensitive Data: Restrict the scope and permissions of AI tools to reduce the chance of data misuse. A SaaS provider used data minimization techniques, lowering the risk of exposure in case of a system breach. Consult With Attorneys for Custom Policies and Protocols: Collaborate with experienced Attorneys to design, review, and update AI compliance frameworks. How Do Attorneys Shield Your Business From AI Legal Risks? Attorneys provide a critical safety net as AI integrates deeper into business operations. They draft tailored contracts, establish protocols for monitoring and escalation, and assess risks unique to your industry. In the event of an AI-driven incident, legal counsel investigates the facts, manages communication with regulators, and builds a robust defense. By providing training, ongoing guidance, and crisis management support, attorneys ensure that innovation doesn’t lead to exposure—or disaster. With the right legal partner, businesses can harness AI’s power while staying firmly on the right side of the law. Tech World TimesTech World Times, a global collective focusing on the latest tech news and trends in blockchain, Fintech, Development & Testing, AI and Startups. If you are looking for the guest post then contact at techworldtimes@gmail.com #can #mistakes #lead #real #legal
    TECHWORLDTIMES.COM
    Can AI Mistakes Lead to Real Legal Exposure?
    Posted on : June 5, 2025 By Tech World Times AI  Rate this post Artificial intelligence tools now touch nearly every corner of modern business, from customer service and marketing to supply chain management and HR. These powerful technologies promise speed, accuracy, and insight, but their missteps can cause more than temporary inconvenience. A single AI-driven error can result in regulatory investigations, civil lawsuits, or public scandals that threaten the foundation of a business. Understanding how legal exposure arises from AI mistakes—and how a skilled attorney protects your interests—is no longer an option, but a requirement for any forward-thinking business owner. What Types of AI Errors Create Legal Liability? AI does not think or reason like a human; it follows code and statistical patterns, sometimes with unintended results. These missteps can create a trail of legal liability for any business owner. For example, an online retailer’s AI recommends discriminatory pricing, sparking allegations of unfair trade practices. An HR department automates hiring decisions with AI, only to face lawsuits for violating anti-discrimination laws. Even an AI-driven chatbot, when programmed without proper safeguards, can inadvertently give health advice or misrepresent product claims—exposing the company to regulatory penalties. Cases like these are regularly reported in Legal news as businesses discover the high cost of digital shortcuts. When Is a Business Owner Liable for AI Mistakes? Liability rarely rests with the software developer or the tool itself. Courts and regulators expect the business to monitor, supervise, and, when needed, override AI decisions. Suppose a financial advisor uses AI to recommend investments, but the algorithm suggests securities that violate state regulations. Even if the AI was “just following instructions,” the advisor remains responsible for client losses. Similarly, a marketing team cannot escape liability if their AI generates misleading advertising. The bottom line: outsourcing work to AI does not outsource legal responsibility. How Do AI Errors Harm Your Reputation and Operations? AI mistakes can leave lasting marks on a business’s reputation, finances, and operations. A logistics firm’s route-optimization tool creates data leaks that breach customer privacy and trigger costly notifications. An online business suffers public backlash after an AI-powered customer service tool sends offensive responses to clients. Such incidents erode public trust, drive customers to competitors, and divert resources into damage control rather than growth. Worse, compliance failures can result in penalties or shutdown orders, putting the entire enterprise at risk. What Steps Reduce Legal Risk From AI Deployments? Careful planning and continuous oversight keep AI tools working for your business—not against it. Compliance is not a “set it and forget it” matter. Proactive risk management transforms artificial intelligence from a liability into a valuable asset. Routine audits, staff training, and transparent policies form the backbone of safe, effective AI use in any organization. You should review these AI risk mitigation strategies below. Implement Manual Review of Sensitive Outputs: Require human approval for high-risk tasks, such as legal filings, financial transactions, or customer communications. A payroll company’s manual audits prevented the accidental overpayment of employees by catching AI-generated errors before disbursement. Update AI Systems for Regulatory Changes: Stay ahead of new laws and standards by regularly reviewing AI algorithms and outputs. An insurance brokerage avoided regulatory fines by updating their risk assessment models as privacy laws evolved. Document Every Incident and Remediation Step: Keep records of AI errors, investigations, and corrections. A healthcare provider’s transparency during a patient data mix-up helped avoid litigation and regulatory penalties. Limit AI Access to Personal and Sensitive Data: Restrict the scope and permissions of AI tools to reduce the chance of data misuse. A SaaS provider used data minimization techniques, lowering the risk of exposure in case of a system breach. Consult With Attorneys for Custom Policies and Protocols: Collaborate with experienced Attorneys to design, review, and update AI compliance frameworks. How Do Attorneys Shield Your Business From AI Legal Risks? Attorneys provide a critical safety net as AI integrates deeper into business operations. They draft tailored contracts, establish protocols for monitoring and escalation, and assess risks unique to your industry. In the event of an AI-driven incident, legal counsel investigates the facts, manages communication with regulators, and builds a robust defense. By providing training, ongoing guidance, and crisis management support, attorneys ensure that innovation doesn’t lead to exposure—or disaster. With the right legal partner, businesses can harness AI’s power while staying firmly on the right side of the law. Tech World TimesTech World Times (TWT), a global collective focusing on the latest tech news and trends in blockchain, Fintech, Development & Testing, AI and Startups. If you are looking for the guest post then contact at techworldtimes@gmail.com
    Like
    Love
    Wow
    Sad
    Angry
    272
    0 Commenti 0 condivisioni
  • Probiotics can help heal ravaged coral reefs

    Get the Popular Science daily newsletter
    Breakthroughs, discoveries, and DIY tips sent every weekday.

    Probiotics are everywhere, claiming to help us poop, restore gut health, and more. They can also be used to help threatened coral reefs. A bacterial probiotic has helped slow the spread of stony coral tissue loss diseasein wild corals in Florida that were already infected with the disease. The findings are detailed in a study published June 5 in the journal Frontiers in Marine Science and show that applying this new probiotic treatment across coral colines helped prevent further tissue loss.
    What is stony coral tissue loss disease?
    SCTLD first emerged in Florida in 2014. In the 11 years since, it has rapidly spread throughout the Caribbean. This mysterious ailment has been confirmed in at least 20 other countries and territories.
    Other coral pathogens typically target specific species. SCTLD infects more than 30 different species of stony corals, including pillar corals and brain corals. The disease causes the soft tissue in the corals to slough off, leaving behind white patches of exposed skeleton. The disease can devastate an entire coral colony in only a few weeks to months. 
    A great star coralcolony infected with stony coral tissue lossdiseaseon the coral reef in Fort Lauderdale, FL. The lesion, where the white band of tissue occurs, typically moves across the coral, killing coral tissue along the way. CREDIT: KellyPitts, Smithsonian.
    The exact cause of SCTLD is still unknown, but it appears to be linked to some kind of harmful bacteria. Currently, the most common treatment for SCTLD is using a paste that contains the antibiotic amoxicillin on diseased corals. However, antibiotics are not a silver bullet. This amoxicillin balm can temporarily halt SCTLD’s spread, but it needs to be frequently reapplied to the lesions on the corals. This takes time and resources, while increasing the likelihood that the microbes causing SCTLD might develop resistance to amoxicillin and related antibiotics.
    “Antibiotics do not stop future outbreaks,” Valerie Paul, a study co-author and the head scientist at the Smithsonian Marine Station at Fort Pierce, Florida, said in a statement. “The disease can quickly come back, even on the same coral colonies that have been treated.”
    Finding the right probiotic
    Paul and her colleagues have spent over six years investigating whether beneficial microorganismscould be a longer lasting alternative to combat this pathogen.
    Just like humans, corals are host to communities known as microbiomes that are bustling with all different types of bacteria. Some of these miniscule organisms produce antioxidants and vitamins that can help keep their coral hosts healthy. 
    First, the team looked at the microbiomes of corals that are impervious to SCTLD to try and harvest probiotics from these disease-resistant species. In theory, these could be used to strengthen the microbiomes of susceptible corals. 
    They tested over 200 strains of bacteria from disease-resistant corals and published a study in 2023 about the probiotic Pseudoalteromonas sp. McH1-7. Taken from the great star coral, this probiotic produces several antibacterial compounds. Having such a stacked antibacterial toolbox made McH1-7 an ideal candidate to combat a pathogen like SCTLD.
    They initially tested McH1-7 on live pieces of M. cavernosa and found that the probiotic reliably prevented the spread of SCTLD in the lab. After these successful lab tests, the wild ocean called next.
    Testing in the ocean
    The team conducted several field tests on a shallow reef near Fort Lauderdale, focusing on 40 M. cavernosa colonies that showed signs of SCTLD. Some of the corals in these colonies received a paste containing the probiotic McH1-7 that was applied directly to the disease lesions. They treated the other corals with a solution of seawater containing McH1-7 and covered them using weighted plastic bags. The probiotics were administered inside the bag in order to cover the entire coral colony.  
    “This created a little mini-aquarium that kept the probiotics around each coral colony,” Paul said.
    For two and a half years, they monitored the colonies, taking multiple rounds of tissue and mucus samples to see how the corals’ microbiomes were changing over time. They found that  the McH1-7 probiotic successfully slowed the spread of SCTLD when it was delivered to the entire colony using the bag and solution method. According to the samples, the probiotic was effective without dominating the corals’ natural microbes. 
    Kelly Pitts, a research technician with the Smithsonian Marine Station at Ft. Pierce, Floridaand co-lead author of the study treats great star coralcolonies infected with SCTLD with probiotic strain McH1-7 by covering the coral colony in a plastic bag, injecting a probiotic bacteria solution into the bag and leaving the bag for two hours to allow for the bacteria to colonize on the coral. CREDIT: Hunter Noren.
    Fighting nature with nature
    While using this probiotic appears to be an effective treatment for SCTLD among the reefs of northern Florida, additional work is needed to see how it could work in other regions. Similar tests on reefs in the Florida Keys have been conducted, with mixed preliminary results, likely due to regional differences in SCTLD.
    The team believes that probiotics still could become a crucial tool for combatting SCTLD across the Caribbean, especially as scientists fine tune how to administer them. Importantly, these beneficial bacteria support what corals already do naturally. 
    “Corals are naturally rich with bacteria and it’s not surprising that the bacterial composition is important for their health,” Paul said. “We’re trying to figure out which bacteria can make these vibrant microbiomes even stronger.”
    #probiotics #can #help #heal #ravaged
    Probiotics can help heal ravaged coral reefs
    Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday. Probiotics are everywhere, claiming to help us poop, restore gut health, and more. They can also be used to help threatened coral reefs. A bacterial probiotic has helped slow the spread of stony coral tissue loss diseasein wild corals in Florida that were already infected with the disease. The findings are detailed in a study published June 5 in the journal Frontiers in Marine Science and show that applying this new probiotic treatment across coral colines helped prevent further tissue loss. What is stony coral tissue loss disease? SCTLD first emerged in Florida in 2014. In the 11 years since, it has rapidly spread throughout the Caribbean. This mysterious ailment has been confirmed in at least 20 other countries and territories. Other coral pathogens typically target specific species. SCTLD infects more than 30 different species of stony corals, including pillar corals and brain corals. The disease causes the soft tissue in the corals to slough off, leaving behind white patches of exposed skeleton. The disease can devastate an entire coral colony in only a few weeks to months.  A great star coralcolony infected with stony coral tissue lossdiseaseon the coral reef in Fort Lauderdale, FL. The lesion, where the white band of tissue occurs, typically moves across the coral, killing coral tissue along the way. CREDIT: KellyPitts, Smithsonian. The exact cause of SCTLD is still unknown, but it appears to be linked to some kind of harmful bacteria. Currently, the most common treatment for SCTLD is using a paste that contains the antibiotic amoxicillin on diseased corals. However, antibiotics are not a silver bullet. This amoxicillin balm can temporarily halt SCTLD’s spread, but it needs to be frequently reapplied to the lesions on the corals. This takes time and resources, while increasing the likelihood that the microbes causing SCTLD might develop resistance to amoxicillin and related antibiotics. “Antibiotics do not stop future outbreaks,” Valerie Paul, a study co-author and the head scientist at the Smithsonian Marine Station at Fort Pierce, Florida, said in a statement. “The disease can quickly come back, even on the same coral colonies that have been treated.” Finding the right probiotic Paul and her colleagues have spent over six years investigating whether beneficial microorganismscould be a longer lasting alternative to combat this pathogen. Just like humans, corals are host to communities known as microbiomes that are bustling with all different types of bacteria. Some of these miniscule organisms produce antioxidants and vitamins that can help keep their coral hosts healthy.  First, the team looked at the microbiomes of corals that are impervious to SCTLD to try and harvest probiotics from these disease-resistant species. In theory, these could be used to strengthen the microbiomes of susceptible corals.  They tested over 200 strains of bacteria from disease-resistant corals and published a study in 2023 about the probiotic Pseudoalteromonas sp. McH1-7. Taken from the great star coral, this probiotic produces several antibacterial compounds. Having such a stacked antibacterial toolbox made McH1-7 an ideal candidate to combat a pathogen like SCTLD. They initially tested McH1-7 on live pieces of M. cavernosa and found that the probiotic reliably prevented the spread of SCTLD in the lab. After these successful lab tests, the wild ocean called next. Testing in the ocean The team conducted several field tests on a shallow reef near Fort Lauderdale, focusing on 40 M. cavernosa colonies that showed signs of SCTLD. Some of the corals in these colonies received a paste containing the probiotic McH1-7 that was applied directly to the disease lesions. They treated the other corals with a solution of seawater containing McH1-7 and covered them using weighted plastic bags. The probiotics were administered inside the bag in order to cover the entire coral colony.   “This created a little mini-aquarium that kept the probiotics around each coral colony,” Paul said. For two and a half years, they monitored the colonies, taking multiple rounds of tissue and mucus samples to see how the corals’ microbiomes were changing over time. They found that  the McH1-7 probiotic successfully slowed the spread of SCTLD when it was delivered to the entire colony using the bag and solution method. According to the samples, the probiotic was effective without dominating the corals’ natural microbes.  Kelly Pitts, a research technician with the Smithsonian Marine Station at Ft. Pierce, Floridaand co-lead author of the study treats great star coralcolonies infected with SCTLD with probiotic strain McH1-7 by covering the coral colony in a plastic bag, injecting a probiotic bacteria solution into the bag and leaving the bag for two hours to allow for the bacteria to colonize on the coral. CREDIT: Hunter Noren. Fighting nature with nature While using this probiotic appears to be an effective treatment for SCTLD among the reefs of northern Florida, additional work is needed to see how it could work in other regions. Similar tests on reefs in the Florida Keys have been conducted, with mixed preliminary results, likely due to regional differences in SCTLD. The team believes that probiotics still could become a crucial tool for combatting SCTLD across the Caribbean, especially as scientists fine tune how to administer them. Importantly, these beneficial bacteria support what corals already do naturally.  “Corals are naturally rich with bacteria and it’s not surprising that the bacterial composition is important for their health,” Paul said. “We’re trying to figure out which bacteria can make these vibrant microbiomes even stronger.” #probiotics #can #help #heal #ravaged
    WWW.POPSCI.COM
    Probiotics can help heal ravaged coral reefs
    Get the Popular Science daily newsletter💡 Breakthroughs, discoveries, and DIY tips sent every weekday. Probiotics are everywhere, claiming to help us poop, restore gut health, and more. They can also be used to help threatened coral reefs. A bacterial probiotic has helped slow the spread of stony coral tissue loss disease (SCTLD) in wild corals in Florida that were already infected with the disease. The findings are detailed in a study published June 5 in the journal Frontiers in Marine Science and show that applying this new probiotic treatment across coral colines helped prevent further tissue loss. What is stony coral tissue loss disease (SCTLD)? SCTLD first emerged in Florida in 2014. In the 11 years since, it has rapidly spread throughout the Caribbean. This mysterious ailment has been confirmed in at least 20 other countries and territories. Other coral pathogens typically target specific species. SCTLD infects more than 30 different species of stony corals, including pillar corals and brain corals. The disease causes the soft tissue in the corals to slough off, leaving behind white patches of exposed skeleton. The disease can devastate an entire coral colony in only a few weeks to months.  A great star coral (Montastraea cavernosa) colony infected with stony coral tissue lossdisease (SCTLD) on the coral reef in Fort Lauderdale, FL. The lesion, where the white band of tissue occurs, typically moves across the coral, killing coral tissue along the way. CREDIT: KellyPitts, Smithsonian. The exact cause of SCTLD is still unknown, but it appears to be linked to some kind of harmful bacteria. Currently, the most common treatment for SCTLD is using a paste that contains the antibiotic amoxicillin on diseased corals. However, antibiotics are not a silver bullet. This amoxicillin balm can temporarily halt SCTLD’s spread, but it needs to be frequently reapplied to the lesions on the corals. This takes time and resources, while increasing the likelihood that the microbes causing SCTLD might develop resistance to amoxicillin and related antibiotics. “Antibiotics do not stop future outbreaks,” Valerie Paul, a study co-author and the head scientist at the Smithsonian Marine Station at Fort Pierce, Florida, said in a statement. “The disease can quickly come back, even on the same coral colonies that have been treated.” Finding the right probiotic Paul and her colleagues have spent over six years investigating whether beneficial microorganisms (aka probiotics) could be a longer lasting alternative to combat this pathogen. Just like humans, corals are host to communities known as microbiomes that are bustling with all different types of bacteria. Some of these miniscule organisms produce antioxidants and vitamins that can help keep their coral hosts healthy.  First, the team looked at the microbiomes of corals that are impervious to SCTLD to try and harvest probiotics from these disease-resistant species. In theory, these could be used to strengthen the microbiomes of susceptible corals.  They tested over 200 strains of bacteria from disease-resistant corals and published a study in 2023 about the probiotic Pseudoalteromonas sp. McH1-7 (or McH1-7 for short). Taken from the great star coral (Montastraea cavernosa), this probiotic produces several antibacterial compounds. Having such a stacked antibacterial toolbox made McH1-7 an ideal candidate to combat a pathogen like SCTLD. They initially tested McH1-7 on live pieces of M. cavernosa and found that the probiotic reliably prevented the spread of SCTLD in the lab. After these successful lab tests, the wild ocean called next. Testing in the ocean The team conducted several field tests on a shallow reef near Fort Lauderdale, focusing on 40 M. cavernosa colonies that showed signs of SCTLD. Some of the corals in these colonies received a paste containing the probiotic McH1-7 that was applied directly to the disease lesions. They treated the other corals with a solution of seawater containing McH1-7 and covered them using weighted plastic bags. The probiotics were administered inside the bag in order to cover the entire coral colony.   “This created a little mini-aquarium that kept the probiotics around each coral colony,” Paul said. For two and a half years, they monitored the colonies, taking multiple rounds of tissue and mucus samples to see how the corals’ microbiomes were changing over time. They found that  the McH1-7 probiotic successfully slowed the spread of SCTLD when it was delivered to the entire colony using the bag and solution method. According to the samples, the probiotic was effective without dominating the corals’ natural microbes.  Kelly Pitts, a research technician with the Smithsonian Marine Station at Ft. Pierce, Floridaand co-lead author of the study treats great star coral (Montaststraea cavernosa) colonies infected with SCTLD with probiotic strain McH1-7 by covering the coral colony in a plastic bag, injecting a probiotic bacteria solution into the bag and leaving the bag for two hours to allow for the bacteria to colonize on the coral. CREDIT: Hunter Noren. Fighting nature with nature While using this probiotic appears to be an effective treatment for SCTLD among the reefs of northern Florida, additional work is needed to see how it could work in other regions. Similar tests on reefs in the Florida Keys have been conducted, with mixed preliminary results, likely due to regional differences in SCTLD. The team believes that probiotics still could become a crucial tool for combatting SCTLD across the Caribbean, especially as scientists fine tune how to administer them. Importantly, these beneficial bacteria support what corals already do naturally.  “Corals are naturally rich with bacteria and it’s not surprising that the bacterial composition is important for their health,” Paul said. “We’re trying to figure out which bacteria can make these vibrant microbiomes even stronger.”
    Like
    Love
    Wow
    Sad
    Angry
    203
    0 Commenti 0 condivisioni
  • Dev snapshot: Godot 4.5 dev 5

    Replicube
    A game by Walaber Entertainment LLCDev snapshot: Godot 4.5 dev 5By:
    Thaddeus Crews2 June 2025Pre-releaseBrrr… Do you feel that? That’s the cold front of the feature freeze just around the corner. It’s not upon us just yet, but this is likely to be our final development snapshot of the 4.5 release cycle. As we enter the home stretch of new features, bugs are naturally going to follow suit, meaning bug reports and feedback will be especially important for a smooth beta timeframe.Jump to the Downloads section, and give it a spin right now, or continue reading to learn more about improvements in this release. You can also try the Web editor or the Android editor for this release. If you are interested in the latter, please request to join our testing group to get access to pre-release builds.The cover illustration is from Replicube, a programming puzzle game where you write code to recreate voxelized objects. It is developed by Walaber Entertainment LLC. You can get the game on Steam.HighlightsIn case you missed them, see the 4.5 dev 1, 4.5 dev 2, 4.5 dev 3, and 4.5 dev 4 release notes for an overview of some key features which were already in those snapshots, and are therefore still available for testing in dev 5.Native visionOS supportNormally, our featured highlights in these development blogs come from long-time contributors. This makes sense of course, as it’s generally those users that have the familiarity necessary for major changes or additions that are commonly used for these highlights. That’s why it might surprise you to hear that visionOS support comes to us from Ricardo Sanchez-Saez, whose pull request GH-105628 is his very first contribution to the engine! It might not surprise you to hear that Ricardo is part of the visionOS engineering team at Apple, which certainly helps get his foot in the door, but that still makes visionOS the first officially-supported platform integration in about a decade.For those unfamiliar, visionOS is Apple’s XR environment. We’re no strangers to XR as a concept, but XR platforms are as distinct from one another as traditional platforms. visionOS users have expressed a strong interest in integrating with our ever-growing XR community, and now we can make that happen. See you all in the next XR Game Jam!GDScript: Abstract classesWhile the Godot Engine utilizes abstract classes—a class that cannot be directly instantiated—frequently, this was only ever supported internally. Thanks to the efforts of Aaron Franke, this paradigm is now available to GDScript users. Now if a user wants to introduce their own abstract class, they merely need to declare it via the new abstract keyword:abstract class_name MyAbstract extends Node
    The purpose of an abstract class is to create a baseline for other classes to derive from:class_name ExtendsMyAbstract extends MyAbstract
    Shader bakerFrom the technical gurus behind implementing ubershaders, Darío Samo and Pedro J. Estébanez bring us another miracle of rendering via GH-102552: shader baker exporting. This is an optional feature that can be enabled at export time to speed up shader compilation massively. This feature works with ubershaders automatically without any work from the user. Using shader baking is strongly recommended when targeting Apple devices or D3D12 since it makes the biggest difference there!Before:After:However, it comes with tradeoffs:Export time will be much longer.Build size will be much larger since the baked shaders can take up a lot of space.We have removed several MoltenVK bug workarounds from the Forward+ shader, therefore we no longer guarantee support for the Forward+ renderer on Intel Macs. If you are targeting Intel Macs, you should use the Mobile or Compatibility renderers.Baking for Vulkan can be done from any device, but baking for D3D12 needs to be done from a Windows device and baking for Apple .metallib requires a Metal compiler.Web: WebAssembly SIMD supportAs you might recall, Godot 4.0 initially released under the assumption that multi-threaded web support would become the standard, and only supported that format for web builds. This assumption unfortunately proved to be wishful thinking, and was reverted in 4.3 by allowing for single-threaded builds once more. However, this doesn’t mean that these single-threaded environments are inherently incapable of parallel processing; it just requires alternative implementations. One such implementation, SIMD, is a perfect candidate thanks to its support across all major browsers. To that end, web-wiz Adam Scott has taken to integrating this implementation for our web builds by default.Inline color pickersWhile it’s always been possible to see what kind of variable is assigned to an exported color in the inspector, some users have expressed a keen interest in allowing for this functionality within the script editor itself. This is because it would mean seeing what kind of color is represented by a variable without it needing to be exposed, as well as making it more intuitive at a glance as to what color a name or code corresponds to. Koliur Rahman has blessed us with this quality-of-life goodness, which adds an inline color picker GH-105724. Now no matter where the color is declared, users will be able to immediately and intuitively know what is actually represented in a non-intrusive manner.Rendering goodiesThe renderer got a fair amount of love this snapshot; not from any one PR, but rather a multitude of community members bringing some long-awaited features to light. Raymond DiDonato helped SMAA 1x make its transition from addon to fully-fledged engine feature. Capry brings bent normal maps to further enhance specular occlusion and indirect lighting. Our very own Clay John converted our Compatibility backend to use a fragment shader copy instead of a blit copy, working around common sample rate issues on mobile devices. More technical information on these rendering changes can be found in their associated PRs.SMAA comparison:OffOnBent normal map comparison:BeforeAfterAnd more!There are too many exciting changes to list them all here, but here’s a curated selection:Animation: Add alphabetical sorting to Animation Player.Animation: Add animation filtering to animation editor.Audio: Implement seek operation for Theora video files, improve multi-channel audio resampling.Core: Add --scene command line argument.Core: Overhaul resource duplication.Core: Use Grisu2 algorithm in String::num_scientific to fix serializing.Editor: Add “Quick Load” button to EditorResourcePicker.Editor: Add PROPERTY_HINT_INPUT_NAME for use with @export_custom to allow using input actions.Editor: Add named EditorScripts to the command palette.GUI: Add file sort to FileDialog.I18n: Add translation preview in editor.Import: Add Channel Remap settings to ResourceImporterTexture.Physics: Improve performance with non-monitoring areas when using Jolt Physics.Porting: Android: Add export option for custom theme attributes.Porting: Android: Add support for 16 KB page sizes, update to NDK r28b.Porting: Android: Remove the gradle_build/compress_native_libraries export option.Porting: Web: Use actual PThread pool size for get_default_thread_pool_size.Porting: Windows/macOS/Linux: Use SSE 4.2 as a baseline when compiling Godot.Rendering: Add new StandardMaterial properties to allow users to control FPS-style objects.Rendering: FTI - Optimize SceneTree traversal.Changelog109 contributors submitted 252 fixes for this release. See our interactive changelog for the complete list of changes since the previous 4.5-dev4 snapshot.This release is built from commit 64b09905c.DownloadsGodot is downloading...Godot exists thanks to donations from people like you. Help us continue our work:Make a DonationStandard build includes support for GDScript and GDExtension..NET buildincludes support for C#, as well as GDScript and GDExtension.While engine maintainers try their best to ensure that each preview snapshot and release candidate is stable, this is by definition a pre-release piece of software. Be sure to make frequent backups, or use a version control system such as Git, to preserve your projects in case of corruption or data loss.Known issuesWindows executableshave been signed with an expired certificate. You may see warnings from Windows Defender’s SmartScreen when running this version, or outright be prevented from running the executables with a double-click. Running Godot from the command line can circumvent this. We will soon have a renewed certificate which will be used for future builds.With every release, we accept that there are going to be various issues, which have already been reported but haven’t been fixed yet. See the GitHub issue tracker for a complete list of known bugs.Bug reportsAs a tester, we encourage you to open bug reports if you experience issues with this release. Please check the existing issues on GitHub first, using the search function with relevant keywords, to ensure that the bug you experience is not already known.In particular, any change that would cause a regression in your projects is very important to report.SupportGodot is a non-profit, open source game engine developed by hundreds of contributors on their free time, as well as a handful of part and full-time developers hired thanks to generous donations from the Godot community. A big thank you to everyone who has contributed their time or their financial support to the project!If you’d like to support the project financially and help us secure our future hires, you can do so using the Godot Development Fund.Donate now
    #dev #snapshot #godot
    Dev snapshot: Godot 4.5 dev 5
    Replicube A game by Walaber Entertainment LLCDev snapshot: Godot 4.5 dev 5By: Thaddeus Crews2 June 2025Pre-releaseBrrr… Do you feel that? That’s the cold front of the feature freeze just around the corner. It’s not upon us just yet, but this is likely to be our final development snapshot of the 4.5 release cycle. As we enter the home stretch of new features, bugs are naturally going to follow suit, meaning bug reports and feedback will be especially important for a smooth beta timeframe.Jump to the Downloads section, and give it a spin right now, or continue reading to learn more about improvements in this release. You can also try the Web editor or the Android editor for this release. If you are interested in the latter, please request to join our testing group to get access to pre-release builds.The cover illustration is from Replicube, a programming puzzle game where you write code to recreate voxelized objects. It is developed by Walaber Entertainment LLC. You can get the game on Steam.HighlightsIn case you missed them, see the 4.5 dev 1, 4.5 dev 2, 4.5 dev 3, and 4.5 dev 4 release notes for an overview of some key features which were already in those snapshots, and are therefore still available for testing in dev 5.Native visionOS supportNormally, our featured highlights in these development blogs come from long-time contributors. This makes sense of course, as it’s generally those users that have the familiarity necessary for major changes or additions that are commonly used for these highlights. That’s why it might surprise you to hear that visionOS support comes to us from Ricardo Sanchez-Saez, whose pull request GH-105628 is his very first contribution to the engine! It might not surprise you to hear that Ricardo is part of the visionOS engineering team at Apple, which certainly helps get his foot in the door, but that still makes visionOS the first officially-supported platform integration in about a decade.For those unfamiliar, visionOS is Apple’s XR environment. We’re no strangers to XR as a concept, but XR platforms are as distinct from one another as traditional platforms. visionOS users have expressed a strong interest in integrating with our ever-growing XR community, and now we can make that happen. See you all in the next XR Game Jam!GDScript: Abstract classesWhile the Godot Engine utilizes abstract classes—a class that cannot be directly instantiated—frequently, this was only ever supported internally. Thanks to the efforts of Aaron Franke, this paradigm is now available to GDScript users. Now if a user wants to introduce their own abstract class, they merely need to declare it via the new abstract keyword:abstract class_name MyAbstract extends Node The purpose of an abstract class is to create a baseline for other classes to derive from:class_name ExtendsMyAbstract extends MyAbstract Shader bakerFrom the technical gurus behind implementing ubershaders, Darío Samo and Pedro J. Estébanez bring us another miracle of rendering via GH-102552: shader baker exporting. This is an optional feature that can be enabled at export time to speed up shader compilation massively. This feature works with ubershaders automatically without any work from the user. Using shader baking is strongly recommended when targeting Apple devices or D3D12 since it makes the biggest difference there!Before:After:However, it comes with tradeoffs:Export time will be much longer.Build size will be much larger since the baked shaders can take up a lot of space.We have removed several MoltenVK bug workarounds from the Forward+ shader, therefore we no longer guarantee support for the Forward+ renderer on Intel Macs. If you are targeting Intel Macs, you should use the Mobile or Compatibility renderers.Baking for Vulkan can be done from any device, but baking for D3D12 needs to be done from a Windows device and baking for Apple .metallib requires a Metal compiler.Web: WebAssembly SIMD supportAs you might recall, Godot 4.0 initially released under the assumption that multi-threaded web support would become the standard, and only supported that format for web builds. This assumption unfortunately proved to be wishful thinking, and was reverted in 4.3 by allowing for single-threaded builds once more. However, this doesn’t mean that these single-threaded environments are inherently incapable of parallel processing; it just requires alternative implementations. One such implementation, SIMD, is a perfect candidate thanks to its support across all major browsers. To that end, web-wiz Adam Scott has taken to integrating this implementation for our web builds by default.Inline color pickersWhile it’s always been possible to see what kind of variable is assigned to an exported color in the inspector, some users have expressed a keen interest in allowing for this functionality within the script editor itself. This is because it would mean seeing what kind of color is represented by a variable without it needing to be exposed, as well as making it more intuitive at a glance as to what color a name or code corresponds to. Koliur Rahman has blessed us with this quality-of-life goodness, which adds an inline color picker GH-105724. Now no matter where the color is declared, users will be able to immediately and intuitively know what is actually represented in a non-intrusive manner.Rendering goodiesThe renderer got a fair amount of love this snapshot; not from any one PR, but rather a multitude of community members bringing some long-awaited features to light. Raymond DiDonato helped SMAA 1x make its transition from addon to fully-fledged engine feature. Capry brings bent normal maps to further enhance specular occlusion and indirect lighting. Our very own Clay John converted our Compatibility backend to use a fragment shader copy instead of a blit copy, working around common sample rate issues on mobile devices. More technical information on these rendering changes can be found in their associated PRs.SMAA comparison:OffOnBent normal map comparison:BeforeAfterAnd more!There are too many exciting changes to list them all here, but here’s a curated selection:Animation: Add alphabetical sorting to Animation Player.Animation: Add animation filtering to animation editor.Audio: Implement seek operation for Theora video files, improve multi-channel audio resampling.Core: Add --scene command line argument.Core: Overhaul resource duplication.Core: Use Grisu2 algorithm in String::num_scientific to fix serializing.Editor: Add “Quick Load” button to EditorResourcePicker.Editor: Add PROPERTY_HINT_INPUT_NAME for use with @export_custom to allow using input actions.Editor: Add named EditorScripts to the command palette.GUI: Add file sort to FileDialog.I18n: Add translation preview in editor.Import: Add Channel Remap settings to ResourceImporterTexture.Physics: Improve performance with non-monitoring areas when using Jolt Physics.Porting: Android: Add export option for custom theme attributes.Porting: Android: Add support for 16 KB page sizes, update to NDK r28b.Porting: Android: Remove the gradle_build/compress_native_libraries export option.Porting: Web: Use actual PThread pool size for get_default_thread_pool_size.Porting: Windows/macOS/Linux: Use SSE 4.2 as a baseline when compiling Godot.Rendering: Add new StandardMaterial properties to allow users to control FPS-style objects.Rendering: FTI - Optimize SceneTree traversal.Changelog109 contributors submitted 252 fixes for this release. See our interactive changelog for the complete list of changes since the previous 4.5-dev4 snapshot.This release is built from commit 64b09905c.DownloadsGodot is downloading...Godot exists thanks to donations from people like you. Help us continue our work:Make a DonationStandard build includes support for GDScript and GDExtension..NET buildincludes support for C#, as well as GDScript and GDExtension.While engine maintainers try their best to ensure that each preview snapshot and release candidate is stable, this is by definition a pre-release piece of software. Be sure to make frequent backups, or use a version control system such as Git, to preserve your projects in case of corruption or data loss.Known issuesWindows executableshave been signed with an expired certificate. You may see warnings from Windows Defender’s SmartScreen when running this version, or outright be prevented from running the executables with a double-click. Running Godot from the command line can circumvent this. We will soon have a renewed certificate which will be used for future builds.With every release, we accept that there are going to be various issues, which have already been reported but haven’t been fixed yet. See the GitHub issue tracker for a complete list of known bugs.Bug reportsAs a tester, we encourage you to open bug reports if you experience issues with this release. Please check the existing issues on GitHub first, using the search function with relevant keywords, to ensure that the bug you experience is not already known.In particular, any change that would cause a regression in your projects is very important to report.SupportGodot is a non-profit, open source game engine developed by hundreds of contributors on their free time, as well as a handful of part and full-time developers hired thanks to generous donations from the Godot community. A big thank you to everyone who has contributed their time or their financial support to the project!If you’d like to support the project financially and help us secure our future hires, you can do so using the Godot Development Fund.Donate now #dev #snapshot #godot
    GODOTENGINE.ORG
    Dev snapshot: Godot 4.5 dev 5
    Replicube A game by Walaber Entertainment LLCDev snapshot: Godot 4.5 dev 5By: Thaddeus Crews2 June 2025Pre-releaseBrrr… Do you feel that? That’s the cold front of the feature freeze just around the corner. It’s not upon us just yet, but this is likely to be our final development snapshot of the 4.5 release cycle. As we enter the home stretch of new features, bugs are naturally going to follow suit, meaning bug reports and feedback will be especially important for a smooth beta timeframe.Jump to the Downloads section, and give it a spin right now, or continue reading to learn more about improvements in this release. You can also try the Web editor or the Android editor for this release. If you are interested in the latter, please request to join our testing group to get access to pre-release builds.The cover illustration is from Replicube, a programming puzzle game where you write code to recreate voxelized objects. It is developed by Walaber Entertainment LLC (Bluesky, Twitter). You can get the game on Steam.HighlightsIn case you missed them, see the 4.5 dev 1, 4.5 dev 2, 4.5 dev 3, and 4.5 dev 4 release notes for an overview of some key features which were already in those snapshots, and are therefore still available for testing in dev 5.Native visionOS supportNormally, our featured highlights in these development blogs come from long-time contributors. This makes sense of course, as it’s generally those users that have the familiarity necessary for major changes or additions that are commonly used for these highlights. That’s why it might surprise you to hear that visionOS support comes to us from Ricardo Sanchez-Saez, whose pull request GH-105628 is his very first contribution to the engine! It might not surprise you to hear that Ricardo is part of the visionOS engineering team at Apple, which certainly helps get his foot in the door, but that still makes visionOS the first officially-supported platform integration in about a decade.For those unfamiliar, visionOS is Apple’s XR environment. We’re no strangers to XR as a concept (see our recent XR blogpost highlighting the latest Godot XR Game Jam), but XR platforms are as distinct from one another as traditional platforms. visionOS users have expressed a strong interest in integrating with our ever-growing XR community, and now we can make that happen. See you all in the next XR Game Jam!GDScript: Abstract classesWhile the Godot Engine utilizes abstract classes—a class that cannot be directly instantiated—frequently, this was only ever supported internally. Thanks to the efforts of Aaron Franke, this paradigm is now available to GDScript users (GH-67777). Now if a user wants to introduce their own abstract class, they merely need to declare it via the new abstract keyword:abstract class_name MyAbstract extends Node The purpose of an abstract class is to create a baseline for other classes to derive from:class_name ExtendsMyAbstract extends MyAbstract Shader bakerFrom the technical gurus behind implementing ubershaders, Darío Samo and Pedro J. Estébanez bring us another miracle of rendering via GH-102552: shader baker exporting. This is an optional feature that can be enabled at export time to speed up shader compilation massively. This feature works with ubershaders automatically without any work from the user. Using shader baking is strongly recommended when targeting Apple devices or D3D12 since it makes the biggest difference there (over 20× decrease in load times in the TPS demo)!Before:After:However, it comes with tradeoffs:Export time will be much longer.Build size will be much larger since the baked shaders can take up a lot of space.We have removed several MoltenVK bug workarounds from the Forward+ shader, therefore we no longer guarantee support for the Forward+ renderer on Intel Macs. If you are targeting Intel Macs, you should use the Mobile or Compatibility renderers.Baking for Vulkan can be done from any device, but baking for D3D12 needs to be done from a Windows device and baking for Apple .metallib requires a Metal compiler (macOS with Xcode / Command Line Tools installed).Web: WebAssembly SIMD supportAs you might recall, Godot 4.0 initially released under the assumption that multi-threaded web support would become the standard, and only supported that format for web builds. This assumption unfortunately proved to be wishful thinking, and was reverted in 4.3 by allowing for single-threaded builds once more. However, this doesn’t mean that these single-threaded environments are inherently incapable of parallel processing; it just requires alternative implementations. One such implementation, SIMD, is a perfect candidate thanks to its support across all major browsers. To that end, web-wiz Adam Scott has taken to integrating this implementation for our web builds by default (GH-106319).Inline color pickersWhile it’s always been possible to see what kind of variable is assigned to an exported color in the inspector, some users have expressed a keen interest in allowing for this functionality within the script editor itself. This is because it would mean seeing what kind of color is represented by a variable without it needing to be exposed, as well as making it more intuitive at a glance as to what color a name or code corresponds to. Koliur Rahman has blessed us with this quality-of-life goodness, which adds an inline color picker GH-105724. Now no matter where the color is declared, users will be able to immediately and intuitively know what is actually represented in a non-intrusive manner.Rendering goodiesThe renderer got a fair amount of love this snapshot; not from any one PR, but rather a multitude of community members bringing some long-awaited features to light. Raymond DiDonato helped SMAA 1x make its transition from addon to fully-fledged engine feature (GH-102330). Capry brings bent normal maps to further enhance specular occlusion and indirect lighting (GH-89988). Our very own Clay John converted our Compatibility backend to use a fragment shader copy instead of a blit copy, working around common sample rate issues on mobile devices (GH-106267). More technical information on these rendering changes can be found in their associated PRs.SMAA comparison:OffOnBent normal map comparison:BeforeAfterAnd more!There are too many exciting changes to list them all here, but here’s a curated selection:Animation: Add alphabetical sorting to Animation Player (GH-103584).Animation: Add animation filtering to animation editor (GH-103130).Audio: Implement seek operation for Theora video files, improve multi-channel audio resampling (GH-102360).Core: Add --scene command line argument (GH-105302).Core: Overhaul resource duplication (GH-100673).Core: Use Grisu2 algorithm in String::num_scientific to fix serializing (GH-98750).Editor: Add “Quick Load” button to EditorResourcePicker (GH-104490).Editor: Add PROPERTY_HINT_INPUT_NAME for use with @export_custom to allow using input actions (GH-96611).Editor: Add named EditorScripts to the command palette (GH-99318).GUI: Add file sort to FileDialog (GH-105723).I18n: Add translation preview in editor (GH-96921).Import: Add Channel Remap settings to ResourceImporterTexture (GH-99676).Physics: Improve performance with non-monitoring areas when using Jolt Physics (GH-106490).Porting: Android: Add export option for custom theme attributes (GH-106724).Porting: Android: Add support for 16 KB page sizes, update to NDK r28b (GH-106358).Porting: Android: Remove the gradle_build/compress_native_libraries export option (GH-106359).Porting: Web: Use actual PThread pool size for get_default_thread_pool_size() (GH-104458).Porting: Windows/macOS/Linux: Use SSE 4.2 as a baseline when compiling Godot (GH-59595).Rendering: Add new StandardMaterial properties to allow users to control FPS-style objects (hands, weapons, tools close to the camera) (GH-93142).Rendering: FTI - Optimize SceneTree traversal (GH-106244).Changelog109 contributors submitted 252 fixes for this release. See our interactive changelog for the complete list of changes since the previous 4.5-dev4 snapshot.This release is built from commit 64b09905c.DownloadsGodot is downloading...Godot exists thanks to donations from people like you. Help us continue our work:Make a DonationStandard build includes support for GDScript and GDExtension..NET build (marked as mono) includes support for C#, as well as GDScript and GDExtension.While engine maintainers try their best to ensure that each preview snapshot and release candidate is stable, this is by definition a pre-release piece of software. Be sure to make frequent backups, or use a version control system such as Git, to preserve your projects in case of corruption or data loss.Known issuesWindows executables (both the editor and export templates) have been signed with an expired certificate. You may see warnings from Windows Defender’s SmartScreen when running this version, or outright be prevented from running the executables with a double-click (GH-106373). Running Godot from the command line can circumvent this. We will soon have a renewed certificate which will be used for future builds.With every release, we accept that there are going to be various issues, which have already been reported but haven’t been fixed yet. See the GitHub issue tracker for a complete list of known bugs.Bug reportsAs a tester, we encourage you to open bug reports if you experience issues with this release. Please check the existing issues on GitHub first, using the search function with relevant keywords, to ensure that the bug you experience is not already known.In particular, any change that would cause a regression in your projects is very important to report (e.g. if something that worked fine in previous 4.x releases, but no longer works in this snapshot).SupportGodot is a non-profit, open source game engine developed by hundreds of contributors on their free time, as well as a handful of part and full-time developers hired thanks to generous donations from the Godot community. A big thank you to everyone who has contributed their time or their financial support to the project!If you’d like to support the project financially and help us secure our future hires, you can do so using the Godot Development Fund.Donate now
    0 Commenti 0 condivisioni
  • SpaceX Starship Flight 9 Reuses Booster, Gathers Key Data Despite Loss

    Photo Credit: SpaceX SpaceX Starship Flight 9 Reuses Booster, Gathers Key Data Despite Loss

    Highlights

    First reuse of Super Heavy booster marks key Starship milestone
    Ship reached space but failed to deploy dummy Starlink satellites
    Valuable reentry and tile test data collected despite stage losses

    Advertisement

    SpaceX launched its ninth Starship test flight on May 27 that featured the first-ever significant reuse of Starship hardware. As planned on Flight 9, Starship's two stages separated successfully, and the upper stage even reached space. However, both were ultimately lost before completing their objectives. Despite these setbacks, the mission yielded valuable data which inspires SpaceX's iterative approach to innovation as it aims to create a fully reusable launch system for space missions. This test flight exhibited successful reuse of a Super Heavy booster and aimed to demonstrate improved hardware performance.Previous test flightsAccording to official site of SpaceX, Starship's two stages are one giant booster called Super Heavy and a 171-foot-tallupper-stage spacecraft known as Starship, or simply "Ship." Both are powered by SpaceX's new Raptor engine — 33 of them for Super Heavy and six for Ship.On Flight 7 and Flight 8 the Super Heavy performed flawlessly, acing its engine burn and then returning to Starbase for a catch by the launch tower's "chopstick" arms. But Ship had problems: It exploded less than 10 minutes after launch on both missions, raining debris down on the Turks and Caicos Islands and The Bahamas, respectively.Advancements in flight 9In flight 9, SpaceX reused a Super Heavy booster for the first time, swapping out just four of its 33 Raptor engines after its initial flight in January. The booster also conducted a new atmospheric entry experiment, entering at a higher angle to collect data on aerodynamic control. Meanwhile, Shipwas tasked with deploying eight dummy Starlink satellites.Despite the promising advances, Flight 9 encountered several failures. Super Heavy broke apart roughly six minutes after launch during its return burn, and Ship lost control due to a fuel tank leak. The upper stage began tumbling, which prevented a planned in-space engine relight and led to a destructive reentry over the Indian Ocean. Still, SpaceX gained critical data, particularly on tile performance and active cooling systems.

     

    For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

    Further reading:
    SpaceX, Starship, Super Heavy, Starship Flight 9, Elon Musk, Rocket Reuse, Raptor Engine, Space Launch, Space Innovation, Starlink, Reusable Rocket

    Gadgets 360 Staff

    The resident bot. If you email me, a human will respond.
    More

    Related Stories
    #spacex #starship #flight #reuses #booster
    SpaceX Starship Flight 9 Reuses Booster, Gathers Key Data Despite Loss
    Photo Credit: SpaceX SpaceX Starship Flight 9 Reuses Booster, Gathers Key Data Despite Loss Highlights First reuse of Super Heavy booster marks key Starship milestone Ship reached space but failed to deploy dummy Starlink satellites Valuable reentry and tile test data collected despite stage losses Advertisement SpaceX launched its ninth Starship test flight on May 27 that featured the first-ever significant reuse of Starship hardware. As planned on Flight 9, Starship's two stages separated successfully, and the upper stage even reached space. However, both were ultimately lost before completing their objectives. Despite these setbacks, the mission yielded valuable data which inspires SpaceX's iterative approach to innovation as it aims to create a fully reusable launch system for space missions. This test flight exhibited successful reuse of a Super Heavy booster and aimed to demonstrate improved hardware performance.Previous test flightsAccording to official site of SpaceX, Starship's two stages are one giant booster called Super Heavy and a 171-foot-tallupper-stage spacecraft known as Starship, or simply "Ship." Both are powered by SpaceX's new Raptor engine — 33 of them for Super Heavy and six for Ship.On Flight 7 and Flight 8 the Super Heavy performed flawlessly, acing its engine burn and then returning to Starbase for a catch by the launch tower's "chopstick" arms. But Ship had problems: It exploded less than 10 minutes after launch on both missions, raining debris down on the Turks and Caicos Islands and The Bahamas, respectively.Advancements in flight 9In flight 9, SpaceX reused a Super Heavy booster for the first time, swapping out just four of its 33 Raptor engines after its initial flight in January. The booster also conducted a new atmospheric entry experiment, entering at a higher angle to collect data on aerodynamic control. Meanwhile, Shipwas tasked with deploying eight dummy Starlink satellites.Despite the promising advances, Flight 9 encountered several failures. Super Heavy broke apart roughly six minutes after launch during its return burn, and Ship lost control due to a fuel tank leak. The upper stage began tumbling, which prevented a planned in-space engine relight and led to a destructive reentry over the Indian Ocean. Still, SpaceX gained critical data, particularly on tile performance and active cooling systems.   For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube. Further reading: SpaceX, Starship, Super Heavy, Starship Flight 9, Elon Musk, Rocket Reuse, Raptor Engine, Space Launch, Space Innovation, Starlink, Reusable Rocket Gadgets 360 Staff The resident bot. If you email me, a human will respond. More Related Stories #spacex #starship #flight #reuses #booster
    WWW.GADGETS360.COM
    SpaceX Starship Flight 9 Reuses Booster, Gathers Key Data Despite Loss
    Photo Credit: SpaceX SpaceX Starship Flight 9 Reuses Booster, Gathers Key Data Despite Loss Highlights First reuse of Super Heavy booster marks key Starship milestone Ship reached space but failed to deploy dummy Starlink satellites Valuable reentry and tile test data collected despite stage losses Advertisement SpaceX launched its ninth Starship test flight on May 27 that featured the first-ever significant reuse of Starship hardware. As planned on Flight 9, Starship's two stages separated successfully, and the upper stage even reached space. However, both were ultimately lost before completing their objectives. Despite these setbacks, the mission yielded valuable data which inspires SpaceX's iterative approach to innovation as it aims to create a fully reusable launch system for space missions. This test flight exhibited successful reuse of a Super Heavy booster and aimed to demonstrate improved hardware performance.Previous test flightsAccording to official site of SpaceX, Starship's two stages are one giant booster called Super Heavy and a 171-foot-tall (52 meters) upper-stage spacecraft known as Starship, or simply "Ship." Both are powered by SpaceX's new Raptor engine — 33 of them for Super Heavy and six for Ship.On Flight 7 and Flight 8 the Super Heavy performed flawlessly, acing its engine burn and then returning to Starbase for a catch by the launch tower's "chopstick" arms. But Ship had problems: It exploded less than 10 minutes after launch on both missions, raining debris down on the Turks and Caicos Islands and The Bahamas, respectively.Advancements in flight 9In flight 9, SpaceX reused a Super Heavy booster for the first time, swapping out just four of its 33 Raptor engines after its initial flight in January. The booster also conducted a new atmospheric entry experiment, entering at a higher angle to collect data on aerodynamic control. Meanwhile, Ship (the upper stage) was tasked with deploying eight dummy Starlink satellites.Despite the promising advances, Flight 9 encountered several failures. Super Heavy broke apart roughly six minutes after launch during its return burn, and Ship lost control due to a fuel tank leak. The upper stage began tumbling, which prevented a planned in-space engine relight and led to a destructive reentry over the Indian Ocean. Still, SpaceX gained critical data, particularly on tile performance and active cooling systems.   For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube. Further reading: SpaceX, Starship, Super Heavy, Starship Flight 9, Elon Musk, Rocket Reuse, Raptor Engine, Space Launch, Space Innovation, Starlink, Reusable Rocket Gadgets 360 Staff The resident bot. If you email me, a human will respond. More Related Stories
    0 Commenti 0 condivisioni
  • Europe threatens Apple with additional fines

    The European Commission has published its full Digital Markets Actdecision against Apple, and it’s far, far worse than anybody expected. The Commission, the executive arm of the European Union, has accepted absolutely none of Apple’s arguments against being fined, and the decision threatens yet more existential damage to the company.

    Apple isn’t winning the argument, and, right or wrong, the decision has fangs.

    Huge fines, big threats

    Europe announced in April that it would fine Apple an eye-popping €500 million for noncompliance with the DMA, giving Apple 60 days to comply with its decision. One month later, the Commission published the full ruling against Apple, which details that changes the company made to its App Store rules did not go far enough to bring it into compliance.

    The decision warns that Apple is subject to additional periodic fines in the future if it fails to comply with the Commission’s strict interpretation of the DMA, no matter how inherently punitive some of its demands may be. We’ll know soon enough if there are to be wider consequences to Europe’s demands. Apple now has 30 days to fully comply with the DMAor face additional fines.

    The act itself came into force in November 2022 and began to be implemented against companies defined as ‘gatekeepers’ in 2023. The intention is to stop Apple and others from using their market position to impose anticompetitive limitations on developers. 

    Who is steering?

    The big bugbear relates to Apple’s anti-steering restrictions, which prevent developers from telling customers they can purchase services outside the App Store. The DMA demands that Apple let developers offer this option, which Apple does, but Europe argues that the limitations the company makes on doing so are not in compliance with the law.

    Europe also says Apple’s existing restrictions, fees, and technical limitations undermine the effectiveness of the DMA. That seems to mean Apple cannot charge a commission and cannot warn users of the consequences they face when shopping outside the App Store. 

    The Commission even plays dumb to the potential significance of permitting developers to link out to any website from within their apps, rather than being constrained to approvedsites. It says Apple has provided insufficient justification for this restriction and also wants Apple to remove messages warning users when they are about to make a transaction outside the App Store. 

    That’s going to be particularly pleasing to fraudsters, who may now attempt to create fake payment portals that look like reputable ones. Apple prevented billion in fraud last year, the company has confirmed. Perhaps once the first big frauds take place, the EU may catch up to the online risks we all know exist.

    While I understand the original aim of Europe’s Digital Markets Act, the demands the Commission is making of Apple appear to go far beyond the original objective, which was to open up Apple’s platforms to competition. 

    The decisions now open Apple’s platform up to competitors. 

    There is a difference between the two, and, as described, it means Apple must now create and manage its platforms while permitting competitors to profit from those platforms at little or no cost.

    Apple rejects Europe

    Apple will fight in Europe. 

    “There is nothing in the 70-page decision released today that justifies the European Commission’s targeted actions against Apple, which threaten the privacy and security of our users in Europe and force us to give away our technology for free,” the company said. “Their decision and unprecedented fine came after the Commission continuously moved the goalposts on compliance, and repeatedly blocked Apple’s months-long efforts to implement a new solution. The decision is bad for innovation, bad for competition, bad for our products, and bad for users. While we appeal, we’ll continue engaging with the Commission to advocate on behalf of our European customers.”

    When the fine was initially revealed, the company also said: 

    “Today’s announcements are yet another example of the European Commission unfairly targeting Apple in a series of decisions that are bad for the privacy and security of our users, bad for products, and force us to give away our technology for free. We have spent hundreds of thousands of engineering hours and made dozens of changes to comply with this law, none of which our users have asked for. Despite countless meetings, the Commission continues to move the goal posts every step of the way.”

    My take? 

    Far from saving Europe’s tech industry, the manner in which the DMA is being applied will make the region even less relevant. Lacking a significant platform of its own, Europe’s approach will reduce choice and increase insecurity.

    As the clear first target of the DMA, Apple will inevitably be forced to increase prices, charge developers more for access to its developer tools, and will I think simply stop selling some products and services in Europe, rather than threaten customer security. We know it can do this because it has done so before.

    Fundamentally, of course, the big question remains unaddressed: How much profit it is legitimate to make on any product or service? I imagine the European Commission doesn’t want to go near a question as fundamental to capitalist wealth extraction as that. Can you imagine the collapse in executive bonuses that would follow a decision to define what the maximum profit made in any business transaction should be?

    Lobbyists across the political spectrum would be appalled — that extra profit pays for their meals. Looking to the extent to which the current application of the DMA seems to favor Apple’s biggest competitors, I can’t help but imagine it’s been paying for a few European meals already. Nice work, if you can get it. 

    You can follow me on social media! Join me on BlueSky,  LinkedIn, Mastodon, and MeWe.
    #europe #threatens #apple #with #additional
    Europe threatens Apple with additional fines
    The European Commission has published its full Digital Markets Actdecision against Apple, and it’s far, far worse than anybody expected. The Commission, the executive arm of the European Union, has accepted absolutely none of Apple’s arguments against being fined, and the decision threatens yet more existential damage to the company. Apple isn’t winning the argument, and, right or wrong, the decision has fangs. Huge fines, big threats Europe announced in April that it would fine Apple an eye-popping €500 million for noncompliance with the DMA, giving Apple 60 days to comply with its decision. One month later, the Commission published the full ruling against Apple, which details that changes the company made to its App Store rules did not go far enough to bring it into compliance. The decision warns that Apple is subject to additional periodic fines in the future if it fails to comply with the Commission’s strict interpretation of the DMA, no matter how inherently punitive some of its demands may be. We’ll know soon enough if there are to be wider consequences to Europe’s demands. Apple now has 30 days to fully comply with the DMAor face additional fines. The act itself came into force in November 2022 and began to be implemented against companies defined as ‘gatekeepers’ in 2023. The intention is to stop Apple and others from using their market position to impose anticompetitive limitations on developers.  Who is steering? The big bugbear relates to Apple’s anti-steering restrictions, which prevent developers from telling customers they can purchase services outside the App Store. The DMA demands that Apple let developers offer this option, which Apple does, but Europe argues that the limitations the company makes on doing so are not in compliance with the law. Europe also says Apple’s existing restrictions, fees, and technical limitations undermine the effectiveness of the DMA. That seems to mean Apple cannot charge a commission and cannot warn users of the consequences they face when shopping outside the App Store.  The Commission even plays dumb to the potential significance of permitting developers to link out to any website from within their apps, rather than being constrained to approvedsites. It says Apple has provided insufficient justification for this restriction and also wants Apple to remove messages warning users when they are about to make a transaction outside the App Store.  That’s going to be particularly pleasing to fraudsters, who may now attempt to create fake payment portals that look like reputable ones. Apple prevented billion in fraud last year, the company has confirmed. Perhaps once the first big frauds take place, the EU may catch up to the online risks we all know exist. While I understand the original aim of Europe’s Digital Markets Act, the demands the Commission is making of Apple appear to go far beyond the original objective, which was to open up Apple’s platforms to competition.  The decisions now open Apple’s platform up to competitors.  There is a difference between the two, and, as described, it means Apple must now create and manage its platforms while permitting competitors to profit from those platforms at little or no cost. Apple rejects Europe Apple will fight in Europe.  “There is nothing in the 70-page decision released today that justifies the European Commission’s targeted actions against Apple, which threaten the privacy and security of our users in Europe and force us to give away our technology for free,” the company said. “Their decision and unprecedented fine came after the Commission continuously moved the goalposts on compliance, and repeatedly blocked Apple’s months-long efforts to implement a new solution. The decision is bad for innovation, bad for competition, bad for our products, and bad for users. While we appeal, we’ll continue engaging with the Commission to advocate on behalf of our European customers.” When the fine was initially revealed, the company also said:  “Today’s announcements are yet another example of the European Commission unfairly targeting Apple in a series of decisions that are bad for the privacy and security of our users, bad for products, and force us to give away our technology for free. We have spent hundreds of thousands of engineering hours and made dozens of changes to comply with this law, none of which our users have asked for. Despite countless meetings, the Commission continues to move the goal posts every step of the way.” My take?  Far from saving Europe’s tech industry, the manner in which the DMA is being applied will make the region even less relevant. Lacking a significant platform of its own, Europe’s approach will reduce choice and increase insecurity. As the clear first target of the DMA, Apple will inevitably be forced to increase prices, charge developers more for access to its developer tools, and will I think simply stop selling some products and services in Europe, rather than threaten customer security. We know it can do this because it has done so before. Fundamentally, of course, the big question remains unaddressed: How much profit it is legitimate to make on any product or service? I imagine the European Commission doesn’t want to go near a question as fundamental to capitalist wealth extraction as that. Can you imagine the collapse in executive bonuses that would follow a decision to define what the maximum profit made in any business transaction should be? Lobbyists across the political spectrum would be appalled — that extra profit pays for their meals. Looking to the extent to which the current application of the DMA seems to favor Apple’s biggest competitors, I can’t help but imagine it’s been paying for a few European meals already. Nice work, if you can get it.  You can follow me on social media! Join me on BlueSky,  LinkedIn, Mastodon, and MeWe. #europe #threatens #apple #with #additional
    WWW.COMPUTERWORLD.COM
    Europe threatens Apple with additional fines
    The European Commission has published its full Digital Markets Act (DMA) decision against Apple, and it’s far, far worse than anybody expected. The Commission, the executive arm of the European Union, has accepted absolutely none of Apple’s arguments against being fined, and the decision threatens yet more existential damage to the company. Apple isn’t winning the argument, and, right or wrong, the decision has fangs. Huge fines, big threats Europe announced in April that it would fine Apple an eye-popping €500 million for noncompliance with the DMA, giving Apple 60 days to comply with its decision. One month later, the Commission published the full ruling against Apple, which details that changes the company made to its App Store rules did not go far enough to bring it into compliance. The decision warns that Apple is subject to additional periodic fines in the future if it fails to comply with the Commission’s strict interpretation of the DMA, no matter how inherently punitive some of its demands may be. (Can anyone else spell “tariffs”?) We’ll know soon enough if there are to be wider consequences to Europe’s demands. Apple now has 30 days to fully comply with the DMA (in Europe’s opinion) or face additional fines. The act itself came into force in November 2022 and began to be implemented against companies defined as ‘gatekeepers’ in 2023. The intention is to stop Apple and others from using their market position to impose anticompetitive limitations on developers.  Who is steering? The big bugbear relates to Apple’s anti-steering restrictions, which prevent developers from telling customers they can purchase services outside the App Store. The DMA demands that Apple let developers offer this option, which Apple does, but Europe argues that the limitations the company makes on doing so are not in compliance with the law. Europe also says Apple’s existing restrictions, fees, and technical limitations undermine the effectiveness of the DMA. That seems to mean Apple cannot charge a commission and cannot warn users of the consequences they face when shopping outside the App Store.  The Commission even plays dumb to the potential significance of permitting developers to link out to any website from within their apps, rather than being constrained to approved (and secure) sites. It says Apple has provided insufficient justification for this restriction and also wants Apple to remove messages warning users when they are about to make a transaction outside the App Store.  That’s going to be particularly pleasing to fraudsters, who may now attempt to create fake payment portals that look like reputable ones. Apple prevented $2 billion in fraud last year, the company has confirmed. Perhaps once the first big frauds take place, the EU may catch up to the online risks we all know exist. While I understand the original aim of Europe’s Digital Markets Act, the demands the Commission is making of Apple appear to go far beyond the original objective, which was to open up Apple’s platforms to competition.  The decisions now open Apple’s platform up to competitors.  There is a difference between the two, and, as described, it means Apple must now create and manage its platforms while permitting competitors to profit from those platforms at little or no cost. Apple rejects Europe Apple will fight in Europe.  “There is nothing in the 70-page decision released today that justifies the European Commission’s targeted actions against Apple, which threaten the privacy and security of our users in Europe and force us to give away our technology for free,” the company said. “Their decision and unprecedented fine came after the Commission continuously moved the goalposts on compliance, and repeatedly blocked Apple’s months-long efforts to implement a new solution. The decision is bad for innovation, bad for competition, bad for our products, and bad for users. While we appeal, we’ll continue engaging with the Commission to advocate on behalf of our European customers.” When the fine was initially revealed, the company also said:  “Today’s announcements are yet another example of the European Commission unfairly targeting Apple in a series of decisions that are bad for the privacy and security of our users, bad for products, and force us to give away our technology for free. We have spent hundreds of thousands of engineering hours and made dozens of changes to comply with this law, none of which our users have asked for. Despite countless meetings, the Commission continues to move the goal posts every step of the way.” My take?  Far from saving Europe’s tech industry, the manner in which the DMA is being applied will make the region even less relevant. Lacking a significant platform of its own, Europe’s approach will reduce choice and increase insecurity. As the clear first target of the DMA, Apple will inevitably be forced to increase prices, charge developers more for access to its developer tools, and will I think simply stop selling some products and services in Europe, rather than threaten customer security. We know it can do this because it has done so before. Fundamentally, of course, the big question remains unaddressed: How much profit it is legitimate to make on any product or service? I imagine the European Commission doesn’t want to go near a question as fundamental to capitalist wealth extraction as that. Can you imagine the collapse in executive bonuses that would follow a decision to define what the maximum profit made in any business transaction should be? Lobbyists across the political spectrum would be appalled — that extra profit pays for their meals. Looking to the extent to which the current application of the DMA seems to favor Apple’s biggest competitors, I can’t help but imagine it’s been paying for a few European meals already. Nice work, if you can get it.  You can follow me on social media! Join me on BlueSky,  LinkedIn, Mastodon, and MeWe.
    0 Commenti 0 condivisioni
Pagine in Evidenza