• Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety

    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse.
    Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehiclesacross countless real-world and edge-case scenarios without the risks and costs of physical testing.
    These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models— neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation.
    To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools.
    Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale.
    Universal Scene Description, a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale.
    NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale.
    Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models.

    Foundations for Scalable, Realistic Simulation
    Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots.

    In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools.
    Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos.
    Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing.
    The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases.
    Driving the Future of AV Safety
    To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety.
    The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems.
    These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks.

    At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance.
    Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay:

    Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks.
    Get Plugged Into the World of OpenUSD
    Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote.
    Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14.
    Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute.
    Explore the Alliance for OpenUSD forum and the AOUSD website.
    Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X.
    #into #omniverse #world #foundation #models
    Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety
    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse. Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehiclesacross countless real-world and edge-case scenarios without the risks and costs of physical testing. These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models— neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation. To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools. Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale. Universal Scene Description, a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale. NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale. Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models. Foundations for Scalable, Realistic Simulation Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots. In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools. Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos. Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing. The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases. Driving the Future of AV Safety To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety. The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems. These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks. At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance. Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay: Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks. Get Plugged Into the World of OpenUSD Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote. Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14. Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute. Explore the Alliance for OpenUSD forum and the AOUSD website. Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X. #into #omniverse #world #foundation #models
    BLOGS.NVIDIA.COM
    Into the Omniverse: World Foundation Models Advance Autonomous Vehicle Simulation and Safety
    Editor’s note: This blog is a part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse. Simulated driving environments enable engineers to safely and efficiently train, test and validate autonomous vehicles (AVs) across countless real-world and edge-case scenarios without the risks and costs of physical testing. These simulated environments can be created through neural reconstruction of real-world data from AV fleets or generated with world foundation models (WFMs) — neural networks that understand physics and real-world properties. WFMs can be used to generate synthetic datasets for enhanced AV simulation. To help physical AI developers build such simulated environments, NVIDIA unveiled major advances in WFMs at the GTC Paris and CVPR conferences earlier this month. These new capabilities enhance NVIDIA Cosmos — a platform of generative WFMs, advanced tokenizers, guardrails and accelerated data processing tools. Key innovations like Cosmos Predict-2, the Cosmos Transfer-1 NVIDIA preview NIM microservice and Cosmos Reason are improving how AV developers generate synthetic data, build realistic simulated environments and validate safety systems at unprecedented scale. Universal Scene Description (OpenUSD), a unified data framework and standard for physical AI applications, enables seamless integration and interoperability of simulation assets across the development pipeline. OpenUSD standardization plays a critical role in ensuring 3D pipelines are built to scale. NVIDIA Omniverse, a platform of application programming interfaces, software development kits and services for building OpenUSD-based physical AI applications, enables simulations from WFMs and neural reconstruction at world scale. Leading AV organizations — including Foretellix, Mcity, Oxa, Parallel Domain, Plus AI and Uber — are among the first to adopt Cosmos models. Foundations for Scalable, Realistic Simulation Cosmos Predict-2, NVIDIA’s latest WFM, generates high-quality synthetic data by predicting future world states from multimodal inputs like text, images and video. This capability is critical for creating temporally consistent, realistic scenarios that accelerate training and validation of AVs and robots. In addition, Cosmos Transfer, a control model that adds variations in weather, lighting and terrain to existing scenarios, will soon be available to 150,000 developers on CARLA, a leading open-source AV simulator. This greatly expands the broad AV developer community’s access to advanced AI-powered simulation tools. Developers can start integrating synthetic data into their own pipelines using the NVIDIA Physical AI Dataset. The latest release includes 40,000 clips generated using Cosmos. Building on these foundations, the Omniverse Blueprint for AV simulation provides a standardized, API-driven workflow for constructing rich digital twins, replaying real-world sensor data and generating new ground-truth data for closed-loop testing. The blueprint taps into OpenUSD’s layer-stacking and composition arcs, which enable developers to collaborate asynchronously and modify scenes nondestructively. This helps create modular, reusable scenario variants to efficiently generate different weather conditions, traffic patterns and edge cases. Driving the Future of AV Safety To bolster the operational safety of AV systems, NVIDIA earlier this year introduced NVIDIA Halos — a comprehensive safety platform that integrates the company’s full automotive hardware and software stack with AI research focused on AV safety. The new Cosmos models — Cosmos Predict- 2, Cosmos Transfer- 1 NIM and Cosmos Reason — deliver further safety enhancements to the Halos platform, enabling developers to create diverse, controllable and realistic scenarios for training and validating AV systems. These models, trained on massive multimodal datasets including driving data, amplify the breadth and depth of simulation, allowing for robust scenario coverage — including rare and safety-critical events — while supporting post-training customization for specialized AV tasks. At CVPR, NVIDIA was recognized as an Autonomous Grand Challenge winner, highlighting its leadership in advancing end-to-end AV workflows. The challenge used OpenUSD’s robust metadata and interoperability to simulate sensor inputs and vehicle trajectories in semi-reactive environments, achieving state-of-the-art results in safety and compliance. Learn more about how developers are leveraging tools like CARLA, Cosmos, and Omniverse to advance AV simulation in this livestream replay: Hear NVIDIA Director of Autonomous Vehicle Research Marco Pavone on the NVIDIA AI Podcast share how digital twins and high-fidelity simulation are improving vehicle testing, accelerating development and reducing real-world risks. Get Plugged Into the World of OpenUSD Learn more about what’s next for AV simulation with OpenUSD by watching the replay of NVIDIA founder and CEO Jensen Huang’s GTC Paris keynote. Looking for more live opportunities to learn more about OpenUSD? Don’t miss sessions and labs happening at SIGGRAPH 2025, August 10–14. Discover why developers and 3D practitioners are using OpenUSD and learn how to optimize 3D workflows with the self-paced “Learn OpenUSD” curriculum for 3D developers and practitioners, available for free through the NVIDIA Deep Learning Institute. Explore the Alliance for OpenUSD forum and the AOUSD website. Stay up to date by subscribing to NVIDIA Omniverse news, joining the community and following NVIDIA Omniverse on Instagram, LinkedIn, Medium and X.
    0 Kommentare 0 Anteile
  • Over 8M patient records leaked in healthcare data breach

    Published
    June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles!
    In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work  Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data     How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop  5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication. It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    #over #patient #records #leaked #healthcare
    Over 8M patient records leaked in healthcare data breach
    Published June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles! In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work  Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data     How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop  5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication. It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com. #over #patient #records #leaked #healthcare
    WWW.FOXNEWS.COM
    Over 8M patient records leaked in healthcare data breach
    Published June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles! In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work   (Kurt "CyberGuy" Knutsson)Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data      (Kurt "CyberGuy" Knutsson)How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop   (Kurt "CyberGuy" Knutsson)5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication (2FA). It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    Like
    Love
    Wow
    Sad
    Angry
    507
    0 Kommentare 0 Anteile
  • Python Creator Guido van Rossum Asks: Is 'Worse is Better' Still True for Programming Languages?

    In 1989 a computer scientist argued that more functionality in software actually lowers usability and practicality — leading to the counterintuitive proposition that "worse is better". But is that still true?

    Python's original creator Guido van Rossum addressed the question last month in a lightning talk at the annual Python Language Summit 2025.

    Guido started by recounting earlier periods of Python development from 35 years ago, where he used UNIX "almost exclusively" and thus "Python was greatly influenced by UNIX's 'worse is better' philosophy"... "The fact thatwasn't perfect encouraged many people to start contributing. All of the code was straightforward, there were no thoughts of optimization... These early contributors also now had a stake in the language;was also their baby"...

    Guido contrasted early development to how Python is developed now: "features that take years to produce from teams of software developers paid by big tech companies. The static type system requires an academic-level understanding of esoteric type system features." And this isn't just Python the language, "third-party projects like numpy are maintained by folks who are paid full-time to do so.... Now we have a huge community, but very few people, relatively speaking, are contributing meaningfully."
    Guido asked whether the expectation for Python contributors going forward would be that "you had to write a perfect PEP or create a perfect prototype that can be turned into production-ready code?" Guido pined for the "old days" where feature development could skip performance or feature-completion to get something into the hands of the community to "start kicking the tires". "Do we have to abandon 'worse is better' as a philosophy and try to make everything as perfect as possible?" Guido thought doing so "would be a shame", but that he "wasn't sure how to change it", acknowledging that core developers wouldn't want to create features and then break users with future releases.
    Guido referenced David Hewitt's PyO3 talk about Rust and Python, and that development "was using worse is better," where there is a core feature set that works, and plenty of work to be done and open questions. "That sounds a lot more fun than working on core CPython", Guido paused, "...not that I'd ever personally learn Rust. Maybe I should give it a try after," which garnered laughter from core developers.

    "Maybe we should do more of that: allowing contributors in the community to have a stake and care".

    of this story at Slashdot.
    #python #creator #guido #van #rossum
    Python Creator Guido van Rossum Asks: Is 'Worse is Better' Still True for Programming Languages?
    In 1989 a computer scientist argued that more functionality in software actually lowers usability and practicality — leading to the counterintuitive proposition that "worse is better". But is that still true? Python's original creator Guido van Rossum addressed the question last month in a lightning talk at the annual Python Language Summit 2025. Guido started by recounting earlier periods of Python development from 35 years ago, where he used UNIX "almost exclusively" and thus "Python was greatly influenced by UNIX's 'worse is better' philosophy"... "The fact thatwasn't perfect encouraged many people to start contributing. All of the code was straightforward, there were no thoughts of optimization... These early contributors also now had a stake in the language;was also their baby"... Guido contrasted early development to how Python is developed now: "features that take years to produce from teams of software developers paid by big tech companies. The static type system requires an academic-level understanding of esoteric type system features." And this isn't just Python the language, "third-party projects like numpy are maintained by folks who are paid full-time to do so.... Now we have a huge community, but very few people, relatively speaking, are contributing meaningfully." Guido asked whether the expectation for Python contributors going forward would be that "you had to write a perfect PEP or create a perfect prototype that can be turned into production-ready code?" Guido pined for the "old days" where feature development could skip performance or feature-completion to get something into the hands of the community to "start kicking the tires". "Do we have to abandon 'worse is better' as a philosophy and try to make everything as perfect as possible?" Guido thought doing so "would be a shame", but that he "wasn't sure how to change it", acknowledging that core developers wouldn't want to create features and then break users with future releases. Guido referenced David Hewitt's PyO3 talk about Rust and Python, and that development "was using worse is better," where there is a core feature set that works, and plenty of work to be done and open questions. "That sounds a lot more fun than working on core CPython", Guido paused, "...not that I'd ever personally learn Rust. Maybe I should give it a try after," which garnered laughter from core developers. "Maybe we should do more of that: allowing contributors in the community to have a stake and care". of this story at Slashdot. #python #creator #guido #van #rossum
    DEVELOPERS.SLASHDOT.ORG
    Python Creator Guido van Rossum Asks: Is 'Worse is Better' Still True for Programming Languages?
    In 1989 a computer scientist argued that more functionality in software actually lowers usability and practicality — leading to the counterintuitive proposition that "worse is better". But is that still true? Python's original creator Guido van Rossum addressed the question last month in a lightning talk at the annual Python Language Summit 2025. Guido started by recounting earlier periods of Python development from 35 years ago, where he used UNIX "almost exclusively" and thus "Python was greatly influenced by UNIX's 'worse is better' philosophy"... "The fact that [Python] wasn't perfect encouraged many people to start contributing. All of the code was straightforward, there were no thoughts of optimization... These early contributors also now had a stake in the language; [Python] was also their baby"... Guido contrasted early development to how Python is developed now: "features that take years to produce from teams of software developers paid by big tech companies. The static type system requires an academic-level understanding of esoteric type system features." And this isn't just Python the language, "third-party projects like numpy are maintained by folks who are paid full-time to do so.... Now we have a huge community, but very few people, relatively speaking, are contributing meaningfully." Guido asked whether the expectation for Python contributors going forward would be that "you had to write a perfect PEP or create a perfect prototype that can be turned into production-ready code?" Guido pined for the "old days" where feature development could skip performance or feature-completion to get something into the hands of the community to "start kicking the tires". "Do we have to abandon 'worse is better' as a philosophy and try to make everything as perfect as possible?" Guido thought doing so "would be a shame", but that he "wasn't sure how to change it", acknowledging that core developers wouldn't want to create features and then break users with future releases. Guido referenced David Hewitt's PyO3 talk about Rust and Python, and that development "was using worse is better," where there is a core feature set that works, and plenty of work to be done and open questions. "That sounds a lot more fun than working on core CPython", Guido paused, "...not that I'd ever personally learn Rust. Maybe I should give it a try after," which garnered laughter from core developers. "Maybe we should do more of that: allowing contributors in the community to have a stake and care". Read more of this story at Slashdot.
    0 Kommentare 0 Anteile
  • Do you think Sony will make support for their rumored new handheld mandatory for developers?

    Red Kong XIX
    Member

    Oct 11, 2020

    13,560

    This is assuming that the handheld can play PS4 games natively without any issues, so they are not included in the poll.
    Hardware leaker Kepler said it should be able to run PS5 games, even without a patch, but with a performance impact potentially. 

    Hero_of_the_Day
    Avenger

    Oct 27, 2017

    19,958

    Isn't the rumor that games don't require patches to run on it? That would imply that support isn't mandatory, but automatic.
     

    Homura
    ▲ Legend ▲
    Member

    Aug 20, 2019

    7,232

    As the post above said, the rumor is the PS5 portable will be able to run natively any and all PS4/PS5 games.

    Of course, some games might not work properly or require specific patches, but the idea is automatic compatibility. 

    shadowman16
    Member

    Oct 25, 2017

    42,292

    Ideally you'd want stuff to pretty much work out of the box. The more you ask devs to do, the less I imagine will want to support it... Or suddenly games get parred down so that they can run on handhelds.

    I personally would just prefer a solution where its automatic. I dont really care about a Sony handheld, dont really want devs to be forced to support the thing 

    Modest_Modsoul
    Living the Dreams
    Member

    Oct 29, 2017

    28,418


     

    setmymindforopensky
    Member

    Apr 20, 2025

    67

    a lot of games have performance modes. it should run a lot of the library even without any patching. if there's multiplat im sure itll default to the PS4 ver. im not sure what theyd do for something like GTA6 but itll have a series S version so its clearly scalable enough.

    im guessing PSTV situation. support it or not we dont care. 

    reksveks
    Member

    May 17, 2022

    7,628

    Think Kepler is personally assuming the goal of running without patches is a goal and one that won't happen just cause it's too late to force it.

    It's going to be an interesting solution to an interesting problem 

    Servbot24
    The Fallen

    Oct 25, 2017

    47,826

    Obviously not. Pretty absurd question tbh.
     

    RivalGT
    Member

    Dec 13, 2017

    7,616

    This one sounds like it requires a lot of work on Sony's end, I dont think developers will need to do much for games to work.

    Granted moving forward Sony is likely to make it easier for devs to have a more input on this portable mode.

    Things working out of the box is likely the goal, and thats what Sony needs if they want this to work, but devs having more input on this mode would be a plus I think. 

    Callibretto
    Member

    Oct 25, 2017

    10,445

    Indonesia

    shadowman16 said:

    Ideally you'd want stuff to pretty much work out of the box. The more you ask devs to do, the less I imagine will want to support it... Or suddenly games get parred down so that they can run on handhelds.

    I personally would just prefer a solution where its automatic. I dont really care about a Sony handheld, dont really want devs to be forced to support the thingClick to expand...
    Click to shrink...

    depend on the game imo, asking CD Project to somehow make Witcher 4 playable on handheld might be unreasonable. but any game that can run on Switch 2 should be playable on PSPortable without much issue
     

    Pheonix1
    Member

    Jun 22, 2024

    716

    Absolutely they will. Not sure why people think it would be hard, if they hand them.the right tools most ports won't take long anyhow.
     

    skeezx
    Member

    Oct 27, 2017

    23,994

    guessing there will be a "portable approved" label with the respective games going forward, regardless whether it's a PS5 or PS6 game. and when the thing is released popular past titles will be retroactively approved by sony, and up to developers if they want to patch the bigger games to be portable friendly.

    i guess where things could get tricky/laborious for developers is whether every game going forward is required to screen for portable performance, as it's not a PC so the portable will likely disallow for running "non-approved" games at all 

    AmFreak
    Member

    Oct 26, 2017

    3,245

    They need to give people some form of guarantee that it will get games, otherwise they greatly diminish their potential success.

    The best way to do this is to make it another SKU of the contemporary console. And witheverything already running at 60fps and progression slowing to a crawl it's far easier than it had been in the past. 

    Ruck
    Member

    Oct 25, 2017

    3,105

    I mean, what is the handheld? PS6? Or an actual second console? If the former, then yes, if the latter then no
     

    TitanicFall
    Member

    Nov 12, 2017

    9,340

    Nah. It might be incentivized though. There's not much in it for devs if it's a cross buy situation.
     

    Callibretto
    Member

    Oct 25, 2017

    10,445

    Indonesia

    imo, PS6 will remain their main console, focusing on high fidelity visuals that Switch 2 and portable PC won't be able to run without huge compromise.

    PSPortable will be secondary console, something like PSPortal, but this time able to play any games that Switch2 can reasonably run. and for the high end games that it can't run, it will use streaming, either from PS6 you own, or PS+ Premium subs 

    bleits
    Member

    Oct 14, 2023

    373

    They have to if they want to be taken seriously
     

    Vic Damone Jr.
    Member

    Oct 27, 2017

    20,534

    Nope Sony doesn't mandate this stuff and it's why their second product always dies.
     

    fiendcode
    Member

    Oct 26, 2017

    26,514

    I think it depends on what the device really is, if it's more of a "Portal 2" or a "Series SP" or something else entirely. Streaming might be enough for PS6 games along with incentivized PS5/4 patches but whatever SIE does they need to make sure their inhouse teams are ALL on board this time. That was a big part of PSP/Vita's downfall, that the biggest or most important PS Studios snubbed them and the teams that did show up with support are mostly closed and gone now.
     

    Callibretto
    Member

    Oct 25, 2017

    10,445

    Indonesia

    bleits said:

    They have to if they want to be taken seriously

    Click to expand...
    Click to shrink...

    from the last interview with PS exec about Switch 2 spec, it seems clear that PS have no plan to abandon high end console spec to switch to mobile hardware like Switch 2 and Xbox Ally.

    PS consider their high fidelity visual as advantage and differentiator from Nintendo.

    so with PS6, their top studio will eventuall make games that just won't realistically run on handheld devices.

    so having a mandate where all PS6 games is playable on handheld is simply unrealistic imo 

    danm999
    Member

    Oct 29, 2017

    19,929

    Sydney

    Incentives, not mandates.
     

    NSESN
    ▲ Legend ▲
    Member

    Oct 25, 2017

    27,729

    I think people are setting themselves for disappointment in regards for how powerful this thing will be
     

    defaltoption
    Plug in a controller and enter the Konami code
    The Fallen

    Oct 27, 2017

    12,485

    Austin

    Depends on what they call it.

    If they call it anything related to ps6, expect very bad performance, and mandates

    If they call it ps5 portable, expect bad performance and no mandates as it will be handled on their end

    If they call it a ps portable expect it to have no support from Sony and get whatever it gets just be happy it functions till they abandon it. 

    Metnut
    Member

    Apr 7, 2025

    30

    Good question OP.

    I voted the middle one. I think anything that ships for PS5 will need to work for the handheld. Question is whether that works automatically or will need patches. 

    mute
    ▲ Legend ▲
    Member

    Oct 25, 2017

    29,807

    I think that would require a level of commitment to a secondary piece of hardware that Sony hasn't shown in a long time.
     

    Patison
    Member

    Oct 27, 2017

    761

    It's difficult to say without knowing what they're planning with this device exactly. If they're fully going Switch routeor more like a Steam Deck, which will run launch games perfectly and then, as time goes on, some titles might start looking less than ideal or be unplayable at all.

    Or Series S/X, just the Series S being portable — that would be preferable but also limiting but also diminishing returns between generations so might be worth it etc.

    And if that device happens at all and its development won't be dropped soon is another question. Lots of unknowns, but I'm interested to see what Sony comes up with, as long as they'll have games to support it this time around. 

    Jammerz
    Member

    Apr 29, 2023

    1,579

    I think it will be optional support.

    However sony needs to support it with their first parties to set an example and making it as easy as possible for other devs to scale down. For sony first party games maybe use nixxes to scale down so their studios aren't bogged down. 

    Hamchan
    The Fallen

    Oct 25, 2017

    6,000

    I think 99.9% of games will be crossgen between PS5 and PS6 for the entire generation, just based on how this industry is going, so it might not be much of an issue for Sony to mandate.
     

    Advance.Wars.Sgt.
    Member

    Jun 10, 2018

    10,456

    Honestly, I'd worry more about Sony's 1st party teams than 3rd party developers since they were notoriously adverse making software with a handheld power profile in mind.
     

    overthewaves
    Member

    Sep 30, 2020

    1,203

    Wouldn't that hamstring the games for ps6? That's PlayStation players biggest fear they don't want a series S type situation right? They treat series S like a punching bag.
     

    Neonvisions
    Member

    Oct 27, 2017

    707

    overthewaves said:

    Wouldn't that hamstring the games for ps6? That's PlayStation players biggest fear they don't want a series S type situation right? They treat series S like a punching bag.

    Click to expand...
    Click to shrink...

    How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X? 

    Gwarm
    Member

    Nov 13, 2017

    2,902

    I'd be shocked if Sony released a device that let's you play games that haven't been patched or confirmed to run acceptably. Imagine if certain games just hard crashed the console? This is the company that wouldn't let you play certain Vita games on the PSTV even if they actually worked.
     

    bloopland33
    Member

    Mar 4, 2020

    3,845

    I wonder if they'll just do the Steam Deck thing and do a compatibility badge. You can boot whatever software you want, but it might run at 5 fps and drain your battery.

    This would be in addition to whatever efforts they're doing to make things work out of the box, of course.

    But it's hard to imagine them mandating developers ship a PS6 profile and a PS6P profile for those heavier games 5-7 years from now…

    ….but it's also hard to imagine them shipping this PS6-gen device that doesn't play everything. So maybe they Steam Deck it 

    vivftp
    Member

    Oct 29, 2017

    23,016

    My guess, every PS6 game will be mandated to support it. PS5 games will support it natively for the simpler games and will require a patch as has been rumored to run on lesser specs

    I think next gen we get PS3 and Vita emulation so the PS6 and portable will be able to play games from PSN from every past PlayStation 

    Mocha Joe
    Member

    Jun 2, 2021

    13,636

    Really need to take the Steam Deck approach and don't make it a requirement. Just make it a complementary device where it is possible to play majority of the games available on PSN.
     

    overthewaves
    Member

    Sep 30, 2020

    1,203

    Neonvisions said:

    How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X?

    Click to expand...
    Click to shrink...

    I mean did you see the reaction here to the series S announcement lol. Everyone was saying it's gonna "hold back the generation".
     

    reksveks
    Member

    May 17, 2022

    7,628

    Neonvisions said:

    How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X?

    Click to expand...
    Click to shrink...

    Or the perception is that it does but the truth is that there is a lot of factors
     

    Fabs
    Member

    Aug 22, 2019

    2,827

    I can't see the forcing handheld and pro support next gen.
     

    level
    Member

    May 25, 2023

    1,427

    Definitely not

    Games already take too long to make. Extra time isn't something they'll want to reinforce to their developers. 

    gofreak
    Member

    Oct 26, 2017

    8,411

    I don't think support will be mandatory. I think they're bringing it into a reality where a growing portion of games can, or could, run without much change or effort on the developer's part on a next gen handheld. They'll lean on that natural trend rather than a policy - anything that is outside of that will just be streamable as now with the Portal.
     

    Caiusto
    Member

    Oct 25, 2017

    7,086

    If they don't want to end up with another Vita yes they will.
     

    mute
    ▲ Legend ▲
    Member

    Oct 25, 2017

    29,807

    Advance.Wars.Sgt. said:

    Honestly, I'd worry more about Sony's 1st party teams than 3rd party developers since they were notoriously adverse making software with a handheld power profile in mind.

    Click to expand...
    Click to shrink...

    It does seem kinda unthinkable that Intergalactic would be made with a handheld in mind, for example.
     

    AmFreak
    Member

    Oct 26, 2017

    3,245

    mute said:

    It does seem kinda unthinkable that Intergalactic would be made with a handheld in mind, for example.

    Click to expand...
    Click to shrink...

    Ratchet, Returnal, Cyberpunk, etc. also weren't made "with a handheld in mind".
     

    Spoit
    Member

    Oct 28, 2017

    5,599

    Given how much of a pain the series S mandate has been, I don't see them binding even first party studios to it, especially ones that are trying to go for the cutting edge of tech. Since given AMDs timelines, is not going to be anywhere near a base PS5.

    I'm also skeptical of the claim that'll be able to play ps5 games without extensive patching. 

    Jawmuncher
    Crisis Dino
    Moderator

    Oct 25, 2017

    45,166

    Ibis Island

    No, I think the portable will handle portable stuff "automatically" for what it converts
     

    knightmawk
    Member

    Dec 12, 2018

    8,900

    I expect they'll do everything they can to make sure no one has to think about it and it's as automatic as possible. It'll technically still be part of cert, but the goal will be for it to be rare that a game fails that part of cert and has to be sent back.

    That being said, I imagine there will be some games that still don't work and developers will be able to submit for that exception. 

    RivalGT
    Member

    Dec 13, 2017

    7,616

    I think the concept here is similar to how PS4 games play on PS5, the ones with patches I mean, the game will run with a different graphics preset then it would on PS4/ PS4 Pro, so in some cases this means higher resolution or higher frame rate cap.

    What Sony needs to work on their end is getting this to work without any patches from developers. Its the only way this can work. 

    Vexii
    Member

    Oct 31, 2017

    3,103

    UK

    if they don't mandate support, it'll just be a death knell for the format. I don't think they could get away with a dedicated handheld platform now when the Switch and Steam Deck exists
     

    Mobius and Pet Octopus
    Member

    Oct 25, 2017

    17,065

    Just because a game can run on a handheld, doesn't mean that's all required for support. The UI alone likely requires changes for an optimal experience, sometimes necessary to be "playable". Small screen sizes usually needs changes.
     

    SeanMN
    Member

    Oct 28, 2017

    2,437

    If PS6 games support is optional, that will create fragmentation of the platform and uncertain software support.

    If it's part of the PS6 family and support is mandatory, I can see there being concern that if would hold the generation back with a low capability sku.

    My thoughts are this should be a PS6 and support the same as the primary console. 
    #you #think #sony #will #make
    Do you think Sony will make support for their rumored new handheld mandatory for developers?
    Red Kong XIX Member Oct 11, 2020 13,560 This is assuming that the handheld can play PS4 games natively without any issues, so they are not included in the poll. Hardware leaker Kepler said it should be able to run PS5 games, even without a patch, but with a performance impact potentially.  Hero_of_the_Day Avenger Oct 27, 2017 19,958 Isn't the rumor that games don't require patches to run on it? That would imply that support isn't mandatory, but automatic.   Homura ▲ Legend ▲ Member Aug 20, 2019 7,232 As the post above said, the rumor is the PS5 portable will be able to run natively any and all PS4/PS5 games. Of course, some games might not work properly or require specific patches, but the idea is automatic compatibility.  shadowman16 Member Oct 25, 2017 42,292 Ideally you'd want stuff to pretty much work out of the box. The more you ask devs to do, the less I imagine will want to support it... Or suddenly games get parred down so that they can run on handhelds. I personally would just prefer a solution where its automatic. I dont really care about a Sony handheld, dont really want devs to be forced to support the thing  Modest_Modsoul Living the Dreams Member Oct 29, 2017 28,418 🤷‍♂️   setmymindforopensky Member Apr 20, 2025 67 a lot of games have performance modes. it should run a lot of the library even without any patching. if there's multiplat im sure itll default to the PS4 ver. im not sure what theyd do for something like GTA6 but itll have a series S version so its clearly scalable enough. im guessing PSTV situation. support it or not we dont care.  reksveks Member May 17, 2022 7,628 Think Kepler is personally assuming the goal of running without patches is a goal and one that won't happen just cause it's too late to force it. It's going to be an interesting solution to an interesting problem  Servbot24 The Fallen Oct 25, 2017 47,826 Obviously not. Pretty absurd question tbh.   RivalGT Member Dec 13, 2017 7,616 This one sounds like it requires a lot of work on Sony's end, I dont think developers will need to do much for games to work. Granted moving forward Sony is likely to make it easier for devs to have a more input on this portable mode. Things working out of the box is likely the goal, and thats what Sony needs if they want this to work, but devs having more input on this mode would be a plus I think.  Callibretto Member Oct 25, 2017 10,445 Indonesia shadowman16 said: Ideally you'd want stuff to pretty much work out of the box. The more you ask devs to do, the less I imagine will want to support it... Or suddenly games get parred down so that they can run on handhelds. I personally would just prefer a solution where its automatic. I dont really care about a Sony handheld, dont really want devs to be forced to support the thingClick to expand... Click to shrink... depend on the game imo, asking CD Project to somehow make Witcher 4 playable on handheld might be unreasonable. but any game that can run on Switch 2 should be playable on PSPortable without much issue   Pheonix1 Member Jun 22, 2024 716 Absolutely they will. Not sure why people think it would be hard, if they hand them.the right tools most ports won't take long anyhow.   skeezx Member Oct 27, 2017 23,994 guessing there will be a "portable approved" label with the respective games going forward, regardless whether it's a PS5 or PS6 game. and when the thing is released popular past titles will be retroactively approved by sony, and up to developers if they want to patch the bigger games to be portable friendly. i guess where things could get tricky/laborious for developers is whether every game going forward is required to screen for portable performance, as it's not a PC so the portable will likely disallow for running "non-approved" games at all  AmFreak Member Oct 26, 2017 3,245 They need to give people some form of guarantee that it will get games, otherwise they greatly diminish their potential success. The best way to do this is to make it another SKU of the contemporary console. And witheverything already running at 60fps and progression slowing to a crawl it's far easier than it had been in the past.  Ruck Member Oct 25, 2017 3,105 I mean, what is the handheld? PS6? Or an actual second console? If the former, then yes, if the latter then no   TitanicFall Member Nov 12, 2017 9,340 Nah. It might be incentivized though. There's not much in it for devs if it's a cross buy situation.   Callibretto Member Oct 25, 2017 10,445 Indonesia imo, PS6 will remain their main console, focusing on high fidelity visuals that Switch 2 and portable PC won't be able to run without huge compromise. PSPortable will be secondary console, something like PSPortal, but this time able to play any games that Switch2 can reasonably run. and for the high end games that it can't run, it will use streaming, either from PS6 you own, or PS+ Premium subs  bleits Member Oct 14, 2023 373 They have to if they want to be taken seriously   Vic Damone Jr. Member Oct 27, 2017 20,534 Nope Sony doesn't mandate this stuff and it's why their second product always dies.   fiendcode Member Oct 26, 2017 26,514 I think it depends on what the device really is, if it's more of a "Portal 2" or a "Series SP" or something else entirely. Streaming might be enough for PS6 games along with incentivized PS5/4 patches but whatever SIE does they need to make sure their inhouse teams are ALL on board this time. That was a big part of PSP/Vita's downfall, that the biggest or most important PS Studios snubbed them and the teams that did show up with support are mostly closed and gone now.   Callibretto Member Oct 25, 2017 10,445 Indonesia bleits said: They have to if they want to be taken seriously Click to expand... Click to shrink... from the last interview with PS exec about Switch 2 spec, it seems clear that PS have no plan to abandon high end console spec to switch to mobile hardware like Switch 2 and Xbox Ally. PS consider their high fidelity visual as advantage and differentiator from Nintendo. so with PS6, their top studio will eventuall make games that just won't realistically run on handheld devices. so having a mandate where all PS6 games is playable on handheld is simply unrealistic imo  danm999 Member Oct 29, 2017 19,929 Sydney Incentives, not mandates.   NSESN ▲ Legend ▲ Member Oct 25, 2017 27,729 I think people are setting themselves for disappointment in regards for how powerful this thing will be   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin Depends on what they call it. If they call it anything related to ps6, expect very bad performance, and mandates If they call it ps5 portable, expect bad performance and no mandates as it will be handled on their end If they call it a ps portable expect it to have no support from Sony and get whatever it gets just be happy it functions till they abandon it.  Metnut Member Apr 7, 2025 30 Good question OP. I voted the middle one. I think anything that ships for PS5 will need to work for the handheld. Question is whether that works automatically or will need patches.  mute ▲ Legend ▲ Member Oct 25, 2017 29,807 I think that would require a level of commitment to a secondary piece of hardware that Sony hasn't shown in a long time.   Patison Member Oct 27, 2017 761 It's difficult to say without knowing what they're planning with this device exactly. If they're fully going Switch routeor more like a Steam Deck, which will run launch games perfectly and then, as time goes on, some titles might start looking less than ideal or be unplayable at all. Or Series S/X, just the Series S being portable — that would be preferable but also limiting but also diminishing returns between generations so might be worth it etc. And if that device happens at all and its development won't be dropped soon is another question. Lots of unknowns, but I'm interested to see what Sony comes up with, as long as they'll have games to support it this time around.  Jammerz Member Apr 29, 2023 1,579 I think it will be optional support. However sony needs to support it with their first parties to set an example and making it as easy as possible for other devs to scale down. For sony first party games maybe use nixxes to scale down so their studios aren't bogged down.  Hamchan The Fallen Oct 25, 2017 6,000 I think 99.9% of games will be crossgen between PS5 and PS6 for the entire generation, just based on how this industry is going, so it might not be much of an issue for Sony to mandate.   Advance.Wars.Sgt. Member Jun 10, 2018 10,456 Honestly, I'd worry more about Sony's 1st party teams than 3rd party developers since they were notoriously adverse making software with a handheld power profile in mind.   overthewaves Member Sep 30, 2020 1,203 Wouldn't that hamstring the games for ps6? That's PlayStation players biggest fear they don't want a series S type situation right? They treat series S like a punching bag.   Neonvisions Member Oct 27, 2017 707 overthewaves said: Wouldn't that hamstring the games for ps6? That's PlayStation players biggest fear they don't want a series S type situation right? They treat series S like a punching bag. Click to expand... Click to shrink... How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X?  Gwarm Member Nov 13, 2017 2,902 I'd be shocked if Sony released a device that let's you play games that haven't been patched or confirmed to run acceptably. Imagine if certain games just hard crashed the console? This is the company that wouldn't let you play certain Vita games on the PSTV even if they actually worked.   bloopland33 Member Mar 4, 2020 3,845 I wonder if they'll just do the Steam Deck thing and do a compatibility badge. You can boot whatever software you want, but it might run at 5 fps and drain your battery. This would be in addition to whatever efforts they're doing to make things work out of the box, of course. But it's hard to imagine them mandating developers ship a PS6 profile and a PS6P profile for those heavier games 5-7 years from now… ….but it's also hard to imagine them shipping this PS6-gen device that doesn't play everything. So maybe they Steam Deck it  vivftp Member Oct 29, 2017 23,016 My guess, every PS6 game will be mandated to support it. PS5 games will support it natively for the simpler games and will require a patch as has been rumored to run on lesser specs I think next gen we get PS3 and Vita emulation so the PS6 and portable will be able to play games from PSN from every past PlayStation  Mocha Joe Member Jun 2, 2021 13,636 Really need to take the Steam Deck approach and don't make it a requirement. Just make it a complementary device where it is possible to play majority of the games available on PSN.   overthewaves Member Sep 30, 2020 1,203 Neonvisions said: How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X? Click to expand... Click to shrink... I mean did you see the reaction here to the series S announcement lol. Everyone was saying it's gonna "hold back the generation".   reksveks Member May 17, 2022 7,628 Neonvisions said: How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X? Click to expand... Click to shrink... Or the perception is that it does but the truth is that there is a lot of factors   Fabs Member Aug 22, 2019 2,827 I can't see the forcing handheld and pro support next gen.   level Member May 25, 2023 1,427 Definitely not Games already take too long to make. Extra time isn't something they'll want to reinforce to their developers.  gofreak Member Oct 26, 2017 8,411 I don't think support will be mandatory. I think they're bringing it into a reality where a growing portion of games can, or could, run without much change or effort on the developer's part on a next gen handheld. They'll lean on that natural trend rather than a policy - anything that is outside of that will just be streamable as now with the Portal.   Caiusto Member Oct 25, 2017 7,086 If they don't want to end up with another Vita yes they will.   mute ▲ Legend ▲ Member Oct 25, 2017 29,807 Advance.Wars.Sgt. said: Honestly, I'd worry more about Sony's 1st party teams than 3rd party developers since they were notoriously adverse making software with a handheld power profile in mind. Click to expand... Click to shrink... It does seem kinda unthinkable that Intergalactic would be made with a handheld in mind, for example.   AmFreak Member Oct 26, 2017 3,245 mute said: It does seem kinda unthinkable that Intergalactic would be made with a handheld in mind, for example. Click to expand... Click to shrink... Ratchet, Returnal, Cyberpunk, etc. also weren't made "with a handheld in mind".   Spoit Member Oct 28, 2017 5,599 Given how much of a pain the series S mandate has been, I don't see them binding even first party studios to it, especially ones that are trying to go for the cutting edge of tech. Since given AMDs timelines, is not going to be anywhere near a base PS5. I'm also skeptical of the claim that'll be able to play ps5 games without extensive patching.  Jawmuncher Crisis Dino Moderator Oct 25, 2017 45,166 Ibis Island No, I think the portable will handle portable stuff "automatically" for what it converts   knightmawk Member Dec 12, 2018 8,900 I expect they'll do everything they can to make sure no one has to think about it and it's as automatic as possible. It'll technically still be part of cert, but the goal will be for it to be rare that a game fails that part of cert and has to be sent back. That being said, I imagine there will be some games that still don't work and developers will be able to submit for that exception.  RivalGT Member Dec 13, 2017 7,616 I think the concept here is similar to how PS4 games play on PS5, the ones with patches I mean, the game will run with a different graphics preset then it would on PS4/ PS4 Pro, so in some cases this means higher resolution or higher frame rate cap. What Sony needs to work on their end is getting this to work without any patches from developers. Its the only way this can work.  Vexii Member Oct 31, 2017 3,103 UK if they don't mandate support, it'll just be a death knell for the format. I don't think they could get away with a dedicated handheld platform now when the Switch and Steam Deck exists   Mobius and Pet Octopus Member Oct 25, 2017 17,065 Just because a game can run on a handheld, doesn't mean that's all required for support. The UI alone likely requires changes for an optimal experience, sometimes necessary to be "playable". Small screen sizes usually needs changes.   SeanMN Member Oct 28, 2017 2,437 If PS6 games support is optional, that will create fragmentation of the platform and uncertain software support. If it's part of the PS6 family and support is mandatory, I can see there being concern that if would hold the generation back with a low capability sku. My thoughts are this should be a PS6 and support the same as the primary console.  #you #think #sony #will #make
    WWW.RESETERA.COM
    Do you think Sony will make support for their rumored new handheld mandatory for developers?
    Red Kong XIX Member Oct 11, 2020 13,560 This is assuming that the handheld can play PS4 games natively without any issues, so they are not included in the poll. Hardware leaker Kepler said it should be able to run PS5 games, even without a patch, but with a performance impact potentially.  Hero_of_the_Day Avenger Oct 27, 2017 19,958 Isn't the rumor that games don't require patches to run on it? That would imply that support isn't mandatory, but automatic.   Homura ▲ Legend ▲ Member Aug 20, 2019 7,232 As the post above said, the rumor is the PS5 portable will be able to run natively any and all PS4/PS5 games. Of course, some games might not work properly or require specific patches, but the idea is automatic compatibility.  shadowman16 Member Oct 25, 2017 42,292 Ideally you'd want stuff to pretty much work out of the box. The more you ask devs to do, the less I imagine will want to support it... Or suddenly games get parred down so that they can run on handhelds (which considering how people hated cross gen for that reason, they'd hate it here as well). I personally would just prefer a solution where its automatic. I dont really care about a Sony handheld, dont really want devs to be forced to support the thing (considering how shit Sony is at supporting its peripherals - like the Vita or PSVR2)  Modest_Modsoul Living the Dreams Member Oct 29, 2017 28,418 🤷‍♂️   setmymindforopensky Member Apr 20, 2025 67 a lot of games have performance modes. it should run a lot of the library even without any patching. if there's multiplat im sure itll default to the PS4 ver. im not sure what theyd do for something like GTA6 but itll have a series S version so its clearly scalable enough. im guessing PSTV situation. support it or not we dont care.  reksveks Member May 17, 2022 7,628 Think Kepler is personally assuming the goal of running without patches is a goal and one that won't happen just cause it's too late to force it. It's going to be an interesting solution to an interesting problem  Servbot24 The Fallen Oct 25, 2017 47,826 Obviously not. Pretty absurd question tbh.   RivalGT Member Dec 13, 2017 7,616 This one sounds like it requires a lot of work on Sony's end, I dont think developers will need to do much for games to work. Granted moving forward Sony is likely to make it easier for devs to have a more input on this portable mode. Things working out of the box is likely the goal, and thats what Sony needs if they want this to work, but devs having more input on this mode would be a plus I think.  Callibretto Member Oct 25, 2017 10,445 Indonesia shadowman16 said: Ideally you'd want stuff to pretty much work out of the box. The more you ask devs to do, the less I imagine will want to support it... Or suddenly games get parred down so that they can run on handhelds (which considering how people hated cross gen for that reason, they'd hate it here as well). I personally would just prefer a solution where its automatic. I dont really care about a Sony handheld, dont really want devs to be forced to support the thing (considering how shit Sony is at supporting its peripherals - like the Vita or PSVR2) Click to expand... Click to shrink... depend on the game imo, asking CD Project to somehow make Witcher 4 playable on handheld might be unreasonable. but any game that can run on Switch 2 should be playable on PSPortable without much issue   Pheonix1 Member Jun 22, 2024 716 Absolutely they will. Not sure why people think it would be hard, if they hand them.the right tools most ports won't take long anyhow.   skeezx Member Oct 27, 2017 23,994 guessing there will be a "portable approved" label with the respective games going forward, regardless whether it's a PS5 or PS6 game. and when the thing is released popular past titles will be retroactively approved by sony, and up to developers if they want to patch the bigger games to be portable friendly. i guess where things could get tricky/laborious for developers is whether every game going forward is required to screen for portable performance, as it's not a PC so the portable will likely disallow for running "non-approved" games at all  AmFreak Member Oct 26, 2017 3,245 They need to give people some form of guarantee that it will get games, otherwise they greatly diminish their potential success. The best way to do this is to make it another SKU of the contemporary console. And with (close to) everything already running at 60fps and progression slowing to a crawl it's far easier than it had been in the past.  Ruck Member Oct 25, 2017 3,105 I mean, what is the handheld? PS6? Or an actual second console? If the former, then yes, if the latter then no   TitanicFall Member Nov 12, 2017 9,340 Nah. It might be incentivized though. There's not much in it for devs if it's a cross buy situation.   Callibretto Member Oct 25, 2017 10,445 Indonesia imo, PS6 will remain their main console, focusing on high fidelity visuals that Switch 2 and portable PC won't be able to run without huge compromise. PSPortable will be secondary console, something like PSPortal, but this time able to play any games that Switch2 can reasonably run. and for the high end games that it can't run, it will use streaming, either from PS6 you own, or PS+ Premium subs  bleits Member Oct 14, 2023 373 They have to if they want to be taken seriously   Vic Damone Jr. Member Oct 27, 2017 20,534 Nope Sony doesn't mandate this stuff and it's why their second product always dies.   fiendcode Member Oct 26, 2017 26,514 I think it depends on what the device really is, if it's more of a "Portal 2" or a "Series SP" or something else entirely (PSP3?). Streaming might be enough for PS6 games along with incentivized PS5/4 patches but whatever SIE does they need to make sure their inhouse teams are ALL on board this time. That was a big part of PSP/Vita's downfall, that the biggest or most important PS Studios snubbed them and the teams that did show up with support are mostly closed and gone now.   Callibretto Member Oct 25, 2017 10,445 Indonesia bleits said: They have to if they want to be taken seriously Click to expand... Click to shrink... from the last interview with PS exec about Switch 2 spec, it seems clear that PS have no plan to abandon high end console spec to switch to mobile hardware like Switch 2 and Xbox Ally. PS consider their high fidelity visual as advantage and differentiator from Nintendo. so with PS6, their top studio will eventuall make games that just won't realistically run on handheld devices. so having a mandate where all PS6 games is playable on handheld is simply unrealistic imo  danm999 Member Oct 29, 2017 19,929 Sydney Incentives, not mandates.   NSESN ▲ Legend ▲ Member Oct 25, 2017 27,729 I think people are setting themselves for disappointment in regards for how powerful this thing will be   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin Depends on what they call it. If they call it anything related to ps6, expect very bad performance, and mandates If they call it ps5 portable, expect bad performance and no mandates as it will be handled on their end If they call it a ps portable expect it to have no support from Sony and get whatever it gets just be happy it functions till they abandon it.  Metnut Member Apr 7, 2025 30 Good question OP. I voted the middle one. I think anything that ships for PS5 will need to work for the handheld. Question is whether that works automatically or will need patches.  mute ▲ Legend ▲ Member Oct 25, 2017 29,807 I think that would require a level of commitment to a secondary piece of hardware that Sony hasn't shown in a long time.   Patison Member Oct 27, 2017 761 It's difficult to say without knowing what they're planning with this device exactly. If they're fully going Switch route (or PS Vita/PS TV route) or more like a Steam Deck, which will run launch games perfectly and then, as time goes on, some titles might start looking less than ideal or be unplayable at all. Or Series S/X, just the Series S being portable — that would be preferable but also limiting but also diminishing returns between generations so might be worth it etc. And if that device happens at all and its development won't be dropped soon is another question. Lots of unknowns, but I'm interested to see what Sony comes up with, as long as they'll have games to support it this time around.  Jammerz Member Apr 29, 2023 1,579 I think it will be optional support. However sony needs to support it with their first parties to set an example and making it as easy as possible for other devs to scale down. For sony first party games maybe use nixxes to scale down so their studios aren't bogged down.  Hamchan The Fallen Oct 25, 2017 6,000 I think 99.9% of games will be crossgen between PS5 and PS6 for the entire generation, just based on how this industry is going, so it might not be much of an issue for Sony to mandate.   Advance.Wars.Sgt. Member Jun 10, 2018 10,456 Honestly, I'd worry more about Sony's 1st party teams than 3rd party developers since they were notoriously adverse making software with a handheld power profile in mind.   overthewaves Member Sep 30, 2020 1,203 Wouldn't that hamstring the games for ps6? That's PlayStation players biggest fear they don't want a series S type situation right? They treat series S like a punching bag.   Neonvisions Member Oct 27, 2017 707 overthewaves said: Wouldn't that hamstring the games for ps6? That's PlayStation players biggest fear they don't want a series S type situation right? They treat series S like a punching bag. Click to expand... Click to shrink... How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X?  Gwarm Member Nov 13, 2017 2,902 I'd be shocked if Sony released a device that let's you play games that haven't been patched or confirmed to run acceptably. Imagine if certain games just hard crashed the console? This is the company that wouldn't let you play certain Vita games on the PSTV even if they actually worked.   bloopland33 Member Mar 4, 2020 3,845 I wonder if they'll just do the Steam Deck thing and do a compatibility badge. You can boot whatever software you want, but it might run at 5 fps and drain your battery. This would be in addition to whatever efforts they're doing to make things work out of the box, of course. But it's hard to imagine them mandating developers ship a PS6 profile and a PS6P profile for those heavier games 5-7 years from now… ….but it's also hard to imagine them shipping this PS6-gen device that doesn't play everything (depending on how they position it). So maybe they Steam Deck it  vivftp Member Oct 29, 2017 23,016 My guess, every PS6 game will be mandated to support it. PS5 games will support it natively for the simpler games and will require a patch as has been rumored to run on lesser specs I think next gen we get PS3 and Vita emulation so the PS6 and portable will be able to play games from PSN from every past PlayStation  Mocha Joe Member Jun 2, 2021 13,636 Really need to take the Steam Deck approach and don't make it a requirement. Just make it a complementary device where it is possible to play majority of the games available on PSN.   overthewaves Member Sep 30, 2020 1,203 Neonvisions said: How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X? Click to expand... Click to shrink... I mean did you see the reaction here to the series S announcement lol. Everyone was saying it's gonna "hold back the generation".   reksveks Member May 17, 2022 7,628 Neonvisions said: How would that effect PS6? Are you suggesting that the Series S hamstrings games for the X? Click to expand... Click to shrink... Or the perception is that it does but the truth is that there is a lot of factors   Fabs Member Aug 22, 2019 2,827 I can't see the forcing handheld and pro support next gen.   level Member May 25, 2023 1,427 Definitely not Games already take too long to make. Extra time isn't something they'll want to reinforce to their developers.  gofreak Member Oct 26, 2017 8,411 I don't think support will be mandatory. I think they're bringing it into a reality where a growing portion of games can, or could, run without much change or effort on the developer's part on a next gen handheld. They'll lean on that natural trend rather than a policy - anything that is outside of that will just be streamable as now with the Portal.   Caiusto Member Oct 25, 2017 7,086 If they don't want to end up with another Vita yes they will.   mute ▲ Legend ▲ Member Oct 25, 2017 29,807 Advance.Wars.Sgt. said: Honestly, I'd worry more about Sony's 1st party teams than 3rd party developers since they were notoriously adverse making software with a handheld power profile in mind. Click to expand... Click to shrink... It does seem kinda unthinkable that Intergalactic would be made with a handheld in mind, for example.   AmFreak Member Oct 26, 2017 3,245 mute said: It does seem kinda unthinkable that Intergalactic would be made with a handheld in mind, for example. Click to expand... Click to shrink... Ratchet, Returnal, Cyberpunk, etc. also weren't made "with a handheld in mind".   Spoit Member Oct 28, 2017 5,599 Given how much of a pain the series S mandate has been, I don't see them binding even first party studios to it, especially ones that are trying to go for the cutting edge of tech. Since given AMDs timelines, is not going to be anywhere near a base PS5. I'm also skeptical of the claim that'll be able to play ps5 games without extensive patching.  Jawmuncher Crisis Dino Moderator Oct 25, 2017 45,166 Ibis Island No, I think the portable will handle portable stuff "automatically" for what it converts   knightmawk Member Dec 12, 2018 8,900 I expect they'll do everything they can to make sure no one has to think about it and it's as automatic as possible. It'll technically still be part of cert, but the goal will be for it to be rare that a game fails that part of cert and has to be sent back. That being said, I imagine there will be some games that still don't work and developers will be able to submit for that exception.  RivalGT Member Dec 13, 2017 7,616 I think the concept here is similar to how PS4 games play on PS5, the ones with patches I mean, the game will run with a different graphics preset then it would on PS4/ PS4 Pro, so in some cases this means higher resolution or higher frame rate cap. What Sony needs to work on their end is getting this to work without any patches from developers. Its the only way this can work.  Vexii Member Oct 31, 2017 3,103 UK if they don't mandate support, it'll just be a death knell for the format. I don't think they could get away with a dedicated handheld platform now when the Switch and Steam Deck exists   Mobius and Pet Octopus Member Oct 25, 2017 17,065 Just because a game can run on a handheld, doesn't mean that's all required for support. The UI alone likely requires changes for an optimal experience, sometimes necessary to be "playable". Small screen sizes usually needs changes.   SeanMN Member Oct 28, 2017 2,437 If PS6 games support is optional, that will create fragmentation of the platform and uncertain software support. If it's part of the PS6 family and support is mandatory, I can see there being concern that if would hold the generation back with a low capability sku. My thoughts are this should be a PS6 and support the same as the primary console. 
    0 Kommentare 0 Anteile
  • Gardenful / TAOA

    Gardenful / TAOASave this picture!© Tao LeiLandscape Architecture•Beijing, China

    Architects:
    TAOA
    Area
    Area of this architecture project

    Area: 
    227 m²

    Year
    Completion year of this architecture project

    Year: 

    2024

    Photographs

    Photographs:Tao LeiMore SpecsLess Specs
    this picture!
    Text description provided by the architects. This is an urban garden built for private use. As a corner of the city, I hope to fill the whole garden with abundant nature in this small space. The site is an open space in a villa compound, surrounded by a cluster of European-style single-family villas typical of Chinese real estate. Modern buildings greatly meet the requirements of indoor temperature and humidity comfort because of their complete facilities, but the building also has a clear climate boundary, cutting off the connection between indoor and outdoor, but also cut off the continuity of nature and life.this picture!this picture!There is no simple definition of the project as a garden or a building, too simple definition will only fall into the narrow imagination, the purpose is only to establish a place that can accommodate a piece of real nature, can give people shelter, can also walk in it. It is the original intention of this design to build a quiet place where you can be alone, a semi-indoor and semi-outdoor space, and re-lead the enclosed life to the outdoors and into the nature.this picture!this picture!The square site in the middle of the garden, which is a relatively independent space, the top shelter provides a comfortable life and cozy, the middle of the garden exposed a sky, sunshine and rain and snow will be staged here. With the corresponding land below, the trees and vegetation of the mountains are introduced into it, maintaining the most primitive wildness. To remain wild in this exquisite urban space, in this abstract geometric order, will naturally get rid of the wild gas of the original nature. A spatial transformation is made on both sides to the north, through the stairway and the upward pull of the roof space, extending the narrow auxiliary garden, which has no roof and is therefore bright, maintaining a different light and shade relationship from the central garden, which is filled with rocks and plants transplanted from the mountains.this picture!this picture!this picture!The structure of the garden is thin and dense synthetic bamboo, and the cross combination of dense structures forms a partition of the space, like a bamboo fence, forming a soft boundary. The interior of the space is lined with wooden panels, and the exterior is covered with thin and crisp aluminum panels. The "bridge" made of stone panels passes through different Spaces, sometimes standing between the bamboo structures, sometimes crossing the rocks, walking between them. Moving between order and wildness.this picture!Nature is difficult to measure, and because of its rich and ever-changing qualities, nature provides richness to Spaces. This is from the mountains to large trees, rocks, small flowers and plants, as far as possible to avoid artificial nursery plants. The structure of the garden will geometrically order the nature, eliminating the wild sense of nature. The details of nature can be discovered, and the life force released can be unconsciously perceived. The nature of fragments is real, is wild, and does not want to lose vitality and richness because of artificial transplantation. The superposition of wild abundance and modern geometric space makes it alive with elegance and decency.this picture!this picture!The nature is independent of the high-density urban space, becoming an independent world, shielding the noise of the city. These are integrated into a continuous and integral "pavilion" and "corridor" constitute the carrier of outdoor life of the family, while sheltering from the wind and rain, under the four eaves also create the relationship between light and dark space, the middle highlights the nature, especially bright, and becomes the center of life. From any Angle one can see a picture of hierarchy and order, a real fragment of nature, built into a new context by geometric order. The richness of nature is therefore more easily perceived, and the changes of nature are constantly played out in daily life and can be seen throughout the year.this picture!

    Project gallerySee allShow less
    Project locationAddress:Beijing, ChinaLocation to be used only as a reference. It could indicate city/country but not exact address.About this officeTAOAOffice•••
    Published on June 15, 2025Cite: "Gardenful / TAOA" 15 Jun 2025. ArchDaily. Accessed . < ISSN 0719-8884Save想阅读文章的中文版本吗?满园 / TAOA 陶磊建筑是否
    You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream
    #gardenful #taoa
    Gardenful / TAOA
    Gardenful / TAOASave this picture!© Tao LeiLandscape Architecture•Beijing, China Architects: TAOA Area Area of this architecture project Area:  227 m² Year Completion year of this architecture project Year:  2024 Photographs Photographs:Tao LeiMore SpecsLess Specs this picture! Text description provided by the architects. This is an urban garden built for private use. As a corner of the city, I hope to fill the whole garden with abundant nature in this small space. The site is an open space in a villa compound, surrounded by a cluster of European-style single-family villas typical of Chinese real estate. Modern buildings greatly meet the requirements of indoor temperature and humidity comfort because of their complete facilities, but the building also has a clear climate boundary, cutting off the connection between indoor and outdoor, but also cut off the continuity of nature and life.this picture!this picture!There is no simple definition of the project as a garden or a building, too simple definition will only fall into the narrow imagination, the purpose is only to establish a place that can accommodate a piece of real nature, can give people shelter, can also walk in it. It is the original intention of this design to build a quiet place where you can be alone, a semi-indoor and semi-outdoor space, and re-lead the enclosed life to the outdoors and into the nature.this picture!this picture!The square site in the middle of the garden, which is a relatively independent space, the top shelter provides a comfortable life and cozy, the middle of the garden exposed a sky, sunshine and rain and snow will be staged here. With the corresponding land below, the trees and vegetation of the mountains are introduced into it, maintaining the most primitive wildness. To remain wild in this exquisite urban space, in this abstract geometric order, will naturally get rid of the wild gas of the original nature. A spatial transformation is made on both sides to the north, through the stairway and the upward pull of the roof space, extending the narrow auxiliary garden, which has no roof and is therefore bright, maintaining a different light and shade relationship from the central garden, which is filled with rocks and plants transplanted from the mountains.this picture!this picture!this picture!The structure of the garden is thin and dense synthetic bamboo, and the cross combination of dense structures forms a partition of the space, like a bamboo fence, forming a soft boundary. The interior of the space is lined with wooden panels, and the exterior is covered with thin and crisp aluminum panels. The "bridge" made of stone panels passes through different Spaces, sometimes standing between the bamboo structures, sometimes crossing the rocks, walking between them. Moving between order and wildness.this picture!Nature is difficult to measure, and because of its rich and ever-changing qualities, nature provides richness to Spaces. This is from the mountains to large trees, rocks, small flowers and plants, as far as possible to avoid artificial nursery plants. The structure of the garden will geometrically order the nature, eliminating the wild sense of nature. The details of nature can be discovered, and the life force released can be unconsciously perceived. The nature of fragments is real, is wild, and does not want to lose vitality and richness because of artificial transplantation. The superposition of wild abundance and modern geometric space makes it alive with elegance and decency.this picture!this picture!The nature is independent of the high-density urban space, becoming an independent world, shielding the noise of the city. These are integrated into a continuous and integral "pavilion" and "corridor" constitute the carrier of outdoor life of the family, while sheltering from the wind and rain, under the four eaves also create the relationship between light and dark space, the middle highlights the nature, especially bright, and becomes the center of life. From any Angle one can see a picture of hierarchy and order, a real fragment of nature, built into a new context by geometric order. The richness of nature is therefore more easily perceived, and the changes of nature are constantly played out in daily life and can be seen throughout the year.this picture! Project gallerySee allShow less Project locationAddress:Beijing, ChinaLocation to be used only as a reference. It could indicate city/country but not exact address.About this officeTAOAOffice••• Published on June 15, 2025Cite: "Gardenful / TAOA" 15 Jun 2025. ArchDaily. Accessed . < ISSN 0719-8884Save想阅读文章的中文版本吗?满园 / TAOA 陶磊建筑是否 You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream #gardenful #taoa
    WWW.ARCHDAILY.COM
    Gardenful / TAOA
    Gardenful / TAOASave this picture!© Tao LeiLandscape Architecture•Beijing, China Architects: TAOA Area Area of this architecture project Area:  227 m² Year Completion year of this architecture project Year:  2024 Photographs Photographs:Tao LeiMore SpecsLess Specs Save this picture! Text description provided by the architects. This is an urban garden built for private use. As a corner of the city, I hope to fill the whole garden with abundant nature in this small space. The site is an open space in a villa compound, surrounded by a cluster of European-style single-family villas typical of Chinese real estate. Modern buildings greatly meet the requirements of indoor temperature and humidity comfort because of their complete facilities, but the building also has a clear climate boundary, cutting off the connection between indoor and outdoor, but also cut off the continuity of nature and life.Save this picture!Save this picture!There is no simple definition of the project as a garden or a building, too simple definition will only fall into the narrow imagination, the purpose is only to establish a place that can accommodate a piece of real nature, can give people shelter, can also walk in it. It is the original intention of this design to build a quiet place where you can be alone, a semi-indoor and semi-outdoor space, and re-lead the enclosed life to the outdoors and into the nature.Save this picture!Save this picture!The square site in the middle of the garden, which is a relatively independent space, the top shelter provides a comfortable life and cozy, the middle of the garden exposed a sky, sunshine and rain and snow will be staged here. With the corresponding land below, the trees and vegetation of the mountains are introduced into it, maintaining the most primitive wildness. To remain wild in this exquisite urban space, in this abstract geometric order, will naturally get rid of the wild gas of the original nature. A spatial transformation is made on both sides to the north, through the stairway and the upward pull of the roof space, extending the narrow auxiliary garden, which has no roof and is therefore bright, maintaining a different light and shade relationship from the central garden, which is filled with rocks and plants transplanted from the mountains.Save this picture!Save this picture!Save this picture!The structure of the garden is thin and dense synthetic bamboo, and the cross combination of dense structures forms a partition of the space, like a bamboo fence, forming a soft boundary. The interior of the space is lined with wooden panels, and the exterior is covered with thin and crisp aluminum panels. The "bridge" made of stone panels passes through different Spaces, sometimes standing between the bamboo structures, sometimes crossing the rocks, walking between them. Moving between order and wildness.Save this picture!Nature is difficult to measure, and because of its rich and ever-changing qualities, nature provides richness to Spaces. This is from the mountains to large trees, rocks, small flowers and plants, as far as possible to avoid artificial nursery plants. The structure of the garden will geometrically order the nature, eliminating the wild sense of nature. The details of nature can be discovered, and the life force released can be unconsciously perceived. The nature of fragments is real, is wild, and does not want to lose vitality and richness because of artificial transplantation. The superposition of wild abundance and modern geometric space makes it alive with elegance and decency.Save this picture!Save this picture!The nature is independent of the high-density urban space, becoming an independent world, shielding the noise of the city. These are integrated into a continuous and integral "pavilion" and "corridor" constitute the carrier of outdoor life of the family, while sheltering from the wind and rain, under the four eaves also create the relationship between light and dark space, the middle highlights the nature, especially bright, and becomes the center of life. From any Angle one can see a picture of hierarchy and order, a real fragment of nature, built into a new context by geometric order. The richness of nature is therefore more easily perceived, and the changes of nature are constantly played out in daily life and can be seen throughout the year.Save this picture! Project gallerySee allShow less Project locationAddress:Beijing, ChinaLocation to be used only as a reference. It could indicate city/country but not exact address.About this officeTAOAOffice••• Published on June 15, 2025Cite: "Gardenful / TAOA" 15 Jun 2025. ArchDaily. Accessed . <https://www.archdaily.com/1028408/gardenful-taoa&gt ISSN 0719-8884Save想阅读文章的中文版本吗?满园 / TAOA 陶磊建筑是否 You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream
    0 Kommentare 0 Anteile
  • Meta’s $15 Billion Scale AI Deal Could Leave Gig Workers Behind

    Meta is reportedly set to invest billion to acquire a 49% stake in Scale AI, in a deal that would make Scale CEO Alexandr Wang head of the tech giant’s new AI unit dedicated to pursuing “superintelligence.”Scale AI, founded in 2016, is a leading data annotation firm that hires workers around the world to label or create the data that is used to train AI systems.The deal is expected to greatly enrich Wang and many of his colleagues with equity in Scale AI; Wang, already a billionaire, would see his wealth grow even further. For Meta, it would breathe new life into the company’s flagging attempts to compete at the “frontier” of AI against OpenAI, Google, and Anthropic.However, Scale’s contract workers, many of whom earn just dollars per day via a subsidiary called RemoTasks, are unlikely to benefit at all from the deal, according to sociologists who study the sector. Typically data workers are not formally employed, and are instead paid for the tasks they complete. Those tasks can include labeling the contents of images, answering questions, or rating which of two chatbots’ answers are better, in order to teach AI systems to better comply with human preferences.“I expect few if any Scale annotators will see any upside at all,” says Callum Cant, a senior lecturer at the University of Essex, U.K., who studies gig work platforms. “It would be very surprising to see some kind of feed-through. Most of these people don’t have a stake in ownership of the company.”Many of those workers already suffer from low pay and poor working conditions. In a recent report by Oxford University’s Internet Institute, the Scale subsidiary RemoTasks failed to meet basic standards for fair pay, fair contracts, fair management, and fair worker representation.Advertisement“A key part of Scale’s value lies in its data work services performed by hundreds of thousands of underpaid and poorly protected workers,” says Jonas Valente, an Oxford researcher who worked on the report. “The company remains far from safeguarding basic standards of fair work, despite limited efforts to improve its practices.”The Meta deal is unlikely to change that. “Unfortunately, the increasing profits of many digital labor platforms and their primary companies, such as the case of Scale, do not translate into better conditions for,” Valente says.A Scale AI spokesperson declined to comment for this story. “We're proud of the flexible earning opportunities offered through our platforms,” the company said in a statement to TechCrunch in May. Meta’s investment also calls into question whether Scale AI will continue supplying data to OpenAI and Google, two of its major clients. In the increasingly competitive AI landscape, observers say Meta may see value in cutting off its rivals from annotated data — an essential means of making AI systems smarter. Advertisement“By buying up access to Scale AI, could Meta deny access to that platform and that avenue for data annotation by other competitors?” says Cant. “It depends entirely on Meta’s strategy.”If that were to happen, Cant says, it could put downward pressure on the wages and tasks available to workers, many of whom already struggle to make ends meet with data work.A Meta spokesperson declined to comment on this story.
    #metas #billion #scale #deal #could
    Meta’s $15 Billion Scale AI Deal Could Leave Gig Workers Behind
    Meta is reportedly set to invest billion to acquire a 49% stake in Scale AI, in a deal that would make Scale CEO Alexandr Wang head of the tech giant’s new AI unit dedicated to pursuing “superintelligence.”Scale AI, founded in 2016, is a leading data annotation firm that hires workers around the world to label or create the data that is used to train AI systems.The deal is expected to greatly enrich Wang and many of his colleagues with equity in Scale AI; Wang, already a billionaire, would see his wealth grow even further. For Meta, it would breathe new life into the company’s flagging attempts to compete at the “frontier” of AI against OpenAI, Google, and Anthropic.However, Scale’s contract workers, many of whom earn just dollars per day via a subsidiary called RemoTasks, are unlikely to benefit at all from the deal, according to sociologists who study the sector. Typically data workers are not formally employed, and are instead paid for the tasks they complete. Those tasks can include labeling the contents of images, answering questions, or rating which of two chatbots’ answers are better, in order to teach AI systems to better comply with human preferences.“I expect few if any Scale annotators will see any upside at all,” says Callum Cant, a senior lecturer at the University of Essex, U.K., who studies gig work platforms. “It would be very surprising to see some kind of feed-through. Most of these people don’t have a stake in ownership of the company.”Many of those workers already suffer from low pay and poor working conditions. In a recent report by Oxford University’s Internet Institute, the Scale subsidiary RemoTasks failed to meet basic standards for fair pay, fair contracts, fair management, and fair worker representation.Advertisement“A key part of Scale’s value lies in its data work services performed by hundreds of thousands of underpaid and poorly protected workers,” says Jonas Valente, an Oxford researcher who worked on the report. “The company remains far from safeguarding basic standards of fair work, despite limited efforts to improve its practices.”The Meta deal is unlikely to change that. “Unfortunately, the increasing profits of many digital labor platforms and their primary companies, such as the case of Scale, do not translate into better conditions for,” Valente says.A Scale AI spokesperson declined to comment for this story. “We're proud of the flexible earning opportunities offered through our platforms,” the company said in a statement to TechCrunch in May. Meta’s investment also calls into question whether Scale AI will continue supplying data to OpenAI and Google, two of its major clients. In the increasingly competitive AI landscape, observers say Meta may see value in cutting off its rivals from annotated data — an essential means of making AI systems smarter. Advertisement“By buying up access to Scale AI, could Meta deny access to that platform and that avenue for data annotation by other competitors?” says Cant. “It depends entirely on Meta’s strategy.”If that were to happen, Cant says, it could put downward pressure on the wages and tasks available to workers, many of whom already struggle to make ends meet with data work.A Meta spokesperson declined to comment on this story. #metas #billion #scale #deal #could
    TIME.COM
    Meta’s $15 Billion Scale AI Deal Could Leave Gig Workers Behind
    Meta is reportedly set to invest $15 billion to acquire a 49% stake in Scale AI, in a deal that would make Scale CEO Alexandr Wang head of the tech giant’s new AI unit dedicated to pursuing “superintelligence.”Scale AI, founded in 2016, is a leading data annotation firm that hires workers around the world to label or create the data that is used to train AI systems.The deal is expected to greatly enrich Wang and many of his colleagues with equity in Scale AI; Wang, already a billionaire, would see his wealth grow even further. For Meta, it would breathe new life into the company’s flagging attempts to compete at the “frontier” of AI against OpenAI, Google, and Anthropic.However, Scale’s contract workers, many of whom earn just dollars per day via a subsidiary called RemoTasks, are unlikely to benefit at all from the deal, according to sociologists who study the sector. Typically data workers are not formally employed, and are instead paid for the tasks they complete. Those tasks can include labeling the contents of images, answering questions, or rating which of two chatbots’ answers are better, in order to teach AI systems to better comply with human preferences.(TIME has a content partnership with Scale AI.)“I expect few if any Scale annotators will see any upside at all,” says Callum Cant, a senior lecturer at the University of Essex, U.K., who studies gig work platforms. “It would be very surprising to see some kind of feed-through. Most of these people don’t have a stake in ownership of the company.”Many of those workers already suffer from low pay and poor working conditions. In a recent report by Oxford University’s Internet Institute, the Scale subsidiary RemoTasks failed to meet basic standards for fair pay, fair contracts, fair management, and fair worker representation.Advertisement“A key part of Scale’s value lies in its data work services performed by hundreds of thousands of underpaid and poorly protected workers,” says Jonas Valente, an Oxford researcher who worked on the report. “The company remains far from safeguarding basic standards of fair work, despite limited efforts to improve its practices.”The Meta deal is unlikely to change that. “Unfortunately, the increasing profits of many digital labor platforms and their primary companies, such as the case of Scale, do not translate into better conditions for [workers],” Valente says.A Scale AI spokesperson declined to comment for this story. “We're proud of the flexible earning opportunities offered through our platforms,” the company said in a statement to TechCrunch in May. Meta’s investment also calls into question whether Scale AI will continue supplying data to OpenAI and Google, two of its major clients. In the increasingly competitive AI landscape, observers say Meta may see value in cutting off its rivals from annotated data — an essential means of making AI systems smarter. Advertisement“By buying up access to Scale AI, could Meta deny access to that platform and that avenue for data annotation by other competitors?” says Cant. “It depends entirely on Meta’s strategy.”If that were to happen, Cant says, it could put downward pressure on the wages and tasks available to workers, many of whom already struggle to make ends meet with data work.A Meta spokesperson declined to comment on this story.
    0 Kommentare 0 Anteile
  • Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals

    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals
    She spent nearly 40 years taking theater and dance pictures, providing glimpses behind the scenes and creating images that the public couldn’t otherwise access

    Stephanie Rudig

    - Freelance Writer

    June 11, 2025

    Photographer Martha Swope sitting on a floor covered with prints of her photos in 1987
    Andrea Legge / © NYPL

    Martha Swope wanted to be a dancer. She moved from her home state of Texas to New York to attend the School of American Ballet, hoping to start a career in dance. Swope also happened to be an amateur photographer. So, in 1957, a fellow classmate invited her to bring her camera and document rehearsals for a little theater show he was working on. The classmate was director and choreographer Jerome Robbins, and the show was West Side Story.
    One of those rehearsal shots ended up in Life magazine, and Swope quickly started getting professional bookings. It’s notoriously tough to make it on Broadway, but through photography, Swope carved out a career capturing theater and dance. Over the course of nearly four decades, she photographed hundreds more rehearsals, productions and promotional studio shots.

    Unidentified male chorus members dancing during rehearsals for musical West Side Story in 1957

    Martha Swope / © NYPL

    At a time when live performances were not often or easily captured, Swope’s photographs caught the animated moments and distilled the essence of a show into a single image: André De Shields clad in a jumpsuit as the title character in The Wiz, Patti LuPone with her arms raised overhead in Evita, the cast of Cats leaping in feline formations, a close-up of a forlorn Sheryl Lee Ralph in Dreamgirls and the row of dancers obscuring their faces with their headshots in A Chorus Line were all captured by Swope’s camera. She was also the house photographer for the New York City Ballet and the Martha Graham Dance Company and photographed other major dance companies such as the Ailey School.
    Her vision of the stage became fairly ubiquitous, with Playbill reporting that in the late 1970s, two-thirds of Broadway productions were photographed by Swope, meaning her work dominated theater and dance coverage. Carol Rosegg was early in her photography career when she heard that Swope was looking for an assistant. “I didn't frankly even know who she was,” Rosegg says. “Then the press agent who told me said, ‘Pick up any New York Times and you’ll find out.’”
    Swope’s background as a dancer likely equipped her to press the shutter at the exact right moment to capture movement, and to know when everyone on stage was precisely posed. She taught herself photography and early on used a Brownie camera, a simple box model made by Kodak. “She was what she described as ‘a dancer with a Brownie,’” says Barbara Stratyner, a historian of the performing arts who curated exhibitions of Swope’s work at the New York Public Library.

    An ensemble of dancers in rehearsal for the stage production Cats in 1982

    Martha Swope / © NYPL

    “Dance was her first love,” Rosegg says. “She knew everything about dance. She would never use a photo of a dancer whose foot was wrong; the feet had to be perfect.”
    According to Rosegg, once the photo subjects knew she was shooting, “the anxiety level came down a little bit.” They knew that they’d look good in the resulting photos, and they likely trusted her intuition as a fellow dancer. Swope moved with the bearing of a dancer and often stood with her feet in ballet’s fourth position while she shot. She continued to take dance classes throughout her life, including at the prestigious Martha Graham School. Stratyner says, “As Graham got older,was, I think, the only person who was allowed to photograph rehearsals, because Graham didn’t want rehearsals shown.”
    Photographic technology and the theater and dance landscapes evolved greatly over the course of Swope’s career. Rosegg points out that at the start of her own career, cameras didn’t even automatically advance the film after each shot. She explains the delicate nature of working with film, saying, “When you were shooting film, you actually had to compose, because you had 35 shots and then you had to change your film.” Swope also worked during a period of changing over from all black-and-white photos to a mixture of black-and-white and color photography. Rosegg notes that simultaneously, Swope would shoot black-and-white, and she herself would shoot color. Looking at Swope’s portfolio is also an examination of increasingly crisp photo production. Advances in photography made shooting in the dark or capturing subjects under blinding stage lights easier, and they allowed for better zooming in from afar.

    Martha Graham rehearses dancer Takako Asakawa and others in Heretic, a dance work choreographed by Graham, in 1986

    Martha Swope / © NYPL

    It’s much more common nowadays to get a look behind the curtain of theater productions via social media. “The theater photographers of today need to supply so much content,” Rosegg says. “We didn’t have any of that, and getting to go backstage was kind of a big deal.”
    Photographers coming to document a rehearsal once might have been seen as an intrusion, but now, as Rosegg puts it, “everybody is desperate for you to come, and if you’re not there, they’re shooting it on their iPhone.”
    Even with exclusive behind-the-scenes access to the hottest tickets in town and the biggest stars of the day, Swope remained unpretentious. She lived and worked in a brownstone with her apartment above her studio, where the film was developed in a closet and the bathroom served as a darkroom. Rosegg recalls that a phone sat in the darkroom so they could be reached while printing, and she would be amazed at the big-name producers and theater glitterati who rang in while she was making prints in an unventilated space.

    From left to right: Paul Winfield, Ruby Dee, Marsha Jackson and Denzel Washington in the stage production Checkmates in 1988

    Martha Swope / © NYPL

    Swope’s approachability extended to how she chose to preserve her work. She originally sold her body of work to Time Life, and, according to Stratyner, she was unhappy with the way the photos became relatively inaccessible. She took back the rights to her collection and donated it to the New York Public Library, where many photos can be accessed by researchers in person, and the entire array of photos is available online to the public in the Digital Collections. Searching “Martha Swope” yields over 50,000 items from more than 800 productions, featuring a huge variety of figures, from a white-suited John Travolta busting a disco move in Saturday Night Fever to Andrew Lloyd Webber with Nancy Reagan at a performance of Phantom of the Opera.
    Swope’s extensive career was recognized in 2004 with a special Tony Award, a Tony Honors for Excellence in Theater, which are given intermittently to notable figures in theater who operate outside of traditional awards categories. She also received a lifetime achievement award from the League of Professional Theater Women in 2007. Though she retired in 1994 and died in 2017, her work still reverberates through dance and Broadway history today. For decades, she captured the fleeting moments of theater that would otherwise never be seen by the public. And her passion was clear and straightforward. As she once told an interviewer: “I’m not interested in what’s going on on my side of the camera. I’m interested in what’s happening on the other side.”

    Get the latest Travel & Culture stories in your inbox.
    #meet #martha #swope #legendary #broadway
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals She spent nearly 40 years taking theater and dance pictures, providing glimpses behind the scenes and creating images that the public couldn’t otherwise access Stephanie Rudig - Freelance Writer June 11, 2025 Photographer Martha Swope sitting on a floor covered with prints of her photos in 1987 Andrea Legge / © NYPL Martha Swope wanted to be a dancer. She moved from her home state of Texas to New York to attend the School of American Ballet, hoping to start a career in dance. Swope also happened to be an amateur photographer. So, in 1957, a fellow classmate invited her to bring her camera and document rehearsals for a little theater show he was working on. The classmate was director and choreographer Jerome Robbins, and the show was West Side Story. One of those rehearsal shots ended up in Life magazine, and Swope quickly started getting professional bookings. It’s notoriously tough to make it on Broadway, but through photography, Swope carved out a career capturing theater and dance. Over the course of nearly four decades, she photographed hundreds more rehearsals, productions and promotional studio shots. Unidentified male chorus members dancing during rehearsals for musical West Side Story in 1957 Martha Swope / © NYPL At a time when live performances were not often or easily captured, Swope’s photographs caught the animated moments and distilled the essence of a show into a single image: André De Shields clad in a jumpsuit as the title character in The Wiz, Patti LuPone with her arms raised overhead in Evita, the cast of Cats leaping in feline formations, a close-up of a forlorn Sheryl Lee Ralph in Dreamgirls and the row of dancers obscuring their faces with their headshots in A Chorus Line were all captured by Swope’s camera. She was also the house photographer for the New York City Ballet and the Martha Graham Dance Company and photographed other major dance companies such as the Ailey School. Her vision of the stage became fairly ubiquitous, with Playbill reporting that in the late 1970s, two-thirds of Broadway productions were photographed by Swope, meaning her work dominated theater and dance coverage. Carol Rosegg was early in her photography career when she heard that Swope was looking for an assistant. “I didn't frankly even know who she was,” Rosegg says. “Then the press agent who told me said, ‘Pick up any New York Times and you’ll find out.’” Swope’s background as a dancer likely equipped her to press the shutter at the exact right moment to capture movement, and to know when everyone on stage was precisely posed. She taught herself photography and early on used a Brownie camera, a simple box model made by Kodak. “She was what she described as ‘a dancer with a Brownie,’” says Barbara Stratyner, a historian of the performing arts who curated exhibitions of Swope’s work at the New York Public Library. An ensemble of dancers in rehearsal for the stage production Cats in 1982 Martha Swope / © NYPL “Dance was her first love,” Rosegg says. “She knew everything about dance. She would never use a photo of a dancer whose foot was wrong; the feet had to be perfect.” According to Rosegg, once the photo subjects knew she was shooting, “the anxiety level came down a little bit.” They knew that they’d look good in the resulting photos, and they likely trusted her intuition as a fellow dancer. Swope moved with the bearing of a dancer and often stood with her feet in ballet’s fourth position while she shot. She continued to take dance classes throughout her life, including at the prestigious Martha Graham School. Stratyner says, “As Graham got older,was, I think, the only person who was allowed to photograph rehearsals, because Graham didn’t want rehearsals shown.” Photographic technology and the theater and dance landscapes evolved greatly over the course of Swope’s career. Rosegg points out that at the start of her own career, cameras didn’t even automatically advance the film after each shot. She explains the delicate nature of working with film, saying, “When you were shooting film, you actually had to compose, because you had 35 shots and then you had to change your film.” Swope also worked during a period of changing over from all black-and-white photos to a mixture of black-and-white and color photography. Rosegg notes that simultaneously, Swope would shoot black-and-white, and she herself would shoot color. Looking at Swope’s portfolio is also an examination of increasingly crisp photo production. Advances in photography made shooting in the dark or capturing subjects under blinding stage lights easier, and they allowed for better zooming in from afar. Martha Graham rehearses dancer Takako Asakawa and others in Heretic, a dance work choreographed by Graham, in 1986 Martha Swope / © NYPL It’s much more common nowadays to get a look behind the curtain of theater productions via social media. “The theater photographers of today need to supply so much content,” Rosegg says. “We didn’t have any of that, and getting to go backstage was kind of a big deal.” Photographers coming to document a rehearsal once might have been seen as an intrusion, but now, as Rosegg puts it, “everybody is desperate for you to come, and if you’re not there, they’re shooting it on their iPhone.” Even with exclusive behind-the-scenes access to the hottest tickets in town and the biggest stars of the day, Swope remained unpretentious. She lived and worked in a brownstone with her apartment above her studio, where the film was developed in a closet and the bathroom served as a darkroom. Rosegg recalls that a phone sat in the darkroom so they could be reached while printing, and she would be amazed at the big-name producers and theater glitterati who rang in while she was making prints in an unventilated space. From left to right: Paul Winfield, Ruby Dee, Marsha Jackson and Denzel Washington in the stage production Checkmates in 1988 Martha Swope / © NYPL Swope’s approachability extended to how she chose to preserve her work. She originally sold her body of work to Time Life, and, according to Stratyner, she was unhappy with the way the photos became relatively inaccessible. She took back the rights to her collection and donated it to the New York Public Library, where many photos can be accessed by researchers in person, and the entire array of photos is available online to the public in the Digital Collections. Searching “Martha Swope” yields over 50,000 items from more than 800 productions, featuring a huge variety of figures, from a white-suited John Travolta busting a disco move in Saturday Night Fever to Andrew Lloyd Webber with Nancy Reagan at a performance of Phantom of the Opera. Swope’s extensive career was recognized in 2004 with a special Tony Award, a Tony Honors for Excellence in Theater, which are given intermittently to notable figures in theater who operate outside of traditional awards categories. She also received a lifetime achievement award from the League of Professional Theater Women in 2007. Though she retired in 1994 and died in 2017, her work still reverberates through dance and Broadway history today. For decades, she captured the fleeting moments of theater that would otherwise never be seen by the public. And her passion was clear and straightforward. As she once told an interviewer: “I’m not interested in what’s going on on my side of the camera. I’m interested in what’s happening on the other side.” Get the latest Travel & Culture stories in your inbox. #meet #martha #swope #legendary #broadway
    WWW.SMITHSONIANMAG.COM
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals
    Meet Martha Swope, the Legendary Broadway Photographer Who Captured Iconic Moments From Hundreds of Productions and Rehearsals She spent nearly 40 years taking theater and dance pictures, providing glimpses behind the scenes and creating images that the public couldn’t otherwise access Stephanie Rudig - Freelance Writer June 11, 2025 Photographer Martha Swope sitting on a floor covered with prints of her photos in 1987 Andrea Legge / © NYPL Martha Swope wanted to be a dancer. She moved from her home state of Texas to New York to attend the School of American Ballet, hoping to start a career in dance. Swope also happened to be an amateur photographer. So, in 1957, a fellow classmate invited her to bring her camera and document rehearsals for a little theater show he was working on. The classmate was director and choreographer Jerome Robbins, and the show was West Side Story. One of those rehearsal shots ended up in Life magazine, and Swope quickly started getting professional bookings. It’s notoriously tough to make it on Broadway, but through photography, Swope carved out a career capturing theater and dance. Over the course of nearly four decades, she photographed hundreds more rehearsals, productions and promotional studio shots. Unidentified male chorus members dancing during rehearsals for musical West Side Story in 1957 Martha Swope / © NYPL At a time when live performances were not often or easily captured, Swope’s photographs caught the animated moments and distilled the essence of a show into a single image: André De Shields clad in a jumpsuit as the title character in The Wiz, Patti LuPone with her arms raised overhead in Evita, the cast of Cats leaping in feline formations, a close-up of a forlorn Sheryl Lee Ralph in Dreamgirls and the row of dancers obscuring their faces with their headshots in A Chorus Line were all captured by Swope’s camera. She was also the house photographer for the New York City Ballet and the Martha Graham Dance Company and photographed other major dance companies such as the Ailey School. Her vision of the stage became fairly ubiquitous, with Playbill reporting that in the late 1970s, two-thirds of Broadway productions were photographed by Swope, meaning her work dominated theater and dance coverage. Carol Rosegg was early in her photography career when she heard that Swope was looking for an assistant. “I didn't frankly even know who she was,” Rosegg says. “Then the press agent who told me said, ‘Pick up any New York Times and you’ll find out.’” Swope’s background as a dancer likely equipped her to press the shutter at the exact right moment to capture movement, and to know when everyone on stage was precisely posed. She taught herself photography and early on used a Brownie camera, a simple box model made by Kodak. “She was what she described as ‘a dancer with a Brownie,’” says Barbara Stratyner, a historian of the performing arts who curated exhibitions of Swope’s work at the New York Public Library. An ensemble of dancers in rehearsal for the stage production Cats in 1982 Martha Swope / © NYPL “Dance was her first love,” Rosegg says. “She knew everything about dance. She would never use a photo of a dancer whose foot was wrong; the feet had to be perfect.” According to Rosegg, once the photo subjects knew she was shooting, “the anxiety level came down a little bit.” They knew that they’d look good in the resulting photos, and they likely trusted her intuition as a fellow dancer. Swope moved with the bearing of a dancer and often stood with her feet in ballet’s fourth position while she shot. She continued to take dance classes throughout her life, including at the prestigious Martha Graham School. Stratyner says, “As Graham got older, [Swope] was, I think, the only person who was allowed to photograph rehearsals, because Graham didn’t want rehearsals shown.” Photographic technology and the theater and dance landscapes evolved greatly over the course of Swope’s career. Rosegg points out that at the start of her own career, cameras didn’t even automatically advance the film after each shot. She explains the delicate nature of working with film, saying, “When you were shooting film, you actually had to compose, because you had 35 shots and then you had to change your film.” Swope also worked during a period of changing over from all black-and-white photos to a mixture of black-and-white and color photography. Rosegg notes that simultaneously, Swope would shoot black-and-white, and she herself would shoot color. Looking at Swope’s portfolio is also an examination of increasingly crisp photo production. Advances in photography made shooting in the dark or capturing subjects under blinding stage lights easier, and they allowed for better zooming in from afar. Martha Graham rehearses dancer Takako Asakawa and others in Heretic, a dance work choreographed by Graham, in 1986 Martha Swope / © NYPL It’s much more common nowadays to get a look behind the curtain of theater productions via social media. “The theater photographers of today need to supply so much content,” Rosegg says. “We didn’t have any of that, and getting to go backstage was kind of a big deal.” Photographers coming to document a rehearsal once might have been seen as an intrusion, but now, as Rosegg puts it, “everybody is desperate for you to come, and if you’re not there, they’re shooting it on their iPhone.” Even with exclusive behind-the-scenes access to the hottest tickets in town and the biggest stars of the day, Swope remained unpretentious. She lived and worked in a brownstone with her apartment above her studio, where the film was developed in a closet and the bathroom served as a darkroom. Rosegg recalls that a phone sat in the darkroom so they could be reached while printing, and she would be amazed at the big-name producers and theater glitterati who rang in while she was making prints in an unventilated space. From left to right: Paul Winfield, Ruby Dee, Marsha Jackson and Denzel Washington in the stage production Checkmates in 1988 Martha Swope / © NYPL Swope’s approachability extended to how she chose to preserve her work. She originally sold her body of work to Time Life, and, according to Stratyner, she was unhappy with the way the photos became relatively inaccessible. She took back the rights to her collection and donated it to the New York Public Library, where many photos can be accessed by researchers in person, and the entire array of photos is available online to the public in the Digital Collections. Searching “Martha Swope” yields over 50,000 items from more than 800 productions, featuring a huge variety of figures, from a white-suited John Travolta busting a disco move in Saturday Night Fever to Andrew Lloyd Webber with Nancy Reagan at a performance of Phantom of the Opera. Swope’s extensive career was recognized in 2004 with a special Tony Award, a Tony Honors for Excellence in Theater, which are given intermittently to notable figures in theater who operate outside of traditional awards categories. She also received a lifetime achievement award from the League of Professional Theater Women in 2007. Though she retired in 1994 and died in 2017, her work still reverberates through dance and Broadway history today. For decades, she captured the fleeting moments of theater that would otherwise never be seen by the public. And her passion was clear and straightforward. As she once told an interviewer: “I’m not interested in what’s going on on my side of the camera. I’m interested in what’s happening on the other side.” Get the latest Travel & Culture stories in your inbox.
    0 Kommentare 0 Anteile
  • Climate Change Is Ruining Cheese, Scientists and Farmers Warn

    Climate change is making everything worse — including apparently threatening the dairy that makes our precious cheese.In interviews with Science News, veterinary researchers and dairy farmers alike warned that changes to the climate that affect cows are impacting not only affects the nutritional value of the cheeses produced from their milk, but also the color, texture, and even taste.Researchers from the Université Clermont Auvergne, which is located in the mountainous Central France region that produces a delicious firm cheese known as Cantal, explained in a new paper for the Journal of Dairy Science that grass shortages caused by climate change can greatly affect how cows' milk, and the subsequent cheese created from it, tastes.At regular intervals throughout a five-month testing period in 2021, the scientists sampled milk from two groups of cows, each containing 20 cows from two different breeds that were either allowed to graze on grass like normal or only graze part-time while being fed a supplemental diet that featured corn and other concentrated foods.As the researchers found, the corn-fed cohort consistently produced the same amount of milk and less methane than their grass-fed counterparts — but the taste of the resulting milk products was less savory and rich than the grass-fed bovines.Moreover, the milk from the grass-fed cows contained more omega-3 fatty acids, which are good for the heart, and lactic acids, which act as probiotics."Farmers are looking for feed with better yields than grass or that are more resilient to droughts," explained Matthieu Bouchon, the fittingly-named lead author of the study.Still, those same farmers want to know how supplementing their cows' feed will change the nutritional value and taste, Bouchon said — and one farmer who spoke to Science News affirmed anecdotally, this effect is bearing out in other parts of the world, too."We were having lots of problems with milk protein and fat content due to the heat," Gustavo Abijaodi, a dairy farmer in Brazil, told the website. "If we can stabilize heat effects, the cattle will respond with better and more nutritious milk."The heat also seems to be getting to the way cows eat and behave as well."Cows produce heat to digest food — so if they are already feeling hot, they’ll eat less to lower their temperature," noted Marina Danes, a dairy scientist at Brazil's Federal University of Lavras. "This process spirals into immunosuppression, leaving the animal vulnerable to disease."Whether it's the food quality or the heat affecting the cows, the effects are palpable — or, in this case, edible."If climate change progresses the way it’s going, we’ll feel it in our cheese," remarked Bouchon, the French researcher.More on cattle science: Brazilian "Supercows" Reportedly Close to Achieving World DominationShare This Article
    #climate #change #ruining #cheese #scientists
    Climate Change Is Ruining Cheese, Scientists and Farmers Warn
    Climate change is making everything worse — including apparently threatening the dairy that makes our precious cheese.In interviews with Science News, veterinary researchers and dairy farmers alike warned that changes to the climate that affect cows are impacting not only affects the nutritional value of the cheeses produced from their milk, but also the color, texture, and even taste.Researchers from the Université Clermont Auvergne, which is located in the mountainous Central France region that produces a delicious firm cheese known as Cantal, explained in a new paper for the Journal of Dairy Science that grass shortages caused by climate change can greatly affect how cows' milk, and the subsequent cheese created from it, tastes.At regular intervals throughout a five-month testing period in 2021, the scientists sampled milk from two groups of cows, each containing 20 cows from two different breeds that were either allowed to graze on grass like normal or only graze part-time while being fed a supplemental diet that featured corn and other concentrated foods.As the researchers found, the corn-fed cohort consistently produced the same amount of milk and less methane than their grass-fed counterparts — but the taste of the resulting milk products was less savory and rich than the grass-fed bovines.Moreover, the milk from the grass-fed cows contained more omega-3 fatty acids, which are good for the heart, and lactic acids, which act as probiotics."Farmers are looking for feed with better yields than grass or that are more resilient to droughts," explained Matthieu Bouchon, the fittingly-named lead author of the study.Still, those same farmers want to know how supplementing their cows' feed will change the nutritional value and taste, Bouchon said — and one farmer who spoke to Science News affirmed anecdotally, this effect is bearing out in other parts of the world, too."We were having lots of problems with milk protein and fat content due to the heat," Gustavo Abijaodi, a dairy farmer in Brazil, told the website. "If we can stabilize heat effects, the cattle will respond with better and more nutritious milk."The heat also seems to be getting to the way cows eat and behave as well."Cows produce heat to digest food — so if they are already feeling hot, they’ll eat less to lower their temperature," noted Marina Danes, a dairy scientist at Brazil's Federal University of Lavras. "This process spirals into immunosuppression, leaving the animal vulnerable to disease."Whether it's the food quality or the heat affecting the cows, the effects are palpable — or, in this case, edible."If climate change progresses the way it’s going, we’ll feel it in our cheese," remarked Bouchon, the French researcher.More on cattle science: Brazilian "Supercows" Reportedly Close to Achieving World DominationShare This Article #climate #change #ruining #cheese #scientists
    FUTURISM.COM
    Climate Change Is Ruining Cheese, Scientists and Farmers Warn
    Climate change is making everything worse — including apparently threatening the dairy that makes our precious cheese.In interviews with Science News, veterinary researchers and dairy farmers alike warned that changes to the climate that affect cows are impacting not only affects the nutritional value of the cheeses produced from their milk, but also the color, texture, and even taste.Researchers from the Université Clermont Auvergne, which is located in the mountainous Central France region that produces a delicious firm cheese known as Cantal, explained in a new paper for the Journal of Dairy Science that grass shortages caused by climate change can greatly affect how cows' milk, and the subsequent cheese created from it, tastes.At regular intervals throughout a five-month testing period in 2021, the scientists sampled milk from two groups of cows, each containing 20 cows from two different breeds that were either allowed to graze on grass like normal or only graze part-time while being fed a supplemental diet that featured corn and other concentrated foods.As the researchers found, the corn-fed cohort consistently produced the same amount of milk and less methane than their grass-fed counterparts — but the taste of the resulting milk products was less savory and rich than the grass-fed bovines.Moreover, the milk from the grass-fed cows contained more omega-3 fatty acids, which are good for the heart, and lactic acids, which act as probiotics."Farmers are looking for feed with better yields than grass or that are more resilient to droughts," explained Matthieu Bouchon, the fittingly-named lead author of the study.Still, those same farmers want to know how supplementing their cows' feed will change the nutritional value and taste, Bouchon said — and one farmer who spoke to Science News affirmed anecdotally, this effect is bearing out in other parts of the world, too."We were having lots of problems with milk protein and fat content due to the heat," Gustavo Abijaodi, a dairy farmer in Brazil, told the website. "If we can stabilize heat effects, the cattle will respond with better and more nutritious milk."The heat also seems to be getting to the way cows eat and behave as well."Cows produce heat to digest food — so if they are already feeling hot, they’ll eat less to lower their temperature," noted Marina Danes, a dairy scientist at Brazil's Federal University of Lavras. "This process spirals into immunosuppression, leaving the animal vulnerable to disease."Whether it's the food quality or the heat affecting the cows, the effects are palpable — or, in this case, edible."If climate change progresses the way it’s going, we’ll feel it in our cheese," remarked Bouchon, the French researcher.More on cattle science: Brazilian "Supercows" Reportedly Close to Achieving World DominationShare This Article
    0 Kommentare 0 Anteile
  • Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory

    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory
    The method could help bring countless old paintings, currently stored in the back rooms of galleries with limited conservation budgets, to light

    Scans of the painting retouched with a new technique during various stages in the process. On the right is the restored painting with the applied laminate mask.
    Courtesy of the researchers via MIT

    In a contest for jobs requiring the most patience, art restoration might take first place. Traditionally, conservators restore paintings by recreating the artwork’s exact colors to fill in the damage, one spot at a time. Even with the help of X-ray imaging and pigment analyses, several parts of the expensive process, such as the cleaning and retouching, are done by hand, as noted by Artnet’s Jo Lawson-Tancred.
    Now, a mechanical engineering graduate student at MIT has developed an artificial intelligence-based approach that can achieve a faithful restoration in just hours—instead of months of work.
    In a paper published Wednesday in the journal Nature, Alex Kachkine describes a new method that applies digital restorations to paintings by placing a thin film on top. If the approach becomes widespread, it could make art restoration more accessible and help bring countless damaged paintings, currently stored in the back rooms of galleries with limited conservation budgets, back to light.
    The new technique “is a restoration process that saves a lot of time and money, while also being reversible, which some people feel is really important to preserving the underlying character of a piece,” Kachkine tells Nature’s Amanda Heidt.

    Meet the engineer who invented an AI-powered way to restore art
    Watch on

    While filling in damaged areas of a painting would seem like a logical solution to many people, direct retouching raises ethical concerns for modern conservators. That’s because an artwork’s damage is part of its history, and retouching might detract from the painter’s original vision. “For example, instead of removing flaking paint and retouching the painting, a conservator might try to fix the loose paint particles to their original places,” writes Hartmut Kutzke, a chemist at the University of Oslo’s Museum of Cultural History, for Nature News and Views. If retouching is absolutely necessary, he adds, it should be reversible.
    As such, some institutions have started restoring artwork virtually and presenting the restoration next to the untouched, physical version. Many art lovers might argue, however, that a digital restoration printed out or displayed on a screen doesn’t quite compare to seeing the original painting in its full glory.
    That’s where Kachkine, who is also an art collector and amateur conservator, comes in. The MIT student has developed a way to apply digital restorations onto a damaged painting. In short, the approach involves using pre-existing A.I. tools to create a digital version of what the freshly painted artwork would have looked like. Based on this reconstruction, Kachkine’s new software assembles a map of the retouches, and their exact colors, necessary to fill the gaps present in the painting today.
    The map is then printed onto two layers of thin, transparent polymer film—one with colored retouches and one with the same pattern in white—that attach to the painting with conventional varnish. This “mask” aligns the retouches with the gaps while leaving the rest of the artwork visible.
    “In order to fully reproduce color, you need both white and color ink to get the full spectrum,” Kachkine explains in an MIT statement. “If those two layers are misaligned, that’s very easy to see. So, I also developed a few computational tools, based on what we know of human color perception, to determine how small of a region we can practically align and restore.”
    The method’s magic lies in the fact that the mask is removable, and the digital file provides a record of the modifications for future conservators to study.
    Kachkine demonstrated the approach on a 15th-century oil painting in dire need of restoration, by a Dutch artist whose name is now unknown. The retouches were generated by matching the surrounding color, replicating similar patterns visible elsewhere in the painting or copying the artist’s style in other paintings, per Nature News and Views. Overall, the painting’s 5,612 damaged regions were filled with 57,314 different colors in 3.5 hours—66 hours faster than traditional methods would have likely taken.

    Overview of Physically-Applied Digital Restoration
    Watch on

    “It followed years of effort to try to get the method working,” Kachkine tells the Guardian’s Ian Sample. “There was a fair bit of relief that finally this method was able to reconstruct and stitch together the surviving parts of the painting.”
    The new process still poses ethical considerations, such as whether the applied film disrupts the viewing experience or whether A.I.-generated corrections to the painting are accurate. Additionally, Kutzke writes for Nature News and Views that the effect of the varnish on the painting should be studied more deeply.
    Still, Kachkine says this technique could help address the large number of damaged artworks that live in storage rooms. “This approach grants greatly increased foresight and flexibility to conservators,” per the study, “enabling the restoration of countless damaged paintings deemed unworthy of high conservation budgets.”

    Get the latest stories in your inbox every weekday.
    #graduate #student #develops #aibased #approach
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory The method could help bring countless old paintings, currently stored in the back rooms of galleries with limited conservation budgets, to light Scans of the painting retouched with a new technique during various stages in the process. On the right is the restored painting with the applied laminate mask. Courtesy of the researchers via MIT In a contest for jobs requiring the most patience, art restoration might take first place. Traditionally, conservators restore paintings by recreating the artwork’s exact colors to fill in the damage, one spot at a time. Even with the help of X-ray imaging and pigment analyses, several parts of the expensive process, such as the cleaning and retouching, are done by hand, as noted by Artnet’s Jo Lawson-Tancred. Now, a mechanical engineering graduate student at MIT has developed an artificial intelligence-based approach that can achieve a faithful restoration in just hours—instead of months of work. In a paper published Wednesday in the journal Nature, Alex Kachkine describes a new method that applies digital restorations to paintings by placing a thin film on top. If the approach becomes widespread, it could make art restoration more accessible and help bring countless damaged paintings, currently stored in the back rooms of galleries with limited conservation budgets, back to light. The new technique “is a restoration process that saves a lot of time and money, while also being reversible, which some people feel is really important to preserving the underlying character of a piece,” Kachkine tells Nature’s Amanda Heidt. Meet the engineer who invented an AI-powered way to restore art Watch on While filling in damaged areas of a painting would seem like a logical solution to many people, direct retouching raises ethical concerns for modern conservators. That’s because an artwork’s damage is part of its history, and retouching might detract from the painter’s original vision. “For example, instead of removing flaking paint and retouching the painting, a conservator might try to fix the loose paint particles to their original places,” writes Hartmut Kutzke, a chemist at the University of Oslo’s Museum of Cultural History, for Nature News and Views. If retouching is absolutely necessary, he adds, it should be reversible. As such, some institutions have started restoring artwork virtually and presenting the restoration next to the untouched, physical version. Many art lovers might argue, however, that a digital restoration printed out or displayed on a screen doesn’t quite compare to seeing the original painting in its full glory. That’s where Kachkine, who is also an art collector and amateur conservator, comes in. The MIT student has developed a way to apply digital restorations onto a damaged painting. In short, the approach involves using pre-existing A.I. tools to create a digital version of what the freshly painted artwork would have looked like. Based on this reconstruction, Kachkine’s new software assembles a map of the retouches, and their exact colors, necessary to fill the gaps present in the painting today. The map is then printed onto two layers of thin, transparent polymer film—one with colored retouches and one with the same pattern in white—that attach to the painting with conventional varnish. This “mask” aligns the retouches with the gaps while leaving the rest of the artwork visible. “In order to fully reproduce color, you need both white and color ink to get the full spectrum,” Kachkine explains in an MIT statement. “If those two layers are misaligned, that’s very easy to see. So, I also developed a few computational tools, based on what we know of human color perception, to determine how small of a region we can practically align and restore.” The method’s magic lies in the fact that the mask is removable, and the digital file provides a record of the modifications for future conservators to study. Kachkine demonstrated the approach on a 15th-century oil painting in dire need of restoration, by a Dutch artist whose name is now unknown. The retouches were generated by matching the surrounding color, replicating similar patterns visible elsewhere in the painting or copying the artist’s style in other paintings, per Nature News and Views. Overall, the painting’s 5,612 damaged regions were filled with 57,314 different colors in 3.5 hours—66 hours faster than traditional methods would have likely taken. Overview of Physically-Applied Digital Restoration Watch on “It followed years of effort to try to get the method working,” Kachkine tells the Guardian’s Ian Sample. “There was a fair bit of relief that finally this method was able to reconstruct and stitch together the surviving parts of the painting.” The new process still poses ethical considerations, such as whether the applied film disrupts the viewing experience or whether A.I.-generated corrections to the painting are accurate. Additionally, Kutzke writes for Nature News and Views that the effect of the varnish on the painting should be studied more deeply. Still, Kachkine says this technique could help address the large number of damaged artworks that live in storage rooms. “This approach grants greatly increased foresight and flexibility to conservators,” per the study, “enabling the restoration of countless damaged paintings deemed unworthy of high conservation budgets.” Get the latest stories in your inbox every weekday. #graduate #student #develops #aibased #approach
    WWW.SMITHSONIANMAG.COM
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory
    Graduate Student Develops an A.I.-Based Approach to Restore Time-Damaged Artwork to Its Former Glory The method could help bring countless old paintings, currently stored in the back rooms of galleries with limited conservation budgets, to light Scans of the painting retouched with a new technique during various stages in the process. On the right is the restored painting with the applied laminate mask. Courtesy of the researchers via MIT In a contest for jobs requiring the most patience, art restoration might take first place. Traditionally, conservators restore paintings by recreating the artwork’s exact colors to fill in the damage, one spot at a time. Even with the help of X-ray imaging and pigment analyses, several parts of the expensive process, such as the cleaning and retouching, are done by hand, as noted by Artnet’s Jo Lawson-Tancred. Now, a mechanical engineering graduate student at MIT has developed an artificial intelligence-based approach that can achieve a faithful restoration in just hours—instead of months of work. In a paper published Wednesday in the journal Nature, Alex Kachkine describes a new method that applies digital restorations to paintings by placing a thin film on top. If the approach becomes widespread, it could make art restoration more accessible and help bring countless damaged paintings, currently stored in the back rooms of galleries with limited conservation budgets, back to light. The new technique “is a restoration process that saves a lot of time and money, while also being reversible, which some people feel is really important to preserving the underlying character of a piece,” Kachkine tells Nature’s Amanda Heidt. Meet the engineer who invented an AI-powered way to restore art Watch on While filling in damaged areas of a painting would seem like a logical solution to many people, direct retouching raises ethical concerns for modern conservators. That’s because an artwork’s damage is part of its history, and retouching might detract from the painter’s original vision. “For example, instead of removing flaking paint and retouching the painting, a conservator might try to fix the loose paint particles to their original places,” writes Hartmut Kutzke, a chemist at the University of Oslo’s Museum of Cultural History, for Nature News and Views. If retouching is absolutely necessary, he adds, it should be reversible. As such, some institutions have started restoring artwork virtually and presenting the restoration next to the untouched, physical version. Many art lovers might argue, however, that a digital restoration printed out or displayed on a screen doesn’t quite compare to seeing the original painting in its full glory. That’s where Kachkine, who is also an art collector and amateur conservator, comes in. The MIT student has developed a way to apply digital restorations onto a damaged painting. In short, the approach involves using pre-existing A.I. tools to create a digital version of what the freshly painted artwork would have looked like. Based on this reconstruction, Kachkine’s new software assembles a map of the retouches, and their exact colors, necessary to fill the gaps present in the painting today. The map is then printed onto two layers of thin, transparent polymer film—one with colored retouches and one with the same pattern in white—that attach to the painting with conventional varnish. This “mask” aligns the retouches with the gaps while leaving the rest of the artwork visible. “In order to fully reproduce color, you need both white and color ink to get the full spectrum,” Kachkine explains in an MIT statement. “If those two layers are misaligned, that’s very easy to see. So, I also developed a few computational tools, based on what we know of human color perception, to determine how small of a region we can practically align and restore.” The method’s magic lies in the fact that the mask is removable, and the digital file provides a record of the modifications for future conservators to study. Kachkine demonstrated the approach on a 15th-century oil painting in dire need of restoration, by a Dutch artist whose name is now unknown. The retouches were generated by matching the surrounding color, replicating similar patterns visible elsewhere in the painting or copying the artist’s style in other paintings, per Nature News and Views. Overall, the painting’s 5,612 damaged regions were filled with 57,314 different colors in 3.5 hours—66 hours faster than traditional methods would have likely taken. Overview of Physically-Applied Digital Restoration Watch on “It followed years of effort to try to get the method working,” Kachkine tells the Guardian’s Ian Sample. “There was a fair bit of relief that finally this method was able to reconstruct and stitch together the surviving parts of the painting.” The new process still poses ethical considerations, such as whether the applied film disrupts the viewing experience or whether A.I.-generated corrections to the painting are accurate. Additionally, Kutzke writes for Nature News and Views that the effect of the varnish on the painting should be studied more deeply. Still, Kachkine says this technique could help address the large number of damaged artworks that live in storage rooms. “This approach grants greatly increased foresight and flexibility to conservators,” per the study, “enabling the restoration of countless damaged paintings deemed unworthy of high conservation budgets.” Get the latest stories in your inbox every weekday.
    0 Kommentare 0 Anteile