• Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Over 8M patient records leaked in healthcare data breach

    Published
    June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles!
    In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work  Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data     How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop  5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication. It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    #over #patient #records #leaked #healthcare
    Over 8M patient records leaked in healthcare data breach
    Published June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles! In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work  Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data     How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop  5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication. It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com. #over #patient #records #leaked #healthcare
    WWW.FOXNEWS.COM
    Over 8M patient records leaked in healthcare data breach
    Published June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles! In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work   (Kurt "CyberGuy" Knutsson)Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data      (Kurt "CyberGuy" Knutsson)How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop   (Kurt "CyberGuy" Knutsson)5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication (2FA). It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    Like
    Love
    Wow
    Sad
    Angry
    507
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Government ditches public sector decarbonisation scheme

    The government has axed a scheme for upgrading energy efficiency in public sector buildings.
    The Public Sector Decarbonisation Schemedelivered more than £2.5bn in its first three phases for measures such as heat pumps, solar panels, insulation and double glazing, with further funding of nearly £1bn recently announced.
    But the Department for Energy Security and Net Zerohas told Building Design that the scheme has been dropped after the spending review, leaving uncertainty about how upgrades will be funded when the current phase expires in 2028.

    Source: UK Government/FlickrEd Miliband’s Department for Energy Security and Net Zero is responsible for the scheme
    The department said it would set out plans for the period after 2028 in due course.
    In a post on LinkedIn, Dave Welkin, director of sustainability at Gleeds, said he had waited for the release of the spending review with a “sense of trepidation” and was unable to find mention of public sector decarbonisation when Treasury documents were released.
    “I hoped because it was already committed in the Budget that its omission wasn’t ominous,” he wrote.
    Yesterday, he was told by Salix Finance, the non-departmental public body that delivers funding for the scheme, that it was no longer being funded.
    It comes after the withdrawal of funding for the Low Carbon Skills Fundin May.
    According to the government’s website, PSDS and LCSF were intended to support the reduction of emissions from public sector buildings by 75% by 2037, compared to a 2017 baseline.
    “Neither LCSF or PSDS were perfect by any means, but they did provide a vital source of funding for local authorities, hospitals, schools and many other public sector organisations to save energy, carbon and money,” Welkin said.
    “PSDS has helped replace failed heating systems in schools, keeping students warm. It’s replaced roofs on hospitals, helping patients recover from illness. It’s replaced windows in our prisons, improving security and stopping drugs getting behind bars.”
    However, responding to Welkin’s post, Steve Connolly, chief executive at Arriba Technologies, a low carbon heating and cooling firm, said that the scheme was being “mismanaged” with a small number of professional services firms “scooping up disproportionately large grants for their clients”.
    The fourth phase of the scheme was confirmed last September, with allocations confirmed only last month.
    This latest phase, which covers the financial years between 2025/26 and 2027/28, saw the distribution of £940m across the country.
    A DESNZ spokesperson said: “Our settlement is about investing in Britain’s renewal to create energy security, sprint to clean power by 2030, encourage investment, create jobs and bring down bills for good.
    “We will deliver £1bn in current allocations of the Public Sector Decarbonisation Scheme until 2028 and, through Great British Energy, have invested in new rooftop solar power and renewable schemes to lower energy bills for schools and hospitals across the UK.
    “We want to build on this progress by incentivising the public sector to decarbonise, so they can reap the benefits in lower bills and emissions, sharing best practice across government and exploring the use of repayable finance, where appropriate.”
    A government assessment of phase 3a and 3b projects identified a number of issues with the scheme, including delays and cost inflation, with more than a tenth being abandoned subsequent to grants being offered.
    Stakeholders interviewed for the report also identified “difficulties in obtaining skilled contractors and equipment”, especially air source heat pumps.
    The first come first served approach to awarding funding was also said to be “encouraging applicants to opt for more straightforward projects” and “potentially undermining the achievement of PSDS objective by restricting the opportunity for largermore complex measures which may have delivered greater carbon reduction benefits”.
    But the consensus among stakeholders and industry representatives interviewed for the report was that the scheme was “currently key to sustaining the existing UK heat pump market” and that it was “seen as vital in enabling many public sector organisations to invest in heat decarbonisation”.
    #government #ditches #public #sector #decarbonisation
    Government ditches public sector decarbonisation scheme
    The government has axed a scheme for upgrading energy efficiency in public sector buildings. The Public Sector Decarbonisation Schemedelivered more than £2.5bn in its first three phases for measures such as heat pumps, solar panels, insulation and double glazing, with further funding of nearly £1bn recently announced. But the Department for Energy Security and Net Zerohas told Building Design that the scheme has been dropped after the spending review, leaving uncertainty about how upgrades will be funded when the current phase expires in 2028. Source: UK Government/FlickrEd Miliband’s Department for Energy Security and Net Zero is responsible for the scheme The department said it would set out plans for the period after 2028 in due course. In a post on LinkedIn, Dave Welkin, director of sustainability at Gleeds, said he had waited for the release of the spending review with a “sense of trepidation” and was unable to find mention of public sector decarbonisation when Treasury documents were released. “I hoped because it was already committed in the Budget that its omission wasn’t ominous,” he wrote. Yesterday, he was told by Salix Finance, the non-departmental public body that delivers funding for the scheme, that it was no longer being funded. It comes after the withdrawal of funding for the Low Carbon Skills Fundin May. According to the government’s website, PSDS and LCSF were intended to support the reduction of emissions from public sector buildings by 75% by 2037, compared to a 2017 baseline. “Neither LCSF or PSDS were perfect by any means, but they did provide a vital source of funding for local authorities, hospitals, schools and many other public sector organisations to save energy, carbon and money,” Welkin said. “PSDS has helped replace failed heating systems in schools, keeping students warm. It’s replaced roofs on hospitals, helping patients recover from illness. It’s replaced windows in our prisons, improving security and stopping drugs getting behind bars.” However, responding to Welkin’s post, Steve Connolly, chief executive at Arriba Technologies, a low carbon heating and cooling firm, said that the scheme was being “mismanaged” with a small number of professional services firms “scooping up disproportionately large grants for their clients”. The fourth phase of the scheme was confirmed last September, with allocations confirmed only last month. This latest phase, which covers the financial years between 2025/26 and 2027/28, saw the distribution of £940m across the country. A DESNZ spokesperson said: “Our settlement is about investing in Britain’s renewal to create energy security, sprint to clean power by 2030, encourage investment, create jobs and bring down bills for good. “We will deliver £1bn in current allocations of the Public Sector Decarbonisation Scheme until 2028 and, through Great British Energy, have invested in new rooftop solar power and renewable schemes to lower energy bills for schools and hospitals across the UK. “We want to build on this progress by incentivising the public sector to decarbonise, so they can reap the benefits in lower bills and emissions, sharing best practice across government and exploring the use of repayable finance, where appropriate.” A government assessment of phase 3a and 3b projects identified a number of issues with the scheme, including delays and cost inflation, with more than a tenth being abandoned subsequent to grants being offered. Stakeholders interviewed for the report also identified “difficulties in obtaining skilled contractors and equipment”, especially air source heat pumps. The first come first served approach to awarding funding was also said to be “encouraging applicants to opt for more straightforward projects” and “potentially undermining the achievement of PSDS objective by restricting the opportunity for largermore complex measures which may have delivered greater carbon reduction benefits”. But the consensus among stakeholders and industry representatives interviewed for the report was that the scheme was “currently key to sustaining the existing UK heat pump market” and that it was “seen as vital in enabling many public sector organisations to invest in heat decarbonisation”. #government #ditches #public #sector #decarbonisation
    WWW.BDONLINE.CO.UK
    Government ditches public sector decarbonisation scheme
    The government has axed a scheme for upgrading energy efficiency in public sector buildings. The Public Sector Decarbonisation Scheme (PSDS) delivered more than £2.5bn in its first three phases for measures such as heat pumps, solar panels, insulation and double glazing, with further funding of nearly £1bn recently announced. But the Department for Energy Security and Net Zero (DESNZ) has told Building Design that the scheme has been dropped after the spending review, leaving uncertainty about how upgrades will be funded when the current phase expires in 2028. Source: UK Government/FlickrEd Miliband’s Department for Energy Security and Net Zero is responsible for the scheme The department said it would set out plans for the period after 2028 in due course. In a post on LinkedIn, Dave Welkin, director of sustainability at Gleeds, said he had waited for the release of the spending review with a “sense of trepidation” and was unable to find mention of public sector decarbonisation when Treasury documents were released. “I hoped because it was already committed in the Budget that its omission wasn’t ominous,” he wrote. Yesterday, he was told by Salix Finance, the non-departmental public body that delivers funding for the scheme, that it was no longer being funded. It comes after the withdrawal of funding for the Low Carbon Skills Fund (LCSF) in May. According to the government’s website, PSDS and LCSF were intended to support the reduction of emissions from public sector buildings by 75% by 2037, compared to a 2017 baseline. “Neither LCSF or PSDS were perfect by any means, but they did provide a vital source of funding for local authorities, hospitals, schools and many other public sector organisations to save energy, carbon and money,” Welkin said. “PSDS has helped replace failed heating systems in schools, keeping students warm. It’s replaced roofs on hospitals, helping patients recover from illness. It’s replaced windows in our prisons, improving security and stopping drugs getting behind bars.” However, responding to Welkin’s post, Steve Connolly, chief executive at Arriba Technologies, a low carbon heating and cooling firm, said that the scheme was being “mismanaged” with a small number of professional services firms “scooping up disproportionately large grants for their clients”. The fourth phase of the scheme was confirmed last September, with allocations confirmed only last month. This latest phase, which covers the financial years between 2025/26 and 2027/28, saw the distribution of £940m across the country. A DESNZ spokesperson said: “Our settlement is about investing in Britain’s renewal to create energy security, sprint to clean power by 2030, encourage investment, create jobs and bring down bills for good. “We will deliver £1bn in current allocations of the Public Sector Decarbonisation Scheme until 2028 and, through Great British Energy, have invested in new rooftop solar power and renewable schemes to lower energy bills for schools and hospitals across the UK. “We want to build on this progress by incentivising the public sector to decarbonise, so they can reap the benefits in lower bills and emissions, sharing best practice across government and exploring the use of repayable finance, where appropriate.” A government assessment of phase 3a and 3b projects identified a number of issues with the scheme, including delays and cost inflation, with more than a tenth being abandoned subsequent to grants being offered. Stakeholders interviewed for the report also identified “difficulties in obtaining skilled contractors and equipment”, especially air source heat pumps. The first come first served approach to awarding funding was also said to be “encouraging applicants to opt for more straightforward projects” and “potentially undermining the achievement of PSDS objective by restricting the opportunity for larger [and] more complex measures which may have delivered greater carbon reduction benefits”. But the consensus among stakeholders and industry representatives interviewed for the report was that the scheme was “currently key to sustaining the existing UK heat pump market” and that it was “seen as vital in enabling many public sector organisations to invest in heat decarbonisation”.
    Like
    Love
    Wow
    Sad
    Angry
    474
    2 Yorumlar 0 hisse senetleri 0 önizleme
  • New Zealand’s Email Security Requirements for Government Organizations: What You Need to Know

    The Secure Government EmailCommon Implementation Framework
    New Zealand’s government is introducing a comprehensive email security framework designed to protect official communications from phishing and domain spoofing. This new framework, which will be mandatory for all government agencies by October 2025, establishes clear technical standards to enhance email security and retire the outdated SEEMail service. 
    Key Takeaways

    All NZ government agencies must comply with new email security requirements by October 2025.
    The new framework strengthens trust and security in government communications by preventing spoofing and phishing.
    The framework mandates TLS 1.2+, SPF, DKIM, DMARC with p=reject, MTA-STS, and DLP controls.
    EasyDMARC simplifies compliance with our guided setup, monitoring, and automated reporting.

    Start a Free Trial

    What is the Secure Government Email Common Implementation Framework?
    The Secure Government EmailCommon Implementation Framework is a new government-led initiative in New Zealand designed to standardize email security across all government agencies. Its main goal is to secure external email communication, reduce domain spoofing in phishing attacks, and replace the legacy SEEMail service.
    Why is New Zealand Implementing New Government Email Security Standards?
    The framework was developed by New Zealand’s Department of Internal Affairsas part of its role in managing ICT Common Capabilities. It leverages modern email security controls via the Domain Name Systemto enable the retirement of the legacy SEEMail service and provide:

    Encryption for transmission security
    Digital signing for message integrity
    Basic non-repudiationDomain spoofing protection

    These improvements apply to all emails, not just those routed through SEEMail, offering broader protection across agency communications.
    What Email Security Technologies Are Required by the New NZ SGE Framework?
    The SGE Framework outlines the following key technologies that agencies must implement:

    TLS 1.2 or higher with implicit TLS enforced
    TLS-RPTSPFDKIMDMARCwith reporting
    MTA-STSData Loss Prevention controls

    These technologies work together to ensure encrypted email transmission, validate sender identity, prevent unauthorized use of domains, and reduce the risk of sensitive data leaks.

    Get in touch

    When Do NZ Government Agencies Need to Comply with this Framework?
    All New Zealand government agencies are expected to fully implement the Secure Government EmailCommon Implementation Framework by October 2025. Agencies should begin their planning and deployment now to ensure full compliance by the deadline.
    The All of Government Secure Email Common Implementation Framework v1.0
    What are the Mandated Requirements for Domains?
    Below are the exact requirements for all email-enabled domains under the new framework.
    ControlExact RequirementTLSMinimum TLS 1.2. TLS 1.1, 1.0, SSL, or clear-text not permitted.TLS-RPTAll email-sending domains must have TLS reporting enabled.SPFMust exist and end with -all.DKIMAll outbound email from every sending service must be DKIM-signed at the final hop.DMARCPolicy of p=reject on all email-enabled domains. adkim=s is recommended when not bulk-sending.MTA-STSEnabled and set to enforce.Implicit TLSMust be configured and enforced for every connection.Data Loss PreventionEnforce in line with the New Zealand Information Security Manualand Protective Security Requirements.
    Compliance Monitoring and Reporting
    The All of Government Service Deliveryteam will be monitoring compliance with the framework. Monitoring will initially cover SPF, DMARC, and MTA-STS settings and will be expanded to include DKIM. Changes to these settings will be monitored, enabling reporting on email security compliance across all government agencies. Ongoing monitoring will highlight changes to domains, ensure new domains are set up with security in place, and monitor the implementation of future email security technologies. 
    Should compliance changes occur, such as an agency’s SPF record being changed from -all to ~all, this will be captured so that the AoGSD Security Team can investigate. They will then communicate directly with the agency to determine if an issue exists or if an error has occurred, reviewing each case individually.
    Deployment Checklist for NZ Government Compliance

    Enforce TLS 1.2 minimum, implicit TLS, MTA-STS & TLS-RPT
    SPF with -all
    DKIM on all outbound email
    DMARC p=reject 
    adkim=s where suitable
    For non-email/parked domains: SPF -all, empty DKIM, DMARC reject strict
    Compliance dashboard
    Inbound DMARC evaluation enforced
    DLP aligned with NZISM

    Start a Free Trial

    How EasyDMARC Can Help Government Agencies Comply
    EasyDMARC provides a comprehensive email security solution that simplifies the deployment and ongoing management of DNS-based email security protocols like SPF, DKIM, and DMARC with reporting. Our platform offers automated checks, real-time monitoring, and a guided setup to help government organizations quickly reach compliance.
    1. TLS-RPT / MTA-STS audit
    EasyDMARC enables you to enable the Managed MTA-STS and TLS-RPT option with a single click. We provide the required DNS records and continuously monitor them for issues, delivering reports on TLS negotiation problems. This helps agencies ensure secure email transmission and quickly detect delivery or encryption failures.

    Note: In this screenshot, you can see how to deploy MTA-STS and TLS Reporting by adding just three CNAME records provided by EasyDMARC. It’s recommended to start in “testing” mode, evaluate the TLS-RPT reports, and then gradually switch your MTA-STS policy to “enforce”. The process is simple and takes just a few clicks.

    As shown above, EasyDMARC parses incoming TLS reports into a centralized dashboard, giving you clear visibility into delivery and encryption issues across all sending sources.
    2. SPF with “-all”In the EasyDARC platform, you can run the SPF Record Generator to create a compliant record. Publish your v=spf1 record with “-all” to enforce a hard fail for unauthorized senders and prevent spoofed emails from passing SPF checks. This strengthens your domain’s protection against impersonation.

    Note: It is highly recommended to start adjusting your SPF record only after you begin receiving DMARC reports and identifying your legitimate email sources. As we’ll explain in more detail below, both SPF and DKIM should be adjusted after you gain visibility through reports.
    Making changes without proper visibility can lead to false positives, misconfigurations, and potential loss of legitimate emails. That’s why the first step should always be setting DMARC to p=none, receiving reports, analyzing them, and then gradually fixing any SPF or DKIM issues.
    3. DKIM on all outbound email
    DKIM must be configured for all email sources sending emails on behalf of your domain. This is critical, as DKIM plays a bigger role than SPF when it comes to building domain reputation, surviving auto-forwarding, mailing lists, and other edge cases.
    As mentioned above, DMARC reports provide visibility into your email sources, allowing you to implement DKIM accordingly. If you’re using third-party services like Google Workspace, Microsoft 365, or Mimecast, you’ll need to retrieve the public DKIM key from your provider’s admin interface.
    EasyDMARC maintains a backend directory of over 1,400 email sources. We also give you detailed guidance on how to configure SPF and DKIM correctly for major ESPs. 
    Note: At the end of this article, you’ll find configuration links for well-known ESPs like Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid – helping you avoid common misconfigurations and get aligned with SGE requirements.
    If you’re using a dedicated MTA, DKIM must be implemented manually. EasyDMARC’s DKIM Record Generator lets you generate both public and private keys for your server. The private key is stored on your MTA, while the public key must be published in your DNS.

    4. DMARC p=reject rollout
    As mentioned in previous points, DMARC reporting is the first and most important step on your DMARC enforcement journey. Always start with a p=none policy and configure RUA reports to be sent to EasyDMARC. Use the report insights to identify and fix SPF and DKIM alignment issues, then gradually move to p=quarantine and finally p=reject once all legitimate email sources have been authenticated. 
    This phased approach ensures full protection against domain spoofing without risking legitimate email delivery.

    5. adkim Strict Alignment Check
    This strict alignment check is not always applicable, especially if you’re using third-party bulk ESPs, such as Sendgrid, that require you to set DKIM on a subdomain level. You can set adkim=s in your DMARC TXT record, or simply enable strict mode in EasyDMARC’s Managed DMARC settings. This ensures that only emails with a DKIM signature that exactly match your domain pass alignment, adding an extra layer of protection against domain spoofing. But only do this if you are NOT a bulk sender.

    6. Securing Non-Email Enabled Domains
    The purpose of deploying email security to non-email-enabled domains, or parked domains, is to prevent messages being spoofed from that domain. This requirement remains even if the root-level domain has SP=reject set within its DMARC record.
    Under this new framework, you must bulk import and mark parked domains as “Parked.” Crucially, this requires adjusting SPF settings to an empty record, setting DMARC to p=reject, and ensuring an empty DKIM record is in place: • SPF record: “v=spf1 -all”.
    • Wildcard DKIM record with empty public key.• DMARC record: “v=DMARC1;p=reject;adkim=s;aspf=s;rua=mailto:…”.
    EasyDMARC allows you to add and label parked domains for free. This is important because it helps you monitor any activity from these domains and ensure they remain protected with a strict DMARC policy of p=reject.
    7. Compliance Dashboard
    Use EasyDMARC’s Domain Scanner to assess the security posture of each domain with a clear compliance score and risk level. The dashboard highlights configuration gaps and guides remediation steps, helping government agencies stay on track toward full compliance with the SGE Framework.

    8. Inbound DMARC Evaluation Enforced
    You don’t need to apply any changes if you’re using Google Workspace, Microsoft 365, or other major mailbox providers. Most of them already enforce DMARC evaluation on incoming emails.
    However, some legacy Microsoft 365 setups may still quarantine emails that fail DMARC checks, even when the sending domain has a p=reject policy, instead of rejecting them. This behavior can be adjusted directly from your Microsoft Defender portal. about this in our step-by-step guide on how to set up SPF, DKIM, and DMARC from Microsoft Defender.
    If you’re using a third-party mail provider that doesn’t enforce having a DMARC policy for incoming emails, which is rare, you’ll need to contact their support to request a configuration change.
    9. Data Loss Prevention Aligned with NZISM
    The New Zealand Information Security Manualis the New Zealand Government’s manual on information assurance and information systems security. It includes guidance on data loss prevention, which must be followed to be aligned with the SEG.
    Need Help Setting up SPF and DKIM for your Email Provider?
    Setting up SPF and DKIM for different ESPs often requires specific configurations. Some providers require you to publish SPF and DKIM on a subdomain, while others only require DKIM, or have different formatting rules. We’ve simplified all these steps to help you avoid misconfigurations that could delay your DMARC enforcement, or worse, block legitimate emails from reaching your recipients.
    Below you’ll find comprehensive setup guides for Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid. You can also explore our full blog section that covers setup instructions for many other well-known ESPs.
    Remember, all this information is reflected in your DMARC aggregate reports. These reports give you live visibility into your outgoing email ecosystem, helping you analyze and fix any issues specific to a given provider.
    Here are our step-by-step guides for the most common platforms:

    Google Workspace

    Microsoft 365

    These guides will help ensure your DNS records are configured correctly as part of the Secure Government EmailFramework rollout.
    Meet New Government Email Security Standards With EasyDMARC
    New Zealand’s SEG Framework sets a clear path for government agencies to enhance their email security by October 2025. With EasyDMARC, you can meet these technical requirements efficiently and with confidence. From protocol setup to continuous monitoring and compliance tracking, EasyDMARC streamlines the entire process, ensuring strong protection against spoofing, phishing, and data loss while simplifying your transition from SEEMail.
    #new #zealands #email #security #requirements
    New Zealand’s Email Security Requirements for Government Organizations: What You Need to Know
    The Secure Government EmailCommon Implementation Framework New Zealand’s government is introducing a comprehensive email security framework designed to protect official communications from phishing and domain spoofing. This new framework, which will be mandatory for all government agencies by October 2025, establishes clear technical standards to enhance email security and retire the outdated SEEMail service.  Key Takeaways All NZ government agencies must comply with new email security requirements by October 2025. The new framework strengthens trust and security in government communications by preventing spoofing and phishing. The framework mandates TLS 1.2+, SPF, DKIM, DMARC with p=reject, MTA-STS, and DLP controls. EasyDMARC simplifies compliance with our guided setup, monitoring, and automated reporting. Start a Free Trial What is the Secure Government Email Common Implementation Framework? The Secure Government EmailCommon Implementation Framework is a new government-led initiative in New Zealand designed to standardize email security across all government agencies. Its main goal is to secure external email communication, reduce domain spoofing in phishing attacks, and replace the legacy SEEMail service. Why is New Zealand Implementing New Government Email Security Standards? The framework was developed by New Zealand’s Department of Internal Affairsas part of its role in managing ICT Common Capabilities. It leverages modern email security controls via the Domain Name Systemto enable the retirement of the legacy SEEMail service and provide: Encryption for transmission security Digital signing for message integrity Basic non-repudiationDomain spoofing protection These improvements apply to all emails, not just those routed through SEEMail, offering broader protection across agency communications. What Email Security Technologies Are Required by the New NZ SGE Framework? The SGE Framework outlines the following key technologies that agencies must implement: TLS 1.2 or higher with implicit TLS enforced TLS-RPTSPFDKIMDMARCwith reporting MTA-STSData Loss Prevention controls These technologies work together to ensure encrypted email transmission, validate sender identity, prevent unauthorized use of domains, and reduce the risk of sensitive data leaks. Get in touch When Do NZ Government Agencies Need to Comply with this Framework? All New Zealand government agencies are expected to fully implement the Secure Government EmailCommon Implementation Framework by October 2025. Agencies should begin their planning and deployment now to ensure full compliance by the deadline. The All of Government Secure Email Common Implementation Framework v1.0 What are the Mandated Requirements for Domains? Below are the exact requirements for all email-enabled domains under the new framework. ControlExact RequirementTLSMinimum TLS 1.2. TLS 1.1, 1.0, SSL, or clear-text not permitted.TLS-RPTAll email-sending domains must have TLS reporting enabled.SPFMust exist and end with -all.DKIMAll outbound email from every sending service must be DKIM-signed at the final hop.DMARCPolicy of p=reject on all email-enabled domains. adkim=s is recommended when not bulk-sending.MTA-STSEnabled and set to enforce.Implicit TLSMust be configured and enforced for every connection.Data Loss PreventionEnforce in line with the New Zealand Information Security Manualand Protective Security Requirements. Compliance Monitoring and Reporting The All of Government Service Deliveryteam will be monitoring compliance with the framework. Monitoring will initially cover SPF, DMARC, and MTA-STS settings and will be expanded to include DKIM. Changes to these settings will be monitored, enabling reporting on email security compliance across all government agencies. Ongoing monitoring will highlight changes to domains, ensure new domains are set up with security in place, and monitor the implementation of future email security technologies.  Should compliance changes occur, such as an agency’s SPF record being changed from -all to ~all, this will be captured so that the AoGSD Security Team can investigate. They will then communicate directly with the agency to determine if an issue exists or if an error has occurred, reviewing each case individually. Deployment Checklist for NZ Government Compliance Enforce TLS 1.2 minimum, implicit TLS, MTA-STS & TLS-RPT SPF with -all DKIM on all outbound email DMARC p=reject  adkim=s where suitable For non-email/parked domains: SPF -all, empty DKIM, DMARC reject strict Compliance dashboard Inbound DMARC evaluation enforced DLP aligned with NZISM Start a Free Trial How EasyDMARC Can Help Government Agencies Comply EasyDMARC provides a comprehensive email security solution that simplifies the deployment and ongoing management of DNS-based email security protocols like SPF, DKIM, and DMARC with reporting. Our platform offers automated checks, real-time monitoring, and a guided setup to help government organizations quickly reach compliance. 1. TLS-RPT / MTA-STS audit EasyDMARC enables you to enable the Managed MTA-STS and TLS-RPT option with a single click. We provide the required DNS records and continuously monitor them for issues, delivering reports on TLS negotiation problems. This helps agencies ensure secure email transmission and quickly detect delivery or encryption failures. Note: In this screenshot, you can see how to deploy MTA-STS and TLS Reporting by adding just three CNAME records provided by EasyDMARC. It’s recommended to start in “testing” mode, evaluate the TLS-RPT reports, and then gradually switch your MTA-STS policy to “enforce”. The process is simple and takes just a few clicks. As shown above, EasyDMARC parses incoming TLS reports into a centralized dashboard, giving you clear visibility into delivery and encryption issues across all sending sources. 2. SPF with “-all”In the EasyDARC platform, you can run the SPF Record Generator to create a compliant record. Publish your v=spf1 record with “-all” to enforce a hard fail for unauthorized senders and prevent spoofed emails from passing SPF checks. This strengthens your domain’s protection against impersonation. Note: It is highly recommended to start adjusting your SPF record only after you begin receiving DMARC reports and identifying your legitimate email sources. As we’ll explain in more detail below, both SPF and DKIM should be adjusted after you gain visibility through reports. Making changes without proper visibility can lead to false positives, misconfigurations, and potential loss of legitimate emails. That’s why the first step should always be setting DMARC to p=none, receiving reports, analyzing them, and then gradually fixing any SPF or DKIM issues. 3. DKIM on all outbound email DKIM must be configured for all email sources sending emails on behalf of your domain. This is critical, as DKIM plays a bigger role than SPF when it comes to building domain reputation, surviving auto-forwarding, mailing lists, and other edge cases. As mentioned above, DMARC reports provide visibility into your email sources, allowing you to implement DKIM accordingly. If you’re using third-party services like Google Workspace, Microsoft 365, or Mimecast, you’ll need to retrieve the public DKIM key from your provider’s admin interface. EasyDMARC maintains a backend directory of over 1,400 email sources. We also give you detailed guidance on how to configure SPF and DKIM correctly for major ESPs.  Note: At the end of this article, you’ll find configuration links for well-known ESPs like Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid – helping you avoid common misconfigurations and get aligned with SGE requirements. If you’re using a dedicated MTA, DKIM must be implemented manually. EasyDMARC’s DKIM Record Generator lets you generate both public and private keys for your server. The private key is stored on your MTA, while the public key must be published in your DNS. 4. DMARC p=reject rollout As mentioned in previous points, DMARC reporting is the first and most important step on your DMARC enforcement journey. Always start with a p=none policy and configure RUA reports to be sent to EasyDMARC. Use the report insights to identify and fix SPF and DKIM alignment issues, then gradually move to p=quarantine and finally p=reject once all legitimate email sources have been authenticated.  This phased approach ensures full protection against domain spoofing without risking legitimate email delivery. 5. adkim Strict Alignment Check This strict alignment check is not always applicable, especially if you’re using third-party bulk ESPs, such as Sendgrid, that require you to set DKIM on a subdomain level. You can set adkim=s in your DMARC TXT record, or simply enable strict mode in EasyDMARC’s Managed DMARC settings. This ensures that only emails with a DKIM signature that exactly match your domain pass alignment, adding an extra layer of protection against domain spoofing. But only do this if you are NOT a bulk sender. 6. Securing Non-Email Enabled Domains The purpose of deploying email security to non-email-enabled domains, or parked domains, is to prevent messages being spoofed from that domain. This requirement remains even if the root-level domain has SP=reject set within its DMARC record. Under this new framework, you must bulk import and mark parked domains as “Parked.” Crucially, this requires adjusting SPF settings to an empty record, setting DMARC to p=reject, and ensuring an empty DKIM record is in place: • SPF record: “v=spf1 -all”. • Wildcard DKIM record with empty public key.• DMARC record: “v=DMARC1;p=reject;adkim=s;aspf=s;rua=mailto:…”. EasyDMARC allows you to add and label parked domains for free. This is important because it helps you monitor any activity from these domains and ensure they remain protected with a strict DMARC policy of p=reject. 7. Compliance Dashboard Use EasyDMARC’s Domain Scanner to assess the security posture of each domain with a clear compliance score and risk level. The dashboard highlights configuration gaps and guides remediation steps, helping government agencies stay on track toward full compliance with the SGE Framework. 8. Inbound DMARC Evaluation Enforced You don’t need to apply any changes if you’re using Google Workspace, Microsoft 365, or other major mailbox providers. Most of them already enforce DMARC evaluation on incoming emails. However, some legacy Microsoft 365 setups may still quarantine emails that fail DMARC checks, even when the sending domain has a p=reject policy, instead of rejecting them. This behavior can be adjusted directly from your Microsoft Defender portal. about this in our step-by-step guide on how to set up SPF, DKIM, and DMARC from Microsoft Defender. If you’re using a third-party mail provider that doesn’t enforce having a DMARC policy for incoming emails, which is rare, you’ll need to contact their support to request a configuration change. 9. Data Loss Prevention Aligned with NZISM The New Zealand Information Security Manualis the New Zealand Government’s manual on information assurance and information systems security. It includes guidance on data loss prevention, which must be followed to be aligned with the SEG. Need Help Setting up SPF and DKIM for your Email Provider? Setting up SPF and DKIM for different ESPs often requires specific configurations. Some providers require you to publish SPF and DKIM on a subdomain, while others only require DKIM, or have different formatting rules. We’ve simplified all these steps to help you avoid misconfigurations that could delay your DMARC enforcement, or worse, block legitimate emails from reaching your recipients. Below you’ll find comprehensive setup guides for Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid. You can also explore our full blog section that covers setup instructions for many other well-known ESPs. Remember, all this information is reflected in your DMARC aggregate reports. These reports give you live visibility into your outgoing email ecosystem, helping you analyze and fix any issues specific to a given provider. Here are our step-by-step guides for the most common platforms: Google Workspace Microsoft 365 These guides will help ensure your DNS records are configured correctly as part of the Secure Government EmailFramework rollout. Meet New Government Email Security Standards With EasyDMARC New Zealand’s SEG Framework sets a clear path for government agencies to enhance their email security by October 2025. With EasyDMARC, you can meet these technical requirements efficiently and with confidence. From protocol setup to continuous monitoring and compliance tracking, EasyDMARC streamlines the entire process, ensuring strong protection against spoofing, phishing, and data loss while simplifying your transition from SEEMail. #new #zealands #email #security #requirements
    EASYDMARC.COM
    New Zealand’s Email Security Requirements for Government Organizations: What You Need to Know
    The Secure Government Email (SGE) Common Implementation Framework New Zealand’s government is introducing a comprehensive email security framework designed to protect official communications from phishing and domain spoofing. This new framework, which will be mandatory for all government agencies by October 2025, establishes clear technical standards to enhance email security and retire the outdated SEEMail service.  Key Takeaways All NZ government agencies must comply with new email security requirements by October 2025. The new framework strengthens trust and security in government communications by preventing spoofing and phishing. The framework mandates TLS 1.2+, SPF, DKIM, DMARC with p=reject, MTA-STS, and DLP controls. EasyDMARC simplifies compliance with our guided setup, monitoring, and automated reporting. Start a Free Trial What is the Secure Government Email Common Implementation Framework? The Secure Government Email (SGE) Common Implementation Framework is a new government-led initiative in New Zealand designed to standardize email security across all government agencies. Its main goal is to secure external email communication, reduce domain spoofing in phishing attacks, and replace the legacy SEEMail service. Why is New Zealand Implementing New Government Email Security Standards? The framework was developed by New Zealand’s Department of Internal Affairs (DIA) as part of its role in managing ICT Common Capabilities. It leverages modern email security controls via the Domain Name System (DNS) to enable the retirement of the legacy SEEMail service and provide: Encryption for transmission security Digital signing for message integrity Basic non-repudiation (by allowing only authorized senders) Domain spoofing protection These improvements apply to all emails, not just those routed through SEEMail, offering broader protection across agency communications. What Email Security Technologies Are Required by the New NZ SGE Framework? The SGE Framework outlines the following key technologies that agencies must implement: TLS 1.2 or higher with implicit TLS enforced TLS-RPT (TLS Reporting) SPF (Sender Policy Framework) DKIM (DomainKeys Identified Mail) DMARC (Domain-based Message Authentication, Reporting, and Conformance) with reporting MTA-STS (Mail Transfer Agent Strict Transport Security) Data Loss Prevention controls These technologies work together to ensure encrypted email transmission, validate sender identity, prevent unauthorized use of domains, and reduce the risk of sensitive data leaks. Get in touch When Do NZ Government Agencies Need to Comply with this Framework? All New Zealand government agencies are expected to fully implement the Secure Government Email (SGE) Common Implementation Framework by October 2025. Agencies should begin their planning and deployment now to ensure full compliance by the deadline. The All of Government Secure Email Common Implementation Framework v1.0 What are the Mandated Requirements for Domains? Below are the exact requirements for all email-enabled domains under the new framework. ControlExact RequirementTLSMinimum TLS 1.2. TLS 1.1, 1.0, SSL, or clear-text not permitted.TLS-RPTAll email-sending domains must have TLS reporting enabled.SPFMust exist and end with -all.DKIMAll outbound email from every sending service must be DKIM-signed at the final hop.DMARCPolicy of p=reject on all email-enabled domains. adkim=s is recommended when not bulk-sending.MTA-STSEnabled and set to enforce.Implicit TLSMust be configured and enforced for every connection.Data Loss PreventionEnforce in line with the New Zealand Information Security Manual (NZISM) and Protective Security Requirements (PSR). Compliance Monitoring and Reporting The All of Government Service Delivery (AoGSD) team will be monitoring compliance with the framework. Monitoring will initially cover SPF, DMARC, and MTA-STS settings and will be expanded to include DKIM. Changes to these settings will be monitored, enabling reporting on email security compliance across all government agencies. Ongoing monitoring will highlight changes to domains, ensure new domains are set up with security in place, and monitor the implementation of future email security technologies.  Should compliance changes occur, such as an agency’s SPF record being changed from -all to ~all, this will be captured so that the AoGSD Security Team can investigate. They will then communicate directly with the agency to determine if an issue exists or if an error has occurred, reviewing each case individually. Deployment Checklist for NZ Government Compliance Enforce TLS 1.2 minimum, implicit TLS, MTA-STS & TLS-RPT SPF with -all DKIM on all outbound email DMARC p=reject  adkim=s where suitable For non-email/parked domains: SPF -all, empty DKIM, DMARC reject strict Compliance dashboard Inbound DMARC evaluation enforced DLP aligned with NZISM Start a Free Trial How EasyDMARC Can Help Government Agencies Comply EasyDMARC provides a comprehensive email security solution that simplifies the deployment and ongoing management of DNS-based email security protocols like SPF, DKIM, and DMARC with reporting. Our platform offers automated checks, real-time monitoring, and a guided setup to help government organizations quickly reach compliance. 1. TLS-RPT / MTA-STS audit EasyDMARC enables you to enable the Managed MTA-STS and TLS-RPT option with a single click. We provide the required DNS records and continuously monitor them for issues, delivering reports on TLS negotiation problems. This helps agencies ensure secure email transmission and quickly detect delivery or encryption failures. Note: In this screenshot, you can see how to deploy MTA-STS and TLS Reporting by adding just three CNAME records provided by EasyDMARC. It’s recommended to start in “testing” mode, evaluate the TLS-RPT reports, and then gradually switch your MTA-STS policy to “enforce”. The process is simple and takes just a few clicks. As shown above, EasyDMARC parses incoming TLS reports into a centralized dashboard, giving you clear visibility into delivery and encryption issues across all sending sources. 2. SPF with “-all”In the EasyDARC platform, you can run the SPF Record Generator to create a compliant record. Publish your v=spf1 record with “-all” to enforce a hard fail for unauthorized senders and prevent spoofed emails from passing SPF checks. This strengthens your domain’s protection against impersonation. Note: It is highly recommended to start adjusting your SPF record only after you begin receiving DMARC reports and identifying your legitimate email sources. As we’ll explain in more detail below, both SPF and DKIM should be adjusted after you gain visibility through reports. Making changes without proper visibility can lead to false positives, misconfigurations, and potential loss of legitimate emails. That’s why the first step should always be setting DMARC to p=none, receiving reports, analyzing them, and then gradually fixing any SPF or DKIM issues. 3. DKIM on all outbound email DKIM must be configured for all email sources sending emails on behalf of your domain. This is critical, as DKIM plays a bigger role than SPF when it comes to building domain reputation, surviving auto-forwarding, mailing lists, and other edge cases. As mentioned above, DMARC reports provide visibility into your email sources, allowing you to implement DKIM accordingly (see first screenshot). If you’re using third-party services like Google Workspace, Microsoft 365, or Mimecast, you’ll need to retrieve the public DKIM key from your provider’s admin interface (see second screenshot). EasyDMARC maintains a backend directory of over 1,400 email sources. We also give you detailed guidance on how to configure SPF and DKIM correctly for major ESPs.  Note: At the end of this article, you’ll find configuration links for well-known ESPs like Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid – helping you avoid common misconfigurations and get aligned with SGE requirements. If you’re using a dedicated MTA (e.g., Postfix), DKIM must be implemented manually. EasyDMARC’s DKIM Record Generator lets you generate both public and private keys for your server. The private key is stored on your MTA, while the public key must be published in your DNS (see third and fourth screenshots). 4. DMARC p=reject rollout As mentioned in previous points, DMARC reporting is the first and most important step on your DMARC enforcement journey. Always start with a p=none policy and configure RUA reports to be sent to EasyDMARC. Use the report insights to identify and fix SPF and DKIM alignment issues, then gradually move to p=quarantine and finally p=reject once all legitimate email sources have been authenticated.  This phased approach ensures full protection against domain spoofing without risking legitimate email delivery. 5. adkim Strict Alignment Check This strict alignment check is not always applicable, especially if you’re using third-party bulk ESPs, such as Sendgrid, that require you to set DKIM on a subdomain level. You can set adkim=s in your DMARC TXT record, or simply enable strict mode in EasyDMARC’s Managed DMARC settings. This ensures that only emails with a DKIM signature that exactly match your domain pass alignment, adding an extra layer of protection against domain spoofing. But only do this if you are NOT a bulk sender. 6. Securing Non-Email Enabled Domains The purpose of deploying email security to non-email-enabled domains, or parked domains, is to prevent messages being spoofed from that domain. This requirement remains even if the root-level domain has SP=reject set within its DMARC record. Under this new framework, you must bulk import and mark parked domains as “Parked.” Crucially, this requires adjusting SPF settings to an empty record, setting DMARC to p=reject, and ensuring an empty DKIM record is in place: • SPF record: “v=spf1 -all”. • Wildcard DKIM record with empty public key.• DMARC record: “v=DMARC1;p=reject;adkim=s;aspf=s;rua=mailto:…”. EasyDMARC allows you to add and label parked domains for free. This is important because it helps you monitor any activity from these domains and ensure they remain protected with a strict DMARC policy of p=reject. 7. Compliance Dashboard Use EasyDMARC’s Domain Scanner to assess the security posture of each domain with a clear compliance score and risk level. The dashboard highlights configuration gaps and guides remediation steps, helping government agencies stay on track toward full compliance with the SGE Framework. 8. Inbound DMARC Evaluation Enforced You don’t need to apply any changes if you’re using Google Workspace, Microsoft 365, or other major mailbox providers. Most of them already enforce DMARC evaluation on incoming emails. However, some legacy Microsoft 365 setups may still quarantine emails that fail DMARC checks, even when the sending domain has a p=reject policy, instead of rejecting them. This behavior can be adjusted directly from your Microsoft Defender portal. Read more about this in our step-by-step guide on how to set up SPF, DKIM, and DMARC from Microsoft Defender. If you’re using a third-party mail provider that doesn’t enforce having a DMARC policy for incoming emails, which is rare, you’ll need to contact their support to request a configuration change. 9. Data Loss Prevention Aligned with NZISM The New Zealand Information Security Manual (NZISM) is the New Zealand Government’s manual on information assurance and information systems security. It includes guidance on data loss prevention (DLP), which must be followed to be aligned with the SEG. Need Help Setting up SPF and DKIM for your Email Provider? Setting up SPF and DKIM for different ESPs often requires specific configurations. Some providers require you to publish SPF and DKIM on a subdomain, while others only require DKIM, or have different formatting rules. We’ve simplified all these steps to help you avoid misconfigurations that could delay your DMARC enforcement, or worse, block legitimate emails from reaching your recipients. Below you’ll find comprehensive setup guides for Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid. You can also explore our full blog section that covers setup instructions for many other well-known ESPs. Remember, all this information is reflected in your DMARC aggregate reports. These reports give you live visibility into your outgoing email ecosystem, helping you analyze and fix any issues specific to a given provider. Here are our step-by-step guides for the most common platforms: Google Workspace Microsoft 365 These guides will help ensure your DNS records are configured correctly as part of the Secure Government Email (SGE) Framework rollout. Meet New Government Email Security Standards With EasyDMARC New Zealand’s SEG Framework sets a clear path for government agencies to enhance their email security by October 2025. With EasyDMARC, you can meet these technical requirements efficiently and with confidence. From protocol setup to continuous monitoring and compliance tracking, EasyDMARC streamlines the entire process, ensuring strong protection against spoofing, phishing, and data loss while simplifying your transition from SEEMail.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • NVIDIA helps Germany lead Europe’s AI manufacturing race

    Germany and NVIDIA are building possibly the most ambitious European tech project of the decade: the continent’s first industrial AI cloud.NVIDIA has been on a European tour over the past month with CEO Jensen Huang charming audiences at London Tech Week before dazzling the crowds at Paris’s VivaTech. But it was his meeting with German Chancellor Friedrich Merz that might prove the most consequential stop.The resulting partnership between NVIDIA and Deutsche Telekom isn’t just another corporate handshake; it’s potentially a turning point for European technological sovereignty.An “AI factory”will be created with a focus on manufacturing, which is hardly surprising given Germany’s renowned industrial heritage. The facility aims to give European industrial players the computational firepower to revolutionise everything from design to robotics.“In the era of AI, every manufacturer needs two factories: one for making things, and one for creating the intelligence that powers them,” said Huang. “By building Europe’s first industrial AI infrastructure, we’re enabling the region’s leading industrial companies to advance simulation-first, AI-driven manufacturing.”It’s rare to hear such urgency from a telecoms CEO, but Deutsche Telekom’s Timotheus Höttges added: “Europe’s technological future needs a sprint, not a stroll. We must seize the opportunities of artificial intelligence now, revolutionise our industry, and secure a leading position in the global technology competition. Our economic success depends on quick decisions and collaborative innovations.”The first phase alone will deploy 10,000 NVIDIA Blackwell GPUs spread across various high-performance systems. That makes this Germany’s largest AI deployment ever; a statement the country isn’t content to watch from the sidelines as AI transforms global industry.A Deloitte study recently highlighted the critical importance of AI technology development to Germany’s future competitiveness, particularly noting the need for expanded data centre capacity. When you consider that demand is expected to triple within just five years, this investment seems less like ambition and more like necessity.Robots teaching robotsOne of the early adopters is NEURA Robotics, a German firm that specialises in cognitive robotics. They’re using this computational muscle to power something called the Neuraverse which is essentially a connected network where robots can learn from each other.Think of it as a robotic hive mind for skills ranging from precision welding to household ironing, with each machine contributing its learnings to a collective intelligence.“Physical AI is the electricity of the future—it will power every machine on the planet,” said David Reger, Founder and CEO of NEURA Robotics. “Through this initiative, we’re helping build the sovereign infrastructure Europe needs to lead in intelligent robotics and stay in control of its future.”The implications of this AI project for manufacturing in Germany could be profound. This isn’t just about making existing factories slightly more efficient; it’s about reimagining what manufacturing can be in an age of intelligent machines.AI for more than just Germany’s industrial titansWhat’s particularly promising about this project is its potential reach beyond Germany’s industrial titans. The famed Mittelstand – the network of specialised small and medium-sized businesses that forms the backbone of the German economy – stands to benefit.These companies often lack the resources to build their own AI infrastructure but possess the specialised knowledge that makes them perfect candidates for AI-enhanced innovation. Democratising access to cutting-edge AI could help preserve their competitive edge in a challenging global market.Academic and research institutions will also gain access, potentially accelerating innovation across numerous fields. The approximately 900 Germany-based startups in NVIDIA’s Inception program will be eligible to use these resources, potentially unleashing a wave of entrepreneurial AI applications.However impressive this massive project is, it’s viewed merely as a stepping stone towards something even more ambitious: Europe’s AI gigafactory. This planned 100,000 GPU-powered initiative backed by the EU and Germany won’t come online until 2027, but it represents Europe’s determination to carve out its own technological future.As other European telecom providers follow suit with their own AI infrastructure projects, we may be witnessing the beginning of a concerted effort to establish technological sovereignty across the continent.For a region that has often found itself caught between American tech dominance and Chinese ambitions, building indigenous AI capability represents more than economic opportunity. Whether this bold project in Germany will succeed remains to be seen, but one thing is clear: Europe is no longer content to be a passive consumer of AI technology developed elsewhere.Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    #nvidia #helps #germany #lead #europes
    NVIDIA helps Germany lead Europe’s AI manufacturing race
    Germany and NVIDIA are building possibly the most ambitious European tech project of the decade: the continent’s first industrial AI cloud.NVIDIA has been on a European tour over the past month with CEO Jensen Huang charming audiences at London Tech Week before dazzling the crowds at Paris’s VivaTech. But it was his meeting with German Chancellor Friedrich Merz that might prove the most consequential stop.The resulting partnership between NVIDIA and Deutsche Telekom isn’t just another corporate handshake; it’s potentially a turning point for European technological sovereignty.An “AI factory”will be created with a focus on manufacturing, which is hardly surprising given Germany’s renowned industrial heritage. The facility aims to give European industrial players the computational firepower to revolutionise everything from design to robotics.“In the era of AI, every manufacturer needs two factories: one for making things, and one for creating the intelligence that powers them,” said Huang. “By building Europe’s first industrial AI infrastructure, we’re enabling the region’s leading industrial companies to advance simulation-first, AI-driven manufacturing.”It’s rare to hear such urgency from a telecoms CEO, but Deutsche Telekom’s Timotheus Höttges added: “Europe’s technological future needs a sprint, not a stroll. We must seize the opportunities of artificial intelligence now, revolutionise our industry, and secure a leading position in the global technology competition. Our economic success depends on quick decisions and collaborative innovations.”The first phase alone will deploy 10,000 NVIDIA Blackwell GPUs spread across various high-performance systems. That makes this Germany’s largest AI deployment ever; a statement the country isn’t content to watch from the sidelines as AI transforms global industry.A Deloitte study recently highlighted the critical importance of AI technology development to Germany’s future competitiveness, particularly noting the need for expanded data centre capacity. When you consider that demand is expected to triple within just five years, this investment seems less like ambition and more like necessity.Robots teaching robotsOne of the early adopters is NEURA Robotics, a German firm that specialises in cognitive robotics. They’re using this computational muscle to power something called the Neuraverse which is essentially a connected network where robots can learn from each other.Think of it as a robotic hive mind for skills ranging from precision welding to household ironing, with each machine contributing its learnings to a collective intelligence.“Physical AI is the electricity of the future—it will power every machine on the planet,” said David Reger, Founder and CEO of NEURA Robotics. “Through this initiative, we’re helping build the sovereign infrastructure Europe needs to lead in intelligent robotics and stay in control of its future.”The implications of this AI project for manufacturing in Germany could be profound. This isn’t just about making existing factories slightly more efficient; it’s about reimagining what manufacturing can be in an age of intelligent machines.AI for more than just Germany’s industrial titansWhat’s particularly promising about this project is its potential reach beyond Germany’s industrial titans. The famed Mittelstand – the network of specialised small and medium-sized businesses that forms the backbone of the German economy – stands to benefit.These companies often lack the resources to build their own AI infrastructure but possess the specialised knowledge that makes them perfect candidates for AI-enhanced innovation. Democratising access to cutting-edge AI could help preserve their competitive edge in a challenging global market.Academic and research institutions will also gain access, potentially accelerating innovation across numerous fields. The approximately 900 Germany-based startups in NVIDIA’s Inception program will be eligible to use these resources, potentially unleashing a wave of entrepreneurial AI applications.However impressive this massive project is, it’s viewed merely as a stepping stone towards something even more ambitious: Europe’s AI gigafactory. This planned 100,000 GPU-powered initiative backed by the EU and Germany won’t come online until 2027, but it represents Europe’s determination to carve out its own technological future.As other European telecom providers follow suit with their own AI infrastructure projects, we may be witnessing the beginning of a concerted effort to establish technological sovereignty across the continent.For a region that has often found itself caught between American tech dominance and Chinese ambitions, building indigenous AI capability represents more than economic opportunity. Whether this bold project in Germany will succeed remains to be seen, but one thing is clear: Europe is no longer content to be a passive consumer of AI technology developed elsewhere.Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here. #nvidia #helps #germany #lead #europes
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    NVIDIA helps Germany lead Europe’s AI manufacturing race
    Germany and NVIDIA are building possibly the most ambitious European tech project of the decade: the continent’s first industrial AI cloud.NVIDIA has been on a European tour over the past month with CEO Jensen Huang charming audiences at London Tech Week before dazzling the crowds at Paris’s VivaTech. But it was his meeting with German Chancellor Friedrich Merz that might prove the most consequential stop.The resulting partnership between NVIDIA and Deutsche Telekom isn’t just another corporate handshake; it’s potentially a turning point for European technological sovereignty.An “AI factory” (as they’re calling it) will be created with a focus on manufacturing, which is hardly surprising given Germany’s renowned industrial heritage. The facility aims to give European industrial players the computational firepower to revolutionise everything from design to robotics.“In the era of AI, every manufacturer needs two factories: one for making things, and one for creating the intelligence that powers them,” said Huang. “By building Europe’s first industrial AI infrastructure, we’re enabling the region’s leading industrial companies to advance simulation-first, AI-driven manufacturing.”It’s rare to hear such urgency from a telecoms CEO, but Deutsche Telekom’s Timotheus Höttges added: “Europe’s technological future needs a sprint, not a stroll. We must seize the opportunities of artificial intelligence now, revolutionise our industry, and secure a leading position in the global technology competition. Our economic success depends on quick decisions and collaborative innovations.”The first phase alone will deploy 10,000 NVIDIA Blackwell GPUs spread across various high-performance systems. That makes this Germany’s largest AI deployment ever; a statement the country isn’t content to watch from the sidelines as AI transforms global industry.A Deloitte study recently highlighted the critical importance of AI technology development to Germany’s future competitiveness, particularly noting the need for expanded data centre capacity. When you consider that demand is expected to triple within just five years, this investment seems less like ambition and more like necessity.Robots teaching robotsOne of the early adopters is NEURA Robotics, a German firm that specialises in cognitive robotics. They’re using this computational muscle to power something called the Neuraverse which is essentially a connected network where robots can learn from each other.Think of it as a robotic hive mind for skills ranging from precision welding to household ironing, with each machine contributing its learnings to a collective intelligence.“Physical AI is the electricity of the future—it will power every machine on the planet,” said David Reger, Founder and CEO of NEURA Robotics. “Through this initiative, we’re helping build the sovereign infrastructure Europe needs to lead in intelligent robotics and stay in control of its future.”The implications of this AI project for manufacturing in Germany could be profound. This isn’t just about making existing factories slightly more efficient; it’s about reimagining what manufacturing can be in an age of intelligent machines.AI for more than just Germany’s industrial titansWhat’s particularly promising about this project is its potential reach beyond Germany’s industrial titans. The famed Mittelstand – the network of specialised small and medium-sized businesses that forms the backbone of the German economy – stands to benefit.These companies often lack the resources to build their own AI infrastructure but possess the specialised knowledge that makes them perfect candidates for AI-enhanced innovation. Democratising access to cutting-edge AI could help preserve their competitive edge in a challenging global market.Academic and research institutions will also gain access, potentially accelerating innovation across numerous fields. The approximately 900 Germany-based startups in NVIDIA’s Inception program will be eligible to use these resources, potentially unleashing a wave of entrepreneurial AI applications.However impressive this massive project is, it’s viewed merely as a stepping stone towards something even more ambitious: Europe’s AI gigafactory. This planned 100,000 GPU-powered initiative backed by the EU and Germany won’t come online until 2027, but it represents Europe’s determination to carve out its own technological future.As other European telecom providers follow suit with their own AI infrastructure projects, we may be witnessing the beginning of a concerted effort to establish technological sovereignty across the continent.For a region that has often found itself caught between American tech dominance and Chinese ambitions, building indigenous AI capability represents more than economic opportunity. Whether this bold project in Germany will succeed remains to be seen, but one thing is clear: Europe is no longer content to be a passive consumer of AI technology developed elsewhere.(Photo by Maheshkumar Painam)Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Sienna Net-Zero Home / billionBricks

    Sienna Net-Zero Home / billionBricksSave this picture!© Ron Mendoza , Mark Twain C , BB teamHouses, Sustainability•Quezon City, Philippines

    Architects:
    billionBricks
    Area
    Area of this architecture project

    Area: 
    45 m²

    Year
    Completion year of this architecture project

    Year: 

    2024

    Photographs

    Photographs:Ron Mendoza , Mark Twain C , BB teamMore SpecsLess Specs
    this picture!
    Text description provided by the architects. Built to address homelessness and climate change, the Sienna Net-Zero Home is a self-sustaining, solar-powered, cost-efficient, and compact housing solution. This climate-responsive and affordable home, located in Quezon City, Philippines, represents a revolutionary vision for social housing through its integration of thoughtful design, sustainability, and energy self-sufficiency.this picture!this picture!this picture!Designed with the unique tropical climate of the Philippines in mind, the Sienna Home prioritizes natural ventilation, passive cooling, and rainwater management to enhance indoor comfort and reduce reliance on artificial cooling systems. The compact 4.5m x 5.1m floor plan has been meticulously optimized for functionality, offering a flexible layout that grows and adapts to the families living in them.this picture!this picture!this picture!A key architectural feature is BillionBricks' innovative Powershade technology - an advanced solar roofing system that serves multiple purposes. Beyond generating clean, renewable energy, it acts as a protective heat barrier, reducing indoor temperatures and improving thermal comfort. Unlike conventional solar panels, Powershade seamlessly integrates with the home's structure, providing reliable energy generation while doubling as a durable roof. This makes the Sienna Home energy-positive, meaning it produces more electricity than it consumes, lowering utility costs and promoting long-term energy independence. Excess power can also be stored or sold back to the grid, creating an additional financial benefit for homeowners.this picture!When multiple Sienna Homes are built together, the innovative PowerShade roofing solution transcends its role as an individual energy source and transforms into a utility-scale solar rooftop farm, capable of powering essential community facilities and generating additional income. This shared energy infrastructure fosters a sense of collective empowerment, enabling residents to actively participate in a sustainable and financially rewarding energy ecosystem.this picture!this picture!The Sienna Home is built using lightweight prefabricated components, allowing for rapid on-site assembly while maintaining durability and structural integrity. This modular approach enables scalability, making it an ideal prototype for large-scale, cost-effective housing developments. The design also allows for future expansions, giving homeowners the flexibility to adapt their living spaces over time.this picture!Adhering to BP 220 social housing regulations, the unit features a 3-meter front setback and a 2-meter rear setback, ensuring proper ventilation, safety, and community-friendly spaces. Additionally, corner units include a 1.5-meter offset, enhancing privacy and accessibility within neighborhood layouts. Beyond providing a single-family residence, the Sienna House is designed to function within a larger sustainable community model, integrating shared green spaces, pedestrian pathways, and decentralized utilities. By promoting energy independence and environmental resilience, the project sets a new precedent for affordable yet high-quality housing solutions in rapidly urbanizing regions.this picture!The Sienna Home in Quezon City serves as a blueprint for future developments, proving that low-cost housing can be both architecturally compelling and socially transformative. By rethinking traditional housing models, BillionBricks is pioneering a future where affordability and sustainability are seamlessly integrated.

    Project gallerySee allShow less
    About this officebillionBricksOffice•••
    Published on June 15, 2025Cite: "Sienna Net-Zero Home / billionBricks" 14 Jun 2025. ArchDaily. Accessed . < ISSN 0719-8884Save世界上最受欢迎的建筑网站现已推出你的母语版本!想浏览ArchDaily中国吗?是否
    You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream
    #sienna #netzero #home #billionbricks
    Sienna Net-Zero Home / billionBricks
    Sienna Net-Zero Home / billionBricksSave this picture!© Ron Mendoza , Mark Twain C , BB teamHouses, Sustainability•Quezon City, Philippines Architects: billionBricks Area Area of this architecture project Area:  45 m² Year Completion year of this architecture project Year:  2024 Photographs Photographs:Ron Mendoza , Mark Twain C , BB teamMore SpecsLess Specs this picture! Text description provided by the architects. Built to address homelessness and climate change, the Sienna Net-Zero Home is a self-sustaining, solar-powered, cost-efficient, and compact housing solution. This climate-responsive and affordable home, located in Quezon City, Philippines, represents a revolutionary vision for social housing through its integration of thoughtful design, sustainability, and energy self-sufficiency.this picture!this picture!this picture!Designed with the unique tropical climate of the Philippines in mind, the Sienna Home prioritizes natural ventilation, passive cooling, and rainwater management to enhance indoor comfort and reduce reliance on artificial cooling systems. The compact 4.5m x 5.1m floor plan has been meticulously optimized for functionality, offering a flexible layout that grows and adapts to the families living in them.this picture!this picture!this picture!A key architectural feature is BillionBricks' innovative Powershade technology - an advanced solar roofing system that serves multiple purposes. Beyond generating clean, renewable energy, it acts as a protective heat barrier, reducing indoor temperatures and improving thermal comfort. Unlike conventional solar panels, Powershade seamlessly integrates with the home's structure, providing reliable energy generation while doubling as a durable roof. This makes the Sienna Home energy-positive, meaning it produces more electricity than it consumes, lowering utility costs and promoting long-term energy independence. Excess power can also be stored or sold back to the grid, creating an additional financial benefit for homeowners.this picture!When multiple Sienna Homes are built together, the innovative PowerShade roofing solution transcends its role as an individual energy source and transforms into a utility-scale solar rooftop farm, capable of powering essential community facilities and generating additional income. This shared energy infrastructure fosters a sense of collective empowerment, enabling residents to actively participate in a sustainable and financially rewarding energy ecosystem.this picture!this picture!The Sienna Home is built using lightweight prefabricated components, allowing for rapid on-site assembly while maintaining durability and structural integrity. This modular approach enables scalability, making it an ideal prototype for large-scale, cost-effective housing developments. The design also allows for future expansions, giving homeowners the flexibility to adapt their living spaces over time.this picture!Adhering to BP 220 social housing regulations, the unit features a 3-meter front setback and a 2-meter rear setback, ensuring proper ventilation, safety, and community-friendly spaces. Additionally, corner units include a 1.5-meter offset, enhancing privacy and accessibility within neighborhood layouts. Beyond providing a single-family residence, the Sienna House is designed to function within a larger sustainable community model, integrating shared green spaces, pedestrian pathways, and decentralized utilities. By promoting energy independence and environmental resilience, the project sets a new precedent for affordable yet high-quality housing solutions in rapidly urbanizing regions.this picture!The Sienna Home in Quezon City serves as a blueprint for future developments, proving that low-cost housing can be both architecturally compelling and socially transformative. By rethinking traditional housing models, BillionBricks is pioneering a future where affordability and sustainability are seamlessly integrated. Project gallerySee allShow less About this officebillionBricksOffice••• Published on June 15, 2025Cite: "Sienna Net-Zero Home / billionBricks" 14 Jun 2025. ArchDaily. Accessed . < ISSN 0719-8884Save世界上最受欢迎的建筑网站现已推出你的母语版本!想浏览ArchDaily中国吗?是否 You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream #sienna #netzero #home #billionbricks
    WWW.ARCHDAILY.COM
    Sienna Net-Zero Home / billionBricks
    Sienna Net-Zero Home / billionBricksSave this picture!© Ron Mendoza , Mark Twain C , BB teamHouses, Sustainability•Quezon City, Philippines Architects: billionBricks Area Area of this architecture project Area:  45 m² Year Completion year of this architecture project Year:  2024 Photographs Photographs:Ron Mendoza , Mark Twain C , BB teamMore SpecsLess Specs Save this picture! Text description provided by the architects. Built to address homelessness and climate change, the Sienna Net-Zero Home is a self-sustaining, solar-powered, cost-efficient, and compact housing solution. This climate-responsive and affordable home, located in Quezon City, Philippines, represents a revolutionary vision for social housing through its integration of thoughtful design, sustainability, and energy self-sufficiency.Save this picture!Save this picture!Save this picture!Designed with the unique tropical climate of the Philippines in mind, the Sienna Home prioritizes natural ventilation, passive cooling, and rainwater management to enhance indoor comfort and reduce reliance on artificial cooling systems. The compact 4.5m x 5.1m floor plan has been meticulously optimized for functionality, offering a flexible layout that grows and adapts to the families living in them.Save this picture!Save this picture!Save this picture!A key architectural feature is BillionBricks' innovative Powershade technology - an advanced solar roofing system that serves multiple purposes. Beyond generating clean, renewable energy, it acts as a protective heat barrier, reducing indoor temperatures and improving thermal comfort. Unlike conventional solar panels, Powershade seamlessly integrates with the home's structure, providing reliable energy generation while doubling as a durable roof. This makes the Sienna Home energy-positive, meaning it produces more electricity than it consumes, lowering utility costs and promoting long-term energy independence. Excess power can also be stored or sold back to the grid, creating an additional financial benefit for homeowners.Save this picture!When multiple Sienna Homes are built together, the innovative PowerShade roofing solution transcends its role as an individual energy source and transforms into a utility-scale solar rooftop farm, capable of powering essential community facilities and generating additional income. This shared energy infrastructure fosters a sense of collective empowerment, enabling residents to actively participate in a sustainable and financially rewarding energy ecosystem.Save this picture!Save this picture!The Sienna Home is built using lightweight prefabricated components, allowing for rapid on-site assembly while maintaining durability and structural integrity. This modular approach enables scalability, making it an ideal prototype for large-scale, cost-effective housing developments. The design also allows for future expansions, giving homeowners the flexibility to adapt their living spaces over time.Save this picture!Adhering to BP 220 social housing regulations, the unit features a 3-meter front setback and a 2-meter rear setback, ensuring proper ventilation, safety, and community-friendly spaces. Additionally, corner units include a 1.5-meter offset, enhancing privacy and accessibility within neighborhood layouts. Beyond providing a single-family residence, the Sienna House is designed to function within a larger sustainable community model, integrating shared green spaces, pedestrian pathways, and decentralized utilities. By promoting energy independence and environmental resilience, the project sets a new precedent for affordable yet high-quality housing solutions in rapidly urbanizing regions.Save this picture!The Sienna Home in Quezon City serves as a blueprint for future developments, proving that low-cost housing can be both architecturally compelling and socially transformative. By rethinking traditional housing models, BillionBricks is pioneering a future where affordability and sustainability are seamlessly integrated. Project gallerySee allShow less About this officebillionBricksOffice••• Published on June 15, 2025Cite: "Sienna Net-Zero Home / billionBricks" 14 Jun 2025. ArchDaily. Accessed . <https://www.archdaily.com/1031072/sienna-billionbricks&gt ISSN 0719-8884Save世界上最受欢迎的建筑网站现已推出你的母语版本!想浏览ArchDaily中国吗?是否 You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • From Networks to Business Models, AI Is Rewiring Telecom

    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services.
    As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry.
    Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental.
    AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint.
    AI Is Reshaping Wireless Networks Already
    Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access, and intelligent automation across the stack.
    AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models.
    Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time.
    AI Acceleration Will Outpace Past Tech Shifts
    Many may underestimate the speed and magnitude of AI-driven change.
    The shift from traditional voice and data systems to AI-driven network intelligence is already underway.
    Although predictions abound, the true scope remains unclear.
    It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise.

    Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board.
    Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined.
    History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries.
    Technological shifts bring both new opportunities and complex trade-offs.
    AI Disruption Will Move Faster Than Ever
    The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed.
    Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway.
    As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other.
    Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward.
    AI Will Reshape All Sectors and Companies
    This shift will unfold faster than most organizations or individuals are prepared to handle.
    Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries.
    Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption.
    Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage.
    As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name.

    It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries.
    SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption.
    The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers.
    Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives.
    No Industry Is Immune From AI Disruption
    AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp.
    New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent.
    Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets.
    The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow.
    #networks #business #models #rewiring #telecom
    From Networks to Business Models, AI Is Rewiring Telecom
    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services. As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry. Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental. AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint. AI Is Reshaping Wireless Networks Already Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access, and intelligent automation across the stack. AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models. Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time. AI Acceleration Will Outpace Past Tech Shifts Many may underestimate the speed and magnitude of AI-driven change. The shift from traditional voice and data systems to AI-driven network intelligence is already underway. Although predictions abound, the true scope remains unclear. It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise. Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board. Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined. History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries. Technological shifts bring both new opportunities and complex trade-offs. AI Disruption Will Move Faster Than Ever The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed. Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway. As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other. Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward. AI Will Reshape All Sectors and Companies This shift will unfold faster than most organizations or individuals are prepared to handle. Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries. Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption. Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage. As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name. It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries. SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption. The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers. Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives. No Industry Is Immune From AI Disruption AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp. New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent. Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets. The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow. #networks #business #models #rewiring #telecom
    From Networks to Business Models, AI Is Rewiring Telecom
    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services. As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry. Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental. AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint. AI Is Reshaping Wireless Networks Already Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access (FWA), and intelligent automation across the stack. AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models. Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time. AI Acceleration Will Outpace Past Tech Shifts Many may underestimate the speed and magnitude of AI-driven change. The shift from traditional voice and data systems to AI-driven network intelligence is already underway. Although predictions abound, the true scope remains unclear. It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise. Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board. Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined. History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries. Technological shifts bring both new opportunities and complex trade-offs. AI Disruption Will Move Faster Than Ever The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed. Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway. As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other. Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward. AI Will Reshape All Sectors and Companies This shift will unfold faster than most organizations or individuals are prepared to handle. Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries. Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption. Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage. As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name. It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries. SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption. The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers. Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives. No Industry Is Immune From AI Disruption AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp. New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent. Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets. The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Biofuels policy has been a failure for the climate, new report claims

    Fewer food crops

    Biofuels policy has been a failure for the climate, new report claims

    Report: An expansion of biofuels policy under Trump would lead to more greenhouse gas emissions.

    Georgina Gustin, Inside Climate News



    Jun 14, 2025 7:10 am

    |

    24

    An ethanol production plant on March 20, 2024 near Ravenna, Nebraska.

    Credit:

    David Madison/Getty Images

    An ethanol production plant on March 20, 2024 near Ravenna, Nebraska.

    Credit:

    David Madison/Getty Images

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.
    The American Midwest is home to some of the richest, most productive farmland in the world, enabling its transformation into a vast corn- and soy-producing machine—a conversion spurred largely by decades-long policies that support the production of biofuels.
    But a new report takes a big swing at the ethanol orthodoxy of American agriculture, criticizing the industry for causing economic and social imbalances across rural communities and saying that the expansion of biofuels will increase greenhouse gas emissions, despite their purported climate benefits.
    The report, from the World Resources Institute, which has been critical of US biofuel policy in the past, draws from 100 academic studies on biofuel impacts. It concludes that ethanol policy has been largely a failure and ought to be reconsidered, especially as the world needs more land to produce food to meet growing demand.
    “Multiple studies show that US biofuel policies have reshaped crop production, displacing food crops and driving up emissions from land conversion, tillage, and fertilizer use,” said the report’s lead author, Haley Leslie-Bole. “Corn-based ethanol, in particular, has contributed to nutrient runoff, degraded water quality and harmed wildlife habitat. As climate pressures grow, increasing irrigation and refining for first-gen biofuels could deepen water scarcity in already drought-prone parts of the Midwest.”
    The conversion of Midwestern agricultural land has been sweeping. Between 2004 and 2024, ethanol production increased by nearly 500 percent. Corn and soybeans are now grown on 92 and 86 million acres of land respectively—and roughly a third of those crops go to produce ethanol. That means about 30 million acres of land that could be used to grow food crops are instead being used to produce ethanol, despite ethanol only accounting for 6 percent of the country’s transportation fuel.

    The biofuels industry—which includes refiners, corn and soy growers and the influential agriculture lobby writ large—has long insisted that corn- and soy-based biofuels provide an energy-efficient alternative to fossil-based fuels. Congress and the US Department of Agriculture have agreed.
    The country’s primary biofuels policy, the Renewable Fuel Standard, requires that biofuels provide a greenhouse gas reduction over fossil fuels: The law says that ethanol from new plants must deliver a 20 percent reduction in greenhouse gas emissions compared to gasoline.
    In addition to greenhouse gas reductions, the industry and its allies in Congress have also continued to say that ethanol is a primary mainstay of the rural economy, benefiting communities across the Midwest.
    But a growing body of research—much of which the industry has tried to debunk and deride—suggests that ethanol actually may not provide the benefits that policies require. It may, in fact, produce more greenhouse gases than the fossil fuels it was intended to replace. Recent research says that biofuel refiners also emit significant amounts of carcinogenic and dangerous substances, including hexane and formaldehyde, in greater amounts than petroleum refineries.
    The new report points to research saying that increased production of biofuels from corn and soy could actually raise greenhouse gas emissions, largely from carbon emissions linked to clearing land in other countries to compensate for the use of land in the Midwest.
    On top of that, corn is an especially fertilizer-hungry crop requiring large amounts of nitrogen-based fertilizer, which releases huge amounts of nitrous oxide when it interacts with the soil. American farming is, by far, the largest source of domestic nitrous oxide emissions already—about 50 percent. If biofuel policies lead to expanded production, emissions of this enormously powerful greenhouse gas will likely increase, too.

    The new report concludes that not only will the expansion of ethanol increase greenhouse gas emissions, but it has also failed to provide the social and financial benefits to Midwestern communities that lawmakers and the industry say it has.“The benefits from biofuels remain concentrated in the hands of a few,” Leslie-Bole said. “As subsidies flow, so may the trend of farmland consolidation, increasing inaccessibility of farmland in the Midwest, and locking out emerging or low-resource farmers. This means the benefits of biofuels production are flowing to fewer people, while more are left bearing the costs.”
    New policies being considered in state legislatures and Congress, including additional tax credits and support for biofuel-based aviation fuel, could expand production, potentially causing more land conversion and greenhouse gas emissions, widening the gap between the rural communities and rich agribusinesses at a time when food demand is climbing and, critics say, land should be used to grow food instead.
    President Donald Trump’s tax cut bill, passed by the House and currently being negotiated in the Senate, would not only extend tax credits for biofuels producers, it specifically excludes calculations of emissions from land conversion when determining what qualifies as a low-emission fuel.
    The primary biofuels industry trade groups, including Growth Energy and the Renewable Fuels Association, did not respond to Inside Climate News requests for comment or interviews.
    An employee with the Clean Fuels Alliance America, which represents biodiesel and sustainable aviation fuel producers, not ethanol, said the report vastly overstates the carbon emissions from crop-based fuels by comparing the farmed land to natural landscapes, which no longer exist.
    They also noted that the impact of soy-based fuels in 2024 was more than billion, providing over 100,000 jobs.
    “Ten percent of the value of every bushel of soybeans is linked to biomass-based fuel,” they said.

    Georgina Gustin, Inside Climate News

    24 Comments
    #biofuels #policy #has #been #failure
    Biofuels policy has been a failure for the climate, new report claims
    Fewer food crops Biofuels policy has been a failure for the climate, new report claims Report: An expansion of biofuels policy under Trump would lead to more greenhouse gas emissions. Georgina Gustin, Inside Climate News – Jun 14, 2025 7:10 am | 24 An ethanol production plant on March 20, 2024 near Ravenna, Nebraska. Credit: David Madison/Getty Images An ethanol production plant on March 20, 2024 near Ravenna, Nebraska. Credit: David Madison/Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here. The American Midwest is home to some of the richest, most productive farmland in the world, enabling its transformation into a vast corn- and soy-producing machine—a conversion spurred largely by decades-long policies that support the production of biofuels. But a new report takes a big swing at the ethanol orthodoxy of American agriculture, criticizing the industry for causing economic and social imbalances across rural communities and saying that the expansion of biofuels will increase greenhouse gas emissions, despite their purported climate benefits. The report, from the World Resources Institute, which has been critical of US biofuel policy in the past, draws from 100 academic studies on biofuel impacts. It concludes that ethanol policy has been largely a failure and ought to be reconsidered, especially as the world needs more land to produce food to meet growing demand. “Multiple studies show that US biofuel policies have reshaped crop production, displacing food crops and driving up emissions from land conversion, tillage, and fertilizer use,” said the report’s lead author, Haley Leslie-Bole. “Corn-based ethanol, in particular, has contributed to nutrient runoff, degraded water quality and harmed wildlife habitat. As climate pressures grow, increasing irrigation and refining for first-gen biofuels could deepen water scarcity in already drought-prone parts of the Midwest.” The conversion of Midwestern agricultural land has been sweeping. Between 2004 and 2024, ethanol production increased by nearly 500 percent. Corn and soybeans are now grown on 92 and 86 million acres of land respectively—and roughly a third of those crops go to produce ethanol. That means about 30 million acres of land that could be used to grow food crops are instead being used to produce ethanol, despite ethanol only accounting for 6 percent of the country’s transportation fuel. The biofuels industry—which includes refiners, corn and soy growers and the influential agriculture lobby writ large—has long insisted that corn- and soy-based biofuels provide an energy-efficient alternative to fossil-based fuels. Congress and the US Department of Agriculture have agreed. The country’s primary biofuels policy, the Renewable Fuel Standard, requires that biofuels provide a greenhouse gas reduction over fossil fuels: The law says that ethanol from new plants must deliver a 20 percent reduction in greenhouse gas emissions compared to gasoline. In addition to greenhouse gas reductions, the industry and its allies in Congress have also continued to say that ethanol is a primary mainstay of the rural economy, benefiting communities across the Midwest. But a growing body of research—much of which the industry has tried to debunk and deride—suggests that ethanol actually may not provide the benefits that policies require. It may, in fact, produce more greenhouse gases than the fossil fuels it was intended to replace. Recent research says that biofuel refiners also emit significant amounts of carcinogenic and dangerous substances, including hexane and formaldehyde, in greater amounts than petroleum refineries. The new report points to research saying that increased production of biofuels from corn and soy could actually raise greenhouse gas emissions, largely from carbon emissions linked to clearing land in other countries to compensate for the use of land in the Midwest. On top of that, corn is an especially fertilizer-hungry crop requiring large amounts of nitrogen-based fertilizer, which releases huge amounts of nitrous oxide when it interacts with the soil. American farming is, by far, the largest source of domestic nitrous oxide emissions already—about 50 percent. If biofuel policies lead to expanded production, emissions of this enormously powerful greenhouse gas will likely increase, too. The new report concludes that not only will the expansion of ethanol increase greenhouse gas emissions, but it has also failed to provide the social and financial benefits to Midwestern communities that lawmakers and the industry say it has.“The benefits from biofuels remain concentrated in the hands of a few,” Leslie-Bole said. “As subsidies flow, so may the trend of farmland consolidation, increasing inaccessibility of farmland in the Midwest, and locking out emerging or low-resource farmers. This means the benefits of biofuels production are flowing to fewer people, while more are left bearing the costs.” New policies being considered in state legislatures and Congress, including additional tax credits and support for biofuel-based aviation fuel, could expand production, potentially causing more land conversion and greenhouse gas emissions, widening the gap between the rural communities and rich agribusinesses at a time when food demand is climbing and, critics say, land should be used to grow food instead. President Donald Trump’s tax cut bill, passed by the House and currently being negotiated in the Senate, would not only extend tax credits for biofuels producers, it specifically excludes calculations of emissions from land conversion when determining what qualifies as a low-emission fuel. The primary biofuels industry trade groups, including Growth Energy and the Renewable Fuels Association, did not respond to Inside Climate News requests for comment or interviews. An employee with the Clean Fuels Alliance America, which represents biodiesel and sustainable aviation fuel producers, not ethanol, said the report vastly overstates the carbon emissions from crop-based fuels by comparing the farmed land to natural landscapes, which no longer exist. They also noted that the impact of soy-based fuels in 2024 was more than billion, providing over 100,000 jobs. “Ten percent of the value of every bushel of soybeans is linked to biomass-based fuel,” they said. Georgina Gustin, Inside Climate News 24 Comments #biofuels #policy #has #been #failure
    ARSTECHNICA.COM
    Biofuels policy has been a failure for the climate, new report claims
    Fewer food crops Biofuels policy has been a failure for the climate, new report claims Report: An expansion of biofuels policy under Trump would lead to more greenhouse gas emissions. Georgina Gustin, Inside Climate News – Jun 14, 2025 7:10 am | 24 An ethanol production plant on March 20, 2024 near Ravenna, Nebraska. Credit: David Madison/Getty Images An ethanol production plant on March 20, 2024 near Ravenna, Nebraska. Credit: David Madison/Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here. The American Midwest is home to some of the richest, most productive farmland in the world, enabling its transformation into a vast corn- and soy-producing machine—a conversion spurred largely by decades-long policies that support the production of biofuels. But a new report takes a big swing at the ethanol orthodoxy of American agriculture, criticizing the industry for causing economic and social imbalances across rural communities and saying that the expansion of biofuels will increase greenhouse gas emissions, despite their purported climate benefits. The report, from the World Resources Institute, which has been critical of US biofuel policy in the past, draws from 100 academic studies on biofuel impacts. It concludes that ethanol policy has been largely a failure and ought to be reconsidered, especially as the world needs more land to produce food to meet growing demand. “Multiple studies show that US biofuel policies have reshaped crop production, displacing food crops and driving up emissions from land conversion, tillage, and fertilizer use,” said the report’s lead author, Haley Leslie-Bole. “Corn-based ethanol, in particular, has contributed to nutrient runoff, degraded water quality and harmed wildlife habitat. As climate pressures grow, increasing irrigation and refining for first-gen biofuels could deepen water scarcity in already drought-prone parts of the Midwest.” The conversion of Midwestern agricultural land has been sweeping. Between 2004 and 2024, ethanol production increased by nearly 500 percent. Corn and soybeans are now grown on 92 and 86 million acres of land respectively—and roughly a third of those crops go to produce ethanol. That means about 30 million acres of land that could be used to grow food crops are instead being used to produce ethanol, despite ethanol only accounting for 6 percent of the country’s transportation fuel. The biofuels industry—which includes refiners, corn and soy growers and the influential agriculture lobby writ large—has long insisted that corn- and soy-based biofuels provide an energy-efficient alternative to fossil-based fuels. Congress and the US Department of Agriculture have agreed. The country’s primary biofuels policy, the Renewable Fuel Standard, requires that biofuels provide a greenhouse gas reduction over fossil fuels: The law says that ethanol from new plants must deliver a 20 percent reduction in greenhouse gas emissions compared to gasoline. In addition to greenhouse gas reductions, the industry and its allies in Congress have also continued to say that ethanol is a primary mainstay of the rural economy, benefiting communities across the Midwest. But a growing body of research—much of which the industry has tried to debunk and deride—suggests that ethanol actually may not provide the benefits that policies require. It may, in fact, produce more greenhouse gases than the fossil fuels it was intended to replace. Recent research says that biofuel refiners also emit significant amounts of carcinogenic and dangerous substances, including hexane and formaldehyde, in greater amounts than petroleum refineries. The new report points to research saying that increased production of biofuels from corn and soy could actually raise greenhouse gas emissions, largely from carbon emissions linked to clearing land in other countries to compensate for the use of land in the Midwest. On top of that, corn is an especially fertilizer-hungry crop requiring large amounts of nitrogen-based fertilizer, which releases huge amounts of nitrous oxide when it interacts with the soil. American farming is, by far, the largest source of domestic nitrous oxide emissions already—about 50 percent. If biofuel policies lead to expanded production, emissions of this enormously powerful greenhouse gas will likely increase, too. The new report concludes that not only will the expansion of ethanol increase greenhouse gas emissions, but it has also failed to provide the social and financial benefits to Midwestern communities that lawmakers and the industry say it has. (The report defines the Midwest as Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota, and Wisconsin.) “The benefits from biofuels remain concentrated in the hands of a few,” Leslie-Bole said. “As subsidies flow, so may the trend of farmland consolidation, increasing inaccessibility of farmland in the Midwest, and locking out emerging or low-resource farmers. This means the benefits of biofuels production are flowing to fewer people, while more are left bearing the costs.” New policies being considered in state legislatures and Congress, including additional tax credits and support for biofuel-based aviation fuel, could expand production, potentially causing more land conversion and greenhouse gas emissions, widening the gap between the rural communities and rich agribusinesses at a time when food demand is climbing and, critics say, land should be used to grow food instead. President Donald Trump’s tax cut bill, passed by the House and currently being negotiated in the Senate, would not only extend tax credits for biofuels producers, it specifically excludes calculations of emissions from land conversion when determining what qualifies as a low-emission fuel. The primary biofuels industry trade groups, including Growth Energy and the Renewable Fuels Association, did not respond to Inside Climate News requests for comment or interviews. An employee with the Clean Fuels Alliance America, which represents biodiesel and sustainable aviation fuel producers, not ethanol, said the report vastly overstates the carbon emissions from crop-based fuels by comparing the farmed land to natural landscapes, which no longer exist. They also noted that the impact of soy-based fuels in 2024 was more than $42 billion, providing over 100,000 jobs. “Ten percent of the value of every bushel of soybeans is linked to biomass-based fuel,” they said. Georgina Gustin, Inside Climate News 24 Comments
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • A shortage of high-voltage power cables could stall the clean energy transition

    In a nutshell: As nations set ever more ambitious targets for renewable energy and electrification, the humble high-voltage cable has emerged as a linchpin – and a potential chokepoint – in the race to decarbonize the global economy. A Bloomberg interview with Claes Westerlind, CEO of NKT, a leading cable manufacturer based in Denmark, explains why.
    A global surge in demand for high-voltage electricity cables is threatening to stall the clean energy revolution, as the world's ability to build new wind farms, solar plants, and cross-border power links increasingly hinges on a supply chain bottleneck few outside the industry have considered. At the center of this challenge is the complex, capital-intensive process of manufacturing the giant cables that transport electricity across hundreds of miles, both over land and under the sea.
    Despite soaring demand, cable manufacturers remain cautious about expanding capacity, raising questions about whether the pace of electrification can keep up with climate ambitions, geopolitical tensions, and the practical realities of industrial investment.
    High-voltage cables are the arteries of modern power grids, carrying electrons from remote wind farms or hydroelectric dams to the cities and industries that need them. Unlike the thin wires that run through a home's walls, these cables are engineering marvels – sometimes as thick as a person's torso, armored to withstand the crushing pressure of the ocean floor, and designed to last for decades under extreme electrical and environmental stress.

    "If you look at the very high voltage direct current cable, able to carry roughly two gigawatts through two pairs of cables – that means that the equivalent of one nuclear power reactor is flowing through one cable," Westerlind told Bloomberg.
    The process of making these cables is as specialized as it is demanding. At the core is a conductor, typically made of copper or aluminum, twisted together like a rope for flexibility and strength. Around this, manufacturers apply multiple layers of insulation in towering vertical factories to ensure the cable remains perfectly round and can safely contain the immense voltages involved. Any impurity in the insulation, even something as small as an eyelash, can cause catastrophic failure, potentially knocking out power to entire cities.
    // Related Stories

    As the world rushes to harness new sources of renewable energy, the demand for high-voltage direct currentcables has skyrocketed. HVDC technology, initially pioneered by NKT in the 1950s, has become the backbone of long-distance power transmission, particularly for offshore wind farms and intercontinental links. In recent years, approximately 80 to 90 percent of new large-scale cable projects have utilized HVDC, reflecting its efficiency in transmitting electricity over vast distances with minimal losses.

    But this surge in demand has led to a critical bottleneck. Factories that produce these cables are booked out for years, Westerlind reports, and every project requires custom engineering to match the power needs, geography, and environmental conditions of its route. According to the International Energy Agency, meeting global clean energy goals will require building the equivalent of 80 million kilometersof new grid infrastructure by 2040 – essentially doubling what has been constructed over the past century, but in just 15 years.
    Despite the clear need, cable makers have been slow to add capacity due to reasons that are as much economic and political as technical. Building a new cable factory can cost upwards of a billion euros, and manufacturers are wary of making such investments without long-term commitments from utilities or governments. "For a company like us to do investments in the realm of €1 or 2 billion, it's a massive commitment... but it's also a massive amount of demand that is needed for this investment to actually make financial sense over the next not five years, not 10 years, but over the next 20 to 30 years," Westerlind said. The industry still bears scars from a decade ago, when anticipated demand failed to materialize and expensive new facilities sat underused.
    Some governments and transmission system operators are trying to break the logjam by making "anticipatory investments" – committing to buy cable capacity even before specific projects are finalized. This approach, backed by regulators, gives manufacturers the confidence to expand, but it remains the exception rather than the rule.
    Meanwhile, the industry's structure itself creates barriers to rapid expansion, according to Westerlind. The expertise, technology, and infrastructure required to make high-voltage cables are concentrated in a handful of companies, creating what analysts describe as a "deep moat" that is difficult for new entrants to cross.
    Geopolitical tensions add another layer of complexity. China has built more HVDC lines than any other country, although Western manufacturers, such as NKT, maintain a technical edge in the most advanced cable systems. Still, there is growing concern in Europe and the US about becoming dependent on foreign suppliers for such critical infrastructure, especially in light of recent global conflicts and trade disputes. "Strategic autonomy is very important when it comes to the core parts and the fundamental parts of your society, where the grid backbone is one," Westerlind noted.
    The stakes are high. Without a rapid and coordinated push to expand cable manufacturing, the world's clean energy transition could be slowed not by a lack of wind or sun but by a shortage of the cables needed to connect them to the grid. As Westerlind put it, "We all know it has to be done... These are large investments. They are very expensive investments. So also the governments have to have a part in enabling these anticipatory investments, and making it possible for the TSOs to actually carry forward with them."
    #shortage #highvoltage #power #cables #could
    A shortage of high-voltage power cables could stall the clean energy transition
    In a nutshell: As nations set ever more ambitious targets for renewable energy and electrification, the humble high-voltage cable has emerged as a linchpin – and a potential chokepoint – in the race to decarbonize the global economy. A Bloomberg interview with Claes Westerlind, CEO of NKT, a leading cable manufacturer based in Denmark, explains why. A global surge in demand for high-voltage electricity cables is threatening to stall the clean energy revolution, as the world's ability to build new wind farms, solar plants, and cross-border power links increasingly hinges on a supply chain bottleneck few outside the industry have considered. At the center of this challenge is the complex, capital-intensive process of manufacturing the giant cables that transport electricity across hundreds of miles, both over land and under the sea. Despite soaring demand, cable manufacturers remain cautious about expanding capacity, raising questions about whether the pace of electrification can keep up with climate ambitions, geopolitical tensions, and the practical realities of industrial investment. High-voltage cables are the arteries of modern power grids, carrying electrons from remote wind farms or hydroelectric dams to the cities and industries that need them. Unlike the thin wires that run through a home's walls, these cables are engineering marvels – sometimes as thick as a person's torso, armored to withstand the crushing pressure of the ocean floor, and designed to last for decades under extreme electrical and environmental stress. "If you look at the very high voltage direct current cable, able to carry roughly two gigawatts through two pairs of cables – that means that the equivalent of one nuclear power reactor is flowing through one cable," Westerlind told Bloomberg. The process of making these cables is as specialized as it is demanding. At the core is a conductor, typically made of copper or aluminum, twisted together like a rope for flexibility and strength. Around this, manufacturers apply multiple layers of insulation in towering vertical factories to ensure the cable remains perfectly round and can safely contain the immense voltages involved. Any impurity in the insulation, even something as small as an eyelash, can cause catastrophic failure, potentially knocking out power to entire cities. // Related Stories As the world rushes to harness new sources of renewable energy, the demand for high-voltage direct currentcables has skyrocketed. HVDC technology, initially pioneered by NKT in the 1950s, has become the backbone of long-distance power transmission, particularly for offshore wind farms and intercontinental links. In recent years, approximately 80 to 90 percent of new large-scale cable projects have utilized HVDC, reflecting its efficiency in transmitting electricity over vast distances with minimal losses. But this surge in demand has led to a critical bottleneck. Factories that produce these cables are booked out for years, Westerlind reports, and every project requires custom engineering to match the power needs, geography, and environmental conditions of its route. According to the International Energy Agency, meeting global clean energy goals will require building the equivalent of 80 million kilometersof new grid infrastructure by 2040 – essentially doubling what has been constructed over the past century, but in just 15 years. Despite the clear need, cable makers have been slow to add capacity due to reasons that are as much economic and political as technical. Building a new cable factory can cost upwards of a billion euros, and manufacturers are wary of making such investments without long-term commitments from utilities or governments. "For a company like us to do investments in the realm of €1 or 2 billion, it's a massive commitment... but it's also a massive amount of demand that is needed for this investment to actually make financial sense over the next not five years, not 10 years, but over the next 20 to 30 years," Westerlind said. The industry still bears scars from a decade ago, when anticipated demand failed to materialize and expensive new facilities sat underused. Some governments and transmission system operators are trying to break the logjam by making "anticipatory investments" – committing to buy cable capacity even before specific projects are finalized. This approach, backed by regulators, gives manufacturers the confidence to expand, but it remains the exception rather than the rule. Meanwhile, the industry's structure itself creates barriers to rapid expansion, according to Westerlind. The expertise, technology, and infrastructure required to make high-voltage cables are concentrated in a handful of companies, creating what analysts describe as a "deep moat" that is difficult for new entrants to cross. Geopolitical tensions add another layer of complexity. China has built more HVDC lines than any other country, although Western manufacturers, such as NKT, maintain a technical edge in the most advanced cable systems. Still, there is growing concern in Europe and the US about becoming dependent on foreign suppliers for such critical infrastructure, especially in light of recent global conflicts and trade disputes. "Strategic autonomy is very important when it comes to the core parts and the fundamental parts of your society, where the grid backbone is one," Westerlind noted. The stakes are high. Without a rapid and coordinated push to expand cable manufacturing, the world's clean energy transition could be slowed not by a lack of wind or sun but by a shortage of the cables needed to connect them to the grid. As Westerlind put it, "We all know it has to be done... These are large investments. They are very expensive investments. So also the governments have to have a part in enabling these anticipatory investments, and making it possible for the TSOs to actually carry forward with them." #shortage #highvoltage #power #cables #could
    WWW.TECHSPOT.COM
    A shortage of high-voltage power cables could stall the clean energy transition
    In a nutshell: As nations set ever more ambitious targets for renewable energy and electrification, the humble high-voltage cable has emerged as a linchpin – and a potential chokepoint – in the race to decarbonize the global economy. A Bloomberg interview with Claes Westerlind, CEO of NKT, a leading cable manufacturer based in Denmark, explains why. A global surge in demand for high-voltage electricity cables is threatening to stall the clean energy revolution, as the world's ability to build new wind farms, solar plants, and cross-border power links increasingly hinges on a supply chain bottleneck few outside the industry have considered. At the center of this challenge is the complex, capital-intensive process of manufacturing the giant cables that transport electricity across hundreds of miles, both over land and under the sea. Despite soaring demand, cable manufacturers remain cautious about expanding capacity, raising questions about whether the pace of electrification can keep up with climate ambitions, geopolitical tensions, and the practical realities of industrial investment. High-voltage cables are the arteries of modern power grids, carrying electrons from remote wind farms or hydroelectric dams to the cities and industries that need them. Unlike the thin wires that run through a home's walls, these cables are engineering marvels – sometimes as thick as a person's torso, armored to withstand the crushing pressure of the ocean floor, and designed to last for decades under extreme electrical and environmental stress. "If you look at the very high voltage direct current cable, able to carry roughly two gigawatts through two pairs of cables – that means that the equivalent of one nuclear power reactor is flowing through one cable," Westerlind told Bloomberg. The process of making these cables is as specialized as it is demanding. At the core is a conductor, typically made of copper or aluminum, twisted together like a rope for flexibility and strength. Around this, manufacturers apply multiple layers of insulation in towering vertical factories to ensure the cable remains perfectly round and can safely contain the immense voltages involved. Any impurity in the insulation, even something as small as an eyelash, can cause catastrophic failure, potentially knocking out power to entire cities. // Related Stories As the world rushes to harness new sources of renewable energy, the demand for high-voltage direct current (HVDC) cables has skyrocketed. HVDC technology, initially pioneered by NKT in the 1950s, has become the backbone of long-distance power transmission, particularly for offshore wind farms and intercontinental links. In recent years, approximately 80 to 90 percent of new large-scale cable projects have utilized HVDC, reflecting its efficiency in transmitting electricity over vast distances with minimal losses. But this surge in demand has led to a critical bottleneck. Factories that produce these cables are booked out for years, Westerlind reports, and every project requires custom engineering to match the power needs, geography, and environmental conditions of its route. According to the International Energy Agency, meeting global clean energy goals will require building the equivalent of 80 million kilometers (around 49.7 million miles) of new grid infrastructure by 2040 – essentially doubling what has been constructed over the past century, but in just 15 years. Despite the clear need, cable makers have been slow to add capacity due to reasons that are as much economic and political as technical. Building a new cable factory can cost upwards of a billion euros, and manufacturers are wary of making such investments without long-term commitments from utilities or governments. "For a company like us to do investments in the realm of €1 or 2 billion, it's a massive commitment... but it's also a massive amount of demand that is needed for this investment to actually make financial sense over the next not five years, not 10 years, but over the next 20 to 30 years," Westerlind said. The industry still bears scars from a decade ago, when anticipated demand failed to materialize and expensive new facilities sat underused. Some governments and transmission system operators are trying to break the logjam by making "anticipatory investments" – committing to buy cable capacity even before specific projects are finalized. This approach, backed by regulators, gives manufacturers the confidence to expand, but it remains the exception rather than the rule. Meanwhile, the industry's structure itself creates barriers to rapid expansion, according to Westerlind. The expertise, technology, and infrastructure required to make high-voltage cables are concentrated in a handful of companies, creating what analysts describe as a "deep moat" that is difficult for new entrants to cross. Geopolitical tensions add another layer of complexity. China has built more HVDC lines than any other country, although Western manufacturers, such as NKT, maintain a technical edge in the most advanced cable systems. Still, there is growing concern in Europe and the US about becoming dependent on foreign suppliers for such critical infrastructure, especially in light of recent global conflicts and trade disputes. "Strategic autonomy is very important when it comes to the core parts and the fundamental parts of your society, where the grid backbone is one," Westerlind noted. The stakes are high. Without a rapid and coordinated push to expand cable manufacturing, the world's clean energy transition could be slowed not by a lack of wind or sun but by a shortage of the cables needed to connect them to the grid. As Westerlind put it, "We all know it has to be done... These are large investments. They are very expensive investments. So also the governments have to have a part in enabling these anticipatory investments, and making it possible for the TSOs to actually carry forward with them."
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Ansys: UX Designer II (Remote - US)

    Requisition #: 16391 Our Mission: Powering Innovation That Drives Human Advancement When visionary companies need to know how their world-changing ideas will perform, they close the gap between design and reality with Ansys simulation. For more than 50 years, Ansys software has enabled innovators across industries to push boundaries by using the predictive power of simulation. From sustainable transportation to advanced semiconductors, from satellite systems to life-saving medical devices, the next great leaps in human advancement will be powered by Ansys. Innovate With Ansys, Power Your Career. Summary / Role Purpose The User Experience Designer II creates easy and delightful experiences for users interacting with ANSYS products and services. The UX designer assesses the functional and content requirements of a product, develops storyboards, creates wireframes and task flows based on user needs, and produces visually detailed mockups. A passion for visual design and familiarity with UI trends and technologies are essential in this role, enabling the UX designer to bring fresh and innovative ideas to a project. This is an intermediate role, heavily focused on content production and communication. It is intended to expose the UX professional to the nuts-and-bolts aspects of their UX career; while building on presentation, communication, and usability aspects of the design role. The User Experience Designer II will contribute to the development of a new web-based, collaborative solution for the ModelCenter and optiSLang product lines. This work will be based on an innovative modeling framework, modern web technologies, micro-services and integrations with Ansys' core products. The User Experience Designer II will contribute to the specification and design of user interactions and workflows for new features. The solution will be used by Ansys customers to design next generation systems in the most innovative industries. Location: Can be 100% Remote within US Key Duties and Responsibilities Designs, develops, and evaluates cutting-edge user interfaces Reviews UX artifacts created by other UX team members Utilizes prototyping tools and UX toolkits Creates and delivers usability studies Communicates design rationale across product creation disciplines and personnel Records usability/UX problems with clear explanations and recommendations for improvement Works closely with product managers, development teams, and other designers Minimum Education/Certification Requirements and Experience BS or BA in Human-Computer Interaction, Design Engineering, or Industrial Design with 2 years' experience or MS Working experience with technical software development proven by academic, research, or industry projects. Professional working proficiency in English Preferred Qualifications and Skills Experience with: UX design and collaboration tools: Figma, Balsamiq or similar tools Tools & technologies for UI implementation: HTML, CSS, JavaScript, Angular, React Screen-capture/editing/video-editing tools Adobe Creative Suite Ability to: Smoothly iterate on designs, taking direction, adjusting, and re-focusing towards a converged design Organize deliverables for future reflection and current investigations Communicate succinctly and professionally via email, chat, remote meetings, usability evaluations, etc. Prototype rapidly using any tools available Knowledge of Model Based System Engineeringor optimization is a plus Culture and Values Culture and values are incredibly important to ANSYS. They inform us of who we are, of how we act. Values aren't posters hanging on a wall or about trite or glib slogans. They aren't about rules and regulations. They can't just be handed down the organization. They are shared beliefs - guideposts that we all follow when we're facing a challenge or a decision. Our values tell us how we live our lives; how we approach our jobs. Our values are crucial for fostering a culture of winning for our company: • Customer focus • Results and Accountability • Innovation • Transparency and Integrity • Mastery • Inclusiveness • Sense of urgency • Collaboration and Teamwork At Ansys, we know that changing the world takes vision, skill, and each other. We fuel new ideas, build relationships, and help each other realize our greatest potential. We are ONE Ansys. We operate on three key components: our commitments to stakeholders, our values that guide how we work together, and our actions to deliver results. As ONE Ansys, we are powering innovation that drives human advancement Our Commitments:Amaze with innovative products and solutionsMake our customers incredibly successfulAct with integrityEnsure employees thrive and shareholders prosper Our Values:Adaptability: Be open, welcome what's nextCourage: Be courageous, move forward passionatelyGenerosity: Be generous, share, listen, serveAuthenticity: Be you, make us stronger Our Actions:We commit to audacious goalsWe work seamlessly as a teamWe demonstrate masteryWe deliver outstanding resultsVALUES IN ACTION Ansys is committed to powering the people who power human advancement. We believe in creating and nurturing a workplace that supports and welcomes people of all backgrounds; encouraging them to bring their talents and experience to a workplace where they are valued and can thrive. Our culture is grounded in our four core values of adaptability, courage, generosity, and authenticity. Through our behaviors and actions, these values foster higher team performance and greater innovation for our customers. We're proud to offer programs, available to all employees, to further impact innovation and business outcomes, such as employee networks and learning communities that inform solutions for our globally minded customer base. WELCOME WHAT'S NEXT IN YOUR CAREER AT ANSYS At Ansys, you will find yourself among the sharpest minds and most visionary leaders across the globe. Collectively, we strive to change the world with innovative technology and transformational solutions. With a prestigious reputation in working with well-known, world-class companies, standards at Ansys are high - met by those willing to rise to the occasion and meet those challenges head on. Our team is passionate about pushing the limits of world-class simulation technology, empowering our customers to turn their design concepts into successful, innovative products faster and at a lower cost. Ready to feel inspired? Check out some of our recent customer stories, here and here . At Ansys, it's about the learning, the discovery, and the collaboration. It's about the "what's next" as much as the "mission accomplished." And it's about the melding of disciplined intellect with strategic direction and results that have, can, and do impact real people in real ways. All this is forged within a working environment built on respect, autonomy, and ethics.CREATING A PLACE WE'RE PROUD TO BEAnsys is an S&P 500 company and a member of the NASDAQ-100. We are proud to have been recognized for the following more recent awards, although our list goes on: Newsweek's Most Loved Workplace globally and in the U.S., Gold Stevie Award Winner, America's Most Responsible Companies, Fast Company World Changing Ideas, Great Place to Work Certified.For more information, please visit us at Ansys is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other protected characteristics.Ansys does not accept unsolicited referrals for vacancies, and any unsolicited referral will become the property of Ansys. Upon hire, no fee will be owed to the agency, person, or entity.Apply NowLet's start your dream job Apply now Meet JobCopilot: Your Personal AI Job HunterAutomatically Apply to Remote Full-Stack Programming JobsJust set your preferences and Job Copilot will do the rest-finding, filtering, and applying while you focus on what matters. Activate JobCopilot
    #ansys #designer #remote
    Ansys: UX Designer II (Remote - US)
    Requisition #: 16391 Our Mission: Powering Innovation That Drives Human Advancement When visionary companies need to know how their world-changing ideas will perform, they close the gap between design and reality with Ansys simulation. For more than 50 years, Ansys software has enabled innovators across industries to push boundaries by using the predictive power of simulation. From sustainable transportation to advanced semiconductors, from satellite systems to life-saving medical devices, the next great leaps in human advancement will be powered by Ansys. Innovate With Ansys, Power Your Career. Summary / Role Purpose The User Experience Designer II creates easy and delightful experiences for users interacting with ANSYS products and services. The UX designer assesses the functional and content requirements of a product, develops storyboards, creates wireframes and task flows based on user needs, and produces visually detailed mockups. A passion for visual design and familiarity with UI trends and technologies are essential in this role, enabling the UX designer to bring fresh and innovative ideas to a project. This is an intermediate role, heavily focused on content production and communication. It is intended to expose the UX professional to the nuts-and-bolts aspects of their UX career; while building on presentation, communication, and usability aspects of the design role. The User Experience Designer II will contribute to the development of a new web-based, collaborative solution for the ModelCenter and optiSLang product lines. This work will be based on an innovative modeling framework, modern web technologies, micro-services and integrations with Ansys' core products. The User Experience Designer II will contribute to the specification and design of user interactions and workflows for new features. The solution will be used by Ansys customers to design next generation systems in the most innovative industries. Location: Can be 100% Remote within US Key Duties and Responsibilities Designs, develops, and evaluates cutting-edge user interfaces Reviews UX artifacts created by other UX team members Utilizes prototyping tools and UX toolkits Creates and delivers usability studies Communicates design rationale across product creation disciplines and personnel Records usability/UX problems with clear explanations and recommendations for improvement Works closely with product managers, development teams, and other designers Minimum Education/Certification Requirements and Experience BS or BA in Human-Computer Interaction, Design Engineering, or Industrial Design with 2 years' experience or MS Working experience with technical software development proven by academic, research, or industry projects. Professional working proficiency in English Preferred Qualifications and Skills Experience with: UX design and collaboration tools: Figma, Balsamiq or similar tools Tools & technologies for UI implementation: HTML, CSS, JavaScript, Angular, React Screen-capture/editing/video-editing tools Adobe Creative Suite Ability to: Smoothly iterate on designs, taking direction, adjusting, and re-focusing towards a converged design Organize deliverables for future reflection and current investigations Communicate succinctly and professionally via email, chat, remote meetings, usability evaluations, etc. Prototype rapidly using any tools available Knowledge of Model Based System Engineeringor optimization is a plus Culture and Values Culture and values are incredibly important to ANSYS. They inform us of who we are, of how we act. Values aren't posters hanging on a wall or about trite or glib slogans. They aren't about rules and regulations. They can't just be handed down the organization. They are shared beliefs - guideposts that we all follow when we're facing a challenge or a decision. Our values tell us how we live our lives; how we approach our jobs. Our values are crucial for fostering a culture of winning for our company: • Customer focus • Results and Accountability • Innovation • Transparency and Integrity • Mastery • Inclusiveness • Sense of urgency • Collaboration and Teamwork At Ansys, we know that changing the world takes vision, skill, and each other. We fuel new ideas, build relationships, and help each other realize our greatest potential. We are ONE Ansys. We operate on three key components: our commitments to stakeholders, our values that guide how we work together, and our actions to deliver results. As ONE Ansys, we are powering innovation that drives human advancement Our Commitments:Amaze with innovative products and solutionsMake our customers incredibly successfulAct with integrityEnsure employees thrive and shareholders prosper Our Values:Adaptability: Be open, welcome what's nextCourage: Be courageous, move forward passionatelyGenerosity: Be generous, share, listen, serveAuthenticity: Be you, make us stronger Our Actions:We commit to audacious goalsWe work seamlessly as a teamWe demonstrate masteryWe deliver outstanding resultsVALUES IN ACTION Ansys is committed to powering the people who power human advancement. We believe in creating and nurturing a workplace that supports and welcomes people of all backgrounds; encouraging them to bring their talents and experience to a workplace where they are valued and can thrive. Our culture is grounded in our four core values of adaptability, courage, generosity, and authenticity. Through our behaviors and actions, these values foster higher team performance and greater innovation for our customers. We're proud to offer programs, available to all employees, to further impact innovation and business outcomes, such as employee networks and learning communities that inform solutions for our globally minded customer base. WELCOME WHAT'S NEXT IN YOUR CAREER AT ANSYS At Ansys, you will find yourself among the sharpest minds and most visionary leaders across the globe. Collectively, we strive to change the world with innovative technology and transformational solutions. With a prestigious reputation in working with well-known, world-class companies, standards at Ansys are high - met by those willing to rise to the occasion and meet those challenges head on. Our team is passionate about pushing the limits of world-class simulation technology, empowering our customers to turn their design concepts into successful, innovative products faster and at a lower cost. Ready to feel inspired? Check out some of our recent customer stories, here and here . At Ansys, it's about the learning, the discovery, and the collaboration. It's about the "what's next" as much as the "mission accomplished." And it's about the melding of disciplined intellect with strategic direction and results that have, can, and do impact real people in real ways. All this is forged within a working environment built on respect, autonomy, and ethics.CREATING A PLACE WE'RE PROUD TO BEAnsys is an S&P 500 company and a member of the NASDAQ-100. We are proud to have been recognized for the following more recent awards, although our list goes on: Newsweek's Most Loved Workplace globally and in the U.S., Gold Stevie Award Winner, America's Most Responsible Companies, Fast Company World Changing Ideas, Great Place to Work Certified.For more information, please visit us at Ansys is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other protected characteristics.Ansys does not accept unsolicited referrals for vacancies, and any unsolicited referral will become the property of Ansys. Upon hire, no fee will be owed to the agency, person, or entity.Apply NowLet's start your dream job Apply now Meet JobCopilot: Your Personal AI Job HunterAutomatically Apply to Remote Full-Stack Programming JobsJust set your preferences and Job Copilot will do the rest-finding, filtering, and applying while you focus on what matters. Activate JobCopilot #ansys #designer #remote
    WEWORKREMOTELY.COM
    Ansys: UX Designer II (Remote - US)
    Requisition #: 16391 Our Mission: Powering Innovation That Drives Human Advancement When visionary companies need to know how their world-changing ideas will perform, they close the gap between design and reality with Ansys simulation. For more than 50 years, Ansys software has enabled innovators across industries to push boundaries by using the predictive power of simulation. From sustainable transportation to advanced semiconductors, from satellite systems to life-saving medical devices, the next great leaps in human advancement will be powered by Ansys. Innovate With Ansys, Power Your Career. Summary / Role Purpose The User Experience Designer II creates easy and delightful experiences for users interacting with ANSYS products and services. The UX designer assesses the functional and content requirements of a product, develops storyboards, creates wireframes and task flows based on user needs, and produces visually detailed mockups. A passion for visual design and familiarity with UI trends and technologies are essential in this role, enabling the UX designer to bring fresh and innovative ideas to a project. This is an intermediate role, heavily focused on content production and communication. It is intended to expose the UX professional to the nuts-and-bolts aspects of their UX career; while building on presentation, communication, and usability aspects of the design role. The User Experience Designer II will contribute to the development of a new web-based, collaborative solution for the ModelCenter and optiSLang product lines. This work will be based on an innovative modeling framework, modern web technologies, micro-services and integrations with Ansys' core products. The User Experience Designer II will contribute to the specification and design of user interactions and workflows for new features. The solution will be used by Ansys customers to design next generation systems in the most innovative industries (Aerospace and Defense, Automotive, semi-conductors, and others). Location: Can be 100% Remote within US Key Duties and Responsibilities Designs, develops, and evaluates cutting-edge user interfaces Reviews UX artifacts created by other UX team members Utilizes prototyping tools and UX toolkits Creates and delivers usability studies Communicates design rationale across product creation disciplines and personnel Records usability/UX problems with clear explanations and recommendations for improvement Works closely with product managers, development teams, and other designers Minimum Education/Certification Requirements and Experience BS or BA in Human-Computer Interaction, Design Engineering, or Industrial Design with 2 years' experience or MS Working experience with technical software development proven by academic, research, or industry projects. Professional working proficiency in English Preferred Qualifications and Skills Experience with: UX design and collaboration tools: Figma, Balsamiq or similar tools Tools & technologies for UI implementation: HTML, CSS, JavaScript, Angular, React Screen-capture/editing/video-editing tools Adobe Creative Suite Ability to: Smoothly iterate on designs, taking direction, adjusting, and re-focusing towards a converged design Organize deliverables for future reflection and current investigations Communicate succinctly and professionally via email, chat, remote meetings, usability evaluations, etc. Prototype rapidly using any tools available Knowledge of Model Based System Engineering (MBSE) or optimization is a plus Culture and Values Culture and values are incredibly important to ANSYS. They inform us of who we are, of how we act. Values aren't posters hanging on a wall or about trite or glib slogans. They aren't about rules and regulations. They can't just be handed down the organization. They are shared beliefs - guideposts that we all follow when we're facing a challenge or a decision. Our values tell us how we live our lives; how we approach our jobs. Our values are crucial for fostering a culture of winning for our company: • Customer focus • Results and Accountability • Innovation • Transparency and Integrity • Mastery • Inclusiveness • Sense of urgency • Collaboration and Teamwork At Ansys, we know that changing the world takes vision, skill, and each other. We fuel new ideas, build relationships, and help each other realize our greatest potential. We are ONE Ansys. We operate on three key components: our commitments to stakeholders, our values that guide how we work together, and our actions to deliver results. As ONE Ansys, we are powering innovation that drives human advancement Our Commitments:Amaze with innovative products and solutionsMake our customers incredibly successfulAct with integrityEnsure employees thrive and shareholders prosper Our Values:Adaptability: Be open, welcome what's nextCourage: Be courageous, move forward passionatelyGenerosity: Be generous, share, listen, serveAuthenticity: Be you, make us stronger Our Actions:We commit to audacious goalsWe work seamlessly as a teamWe demonstrate masteryWe deliver outstanding resultsVALUES IN ACTION Ansys is committed to powering the people who power human advancement. We believe in creating and nurturing a workplace that supports and welcomes people of all backgrounds; encouraging them to bring their talents and experience to a workplace where they are valued and can thrive. Our culture is grounded in our four core values of adaptability, courage, generosity, and authenticity. Through our behaviors and actions, these values foster higher team performance and greater innovation for our customers. We're proud to offer programs, available to all employees, to further impact innovation and business outcomes, such as employee networks and learning communities that inform solutions for our globally minded customer base. WELCOME WHAT'S NEXT IN YOUR CAREER AT ANSYS At Ansys, you will find yourself among the sharpest minds and most visionary leaders across the globe. Collectively, we strive to change the world with innovative technology and transformational solutions. With a prestigious reputation in working with well-known, world-class companies, standards at Ansys are high - met by those willing to rise to the occasion and meet those challenges head on. Our team is passionate about pushing the limits of world-class simulation technology, empowering our customers to turn their design concepts into successful, innovative products faster and at a lower cost. Ready to feel inspired? Check out some of our recent customer stories, here and here . At Ansys, it's about the learning, the discovery, and the collaboration. It's about the "what's next" as much as the "mission accomplished." And it's about the melding of disciplined intellect with strategic direction and results that have, can, and do impact real people in real ways. All this is forged within a working environment built on respect, autonomy, and ethics.CREATING A PLACE WE'RE PROUD TO BEAnsys is an S&P 500 company and a member of the NASDAQ-100. We are proud to have been recognized for the following more recent awards, although our list goes on: Newsweek's Most Loved Workplace globally and in the U.S., Gold Stevie Award Winner, America's Most Responsible Companies, Fast Company World Changing Ideas, Great Place to Work Certified (China, Greece, France, India, Japan, Korea, Spain, Sweden, Taiwan, and U.K.).For more information, please visit us at Ansys is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other protected characteristics.Ansys does not accept unsolicited referrals for vacancies, and any unsolicited referral will become the property of Ansys. Upon hire, no fee will be owed to the agency, person, or entity.Apply NowLet's start your dream job Apply now Meet JobCopilot: Your Personal AI Job HunterAutomatically Apply to Remote Full-Stack Programming JobsJust set your preferences and Job Copilot will do the rest-finding, filtering, and applying while you focus on what matters. Activate JobCopilot
    0 Yorumlar 0 hisse senetleri 0 önizleme
Arama Sonuçları
CGShares https://cgshares.com