• NVIDIA Scores Consecutive Win for End-to-End Autonomous Driving Grand Challenge at CVPR

    NVIDIA was today named an Autonomous Grand Challenge winner at the Computer Vision and Pattern Recognitionconference, held this week in Nashville, Tennessee. The announcement was made at the Embodied Intelligence for Autonomous Systems on the Horizon Workshop.
    This marks the second consecutive year that NVIDIA’s topped the leaderboard in the End-to-End Driving at Scale category and the third year in a row winning an Autonomous Grand Challenge award at CVPR.
    The theme of this year’s challenge was “Towards Generalizable Embodied Systems” — based on NAVSIM v2, a data-driven, nonreactive autonomous vehiclesimulation framework.
    The challenge offered researchers the opportunity to explore ways to handle unexpected situations, beyond using only real-world human driving data, to accelerate the development of smarter, safer AVs.
    Generating Safe and Adaptive Driving Trajectories
    Participants of the challenge were tasked with generating driving trajectories from multi-sensor data in a semi-reactive simulation, where the ego vehicle’s plan is fixed at the start, but background traffic changes dynamically.
    Submissions were evaluated using the Extended Predictive Driver Model Score, which measures safety, comfort, compliance and generalization across real-world and synthetic scenarios — pushing the boundaries of robust and generalizable autonomous driving research.
    The NVIDIA AV Applied Research Team’s key innovation was the Generalized Trajectory Scoringmethod, which generates a variety of trajectories and progressively filters out the best one.
    GTRS model architecture showing a unified system for generating and scoring diverse driving trajectories using diffusion- and vocabulary-based trajectories.
    GTRS introduces a combination of coarse sets of trajectories covering a wide range of situations and fine-grained trajectories for safety-critical situations, created using a diffusion policy conditioned on the environment. GTRS then uses a transformer decoder distilled from perception-dependent metrics, focusing on safety, comfort and traffic rule compliance. This decoder progressively filters out the most promising trajectory candidates by capturing subtle but critical differences between similar trajectories.
    This system has proved to generalize well to a wide range of scenarios, achieving state-of-the-art results on challenging benchmarks and enabling robust, adaptive trajectory selection in diverse and challenging driving conditions.

    NVIDIA Automotive Research at CVPR 
    More than 60 NVIDIA papers were accepted for CVPR 2025, spanning automotive, healthcare, robotics and more.
    In automotive, NVIDIA researchers are advancing physical AI with innovation in perception, planning and data generation. This year, three NVIDIA papers were nominated for the Best Paper Award: FoundationStereo, Zero-Shot Monocular Scene Flow and Difix3D+.
    The NVIDIA papers listed below showcase breakthroughs in stereo depth estimation, monocular motion understanding, 3D reconstruction, closed-loop planning, vision-language modeling and generative simulation — all critical to building safer, more generalizable AVs:

    Diffusion Renderer: Neural Inverse and Forward Rendering With Video Diffusion ModelsFoundationStereo: Zero-Shot Stereo MatchingZero-Shot Monocular Scene Flow Estimation in the WildDifix3D+: Improving 3D Reconstructions With Single-Step Diffusion Models3DGUT: Enabling Distorted Cameras and Secondary Rays in Gaussian Splatting
    Closed-Loop Supervised Fine-Tuning of Tokenized Traffic Models
    Zero-Shot 4D Lidar Panoptic Segmentation
    NVILA: Efficient Frontier Visual Language Models
    RADIO Amplified: Improved Baselines for Agglomerative Vision Foundation Models
    OmniDrive: A Holistic Vision-Language Dataset for Autonomous Driving With Counterfactual Reasoning

    Explore automotive workshops and tutorials at CVPR, including:

    Workshop on Data-Driven Autonomous Driving Simulation, featuring Marco Pavone, senior director of AV research at NVIDIA, and Sanja Fidler, vice president of AI research at NVIDIA
    Workshop on Autonomous Driving, featuring Laura Leal-Taixe, senior research manager at NVIDIA
    Workshop on Open-World 3D Scene Understanding with Foundation Models, featuring Leal-Taixe
    Safe Artificial Intelligence for All Domains, featuring Jose Alvarez, director of AV applied research at NVIDIA
    Workshop on Foundation Models for V2X-Based Cooperative Autonomous Driving, featuring Pavone and Leal-Taixe
    Workshop on Multi-Agent Embodied Intelligent Systems Meet Generative AI Era, featuring Pavone
    LatinX in CV Workshop, featuring Leal-Taixe
    Workshop on Exploring the Next Generation of Data, featuring Alvarez
    Full-Stack, GPU-Based Acceleration of Deep Learning and Foundation Models, led by NVIDIA
    Continuous Data Cycle via Foundation Models, led by NVIDIA
    Distillation of Foundation Models for Autonomous Driving, led by NVIDIA

    Explore the NVIDIA research papers to be presented at CVPR and watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang.
    Learn more about NVIDIA Research, a global team of hundreds of scientists and engineers focused on topics including AI, computer graphics, computer vision, self-driving cars and robotics.
    The featured image above shows how an autonomous vehicle adapts its trajectory to navigate an urban environment with dynamic traffic using the GTRS model.
    #nvidia #scores #consecutive #win #endtoend
    NVIDIA Scores Consecutive Win for End-to-End Autonomous Driving Grand Challenge at CVPR
    NVIDIA was today named an Autonomous Grand Challenge winner at the Computer Vision and Pattern Recognitionconference, held this week in Nashville, Tennessee. The announcement was made at the Embodied Intelligence for Autonomous Systems on the Horizon Workshop. This marks the second consecutive year that NVIDIA’s topped the leaderboard in the End-to-End Driving at Scale category and the third year in a row winning an Autonomous Grand Challenge award at CVPR. The theme of this year’s challenge was “Towards Generalizable Embodied Systems” — based on NAVSIM v2, a data-driven, nonreactive autonomous vehiclesimulation framework. The challenge offered researchers the opportunity to explore ways to handle unexpected situations, beyond using only real-world human driving data, to accelerate the development of smarter, safer AVs. Generating Safe and Adaptive Driving Trajectories Participants of the challenge were tasked with generating driving trajectories from multi-sensor data in a semi-reactive simulation, where the ego vehicle’s plan is fixed at the start, but background traffic changes dynamically. Submissions were evaluated using the Extended Predictive Driver Model Score, which measures safety, comfort, compliance and generalization across real-world and synthetic scenarios — pushing the boundaries of robust and generalizable autonomous driving research. The NVIDIA AV Applied Research Team’s key innovation was the Generalized Trajectory Scoringmethod, which generates a variety of trajectories and progressively filters out the best one. GTRS model architecture showing a unified system for generating and scoring diverse driving trajectories using diffusion- and vocabulary-based trajectories. GTRS introduces a combination of coarse sets of trajectories covering a wide range of situations and fine-grained trajectories for safety-critical situations, created using a diffusion policy conditioned on the environment. GTRS then uses a transformer decoder distilled from perception-dependent metrics, focusing on safety, comfort and traffic rule compliance. This decoder progressively filters out the most promising trajectory candidates by capturing subtle but critical differences between similar trajectories. This system has proved to generalize well to a wide range of scenarios, achieving state-of-the-art results on challenging benchmarks and enabling robust, adaptive trajectory selection in diverse and challenging driving conditions. NVIDIA Automotive Research at CVPR  More than 60 NVIDIA papers were accepted for CVPR 2025, spanning automotive, healthcare, robotics and more. In automotive, NVIDIA researchers are advancing physical AI with innovation in perception, planning and data generation. This year, three NVIDIA papers were nominated for the Best Paper Award: FoundationStereo, Zero-Shot Monocular Scene Flow and Difix3D+. The NVIDIA papers listed below showcase breakthroughs in stereo depth estimation, monocular motion understanding, 3D reconstruction, closed-loop planning, vision-language modeling and generative simulation — all critical to building safer, more generalizable AVs: Diffusion Renderer: Neural Inverse and Forward Rendering With Video Diffusion ModelsFoundationStereo: Zero-Shot Stereo MatchingZero-Shot Monocular Scene Flow Estimation in the WildDifix3D+: Improving 3D Reconstructions With Single-Step Diffusion Models3DGUT: Enabling Distorted Cameras and Secondary Rays in Gaussian Splatting Closed-Loop Supervised Fine-Tuning of Tokenized Traffic Models Zero-Shot 4D Lidar Panoptic Segmentation NVILA: Efficient Frontier Visual Language Models RADIO Amplified: Improved Baselines for Agglomerative Vision Foundation Models OmniDrive: A Holistic Vision-Language Dataset for Autonomous Driving With Counterfactual Reasoning Explore automotive workshops and tutorials at CVPR, including: Workshop on Data-Driven Autonomous Driving Simulation, featuring Marco Pavone, senior director of AV research at NVIDIA, and Sanja Fidler, vice president of AI research at NVIDIA Workshop on Autonomous Driving, featuring Laura Leal-Taixe, senior research manager at NVIDIA Workshop on Open-World 3D Scene Understanding with Foundation Models, featuring Leal-Taixe Safe Artificial Intelligence for All Domains, featuring Jose Alvarez, director of AV applied research at NVIDIA Workshop on Foundation Models for V2X-Based Cooperative Autonomous Driving, featuring Pavone and Leal-Taixe Workshop on Multi-Agent Embodied Intelligent Systems Meet Generative AI Era, featuring Pavone LatinX in CV Workshop, featuring Leal-Taixe Workshop on Exploring the Next Generation of Data, featuring Alvarez Full-Stack, GPU-Based Acceleration of Deep Learning and Foundation Models, led by NVIDIA Continuous Data Cycle via Foundation Models, led by NVIDIA Distillation of Foundation Models for Autonomous Driving, led by NVIDIA Explore the NVIDIA research papers to be presented at CVPR and watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang. Learn more about NVIDIA Research, a global team of hundreds of scientists and engineers focused on topics including AI, computer graphics, computer vision, self-driving cars and robotics. The featured image above shows how an autonomous vehicle adapts its trajectory to navigate an urban environment with dynamic traffic using the GTRS model. #nvidia #scores #consecutive #win #endtoend
    BLOGS.NVIDIA.COM
    NVIDIA Scores Consecutive Win for End-to-End Autonomous Driving Grand Challenge at CVPR
    NVIDIA was today named an Autonomous Grand Challenge winner at the Computer Vision and Pattern Recognition (CVPR) conference, held this week in Nashville, Tennessee. The announcement was made at the Embodied Intelligence for Autonomous Systems on the Horizon Workshop. This marks the second consecutive year that NVIDIA’s topped the leaderboard in the End-to-End Driving at Scale category and the third year in a row winning an Autonomous Grand Challenge award at CVPR. The theme of this year’s challenge was “Towards Generalizable Embodied Systems” — based on NAVSIM v2, a data-driven, nonreactive autonomous vehicle (AV) simulation framework. The challenge offered researchers the opportunity to explore ways to handle unexpected situations, beyond using only real-world human driving data, to accelerate the development of smarter, safer AVs. Generating Safe and Adaptive Driving Trajectories Participants of the challenge were tasked with generating driving trajectories from multi-sensor data in a semi-reactive simulation, where the ego vehicle’s plan is fixed at the start, but background traffic changes dynamically. Submissions were evaluated using the Extended Predictive Driver Model Score, which measures safety, comfort, compliance and generalization across real-world and synthetic scenarios — pushing the boundaries of robust and generalizable autonomous driving research. The NVIDIA AV Applied Research Team’s key innovation was the Generalized Trajectory Scoring (GTRS) method, which generates a variety of trajectories and progressively filters out the best one. GTRS model architecture showing a unified system for generating and scoring diverse driving trajectories using diffusion- and vocabulary-based trajectories. GTRS introduces a combination of coarse sets of trajectories covering a wide range of situations and fine-grained trajectories for safety-critical situations, created using a diffusion policy conditioned on the environment. GTRS then uses a transformer decoder distilled from perception-dependent metrics, focusing on safety, comfort and traffic rule compliance. This decoder progressively filters out the most promising trajectory candidates by capturing subtle but critical differences between similar trajectories. This system has proved to generalize well to a wide range of scenarios, achieving state-of-the-art results on challenging benchmarks and enabling robust, adaptive trajectory selection in diverse and challenging driving conditions. NVIDIA Automotive Research at CVPR  More than 60 NVIDIA papers were accepted for CVPR 2025, spanning automotive, healthcare, robotics and more. In automotive, NVIDIA researchers are advancing physical AI with innovation in perception, planning and data generation. This year, three NVIDIA papers were nominated for the Best Paper Award: FoundationStereo, Zero-Shot Monocular Scene Flow and Difix3D+. The NVIDIA papers listed below showcase breakthroughs in stereo depth estimation, monocular motion understanding, 3D reconstruction, closed-loop planning, vision-language modeling and generative simulation — all critical to building safer, more generalizable AVs: Diffusion Renderer: Neural Inverse and Forward Rendering With Video Diffusion Models (Read more in this blog.) FoundationStereo: Zero-Shot Stereo Matching (Best Paper nominee) Zero-Shot Monocular Scene Flow Estimation in the Wild (Best Paper nominee) Difix3D+: Improving 3D Reconstructions With Single-Step Diffusion Models (Best Paper nominee) 3DGUT: Enabling Distorted Cameras and Secondary Rays in Gaussian Splatting Closed-Loop Supervised Fine-Tuning of Tokenized Traffic Models Zero-Shot 4D Lidar Panoptic Segmentation NVILA: Efficient Frontier Visual Language Models RADIO Amplified: Improved Baselines for Agglomerative Vision Foundation Models OmniDrive: A Holistic Vision-Language Dataset for Autonomous Driving With Counterfactual Reasoning Explore automotive workshops and tutorials at CVPR, including: Workshop on Data-Driven Autonomous Driving Simulation, featuring Marco Pavone, senior director of AV research at NVIDIA, and Sanja Fidler, vice president of AI research at NVIDIA Workshop on Autonomous Driving, featuring Laura Leal-Taixe, senior research manager at NVIDIA Workshop on Open-World 3D Scene Understanding with Foundation Models, featuring Leal-Taixe Safe Artificial Intelligence for All Domains, featuring Jose Alvarez, director of AV applied research at NVIDIA Workshop on Foundation Models for V2X-Based Cooperative Autonomous Driving, featuring Pavone and Leal-Taixe Workshop on Multi-Agent Embodied Intelligent Systems Meet Generative AI Era, featuring Pavone LatinX in CV Workshop, featuring Leal-Taixe Workshop on Exploring the Next Generation of Data, featuring Alvarez Full-Stack, GPU-Based Acceleration of Deep Learning and Foundation Models, led by NVIDIA Continuous Data Cycle via Foundation Models, led by NVIDIA Distillation of Foundation Models for Autonomous Driving, led by NVIDIA Explore the NVIDIA research papers to be presented at CVPR and watch the NVIDIA GTC Paris keynote from NVIDIA founder and CEO Jensen Huang. Learn more about NVIDIA Research, a global team of hundreds of scientists and engineers focused on topics including AI, computer graphics, computer vision, self-driving cars and robotics. The featured image above shows how an autonomous vehicle adapts its trajectory to navigate an urban environment with dynamic traffic using the GTRS model.
    Like
    Love
    Wow
    Angry
    27
    0 التعليقات 0 المشاركات
  • Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 التعليقات 0 المشاركات
  • Casa Sofia by Mário Martins Atelier: A Contemporary Urban Infill in Lagos

    Casa Sofia | © Fernando Guerra / FG+SG
    Located in the historic heart of Lagos, Portugal, Casa Sofia by Mário Martins Atelier is a thoughtful exercise in urban integration and contemporary reinterpretation. Occupying a site once held by a modest two-story house, the project is situated on the corner of a block facing the Church of St Sebastião. With its commanding presence, this national monument set a formidable challenge for the architects: introducing a new residence that respects the weight of history while offering a clear, contemporary expression.

    Casa Sofia Technical Information

    Architects1-4: Mário Martins Atelier
    Location: Lagos, Portugal
    Project Completion Years: 2023
    Photographs: © Fernando Guerra / FG+SG

    It is therefore important to design a building to fit into and complete the block. A house that is quiet and solid, with rhythmic metrics, whose new design brings an identity, with the weight and scent of the times, to a city that has existed for many centuries.
    – Mário Martins Atelier

    Casa Sofia Photographs

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG
    Spatial Organization and Circulation
    The design’s ambition is anchored in reconciling modern residential needs with the dense urban fabric that defines the walled city. Rather than imposing a bold or disruptive form, the project embraces the existing rhythms and textures of the surrounding architecture. The result is a building that both defers to and elevates the neighborhood’s character. Its restrained profile and carefully modulated facade echo the massing and articulation of the original house while introducing an identity that is clearly of its time.
    At the core of Casa Sofia’s spatial organization is a deliberate hierarchy of spaces that transitions seamlessly between public, semi-public, and private domains. Entry from the street occurs through a modest set of steps leading to an exterior atrium. This threshold mediates the relationship between the public realm and the interior, grounding the house in its urban context. Once inside, an open hall reveals the vertical flow of the building, dominated by a staircase that appears to float, linking the house’s various levels while maintaining visual continuity throughout.
    The ground floor houses three bedrooms, each with an ensuite bathroom, radiating from the central hall. This level also contains a small basement for technical support, reinforcing the discreet layering of functional and domestic spaces. Midway up the staircase, the house opens onto a garage, a laundry room, and an intimate courtyard. These areas, essential for daily life, are seamlessly integrated into the overall composition, contributing to a spatial richness that is both pragmatic and sensorial.
    On the first floor, an open-plan arrangement accommodates the main living spaces. Around a central void, the living and dining areas, kitchen, and master suite are arranged to encourage visual interplay and shared light. This configuration enhances the spatial porosity, ensuring that despite the density of the historic center, the house retains a sense of openness and fluidity. Above, a recessed roof level recedes from the street, culminating in a panoramic terrace with a swimming pool. Here, the building dissolves into the sky, offering expansive views and light-filled leisure spaces that contrast with the more enclosed lower floors.
    Materiality and Craftsmanship
    Materiality plays a decisive role in mediating the building’s relationship with its context. White-painted plaster, a familiar element in the region, is punctuated by deep limestone moldings. These details create a play of light and shadow that emphasizes the facade’s verticality and rhythm. The generous thickness of the walls, carried over from the site’s earlier construction, lends a sense of solidity and permanence to the house, recalling the tactile traditions of the Algarve’s architecture.
    The interior and exterior detailing is characterized by an economy of means, where each material is selected for its ability to reinforce the house’s quiet presence. Local materials and craftsmanship ground the project in its immediate context while responding to environmental imperatives. High thermal comfort is achieved through careful orientation and passive design strategies, complemented by the integration of solar control and water conservation measures. These considerations underscore the project’s commitment to sustainability without resorting to superficial gestures.
    Broader Urban and Cultural Implications
    Beyond its immediate function as a family home, Casa Sofia engages in a broader dialogue with its urban and cultural surroundings. The project exemplifies a measured response to the question of how to build within a historical setting without resorting to nostalgia or pastiche. It demonstrates that contemporary architecture can find resonance within heritage contexts by prioritizing the values of continuity, scale, and material authenticity.
    In its measured dialogue with the Church of St Sebastião and the centuries-old urban landscape of Lagos, Casa Sofia illustrates the potential for architecture to enrich the experience of place through quiet, rigorous interventions. It is a project that reaffirms architecture’s capacity to negotiate between past and present, crafting spaces that are at once deeply contextual and unambiguously of their moment.
    Casa Sofia Plans

    Sketch | © Mário Martins Atelier

    Ground Level | © Mário Martins Atelier

    Level 1 | © Mário Martins Atelier

    Level 2 | © Mário Martins Atelier

    Roof Plan | © Mário Martins Atelier

    Section | © Mário Martins Atelier
    Casa Sofia Image Gallery

    About Mário Martins Atelier
    Mário Martins Atelier is a Portuguese architecture and urbanism practice founded in 2000 by architect Mário Martins, who holds a degree from the Faculty of Architecture at the Technical University of Lisbon. Headquartered in Lagos with a secondary office in Lisbon, the firm operates with a dedicated multidisciplinary team. The office has developed a broad spectrum of work, from single-family homes and collective housing to public buildings and urban regeneration, distinguished by technical precision, contextual sensitivity, and sustainable strategies.
    Credits and Additional Notes

    Lead Architect: Mário Martins, arq.
    Project Team: Rita Rocha, Sónia Fialho, Susana Caetano, Susana Jóia, Ana Graça
    Engineering: Nuno Grave Engenharia
    Building: Marques Antunes Engenharia Lda
    #casa #sofia #mário #martins #atelier
    Casa Sofia by Mário Martins Atelier: A Contemporary Urban Infill in Lagos
    Casa Sofia | © Fernando Guerra / FG+SG Located in the historic heart of Lagos, Portugal, Casa Sofia by Mário Martins Atelier is a thoughtful exercise in urban integration and contemporary reinterpretation. Occupying a site once held by a modest two-story house, the project is situated on the corner of a block facing the Church of St Sebastião. With its commanding presence, this national monument set a formidable challenge for the architects: introducing a new residence that respects the weight of history while offering a clear, contemporary expression. Casa Sofia Technical Information Architects1-4: Mário Martins Atelier Location: Lagos, Portugal Project Completion Years: 2023 Photographs: © Fernando Guerra / FG+SG It is therefore important to design a building to fit into and complete the block. A house that is quiet and solid, with rhythmic metrics, whose new design brings an identity, with the weight and scent of the times, to a city that has existed for many centuries. – Mário Martins Atelier Casa Sofia Photographs © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG Spatial Organization and Circulation The design’s ambition is anchored in reconciling modern residential needs with the dense urban fabric that defines the walled city. Rather than imposing a bold or disruptive form, the project embraces the existing rhythms and textures of the surrounding architecture. The result is a building that both defers to and elevates the neighborhood’s character. Its restrained profile and carefully modulated facade echo the massing and articulation of the original house while introducing an identity that is clearly of its time. At the core of Casa Sofia’s spatial organization is a deliberate hierarchy of spaces that transitions seamlessly between public, semi-public, and private domains. Entry from the street occurs through a modest set of steps leading to an exterior atrium. This threshold mediates the relationship between the public realm and the interior, grounding the house in its urban context. Once inside, an open hall reveals the vertical flow of the building, dominated by a staircase that appears to float, linking the house’s various levels while maintaining visual continuity throughout. The ground floor houses three bedrooms, each with an ensuite bathroom, radiating from the central hall. This level also contains a small basement for technical support, reinforcing the discreet layering of functional and domestic spaces. Midway up the staircase, the house opens onto a garage, a laundry room, and an intimate courtyard. These areas, essential for daily life, are seamlessly integrated into the overall composition, contributing to a spatial richness that is both pragmatic and sensorial. On the first floor, an open-plan arrangement accommodates the main living spaces. Around a central void, the living and dining areas, kitchen, and master suite are arranged to encourage visual interplay and shared light. This configuration enhances the spatial porosity, ensuring that despite the density of the historic center, the house retains a sense of openness and fluidity. Above, a recessed roof level recedes from the street, culminating in a panoramic terrace with a swimming pool. Here, the building dissolves into the sky, offering expansive views and light-filled leisure spaces that contrast with the more enclosed lower floors. Materiality and Craftsmanship Materiality plays a decisive role in mediating the building’s relationship with its context. White-painted plaster, a familiar element in the region, is punctuated by deep limestone moldings. These details create a play of light and shadow that emphasizes the facade’s verticality and rhythm. The generous thickness of the walls, carried over from the site’s earlier construction, lends a sense of solidity and permanence to the house, recalling the tactile traditions of the Algarve’s architecture. The interior and exterior detailing is characterized by an economy of means, where each material is selected for its ability to reinforce the house’s quiet presence. Local materials and craftsmanship ground the project in its immediate context while responding to environmental imperatives. High thermal comfort is achieved through careful orientation and passive design strategies, complemented by the integration of solar control and water conservation measures. These considerations underscore the project’s commitment to sustainability without resorting to superficial gestures. Broader Urban and Cultural Implications Beyond its immediate function as a family home, Casa Sofia engages in a broader dialogue with its urban and cultural surroundings. The project exemplifies a measured response to the question of how to build within a historical setting without resorting to nostalgia or pastiche. It demonstrates that contemporary architecture can find resonance within heritage contexts by prioritizing the values of continuity, scale, and material authenticity. In its measured dialogue with the Church of St Sebastião and the centuries-old urban landscape of Lagos, Casa Sofia illustrates the potential for architecture to enrich the experience of place through quiet, rigorous interventions. It is a project that reaffirms architecture’s capacity to negotiate between past and present, crafting spaces that are at once deeply contextual and unambiguously of their moment. Casa Sofia Plans Sketch | © Mário Martins Atelier Ground Level | © Mário Martins Atelier Level 1 | © Mário Martins Atelier Level 2 | © Mário Martins Atelier Roof Plan | © Mário Martins Atelier Section | © Mário Martins Atelier Casa Sofia Image Gallery About Mário Martins Atelier Mário Martins Atelier is a Portuguese architecture and urbanism practice founded in 2000 by architect Mário Martins, who holds a degree from the Faculty of Architecture at the Technical University of Lisbon. Headquartered in Lagos with a secondary office in Lisbon, the firm operates with a dedicated multidisciplinary team. The office has developed a broad spectrum of work, from single-family homes and collective housing to public buildings and urban regeneration, distinguished by technical precision, contextual sensitivity, and sustainable strategies. Credits and Additional Notes Lead Architect: Mário Martins, arq. Project Team: Rita Rocha, Sónia Fialho, Susana Caetano, Susana Jóia, Ana Graça Engineering: Nuno Grave Engenharia Building: Marques Antunes Engenharia Lda #casa #sofia #mário #martins #atelier
    ARCHEYES.COM
    Casa Sofia by Mário Martins Atelier: A Contemporary Urban Infill in Lagos
    Casa Sofia | © Fernando Guerra / FG+SG Located in the historic heart of Lagos, Portugal, Casa Sofia by Mário Martins Atelier is a thoughtful exercise in urban integration and contemporary reinterpretation. Occupying a site once held by a modest two-story house, the project is situated on the corner of a block facing the Church of St Sebastião. With its commanding presence, this national monument set a formidable challenge for the architects: introducing a new residence that respects the weight of history while offering a clear, contemporary expression. Casa Sofia Technical Information Architects1-4: Mário Martins Atelier Location: Lagos, Portugal Project Completion Years: 2023 Photographs: © Fernando Guerra / FG+SG It is therefore important to design a building to fit into and complete the block. A house that is quiet and solid, with rhythmic metrics, whose new design brings an identity, with the weight and scent of the times, to a city that has existed for many centuries. – Mário Martins Atelier Casa Sofia Photographs © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG Spatial Organization and Circulation The design’s ambition is anchored in reconciling modern residential needs with the dense urban fabric that defines the walled city. Rather than imposing a bold or disruptive form, the project embraces the existing rhythms and textures of the surrounding architecture. The result is a building that both defers to and elevates the neighborhood’s character. Its restrained profile and carefully modulated facade echo the massing and articulation of the original house while introducing an identity that is clearly of its time. At the core of Casa Sofia’s spatial organization is a deliberate hierarchy of spaces that transitions seamlessly between public, semi-public, and private domains. Entry from the street occurs through a modest set of steps leading to an exterior atrium. This threshold mediates the relationship between the public realm and the interior, grounding the house in its urban context. Once inside, an open hall reveals the vertical flow of the building, dominated by a staircase that appears to float, linking the house’s various levels while maintaining visual continuity throughout. The ground floor houses three bedrooms, each with an ensuite bathroom, radiating from the central hall. This level also contains a small basement for technical support, reinforcing the discreet layering of functional and domestic spaces. Midway up the staircase, the house opens onto a garage, a laundry room, and an intimate courtyard. These areas, essential for daily life, are seamlessly integrated into the overall composition, contributing to a spatial richness that is both pragmatic and sensorial. On the first floor, an open-plan arrangement accommodates the main living spaces. Around a central void, the living and dining areas, kitchen, and master suite are arranged to encourage visual interplay and shared light. This configuration enhances the spatial porosity, ensuring that despite the density of the historic center, the house retains a sense of openness and fluidity. Above, a recessed roof level recedes from the street, culminating in a panoramic terrace with a swimming pool. Here, the building dissolves into the sky, offering expansive views and light-filled leisure spaces that contrast with the more enclosed lower floors. Materiality and Craftsmanship Materiality plays a decisive role in mediating the building’s relationship with its context. White-painted plaster, a familiar element in the region, is punctuated by deep limestone moldings. These details create a play of light and shadow that emphasizes the facade’s verticality and rhythm. The generous thickness of the walls, carried over from the site’s earlier construction, lends a sense of solidity and permanence to the house, recalling the tactile traditions of the Algarve’s architecture. The interior and exterior detailing is characterized by an economy of means, where each material is selected for its ability to reinforce the house’s quiet presence. Local materials and craftsmanship ground the project in its immediate context while responding to environmental imperatives. High thermal comfort is achieved through careful orientation and passive design strategies, complemented by the integration of solar control and water conservation measures. These considerations underscore the project’s commitment to sustainability without resorting to superficial gestures. Broader Urban and Cultural Implications Beyond its immediate function as a family home, Casa Sofia engages in a broader dialogue with its urban and cultural surroundings. The project exemplifies a measured response to the question of how to build within a historical setting without resorting to nostalgia or pastiche. It demonstrates that contemporary architecture can find resonance within heritage contexts by prioritizing the values of continuity, scale, and material authenticity. In its measured dialogue with the Church of St Sebastião and the centuries-old urban landscape of Lagos, Casa Sofia illustrates the potential for architecture to enrich the experience of place through quiet, rigorous interventions. It is a project that reaffirms architecture’s capacity to negotiate between past and present, crafting spaces that are at once deeply contextual and unambiguously of their moment. Casa Sofia Plans Sketch | © Mário Martins Atelier Ground Level | © Mário Martins Atelier Level 1 | © Mário Martins Atelier Level 2 | © Mário Martins Atelier Roof Plan | © Mário Martins Atelier Section | © Mário Martins Atelier Casa Sofia Image Gallery About Mário Martins Atelier Mário Martins Atelier is a Portuguese architecture and urbanism practice founded in 2000 by architect Mário Martins, who holds a degree from the Faculty of Architecture at the Technical University of Lisbon (1988). Headquartered in Lagos with a secondary office in Lisbon, the firm operates with a dedicated multidisciplinary team. The office has developed a broad spectrum of work, from single-family homes and collective housing to public buildings and urban regeneration, distinguished by technical precision, contextual sensitivity, and sustainable strategies. Credits and Additional Notes Lead Architect: Mário Martins, arq. Project Team: Rita Rocha, Sónia Fialho, Susana Caetano, Susana Jóia, Ana Graça Engineering: Nuno Grave Engenharia Building: Marques Antunes Engenharia Lda
    Like
    Love
    Wow
    Sad
    Angry
    395
    2 التعليقات 0 المشاركات
  • MedTech AI, hardware, and clinical application programmes

    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    #medtech #hardware #clinical #application #programmes
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between billion and billion annually in productivity gains. Through GenAI adoption, an additional billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experiencebeing equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here. #medtech #hardware #clinical #application #programmes
    WWW.ARTIFICIALINTELLIGENCE-NEWS.COM
    MedTech AI, hardware, and clinical application programmes
    Modern healthcare innovations span AI, devices, software, images, and regulatory frameworks, all requiring stringent coordination. Generative AI arguably has the strongest transformative potential in healthcare technology programmes, with it already being applied across various domains, such as R&D, commercial operations, and supply chain management.Traditional models for medical appointments, like face-to-face appointments, and paper-based processes may not be sufficient to meet the fast-paced, data-driven medical landscape of today. Therefore, healthcare professionals and patients are seeking more convenient and efficient ways to access and share information, meeting the complex standards of modern medical science. According to McKinsey, Medtech companies are at the forefront of healthcare innovation, estimating they could capture between $14 billion and $55 billion annually in productivity gains. Through GenAI adoption, an additional $50 billion plus in revenue is estimated from products and service innovations. A McKinsey 2024 survey revealed around two thirds of Medtech executives have already implemented Gen AI, with approximately 20% scaling their solutions up and reporting substantial benefits to productivity.  While advanced technology implementation is growing across the medical industry, challenges persist. Organisations face hurdles like data integration issues, decentralised strategies, and skill gaps. Together, these highlight a need for a more streamlined approach to Gen AI deployment. Of all the Medtech domains, R&D is leading the way in Gen AI adoption. Being the most comfortable with new technologies, R&D departments use Gen AI tools to streamline work processes, such as summarising research papers or scientific articles, highlighting a grassroots adoption trend. Individual researchers are using AI to enhance productivity, even when no formal company-wide strategies are in place.While AI tools automate and accelerate R&D tasks, human review is still required to ensure final submissions are correct and satisfactory. Gen AI is proving to reduce time spent on administrative tasks for teams and improve research accuracy and depth, with some companies experiencing 20% to 30% gains in research productivity. KPIs for success in healthcare product programmesMeasuring business performance is essential in the healthcare sector. The number one goal is, of course, to deliver high-quality care, yet simultaneously maintain efficient operations. By measuring and analysing KPIs, healthcare providers are in a better position to improve patient outcomes through their data-based considerations. KPIs can also improve resource allocation, and encourage continuous improvement in all areas of care. In terms of healthcare product programmes, these structured initiatives prioritise the development, delivery, and continual optimisation of medical products. But to be a success, they require cross-functional coordination of clinical, technical, regulatory, and business teams. Time to market is critical, ensuring a product moves from the concept stage to launch as quickly as possible.Of particular note is the emphasis needing to be placed on labelling and documentation. McKinsey notes that AI-assisted labelling has resulted in a 20%-30% improvement in operational efficiency. Resource utilisation rates are also important, showing how efficiently time, budget, and/or headcount are used during the developmental stage of products. In the healthcare sector, KPIs ought to focus on several factors, including operational efficiency, patient outcomes, financial health of the business, and patient satisfaction. To achieve a comprehensive view of performance, these can be categorised into financial, operational, clinical quality, and patient experience.Bridging user experience with technical precision – design awardsInnovation is no longer solely judged by technical performance with user experience (UX) being equally important. Some of the latest innovations in healthcare are recognised at the UX Design Awards, products that exemplify the best in user experience as well as technical precision. Top products prioritise the needs and experiences of both patients and healthcare professionals, also ensuring each product meets the rigorous clinical and regulatory standards of the sector. One example is the CIARTIC Move by Siemens Healthineers, a self-driving 3D C-arm imaging system that lets surgeons operate, controlling the device wirelessly in a sterile field. Computer hardware company ASUS has also received accolades for its HealthConnect App and VivoWatch Series, showcasing the fusion of AIoT-driven smart healthcare solutions with user-friendly interfaces – sometimes in what are essentially consumer devices. This demonstrates how technical innovation is being made accessible and becoming increasingly intuitive as patients gain technical fluency.  Navigating regulatory and product development pathways simultaneously The establishing of clinical and regulatory paths is important, as this enables healthcare teams to feed a twin stream of findings back into development. Gen AI adoption has become a transformative approach, automating the production and refining of complex documents, mixed data sets, and structured and unstructured data. By integrating regulatory considerations early and adopting technologies like Gen AI as part of agile practices, healthcare product programmes help teams navigate a regulatory landscape that can often shift. Baking a regulatory mindset into a team early helps ensure compliance and continued innovation. (Image source: “IBM Achieves New Deep Learning Breakthrough” by IBM Research is licensed under CC BY-ND 2.0.)Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.Explore other upcoming enterprise technology events and webinars powered by TechForge here.
    0 التعليقات 0 المشاركات
  • New Zealand’s Email Security Requirements for Government Organizations: What You Need to Know

    The Secure Government EmailCommon Implementation Framework
    New Zealand’s government is introducing a comprehensive email security framework designed to protect official communications from phishing and domain spoofing. This new framework, which will be mandatory for all government agencies by October 2025, establishes clear technical standards to enhance email security and retire the outdated SEEMail service. 
    Key Takeaways

    All NZ government agencies must comply with new email security requirements by October 2025.
    The new framework strengthens trust and security in government communications by preventing spoofing and phishing.
    The framework mandates TLS 1.2+, SPF, DKIM, DMARC with p=reject, MTA-STS, and DLP controls.
    EasyDMARC simplifies compliance with our guided setup, monitoring, and automated reporting.

    Start a Free Trial

    What is the Secure Government Email Common Implementation Framework?
    The Secure Government EmailCommon Implementation Framework is a new government-led initiative in New Zealand designed to standardize email security across all government agencies. Its main goal is to secure external email communication, reduce domain spoofing in phishing attacks, and replace the legacy SEEMail service.
    Why is New Zealand Implementing New Government Email Security Standards?
    The framework was developed by New Zealand’s Department of Internal Affairsas part of its role in managing ICT Common Capabilities. It leverages modern email security controls via the Domain Name Systemto enable the retirement of the legacy SEEMail service and provide:

    Encryption for transmission security
    Digital signing for message integrity
    Basic non-repudiationDomain spoofing protection

    These improvements apply to all emails, not just those routed through SEEMail, offering broader protection across agency communications.
    What Email Security Technologies Are Required by the New NZ SGE Framework?
    The SGE Framework outlines the following key technologies that agencies must implement:

    TLS 1.2 or higher with implicit TLS enforced
    TLS-RPTSPFDKIMDMARCwith reporting
    MTA-STSData Loss Prevention controls

    These technologies work together to ensure encrypted email transmission, validate sender identity, prevent unauthorized use of domains, and reduce the risk of sensitive data leaks.

    Get in touch

    When Do NZ Government Agencies Need to Comply with this Framework?
    All New Zealand government agencies are expected to fully implement the Secure Government EmailCommon Implementation Framework by October 2025. Agencies should begin their planning and deployment now to ensure full compliance by the deadline.
    The All of Government Secure Email Common Implementation Framework v1.0
    What are the Mandated Requirements for Domains?
    Below are the exact requirements for all email-enabled domains under the new framework.
    ControlExact RequirementTLSMinimum TLS 1.2. TLS 1.1, 1.0, SSL, or clear-text not permitted.TLS-RPTAll email-sending domains must have TLS reporting enabled.SPFMust exist and end with -all.DKIMAll outbound email from every sending service must be DKIM-signed at the final hop.DMARCPolicy of p=reject on all email-enabled domains. adkim=s is recommended when not bulk-sending.MTA-STSEnabled and set to enforce.Implicit TLSMust be configured and enforced for every connection.Data Loss PreventionEnforce in line with the New Zealand Information Security Manualand Protective Security Requirements.
    Compliance Monitoring and Reporting
    The All of Government Service Deliveryteam will be monitoring compliance with the framework. Monitoring will initially cover SPF, DMARC, and MTA-STS settings and will be expanded to include DKIM. Changes to these settings will be monitored, enabling reporting on email security compliance across all government agencies. Ongoing monitoring will highlight changes to domains, ensure new domains are set up with security in place, and monitor the implementation of future email security technologies. 
    Should compliance changes occur, such as an agency’s SPF record being changed from -all to ~all, this will be captured so that the AoGSD Security Team can investigate. They will then communicate directly with the agency to determine if an issue exists or if an error has occurred, reviewing each case individually.
    Deployment Checklist for NZ Government Compliance

    Enforce TLS 1.2 minimum, implicit TLS, MTA-STS & TLS-RPT
    SPF with -all
    DKIM on all outbound email
    DMARC p=reject 
    adkim=s where suitable
    For non-email/parked domains: SPF -all, empty DKIM, DMARC reject strict
    Compliance dashboard
    Inbound DMARC evaluation enforced
    DLP aligned with NZISM

    Start a Free Trial

    How EasyDMARC Can Help Government Agencies Comply
    EasyDMARC provides a comprehensive email security solution that simplifies the deployment and ongoing management of DNS-based email security protocols like SPF, DKIM, and DMARC with reporting. Our platform offers automated checks, real-time monitoring, and a guided setup to help government organizations quickly reach compliance.
    1. TLS-RPT / MTA-STS audit
    EasyDMARC enables you to enable the Managed MTA-STS and TLS-RPT option with a single click. We provide the required DNS records and continuously monitor them for issues, delivering reports on TLS negotiation problems. This helps agencies ensure secure email transmission and quickly detect delivery or encryption failures.

    Note: In this screenshot, you can see how to deploy MTA-STS and TLS Reporting by adding just three CNAME records provided by EasyDMARC. It’s recommended to start in “testing” mode, evaluate the TLS-RPT reports, and then gradually switch your MTA-STS policy to “enforce”. The process is simple and takes just a few clicks.

    As shown above, EasyDMARC parses incoming TLS reports into a centralized dashboard, giving you clear visibility into delivery and encryption issues across all sending sources.
    2. SPF with “-all”In the EasyDARC platform, you can run the SPF Record Generator to create a compliant record. Publish your v=spf1 record with “-all” to enforce a hard fail for unauthorized senders and prevent spoofed emails from passing SPF checks. This strengthens your domain’s protection against impersonation.

    Note: It is highly recommended to start adjusting your SPF record only after you begin receiving DMARC reports and identifying your legitimate email sources. As we’ll explain in more detail below, both SPF and DKIM should be adjusted after you gain visibility through reports.
    Making changes without proper visibility can lead to false positives, misconfigurations, and potential loss of legitimate emails. That’s why the first step should always be setting DMARC to p=none, receiving reports, analyzing them, and then gradually fixing any SPF or DKIM issues.
    3. DKIM on all outbound email
    DKIM must be configured for all email sources sending emails on behalf of your domain. This is critical, as DKIM plays a bigger role than SPF when it comes to building domain reputation, surviving auto-forwarding, mailing lists, and other edge cases.
    As mentioned above, DMARC reports provide visibility into your email sources, allowing you to implement DKIM accordingly. If you’re using third-party services like Google Workspace, Microsoft 365, or Mimecast, you’ll need to retrieve the public DKIM key from your provider’s admin interface.
    EasyDMARC maintains a backend directory of over 1,400 email sources. We also give you detailed guidance on how to configure SPF and DKIM correctly for major ESPs. 
    Note: At the end of this article, you’ll find configuration links for well-known ESPs like Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid – helping you avoid common misconfigurations and get aligned with SGE requirements.
    If you’re using a dedicated MTA, DKIM must be implemented manually. EasyDMARC’s DKIM Record Generator lets you generate both public and private keys for your server. The private key is stored on your MTA, while the public key must be published in your DNS.

    4. DMARC p=reject rollout
    As mentioned in previous points, DMARC reporting is the first and most important step on your DMARC enforcement journey. Always start with a p=none policy and configure RUA reports to be sent to EasyDMARC. Use the report insights to identify and fix SPF and DKIM alignment issues, then gradually move to p=quarantine and finally p=reject once all legitimate email sources have been authenticated. 
    This phased approach ensures full protection against domain spoofing without risking legitimate email delivery.

    5. adkim Strict Alignment Check
    This strict alignment check is not always applicable, especially if you’re using third-party bulk ESPs, such as Sendgrid, that require you to set DKIM on a subdomain level. You can set adkim=s in your DMARC TXT record, or simply enable strict mode in EasyDMARC’s Managed DMARC settings. This ensures that only emails with a DKIM signature that exactly match your domain pass alignment, adding an extra layer of protection against domain spoofing. But only do this if you are NOT a bulk sender.

    6. Securing Non-Email Enabled Domains
    The purpose of deploying email security to non-email-enabled domains, or parked domains, is to prevent messages being spoofed from that domain. This requirement remains even if the root-level domain has SP=reject set within its DMARC record.
    Under this new framework, you must bulk import and mark parked domains as “Parked.” Crucially, this requires adjusting SPF settings to an empty record, setting DMARC to p=reject, and ensuring an empty DKIM record is in place: • SPF record: “v=spf1 -all”.
    • Wildcard DKIM record with empty public key.• DMARC record: “v=DMARC1;p=reject;adkim=s;aspf=s;rua=mailto:…”.
    EasyDMARC allows you to add and label parked domains for free. This is important because it helps you monitor any activity from these domains and ensure they remain protected with a strict DMARC policy of p=reject.
    7. Compliance Dashboard
    Use EasyDMARC’s Domain Scanner to assess the security posture of each domain with a clear compliance score and risk level. The dashboard highlights configuration gaps and guides remediation steps, helping government agencies stay on track toward full compliance with the SGE Framework.

    8. Inbound DMARC Evaluation Enforced
    You don’t need to apply any changes if you’re using Google Workspace, Microsoft 365, or other major mailbox providers. Most of them already enforce DMARC evaluation on incoming emails.
    However, some legacy Microsoft 365 setups may still quarantine emails that fail DMARC checks, even when the sending domain has a p=reject policy, instead of rejecting them. This behavior can be adjusted directly from your Microsoft Defender portal. about this in our step-by-step guide on how to set up SPF, DKIM, and DMARC from Microsoft Defender.
    If you’re using a third-party mail provider that doesn’t enforce having a DMARC policy for incoming emails, which is rare, you’ll need to contact their support to request a configuration change.
    9. Data Loss Prevention Aligned with NZISM
    The New Zealand Information Security Manualis the New Zealand Government’s manual on information assurance and information systems security. It includes guidance on data loss prevention, which must be followed to be aligned with the SEG.
    Need Help Setting up SPF and DKIM for your Email Provider?
    Setting up SPF and DKIM for different ESPs often requires specific configurations. Some providers require you to publish SPF and DKIM on a subdomain, while others only require DKIM, or have different formatting rules. We’ve simplified all these steps to help you avoid misconfigurations that could delay your DMARC enforcement, or worse, block legitimate emails from reaching your recipients.
    Below you’ll find comprehensive setup guides for Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid. You can also explore our full blog section that covers setup instructions for many other well-known ESPs.
    Remember, all this information is reflected in your DMARC aggregate reports. These reports give you live visibility into your outgoing email ecosystem, helping you analyze and fix any issues specific to a given provider.
    Here are our step-by-step guides for the most common platforms:

    Google Workspace

    Microsoft 365

    These guides will help ensure your DNS records are configured correctly as part of the Secure Government EmailFramework rollout.
    Meet New Government Email Security Standards With EasyDMARC
    New Zealand’s SEG Framework sets a clear path for government agencies to enhance their email security by October 2025. With EasyDMARC, you can meet these technical requirements efficiently and with confidence. From protocol setup to continuous monitoring and compliance tracking, EasyDMARC streamlines the entire process, ensuring strong protection against spoofing, phishing, and data loss while simplifying your transition from SEEMail.
    #new #zealands #email #security #requirements
    New Zealand’s Email Security Requirements for Government Organizations: What You Need to Know
    The Secure Government EmailCommon Implementation Framework New Zealand’s government is introducing a comprehensive email security framework designed to protect official communications from phishing and domain spoofing. This new framework, which will be mandatory for all government agencies by October 2025, establishes clear technical standards to enhance email security and retire the outdated SEEMail service.  Key Takeaways All NZ government agencies must comply with new email security requirements by October 2025. The new framework strengthens trust and security in government communications by preventing spoofing and phishing. The framework mandates TLS 1.2+, SPF, DKIM, DMARC with p=reject, MTA-STS, and DLP controls. EasyDMARC simplifies compliance with our guided setup, monitoring, and automated reporting. Start a Free Trial What is the Secure Government Email Common Implementation Framework? The Secure Government EmailCommon Implementation Framework is a new government-led initiative in New Zealand designed to standardize email security across all government agencies. Its main goal is to secure external email communication, reduce domain spoofing in phishing attacks, and replace the legacy SEEMail service. Why is New Zealand Implementing New Government Email Security Standards? The framework was developed by New Zealand’s Department of Internal Affairsas part of its role in managing ICT Common Capabilities. It leverages modern email security controls via the Domain Name Systemto enable the retirement of the legacy SEEMail service and provide: Encryption for transmission security Digital signing for message integrity Basic non-repudiationDomain spoofing protection These improvements apply to all emails, not just those routed through SEEMail, offering broader protection across agency communications. What Email Security Technologies Are Required by the New NZ SGE Framework? The SGE Framework outlines the following key technologies that agencies must implement: TLS 1.2 or higher with implicit TLS enforced TLS-RPTSPFDKIMDMARCwith reporting MTA-STSData Loss Prevention controls These technologies work together to ensure encrypted email transmission, validate sender identity, prevent unauthorized use of domains, and reduce the risk of sensitive data leaks. Get in touch When Do NZ Government Agencies Need to Comply with this Framework? All New Zealand government agencies are expected to fully implement the Secure Government EmailCommon Implementation Framework by October 2025. Agencies should begin their planning and deployment now to ensure full compliance by the deadline. The All of Government Secure Email Common Implementation Framework v1.0 What are the Mandated Requirements for Domains? Below are the exact requirements for all email-enabled domains under the new framework. ControlExact RequirementTLSMinimum TLS 1.2. TLS 1.1, 1.0, SSL, or clear-text not permitted.TLS-RPTAll email-sending domains must have TLS reporting enabled.SPFMust exist and end with -all.DKIMAll outbound email from every sending service must be DKIM-signed at the final hop.DMARCPolicy of p=reject on all email-enabled domains. adkim=s is recommended when not bulk-sending.MTA-STSEnabled and set to enforce.Implicit TLSMust be configured and enforced for every connection.Data Loss PreventionEnforce in line with the New Zealand Information Security Manualand Protective Security Requirements. Compliance Monitoring and Reporting The All of Government Service Deliveryteam will be monitoring compliance with the framework. Monitoring will initially cover SPF, DMARC, and MTA-STS settings and will be expanded to include DKIM. Changes to these settings will be monitored, enabling reporting on email security compliance across all government agencies. Ongoing monitoring will highlight changes to domains, ensure new domains are set up with security in place, and monitor the implementation of future email security technologies.  Should compliance changes occur, such as an agency’s SPF record being changed from -all to ~all, this will be captured so that the AoGSD Security Team can investigate. They will then communicate directly with the agency to determine if an issue exists or if an error has occurred, reviewing each case individually. Deployment Checklist for NZ Government Compliance Enforce TLS 1.2 minimum, implicit TLS, MTA-STS & TLS-RPT SPF with -all DKIM on all outbound email DMARC p=reject  adkim=s where suitable For non-email/parked domains: SPF -all, empty DKIM, DMARC reject strict Compliance dashboard Inbound DMARC evaluation enforced DLP aligned with NZISM Start a Free Trial How EasyDMARC Can Help Government Agencies Comply EasyDMARC provides a comprehensive email security solution that simplifies the deployment and ongoing management of DNS-based email security protocols like SPF, DKIM, and DMARC with reporting. Our platform offers automated checks, real-time monitoring, and a guided setup to help government organizations quickly reach compliance. 1. TLS-RPT / MTA-STS audit EasyDMARC enables you to enable the Managed MTA-STS and TLS-RPT option with a single click. We provide the required DNS records and continuously monitor them for issues, delivering reports on TLS negotiation problems. This helps agencies ensure secure email transmission and quickly detect delivery or encryption failures. Note: In this screenshot, you can see how to deploy MTA-STS and TLS Reporting by adding just three CNAME records provided by EasyDMARC. It’s recommended to start in “testing” mode, evaluate the TLS-RPT reports, and then gradually switch your MTA-STS policy to “enforce”. The process is simple and takes just a few clicks. As shown above, EasyDMARC parses incoming TLS reports into a centralized dashboard, giving you clear visibility into delivery and encryption issues across all sending sources. 2. SPF with “-all”In the EasyDARC platform, you can run the SPF Record Generator to create a compliant record. Publish your v=spf1 record with “-all” to enforce a hard fail for unauthorized senders and prevent spoofed emails from passing SPF checks. This strengthens your domain’s protection against impersonation. Note: It is highly recommended to start adjusting your SPF record only after you begin receiving DMARC reports and identifying your legitimate email sources. As we’ll explain in more detail below, both SPF and DKIM should be adjusted after you gain visibility through reports. Making changes without proper visibility can lead to false positives, misconfigurations, and potential loss of legitimate emails. That’s why the first step should always be setting DMARC to p=none, receiving reports, analyzing them, and then gradually fixing any SPF or DKIM issues. 3. DKIM on all outbound email DKIM must be configured for all email sources sending emails on behalf of your domain. This is critical, as DKIM plays a bigger role than SPF when it comes to building domain reputation, surviving auto-forwarding, mailing lists, and other edge cases. As mentioned above, DMARC reports provide visibility into your email sources, allowing you to implement DKIM accordingly. If you’re using third-party services like Google Workspace, Microsoft 365, or Mimecast, you’ll need to retrieve the public DKIM key from your provider’s admin interface. EasyDMARC maintains a backend directory of over 1,400 email sources. We also give you detailed guidance on how to configure SPF and DKIM correctly for major ESPs.  Note: At the end of this article, you’ll find configuration links for well-known ESPs like Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid – helping you avoid common misconfigurations and get aligned with SGE requirements. If you’re using a dedicated MTA, DKIM must be implemented manually. EasyDMARC’s DKIM Record Generator lets you generate both public and private keys for your server. The private key is stored on your MTA, while the public key must be published in your DNS. 4. DMARC p=reject rollout As mentioned in previous points, DMARC reporting is the first and most important step on your DMARC enforcement journey. Always start with a p=none policy and configure RUA reports to be sent to EasyDMARC. Use the report insights to identify and fix SPF and DKIM alignment issues, then gradually move to p=quarantine and finally p=reject once all legitimate email sources have been authenticated.  This phased approach ensures full protection against domain spoofing without risking legitimate email delivery. 5. adkim Strict Alignment Check This strict alignment check is not always applicable, especially if you’re using third-party bulk ESPs, such as Sendgrid, that require you to set DKIM on a subdomain level. You can set adkim=s in your DMARC TXT record, or simply enable strict mode in EasyDMARC’s Managed DMARC settings. This ensures that only emails with a DKIM signature that exactly match your domain pass alignment, adding an extra layer of protection against domain spoofing. But only do this if you are NOT a bulk sender. 6. Securing Non-Email Enabled Domains The purpose of deploying email security to non-email-enabled domains, or parked domains, is to prevent messages being spoofed from that domain. This requirement remains even if the root-level domain has SP=reject set within its DMARC record. Under this new framework, you must bulk import and mark parked domains as “Parked.” Crucially, this requires adjusting SPF settings to an empty record, setting DMARC to p=reject, and ensuring an empty DKIM record is in place: • SPF record: “v=spf1 -all”. • Wildcard DKIM record with empty public key.• DMARC record: “v=DMARC1;p=reject;adkim=s;aspf=s;rua=mailto:…”. EasyDMARC allows you to add and label parked domains for free. This is important because it helps you monitor any activity from these domains and ensure they remain protected with a strict DMARC policy of p=reject. 7. Compliance Dashboard Use EasyDMARC’s Domain Scanner to assess the security posture of each domain with a clear compliance score and risk level. The dashboard highlights configuration gaps and guides remediation steps, helping government agencies stay on track toward full compliance with the SGE Framework. 8. Inbound DMARC Evaluation Enforced You don’t need to apply any changes if you’re using Google Workspace, Microsoft 365, or other major mailbox providers. Most of them already enforce DMARC evaluation on incoming emails. However, some legacy Microsoft 365 setups may still quarantine emails that fail DMARC checks, even when the sending domain has a p=reject policy, instead of rejecting them. This behavior can be adjusted directly from your Microsoft Defender portal. about this in our step-by-step guide on how to set up SPF, DKIM, and DMARC from Microsoft Defender. If you’re using a third-party mail provider that doesn’t enforce having a DMARC policy for incoming emails, which is rare, you’ll need to contact their support to request a configuration change. 9. Data Loss Prevention Aligned with NZISM The New Zealand Information Security Manualis the New Zealand Government’s manual on information assurance and information systems security. It includes guidance on data loss prevention, which must be followed to be aligned with the SEG. Need Help Setting up SPF and DKIM for your Email Provider? Setting up SPF and DKIM for different ESPs often requires specific configurations. Some providers require you to publish SPF and DKIM on a subdomain, while others only require DKIM, or have different formatting rules. We’ve simplified all these steps to help you avoid misconfigurations that could delay your DMARC enforcement, or worse, block legitimate emails from reaching your recipients. Below you’ll find comprehensive setup guides for Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid. You can also explore our full blog section that covers setup instructions for many other well-known ESPs. Remember, all this information is reflected in your DMARC aggregate reports. These reports give you live visibility into your outgoing email ecosystem, helping you analyze and fix any issues specific to a given provider. Here are our step-by-step guides for the most common platforms: Google Workspace Microsoft 365 These guides will help ensure your DNS records are configured correctly as part of the Secure Government EmailFramework rollout. Meet New Government Email Security Standards With EasyDMARC New Zealand’s SEG Framework sets a clear path for government agencies to enhance their email security by October 2025. With EasyDMARC, you can meet these technical requirements efficiently and with confidence. From protocol setup to continuous monitoring and compliance tracking, EasyDMARC streamlines the entire process, ensuring strong protection against spoofing, phishing, and data loss while simplifying your transition from SEEMail. #new #zealands #email #security #requirements
    EASYDMARC.COM
    New Zealand’s Email Security Requirements for Government Organizations: What You Need to Know
    The Secure Government Email (SGE) Common Implementation Framework New Zealand’s government is introducing a comprehensive email security framework designed to protect official communications from phishing and domain spoofing. This new framework, which will be mandatory for all government agencies by October 2025, establishes clear technical standards to enhance email security and retire the outdated SEEMail service.  Key Takeaways All NZ government agencies must comply with new email security requirements by October 2025. The new framework strengthens trust and security in government communications by preventing spoofing and phishing. The framework mandates TLS 1.2+, SPF, DKIM, DMARC with p=reject, MTA-STS, and DLP controls. EasyDMARC simplifies compliance with our guided setup, monitoring, and automated reporting. Start a Free Trial What is the Secure Government Email Common Implementation Framework? The Secure Government Email (SGE) Common Implementation Framework is a new government-led initiative in New Zealand designed to standardize email security across all government agencies. Its main goal is to secure external email communication, reduce domain spoofing in phishing attacks, and replace the legacy SEEMail service. Why is New Zealand Implementing New Government Email Security Standards? The framework was developed by New Zealand’s Department of Internal Affairs (DIA) as part of its role in managing ICT Common Capabilities. It leverages modern email security controls via the Domain Name System (DNS) to enable the retirement of the legacy SEEMail service and provide: Encryption for transmission security Digital signing for message integrity Basic non-repudiation (by allowing only authorized senders) Domain spoofing protection These improvements apply to all emails, not just those routed through SEEMail, offering broader protection across agency communications. What Email Security Technologies Are Required by the New NZ SGE Framework? The SGE Framework outlines the following key technologies that agencies must implement: TLS 1.2 or higher with implicit TLS enforced TLS-RPT (TLS Reporting) SPF (Sender Policy Framework) DKIM (DomainKeys Identified Mail) DMARC (Domain-based Message Authentication, Reporting, and Conformance) with reporting MTA-STS (Mail Transfer Agent Strict Transport Security) Data Loss Prevention controls These technologies work together to ensure encrypted email transmission, validate sender identity, prevent unauthorized use of domains, and reduce the risk of sensitive data leaks. Get in touch When Do NZ Government Agencies Need to Comply with this Framework? All New Zealand government agencies are expected to fully implement the Secure Government Email (SGE) Common Implementation Framework by October 2025. Agencies should begin their planning and deployment now to ensure full compliance by the deadline. The All of Government Secure Email Common Implementation Framework v1.0 What are the Mandated Requirements for Domains? Below are the exact requirements for all email-enabled domains under the new framework. ControlExact RequirementTLSMinimum TLS 1.2. TLS 1.1, 1.0, SSL, or clear-text not permitted.TLS-RPTAll email-sending domains must have TLS reporting enabled.SPFMust exist and end with -all.DKIMAll outbound email from every sending service must be DKIM-signed at the final hop.DMARCPolicy of p=reject on all email-enabled domains. adkim=s is recommended when not bulk-sending.MTA-STSEnabled and set to enforce.Implicit TLSMust be configured and enforced for every connection.Data Loss PreventionEnforce in line with the New Zealand Information Security Manual (NZISM) and Protective Security Requirements (PSR). Compliance Monitoring and Reporting The All of Government Service Delivery (AoGSD) team will be monitoring compliance with the framework. Monitoring will initially cover SPF, DMARC, and MTA-STS settings and will be expanded to include DKIM. Changes to these settings will be monitored, enabling reporting on email security compliance across all government agencies. Ongoing monitoring will highlight changes to domains, ensure new domains are set up with security in place, and monitor the implementation of future email security technologies.  Should compliance changes occur, such as an agency’s SPF record being changed from -all to ~all, this will be captured so that the AoGSD Security Team can investigate. They will then communicate directly with the agency to determine if an issue exists or if an error has occurred, reviewing each case individually. Deployment Checklist for NZ Government Compliance Enforce TLS 1.2 minimum, implicit TLS, MTA-STS & TLS-RPT SPF with -all DKIM on all outbound email DMARC p=reject  adkim=s where suitable For non-email/parked domains: SPF -all, empty DKIM, DMARC reject strict Compliance dashboard Inbound DMARC evaluation enforced DLP aligned with NZISM Start a Free Trial How EasyDMARC Can Help Government Agencies Comply EasyDMARC provides a comprehensive email security solution that simplifies the deployment and ongoing management of DNS-based email security protocols like SPF, DKIM, and DMARC with reporting. Our platform offers automated checks, real-time monitoring, and a guided setup to help government organizations quickly reach compliance. 1. TLS-RPT / MTA-STS audit EasyDMARC enables you to enable the Managed MTA-STS and TLS-RPT option with a single click. We provide the required DNS records and continuously monitor them for issues, delivering reports on TLS negotiation problems. This helps agencies ensure secure email transmission and quickly detect delivery or encryption failures. Note: In this screenshot, you can see how to deploy MTA-STS and TLS Reporting by adding just three CNAME records provided by EasyDMARC. It’s recommended to start in “testing” mode, evaluate the TLS-RPT reports, and then gradually switch your MTA-STS policy to “enforce”. The process is simple and takes just a few clicks. As shown above, EasyDMARC parses incoming TLS reports into a centralized dashboard, giving you clear visibility into delivery and encryption issues across all sending sources. 2. SPF with “-all”In the EasyDARC platform, you can run the SPF Record Generator to create a compliant record. Publish your v=spf1 record with “-all” to enforce a hard fail for unauthorized senders and prevent spoofed emails from passing SPF checks. This strengthens your domain’s protection against impersonation. Note: It is highly recommended to start adjusting your SPF record only after you begin receiving DMARC reports and identifying your legitimate email sources. As we’ll explain in more detail below, both SPF and DKIM should be adjusted after you gain visibility through reports. Making changes without proper visibility can lead to false positives, misconfigurations, and potential loss of legitimate emails. That’s why the first step should always be setting DMARC to p=none, receiving reports, analyzing them, and then gradually fixing any SPF or DKIM issues. 3. DKIM on all outbound email DKIM must be configured for all email sources sending emails on behalf of your domain. This is critical, as DKIM plays a bigger role than SPF when it comes to building domain reputation, surviving auto-forwarding, mailing lists, and other edge cases. As mentioned above, DMARC reports provide visibility into your email sources, allowing you to implement DKIM accordingly (see first screenshot). If you’re using third-party services like Google Workspace, Microsoft 365, or Mimecast, you’ll need to retrieve the public DKIM key from your provider’s admin interface (see second screenshot). EasyDMARC maintains a backend directory of over 1,400 email sources. We also give you detailed guidance on how to configure SPF and DKIM correctly for major ESPs.  Note: At the end of this article, you’ll find configuration links for well-known ESPs like Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid – helping you avoid common misconfigurations and get aligned with SGE requirements. If you’re using a dedicated MTA (e.g., Postfix), DKIM must be implemented manually. EasyDMARC’s DKIM Record Generator lets you generate both public and private keys for your server. The private key is stored on your MTA, while the public key must be published in your DNS (see third and fourth screenshots). 4. DMARC p=reject rollout As mentioned in previous points, DMARC reporting is the first and most important step on your DMARC enforcement journey. Always start with a p=none policy and configure RUA reports to be sent to EasyDMARC. Use the report insights to identify and fix SPF and DKIM alignment issues, then gradually move to p=quarantine and finally p=reject once all legitimate email sources have been authenticated.  This phased approach ensures full protection against domain spoofing without risking legitimate email delivery. 5. adkim Strict Alignment Check This strict alignment check is not always applicable, especially if you’re using third-party bulk ESPs, such as Sendgrid, that require you to set DKIM on a subdomain level. You can set adkim=s in your DMARC TXT record, or simply enable strict mode in EasyDMARC’s Managed DMARC settings. This ensures that only emails with a DKIM signature that exactly match your domain pass alignment, adding an extra layer of protection against domain spoofing. But only do this if you are NOT a bulk sender. 6. Securing Non-Email Enabled Domains The purpose of deploying email security to non-email-enabled domains, or parked domains, is to prevent messages being spoofed from that domain. This requirement remains even if the root-level domain has SP=reject set within its DMARC record. Under this new framework, you must bulk import and mark parked domains as “Parked.” Crucially, this requires adjusting SPF settings to an empty record, setting DMARC to p=reject, and ensuring an empty DKIM record is in place: • SPF record: “v=spf1 -all”. • Wildcard DKIM record with empty public key.• DMARC record: “v=DMARC1;p=reject;adkim=s;aspf=s;rua=mailto:…”. EasyDMARC allows you to add and label parked domains for free. This is important because it helps you monitor any activity from these domains and ensure they remain protected with a strict DMARC policy of p=reject. 7. Compliance Dashboard Use EasyDMARC’s Domain Scanner to assess the security posture of each domain with a clear compliance score and risk level. The dashboard highlights configuration gaps and guides remediation steps, helping government agencies stay on track toward full compliance with the SGE Framework. 8. Inbound DMARC Evaluation Enforced You don’t need to apply any changes if you’re using Google Workspace, Microsoft 365, or other major mailbox providers. Most of them already enforce DMARC evaluation on incoming emails. However, some legacy Microsoft 365 setups may still quarantine emails that fail DMARC checks, even when the sending domain has a p=reject policy, instead of rejecting them. This behavior can be adjusted directly from your Microsoft Defender portal. Read more about this in our step-by-step guide on how to set up SPF, DKIM, and DMARC from Microsoft Defender. If you’re using a third-party mail provider that doesn’t enforce having a DMARC policy for incoming emails, which is rare, you’ll need to contact their support to request a configuration change. 9. Data Loss Prevention Aligned with NZISM The New Zealand Information Security Manual (NZISM) is the New Zealand Government’s manual on information assurance and information systems security. It includes guidance on data loss prevention (DLP), which must be followed to be aligned with the SEG. Need Help Setting up SPF and DKIM for your Email Provider? Setting up SPF and DKIM for different ESPs often requires specific configurations. Some providers require you to publish SPF and DKIM on a subdomain, while others only require DKIM, or have different formatting rules. We’ve simplified all these steps to help you avoid misconfigurations that could delay your DMARC enforcement, or worse, block legitimate emails from reaching your recipients. Below you’ll find comprehensive setup guides for Google Workspace, Microsoft 365, Zoho Mail, Amazon SES, and SendGrid. You can also explore our full blog section that covers setup instructions for many other well-known ESPs. Remember, all this information is reflected in your DMARC aggregate reports. These reports give you live visibility into your outgoing email ecosystem, helping you analyze and fix any issues specific to a given provider. Here are our step-by-step guides for the most common platforms: Google Workspace Microsoft 365 These guides will help ensure your DNS records are configured correctly as part of the Secure Government Email (SGE) Framework rollout. Meet New Government Email Security Standards With EasyDMARC New Zealand’s SEG Framework sets a clear path for government agencies to enhance their email security by October 2025. With EasyDMARC, you can meet these technical requirements efficiently and with confidence. From protocol setup to continuous monitoring and compliance tracking, EasyDMARC streamlines the entire process, ensuring strong protection against spoofing, phishing, and data loss while simplifying your transition from SEEMail.
    0 التعليقات 0 المشاركات
  • Casa Morena by Mário Martins Atelier: Architectural Dialogue with Nature

    Casa Morena | © Fernando Guerra / FG+SG
    In the coastal enclave of Lagos, Portugal, Mário Martins Atelier has crafted Casa Morena. This residence quietly asserts itself as an ode to the dialogue between architecture and its natural setting. Completed in 2024, this project demonstrates a considered response to its environment, where the interplay of light, material, and landscape defines a sense of place rather than architectural imposition.

    Casa Morena Technical Information

    Architects1-5: Mário Martins Atelier
    Location: Lagos, Portugal
    Project Years: 2024
    Photographs: © Fernando Guerra / FG+SG

    A simple house, one that wishes to be discreet and to be influenced by its location, to become a house that is pleasant with thoughtful landscaping.
    – Mário Martins Atelier

    Casa Morena Photographs

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG

    © Fernando Guerra / FG+SG
    A Contextual Response to Landscape and Light
    The design of Casa Morena finds its genesis in the site itself, a pine-scented plot overlooking the expanse of a bay. The pine trees, longstanding witnesses to the landscape’s evolution, provide the project’s visual anchor and spatial logic. In a move that both respects and celebrates these natural elements, Mário Martins Atelier structured the house’s reticulated plan to echo the presence of the trees, creating a composition that unfolds as a series of volumes harmonizing with the vertical rhythm of the trunks.
    The solid base of the house, built from locally sourced schist, emerges directly from the terrain. These robust walls establish a tactile continuity with the ground, their rough textures anchoring the architecture within the landscape. In contrast, the upper volumes of the house adopt a distinctly lighter expression: horizontal planes rendered in white plaster, their smooth surfaces catching and refracting the region’s luminous sun. This duality, earthbound solidity, and aerial lightness establish an architectural narrative rooted in the elemental.
    Casa Morena Experiential Flow
    Casa Morena’s spatial arrangement articulates a clear hierarchy of public and private domains. On the ground floor, the house embraces openness and transparency. An expansive entrance hall blurs the threshold inside and out, guiding inhabitants and visitors into a luminous social heart. The lounge, kitchen, and office flow seamlessly into the garden, unified by a continuous glazed façade that invites the outside in.
    This deliberate porosity extends to a covered terrace, an intermediary space that dissolves the boundary between shelter and exposure. The terrace, framed by the garden’s green canopy and the swimming pool’s long line, becomes a place of repose and contemplation. The pool itself demarcates the transition from a cultivated garden to the looser, more rugged landscape beyond, its linear form echoing the horizon’s expanse.
    Ascending to the upper floor, the architectural language shifts towards intimacy. The bedrooms, each with direct access to terraces and patios, create secluded zones that still maintain a fluid relationship with the outdoors. A discreet rooftop terrace, accessible from these private quarters, offers a hidden sanctuary where the interplay of views and light remains uninterrupted.
    Material Tectonics and Environmental Strategy
    Casa Morena’s material palette is rooted in regional specificity and tactile sensibility. Schist, extracted from the site, is not merely a structural element but a narrative thread linking the building to its geological past. Its earthy warmth and rugged surface provide a counterpoint to the luminous white of the upper volumes, an articulation of contrast that enlivens the building’s silhouette.
    White, the chromatic signature of the Algarve region, is employed with restraint and nuance. Its reflective qualities intensify the play of shadow and light, a dynamic that shifts with the passing of the day. In this interplay, architecture becomes an instrument for registering the ephemeral, and the environment itself becomes a participant in the spatial drama.
    Environmental stewardship is also woven into the project’s DNA. Discreetly integrated systems on the roof harness solar energy and manage water resources, extending the house’s commitment to a sustainable coexistence with its setting.
    Casa Morena Plans

    Basement | © Mario Martins Atelier

    Ground Level | © Mario Martins Atelier

    Upper Level | © Mario Martins Atelier

    Roof Plan | © Mario Martins Atelier

    Elevations | © Mario Martins Atelier
    Casa Morena Image Gallery

    About Mário Martins Atelier
    Mário Martins Atelier is an architectural studio based in Lagos and Lisbon, Portugal, led by Mário Martins. The practice is known for its context-sensitive approach, crafting contemporary projects seamlessly integrating with their surroundings while prioritizing regional materials and environmental considerations.
    Credits and Additional Notes

    Lead Architect: Mário Martins, arq.
    Project Team: Nuno Colaço, Sónia Fialho, Susana Jóia, Mariana Franco, Ana Graça
    Engineering: Nuno Grave Engenharia
    Landscape: HB-Hipolito Bettencourt – Arquitectura Paisagista, Lda.
    Building Contractor: Marques Antunes Engenharia Lda.
    #casa #morena #mário #martins #atelier
    Casa Morena by Mário Martins Atelier: Architectural Dialogue with Nature
    Casa Morena | © Fernando Guerra / FG+SG In the coastal enclave of Lagos, Portugal, Mário Martins Atelier has crafted Casa Morena. This residence quietly asserts itself as an ode to the dialogue between architecture and its natural setting. Completed in 2024, this project demonstrates a considered response to its environment, where the interplay of light, material, and landscape defines a sense of place rather than architectural imposition. Casa Morena Technical Information Architects1-5: Mário Martins Atelier Location: Lagos, Portugal Project Years: 2024 Photographs: © Fernando Guerra / FG+SG A simple house, one that wishes to be discreet and to be influenced by its location, to become a house that is pleasant with thoughtful landscaping. – Mário Martins Atelier Casa Morena Photographs © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG A Contextual Response to Landscape and Light The design of Casa Morena finds its genesis in the site itself, a pine-scented plot overlooking the expanse of a bay. The pine trees, longstanding witnesses to the landscape’s evolution, provide the project’s visual anchor and spatial logic. In a move that both respects and celebrates these natural elements, Mário Martins Atelier structured the house’s reticulated plan to echo the presence of the trees, creating a composition that unfolds as a series of volumes harmonizing with the vertical rhythm of the trunks. The solid base of the house, built from locally sourced schist, emerges directly from the terrain. These robust walls establish a tactile continuity with the ground, their rough textures anchoring the architecture within the landscape. In contrast, the upper volumes of the house adopt a distinctly lighter expression: horizontal planes rendered in white plaster, their smooth surfaces catching and refracting the region’s luminous sun. This duality, earthbound solidity, and aerial lightness establish an architectural narrative rooted in the elemental. Casa Morena Experiential Flow Casa Morena’s spatial arrangement articulates a clear hierarchy of public and private domains. On the ground floor, the house embraces openness and transparency. An expansive entrance hall blurs the threshold inside and out, guiding inhabitants and visitors into a luminous social heart. The lounge, kitchen, and office flow seamlessly into the garden, unified by a continuous glazed façade that invites the outside in. This deliberate porosity extends to a covered terrace, an intermediary space that dissolves the boundary between shelter and exposure. The terrace, framed by the garden’s green canopy and the swimming pool’s long line, becomes a place of repose and contemplation. The pool itself demarcates the transition from a cultivated garden to the looser, more rugged landscape beyond, its linear form echoing the horizon’s expanse. Ascending to the upper floor, the architectural language shifts towards intimacy. The bedrooms, each with direct access to terraces and patios, create secluded zones that still maintain a fluid relationship with the outdoors. A discreet rooftop terrace, accessible from these private quarters, offers a hidden sanctuary where the interplay of views and light remains uninterrupted. Material Tectonics and Environmental Strategy Casa Morena’s material palette is rooted in regional specificity and tactile sensibility. Schist, extracted from the site, is not merely a structural element but a narrative thread linking the building to its geological past. Its earthy warmth and rugged surface provide a counterpoint to the luminous white of the upper volumes, an articulation of contrast that enlivens the building’s silhouette. White, the chromatic signature of the Algarve region, is employed with restraint and nuance. Its reflective qualities intensify the play of shadow and light, a dynamic that shifts with the passing of the day. In this interplay, architecture becomes an instrument for registering the ephemeral, and the environment itself becomes a participant in the spatial drama. Environmental stewardship is also woven into the project’s DNA. Discreetly integrated systems on the roof harness solar energy and manage water resources, extending the house’s commitment to a sustainable coexistence with its setting. Casa Morena Plans Basement | © Mario Martins Atelier Ground Level | © Mario Martins Atelier Upper Level | © Mario Martins Atelier Roof Plan | © Mario Martins Atelier Elevations | © Mario Martins Atelier Casa Morena Image Gallery About Mário Martins Atelier Mário Martins Atelier is an architectural studio based in Lagos and Lisbon, Portugal, led by Mário Martins. The practice is known for its context-sensitive approach, crafting contemporary projects seamlessly integrating with their surroundings while prioritizing regional materials and environmental considerations. Credits and Additional Notes Lead Architect: Mário Martins, arq. Project Team: Nuno Colaço, Sónia Fialho, Susana Jóia, Mariana Franco, Ana Graça Engineering: Nuno Grave Engenharia Landscape: HB-Hipolito Bettencourt – Arquitectura Paisagista, Lda. Building Contractor: Marques Antunes Engenharia Lda. #casa #morena #mário #martins #atelier
    ARCHEYES.COM
    Casa Morena by Mário Martins Atelier: Architectural Dialogue with Nature
    Casa Morena | © Fernando Guerra / FG+SG In the coastal enclave of Lagos, Portugal, Mário Martins Atelier has crafted Casa Morena. This residence quietly asserts itself as an ode to the dialogue between architecture and its natural setting. Completed in 2024, this project demonstrates a considered response to its environment, where the interplay of light, material, and landscape defines a sense of place rather than architectural imposition. Casa Morena Technical Information Architects1-5: Mário Martins Atelier Location: Lagos, Portugal Project Years: 2024 Photographs: © Fernando Guerra / FG+SG A simple house, one that wishes to be discreet and to be influenced by its location, to become a house that is pleasant with thoughtful landscaping. – Mário Martins Atelier Casa Morena Photographs © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG © Fernando Guerra / FG+SG A Contextual Response to Landscape and Light The design of Casa Morena finds its genesis in the site itself, a pine-scented plot overlooking the expanse of a bay. The pine trees, longstanding witnesses to the landscape’s evolution, provide the project’s visual anchor and spatial logic. In a move that both respects and celebrates these natural elements, Mário Martins Atelier structured the house’s reticulated plan to echo the presence of the trees, creating a composition that unfolds as a series of volumes harmonizing with the vertical rhythm of the trunks. The solid base of the house, built from locally sourced schist, emerges directly from the terrain. These robust walls establish a tactile continuity with the ground, their rough textures anchoring the architecture within the landscape. In contrast, the upper volumes of the house adopt a distinctly lighter expression: horizontal planes rendered in white plaster, their smooth surfaces catching and refracting the region’s luminous sun. This duality, earthbound solidity, and aerial lightness establish an architectural narrative rooted in the elemental. Casa Morena Experiential Flow Casa Morena’s spatial arrangement articulates a clear hierarchy of public and private domains. On the ground floor, the house embraces openness and transparency. An expansive entrance hall blurs the threshold inside and out, guiding inhabitants and visitors into a luminous social heart. The lounge, kitchen, and office flow seamlessly into the garden, unified by a continuous glazed façade that invites the outside in. This deliberate porosity extends to a covered terrace, an intermediary space that dissolves the boundary between shelter and exposure. The terrace, framed by the garden’s green canopy and the swimming pool’s long line, becomes a place of repose and contemplation. The pool itself demarcates the transition from a cultivated garden to the looser, more rugged landscape beyond, its linear form echoing the horizon’s expanse. Ascending to the upper floor, the architectural language shifts towards intimacy. The bedrooms, each with direct access to terraces and patios, create secluded zones that still maintain a fluid relationship with the outdoors. A discreet rooftop terrace, accessible from these private quarters, offers a hidden sanctuary where the interplay of views and light remains uninterrupted. Material Tectonics and Environmental Strategy Casa Morena’s material palette is rooted in regional specificity and tactile sensibility. Schist, extracted from the site, is not merely a structural element but a narrative thread linking the building to its geological past. Its earthy warmth and rugged surface provide a counterpoint to the luminous white of the upper volumes, an articulation of contrast that enlivens the building’s silhouette. White, the chromatic signature of the Algarve region, is employed with restraint and nuance. Its reflective qualities intensify the play of shadow and light, a dynamic that shifts with the passing of the day. In this interplay, architecture becomes an instrument for registering the ephemeral, and the environment itself becomes a participant in the spatial drama. Environmental stewardship is also woven into the project’s DNA. Discreetly integrated systems on the roof harness solar energy and manage water resources, extending the house’s commitment to a sustainable coexistence with its setting. Casa Morena Plans Basement | © Mario Martins Atelier Ground Level | © Mario Martins Atelier Upper Level | © Mario Martins Atelier Roof Plan | © Mario Martins Atelier Elevations | © Mario Martins Atelier Casa Morena Image Gallery About Mário Martins Atelier Mário Martins Atelier is an architectural studio based in Lagos and Lisbon, Portugal, led by Mário Martins. The practice is known for its context-sensitive approach, crafting contemporary projects seamlessly integrating with their surroundings while prioritizing regional materials and environmental considerations. Credits and Additional Notes Lead Architect: Mário Martins, arq. Project Team: Nuno Colaço, Sónia Fialho, Susana Jóia, Mariana Franco, Ana Graça Engineering: Nuno Grave Engenharia Landscape: HB-Hipolito Bettencourt – Arquitectura Paisagista, Lda. Building Contractor: Marques Antunes Engenharia Lda.
    0 التعليقات 0 المشاركات
  • Over 269,000 Websites Infected with JSFireTruck JavaScript Malware in One Month

    Jun 13, 2025Ravie LakshmananWeb Security / Network Security

    Cybersecurity researchers are calling attention to a "large-scale campaign" that has been observed compromising legitimate websites with malicious JavaScript injections.
    According to Palo Alto Networks Unit 42, these malicious injects are obfuscated using JSFuck, which refers to an "esoteric and educational programming style" that uses only a limited set of characters to write and execute code.
    The cybersecurity company has given the technique an alternate name JSFireTruck owing to the profanity involved.
    "Multiple websites have been identified with injected malicious JavaScript that uses JSFireTruck obfuscation, which is composed primarily of the symbols, +, {, and }," security researchers Hardik Shah, Brad Duncan, and Pranay Kumar Chhaparwal said. "The code's obfuscation hides its true purpose, hindering analysis."

    Further analysis has determined that the injected code is designed to check the website referrer, which identifies the address of the web page from which a request originated.
    Should the referrer be a search engine such as Google, Bing, DuckDuckGo, Yahoo!, or AOL, the JavaScript code redirects victims to malicious URLs that can deliver malware, exploits, traffic monetization, and malvertising.

    Unit 42 said its telemetry uncovered 269,552 web pages that have been infected with JavaScript code using the JSFireTruck technique between March 26 and April 25, 2025. A spike in the campaign was first recorded on April 12, when over 50,000 infected web pages were observed in a single day.
    "The campaign's scale and stealth pose a significant threat," the researchers said. "The widespread nature of these infections suggests a coordinated effort to compromise legitimate websites as attack vectors for further malicious activities."
    Say Hello to HelloTDS
    The development comes as Gen Digital took the wraps off a sophisticated Traffic Distribution Servicecalled HelloTDS that's designed to conditionally redirect site visitors to fake CAPTCHA pages, tech support scams, fake browser updates, unwanted browser extensions, and cryptocurrency scams through remotely-hosted JavaScript code injected into the sites.
    The primary objective of the TDS is to act as a gateway, determining the exact nature of content to be delivered to the victims after fingerprinting their devices. If the user is not deemed a suitable target, the victim is redirected to a benign web page.

    "The campaign entry points are infected or otherwise attacker-controlled streaming websites, file sharing services, as well as malvertising campaigns," researchers Vojtěch Krejsa and Milan Špinka said in a report published this month.
    "Victims are evaluated based on geolocation, IP address, and browser fingerprinting; for example, connections through VPNs or headless browsers are detected and rejected."
    Some of these attack chains have been found to serve bogus CAPTCHA pages that leverage the ClickFix strategy to trick users into running malicious code and infecting their machines with a malware known as PEAKLIGHT, which is known to server information stealers like Lumma.

    Central to the HelloTDS infrastructure is the use of .top, .shop, and .com top-level domains that are used to host the JavaScript code and trigger the redirections following a multi-stage fingerprinting process engineered to collect network and browser information.
    "The HelloTDS infrastructure behind fake CAPTCHA campaigns demonstrates how attackers continue to refine their methods to bypass traditional protections, evade detection, and selectively target victims," the researchers said.
    "By leveraging sophisticated fingerprinting, dynamic domain infrastructure, and deception tacticsthese campaigns achieve both stealth and scale."

    Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post.

    SHARE




    #over #websites #infected #with #jsfiretruck
    Over 269,000 Websites Infected with JSFireTruck JavaScript Malware in One Month
    Jun 13, 2025Ravie LakshmananWeb Security / Network Security Cybersecurity researchers are calling attention to a "large-scale campaign" that has been observed compromising legitimate websites with malicious JavaScript injections. According to Palo Alto Networks Unit 42, these malicious injects are obfuscated using JSFuck, which refers to an "esoteric and educational programming style" that uses only a limited set of characters to write and execute code. The cybersecurity company has given the technique an alternate name JSFireTruck owing to the profanity involved. "Multiple websites have been identified with injected malicious JavaScript that uses JSFireTruck obfuscation, which is composed primarily of the symbols, +, {, and }," security researchers Hardik Shah, Brad Duncan, and Pranay Kumar Chhaparwal said. "The code's obfuscation hides its true purpose, hindering analysis." Further analysis has determined that the injected code is designed to check the website referrer, which identifies the address of the web page from which a request originated. Should the referrer be a search engine such as Google, Bing, DuckDuckGo, Yahoo!, or AOL, the JavaScript code redirects victims to malicious URLs that can deliver malware, exploits, traffic monetization, and malvertising. Unit 42 said its telemetry uncovered 269,552 web pages that have been infected with JavaScript code using the JSFireTruck technique between March 26 and April 25, 2025. A spike in the campaign was first recorded on April 12, when over 50,000 infected web pages were observed in a single day. "The campaign's scale and stealth pose a significant threat," the researchers said. "The widespread nature of these infections suggests a coordinated effort to compromise legitimate websites as attack vectors for further malicious activities." Say Hello to HelloTDS The development comes as Gen Digital took the wraps off a sophisticated Traffic Distribution Servicecalled HelloTDS that's designed to conditionally redirect site visitors to fake CAPTCHA pages, tech support scams, fake browser updates, unwanted browser extensions, and cryptocurrency scams through remotely-hosted JavaScript code injected into the sites. The primary objective of the TDS is to act as a gateway, determining the exact nature of content to be delivered to the victims after fingerprinting their devices. If the user is not deemed a suitable target, the victim is redirected to a benign web page. "The campaign entry points are infected or otherwise attacker-controlled streaming websites, file sharing services, as well as malvertising campaigns," researchers Vojtěch Krejsa and Milan Špinka said in a report published this month. "Victims are evaluated based on geolocation, IP address, and browser fingerprinting; for example, connections through VPNs or headless browsers are detected and rejected." Some of these attack chains have been found to serve bogus CAPTCHA pages that leverage the ClickFix strategy to trick users into running malicious code and infecting their machines with a malware known as PEAKLIGHT, which is known to server information stealers like Lumma. Central to the HelloTDS infrastructure is the use of .top, .shop, and .com top-level domains that are used to host the JavaScript code and trigger the redirections following a multi-stage fingerprinting process engineered to collect network and browser information. "The HelloTDS infrastructure behind fake CAPTCHA campaigns demonstrates how attackers continue to refine their methods to bypass traditional protections, evade detection, and selectively target victims," the researchers said. "By leveraging sophisticated fingerprinting, dynamic domain infrastructure, and deception tacticsthese campaigns achieve both stealth and scale." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post. SHARE     #over #websites #infected #with #jsfiretruck
    THEHACKERNEWS.COM
    Over 269,000 Websites Infected with JSFireTruck JavaScript Malware in One Month
    Jun 13, 2025Ravie LakshmananWeb Security / Network Security Cybersecurity researchers are calling attention to a "large-scale campaign" that has been observed compromising legitimate websites with malicious JavaScript injections. According to Palo Alto Networks Unit 42, these malicious injects are obfuscated using JSFuck, which refers to an "esoteric and educational programming style" that uses only a limited set of characters to write and execute code. The cybersecurity company has given the technique an alternate name JSFireTruck owing to the profanity involved. "Multiple websites have been identified with injected malicious JavaScript that uses JSFireTruck obfuscation, which is composed primarily of the symbols [, ], +, $, {, and }," security researchers Hardik Shah, Brad Duncan, and Pranay Kumar Chhaparwal said. "The code's obfuscation hides its true purpose, hindering analysis." Further analysis has determined that the injected code is designed to check the website referrer ("document.referrer"), which identifies the address of the web page from which a request originated. Should the referrer be a search engine such as Google, Bing, DuckDuckGo, Yahoo!, or AOL, the JavaScript code redirects victims to malicious URLs that can deliver malware, exploits, traffic monetization, and malvertising. Unit 42 said its telemetry uncovered 269,552 web pages that have been infected with JavaScript code using the JSFireTruck technique between March 26 and April 25, 2025. A spike in the campaign was first recorded on April 12, when over 50,000 infected web pages were observed in a single day. "The campaign's scale and stealth pose a significant threat," the researchers said. "The widespread nature of these infections suggests a coordinated effort to compromise legitimate websites as attack vectors for further malicious activities." Say Hello to HelloTDS The development comes as Gen Digital took the wraps off a sophisticated Traffic Distribution Service (TDS) called HelloTDS that's designed to conditionally redirect site visitors to fake CAPTCHA pages, tech support scams, fake browser updates, unwanted browser extensions, and cryptocurrency scams through remotely-hosted JavaScript code injected into the sites. The primary objective of the TDS is to act as a gateway, determining the exact nature of content to be delivered to the victims after fingerprinting their devices. If the user is not deemed a suitable target, the victim is redirected to a benign web page. "The campaign entry points are infected or otherwise attacker-controlled streaming websites, file sharing services, as well as malvertising campaigns," researchers Vojtěch Krejsa and Milan Špinka said in a report published this month. "Victims are evaluated based on geolocation, IP address, and browser fingerprinting; for example, connections through VPNs or headless browsers are detected and rejected." Some of these attack chains have been found to serve bogus CAPTCHA pages that leverage the ClickFix strategy to trick users into running malicious code and infecting their machines with a malware known as PEAKLIGHT (aka Emmenhtal Loader), which is known to server information stealers like Lumma. Central to the HelloTDS infrastructure is the use of .top, .shop, and .com top-level domains that are used to host the JavaScript code and trigger the redirections following a multi-stage fingerprinting process engineered to collect network and browser information. "The HelloTDS infrastructure behind fake CAPTCHA campaigns demonstrates how attackers continue to refine their methods to bypass traditional protections, evade detection, and selectively target victims," the researchers said. "By leveraging sophisticated fingerprinting, dynamic domain infrastructure, and deception tactics (such as mimicking legitimate websites and serving benign content to researchers) these campaigns achieve both stealth and scale." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post. SHARE    
    0 التعليقات 0 المشاركات