• Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Endangered classic Mac plastic color returns as 3D-printer filament

    The color of nostalgia

    Endangered classic Mac plastic color returns as 3D-printer filament

    Mac fan paid to color-match iconic Apple beige-gray "Platinum" plastic for everyone.

    Benj Edwards



    Jun 4, 2025 6:13 pm

    |

    3

    The Mac SE, released in 1987, was one of many classic Macs to use the "Platinum" color scheme.

    Credit:

    Apple / Polar Filament

    The Mac SE, released in 1987, was one of many classic Macs to use the "Platinum" color scheme.

    Credit:

    Apple / Polar Filament

    Story text

    Size

    Small
    Standard
    Large

    Width
    *

    Standard
    Wide

    Links

    Standard
    Orange

    * Subscribers only
      Learn more

    On Tuesday, classic computer collector Joe Strosnider announced the availability of a new 3D-printer filament that replicates the iconic "Platinum" color scheme used in classic Macintosh computers from the late 1980s through the 1990s. The PLA filamentallows hobbyists to 3D-print nostalgic novelties, replacement parts, and accessories that match the original color of vintage Apple computers.
    Hobbyists commonly feed this type of filament into commercial desktop 3D printers, which heat the plastic and extrude it in a computer-controlled way to fabricate new plastic parts.
    The Platinum color, which Apple used in its desktop and portable computer lines starting with the Apple IIgs in 1986, has become synonymous with a distinctive era of classic Macintosh aesthetic. Over time, original Macintosh plastics have become brittle and discolored with age, so matching the "original" color can be a somewhat challenging and subjective experience.

    A close-up of "Retro Platinum" PLA filament by Polar Filament.

    Credit:

    Polar Filament

    Strosnider, who runs a website about his extensive vintage computer collection in Ohio, worked for years to color-match the distinctive beige-gray hue of the Macintosh Platinum scheme, resulting in a spool of hobby-ready plastic by Polar Filament and priced at per kilogram.
    According to a forum post, Strosnider paid approximately to develop the color and purchase an initial 25-kilogram supply of the filament. Rather than keeping the formulation proprietary, he arranged for Polar Filament to make the color publicly available.
    "I paid them a fee to color match the speaker box from inside my Mac Color Classic," Strosnider wrote in a Tinkerdifferent forum post on Tuesday. "In exchange, I asked them to release the color to the public so anyone can use it."

    A spool of "Retro Platinum" PLA filament by Polar Filament.

    Credit:

    Polar Filament

    The development addresses a gap in the vintage computing community, where enthusiasts sometimes struggle to find appropriately colored materials for restoration projects and new accessories. The new filament is an attempt to replace previous options that were either expensive, required international shipping, or had consistency issues that Strosnider described as "chalky."
    The 1.75 mm filament works with standard 3D printers and is compatible with automated material systems used in some newer printer models. On Bluesky, Strosnider encouraged buyers to "order plenty, and let them know you want them to print it forever" to ensure continued production of the specialty color.
    Extruded nostalgia
    The timing of the filament's release coincides with growing interest in 3D-printed cases and accessories for vintage computer hardware. One example is the SE Mini desktop case, a project by "GutBomb" that transforms Macintosh SE and SE/30 logic boards into compact desktop computers that can connect to modern displays. The case, designed to be 3D-printed in multiple pieces and assembled, represents the type of project that benefits from color-accurate filament.

    A 3D-printed "SE Mini" desktop case that allows using a vintage compact Mac board in a new enclosure.

    Credit:

    Joe Strosnider

    The SE Mini case requires approximately half a spool of filament and takes a couple of days to print on consumer 3D printers. Users can outfit the case with modern components, such as Pico PSUs and BlueSCSI storage devices, while maintaining the classic Macintosh appearance.
    Why create new "retro" devices? Because it's fun, and it's a great way to merge technology's past with the benefits of recent tech developments. Projects like the Platinum PLA filament, the SE Mini case, and the dedication of hobbyists like Strosnider ensure that appreciation for Apple's computers of yore will continue for decades.

    Benj Edwards
    Senior AI Reporter

    Benj Edwards
    Senior AI Reporter

    Benj Edwards is Ars Technica's Senior AI Reporter and founder of the site's dedicated AI beat in 2022. He's also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC.

    3 Comments
    #endangered #classic #mac #plastic #color
    Endangered classic Mac plastic color returns as 3D-printer filament
    The color of nostalgia Endangered classic Mac plastic color returns as 3D-printer filament Mac fan paid to color-match iconic Apple beige-gray "Platinum" plastic for everyone. Benj Edwards – Jun 4, 2025 6:13 pm | 3 The Mac SE, released in 1987, was one of many classic Macs to use the "Platinum" color scheme. Credit: Apple / Polar Filament The Mac SE, released in 1987, was one of many classic Macs to use the "Platinum" color scheme. Credit: Apple / Polar Filament Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more On Tuesday, classic computer collector Joe Strosnider announced the availability of a new 3D-printer filament that replicates the iconic "Platinum" color scheme used in classic Macintosh computers from the late 1980s through the 1990s. The PLA filamentallows hobbyists to 3D-print nostalgic novelties, replacement parts, and accessories that match the original color of vintage Apple computers. Hobbyists commonly feed this type of filament into commercial desktop 3D printers, which heat the plastic and extrude it in a computer-controlled way to fabricate new plastic parts. The Platinum color, which Apple used in its desktop and portable computer lines starting with the Apple IIgs in 1986, has become synonymous with a distinctive era of classic Macintosh aesthetic. Over time, original Macintosh plastics have become brittle and discolored with age, so matching the "original" color can be a somewhat challenging and subjective experience. A close-up of "Retro Platinum" PLA filament by Polar Filament. Credit: Polar Filament Strosnider, who runs a website about his extensive vintage computer collection in Ohio, worked for years to color-match the distinctive beige-gray hue of the Macintosh Platinum scheme, resulting in a spool of hobby-ready plastic by Polar Filament and priced at per kilogram. According to a forum post, Strosnider paid approximately to develop the color and purchase an initial 25-kilogram supply of the filament. Rather than keeping the formulation proprietary, he arranged for Polar Filament to make the color publicly available. "I paid them a fee to color match the speaker box from inside my Mac Color Classic," Strosnider wrote in a Tinkerdifferent forum post on Tuesday. "In exchange, I asked them to release the color to the public so anyone can use it." A spool of "Retro Platinum" PLA filament by Polar Filament. Credit: Polar Filament The development addresses a gap in the vintage computing community, where enthusiasts sometimes struggle to find appropriately colored materials for restoration projects and new accessories. The new filament is an attempt to replace previous options that were either expensive, required international shipping, or had consistency issues that Strosnider described as "chalky." The 1.75 mm filament works with standard 3D printers and is compatible with automated material systems used in some newer printer models. On Bluesky, Strosnider encouraged buyers to "order plenty, and let them know you want them to print it forever" to ensure continued production of the specialty color. Extruded nostalgia The timing of the filament's release coincides with growing interest in 3D-printed cases and accessories for vintage computer hardware. One example is the SE Mini desktop case, a project by "GutBomb" that transforms Macintosh SE and SE/30 logic boards into compact desktop computers that can connect to modern displays. The case, designed to be 3D-printed in multiple pieces and assembled, represents the type of project that benefits from color-accurate filament. A 3D-printed "SE Mini" desktop case that allows using a vintage compact Mac board in a new enclosure. Credit: Joe Strosnider The SE Mini case requires approximately half a spool of filament and takes a couple of days to print on consumer 3D printers. Users can outfit the case with modern components, such as Pico PSUs and BlueSCSI storage devices, while maintaining the classic Macintosh appearance. Why create new "retro" devices? Because it's fun, and it's a great way to merge technology's past with the benefits of recent tech developments. Projects like the Platinum PLA filament, the SE Mini case, and the dedication of hobbyists like Strosnider ensure that appreciation for Apple's computers of yore will continue for decades. Benj Edwards Senior AI Reporter Benj Edwards Senior AI Reporter Benj Edwards is Ars Technica's Senior AI Reporter and founder of the site's dedicated AI beat in 2022. He's also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC. 3 Comments #endangered #classic #mac #plastic #color
    ARSTECHNICA.COM
    Endangered classic Mac plastic color returns as 3D-printer filament
    The color of nostalgia Endangered classic Mac plastic color returns as 3D-printer filament Mac fan paid $900 to color-match iconic Apple beige-gray "Platinum" plastic for everyone. Benj Edwards – Jun 4, 2025 6:13 pm | 3 The Mac SE, released in 1987, was one of many classic Macs to use the "Platinum" color scheme. Credit: Apple / Polar Filament The Mac SE, released in 1987, was one of many classic Macs to use the "Platinum" color scheme. Credit: Apple / Polar Filament Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more On Tuesday, classic computer collector Joe Strosnider announced the availability of a new 3D-printer filament that replicates the iconic "Platinum" color scheme used in classic Macintosh computers from the late 1980s through the 1990s. The PLA filament (PLA is short for polylactic acid) allows hobbyists to 3D-print nostalgic novelties, replacement parts, and accessories that match the original color of vintage Apple computers. Hobbyists commonly feed this type of filament into commercial desktop 3D printers, which heat the plastic and extrude it in a computer-controlled way to fabricate new plastic parts. The Platinum color, which Apple used in its desktop and portable computer lines starting with the Apple IIgs in 1986, has become synonymous with a distinctive era of classic Macintosh aesthetic. Over time, original Macintosh plastics have become brittle and discolored with age, so matching the "original" color can be a somewhat challenging and subjective experience. A close-up of "Retro Platinum" PLA filament by Polar Filament. Credit: Polar Filament Strosnider, who runs a website about his extensive vintage computer collection in Ohio, worked for years to color-match the distinctive beige-gray hue of the Macintosh Platinum scheme, resulting in a spool of hobby-ready plastic by Polar Filament and priced at $21.99 per kilogram. According to a forum post, Strosnider paid approximately $900 to develop the color and purchase an initial 25-kilogram supply of the filament. Rather than keeping the formulation proprietary, he arranged for Polar Filament to make the color publicly available. "I paid them a fee to color match the speaker box from inside my Mac Color Classic," Strosnider wrote in a Tinkerdifferent forum post on Tuesday. "In exchange, I asked them to release the color to the public so anyone can use it." A spool of "Retro Platinum" PLA filament by Polar Filament. Credit: Polar Filament The development addresses a gap in the vintage computing community, where enthusiasts sometimes struggle to find appropriately colored materials for restoration projects and new accessories. The new filament is an attempt to replace previous options that were either expensive, required international shipping, or had consistency issues that Strosnider described as "chalky." The 1.75 mm filament works with standard 3D printers and is compatible with automated material systems used in some newer printer models. On Bluesky, Strosnider encouraged buyers to "order plenty, and let them know you want them to print it forever" to ensure continued production of the specialty color. Extruded nostalgia The timing of the filament's release coincides with growing interest in 3D-printed cases and accessories for vintage computer hardware. One example is the SE Mini desktop case, a project by "GutBomb" that transforms Macintosh SE and SE/30 logic boards into compact desktop computers that can connect to modern displays. The case, designed to be 3D-printed in multiple pieces and assembled, represents the type of project that benefits from color-accurate filament. A 3D-printed "SE Mini" desktop case that allows using a vintage compact Mac board in a new enclosure. Credit: Joe Strosnider The SE Mini case requires approximately half a spool of filament and takes a couple of days to print on consumer 3D printers. Users can outfit the case with modern components, such as Pico PSUs and BlueSCSI storage devices, while maintaining the classic Macintosh appearance. Why create new "retro" devices? Because it's fun, and it's a great way to merge technology's past with the benefits of recent tech developments. Projects like the Platinum PLA filament, the SE Mini case, and the dedication of hobbyists like Strosnider ensure that appreciation for Apple's computers of yore will continue for decades. Benj Edwards Senior AI Reporter Benj Edwards Senior AI Reporter Benj Edwards is Ars Technica's Senior AI Reporter and founder of the site's dedicated AI beat in 2022. He's also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC. 3 Comments
    Like
    Love
    Wow
    Sad
    Angry
    249
    0 Comentários 0 Compartilhamentos 0 Anterior
  • A Deadly Disease Is Eating Away at Caribbean Corals and Wreaking Havoc on Reefs. Could Probiotics Be the Solution?

    A Deadly Disease Is Eating Away at Caribbean Corals and Wreaking Havoc on Reefs. Could Probiotics Be the Solution?
    New research suggests the probiotic McH1-7 could help stop the spread of stony coral tissue loss disease among wild corals near Fort Lauderdale, Florida

    Scientists determined the most effective method of halting the disease was covering a coral colony with a weighted plastic bag, then injecting a seawater solution that contains the probiotic. They left the colony covered for two hours to allow the probiotic bacteria to colonize the coral.
    Hunter Noren

    Probiotics can be good for human health. Now, new research suggests they might also help protect coral reefs.
    A bacterial probiotic helped slow the advance of stony coral tissue loss disease—a fast-spreading and deadly condition—among wild corals in Florida, researchers report today in a new study published in the journal Frontiers in Marine Science.
    The probiotic may be a good alternative to antibiotics like amoxicillin, which temporarily curb the spread of the disease but must be reapplied frequently. In addition, scientists fear stony coral tissue loss disease may one day become resistant to these antibiotic treatments—just as “superbugs” that infect humans are building resistance to our own drugs.
    Antibiotics are meant to kill microorganisms, but probiotics are beneficial living microbes. The idea is that a probiotic can be incorporated into corals’ natural microbiomes, ideally offering them longer-lasting protection.
    First discovered in Florida in 2014, stony coral tissue loss disease attacks the soft tissue of more than 30 different species of coral. Without treatment, the disease eventually kills the corals, and their soft tissue falls off, revealing the white calcium carbonate skeleton below. In just weeks or months, it can devastate a whole colony.
    Stony coral tissue loss disease can be spread by fish that eat coral, as well as by boaters and divers who do not disinfect their gear. The condition has since expanded its range beyond Florida to reefs throughout the Caribbean.
    Several years ago, researchers looking at the great star coral discovered a probiotic called Pseudoalteromonas sp. strain McH1-7. Laboratory tests showed McH1-7 stopped or slowed the progression of stony coral tissue loss disease in infected corals. It also helped prevent the disease from spreading to healthy corals.
    But that was in the lab. Would McH1-7 be similarly effective in the ocean? Researchers were eager to find out, so they set up an experiment on a shallow reef off the coast of Fort Lauderdale.

    Study co-author Kelly Pitts, a research technician with the Smithsonian Marine Station, applies a paste containing the probiotic directly onto the disease lesion of an infected coral.

    Hunter Noren

    Experimenting with wild corals
    For the study, the scientists focused on 40 great star coral colonies that were showing symptoms of stony coral tissue loss disease. In one experimental condition, the researchers made a paste that contained McH1-7 and applied it directly onto the disease lesions. For comparison, they also applied the same paste, minus the probiotic, to some corals.
    In another condition, they covered infected coral colonies with weighted plastic bags, then filled the bags with seawater solutions made with and without McH1-7. They left the corals covered for two hours.
    “This created a little mini-aquarium that kept the probiotics around each coral colony,” says study co-author Valerie Paul, head scientist at the Smithsonian Marine Station at Fort Pierce, Florida, in a statement.
    The scientists completed all the treatments within the first 4.5 months of the project. Then, they returned periodically to gather tissue and mucus samples from the corals to measure changes to their microbiomes. Over the next 2.5 years, they took photos from a variety of different angles, which they then used to create 3D models that could track the disease’s progression.
    In the end, the results suggest covering the corals with plastic bags filled with the probiotic seawater solution was the most effective method. More than two years post-treatment, the colonies that received the probiotic bag had lost just 7 percent of their tissue, while colonies in the control bag condition faced 35 percent tissue loss.

    Scientists applied a probiotic paste directly to disease lesions on some corals.

    Kelly Pitts

    The probiotic paste, by contrast, appears to have made the situation worse: The corals that had the probiotic paste applied directly to their lesions lost more tissue than those treated with the control paste, which did not contain McH1-7.
    “We do not really know what is going on with the probiotic paste treatment,” Paul tells Smithsonian magazine in an email.
    But she has a few theories. It’s possible the high concentrations of McH1-7 contributed to localized hypoxia, or low-oxygen conditions that further harmed the already stressed corals, she says. Or, the probiotic could have changed the microbiome at the lesion site in some negative way. Another possibility is that McH1-7 produces antibiotics or other substances that were harmful at high concentrations.
    Amanda Alker, a marine microbiologist at the University of Rhode Island who was not involved with the study, wonders if this finding suggests McH1-7 is beneficial at specific dosages—a question future laboratory research might be able to answer, she tells Smithsonian magazine in an email. She’s also curious to know which specific molecular components of the probiotic are responsible for the increased tissue loss when applied as a paste.
    More broadly, Alker would like to see additional experiments validating the bag treatment method, but she says this “inventive” technique seems promising.
    “Their approach is a safer solution than antibiotic treatment methods that have been deployed to combatin the field so far,” she says. “Further, this is a practical solution that could be implemented widely because it doesn’t require highly specialized equipment and has the ability to be used with any type of microbial solution.”
    Looking ahead to save reefs
    Probiotics are likely not a silver bullet for protecting corals. For one, researchers still don’t know exactly what causes stony coral tissue loss disease, which makes it difficult to determine how or why the probiotic works, Paul says. In addition, since the disease has spread to many different parts of the Caribbean, it might be challenging to use the bag treatment technique on all affected colonies.
    “We would need to develop better methods of deploying the probiotic through time release formulations or other ways to scale up treatments,” Paul says. “Right now, having divers swim around underwater with weighted bags is not a very scalable method.”
    The researchers have also conducted similar experiments on infected corals located farther south, in the Florida Keys. However, these tests have produced mixed results, probably because of regional differences in stony coral tissue loss disease. This is another hurdle scientists will likely need to overcome if they hope to expand the use of probiotics.
    “We probably need to develop different probiotics for different coral species and different regions of the Caribbean,” Paul says.

    Researchers returned to gather samples of tissues and mucus to see how the corals' microbiomes had changed.

    Hunter Noren

    Even so, scientists are heartened by the results of the experiments conducted near Fort Lauderdale. With more research, the findings suggest probiotics could be a promising tool for combatting the disease elsewhere.
    “Coral probiotics is a challenging field, because there are hundreds of different types of bacteria that associate with corals, and there are limitless experiments that need to be performed,” Amy Apprill, a marine chemist at Woods Hole Oceanographic Institution who was not involved with the research, tells Smithsonian magazine in an email. “These researchers made a major advance with their study by demonstrating the utility of whole colony treatment as well as the specific probiotic tested.”
    Apprill adds that, while antibiotics have been widely used to control stony coral tissue loss disease, scientists haven’t conducted much research to see how these treatments are affecting the plants and creatures that live nearby.
    “Using a naturally occurring bacterium for disease treatment may result in lessened impacts to other members of the coral reef ecosystem,” she says.
    Amid rising ocean temperatures, scientists expect to find even more diseased coral colonies in the future. Warmer waters may also allow other pathogens to thrive and proliferate. Against that backdrop, Apprill adds, probiotics and the different methods of applying them will be “major allies” in the fight to save coral reefs.
    Paul is also optimistic. Through research and field studies, she’s confident researchers will be able to develop interventions that can “help corals better survive changing environments and respond better to diseases and bleaching,” she says.

    Get the latest stories in your inbox every weekday.
    #deadly #disease #eating #away #caribbean
    A Deadly Disease Is Eating Away at Caribbean Corals and Wreaking Havoc on Reefs. Could Probiotics Be the Solution?
    A Deadly Disease Is Eating Away at Caribbean Corals and Wreaking Havoc on Reefs. Could Probiotics Be the Solution? New research suggests the probiotic McH1-7 could help stop the spread of stony coral tissue loss disease among wild corals near Fort Lauderdale, Florida Scientists determined the most effective method of halting the disease was covering a coral colony with a weighted plastic bag, then injecting a seawater solution that contains the probiotic. They left the colony covered for two hours to allow the probiotic bacteria to colonize the coral. Hunter Noren Probiotics can be good for human health. Now, new research suggests they might also help protect coral reefs. A bacterial probiotic helped slow the advance of stony coral tissue loss disease—a fast-spreading and deadly condition—among wild corals in Florida, researchers report today in a new study published in the journal Frontiers in Marine Science. The probiotic may be a good alternative to antibiotics like amoxicillin, which temporarily curb the spread of the disease but must be reapplied frequently. In addition, scientists fear stony coral tissue loss disease may one day become resistant to these antibiotic treatments—just as “superbugs” that infect humans are building resistance to our own drugs. Antibiotics are meant to kill microorganisms, but probiotics are beneficial living microbes. The idea is that a probiotic can be incorporated into corals’ natural microbiomes, ideally offering them longer-lasting protection. First discovered in Florida in 2014, stony coral tissue loss disease attacks the soft tissue of more than 30 different species of coral. Without treatment, the disease eventually kills the corals, and their soft tissue falls off, revealing the white calcium carbonate skeleton below. In just weeks or months, it can devastate a whole colony. Stony coral tissue loss disease can be spread by fish that eat coral, as well as by boaters and divers who do not disinfect their gear. The condition has since expanded its range beyond Florida to reefs throughout the Caribbean. Several years ago, researchers looking at the great star coral discovered a probiotic called Pseudoalteromonas sp. strain McH1-7. Laboratory tests showed McH1-7 stopped or slowed the progression of stony coral tissue loss disease in infected corals. It also helped prevent the disease from spreading to healthy corals. But that was in the lab. Would McH1-7 be similarly effective in the ocean? Researchers were eager to find out, so they set up an experiment on a shallow reef off the coast of Fort Lauderdale. Study co-author Kelly Pitts, a research technician with the Smithsonian Marine Station, applies a paste containing the probiotic directly onto the disease lesion of an infected coral. Hunter Noren Experimenting with wild corals For the study, the scientists focused on 40 great star coral colonies that were showing symptoms of stony coral tissue loss disease. In one experimental condition, the researchers made a paste that contained McH1-7 and applied it directly onto the disease lesions. For comparison, they also applied the same paste, minus the probiotic, to some corals. In another condition, they covered infected coral colonies with weighted plastic bags, then filled the bags with seawater solutions made with and without McH1-7. They left the corals covered for two hours. “This created a little mini-aquarium that kept the probiotics around each coral colony,” says study co-author Valerie Paul, head scientist at the Smithsonian Marine Station at Fort Pierce, Florida, in a statement. The scientists completed all the treatments within the first 4.5 months of the project. Then, they returned periodically to gather tissue and mucus samples from the corals to measure changes to their microbiomes. Over the next 2.5 years, they took photos from a variety of different angles, which they then used to create 3D models that could track the disease’s progression. In the end, the results suggest covering the corals with plastic bags filled with the probiotic seawater solution was the most effective method. More than two years post-treatment, the colonies that received the probiotic bag had lost just 7 percent of their tissue, while colonies in the control bag condition faced 35 percent tissue loss. Scientists applied a probiotic paste directly to disease lesions on some corals. Kelly Pitts The probiotic paste, by contrast, appears to have made the situation worse: The corals that had the probiotic paste applied directly to their lesions lost more tissue than those treated with the control paste, which did not contain McH1-7. “We do not really know what is going on with the probiotic paste treatment,” Paul tells Smithsonian magazine in an email. But she has a few theories. It’s possible the high concentrations of McH1-7 contributed to localized hypoxia, or low-oxygen conditions that further harmed the already stressed corals, she says. Or, the probiotic could have changed the microbiome at the lesion site in some negative way. Another possibility is that McH1-7 produces antibiotics or other substances that were harmful at high concentrations. Amanda Alker, a marine microbiologist at the University of Rhode Island who was not involved with the study, wonders if this finding suggests McH1-7 is beneficial at specific dosages—a question future laboratory research might be able to answer, she tells Smithsonian magazine in an email. She’s also curious to know which specific molecular components of the probiotic are responsible for the increased tissue loss when applied as a paste. More broadly, Alker would like to see additional experiments validating the bag treatment method, but she says this “inventive” technique seems promising. “Their approach is a safer solution than antibiotic treatment methods that have been deployed to combatin the field so far,” she says. “Further, this is a practical solution that could be implemented widely because it doesn’t require highly specialized equipment and has the ability to be used with any type of microbial solution.” Looking ahead to save reefs Probiotics are likely not a silver bullet for protecting corals. For one, researchers still don’t know exactly what causes stony coral tissue loss disease, which makes it difficult to determine how or why the probiotic works, Paul says. In addition, since the disease has spread to many different parts of the Caribbean, it might be challenging to use the bag treatment technique on all affected colonies. “We would need to develop better methods of deploying the probiotic through time release formulations or other ways to scale up treatments,” Paul says. “Right now, having divers swim around underwater with weighted bags is not a very scalable method.” The researchers have also conducted similar experiments on infected corals located farther south, in the Florida Keys. However, these tests have produced mixed results, probably because of regional differences in stony coral tissue loss disease. This is another hurdle scientists will likely need to overcome if they hope to expand the use of probiotics. “We probably need to develop different probiotics for different coral species and different regions of the Caribbean,” Paul says. Researchers returned to gather samples of tissues and mucus to see how the corals' microbiomes had changed. Hunter Noren Even so, scientists are heartened by the results of the experiments conducted near Fort Lauderdale. With more research, the findings suggest probiotics could be a promising tool for combatting the disease elsewhere. “Coral probiotics is a challenging field, because there are hundreds of different types of bacteria that associate with corals, and there are limitless experiments that need to be performed,” Amy Apprill, a marine chemist at Woods Hole Oceanographic Institution who was not involved with the research, tells Smithsonian magazine in an email. “These researchers made a major advance with their study by demonstrating the utility of whole colony treatment as well as the specific probiotic tested.” Apprill adds that, while antibiotics have been widely used to control stony coral tissue loss disease, scientists haven’t conducted much research to see how these treatments are affecting the plants and creatures that live nearby. “Using a naturally occurring bacterium for disease treatment may result in lessened impacts to other members of the coral reef ecosystem,” she says. Amid rising ocean temperatures, scientists expect to find even more diseased coral colonies in the future. Warmer waters may also allow other pathogens to thrive and proliferate. Against that backdrop, Apprill adds, probiotics and the different methods of applying them will be “major allies” in the fight to save coral reefs. Paul is also optimistic. Through research and field studies, she’s confident researchers will be able to develop interventions that can “help corals better survive changing environments and respond better to diseases and bleaching,” she says. Get the latest stories in your inbox every weekday. #deadly #disease #eating #away #caribbean
    WWW.SMITHSONIANMAG.COM
    A Deadly Disease Is Eating Away at Caribbean Corals and Wreaking Havoc on Reefs. Could Probiotics Be the Solution?
    A Deadly Disease Is Eating Away at Caribbean Corals and Wreaking Havoc on Reefs. Could Probiotics Be the Solution? New research suggests the probiotic McH1-7 could help stop the spread of stony coral tissue loss disease among wild corals near Fort Lauderdale, Florida Scientists determined the most effective method of halting the disease was covering a coral colony with a weighted plastic bag, then injecting a seawater solution that contains the probiotic. They left the colony covered for two hours to allow the probiotic bacteria to colonize the coral. Hunter Noren Probiotics can be good for human health. Now, new research suggests they might also help protect coral reefs. A bacterial probiotic helped slow the advance of stony coral tissue loss disease—a fast-spreading and deadly condition—among wild corals in Florida, researchers report today in a new study published in the journal Frontiers in Marine Science. The probiotic may be a good alternative to antibiotics like amoxicillin, which temporarily curb the spread of the disease but must be reapplied frequently. In addition, scientists fear stony coral tissue loss disease may one day become resistant to these antibiotic treatments—just as “superbugs” that infect humans are building resistance to our own drugs. Antibiotics are meant to kill microorganisms, but probiotics are beneficial living microbes. The idea is that a probiotic can be incorporated into corals’ natural microbiomes, ideally offering them longer-lasting protection. First discovered in Florida in 2014, stony coral tissue loss disease attacks the soft tissue of more than 30 different species of coral. Without treatment, the disease eventually kills the corals, and their soft tissue falls off, revealing the white calcium carbonate skeleton below. In just weeks or months, it can devastate a whole colony. Stony coral tissue loss disease can be spread by fish that eat coral, as well as by boaters and divers who do not disinfect their gear. The condition has since expanded its range beyond Florida to reefs throughout the Caribbean. Several years ago, researchers looking at the great star coral (Montastraea cavernosa) discovered a probiotic called Pseudoalteromonas sp. strain McH1-7. Laboratory tests showed McH1-7 stopped or slowed the progression of stony coral tissue loss disease in infected corals. It also helped prevent the disease from spreading to healthy corals. But that was in the lab. Would McH1-7 be similarly effective in the ocean? Researchers were eager to find out, so they set up an experiment on a shallow reef off the coast of Fort Lauderdale. Study co-author Kelly Pitts, a research technician with the Smithsonian Marine Station, applies a paste containing the probiotic directly onto the disease lesion of an infected coral. Hunter Noren Experimenting with wild corals For the study, the scientists focused on 40 great star coral colonies that were showing symptoms of stony coral tissue loss disease. In one experimental condition, the researchers made a paste that contained McH1-7 and applied it directly onto the disease lesions. For comparison, they also applied the same paste, minus the probiotic, to some corals. In another condition, they covered infected coral colonies with weighted plastic bags, then filled the bags with seawater solutions made with and without McH1-7. They left the corals covered for two hours. “This created a little mini-aquarium that kept the probiotics around each coral colony,” says study co-author Valerie Paul, head scientist at the Smithsonian Marine Station at Fort Pierce, Florida, in a statement. The scientists completed all the treatments within the first 4.5 months of the project. Then, they returned periodically to gather tissue and mucus samples from the corals to measure changes to their microbiomes. Over the next 2.5 years, they took photos from a variety of different angles, which they then used to create 3D models that could track the disease’s progression. In the end, the results suggest covering the corals with plastic bags filled with the probiotic seawater solution was the most effective method. More than two years post-treatment, the colonies that received the probiotic bag had lost just 7 percent of their tissue, while colonies in the control bag condition faced 35 percent tissue loss. Scientists applied a probiotic paste directly to disease lesions on some corals. Kelly Pitts The probiotic paste, by contrast, appears to have made the situation worse: The corals that had the probiotic paste applied directly to their lesions lost more tissue than those treated with the control paste, which did not contain McH1-7. “We do not really know what is going on with the probiotic paste treatment,” Paul tells Smithsonian magazine in an email. But she has a few theories. It’s possible the high concentrations of McH1-7 contributed to localized hypoxia, or low-oxygen conditions that further harmed the already stressed corals, she says. Or, the probiotic could have changed the microbiome at the lesion site in some negative way. Another possibility is that McH1-7 produces antibiotics or other substances that were harmful at high concentrations. Amanda Alker, a marine microbiologist at the University of Rhode Island who was not involved with the study, wonders if this finding suggests McH1-7 is beneficial at specific dosages—a question future laboratory research might be able to answer, she tells Smithsonian magazine in an email. She’s also curious to know which specific molecular components of the probiotic are responsible for the increased tissue loss when applied as a paste. More broadly, Alker would like to see additional experiments validating the bag treatment method, but she says this “inventive” technique seems promising. “Their approach is a safer solution than antibiotic treatment methods that have been deployed to combat [stony coral tissue loss disease] in the field so far,” she says. “Further, this is a practical solution that could be implemented widely because it doesn’t require highly specialized equipment and has the ability to be used with any type of microbial solution.” Looking ahead to save reefs Probiotics are likely not a silver bullet for protecting corals. For one, researchers still don’t know exactly what causes stony coral tissue loss disease, which makes it difficult to determine how or why the probiotic works, Paul says. In addition, since the disease has spread to many different parts of the Caribbean, it might be challenging to use the bag treatment technique on all affected colonies. “We would need to develop better methods of deploying the probiotic through time release formulations or other ways to scale up treatments,” Paul says. “Right now, having divers swim around underwater with weighted bags is not a very scalable method.” The researchers have also conducted similar experiments on infected corals located farther south, in the Florida Keys. However, these tests have produced mixed results, probably because of regional differences in stony coral tissue loss disease. This is another hurdle scientists will likely need to overcome if they hope to expand the use of probiotics. “We probably need to develop different probiotics for different coral species and different regions of the Caribbean,” Paul says. Researchers returned to gather samples of tissues and mucus to see how the corals' microbiomes had changed. Hunter Noren Even so, scientists are heartened by the results of the experiments conducted near Fort Lauderdale. With more research, the findings suggest probiotics could be a promising tool for combatting the disease elsewhere. “Coral probiotics is a challenging field, because there are hundreds of different types of bacteria that associate with corals, and there are limitless experiments that need to be performed,” Amy Apprill, a marine chemist at Woods Hole Oceanographic Institution who was not involved with the research, tells Smithsonian magazine in an email. “These researchers made a major advance with their study by demonstrating the utility of whole colony treatment as well as the specific probiotic tested.” Apprill adds that, while antibiotics have been widely used to control stony coral tissue loss disease, scientists haven’t conducted much research to see how these treatments are affecting the plants and creatures that live nearby. “Using a naturally occurring bacterium for disease treatment may result in lessened impacts to other members of the coral reef ecosystem,” she says. Amid rising ocean temperatures, scientists expect to find even more diseased coral colonies in the future. Warmer waters may also allow other pathogens to thrive and proliferate. Against that backdrop, Apprill adds, probiotics and the different methods of applying them will be “major allies” in the fight to save coral reefs. Paul is also optimistic. Through research and field studies, she’s confident researchers will be able to develop interventions that can “help corals better survive changing environments and respond better to diseases and bleaching,” she says. Get the latest stories in your inbox every weekday.
    Like
    Love
    Wow
    Angry
    Sad
    282
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Multicolor DLP 3D printing breakthrough enables dissolvable supports for complex freestanding structures

    Researchers at the University of Texas at Austin have developed a novel resin system for multicolor digital light processing3D printing that enables rapid fabrication of freestanding and non-assembly structures using dissolvable supports. The work, led by Zachariah A. Page and published in ACS Central Science, combines UV- and visible-light-responsive chemistries to produce materials with distinct solubility profiles, significantly streamlining post-processing.
    Current DLP workflows are often limited by the need for manually removed support structures, especially when fabricating components with overhangs or internal joints. These limitations constrain automation and increase production time and cost. To overcome this, the team designed wavelength-selective photopolymer resins that form either an insoluble thermoset or a readily dissolvable thermoplastic, depending on the light color used during printing.
    In practical terms, this allows supports to be printed in one material and rapidly dissolved using ethyl acetate, an environmentally friendly solvent, without affecting the primary structure. The supports dissolve in under 10 minutes at room temperature, eliminating the need for time-consuming sanding or cutting.
    Illustration comparing traditional DLP 3D printing with manual support removaland the new multicolor DLP process with dissolvable supports. Image via University of Texas at Austin.
    The research was supported by the U.S. Army Research Office, the National Science Foundation, and the Robert A. Welch Foundation. The authors also acknowledge collaboration with MonoPrinter and Lawrence Livermore National Laboratory.
    High-resolution multimaterial printing
    The research showcases how multicolor DLP can serve as a precise multimaterial platform, achieving sub-100 μm feature resolution with layer heights as low as 50 μm. By tuning the photoinitiator and photoacid systems to respond selectively to ultraviolet, violet, or bluelight, the team spatially controlled polymer network formation in a single vat. This enabled the production of complex, freestanding structures such as chainmail, hooks with unsupported overhangs, and fully enclosed joints, which traditionally require extensive post-processing or multi-step assembly.
    The supports, printed in a visible-light-cured thermoplastic, demonstrated sufficient mechanical integrity during the build, with tensile moduli around 160–200 MPa. Yet, upon immersion in ethyl acetate, they dissolved within 10 minutes, leaving the UV-cured thermoset structure intact. Surface profilometry confirmed that including a single interface layer of the dissolvable material between the support and the final object significantly improved surface finish, lowering roughness to under 5 μm without polishing. Computed tomography scans validated geometric fidelity, with dimensional deviations from CAD files as low as 126 μm, reinforcing the method’s capability for high-precision, solvent-cleared multimaterial printing.
    Comparison of dissolvable and traditional supports in DLP 3D printing.Disk printed with soluble supports using violet light, with rapid dissolution in ethyl acetate.Gravimetric analysis showing selective mass loss.Mechanical properties of support and structural materials.Manual support removal steps.Surface roughness comparison across methods.High-resolution test print demonstrating feature fidelity. Image via University of Texas at Austin.
    Towards scalable automation
    This work marks a significant step toward automated vat photopolymerization workflows. By removing manual support removal and achieving clean surface finishes with minimal roughness, the method could benefit applications in medical devices, robotics, and consumer products.
    The authors suggest that future work may involve refining resin formulations to enhance performance and print speed, possibly incorporating new reactive diluents and opaquing agents for improved resolution.
    Examples of printed freestanding and non-assembly structures, including a retainer, hook with overhangs, interlocked chains, and revolute joints, before and after dissolvable support removal. Image via University of Texas at Austin.
    Dissolvable materials as post-processing solutions
    Dissolvable supports have been a focal point in additive manufacturing, particularly for enhancing the efficiency of post-processing. In Fused Deposition Modeling, materials like Stratasys’ SR-30 have been effectively removed using specialized cleaning agents such as Oryx Additive‘s SRC1, which dissolves supports at twice the speed of traditional solutions. For resin-based printing, systems like Xioneer‘s Vortex EZ employ heat and fluid agitation to streamline the removal of soluble supports . In metal additive manufacturing, innovations have led to the development of chemical processes that selectively dissolve support structures without compromising the integrity of the main part . These advancements underscore the industry’s commitment to reducing manual intervention and improving the overall efficiency of 3D printing workflows.
    Read the full article in ACS Publications.
    Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news.
    You can also follow us onLinkedIn and subscribe to the 3D Printing Industry YouTube channel to access more exclusive content. At 3DPI, our mission is to deliver high-quality journalism, technical insight, and industry intelligence to professionals across the AM ecosystem.Help us shape the future of 3D printing industry news with our2025 reader survey.
    Featured image shows: Hook geometry printed using multicolor DLP with dissolvable supports. Image via University of Texas at Austin.
    #multicolor #dlp #printing #breakthrough #enables
    Multicolor DLP 3D printing breakthrough enables dissolvable supports for complex freestanding structures
    Researchers at the University of Texas at Austin have developed a novel resin system for multicolor digital light processing3D printing that enables rapid fabrication of freestanding and non-assembly structures using dissolvable supports. The work, led by Zachariah A. Page and published in ACS Central Science, combines UV- and visible-light-responsive chemistries to produce materials with distinct solubility profiles, significantly streamlining post-processing. Current DLP workflows are often limited by the need for manually removed support structures, especially when fabricating components with overhangs or internal joints. These limitations constrain automation and increase production time and cost. To overcome this, the team designed wavelength-selective photopolymer resins that form either an insoluble thermoset or a readily dissolvable thermoplastic, depending on the light color used during printing. In practical terms, this allows supports to be printed in one material and rapidly dissolved using ethyl acetate, an environmentally friendly solvent, without affecting the primary structure. The supports dissolve in under 10 minutes at room temperature, eliminating the need for time-consuming sanding or cutting. Illustration comparing traditional DLP 3D printing with manual support removaland the new multicolor DLP process with dissolvable supports. Image via University of Texas at Austin. The research was supported by the U.S. Army Research Office, the National Science Foundation, and the Robert A. Welch Foundation. The authors also acknowledge collaboration with MonoPrinter and Lawrence Livermore National Laboratory. High-resolution multimaterial printing The research showcases how multicolor DLP can serve as a precise multimaterial platform, achieving sub-100 μm feature resolution with layer heights as low as 50 μm. By tuning the photoinitiator and photoacid systems to respond selectively to ultraviolet, violet, or bluelight, the team spatially controlled polymer network formation in a single vat. This enabled the production of complex, freestanding structures such as chainmail, hooks with unsupported overhangs, and fully enclosed joints, which traditionally require extensive post-processing or multi-step assembly. The supports, printed in a visible-light-cured thermoplastic, demonstrated sufficient mechanical integrity during the build, with tensile moduli around 160–200 MPa. Yet, upon immersion in ethyl acetate, they dissolved within 10 minutes, leaving the UV-cured thermoset structure intact. Surface profilometry confirmed that including a single interface layer of the dissolvable material between the support and the final object significantly improved surface finish, lowering roughness to under 5 μm without polishing. Computed tomography scans validated geometric fidelity, with dimensional deviations from CAD files as low as 126 μm, reinforcing the method’s capability for high-precision, solvent-cleared multimaterial printing. Comparison of dissolvable and traditional supports in DLP 3D printing.Disk printed with soluble supports using violet light, with rapid dissolution in ethyl acetate.Gravimetric analysis showing selective mass loss.Mechanical properties of support and structural materials.Manual support removal steps.Surface roughness comparison across methods.High-resolution test print demonstrating feature fidelity. Image via University of Texas at Austin. Towards scalable automation This work marks a significant step toward automated vat photopolymerization workflows. By removing manual support removal and achieving clean surface finishes with minimal roughness, the method could benefit applications in medical devices, robotics, and consumer products. The authors suggest that future work may involve refining resin formulations to enhance performance and print speed, possibly incorporating new reactive diluents and opaquing agents for improved resolution. Examples of printed freestanding and non-assembly structures, including a retainer, hook with overhangs, interlocked chains, and revolute joints, before and after dissolvable support removal. Image via University of Texas at Austin. Dissolvable materials as post-processing solutions Dissolvable supports have been a focal point in additive manufacturing, particularly for enhancing the efficiency of post-processing. In Fused Deposition Modeling, materials like Stratasys’ SR-30 have been effectively removed using specialized cleaning agents such as Oryx Additive‘s SRC1, which dissolves supports at twice the speed of traditional solutions. For resin-based printing, systems like Xioneer‘s Vortex EZ employ heat and fluid agitation to streamline the removal of soluble supports . In metal additive manufacturing, innovations have led to the development of chemical processes that selectively dissolve support structures without compromising the integrity of the main part . These advancements underscore the industry’s commitment to reducing manual intervention and improving the overall efficiency of 3D printing workflows. Read the full article in ACS Publications. Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news. You can also follow us onLinkedIn and subscribe to the 3D Printing Industry YouTube channel to access more exclusive content. At 3DPI, our mission is to deliver high-quality journalism, technical insight, and industry intelligence to professionals across the AM ecosystem.Help us shape the future of 3D printing industry news with our2025 reader survey. Featured image shows: Hook geometry printed using multicolor DLP with dissolvable supports. Image via University of Texas at Austin. #multicolor #dlp #printing #breakthrough #enables
    3DPRINTINGINDUSTRY.COM
    Multicolor DLP 3D printing breakthrough enables dissolvable supports for complex freestanding structures
    Researchers at the University of Texas at Austin have developed a novel resin system for multicolor digital light processing (DLP) 3D printing that enables rapid fabrication of freestanding and non-assembly structures using dissolvable supports. The work, led by Zachariah A. Page and published in ACS Central Science, combines UV- and visible-light-responsive chemistries to produce materials with distinct solubility profiles, significantly streamlining post-processing. Current DLP workflows are often limited by the need for manually removed support structures, especially when fabricating components with overhangs or internal joints. These limitations constrain automation and increase production time and cost. To overcome this, the team designed wavelength-selective photopolymer resins that form either an insoluble thermoset or a readily dissolvable thermoplastic, depending on the light color used during printing. In practical terms, this allows supports to be printed in one material and rapidly dissolved using ethyl acetate, an environmentally friendly solvent, without affecting the primary structure. The supports dissolve in under 10 minutes at room temperature, eliminating the need for time-consuming sanding or cutting. Illustration comparing traditional DLP 3D printing with manual support removal (A) and the new multicolor DLP process with dissolvable supports (B). Image via University of Texas at Austin. The research was supported by the U.S. Army Research Office, the National Science Foundation, and the Robert A. Welch Foundation. The authors also acknowledge collaboration with MonoPrinter and Lawrence Livermore National Laboratory. High-resolution multimaterial printing The research showcases how multicolor DLP can serve as a precise multimaterial platform, achieving sub-100 μm feature resolution with layer heights as low as 50 μm. By tuning the photoinitiator and photoacid systems to respond selectively to ultraviolet (365 nm), violet (405 nm), or blue (460 nm) light, the team spatially controlled polymer network formation in a single vat. This enabled the production of complex, freestanding structures such as chainmail, hooks with unsupported overhangs, and fully enclosed joints, which traditionally require extensive post-processing or multi-step assembly. The supports, printed in a visible-light-cured thermoplastic, demonstrated sufficient mechanical integrity during the build, with tensile moduli around 160–200 MPa. Yet, upon immersion in ethyl acetate, they dissolved within 10 minutes, leaving the UV-cured thermoset structure intact. Surface profilometry confirmed that including a single interface layer of the dissolvable material between the support and the final object significantly improved surface finish, lowering roughness to under 5 μm without polishing. Computed tomography scans validated geometric fidelity, with dimensional deviations from CAD files as low as 126 μm, reinforcing the method’s capability for high-precision, solvent-cleared multimaterial printing. Comparison of dissolvable and traditional supports in DLP 3D printing. (A) Disk printed with soluble supports using violet light, with rapid dissolution in ethyl acetate. (B) Gravimetric analysis showing selective mass loss. (C) Mechanical properties of support and structural materials. (D) Manual support removal steps. (E) Surface roughness comparison across methods. (F) High-resolution test print demonstrating feature fidelity. Image via University of Texas at Austin. Towards scalable automation This work marks a significant step toward automated vat photopolymerization workflows. By removing manual support removal and achieving clean surface finishes with minimal roughness, the method could benefit applications in medical devices, robotics, and consumer products. The authors suggest that future work may involve refining resin formulations to enhance performance and print speed, possibly incorporating new reactive diluents and opaquing agents for improved resolution. Examples of printed freestanding and non-assembly structures, including a retainer, hook with overhangs, interlocked chains, and revolute joints, before and after dissolvable support removal. Image via University of Texas at Austin. Dissolvable materials as post-processing solutions Dissolvable supports have been a focal point in additive manufacturing, particularly for enhancing the efficiency of post-processing. In Fused Deposition Modeling (FDM), materials like Stratasys’ SR-30 have been effectively removed using specialized cleaning agents such as Oryx Additive‘s SRC1, which dissolves supports at twice the speed of traditional solutions. For resin-based printing, systems like Xioneer‘s Vortex EZ employ heat and fluid agitation to streamline the removal of soluble supports . In metal additive manufacturing, innovations have led to the development of chemical processes that selectively dissolve support structures without compromising the integrity of the main part . These advancements underscore the industry’s commitment to reducing manual intervention and improving the overall efficiency of 3D printing workflows. Read the full article in ACS Publications. Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news. You can also follow us onLinkedIn and subscribe to the 3D Printing Industry YouTube channel to access more exclusive content. At 3DPI, our mission is to deliver high-quality journalism, technical insight, and industry intelligence to professionals across the AM ecosystem.Help us shape the future of 3D printing industry news with our2025 reader survey. Featured image shows: Hook geometry printed using multicolor DLP with dissolvable supports. Image via University of Texas at Austin.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Molecular Rebar Design patents carbon nanotube dispersions for improved additive manufacturing resins

    Molecular Rebar Design, a nanomaterials company based in Austin, Texas, has patented a new additive manufacturingcomposition that utilizes oxidized discrete carbon nanotubeswith bonded dispersing agents to enhance 3D printing resins. The patent, published under US20210237509A1, outlines methods to improve resin properties for applications such as vat photopolymerization, sintering, and thermoplastic fusion.
    The inventors, Clive P. Bosnyak, Kurt W. Swogger, Steven Lowder, and Olga Ivanova, propose formulations that improve electrical conductivity, thermal stability, and mechanical strength, while overcoming dispersion challenges common with CNTs in composite materials.
    Image shows a schematic diagram of functionalized carbon nanotubes. Image via Molecular Rebar Design.
    Functionalized CNTs for additive manufacturing
    At the core of the invention is the chemical functionalization of CNTs with dispersing agents bonded to their sidewalls, enabling higher aspect ratios and more homogeneous dispersions. These dispersions integrate into UV-curable acrylates, thermoplastics, and elastomers, yielding improved green strength, sinterability, and faster cure rates.
    The patent emphasizes the benefit of using bimodal or trimodal distributions of CNT diametersto tune material performance. Additional fillers such as carbon black, silica, and metallic powders can also be incorporated for applications ranging from electronic encapsulation to impact-resistant parts.
    Experimental validation
    To validate the invention, the applicants oxidized carbon nanotubes using nitric acid and covalently bonded them with polyether dispersing agents such as Jeffamine M2005. These modified CNTs were incorporated into photopolymer resin formulations. In tensile testing, specimens produced with the dispersions demonstrated enhanced mechanical performance, with yield strengths exceeding 50 MPa and Young’s modulus values above 2.8 GPa.
    Impact strength improved by up to 90% in certain formulations compared to control samples without CNTs. These performance gains suggest suitability for applications demanding high strength-to-weight ratios, such as aerospace, electronics, and structural components.
    Nanotube innovations in AM
    Carbon nanotubeshave long been explored for additive manufacturingdue to their exceptional mechanical and electrical properties. However, challenges such as poor dispersion and inconsistent aspect ratios have hindered their widespread adoption in AM processes. Recent advancements aim to overcome these barriers by integrating oxidation and dispersion techniques into scalable production methods.
    For instance, researchers at Rice University have developed a novel acid-based solvent that prevents the common “spaghetti effect” of CNTs tangling together. This innovation simplifies the processing of CNTs, potentially enabling their scale-up for industrial 3D printing applications.
    Similarly, a research team led by the University of Glasgow has created a 3D printable CNT-based plastic material capable of sensing its own structural health. This material, inspired by natural porous structures, offers enhanced toughness and strength, with potential applications in medicine, prosthetics, automotive, and aerospace design.
    Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news.
    You can also follow us onLinkedIn and subscribe to the 3D Printing Industry YouTube channel to access more exclusive content. At 3DPI, our mission is to deliver high-quality journalism, technical insight, and industry intelligence to professionals across the AM ecosystem.Help us shape the future of 3D printing industry news with our2025 reader survey.
    Feature image shows schematic diagram of functionalized carbon nanotubes. Image via Molecular Rebar Design.
    #molecular #rebar #design #patents #carbon
    Molecular Rebar Design patents carbon nanotube dispersions for improved additive manufacturing resins
    Molecular Rebar Design, a nanomaterials company based in Austin, Texas, has patented a new additive manufacturingcomposition that utilizes oxidized discrete carbon nanotubeswith bonded dispersing agents to enhance 3D printing resins. The patent, published under US20210237509A1, outlines methods to improve resin properties for applications such as vat photopolymerization, sintering, and thermoplastic fusion. The inventors, Clive P. Bosnyak, Kurt W. Swogger, Steven Lowder, and Olga Ivanova, propose formulations that improve electrical conductivity, thermal stability, and mechanical strength, while overcoming dispersion challenges common with CNTs in composite materials. Image shows a schematic diagram of functionalized carbon nanotubes. Image via Molecular Rebar Design. Functionalized CNTs for additive manufacturing At the core of the invention is the chemical functionalization of CNTs with dispersing agents bonded to their sidewalls, enabling higher aspect ratios and more homogeneous dispersions. These dispersions integrate into UV-curable acrylates, thermoplastics, and elastomers, yielding improved green strength, sinterability, and faster cure rates. The patent emphasizes the benefit of using bimodal or trimodal distributions of CNT diametersto tune material performance. Additional fillers such as carbon black, silica, and metallic powders can also be incorporated for applications ranging from electronic encapsulation to impact-resistant parts. Experimental validation To validate the invention, the applicants oxidized carbon nanotubes using nitric acid and covalently bonded them with polyether dispersing agents such as Jeffamine M2005. These modified CNTs were incorporated into photopolymer resin formulations. In tensile testing, specimens produced with the dispersions demonstrated enhanced mechanical performance, with yield strengths exceeding 50 MPa and Young’s modulus values above 2.8 GPa. Impact strength improved by up to 90% in certain formulations compared to control samples without CNTs. These performance gains suggest suitability for applications demanding high strength-to-weight ratios, such as aerospace, electronics, and structural components. Nanotube innovations in AM Carbon nanotubeshave long been explored for additive manufacturingdue to their exceptional mechanical and electrical properties. However, challenges such as poor dispersion and inconsistent aspect ratios have hindered their widespread adoption in AM processes. Recent advancements aim to overcome these barriers by integrating oxidation and dispersion techniques into scalable production methods. For instance, researchers at Rice University have developed a novel acid-based solvent that prevents the common “spaghetti effect” of CNTs tangling together. This innovation simplifies the processing of CNTs, potentially enabling their scale-up for industrial 3D printing applications. Similarly, a research team led by the University of Glasgow has created a 3D printable CNT-based plastic material capable of sensing its own structural health. This material, inspired by natural porous structures, offers enhanced toughness and strength, with potential applications in medicine, prosthetics, automotive, and aerospace design. Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news. You can also follow us onLinkedIn and subscribe to the 3D Printing Industry YouTube channel to access more exclusive content. At 3DPI, our mission is to deliver high-quality journalism, technical insight, and industry intelligence to professionals across the AM ecosystem.Help us shape the future of 3D printing industry news with our2025 reader survey. Feature image shows schematic diagram of functionalized carbon nanotubes. Image via Molecular Rebar Design. #molecular #rebar #design #patents #carbon
    3DPRINTINGINDUSTRY.COM
    Molecular Rebar Design patents carbon nanotube dispersions for improved additive manufacturing resins
    Molecular Rebar Design, a nanomaterials company based in Austin, Texas, has patented a new additive manufacturing (AM) composition that utilizes oxidized discrete carbon nanotubes (CNTs) with bonded dispersing agents to enhance 3D printing resins. The patent, published under US20210237509A1, outlines methods to improve resin properties for applications such as vat photopolymerization, sintering, and thermoplastic fusion. The inventors, Clive P. Bosnyak, Kurt W. Swogger, Steven Lowder, and Olga Ivanova, propose formulations that improve electrical conductivity, thermal stability, and mechanical strength, while overcoming dispersion challenges common with CNTs in composite materials. Image shows a schematic diagram of functionalized carbon nanotubes. Image via Molecular Rebar Design. Functionalized CNTs for additive manufacturing At the core of the invention is the chemical functionalization of CNTs with dispersing agents bonded to their sidewalls, enabling higher aspect ratios and more homogeneous dispersions. These dispersions integrate into UV-curable acrylates, thermoplastics, and elastomers, yielding improved green strength, sinterability, and faster cure rates. The patent emphasizes the benefit of using bimodal or trimodal distributions of CNT diameters (single-, double-, or multi-wall) to tune material performance. Additional fillers such as carbon black, silica, and metallic powders can also be incorporated for applications ranging from electronic encapsulation to impact-resistant parts. Experimental validation To validate the invention, the applicants oxidized carbon nanotubes using nitric acid and covalently bonded them with polyether dispersing agents such as Jeffamine M2005. These modified CNTs were incorporated into photopolymer resin formulations. In tensile testing, specimens produced with the dispersions demonstrated enhanced mechanical performance, with yield strengths exceeding 50 MPa and Young’s modulus values above 2.8 GPa. Impact strength improved by up to 90% in certain formulations compared to control samples without CNTs. These performance gains suggest suitability for applications demanding high strength-to-weight ratios, such as aerospace, electronics, and structural components. Nanotube innovations in AM Carbon nanotubes (CNTs) have long been explored for additive manufacturing (AM) due to their exceptional mechanical and electrical properties. However, challenges such as poor dispersion and inconsistent aspect ratios have hindered their widespread adoption in AM processes. Recent advancements aim to overcome these barriers by integrating oxidation and dispersion techniques into scalable production methods. For instance, researchers at Rice University have developed a novel acid-based solvent that prevents the common “spaghetti effect” of CNTs tangling together. This innovation simplifies the processing of CNTs, potentially enabling their scale-up for industrial 3D printing applications. Similarly, a research team led by the University of Glasgow has created a 3D printable CNT-based plastic material capable of sensing its own structural health. This material, inspired by natural porous structures, offers enhanced toughness and strength, with potential applications in medicine, prosthetics, automotive, and aerospace design. Subscribe to the 3D Printing Industry newsletter to keep up with the latest 3D printing news. You can also follow us onLinkedIn and subscribe to the 3D Printing Industry YouTube channel to access more exclusive content. At 3DPI, our mission is to deliver high-quality journalism, technical insight, and industry intelligence to professionals across the AM ecosystem.Help us shape the future of 3D printing industry news with our2025 reader survey. Feature image shows schematic diagram of functionalized carbon nanotubes. Image via Molecular Rebar Design.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Eight House Problems You Can Solve With a Fresh Coat of Paint

    We may earn a commission from links on this page.For a property owner, paint is an incredibly powerful tool. It’s a cheap and effective renovation in a can, a fun way to add some personality to your home, and a project that can be wrapped up in a weekend. Best of all, if you mess up your paint job, you can just paint over your mistakes.But the power of paint goes way beyond aesthetics. Paint can be formulated in different ways, with different effects, making it an easy, low-cost solution to a host of problems you might experience in your home—and I'm not talking about covering them up to pretend they aren't there. Choosing the right kind of paint can often be the most affordable solution, and is worth considering before you start taking out home equity loans to pay for a more invasive, disruptive fix. Here are eight problems that you might be able to take care of with the right paint.Slippery floors and stairsWhen I first moved into my current home, I slipped on our narrow, steep old stairs. I didn’t get seriously hurt, unless humiliation and emotional damage counts—but I could easily imagine a different outcome. Since changing the rise of the stairs was out of the question and my wife and I weren’t into carpeting, we decided to paint them with anti-slip paint.

    The stairs that tried to kill me, now coated in anti-slip paint.
    Credit: Jeff Somers

    It worked perfectly. Not only did the paint job turn out great, making the stairs look new, the slight grit the paint added to the surface means I haven’t slipped on those stairs in years. Anti-slip paint can be used indoors or outdoors, and on just about any surface—companies even make additives you can mix into any exterior or interior paint to transform it into anti-slip paint. If there are places in your home where you constantly worry about slipping and falling, a coat of anti-slip paint can take care of them.Cosmetic imperfectionsYou might think that covering imperfections like minor scratches, stains, or that hideous green color the previous owner used is the whole point of paint, and you would be right. But if the wall in question is especially problematic and you want to avoid re-doing the drywall or plaster or the tedious work of adding a skim coat, you might be able to hide those imperfections with a high-opacity trade paint. A trade paint is a professional formulation of paint that’s designed to be thicker and more opaque while offering better coverage and durability. The paint you buy in the store is retail paint, and it’s usually formulated to keep costs down. Trade paint is for the professionals, and it costs more, but will do a much better job of covering up the sins on your walls because of its thickness, matte finish, and opacity.NoiseIf the problem in your house is noise—whether from inconsiderate neighbors or roommates from hell—a sound-deadening acoustic paint will definitely help. These paints are formulated to be thick and spongy when they cure, absorbing sound and reducing echo—no need to attach all kinds of foam baffles to every surface.Sound deadening paint won’t block all sound, especially if it’s only applied on one side of a wall. But it will reduce the level of noise that makes it through, and if you apply it to both sides of shared walls in sufficient thicknessit will make an audible difference.Fire riskYour house burning down would definitely fall under the category of a “house problem.” Believe it or not, paint can help with that. Choosing a fire-retardant paint for your next interior paint project can turn your walls into firebreaks that will slow down a house fire. When these paints encounter fire, they quickly char over, forming a protective layer that resists the flames. It won’t completely stop the spread of a fire in your house, but it will buy you time to get your family to safety and call in the firefighters—and in a house fire, time is the most important factor.High utility billsIf your house is crazy expensive to heat or cool, you can make the situation a little better with paint in two ways:Paint your roof. Painting your flat roof with an appropriate roof coating can not only extend the lifespan of your roof, it can help bounce the sun’s rays away, lowering the temperature of your roof and reducing the heat that’s transferred to your home as a result.Use an insulating interior paint. Insulating paint is designed to augment existing insulation in your home—you can’t just slap a coat of it on an uninsulated wall or ceiling and get results. But it can help reduce temperature transfer and fluctuation inside your home if it’s applied correctly and in multiple coats. If you’ve tried everything else to get your utility bills under control, throwing some insulating paint on the walls might help.Too-small roomsIt happens: You buy a house with loads of charm, and once you’re living in it you realize that the rooms are actually small and dark, because the people who built it were short and afraid of the Sun. Or something. If that’s your problem, you can try a bunch of different strategies to get more natural light into a room, and one of the tricks you can try is paint: By choosing the right color intensity, saturation, and finish for your walls and ceilings, you can turn a small, dark space into a brighter one that at least seems larger. No, paint won’t suddenly make that huge armoire fit into your tiny bedroom, but it will at least make it feel possible.Moisture and moldIf you’re worried about a damp room and mold, or have a bathroom that isn’t well-ventilated and is thus susceptible to mold infestations, paint can help you out in two ways:Waterproofing paint or primer can help block moisture from seeping into the room in the first place. This isn’t magic—it’s not going to stop flowing water, and if you don’t take steps to mitigate flooding or poor drainage in or around your house no amount of waterproofing paint is going to help. But it can be very effective at reducing moisture in a room if applied correctly.Mold-resistant paint in damp areas like bathrooms, basements, or any room where the humidity is a concern can then help prevent mold from taking root. These paints have antimicrobial properties, so if you start off with a mold-free room and take steps to reduce moisture, using a mold-resistant paint will make a huge difference going forward.
    #eight #house #problems #you #can
    Eight House Problems You Can Solve With a Fresh Coat of Paint
    We may earn a commission from links on this page.For a property owner, paint is an incredibly powerful tool. It’s a cheap and effective renovation in a can, a fun way to add some personality to your home, and a project that can be wrapped up in a weekend. Best of all, if you mess up your paint job, you can just paint over your mistakes.But the power of paint goes way beyond aesthetics. Paint can be formulated in different ways, with different effects, making it an easy, low-cost solution to a host of problems you might experience in your home—and I'm not talking about covering them up to pretend they aren't there. Choosing the right kind of paint can often be the most affordable solution, and is worth considering before you start taking out home equity loans to pay for a more invasive, disruptive fix. Here are eight problems that you might be able to take care of with the right paint.Slippery floors and stairsWhen I first moved into my current home, I slipped on our narrow, steep old stairs. I didn’t get seriously hurt, unless humiliation and emotional damage counts—but I could easily imagine a different outcome. Since changing the rise of the stairs was out of the question and my wife and I weren’t into carpeting, we decided to paint them with anti-slip paint. The stairs that tried to kill me, now coated in anti-slip paint. Credit: Jeff Somers It worked perfectly. Not only did the paint job turn out great, making the stairs look new, the slight grit the paint added to the surface means I haven’t slipped on those stairs in years. Anti-slip paint can be used indoors or outdoors, and on just about any surface—companies even make additives you can mix into any exterior or interior paint to transform it into anti-slip paint. If there are places in your home where you constantly worry about slipping and falling, a coat of anti-slip paint can take care of them.Cosmetic imperfectionsYou might think that covering imperfections like minor scratches, stains, or that hideous green color the previous owner used is the whole point of paint, and you would be right. But if the wall in question is especially problematic and you want to avoid re-doing the drywall or plaster or the tedious work of adding a skim coat, you might be able to hide those imperfections with a high-opacity trade paint. A trade paint is a professional formulation of paint that’s designed to be thicker and more opaque while offering better coverage and durability. The paint you buy in the store is retail paint, and it’s usually formulated to keep costs down. Trade paint is for the professionals, and it costs more, but will do a much better job of covering up the sins on your walls because of its thickness, matte finish, and opacity.NoiseIf the problem in your house is noise—whether from inconsiderate neighbors or roommates from hell—a sound-deadening acoustic paint will definitely help. These paints are formulated to be thick and spongy when they cure, absorbing sound and reducing echo—no need to attach all kinds of foam baffles to every surface.Sound deadening paint won’t block all sound, especially if it’s only applied on one side of a wall. But it will reduce the level of noise that makes it through, and if you apply it to both sides of shared walls in sufficient thicknessit will make an audible difference.Fire riskYour house burning down would definitely fall under the category of a “house problem.” Believe it or not, paint can help with that. Choosing a fire-retardant paint for your next interior paint project can turn your walls into firebreaks that will slow down a house fire. When these paints encounter fire, they quickly char over, forming a protective layer that resists the flames. It won’t completely stop the spread of a fire in your house, but it will buy you time to get your family to safety and call in the firefighters—and in a house fire, time is the most important factor.High utility billsIf your house is crazy expensive to heat or cool, you can make the situation a little better with paint in two ways:Paint your roof. Painting your flat roof with an appropriate roof coating can not only extend the lifespan of your roof, it can help bounce the sun’s rays away, lowering the temperature of your roof and reducing the heat that’s transferred to your home as a result.Use an insulating interior paint. Insulating paint is designed to augment existing insulation in your home—you can’t just slap a coat of it on an uninsulated wall or ceiling and get results. But it can help reduce temperature transfer and fluctuation inside your home if it’s applied correctly and in multiple coats. If you’ve tried everything else to get your utility bills under control, throwing some insulating paint on the walls might help.Too-small roomsIt happens: You buy a house with loads of charm, and once you’re living in it you realize that the rooms are actually small and dark, because the people who built it were short and afraid of the Sun. Or something. If that’s your problem, you can try a bunch of different strategies to get more natural light into a room, and one of the tricks you can try is paint: By choosing the right color intensity, saturation, and finish for your walls and ceilings, you can turn a small, dark space into a brighter one that at least seems larger. No, paint won’t suddenly make that huge armoire fit into your tiny bedroom, but it will at least make it feel possible.Moisture and moldIf you’re worried about a damp room and mold, or have a bathroom that isn’t well-ventilated and is thus susceptible to mold infestations, paint can help you out in two ways:Waterproofing paint or primer can help block moisture from seeping into the room in the first place. This isn’t magic—it’s not going to stop flowing water, and if you don’t take steps to mitigate flooding or poor drainage in or around your house no amount of waterproofing paint is going to help. But it can be very effective at reducing moisture in a room if applied correctly.Mold-resistant paint in damp areas like bathrooms, basements, or any room where the humidity is a concern can then help prevent mold from taking root. These paints have antimicrobial properties, so if you start off with a mold-free room and take steps to reduce moisture, using a mold-resistant paint will make a huge difference going forward. #eight #house #problems #you #can
    LIFEHACKER.COM
    Eight House Problems You Can Solve With a Fresh Coat of Paint
    We may earn a commission from links on this page.For a property owner, paint is an incredibly powerful tool. It’s a cheap and effective renovation in a can, a fun way to add some personality to your home, and a project that can be wrapped up in a weekend. Best of all, if you mess up your paint job, you can just paint over your mistakes.But the power of paint goes way beyond aesthetics. Paint can be formulated in different ways, with different effects, making it an easy, low-cost solution to a host of problems you might experience in your home—and I'm not talking about covering them up to pretend they aren't there. Choosing the right kind of paint can often be the most affordable solution, and is worth considering before you start taking out home equity loans to pay for a more invasive, disruptive fix. Here are eight problems that you might be able to take care of with the right paint.Slippery floors and stairsWhen I first moved into my current home, I slipped on our narrow, steep old stairs. I didn’t get seriously hurt, unless humiliation and emotional damage counts—but I could easily imagine a different outcome. Since changing the rise of the stairs was out of the question and my wife and I weren’t into carpeting, we decided to paint them with anti-slip paint. The stairs that tried to kill me, now coated in anti-slip paint. Credit: Jeff Somers It worked perfectly. Not only did the paint job turn out great, making the stairs look new, the slight grit the paint added to the surface means I haven’t slipped on those stairs in years. Anti-slip paint can be used indoors or outdoors (on slippery deck planks, for example), and on just about any surface—companies even make additives you can mix into any exterior or interior paint to transform it into anti-slip paint. If there are places in your home where you constantly worry about slipping and falling, a coat of anti-slip paint can take care of them.Cosmetic imperfectionsYou might think that covering imperfections like minor scratches, stains, or that hideous green color the previous owner used is the whole point of paint, and you would be right. But if the wall in question is especially problematic and you want to avoid re-doing the drywall or plaster or the tedious work of adding a skim coat, you might be able to hide those imperfections with a high-opacity trade paint. A trade paint is a professional formulation of paint that’s designed to be thicker and more opaque while offering better coverage and durability (you might see this referred to as “obliterating paint,” especially outside the U.S.). The paint you buy in the store is retail paint, and it’s usually formulated to keep costs down. Trade paint is for the professionals, and it costs more, but will do a much better job of covering up the sins on your walls because of its thickness, matte finish, and opacity.NoiseIf the problem in your house is noise—whether from inconsiderate neighbors or roommates from hell—a sound-deadening acoustic paint will definitely help. These paints are formulated to be thick and spongy when they cure, absorbing sound and reducing echo—no need to attach all kinds of foam baffles to every surface.Sound deadening paint won’t block all sound, especially if it’s only applied on one side of a wall. But it will reduce the level of noise that makes it through, and if you apply it to both sides of shared walls in sufficient thickness (you usually need at least three coats for maximum effectiveness) it will make an audible difference.Fire riskYour house burning down would definitely fall under the category of a “house problem.” Believe it or not, paint can help with that. Choosing a fire-retardant paint for your next interior paint project can turn your walls into firebreaks that will slow down a house fire. When these paints encounter fire, they quickly char over, forming a protective layer that resists the flames. It won’t completely stop the spread of a fire in your house, but it will buy you time to get your family to safety and call in the firefighters—and in a house fire, time is the most important factor.High utility billsIf your house is crazy expensive to heat or cool (or, if you’re really lucky, crazy expensive to heat and cool), you can make the situation a little better with paint in two ways:Paint your roof. Painting your flat roof with an appropriate roof coating can not only extend the lifespan of your roof, it can help bounce the sun’s rays away, lowering the temperature of your roof and reducing the heat that’s transferred to your home as a result. (Choosing a white paint for this job will be the most effective in cooling things down.)Use an insulating interior paint. Insulating paint is designed to augment existing insulation in your home—you can’t just slap a coat of it on an uninsulated wall or ceiling and get results. But it can help reduce temperature transfer and fluctuation inside your home if it’s applied correctly and in multiple coats (the more coats, the better it will work). If you’ve tried everything else to get your utility bills under control, throwing some insulating paint on the walls might help.Too-small roomsIt happens: You buy a house with loads of charm, and once you’re living in it you realize that the rooms are actually small and dark, because the people who built it were short and afraid of the Sun. Or something. If that’s your problem, you can try a bunch of different strategies to get more natural light into a room (or fake it), and one of the tricks you can try is paint: By choosing the right color intensity, saturation, and finish for your walls and ceilings, you can turn a small, dark space into a brighter one that at least seems larger. No, paint won’t suddenly make that huge armoire fit into your tiny bedroom, but it will at least make it feel possible.Moisture and moldIf you’re worried about a damp room and mold, or have a bathroom that isn’t well-ventilated and is thus susceptible to mold infestations, paint can help you out in two ways:Waterproofing paint or primer can help block moisture from seeping into the room in the first place. This isn’t magic—it’s not going to stop flowing water, and if you don’t take steps to mitigate flooding or poor drainage in or around your house no amount of waterproofing paint is going to help. But it can be very effective at reducing moisture in a room if applied correctly.Mold-resistant paint in damp areas like bathrooms, basements, or any room where the humidity is a concern can then help prevent mold from taking root. These paints have antimicrobial properties, so if you start off with a mold-free room and take steps to reduce moisture, using a mold-resistant paint will make a huge difference going forward.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Mapping the Expanding Role of 3D Printing in Micro and Nano Device Fabrication

    A new review by researchers from the Beijing University of Posts and Telecommunications, CETC 54, Sun Yat-sen University, Shenzhen University, and the University of Electronic Science and Technology of China surveys the latest developments in 3D printing for microelectronic and microfluidic applications. The paper released on Springer Nature Link highlights how additive manufacturing methods have reached sub-micron precision, allowing the production of devices previously limited to traditional cleanroom fabrication.
    High-resolution techniques like two-photon polymerization, electrohydrodynamic jet printing, and computed axial lithographyare now being used to create structures with feature sizes down to 100 nanometers. These capabilities have broad implications for biomedical sensors, flexible electronics, and microfluidic systems used in diagnostics and environmental monitoring.
    Overview of 3D printing applications for microelectronic and microfluidic device fabrication. Image via Springer Nature.
    Classification of High-Precision Additive Processes
    Seven categories of additive manufacturing, as defined by the American Society for Testing and Materialsserve as the foundation for modern 3D printing workflows: binder jetting, directed energy deposition, material extrusion, material jetting, powder bed fusion, sheet lamination, and vat photopolymerization.
    Among these, 2PP provides the finest resolution, enabling the fabrication of nanoscale features for optical communication components and MEMS support structures. Inkjet-based material jetting and direct ink writingallow patterned deposition of conductive or biological materials, including stretchable gels and ionic polymers. Binder jetting, which operates by spraying adhesives onto powdered substrates, is particularly suited for large-volume structures using metals or ceramics with minimal thermal stress.
    Fused deposition modeling, a form of material extrusion, continues to be widely used for its low cost and compatibility with thermoplastics. Although limited in resolution, it remains practical for building mechanical supports or sacrificial molds in soft lithography.
    Various micro-scale 3D printing strategies. Image via Springer Nature.
    3D Printing in Microelectronics, MEMS, and Sensing
    Additive manufacturing is now routinely used to fabricate microsensors, microelectromechanical systemactuators, and flexible electronics. Compared to traditional lithographic processes, 3D printing reduces material waste and bypasses the need for masks or etching steps.
    In one example cited by the review, flexible multi-directional sensors were printed directly onto skin-like substrates using a customized FDM platform. Another case involved a cantilever support for a micro-accelerometer produced via 2PP and coated with conductive materials through evaporation. These examples show how additive techniques can fabricate both support and functional layers with high geometric complexity.
    MEMS actuators fabricated with additive methods often combine printed scaffolds with conventional micromachining. A 2PP-printed spiral structure was used to house liquid metal in an electrothermal actuator. Separately, FDM was used to print a MEMS switch, combining conductive PLA and polyvinyl alcohol as the sacrificial layer. However, achieving the mechanical precision needed for switching elements remains a barrier for fully integrated use.
    3D printing material and preparation methods. Image via Springer Nature.
    Development of Functional Inks and Composite Materials
    Microelectronic applications depend on the availability of printable materials with specific electrical, mechanical, or chemical properties. MXene-based conductive inks, metal particle suspensions, and piezoelectric composites are being optimized for use in DIW, inkjet, and light-curing platforms.
    Researchers have fabricated planar asymmetric micro-supercapacitors using ink composed of nickel sulfide on nitrogen-doped MXene. These devices demonstrate increased voltage windowsand volumetric capacitance, meeting the demands of compact power systems. Other work involves composite hydrogels with ionic conductivity and high tensile stretch, used in flexible biosensing applications.
    PEDOT:PSS, a common conductive polymer, has been formulated into a high-resolution ink using lyophilization and re-dispersion in photocurable matrices. These formulations are used to create electrode arrays for neural probes and flexible circuits. Multiphoton lithography has also been applied to print complex 3D structures from organic semiconductor resins.
    Bioelectronic applications are driving the need for biocompatible inks that can perform reliably in wet and dynamic environments. One group incorporated graphene nanoplatelets and carbon nanotubes into ink for multi-jet fusion, producing pressure sensors with high mechanical durability and signal sensitivity.
    3D printed electronics achieved through the integration of active initiators into printing materials. Image via Springer Nature.
    Microfluidic Devices Fabricated via Direct and Indirect Methods
    Microfluidic systems have traditionally relied on soft lithography techniques using polydimethylsiloxane. Additive manufacturing now offers alternatives through both direct printing of fluidic chips and indirect fabrication using 3D printed molds.
    Direct fabrication using SLA, DLP, or inkjet-based systems allows the rapid prototyping of chips with integrated reservoirs and channels. However, achieving sub-100 µm channels requires careful calibration. One group demonstrated channels as small as 18 µm × 20 µm using a customized DLP printer.
    Indirect fabrication relies on printing sacrificial or reusable molds, followed by casting and demolding. PLA, ABS, and resin-based molds are commonly used, depending on whether water-soluble or solvent-dissolvable materials are preferred. These techniques are compatible with PDMS and reduce reliance on photolithography equipment.
    Surface roughness and optical transparency remain concerns. FDM-printed molds often introduce layer artifacts, while uncured resin in SLA methods can leach toxins or inhibit PDMS curing. Some teams address these issues by polishing surfaces post-print or chemically treating molds to improve release characteristics.
    Integration and Future Directions for Microdevices
    3D printed microfluidic devices in biology and chemistry.Image via Springer Nature.
    3D printing is increasingly enabling the integration of structural, electrical, and sensing components into single build processes. Multi-material printers are beginning to produce substrates, conductive paths, and dielectric layers in tandem, although component embedding still requires manual intervention.
    Applications in wearable electronics, flexible sensors, and soft robotics continue to expand. Stretchable conductors printed onto elastomeric backings are being used to simulate mechanoreceptors and thermoreceptors for electronic skin systems. Piezoelectric materials such as BaTiO₃-PVDF composites are under investigation for printed actuators and energy harvesters.
    MEMS fabrication remains constrained by the mechanical limitations of printable materials. Silicon continues to dominate high-performance actuators due to its stiffness and precision. Additive methods are currently better suited for producing packaging, connectors, and sacrificial scaffolds within MEMS systems.
    Multi-photon and light-assisted processes are being explored for producing active devices like microcapacitors and accelerometers. Recent work demonstrated the use of 2PP to fabricate nitrogen-vacancy center–based quantum sensors, capable of detecting thermal and magnetic fluctuations in microscopic environments.
    As materials, resolution, and system integration improve, 3D printing is poised to shift from peripheral use to a central role in microsystem design and production. 
    3D printing micro-nano devices. Image via Springer Nature.
    Ready to discover who won the 20243D Printing Industry Awards?
    Subscribe to the 3D Printing Industry newsletter to stay updated with the latest news and insights.
    Take the 3DPI Reader Survey — shape the future of AM reporting in under 5 minutes.
    Featured image shows an Overview of 3D printing applications for microelectronic and microfluidic device fabrication. Image via Springer Nature.

    Anyer Tenorio Lara
    Anyer Tenorio Lara is an emerging tech journalist passionate about uncovering the latest advances in technology and innovation. With a sharp eye for detail and a talent for storytelling, Anyer has quickly made a name for himself in the tech community. Anyer's articles aim to make complex subjects accessible and engaging for a broad audience. In addition to his writing, Anyer enjoys participating in industry events and discussions, eager to learn and share knowledge in the dynamic world of technology.
    #mapping #expanding #role #printing #micro
    Mapping the Expanding Role of 3D Printing in Micro and Nano Device Fabrication
    A new review by researchers from the Beijing University of Posts and Telecommunications, CETC 54, Sun Yat-sen University, Shenzhen University, and the University of Electronic Science and Technology of China surveys the latest developments in 3D printing for microelectronic and microfluidic applications. The paper released on Springer Nature Link highlights how additive manufacturing methods have reached sub-micron precision, allowing the production of devices previously limited to traditional cleanroom fabrication. High-resolution techniques like two-photon polymerization, electrohydrodynamic jet printing, and computed axial lithographyare now being used to create structures with feature sizes down to 100 nanometers. These capabilities have broad implications for biomedical sensors, flexible electronics, and microfluidic systems used in diagnostics and environmental monitoring. Overview of 3D printing applications for microelectronic and microfluidic device fabrication. Image via Springer Nature. Classification of High-Precision Additive Processes Seven categories of additive manufacturing, as defined by the American Society for Testing and Materialsserve as the foundation for modern 3D printing workflows: binder jetting, directed energy deposition, material extrusion, material jetting, powder bed fusion, sheet lamination, and vat photopolymerization. Among these, 2PP provides the finest resolution, enabling the fabrication of nanoscale features for optical communication components and MEMS support structures. Inkjet-based material jetting and direct ink writingallow patterned deposition of conductive or biological materials, including stretchable gels and ionic polymers. Binder jetting, which operates by spraying adhesives onto powdered substrates, is particularly suited for large-volume structures using metals or ceramics with minimal thermal stress. Fused deposition modeling, a form of material extrusion, continues to be widely used for its low cost and compatibility with thermoplastics. Although limited in resolution, it remains practical for building mechanical supports or sacrificial molds in soft lithography. Various micro-scale 3D printing strategies. Image via Springer Nature. 3D Printing in Microelectronics, MEMS, and Sensing Additive manufacturing is now routinely used to fabricate microsensors, microelectromechanical systemactuators, and flexible electronics. Compared to traditional lithographic processes, 3D printing reduces material waste and bypasses the need for masks or etching steps. In one example cited by the review, flexible multi-directional sensors were printed directly onto skin-like substrates using a customized FDM platform. Another case involved a cantilever support for a micro-accelerometer produced via 2PP and coated with conductive materials through evaporation. These examples show how additive techniques can fabricate both support and functional layers with high geometric complexity. MEMS actuators fabricated with additive methods often combine printed scaffolds with conventional micromachining. A 2PP-printed spiral structure was used to house liquid metal in an electrothermal actuator. Separately, FDM was used to print a MEMS switch, combining conductive PLA and polyvinyl alcohol as the sacrificial layer. However, achieving the mechanical precision needed for switching elements remains a barrier for fully integrated use. 3D printing material and preparation methods. Image via Springer Nature. Development of Functional Inks and Composite Materials Microelectronic applications depend on the availability of printable materials with specific electrical, mechanical, or chemical properties. MXene-based conductive inks, metal particle suspensions, and piezoelectric composites are being optimized for use in DIW, inkjet, and light-curing platforms. Researchers have fabricated planar asymmetric micro-supercapacitors using ink composed of nickel sulfide on nitrogen-doped MXene. These devices demonstrate increased voltage windowsand volumetric capacitance, meeting the demands of compact power systems. Other work involves composite hydrogels with ionic conductivity and high tensile stretch, used in flexible biosensing applications. PEDOT:PSS, a common conductive polymer, has been formulated into a high-resolution ink using lyophilization and re-dispersion in photocurable matrices. These formulations are used to create electrode arrays for neural probes and flexible circuits. Multiphoton lithography has also been applied to print complex 3D structures from organic semiconductor resins. Bioelectronic applications are driving the need for biocompatible inks that can perform reliably in wet and dynamic environments. One group incorporated graphene nanoplatelets and carbon nanotubes into ink for multi-jet fusion, producing pressure sensors with high mechanical durability and signal sensitivity. 3D printed electronics achieved through the integration of active initiators into printing materials. Image via Springer Nature. Microfluidic Devices Fabricated via Direct and Indirect Methods Microfluidic systems have traditionally relied on soft lithography techniques using polydimethylsiloxane. Additive manufacturing now offers alternatives through both direct printing of fluidic chips and indirect fabrication using 3D printed molds. Direct fabrication using SLA, DLP, or inkjet-based systems allows the rapid prototyping of chips with integrated reservoirs and channels. However, achieving sub-100 µm channels requires careful calibration. One group demonstrated channels as small as 18 µm × 20 µm using a customized DLP printer. Indirect fabrication relies on printing sacrificial or reusable molds, followed by casting and demolding. PLA, ABS, and resin-based molds are commonly used, depending on whether water-soluble or solvent-dissolvable materials are preferred. These techniques are compatible with PDMS and reduce reliance on photolithography equipment. Surface roughness and optical transparency remain concerns. FDM-printed molds often introduce layer artifacts, while uncured resin in SLA methods can leach toxins or inhibit PDMS curing. Some teams address these issues by polishing surfaces post-print or chemically treating molds to improve release characteristics. Integration and Future Directions for Microdevices 3D printed microfluidic devices in biology and chemistry.Image via Springer Nature. 3D printing is increasingly enabling the integration of structural, electrical, and sensing components into single build processes. Multi-material printers are beginning to produce substrates, conductive paths, and dielectric layers in tandem, although component embedding still requires manual intervention. Applications in wearable electronics, flexible sensors, and soft robotics continue to expand. Stretchable conductors printed onto elastomeric backings are being used to simulate mechanoreceptors and thermoreceptors for electronic skin systems. Piezoelectric materials such as BaTiO₃-PVDF composites are under investigation for printed actuators and energy harvesters. MEMS fabrication remains constrained by the mechanical limitations of printable materials. Silicon continues to dominate high-performance actuators due to its stiffness and precision. Additive methods are currently better suited for producing packaging, connectors, and sacrificial scaffolds within MEMS systems. Multi-photon and light-assisted processes are being explored for producing active devices like microcapacitors and accelerometers. Recent work demonstrated the use of 2PP to fabricate nitrogen-vacancy center–based quantum sensors, capable of detecting thermal and magnetic fluctuations in microscopic environments. As materials, resolution, and system integration improve, 3D printing is poised to shift from peripheral use to a central role in microsystem design and production.  3D printing micro-nano devices. Image via Springer Nature. Ready to discover who won the 20243D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletter to stay updated with the latest news and insights. Take the 3DPI Reader Survey — shape the future of AM reporting in under 5 minutes. Featured image shows an Overview of 3D printing applications for microelectronic and microfluidic device fabrication. Image via Springer Nature. Anyer Tenorio Lara Anyer Tenorio Lara is an emerging tech journalist passionate about uncovering the latest advances in technology and innovation. With a sharp eye for detail and a talent for storytelling, Anyer has quickly made a name for himself in the tech community. Anyer's articles aim to make complex subjects accessible and engaging for a broad audience. In addition to his writing, Anyer enjoys participating in industry events and discussions, eager to learn and share knowledge in the dynamic world of technology. #mapping #expanding #role #printing #micro
    3DPRINTINGINDUSTRY.COM
    Mapping the Expanding Role of 3D Printing in Micro and Nano Device Fabrication
    A new review by researchers from the Beijing University of Posts and Telecommunications, CETC 54 (54th Research Institute of Electronics Technology Group Corporation), Sun Yat-sen University, Shenzhen University, and the University of Electronic Science and Technology of China surveys the latest developments in 3D printing for microelectronic and microfluidic applications. The paper released on Springer Nature Link highlights how additive manufacturing methods have reached sub-micron precision, allowing the production of devices previously limited to traditional cleanroom fabrication. High-resolution techniques like two-photon polymerization (2PP), electrohydrodynamic jet printing, and computed axial lithography (CAL) are now being used to create structures with feature sizes down to 100 nanometers. These capabilities have broad implications for biomedical sensors, flexible electronics, and microfluidic systems used in diagnostics and environmental monitoring. Overview of 3D printing applications for microelectronic and microfluidic device fabrication. Image via Springer Nature. Classification of High-Precision Additive Processes Seven categories of additive manufacturing, as defined by the American Society for Testing and Materials (ASTM) serve as the foundation for modern 3D printing workflows: binder jetting, directed energy deposition (DED), material extrusion (MEX), material jetting, powder bed fusion (PBF), sheet lamination (SHL), and vat photopolymerization (VP). Among these, 2PP provides the finest resolution, enabling the fabrication of nanoscale features for optical communication components and MEMS support structures. Inkjet-based material jetting and direct ink writing (DIW) allow patterned deposition of conductive or biological materials, including stretchable gels and ionic polymers. Binder jetting, which operates by spraying adhesives onto powdered substrates, is particularly suited for large-volume structures using metals or ceramics with minimal thermal stress. Fused deposition modeling, a form of material extrusion, continues to be widely used for its low cost and compatibility with thermoplastics. Although limited in resolution, it remains practical for building mechanical supports or sacrificial molds in soft lithography. Various micro-scale 3D printing strategies. Image via Springer Nature. 3D Printing in Microelectronics, MEMS, and Sensing Additive manufacturing is now routinely used to fabricate microsensors, microelectromechanical system (MEMS) actuators, and flexible electronics. Compared to traditional lithographic processes, 3D printing reduces material waste and bypasses the need for masks or etching steps. In one example cited by the review, flexible multi-directional sensors were printed directly onto skin-like substrates using a customized FDM platform. Another case involved a cantilever support for a micro-accelerometer produced via 2PP and coated with conductive materials through evaporation. These examples show how additive techniques can fabricate both support and functional layers with high geometric complexity. MEMS actuators fabricated with additive methods often combine printed scaffolds with conventional micromachining. A 2PP-printed spiral structure was used to house liquid metal in an electrothermal actuator. Separately, FDM was used to print a MEMS switch, combining conductive PLA and polyvinyl alcohol as the sacrificial layer. However, achieving the mechanical precision needed for switching elements remains a barrier for fully integrated use. 3D printing material and preparation methods. Image via Springer Nature. Development of Functional Inks and Composite Materials Microelectronic applications depend on the availability of printable materials with specific electrical, mechanical, or chemical properties. MXene-based conductive inks, metal particle suspensions, and piezoelectric composites are being optimized for use in DIW, inkjet, and light-curing platforms. Researchers have fabricated planar asymmetric micro-supercapacitors using ink composed of nickel sulfide on nitrogen-doped MXene. These devices demonstrate increased voltage windows (up to 1.5 V) and volumetric capacitance, meeting the demands of compact power systems. Other work involves composite hydrogels with ionic conductivity and high tensile stretch, used in flexible biosensing applications. PEDOT:PSS, a common conductive polymer, has been formulated into a high-resolution ink using lyophilization and re-dispersion in photocurable matrices. These formulations are used to create electrode arrays for neural probes and flexible circuits. Multiphoton lithography has also been applied to print complex 3D structures from organic semiconductor resins. Bioelectronic applications are driving the need for biocompatible inks that can perform reliably in wet and dynamic environments. One group incorporated graphene nanoplatelets and carbon nanotubes into ink for multi-jet fusion, producing pressure sensors with high mechanical durability and signal sensitivity. 3D printed electronics achieved through the integration of active initiators into printing materials. Image via Springer Nature. Microfluidic Devices Fabricated via Direct and Indirect Methods Microfluidic systems have traditionally relied on soft lithography techniques using polydimethylsiloxane (PDMS). Additive manufacturing now offers alternatives through both direct printing of fluidic chips and indirect fabrication using 3D printed molds. Direct fabrication using SLA, DLP, or inkjet-based systems allows the rapid prototyping of chips with integrated reservoirs and channels. However, achieving sub-100 µm channels requires careful calibration. One group demonstrated channels as small as 18 µm × 20 µm using a customized DLP printer. Indirect fabrication relies on printing sacrificial or reusable molds, followed by casting and demolding. PLA, ABS, and resin-based molds are commonly used, depending on whether water-soluble or solvent-dissolvable materials are preferred. These techniques are compatible with PDMS and reduce reliance on photolithography equipment. Surface roughness and optical transparency remain concerns. FDM-printed molds often introduce layer artifacts, while uncured resin in SLA methods can leach toxins or inhibit PDMS curing. Some teams address these issues by polishing surfaces post-print or chemically treating molds to improve release characteristics. Integration and Future Directions for Microdevices 3D printed microfluidic devices in biology and chemistry.Image via Springer Nature. 3D printing is increasingly enabling the integration of structural, electrical, and sensing components into single build processes. Multi-material printers are beginning to produce substrates, conductive paths, and dielectric layers in tandem, although component embedding still requires manual intervention. Applications in wearable electronics, flexible sensors, and soft robotics continue to expand. Stretchable conductors printed onto elastomeric backings are being used to simulate mechanoreceptors and thermoreceptors for electronic skin systems. Piezoelectric materials such as BaTiO₃-PVDF composites are under investigation for printed actuators and energy harvesters. MEMS fabrication remains constrained by the mechanical limitations of printable materials. Silicon continues to dominate high-performance actuators due to its stiffness and precision. Additive methods are currently better suited for producing packaging, connectors, and sacrificial scaffolds within MEMS systems. Multi-photon and light-assisted processes are being explored for producing active devices like microcapacitors and accelerometers. Recent work demonstrated the use of 2PP to fabricate nitrogen-vacancy center–based quantum sensors, capable of detecting thermal and magnetic fluctuations in microscopic environments. As materials, resolution, and system integration improve, 3D printing is poised to shift from peripheral use to a central role in microsystem design and production.  3D printing micro-nano devices. Image via Springer Nature. Ready to discover who won the 20243D Printing Industry Awards? Subscribe to the 3D Printing Industry newsletter to stay updated with the latest news and insights. Take the 3DPI Reader Survey — shape the future of AM reporting in under 5 minutes. Featured image shows an Overview of 3D printing applications for microelectronic and microfluidic device fabrication. Image via Springer Nature. Anyer Tenorio Lara Anyer Tenorio Lara is an emerging tech journalist passionate about uncovering the latest advances in technology and innovation. With a sharp eye for detail and a talent for storytelling, Anyer has quickly made a name for himself in the tech community. Anyer's articles aim to make complex subjects accessible and engaging for a broad audience. In addition to his writing, Anyer enjoys participating in industry events and discussions, eager to learn and share knowledge in the dynamic world of technology.
    0 Comentários 0 Compartilhamentos 0 Anterior
  • How the FDA Might Make It Harder to Get COVID Shots This Year

    The U.S. government has not yet made its official recommendations for who should be able to get COVID booster shots this fall, but FDA officials published a policy position in the New England Journal of Medicine announcing that it intends to make some drastic policy changes. The changes could result in healthy people under age 65 losing access to COVID vaccines, according to vaccine experts who have spoken about the policies. Here’s what we know so far, and why the announced policy could be a problem. How COVID vaccines are currently approvedScientists have changed the formulation of COVID vaccines a few times over the years, because the COVID virus itself tends to mutate. Vaccines are updated to better match the strains that are circulating, and this has happened roughly once a year—similar to how flu shots are updated each year. Instead of designing new vaccine trials from scratch for each small change in the COVID vaccine, manufacturers conduct studies to show that the immunity people get from the new vaccine is equivalent to what people got from the old vaccine. After approval from the FDA, the CDC then issues a recommendation for who should get the vaccine. Currently, everyone aged 6 months and up is recommended to get a COVID vaccine. What might be changingThe new policy, according to the NEJM article, would be to accept those immunobridging studies only to approve vaccines for people aged 65 and up, and people above the age of 6 months who have one of the high-risk conditions on a list maintained by the CDC. For healthy people under 65, the FDA’s policy wouldn’t approve new COVID vaccines unless they were tested against a placebo. The FDA doesn’t have the authority to change the recommendations on who should get vaccines that are already approved, but it is in charge of approving vaccines and can approve them only for specific populations. Why placebo-controlled trials are an absolutely wild idea for COVID vaccinesPublic health experts are, to put it mildly, not happy with this plan. That’s because we already have COVID vaccines that work. Doing a placebo-controlled trial would require withholding COVID vaccines from people in the control group; they would get saline instead of a functional vaccine. The normal way to do this type of trialis to compare the new vaccine or medication against one that is already considered effective. To use an extreme analogy, you wouldn’t test a new design of seatbelt by randomizing people to ride around without using any seatbelts at all. Vaccine scientist Peter Hotez told CNN that the FDA’s announced approach “essentially denies access to vaccines,” since such trials are not practical for companies to do. In a post on Bluesky, toxicologist Ryan Marino said that it amounts to “scientific misconduct.” Vaccine expert Paul Offit told NPR “I don't think it's ethical, given that we have a vaccine that works, given that we know that SARS-CoV2continues to circulate and cause hospitalizations and death, and there's no group that has no risk.”More vaccine chaos may be comingThe new policy isn’t official yet, but it’s hard to imagine the FDA and CDC being allowed to approve and recommend vaccines the way it always has in the current political climate. Biologics director Vinay Prasad and FDA Commissioner Marty Makary, whose names appear on the FDA’s policy statement, have a history of arguing against COVID vaccine access for children. And both agencies are under the umbrella of HHS, the department of Health and Human Services, which is headed by Robert F. Kennedy, Jr—the same person whose anti-vaccine organization financed the movie Plandemic. If you don’t recall the details of that movie circulating in the early pandemic days, it implied both that COVID wasn’t real and that it was a bioweapon created by the government; the logic didn’t hold together but ultimately the point was that we should be suspicious of vaccines. RFK, Jr has said a lot of bananas stuff about vaccines. He has compared childhood vaccines to the holocaust, claimed that Bill Gates put microchips in vaccines, and loudly questioned whether vaccines cause autism. How this man got put in charge of a health agency, I will never understand. Recent and future vaccine approvals may be at risk in this environment. Moderna had planned to submit a combined flu/COVID vaccine for approval; it has since withdrawn its application.Novavax’s recent vaccine was approved recently, but only after a delay and only for older adults and for people with high-risk health conditions. Kennedy released a report today that questions the childhood vaccine schedule and implies that vaccines are part of the “stark reality of American children's declining health.” 
    #how #fda #might #make #harder
    How the FDA Might Make It Harder to Get COVID Shots This Year
    The U.S. government has not yet made its official recommendations for who should be able to get COVID booster shots this fall, but FDA officials published a policy position in the New England Journal of Medicine announcing that it intends to make some drastic policy changes. The changes could result in healthy people under age 65 losing access to COVID vaccines, according to vaccine experts who have spoken about the policies. Here’s what we know so far, and why the announced policy could be a problem. How COVID vaccines are currently approvedScientists have changed the formulation of COVID vaccines a few times over the years, because the COVID virus itself tends to mutate. Vaccines are updated to better match the strains that are circulating, and this has happened roughly once a year—similar to how flu shots are updated each year. Instead of designing new vaccine trials from scratch for each small change in the COVID vaccine, manufacturers conduct studies to show that the immunity people get from the new vaccine is equivalent to what people got from the old vaccine. After approval from the FDA, the CDC then issues a recommendation for who should get the vaccine. Currently, everyone aged 6 months and up is recommended to get a COVID vaccine. What might be changingThe new policy, according to the NEJM article, would be to accept those immunobridging studies only to approve vaccines for people aged 65 and up, and people above the age of 6 months who have one of the high-risk conditions on a list maintained by the CDC. For healthy people under 65, the FDA’s policy wouldn’t approve new COVID vaccines unless they were tested against a placebo. The FDA doesn’t have the authority to change the recommendations on who should get vaccines that are already approved, but it is in charge of approving vaccines and can approve them only for specific populations. Why placebo-controlled trials are an absolutely wild idea for COVID vaccinesPublic health experts are, to put it mildly, not happy with this plan. That’s because we already have COVID vaccines that work. Doing a placebo-controlled trial would require withholding COVID vaccines from people in the control group; they would get saline instead of a functional vaccine. The normal way to do this type of trialis to compare the new vaccine or medication against one that is already considered effective. To use an extreme analogy, you wouldn’t test a new design of seatbelt by randomizing people to ride around without using any seatbelts at all. Vaccine scientist Peter Hotez told CNN that the FDA’s announced approach “essentially denies access to vaccines,” since such trials are not practical for companies to do. In a post on Bluesky, toxicologist Ryan Marino said that it amounts to “scientific misconduct.” Vaccine expert Paul Offit told NPR “I don't think it's ethical, given that we have a vaccine that works, given that we know that SARS-CoV2continues to circulate and cause hospitalizations and death, and there's no group that has no risk.”More vaccine chaos may be comingThe new policy isn’t official yet, but it’s hard to imagine the FDA and CDC being allowed to approve and recommend vaccines the way it always has in the current political climate. Biologics director Vinay Prasad and FDA Commissioner Marty Makary, whose names appear on the FDA’s policy statement, have a history of arguing against COVID vaccine access for children. And both agencies are under the umbrella of HHS, the department of Health and Human Services, which is headed by Robert F. Kennedy, Jr—the same person whose anti-vaccine organization financed the movie Plandemic. If you don’t recall the details of that movie circulating in the early pandemic days, it implied both that COVID wasn’t real and that it was a bioweapon created by the government; the logic didn’t hold together but ultimately the point was that we should be suspicious of vaccines. RFK, Jr has said a lot of bananas stuff about vaccines. He has compared childhood vaccines to the holocaust, claimed that Bill Gates put microchips in vaccines, and loudly questioned whether vaccines cause autism. How this man got put in charge of a health agency, I will never understand. Recent and future vaccine approvals may be at risk in this environment. Moderna had planned to submit a combined flu/COVID vaccine for approval; it has since withdrawn its application.Novavax’s recent vaccine was approved recently, but only after a delay and only for older adults and for people with high-risk health conditions. Kennedy released a report today that questions the childhood vaccine schedule and implies that vaccines are part of the “stark reality of American children's declining health.”  #how #fda #might #make #harder
    LIFEHACKER.COM
    How the FDA Might Make It Harder to Get COVID Shots This Year
    The U.S. government has not yet made its official recommendations for who should be able to get COVID booster shots this fall, but FDA officials published a policy position in the New England Journal of Medicine announcing that it intends to make some drastic policy changes. The changes could result in healthy people under age 65 losing access to COVID vaccines, according to vaccine experts who have spoken about the policies. Here’s what we know so far, and why the announced policy could be a problem. How COVID vaccines are currently approvedScientists have changed the formulation of COVID vaccines a few times over the years, because the COVID virus itself tends to mutate. Vaccines are updated to better match the strains that are circulating, and this has happened roughly once a year—similar to how flu shots are updated each year. Instead of designing new vaccine trials from scratch for each small change in the COVID vaccine, manufacturers conduct studies to show that the immunity people get from the new vaccine is equivalent to what people got from the old vaccine. After approval from the FDA, the CDC then issues a recommendation for who should get the vaccine. Currently, everyone aged 6 months and up is recommended to get a COVID vaccine. What might be changingThe new policy, according to the NEJM article, would be to accept those immunobridging studies only to approve vaccines for people aged 65 and up, and people above the age of 6 months who have one of the high-risk conditions on a list maintained by the CDC. For healthy people under 65, the FDA’s policy wouldn’t approve new COVID vaccines unless they were tested against a placebo. (The type of placebo is phrased vaguely: “The control group could receive a saline placebo,” the authors write.) The FDA doesn’t have the authority to change the recommendations on who should get vaccines that are already approved (that’s the CDC’s purview), but it is in charge of approving vaccines and can approve them only for specific populations. Why placebo-controlled trials are an absolutely wild idea for COVID vaccinesPublic health experts are, to put it mildly, not happy with this plan. That’s because we already have COVID vaccines that work. Doing a placebo-controlled trial would require withholding COVID vaccines from people in the control group; they would get saline instead of a functional vaccine. The normal way to do this type of trial (if you do one at all, rather than relying on immunobridging) is to compare the new vaccine or medication against one that is already considered effective. To use an extreme analogy, you wouldn’t test a new design of seatbelt by randomizing people to ride around without using any seatbelts at all. Vaccine scientist Peter Hotez told CNN that the FDA’s announced approach “essentially denies access to vaccines,” since such trials are not practical for companies to do. In a post on Bluesky, toxicologist Ryan Marino said that it amounts to “scientific misconduct.” Vaccine expert Paul Offit told NPR “I don't think it's ethical, given that we have a vaccine that works, given that we know that SARS-CoV2 [the COVID virus] continues to circulate and cause hospitalizations and death, and there's no group that has no risk.”More vaccine chaos may be comingThe new policy isn’t official yet, but it’s hard to imagine the FDA and CDC being allowed to approve and recommend vaccines the way it always has in the current political climate. Biologics director Vinay Prasad and FDA Commissioner Marty Makary, whose names appear on the FDA’s policy statement, have a history of arguing against COVID vaccine access for children. And both agencies are under the umbrella of HHS, the department of Health and Human Services, which is headed by Robert F. Kennedy, Jr—the same person whose anti-vaccine organization financed the movie Plandemic. If you don’t recall the details of that movie circulating in the early pandemic days, it implied both that COVID wasn’t real and that it was a bioweapon created by the government; the logic didn’t hold together but ultimately the point was that we should be suspicious of vaccines. (I have more on Plandemic here.) RFK, Jr has said a lot of bananas stuff about vaccines. He has compared childhood vaccines to the holocaust, claimed that Bill Gates put microchips in vaccines, and loudly questioned whether vaccines cause autism. How this man got put in charge of a health agency, I will never understand. Recent and future vaccine approvals may be at risk in this environment. Moderna had planned to submit a combined flu/COVID vaccine for approval; it has since withdrawn its application. (It’s not clear whether recent FDA policy announcements are directly related.) Novavax’s recent vaccine was approved recently, but only after a delay and only for older adults and for people with high-risk health conditions. Kennedy released a report today that questions the childhood vaccine schedule and implies that vaccines are part of the “stark reality of American children's declining health.” 
    0 Comentários 0 Compartilhamentos 0 Anterior
  • Feeding The Future Or Eating The Ocean? The $80 Billion Salmon Crisis

    Trawl net bycatch from shrimp fishery, Sea of Cortez, Mexico.Universal Images Group via Getty Images
    Beneath the placid surface of the global seafood market, a material financial risk is quietly escalating—one rooted deep within the industry's supply chain. It's not climate volatility or ESG scrutiny grabbing the headlines —but the fragile economics of what we’re feeding our farmed fish.

    A new report from the FAIRR Initiative—an trillion-backed investor network focused on ESG risks in protein production—exposes a growing contradiction at the heart of the global salmon farming industry: a sector that markets itself as sustainable yet increasingly relies on a shrinking, finite resource—wild fish—for its survival.

    Released ahead of the 2025 UN Ocean Conference, the report follows a four-year engagement with seven of the world’s largest publicly listed salmon producers and delivers a stark warning: without urgent reform, the industry’s feed supply chain could buckle under its own expansion.

    These companies represent 58% of global farmed salmon production, with over 1.2 million tonnes produced in 2023. FAIRR’s analysis reveals systemic environmental, regulatory, and financial risks tied to dependence on wild-caught fish, exposing a deep disconnect between sustainability claims and operational reality.

    Supply Chain Risk from Finite Fish Resources
    The industry’s dependence on fishmeal and fish oil, both derived from wild-caught fish, is a growing vulnerability. According to the UN Food and Agriculture Organization, 90% of global fisheries are already overexploited or at capacity. Yet salmon producers continue to lean heavily on this strained input to support projected production increases of 40% by 2033.
    FMFO is also used across aquaculture species like sea bass and sea bream, as well as in pet food. In 2023, when Peru cancelled its anchovy fishing season, fish oil prices surged by 107%. Mowi, the world’s largest salmon producer, reported a 70% rise in feed costs between 2021 and 2023 due to that single event. Some companies temporarily switched to algae oil during the price spike, only to revert once fisheries opened—highlighting a reactive approach that favors short-term cost savings over long-term resilience.
    “We are relying on a finite input to fuel infinite growth projections,” said Laure Boissat, Oceans Programme Manager at FAIRR. “That’s not resilience—it’s a recipe for collapse.”
    We’re Literally Feeding Fish to Fish
    Between 2020 and 2024, five of the seven companies in FAIRR’s study increased their absolute use of FMFO made from whole wild fish—by as much as 39%. Despite sustainability claims, only three firms reduced the proportion of FMFO in their feed, and none by more than three percentage points.

    In response, many companies have turned to fish trimmings—by-products from fish processing—as an alternative. While six producers have increased their use, the supply is inherently limited. One company reported purchasing all available trimmings in its operating region, raising concerns that rising demand could incentivize additional fishing.
    This exposes a fundamental flaw in the industry’s growth narrative. Farmed salmon production is projected to grow by 40% by 2033, yet fishmeal and fish oil production is forecast to rise by only 9% and 12% respectively over roughly the same period. These numbers are irreconcilable. Without scalable alternatives, or a drastic shift in feed formulation, the industry’s expansion plans appear unsustainable.
    FAIRR’s report notes that none of the seven companies assessed have set absolute reduction targets for fish-based feed, even as five aim to scale up salmon output. This disconnect exposes investors to long-term risk: if feed supply can't match growth, either costs will skyrocket, margins will shrink, or the environmental impact will intensify.
    Feed Frenzy: The Domino Effect Across Industries
    This feed dependency has broader implications. Aquafeed producers face rising costs and raw material uncertainty. The pet food industry, reliant on salmon oil and trimmings, is also vulnerable to volatility. As wild fish availability declines, disruptions in one part of the supply chain can ripple across sectors, amplifying risk.
    The diversion of edible fish into feed also raises ethical concerns. Over 90% of fish used in FMFO could be eaten by people. Feedback’s Blue Empire report found that in 2020, Norwegian salmon farms used nearly 2 million tonnes of wild-caught fish for feed, including up to 144,000 tonnes harvested off West Africa, enough to feed 2.5 to 4 million people for a year.
    Novel Feeds: Big Hype, Small Bite
    Novel ingredients like insect meal, algae oil, and single-cell proteins were once seen as game-changers. Four years later, their use remains limited as challenges abound including high production costs, scalability issues, nutritional limitations, and consumer skepticism. Only three companies have set targets to increase their inclusion, which average just 4%. One aims for 10–15% by 2030—a modest target given the urgent need for action.
    “In essence, the sector is stalling,” said Boissat. “There’s no silver bullet ingredient on the horizon. What we’re seeing instead is short-term thinking packaged as long-term strategy.”
    Investors Sound the Alarm
    FAIRR’s report quantifies a growing financial risk. Feed price volatility, as demonstrated by the Peru example, threatens margins across the sector. That being the case, it’s arguable that the salmon industry’s dependence on wild-caught fish is not just environmentally unsustainable—it’s economically reckless.
    “As investors, we believe the aquaculture industry must shift towards sustainable feed solutions. Diversifying feed ingredients is not only an environmental imperative, but also a strategic necessity for long-term resilience,” Thekla Swart of FAIRR participant Steyler Ethik Bank said in a statement.
    Salmon producers often lead the protein sector in disclosure, but FAIRR warns that transparency alone is not enough. “Companies disclose intensity-based metrics, but those don’t show the absolute pressure on fish stocks,” Boissat explained. “This is the gap between reporting and reality—the system is unsustainable even as it appears progressive on paper.”
    A Fork in the Water: From Carnivores to Mussels?
    FAIRR’s recommendations are clear. Companies should set absolute reduction rather than efficiency targets for FMFO and invest in scalable alternative feed ingredients—but deeper transformation may be needed.
    That means shifting away from carnivorous species like salmon toward unfed aquaculture options—such as mussels and oysters—which require no external feed inputs. FAIRR also encourages exploration of plant-based seafood, mirroring moves by the meat industry into alternative proteins. “Fed aquaculture is simply inefficient,” Boissat emphasized. “We must rethink what seafood production looks like in the 21st century.”
    Ocean Governance on Trial: What the UN Must Confront In Nice
    FAIRR’s report arrives ahead of the June UN Ocean Conference in Nice, where global leaders will gather to address ocean sustainability. A key issue is the gap between marine protection policy and practice.
    Many marine protected areasstill allow bottom trawling and industrial fishing, undermining conservation goals. “Even in protected waters, the absence of enforceable restrictions allows destructive practices to persist,” said Boissat. “Until regulation catches up with science, these so-called protections offer a false sense of security—for ecosystems and for markets.”
    Campaigners and investors hope the conference will lead to stronger governance—not just symbolic declarations. Without enforceable protections, risks to marine biodiversity and the industries that depend on it will only grow.
    Is The Supply Chain Eating Itself?
    Stakeholder must decide: continue with business as usual, risking biodiversity collapse, food insecurity, and supply chain disruption—or rethink how the aquaculture sector operates.
    “The industry has been talking about risk and resilience for years,” said Boissat. “But if your entire business model is based on a disappearing input, that’s not resilience. That’s denial.”
    #feeding #future #eating #ocean #billion
    Feeding The Future Or Eating The Ocean? The $80 Billion Salmon Crisis
    Trawl net bycatch from shrimp fishery, Sea of Cortez, Mexico.Universal Images Group via Getty Images Beneath the placid surface of the global seafood market, a material financial risk is quietly escalating—one rooted deep within the industry's supply chain. It's not climate volatility or ESG scrutiny grabbing the headlines —but the fragile economics of what we’re feeding our farmed fish. A new report from the FAIRR Initiative—an trillion-backed investor network focused on ESG risks in protein production—exposes a growing contradiction at the heart of the global salmon farming industry: a sector that markets itself as sustainable yet increasingly relies on a shrinking, finite resource—wild fish—for its survival. Released ahead of the 2025 UN Ocean Conference, the report follows a four-year engagement with seven of the world’s largest publicly listed salmon producers and delivers a stark warning: without urgent reform, the industry’s feed supply chain could buckle under its own expansion. These companies represent 58% of global farmed salmon production, with over 1.2 million tonnes produced in 2023. FAIRR’s analysis reveals systemic environmental, regulatory, and financial risks tied to dependence on wild-caught fish, exposing a deep disconnect between sustainability claims and operational reality. Supply Chain Risk from Finite Fish Resources The industry’s dependence on fishmeal and fish oil, both derived from wild-caught fish, is a growing vulnerability. According to the UN Food and Agriculture Organization, 90% of global fisheries are already overexploited or at capacity. Yet salmon producers continue to lean heavily on this strained input to support projected production increases of 40% by 2033. FMFO is also used across aquaculture species like sea bass and sea bream, as well as in pet food. In 2023, when Peru cancelled its anchovy fishing season, fish oil prices surged by 107%. Mowi, the world’s largest salmon producer, reported a 70% rise in feed costs between 2021 and 2023 due to that single event. Some companies temporarily switched to algae oil during the price spike, only to revert once fisheries opened—highlighting a reactive approach that favors short-term cost savings over long-term resilience. “We are relying on a finite input to fuel infinite growth projections,” said Laure Boissat, Oceans Programme Manager at FAIRR. “That’s not resilience—it’s a recipe for collapse.” We’re Literally Feeding Fish to Fish Between 2020 and 2024, five of the seven companies in FAIRR’s study increased their absolute use of FMFO made from whole wild fish—by as much as 39%. Despite sustainability claims, only three firms reduced the proportion of FMFO in their feed, and none by more than three percentage points. In response, many companies have turned to fish trimmings—by-products from fish processing—as an alternative. While six producers have increased their use, the supply is inherently limited. One company reported purchasing all available trimmings in its operating region, raising concerns that rising demand could incentivize additional fishing. This exposes a fundamental flaw in the industry’s growth narrative. Farmed salmon production is projected to grow by 40% by 2033, yet fishmeal and fish oil production is forecast to rise by only 9% and 12% respectively over roughly the same period. These numbers are irreconcilable. Without scalable alternatives, or a drastic shift in feed formulation, the industry’s expansion plans appear unsustainable. FAIRR’s report notes that none of the seven companies assessed have set absolute reduction targets for fish-based feed, even as five aim to scale up salmon output. This disconnect exposes investors to long-term risk: if feed supply can't match growth, either costs will skyrocket, margins will shrink, or the environmental impact will intensify. Feed Frenzy: The Domino Effect Across Industries This feed dependency has broader implications. Aquafeed producers face rising costs and raw material uncertainty. The pet food industry, reliant on salmon oil and trimmings, is also vulnerable to volatility. As wild fish availability declines, disruptions in one part of the supply chain can ripple across sectors, amplifying risk. The diversion of edible fish into feed also raises ethical concerns. Over 90% of fish used in FMFO could be eaten by people. Feedback’s Blue Empire report found that in 2020, Norwegian salmon farms used nearly 2 million tonnes of wild-caught fish for feed, including up to 144,000 tonnes harvested off West Africa, enough to feed 2.5 to 4 million people for a year. Novel Feeds: Big Hype, Small Bite Novel ingredients like insect meal, algae oil, and single-cell proteins were once seen as game-changers. Four years later, their use remains limited as challenges abound including high production costs, scalability issues, nutritional limitations, and consumer skepticism. Only three companies have set targets to increase their inclusion, which average just 4%. One aims for 10–15% by 2030—a modest target given the urgent need for action. “In essence, the sector is stalling,” said Boissat. “There’s no silver bullet ingredient on the horizon. What we’re seeing instead is short-term thinking packaged as long-term strategy.” Investors Sound the Alarm FAIRR’s report quantifies a growing financial risk. Feed price volatility, as demonstrated by the Peru example, threatens margins across the sector. That being the case, it’s arguable that the salmon industry’s dependence on wild-caught fish is not just environmentally unsustainable—it’s economically reckless. “As investors, we believe the aquaculture industry must shift towards sustainable feed solutions. Diversifying feed ingredients is not only an environmental imperative, but also a strategic necessity for long-term resilience,” Thekla Swart of FAIRR participant Steyler Ethik Bank said in a statement. Salmon producers often lead the protein sector in disclosure, but FAIRR warns that transparency alone is not enough. “Companies disclose intensity-based metrics, but those don’t show the absolute pressure on fish stocks,” Boissat explained. “This is the gap between reporting and reality—the system is unsustainable even as it appears progressive on paper.” A Fork in the Water: From Carnivores to Mussels? FAIRR’s recommendations are clear. Companies should set absolute reduction rather than efficiency targets for FMFO and invest in scalable alternative feed ingredients—but deeper transformation may be needed. That means shifting away from carnivorous species like salmon toward unfed aquaculture options—such as mussels and oysters—which require no external feed inputs. FAIRR also encourages exploration of plant-based seafood, mirroring moves by the meat industry into alternative proteins. “Fed aquaculture is simply inefficient,” Boissat emphasized. “We must rethink what seafood production looks like in the 21st century.” Ocean Governance on Trial: What the UN Must Confront In Nice FAIRR’s report arrives ahead of the June UN Ocean Conference in Nice, where global leaders will gather to address ocean sustainability. A key issue is the gap between marine protection policy and practice. Many marine protected areasstill allow bottom trawling and industrial fishing, undermining conservation goals. “Even in protected waters, the absence of enforceable restrictions allows destructive practices to persist,” said Boissat. “Until regulation catches up with science, these so-called protections offer a false sense of security—for ecosystems and for markets.” Campaigners and investors hope the conference will lead to stronger governance—not just symbolic declarations. Without enforceable protections, risks to marine biodiversity and the industries that depend on it will only grow. Is The Supply Chain Eating Itself? Stakeholder must decide: continue with business as usual, risking biodiversity collapse, food insecurity, and supply chain disruption—or rethink how the aquaculture sector operates. “The industry has been talking about risk and resilience for years,” said Boissat. “But if your entire business model is based on a disappearing input, that’s not resilience. That’s denial.” #feeding #future #eating #ocean #billion
    WWW.FORBES.COM
    Feeding The Future Or Eating The Ocean? The $80 Billion Salmon Crisis
    Trawl net bycatch from shrimp fishery, Sea of Cortez, Mexico. (Photo by: Mark Conlin/VW PICS/UIG via ... More Getty Image)Universal Images Group via Getty Images Beneath the placid surface of the global seafood market, a material financial risk is quietly escalating—one rooted deep within the industry's supply chain. It's not climate volatility or ESG scrutiny grabbing the headlines —but the fragile economics of what we’re feeding our farmed fish. A new report from the FAIRR Initiative—an $80 trillion-backed investor network focused on ESG risks in protein production—exposes a growing contradiction at the heart of the global salmon farming industry: a sector that markets itself as sustainable yet increasingly relies on a shrinking, finite resource—wild fish—for its survival. Released ahead of the 2025 UN Ocean Conference, the report follows a four-year engagement with seven of the world’s largest publicly listed salmon producers and delivers a stark warning: without urgent reform, the industry’s feed supply chain could buckle under its own expansion. These companies represent 58% of global farmed salmon production, with over 1.2 million tonnes produced in 2023. FAIRR’s analysis reveals systemic environmental, regulatory, and financial risks tied to dependence on wild-caught fish, exposing a deep disconnect between sustainability claims and operational reality. Supply Chain Risk from Finite Fish Resources The industry’s dependence on fishmeal and fish oil (FMFO), both derived from wild-caught fish, is a growing vulnerability. According to the UN Food and Agriculture Organization, 90% of global fisheries are already overexploited or at capacity. Yet salmon producers continue to lean heavily on this strained input to support projected production increases of 40% by 2033. FMFO is also used across aquaculture species like sea bass and sea bream, as well as in pet food. In 2023, when Peru cancelled its anchovy fishing season, fish oil prices surged by 107%. Mowi, the world’s largest salmon producer, reported a 70% rise in feed costs between 2021 and 2023 due to that single event. Some companies temporarily switched to algae oil during the price spike, only to revert once fisheries opened—highlighting a reactive approach that favors short-term cost savings over long-term resilience. “We are relying on a finite input to fuel infinite growth projections,” said Laure Boissat, Oceans Programme Manager at FAIRR. “That’s not resilience—it’s a recipe for collapse.” We’re Literally Feeding Fish to Fish Between 2020 and 2024, five of the seven companies in FAIRR’s study increased their absolute use of FMFO made from whole wild fish—by as much as 39%. Despite sustainability claims, only three firms reduced the proportion of FMFO in their feed, and none by more than three percentage points. In response, many companies have turned to fish trimmings—by-products from fish processing—as an alternative. While six producers have increased their use, the supply is inherently limited. One company reported purchasing all available trimmings in its operating region, raising concerns that rising demand could incentivize additional fishing. This exposes a fundamental flaw in the industry’s growth narrative. Farmed salmon production is projected to grow by 40% by 2033, yet fishmeal and fish oil production is forecast to rise by only 9% and 12% respectively over roughly the same period. These numbers are irreconcilable. Without scalable alternatives, or a drastic shift in feed formulation, the industry’s expansion plans appear unsustainable. FAIRR’s report notes that none of the seven companies assessed have set absolute reduction targets for fish-based feed, even as five aim to scale up salmon output. This disconnect exposes investors to long-term risk: if feed supply can't match growth, either costs will skyrocket, margins will shrink, or the environmental impact will intensify. Feed Frenzy: The Domino Effect Across Industries This feed dependency has broader implications. Aquafeed producers face rising costs and raw material uncertainty. The pet food industry, reliant on salmon oil and trimmings, is also vulnerable to volatility. As wild fish availability declines, disruptions in one part of the supply chain can ripple across sectors, amplifying risk. The diversion of edible fish into feed also raises ethical concerns. Over 90% of fish used in FMFO could be eaten by people. Feedback’s Blue Empire report found that in 2020, Norwegian salmon farms used nearly 2 million tonnes of wild-caught fish for feed, including up to 144,000 tonnes harvested off West Africa, enough to feed 2.5 to 4 million people for a year. Novel Feeds: Big Hype, Small Bite Novel ingredients like insect meal, algae oil, and single-cell proteins were once seen as game-changers. Four years later, their use remains limited as challenges abound including high production costs, scalability issues, nutritional limitations, and consumer skepticism. Only three companies have set targets to increase their inclusion, which average just 4%. One aims for 10–15% by 2030—a modest target given the urgent need for action. “In essence, the sector is stalling,” said Boissat. “There’s no silver bullet ingredient on the horizon. What we’re seeing instead is short-term thinking packaged as long-term strategy.” Investors Sound the Alarm FAIRR’s report quantifies a growing financial risk. Feed price volatility, as demonstrated by the Peru example, threatens margins across the sector. That being the case, it’s arguable that the salmon industry’s dependence on wild-caught fish is not just environmentally unsustainable—it’s economically reckless. “As investors, we believe the aquaculture industry must shift towards sustainable feed solutions. Diversifying feed ingredients is not only an environmental imperative, but also a strategic necessity for long-term resilience,” Thekla Swart of FAIRR participant Steyler Ethik Bank said in a statement. Salmon producers often lead the protein sector in disclosure, but FAIRR warns that transparency alone is not enough. “Companies disclose intensity-based metrics, but those don’t show the absolute pressure on fish stocks,” Boissat explained. “This is the gap between reporting and reality—the system is unsustainable even as it appears progressive on paper.” A Fork in the Water: From Carnivores to Mussels? FAIRR’s recommendations are clear. Companies should set absolute reduction rather than efficiency targets for FMFO and invest in scalable alternative feed ingredients—but deeper transformation may be needed. That means shifting away from carnivorous species like salmon toward unfed aquaculture options—such as mussels and oysters—which require no external feed inputs. FAIRR also encourages exploration of plant-based seafood, mirroring moves by the meat industry into alternative proteins. “Fed aquaculture is simply inefficient,” Boissat emphasized. “We must rethink what seafood production looks like in the 21st century.” Ocean Governance on Trial: What the UN Must Confront In Nice FAIRR’s report arrives ahead of the June UN Ocean Conference in Nice, where global leaders will gather to address ocean sustainability. A key issue is the gap between marine protection policy and practice. Many marine protected areas (MPAs) still allow bottom trawling and industrial fishing, undermining conservation goals. “Even in protected waters, the absence of enforceable restrictions allows destructive practices to persist,” said Boissat. “Until regulation catches up with science, these so-called protections offer a false sense of security—for ecosystems and for markets.” Campaigners and investors hope the conference will lead to stronger governance—not just symbolic declarations. Without enforceable protections, risks to marine biodiversity and the industries that depend on it will only grow. Is The Supply Chain Eating Itself? Stakeholder must decide: continue with business as usual, risking biodiversity collapse, food insecurity, and supply chain disruption—or rethink how the aquaculture sector operates. “The industry has been talking about risk and resilience for years,” said Boissat. “But if your entire business model is based on a disappearing input, that’s not resilience. That’s denial.”
    0 Comentários 0 Compartilhamentos 0 Anterior
CGShares https://cgshares.com