• It's infuriating to see how many businesses are still in the dark about the true power of local SEO! Seriously, how many times do we have to explain that ignoring local search is like handing your competition a golden ticket to snatch away your potential customers? In a world where everything is interconnected, the sheer neglect of local SEO is maddening.

    Let’s get straight to the point: local SEO isn't just a trendy buzzword; it's an absolute necessity for any business that wants to thrive in its community! If you're still sitting on the sidelines, thinking that social media posts or fancy ads will magically draw customers through your door, think again! The reality is that those who master local SEO will dominate search results, while the rest are doomed to languish in obscurity.

    The absurdity of this situation is mind-boggling. Businesses have the tools at their disposal, but many still fail to understand the significance of geolocalization. It’s not rocket science! Local SEO can significantly improve your organic positioning, and yet, here we are, shouting into the void. You want visibility? You want to attract local customers? Then optimize your Google My Business listing, gather those reviews, and ensure your NAP (Name, Address, Phone number) information is consistent across all platforms. It’s not that complicated, yet so many are just too lazy to put in the work!

    And let’s talk about the content. Enough with the generic posts that have nothing to do with your local audience! If your content doesn’t resonate with the community you serve, it’s as good as throwing money out the window. Local SEO thrives on relevance and authenticity, so start creating content that speaks directly to your audience. Be the business that knows its customers, not just another faceless entity in the digital ether.

    It’s time to wake up, people! Local SEO is the lifeblood of businesses that want to thrive in today’s competitive landscape. Stop making excuses for why you can’t implement these strategies. It’s not about being tech-savvy; it’s about being smart, strategic, and willing to adapt. The longer you wait, the more customers you lose to those who understand the importance of local SEO.

    If you’re still clueless, it’s time to educate yourself because ignoring local SEO is a direct ticket to failure. Don’t let your competitors leave you in their dust. Step up, get informed, and start making the changes that will propel your business forward. Your community is waiting for you—don’t keep them waiting any longer!

    #LocalSEO #DigitalMarketing #SmallBusiness #OrganicPositioning #SEO
    It's infuriating to see how many businesses are still in the dark about the true power of local SEO! Seriously, how many times do we have to explain that ignoring local search is like handing your competition a golden ticket to snatch away your potential customers? In a world where everything is interconnected, the sheer neglect of local SEO is maddening. Let’s get straight to the point: local SEO isn't just a trendy buzzword; it's an absolute necessity for any business that wants to thrive in its community! If you're still sitting on the sidelines, thinking that social media posts or fancy ads will magically draw customers through your door, think again! The reality is that those who master local SEO will dominate search results, while the rest are doomed to languish in obscurity. The absurdity of this situation is mind-boggling. Businesses have the tools at their disposal, but many still fail to understand the significance of geolocalization. It’s not rocket science! Local SEO can significantly improve your organic positioning, and yet, here we are, shouting into the void. You want visibility? You want to attract local customers? Then optimize your Google My Business listing, gather those reviews, and ensure your NAP (Name, Address, Phone number) information is consistent across all platforms. It’s not that complicated, yet so many are just too lazy to put in the work! And let’s talk about the content. Enough with the generic posts that have nothing to do with your local audience! If your content doesn’t resonate with the community you serve, it’s as good as throwing money out the window. Local SEO thrives on relevance and authenticity, so start creating content that speaks directly to your audience. Be the business that knows its customers, not just another faceless entity in the digital ether. It’s time to wake up, people! Local SEO is the lifeblood of businesses that want to thrive in today’s competitive landscape. Stop making excuses for why you can’t implement these strategies. It’s not about being tech-savvy; it’s about being smart, strategic, and willing to adapt. The longer you wait, the more customers you lose to those who understand the importance of local SEO. If you’re still clueless, it’s time to educate yourself because ignoring local SEO is a direct ticket to failure. Don’t let your competitors leave you in their dust. Step up, get informed, and start making the changes that will propel your business forward. Your community is waiting for you—don’t keep them waiting any longer! #LocalSEO #DigitalMarketing #SmallBusiness #OrganicPositioning #SEO
    SEO local, ¿qué es y cómo ayuda a mejorar el posicionamiento orgánico?
    SEO local, ¿qué es y cómo ayuda a mejorar el posicionamiento orgánico? En un mundo cada vez más conectado, el SEO local se ha consolidado como una de las estrategias más efectivas para mejorar la visibilidad de los negocios locales que dependen de la
    Like
    Love
    Wow
    Sad
    Angry
    518
    1 التعليقات 0 المشاركات
  • Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 التعليقات 0 المشاركات
  • Editorial Design: '100 Beste Plakate 24' Showcase

    06/12 — 2025

    by abduzeedo

    Explore "100 Beste Plakate 24," a stunning yearbook by Tristesse and Slanted Publishers. Dive into cutting-edge editorial design and visual identity.
    Design enthusiasts, get ready to dive into the latest from the German-speaking design scene. The "100 Beste Plakate 24" yearbook offers a compelling showcase of contemporary graphic design. It's more than just a collection; it's a deep exploration of visual identity and editorial design.
    This yearbook, published by Slanted Publishers and edited by 100 beste Plakate e. V. and Fons Hickmann, is a testament to the power of impactful poster design. The design studio Tristesse from Basel took the reins for the overall concept, delivering a fresh and cheeky aesthetic that makes the "100 best posters" feel like leading actors on a vibrant stage. Their in-house approach to layout, typography, and photography truly shines.
    Unpacking the Visuals
    The book's formatand 256 pages allow for large-format images, providing ample space to appreciate each poster's intricate details. It includes detailed credits, content descriptions, and creation contexts. This commitment to detail in the editorial design elevates the reading experience.
    One notable example within the yearbook is the "To-Do: Diplome 24" poster campaign by Atelier HKB. Designed under Marco Matti's project management, this series features twelve motifs for the Bern University of the Arts graduation events. These posters highlight effective graphic design and visual communication. Another standout is the "Rettungsplakate" by klotz-studio für gestaltung. These "rescue posters," printed on actual rescue blankets, address homelessness in Germany. The raw, impactful visual approach paired with a tangible medium demonstrates powerful design with a purpose.
    Beyond the Imagery
    Beyond the stunning visuals, the yearbook offers insightful essays and interviews on current poster design trends. The introductory section features jury members, their works, and statements on the selection process, alongside forewords from the association president and jury chair. This editorial content offers valuable context and insights into the evolving landscape of graphic design.
    The book’s concept playfully questions the seriousness and benevolence of the honorary certificates awarded to the winning designers. This subtle irony adds a unique layer to the publication, transforming it from a mere compilation into a thoughtful commentary on the design world itself. It's an inspiring showcase of the cutting edge of contemporary graphic design.
    The Art of Editorial Design
    "100 Beste Plakate 24" is a prime example of exceptional editorial design. It's not just about compiling images; it's about curating a narrative. The precise layout, thoughtful typography choices, and the deliberate flow of content all contribute to a cohesive and engaging experience. This book highlights how editorial design can transform a collection of works into a compelling story, inviting readers to delve deeper into each piece.
    The attention to detail, from the softcover with flaps to the thread-stitching and hot-foil embossing, speaks volumes about the dedication to craftsmanship. This is where illustration, graphic design, and branding converge to create a truly immersive experience.
    Final Thoughts
    This yearbook is a must-have for anyone passionate about graphic design and visual identity. It offers a fresh perspective on contemporary poster design, highlighting both aesthetic excellence and social relevance. The detailed insights into the design process and the designers' intentions make it an invaluable resource. Pick up a copy and see how impactful design can be.
    You can learn more about this incredible work and acquire your copy at slanted.de/product/100-beste-plakate-24.
    Editorial design artifacts

    Tags

    editorial design
    #editorial #design #beste #plakate #showcase
    Editorial Design: '100 Beste Plakate 24' Showcase
    06/12 — 2025 by abduzeedo Explore "100 Beste Plakate 24," a stunning yearbook by Tristesse and Slanted Publishers. Dive into cutting-edge editorial design and visual identity. Design enthusiasts, get ready to dive into the latest from the German-speaking design scene. The "100 Beste Plakate 24" yearbook offers a compelling showcase of contemporary graphic design. It's more than just a collection; it's a deep exploration of visual identity and editorial design. This yearbook, published by Slanted Publishers and edited by 100 beste Plakate e. V. and Fons Hickmann, is a testament to the power of impactful poster design. The design studio Tristesse from Basel took the reins for the overall concept, delivering a fresh and cheeky aesthetic that makes the "100 best posters" feel like leading actors on a vibrant stage. Their in-house approach to layout, typography, and photography truly shines. Unpacking the Visuals The book's formatand 256 pages allow for large-format images, providing ample space to appreciate each poster's intricate details. It includes detailed credits, content descriptions, and creation contexts. This commitment to detail in the editorial design elevates the reading experience. One notable example within the yearbook is the "To-Do: Diplome 24" poster campaign by Atelier HKB. Designed under Marco Matti's project management, this series features twelve motifs for the Bern University of the Arts graduation events. These posters highlight effective graphic design and visual communication. Another standout is the "Rettungsplakate" by klotz-studio für gestaltung. These "rescue posters," printed on actual rescue blankets, address homelessness in Germany. The raw, impactful visual approach paired with a tangible medium demonstrates powerful design with a purpose. Beyond the Imagery Beyond the stunning visuals, the yearbook offers insightful essays and interviews on current poster design trends. The introductory section features jury members, their works, and statements on the selection process, alongside forewords from the association president and jury chair. This editorial content offers valuable context and insights into the evolving landscape of graphic design. The book’s concept playfully questions the seriousness and benevolence of the honorary certificates awarded to the winning designers. This subtle irony adds a unique layer to the publication, transforming it from a mere compilation into a thoughtful commentary on the design world itself. It's an inspiring showcase of the cutting edge of contemporary graphic design. The Art of Editorial Design "100 Beste Plakate 24" is a prime example of exceptional editorial design. It's not just about compiling images; it's about curating a narrative. The precise layout, thoughtful typography choices, and the deliberate flow of content all contribute to a cohesive and engaging experience. This book highlights how editorial design can transform a collection of works into a compelling story, inviting readers to delve deeper into each piece. The attention to detail, from the softcover with flaps to the thread-stitching and hot-foil embossing, speaks volumes about the dedication to craftsmanship. This is where illustration, graphic design, and branding converge to create a truly immersive experience. Final Thoughts This yearbook is a must-have for anyone passionate about graphic design and visual identity. It offers a fresh perspective on contemporary poster design, highlighting both aesthetic excellence and social relevance. The detailed insights into the design process and the designers' intentions make it an invaluable resource. Pick up a copy and see how impactful design can be. You can learn more about this incredible work and acquire your copy at slanted.de/product/100-beste-plakate-24. Editorial design artifacts Tags editorial design #editorial #design #beste #plakate #showcase
    ABDUZEEDO.COM
    Editorial Design: '100 Beste Plakate 24' Showcase
    06/12 — 2025 by abduzeedo Explore "100 Beste Plakate 24," a stunning yearbook by Tristesse and Slanted Publishers. Dive into cutting-edge editorial design and visual identity. Design enthusiasts, get ready to dive into the latest from the German-speaking design scene. The "100 Beste Plakate 24" yearbook offers a compelling showcase of contemporary graphic design. It's more than just a collection; it's a deep exploration of visual identity and editorial design. This yearbook, published by Slanted Publishers and edited by 100 beste Plakate e. V. and Fons Hickmann, is a testament to the power of impactful poster design. The design studio Tristesse from Basel took the reins for the overall concept, delivering a fresh and cheeky aesthetic that makes the "100 best posters" feel like leading actors on a vibrant stage. Their in-house approach to layout, typography, and photography truly shines. Unpacking the Visuals The book's format (17×24 cm) and 256 pages allow for large-format images, providing ample space to appreciate each poster's intricate details. It includes detailed credits, content descriptions, and creation contexts. This commitment to detail in the editorial design elevates the reading experience. One notable example within the yearbook is the "To-Do: Diplome 24" poster campaign by Atelier HKB. Designed under Marco Matti's project management, this series features twelve motifs for the Bern University of the Arts graduation events. These posters highlight effective graphic design and visual communication. Another standout is the "Rettungsplakate" by klotz-studio für gestaltung. These "rescue posters," printed on actual rescue blankets, address homelessness in Germany. The raw, impactful visual approach paired with a tangible medium demonstrates powerful design with a purpose. Beyond the Imagery Beyond the stunning visuals, the yearbook offers insightful essays and interviews on current poster design trends. The introductory section features jury members, their works, and statements on the selection process, alongside forewords from the association president and jury chair. This editorial content offers valuable context and insights into the evolving landscape of graphic design. The book’s concept playfully questions the seriousness and benevolence of the honorary certificates awarded to the winning designers. This subtle irony adds a unique layer to the publication, transforming it from a mere compilation into a thoughtful commentary on the design world itself. It's an inspiring showcase of the cutting edge of contemporary graphic design. The Art of Editorial Design "100 Beste Plakate 24" is a prime example of exceptional editorial design. It's not just about compiling images; it's about curating a narrative. The precise layout, thoughtful typography choices, and the deliberate flow of content all contribute to a cohesive and engaging experience. This book highlights how editorial design can transform a collection of works into a compelling story, inviting readers to delve deeper into each piece. The attention to detail, from the softcover with flaps to the thread-stitching and hot-foil embossing, speaks volumes about the dedication to craftsmanship. This is where illustration, graphic design, and branding converge to create a truly immersive experience. Final Thoughts This yearbook is a must-have for anyone passionate about graphic design and visual identity. It offers a fresh perspective on contemporary poster design, highlighting both aesthetic excellence and social relevance. The detailed insights into the design process and the designers' intentions make it an invaluable resource. Pick up a copy and see how impactful design can be. You can learn more about this incredible work and acquire your copy at slanted.de/product/100-beste-plakate-24. Editorial design artifacts Tags editorial design
    0 التعليقات 0 المشاركات
  • IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    By John P. Mello Jr.
    June 11, 2025 5:00 AM PT

    IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT
    Enterprise IT Lead Generation Services
    Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more.

    IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible.
    The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent.
    “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.”
    IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del.
    “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.”
    A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time.
    Realistic Roadmap
    Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld.
    “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany.
    “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.”
    Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.”
    “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.”
    “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada.
    “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.”
    Solving the Quantum Error Correction Puzzle
    To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits.
    “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.”
    IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published.

    Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices.
    In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer.
    One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing.
    According to IBM, a practical fault-tolerant quantum architecture must:

    Suppress enough errors for useful algorithms to succeed
    Prepare and measure logical qubits during computation
    Apply universal instructions to logical qubits
    Decode measurements from logical qubits in real time and guide subsequent operations
    Scale modularly across hundreds or thousands of logical qubits
    Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources

    Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained.
    “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.”
    Q-Day Approaching Faster Than Expected
    For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated.
    “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif.
    “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.”

    “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said.
    Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years.
    “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld.
    “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.”
    “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.”
    “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.”

    John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

    Leave a Comment

    Click here to cancel reply.
    Please sign in to post or reply to a comment. New users create a free account.

    Related Stories

    More by John P. Mello Jr.

    view all

    More in Emerging Tech
    #ibm #plans #largescale #faulttolerant #quantum
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech #ibm #plans #largescale #faulttolerant #quantum
    WWW.TECHNEWSWORLD.COM
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system. (Image Credit: IBM) ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillion (10⁴⁸) of the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed $30 billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity check (qLDPC) codes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling [RCS], can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQC [post-quantum cryptography] preparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EO [Executive Order] that relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech
    0 التعليقات 0 المشاركات
  • Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects

    Solar air heating is among the most cost-effective applications of solar thermal energy. These systems are used for space heating and preheating fresh air for ventilation, typically using glazed or unglazed perforated solar collectors. The collectors draw in outside air, heat it using solar energy, and then distribute it through ductwork to meet building heating and fresh air needs. In 2024, Canada led again the world for the at least seventh year in a row in solar air heating adoption. The four key suppliers – Trigo Energies, Conserval Engineering, Matrix Energy, and Aéronergie – reported a combined 26,203 m2of collector area sold last year. Several of these providers are optimistic about the growing demand. These findings come from the newly released Canadian Solar Thermal Market Survey 2024, commissioned by Natural Resources Canada.
    Canada is the global leader in solar air heating. The market is driven by a strong network of experienced system suppliers, optimized technologies, and a few small favorable funding programs – especially in the province of Quebec. Architects and developers are increasingly turning to these cost-effective, façade-integrated systems as a practical solution for reducing onsite natural gas consumption.
    Despite its cold climate, Canada benefits from strong solar potential with solar irradiance in many areas rivaling or even exceeding that of parts of Europe. This makes solar air heating not only viable, but especially valuable in buildings with high fresh air requirements including schools, hospitals, and offices. The projects highlighted in this article showcase the versatility and relevance of solar air heating across a range of building types, from new constructions to retrofits.
    Figure 1: Preheating air for industrial buildings: 2,750 m2of Calento SL solar air collectors cover all south-west and south-east facing facades of the FAB3R factory in Trois-Rivières, Quebec. The hourly unitary flow rate is set at 41 m3/m2 or 2.23 cfm/ft2 of collector area, at the lower range because only a limited number of intake fans was close enough to the solar façade to avoid long ventilation ductwork. Photo: Trigo Energies
    Quebec’s solar air heating boom: the Trigo Energies story
    Trigo Energies makes almost 90 per cent of its sales in Quebec. “We profit from great subsidies, as solar air systems are supported by several organizations in our province – the electricity utility Hydro Quebec, the gas utility Energir and the Ministry of Natural Resources,” explained Christian Vachon, Vice President Technologies and R&D at Trigo Energies.
    Trigo Energies currently has nine employees directly involved in planning, engineering and installing solar air heating systems and teams up with several partner contractors to install mostly retrofit projects. “A high degree of engineering is required to fit a solar heating system into an existing factory,” emphasized Vachon. “Knowledge about HVAC engineering is as important as experience with solar thermal and architecture.”
    One recent Trigo installation is at the FAB3R factory in Trois-Rivières. FAB3R specializes in manufacturing, repairing, and refurbishing large industrial equipment. Its air heating and ventilation system needed urgent renovation because of leakages and discomfort for the workers. “Due to many positive references he had from industries in the area, the owner of FAB3R contacted us,” explained Vachon. “The existence of subsidies helped the client to go for a retrofitting project including solar façade at once instead of fixing the problems one bit at a time.” Approximately 50 per cent of the investment costs for both the solar air heating and the renovation of the indoor ventilation system were covered by grants and subsidies. FAB3R profited from an Energir grant targeted at solar preheating, plus an investment subsidy from the Government of Quebec’s EcoPerformance Programme.
     
    Blue or black, but always efficient: the advanced absorber coating
    In October 2024, the majority of the new 2,750 m²solar façade at FAB3R began operation. According to Vachon, the system is expected to cover approximately 13 per cent of the factory’s annual heating demand, which is otherwise met by natural gas. Trigo Energies equipped the façade with its high-performance Calento SL collectors, featuring a notable innovation: a selective, low-emissivity coating that withstands outdoor conditions. Introduced by Trigo in 2019 and manufactured by Almeco Group from Italy, this advanced coating is engineered to maximize solar absorption while minimizing heat loss via infrared emission, enhancing the overall efficiency of the system.
    The high efficiency coating is now standard in Trigo’s air heating systems. According to the manufacturer, the improved collector design shows a 25 to 35 per cent increase in yield over the former generation of solar air collectors with black paint. Testing conducted at Queen’s University confirms this performance advantage. Researchers measured the performance of transpired solar air collectors both with and without a selective coating, mounted side-by-side on a south-facing vertical wall. The results showed that the collectors with the selective coating produced 1.3 to 1.5 times more energy than those without it. In 2024, the monitoring results were jointly published by Queen’s University and Canmat Energy in a paper titled Performance Comparison of a Transpired Air Solar Collector with Low-E Surface Coating.
    Selective coating, also used on other solar thermal technologies including glazed flat plate or vacuum tube collectors, has a distinctive blue color. Trigo customers can, however, choose between blue and black finishes. “By going from the normal blue selective coating to black selective coating, which Almeco is specially producing for Trigo, we lose about 1 per cent in solar efficiency,” explained Vachon.
    Figure 2: Building-integrated solar air heating façade with MatrixAir collectors at the firehall building in Mont Saint Hilaire, south of Montreal. The 190 m2south-facing wall preheats the fresh air, reducing natural gas consumption by 18 per cent compared to the conventional make-up system. Architect: Leclerc Architecture. Photo: Matrix Energy
    Matrix Energy: collaborating with architects and engineers in new builds
    The key target customer group of Matrix Energy are public buildings – mainly new construction. “Since the pandemic, schools are more conscious about fresh air, and solar preheating of the incoming fresh air has a positive impact over the entire school year,” noted Brian Wilkinson, President of Matrix Energy.
    Matrix Energy supplies systems across Canada, working with local partners to source and process the metal sheets used in their MatrixAir collectors. These metal sheets are perforated and then formed into architectural cladding profiles. The company exclusively offers unglazed, single-stage collectors, citing fire safety concerns associated with polymeric covers.
    “We have strong relationships with many architects and engineers who appreciate the simplicity and cost-effectiveness of transpired solar air heating systems,” said President Brian Wilkinson, describing the company’s sales approach. “Matrix handles system design and supplies the necessary materials, while installation is carried out by specialized cladding and HVAC contractors overseen by on-site architects and engineers,” Wilkinson added.
    Finding the right flow: the importance of unitary airflow rates
    One of the key design factors in solar air heating systems is the amount of air that passes through each square meter of the perforated metal absorber,  known as the unitary airflow rate. The principle is straightforward: higher airflow rates deliver more total heat to the building, while lower flow rates result in higher outlet air temperatures. Striking the right balance between air volume and temperature gain is essential for efficient system performance.
    For unglazed collectors mounted on building façades, typical hourly flow rates should range between 120 and 170, or 6.6 to 9.4 cfm/ft2. However, Wilkinson suggests that an hourly airflow rate of around 130 m³/h/m²offers the best cost-benefit balance for building owners. If the airflow is lower, the system will deliver higher air temperatures, but it would then need a much larger collector area to achieve the same air volume and optimum performance, he explained.
    It’s also crucial for the flow rate to overcome external wind pressure. As wind passes over the absorber, air flow through the collector’s perforations is reduced, resulting in heat losses to the environment. This effect becomes even more pronounced in taller buildings, where wind exposure is greater. To ensure the system performs well even in these conditions, higher hourly airflow rates typically between 150 and 170 m³/m² are necessary.
    Figure 3: One of three apartment blocks of the Maple House in Toronto’s Canary District. Around 160 m2of SolarWall collectors clad the two-storey mechanical penthouse on the roof. The rental flats have been occupied since the beginning of 2024. Collaborators: architects-Alliance, Claude Cormier et Associés, Thornton Tomasetti, RWDI, Cole Engineering, DesignAgency, MVShore, BA Group, EllisDon. Photo: Conserval Engineering
    Solar air heating systems support LEED-certified building designs
    Solar air collectors are also well-suited for use in multi-unit residential buildings. A prime example is the Canary District in Toronto, where single-stage SolarWall collectors from Conserval Engineering have been installed on several MURBs to clad the mechanical penthouses. “These penthouses are an ideal location for our air heating collectors, as they contain the make-up air units that supply corridor ventilation throughout the building,” explained Victoria Hollick, Vice President of Conserval Engineering. “The walls are typically finished with metal façades, which can be seamlessly replaced with a SolarWall system – maintaining the architectural language without disruption.” To date, nine solar air heating systems have been commissioned in the Canary District, covering a total collector area of over 1,000 m².
    “Our customers have many motivations to integrate SolarWall technology into their new construction or retrofit projects, either carbon reduction, ESG, or green building certification targets,” explained Hollick.
    The use of solar air collectors in the Canary District was proposed by architects from the Danish firm Cobe. The black-colored SolarWall system preheats incoming air before it is distributed to the building’s corridors and common areas, reducing reliance on natural gas heating and supporting the pursuit of LEED Gold certification. Hollick estimates the amount of gas saved between 10 to 20 per cent of the total heating load for the corridor ventilation of the multi-unit residential buildings. Additional energy-saving strategies include a 50/50 window-to-wall ratio with high-performance glazing, green roofs, high-efficiency mechanical systems, LED lighting, and Energy Star-certified appliances.
    The ideal orientation for a SolarWall system is due south. However, the systems can be built at any orientation up to 90° east and west, explained Hollick. A SolarWall at 90° would have approximately 60 per cent of the energy production of the same area facing south.Canada’s expertise in solar air heating continues to set a global benchmark, driven by supporting R&D, by innovative technologies, strategic partnerships, and a growing portfolio of high-impact projects. With strong policy support and proven performance, solar air heating is poised to play a key role in the country’s energy-efficient building future.
    Figure 4: Claude-Bechard Building in Quebec is a showcase project for sustainable architecture with a 72 m2Lubi solar air heating wall from Aéronergie. It serves as a regional administrative center. Architectural firm: Goulet et Lebel Architectes. Photo: Art Massif

    Bärbel Epp is the general manager of the German Agency solrico, whose focus is on solar market research and international communication.
    The post Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects appeared first on Canadian Architect.
    #oped #canadas #leadership #solar #air
    Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects
    Solar air heating is among the most cost-effective applications of solar thermal energy. These systems are used for space heating and preheating fresh air for ventilation, typically using glazed or unglazed perforated solar collectors. The collectors draw in outside air, heat it using solar energy, and then distribute it through ductwork to meet building heating and fresh air needs. In 2024, Canada led again the world for the at least seventh year in a row in solar air heating adoption. The four key suppliers – Trigo Energies, Conserval Engineering, Matrix Energy, and Aéronergie – reported a combined 26,203 m2of collector area sold last year. Several of these providers are optimistic about the growing demand. These findings come from the newly released Canadian Solar Thermal Market Survey 2024, commissioned by Natural Resources Canada. Canada is the global leader in solar air heating. The market is driven by a strong network of experienced system suppliers, optimized technologies, and a few small favorable funding programs – especially in the province of Quebec. Architects and developers are increasingly turning to these cost-effective, façade-integrated systems as a practical solution for reducing onsite natural gas consumption. Despite its cold climate, Canada benefits from strong solar potential with solar irradiance in many areas rivaling or even exceeding that of parts of Europe. This makes solar air heating not only viable, but especially valuable in buildings with high fresh air requirements including schools, hospitals, and offices. The projects highlighted in this article showcase the versatility and relevance of solar air heating across a range of building types, from new constructions to retrofits. Figure 1: Preheating air for industrial buildings: 2,750 m2of Calento SL solar air collectors cover all south-west and south-east facing facades of the FAB3R factory in Trois-Rivières, Quebec. The hourly unitary flow rate is set at 41 m3/m2 or 2.23 cfm/ft2 of collector area, at the lower range because only a limited number of intake fans was close enough to the solar façade to avoid long ventilation ductwork. Photo: Trigo Energies Quebec’s solar air heating boom: the Trigo Energies story Trigo Energies makes almost 90 per cent of its sales in Quebec. “We profit from great subsidies, as solar air systems are supported by several organizations in our province – the electricity utility Hydro Quebec, the gas utility Energir and the Ministry of Natural Resources,” explained Christian Vachon, Vice President Technologies and R&D at Trigo Energies. Trigo Energies currently has nine employees directly involved in planning, engineering and installing solar air heating systems and teams up with several partner contractors to install mostly retrofit projects. “A high degree of engineering is required to fit a solar heating system into an existing factory,” emphasized Vachon. “Knowledge about HVAC engineering is as important as experience with solar thermal and architecture.” One recent Trigo installation is at the FAB3R factory in Trois-Rivières. FAB3R specializes in manufacturing, repairing, and refurbishing large industrial equipment. Its air heating and ventilation system needed urgent renovation because of leakages and discomfort for the workers. “Due to many positive references he had from industries in the area, the owner of FAB3R contacted us,” explained Vachon. “The existence of subsidies helped the client to go for a retrofitting project including solar façade at once instead of fixing the problems one bit at a time.” Approximately 50 per cent of the investment costs for both the solar air heating and the renovation of the indoor ventilation system were covered by grants and subsidies. FAB3R profited from an Energir grant targeted at solar preheating, plus an investment subsidy from the Government of Quebec’s EcoPerformance Programme.   Blue or black, but always efficient: the advanced absorber coating In October 2024, the majority of the new 2,750 m²solar façade at FAB3R began operation. According to Vachon, the system is expected to cover approximately 13 per cent of the factory’s annual heating demand, which is otherwise met by natural gas. Trigo Energies equipped the façade with its high-performance Calento SL collectors, featuring a notable innovation: a selective, low-emissivity coating that withstands outdoor conditions. Introduced by Trigo in 2019 and manufactured by Almeco Group from Italy, this advanced coating is engineered to maximize solar absorption while minimizing heat loss via infrared emission, enhancing the overall efficiency of the system. The high efficiency coating is now standard in Trigo’s air heating systems. According to the manufacturer, the improved collector design shows a 25 to 35 per cent increase in yield over the former generation of solar air collectors with black paint. Testing conducted at Queen’s University confirms this performance advantage. Researchers measured the performance of transpired solar air collectors both with and without a selective coating, mounted side-by-side on a south-facing vertical wall. The results showed that the collectors with the selective coating produced 1.3 to 1.5 times more energy than those without it. In 2024, the monitoring results were jointly published by Queen’s University and Canmat Energy in a paper titled Performance Comparison of a Transpired Air Solar Collector with Low-E Surface Coating. Selective coating, also used on other solar thermal technologies including glazed flat plate or vacuum tube collectors, has a distinctive blue color. Trigo customers can, however, choose between blue and black finishes. “By going from the normal blue selective coating to black selective coating, which Almeco is specially producing for Trigo, we lose about 1 per cent in solar efficiency,” explained Vachon. Figure 2: Building-integrated solar air heating façade with MatrixAir collectors at the firehall building in Mont Saint Hilaire, south of Montreal. The 190 m2south-facing wall preheats the fresh air, reducing natural gas consumption by 18 per cent compared to the conventional make-up system. Architect: Leclerc Architecture. Photo: Matrix Energy Matrix Energy: collaborating with architects and engineers in new builds The key target customer group of Matrix Energy are public buildings – mainly new construction. “Since the pandemic, schools are more conscious about fresh air, and solar preheating of the incoming fresh air has a positive impact over the entire school year,” noted Brian Wilkinson, President of Matrix Energy. Matrix Energy supplies systems across Canada, working with local partners to source and process the metal sheets used in their MatrixAir collectors. These metal sheets are perforated and then formed into architectural cladding profiles. The company exclusively offers unglazed, single-stage collectors, citing fire safety concerns associated with polymeric covers. “We have strong relationships with many architects and engineers who appreciate the simplicity and cost-effectiveness of transpired solar air heating systems,” said President Brian Wilkinson, describing the company’s sales approach. “Matrix handles system design and supplies the necessary materials, while installation is carried out by specialized cladding and HVAC contractors overseen by on-site architects and engineers,” Wilkinson added. Finding the right flow: the importance of unitary airflow rates One of the key design factors in solar air heating systems is the amount of air that passes through each square meter of the perforated metal absorber,  known as the unitary airflow rate. The principle is straightforward: higher airflow rates deliver more total heat to the building, while lower flow rates result in higher outlet air temperatures. Striking the right balance between air volume and temperature gain is essential for efficient system performance. For unglazed collectors mounted on building façades, typical hourly flow rates should range between 120 and 170, or 6.6 to 9.4 cfm/ft2. However, Wilkinson suggests that an hourly airflow rate of around 130 m³/h/m²offers the best cost-benefit balance for building owners. If the airflow is lower, the system will deliver higher air temperatures, but it would then need a much larger collector area to achieve the same air volume and optimum performance, he explained. It’s also crucial for the flow rate to overcome external wind pressure. As wind passes over the absorber, air flow through the collector’s perforations is reduced, resulting in heat losses to the environment. This effect becomes even more pronounced in taller buildings, where wind exposure is greater. To ensure the system performs well even in these conditions, higher hourly airflow rates typically between 150 and 170 m³/m² are necessary. Figure 3: One of three apartment blocks of the Maple House in Toronto’s Canary District. Around 160 m2of SolarWall collectors clad the two-storey mechanical penthouse on the roof. The rental flats have been occupied since the beginning of 2024. Collaborators: architects-Alliance, Claude Cormier et Associés, Thornton Tomasetti, RWDI, Cole Engineering, DesignAgency, MVShore, BA Group, EllisDon. Photo: Conserval Engineering Solar air heating systems support LEED-certified building designs Solar air collectors are also well-suited for use in multi-unit residential buildings. A prime example is the Canary District in Toronto, where single-stage SolarWall collectors from Conserval Engineering have been installed on several MURBs to clad the mechanical penthouses. “These penthouses are an ideal location for our air heating collectors, as they contain the make-up air units that supply corridor ventilation throughout the building,” explained Victoria Hollick, Vice President of Conserval Engineering. “The walls are typically finished with metal façades, which can be seamlessly replaced with a SolarWall system – maintaining the architectural language without disruption.” To date, nine solar air heating systems have been commissioned in the Canary District, covering a total collector area of over 1,000 m². “Our customers have many motivations to integrate SolarWall technology into their new construction or retrofit projects, either carbon reduction, ESG, or green building certification targets,” explained Hollick. The use of solar air collectors in the Canary District was proposed by architects from the Danish firm Cobe. The black-colored SolarWall system preheats incoming air before it is distributed to the building’s corridors and common areas, reducing reliance on natural gas heating and supporting the pursuit of LEED Gold certification. Hollick estimates the amount of gas saved between 10 to 20 per cent of the total heating load for the corridor ventilation of the multi-unit residential buildings. Additional energy-saving strategies include a 50/50 window-to-wall ratio with high-performance glazing, green roofs, high-efficiency mechanical systems, LED lighting, and Energy Star-certified appliances. The ideal orientation for a SolarWall system is due south. However, the systems can be built at any orientation up to 90° east and west, explained Hollick. A SolarWall at 90° would have approximately 60 per cent of the energy production of the same area facing south.Canada’s expertise in solar air heating continues to set a global benchmark, driven by supporting R&D, by innovative technologies, strategic partnerships, and a growing portfolio of high-impact projects. With strong policy support and proven performance, solar air heating is poised to play a key role in the country’s energy-efficient building future. Figure 4: Claude-Bechard Building in Quebec is a showcase project for sustainable architecture with a 72 m2Lubi solar air heating wall from Aéronergie. It serves as a regional administrative center. Architectural firm: Goulet et Lebel Architectes. Photo: Art Massif Bärbel Epp is the general manager of the German Agency solrico, whose focus is on solar market research and international communication. The post Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects appeared first on Canadian Architect. #oped #canadas #leadership #solar #air
    WWW.CANADIANARCHITECT.COM
    Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects
    Solar air heating is among the most cost-effective applications of solar thermal energy. These systems are used for space heating and preheating fresh air for ventilation, typically using glazed or unglazed perforated solar collectors. The collectors draw in outside air, heat it using solar energy, and then distribute it through ductwork to meet building heating and fresh air needs. In 2024, Canada led again the world for the at least seventh year in a row in solar air heating adoption. The four key suppliers – Trigo Energies, Conserval Engineering, Matrix Energy, and Aéronergie – reported a combined 26,203 m2 (282,046 ft2) of collector area sold last year. Several of these providers are optimistic about the growing demand. These findings come from the newly released Canadian Solar Thermal Market Survey 2024, commissioned by Natural Resources Canada. Canada is the global leader in solar air heating. The market is driven by a strong network of experienced system suppliers, optimized technologies, and a few small favorable funding programs – especially in the province of Quebec. Architects and developers are increasingly turning to these cost-effective, façade-integrated systems as a practical solution for reducing onsite natural gas consumption. Despite its cold climate, Canada benefits from strong solar potential with solar irradiance in many areas rivaling or even exceeding that of parts of Europe. This makes solar air heating not only viable, but especially valuable in buildings with high fresh air requirements including schools, hospitals, and offices. The projects highlighted in this article showcase the versatility and relevance of solar air heating across a range of building types, from new constructions to retrofits. Figure 1: Preheating air for industrial buildings: 2,750 m2 (29,600 ft2) of Calento SL solar air collectors cover all south-west and south-east facing facades of the FAB3R factory in Trois-Rivières, Quebec. The hourly unitary flow rate is set at 41 m3/m2 or 2.23 cfm/ft2 of collector area, at the lower range because only a limited number of intake fans was close enough to the solar façade to avoid long ventilation ductwork. Photo: Trigo Energies Quebec’s solar air heating boom: the Trigo Energies story Trigo Energies makes almost 90 per cent of its sales in Quebec. “We profit from great subsidies, as solar air systems are supported by several organizations in our province – the electricity utility Hydro Quebec, the gas utility Energir and the Ministry of Natural Resources,” explained Christian Vachon, Vice President Technologies and R&D at Trigo Energies. Trigo Energies currently has nine employees directly involved in planning, engineering and installing solar air heating systems and teams up with several partner contractors to install mostly retrofit projects. “A high degree of engineering is required to fit a solar heating system into an existing factory,” emphasized Vachon. “Knowledge about HVAC engineering is as important as experience with solar thermal and architecture.” One recent Trigo installation is at the FAB3R factory in Trois-Rivières. FAB3R specializes in manufacturing, repairing, and refurbishing large industrial equipment. Its air heating and ventilation system needed urgent renovation because of leakages and discomfort for the workers. “Due to many positive references he had from industries in the area, the owner of FAB3R contacted us,” explained Vachon. “The existence of subsidies helped the client to go for a retrofitting project including solar façade at once instead of fixing the problems one bit at a time.” Approximately 50 per cent of the investment costs for both the solar air heating and the renovation of the indoor ventilation system were covered by grants and subsidies. FAB3R profited from an Energir grant targeted at solar preheating, plus an investment subsidy from the Government of Quebec’s EcoPerformance Programme.   Blue or black, but always efficient: the advanced absorber coating In October 2024, the majority of the new 2,750 m² (29,600 ft2) solar façade at FAB3R began operation (see figure 1). According to Vachon, the system is expected to cover approximately 13 per cent of the factory’s annual heating demand, which is otherwise met by natural gas. Trigo Energies equipped the façade with its high-performance Calento SL collectors, featuring a notable innovation: a selective, low-emissivity coating that withstands outdoor conditions. Introduced by Trigo in 2019 and manufactured by Almeco Group from Italy, this advanced coating is engineered to maximize solar absorption while minimizing heat loss via infrared emission, enhancing the overall efficiency of the system. The high efficiency coating is now standard in Trigo’s air heating systems. According to the manufacturer, the improved collector design shows a 25 to 35 per cent increase in yield over the former generation of solar air collectors with black paint. Testing conducted at Queen’s University confirms this performance advantage. Researchers measured the performance of transpired solar air collectors both with and without a selective coating, mounted side-by-side on a south-facing vertical wall. The results showed that the collectors with the selective coating produced 1.3 to 1.5 times more energy than those without it. In 2024, the monitoring results were jointly published by Queen’s University and Canmat Energy in a paper titled Performance Comparison of a Transpired Air Solar Collector with Low-E Surface Coating. Selective coating, also used on other solar thermal technologies including glazed flat plate or vacuum tube collectors, has a distinctive blue color. Trigo customers can, however, choose between blue and black finishes. “By going from the normal blue selective coating to black selective coating, which Almeco is specially producing for Trigo, we lose about 1 per cent in solar efficiency,” explained Vachon. Figure 2: Building-integrated solar air heating façade with MatrixAir collectors at the firehall building in Mont Saint Hilaire, south of Montreal. The 190 m2 (2,045 ft2) south-facing wall preheats the fresh air, reducing natural gas consumption by 18 per cent compared to the conventional make-up system. Architect: Leclerc Architecture. Photo: Matrix Energy Matrix Energy: collaborating with architects and engineers in new builds The key target customer group of Matrix Energy are public buildings – mainly new construction. “Since the pandemic, schools are more conscious about fresh air, and solar preheating of the incoming fresh air has a positive impact over the entire school year,” noted Brian Wilkinson, President of Matrix Energy. Matrix Energy supplies systems across Canada, working with local partners to source and process the metal sheets used in their MatrixAir collectors. These metal sheets are perforated and then formed into architectural cladding profiles. The company exclusively offers unglazed, single-stage collectors, citing fire safety concerns associated with polymeric covers. “We have strong relationships with many architects and engineers who appreciate the simplicity and cost-effectiveness of transpired solar air heating systems,” said President Brian Wilkinson, describing the company’s sales approach. “Matrix handles system design and supplies the necessary materials, while installation is carried out by specialized cladding and HVAC contractors overseen by on-site architects and engineers,” Wilkinson added. Finding the right flow: the importance of unitary airflow rates One of the key design factors in solar air heating systems is the amount of air that passes through each square meter of the perforated metal absorber,  known as the unitary airflow rate. The principle is straightforward: higher airflow rates deliver more total heat to the building, while lower flow rates result in higher outlet air temperatures. Striking the right balance between air volume and temperature gain is essential for efficient system performance. For unglazed collectors mounted on building façades, typical hourly flow rates should range between 120 and 170 (m3/h/m2), or 6.6 to 9.4 cfm/ft2. However, Wilkinson suggests that an hourly airflow rate of around 130 m³/h/m² (7.2 cfm/ft2) offers the best cost-benefit balance for building owners. If the airflow is lower, the system will deliver higher air temperatures, but it would then need a much larger collector area to achieve the same air volume and optimum performance, he explained. It’s also crucial for the flow rate to overcome external wind pressure. As wind passes over the absorber, air flow through the collector’s perforations is reduced, resulting in heat losses to the environment. This effect becomes even more pronounced in taller buildings, where wind exposure is greater. To ensure the system performs well even in these conditions, higher hourly airflow rates typically between 150 and 170 m³/m² (8.3 to 9.4 cfm/ft2)  are necessary. Figure 3: One of three apartment blocks of the Maple House in Toronto’s Canary District. Around 160 m2 (1,722 ft2) of SolarWall collectors clad the two-storey mechanical penthouse on the roof. The rental flats have been occupied since the beginning of 2024. Collaborators: architects-Alliance, Claude Cormier et Associés, Thornton Tomasetti, RWDI, Cole Engineering, DesignAgency, MVShore, BA Group, EllisDon. Photo: Conserval Engineering Solar air heating systems support LEED-certified building designs Solar air collectors are also well-suited for use in multi-unit residential buildings. A prime example is the Canary District in Toronto (see Figure 3), where single-stage SolarWall collectors from Conserval Engineering have been installed on several MURBs to clad the mechanical penthouses. “These penthouses are an ideal location for our air heating collectors, as they contain the make-up air units that supply corridor ventilation throughout the building,” explained Victoria Hollick, Vice President of Conserval Engineering. “The walls are typically finished with metal façades, which can be seamlessly replaced with a SolarWall system – maintaining the architectural language without disruption.” To date, nine solar air heating systems have been commissioned in the Canary District, covering a total collector area of over 1,000 m² (10,764 ft2). “Our customers have many motivations to integrate SolarWall technology into their new construction or retrofit projects, either carbon reduction, ESG, or green building certification targets,” explained Hollick. The use of solar air collectors in the Canary District was proposed by architects from the Danish firm Cobe. The black-colored SolarWall system preheats incoming air before it is distributed to the building’s corridors and common areas, reducing reliance on natural gas heating and supporting the pursuit of LEED Gold certification. Hollick estimates the amount of gas saved between 10 to 20 per cent of the total heating load for the corridor ventilation of the multi-unit residential buildings. Additional energy-saving strategies include a 50/50 window-to-wall ratio with high-performance glazing, green roofs, high-efficiency mechanical systems, LED lighting, and Energy Star-certified appliances. The ideal orientation for a SolarWall system is due south. However, the systems can be built at any orientation up to 90° east and west, explained Hollick. A SolarWall at 90° would have approximately 60 per cent of the energy production of the same area facing south.Canada’s expertise in solar air heating continues to set a global benchmark, driven by supporting R&D, by innovative technologies, strategic partnerships, and a growing portfolio of high-impact projects. With strong policy support and proven performance, solar air heating is poised to play a key role in the country’s energy-efficient building future. Figure 4: Claude-Bechard Building in Quebec is a showcase project for sustainable architecture with a 72 m2 (775 ft2) Lubi solar air heating wall from Aéronergie. It serves as a regional administrative center. Architectural firm: Goulet et Lebel Architectes. Photo: Art Massif Bärbel Epp is the general manager of the German Agency solrico, whose focus is on solar market research and international communication. The post Op-ed: Canada’s leadership in solar air heating—Innovation and flagship projects appeared first on Canadian Architect.
    0 التعليقات 0 المشاركات
  • As AI faces court challenges from Disney and Universal, legal battles are shaping the industry's future | Opinion

    As AI faces court challenges from Disney and Universal, legal battles are shaping the industry's future | Opinion
    Silicon advances and design innovations do still push us forward – but the future landscape of the industry is also being sculpted in courtrooms and parliaments

    Image credit: Disney / Epic Games

    Opinion

    by Rob Fahey
    Contributing Editor

    Published on June 13, 2025

    In some regards, the past couple of weeks have felt rather reassuring.
    We've just seen a hugely successful launch for a new Nintendo console, replete with long queues for midnight sales events. Over the next few days, the various summer events and showcases that have sprouted amongst the scattered bones of E3 generated waves of interest and hype for a host of new games.
    It all feels like old times. It's enough to make you imagine that while change is the only constant, at least it's we're facing change that's fairly well understood, change in the form of faster, cheaper silicon, or bigger, more ambitious games.
    If only the winds that blow through this industry all came from such well-defined points on the compass. Nestled in amongst the week's headlines, though, was something that's likely to have profound but much harder to understand impacts on this industry and many others over the coming years – a lawsuit being brought by Disney and NBC Universal against Midjourney, operators of the eponymous generative AI image creation tool.
    In some regards, the lawsuit looks fairly straightforward; the arguments made and considered in reaching its outcome, though, may have a profound impact on both the ability of creatives and media companiesto protect their IP rights from a very new kind of threat, and the ways in which a promising but highly controversial and risky new set of development and creative tools can be used commercially.
    A more likely tack on Midjourney's side will be the argument that they are not responsible for what their customers create with the tool
    I say the lawsuit looks straightforward from some angles, but honestly overall it looks fairly open and shut – the media giants accuse Midjourney of replicating their copyrighted characters and material, and of essentially building a machine for churning out limitless copyright violations.
    The evidence submitted includes screenshot after screenshot of Midjourney generating pages of images of famous copyrighted and trademarked characters ranging from Yoda to Homer Simpson, so "no we didn't" isn't going to be much of a defence strategy here.
    A more likely tack on Midjourney's side will be the argument that they are not responsible for what their customers create with the tool – you don't sue the manufacturers of oil paints or canvases when artists use them to paint something copyright-infringing, nor does Microsoft get sued when someone writes something libellous in Word, and Midjourney may try to argue that their software belongs in that tool category, with users alone being ultimately responsible for how they use them.

    If that argument prevails and survives appeals and challenges, it would be a major triumph for the nascent generative AI industry and a hugely damaging blow to IP holders and creatives, since it would seriously undermine their argument that AI companies shouldn't be able to include copyrighted material into training data sets without licensing or compensation.
    The reason Disney and NBCU are going after Midjourney specifically seems to be partially down to Midjourney being especially reticent to negotiate with them about licensing fees and prompt restrictions; other generative AI firms have started talking, at least, about paying for content licenses for training data, and have imposed various limitations on their software to prevent the most egregious and obvious forms of copyright violation.
    In the process, though, they're essentially risking a court showdown over a set of not-quite-clear legal questions at the heart of this dispute, and if Midjourney were to prevail in that argument, other AI companies would likely back off from engaging with IP holders on this topic.
    To be clear, though, it seems highly unlikely that Midjourney will win that argument, at least not in the medium to long term. Yet depending on how this case moves forward, losing the argument could have equally dramatic consequences – especially if the courts find themselves compelled to consider the question of how, exactly, a generative AI system reproduces a copyrighted character with such precision without storing copyright-infringing data in some manner.
    The 2020s are turning out to be the decade in which many key regulatory issues come to a head all at once
    AI advocates have been trying to handwave around this notion from the outset, but at some point a court is going to have to sit down and confront the fact that the precision with which these systems can replicate copyrighted characters, scenes, and other materials requires that they must have stored that infringing material in some form.
    That it's stored as a scattered mesh of probabilities across the vertices of a high-dimensional vector array, rather than a straightforward, monolithic media file, is clearly important but may ultimately be considered moot. If the data is in the system and can be replicated on request, how that differs from Napster or The Pirate Bay is arguably just a matter of technical obfuscation.
    Not having to defend that technical argument in court thus far has been a huge boon to the generative AI field; if it is knocked over in that venue, it will have knock-on effects on every company in the sector and on every business that uses their products.
    Nobody can be quite sure which of the various rocks and pebbles being kicked on this slope is going to set off the landslide, but there seems to be an increasing consensus that a legal and regulatory reckoning is coming for generative AI.
    Consequently, a lot of what's happening in that market right now has the feel of companies desperately trying to establish products and lock in revenue streams before that happens, because it'll be harder to regulate a technology that's genuinely integrated into the world's economic systems than it is to impose limits on one that's currently only clocking up relatively paltry sales and revenues.

    Keeping an eye on this is crucial for any industry that's started experimenting with AI in its workflows – none more than a creative industry like video games, where various forms of AI usage have been posited, although the enthusiasm and buzz so far massively outweighs any tangible benefits from the technology.
    Regardless of what happens in legal and regulatory contexts, AI is already a double-edged sword for any creative industry.
    Used judiciously, it might help to speed up development processes and reduce overheads. Applied in a slapdash or thoughtless manner, it can and will end up wreaking havoc on development timelines, filling up storefronts with endless waves of vaguely-copyright-infringing slop, and potentially make creative firms, from the industry's biggest companies to its smallest indie developers, into victims of impossibly large-scale copyright infringement rather than beneficiaries of a new wave of technology-fuelled productivity.
    The legal threat now hanging over the sector isn't new, merely amplified. We've known for a long time that AI generated artwork, code, and text has significant problems from the perspective of intellectual property rights.
    Even if you're not using AI yourself, however – even if you're vehemently opposed to it on moral and ethical grounds, the Midjourney judgement and its fallout may well impact the creative work you produce yourself and how it ends up being used and abused by these products in future.
    This all has huge ramifications for the games business and will shape everything from how games are created to how IP can be protected for many years to come – a wind of change that's very different and vastly more unpredictable than those we're accustomed to. It's a reminder of just how much of the industry's future is currently being shaped not in development studios and semiconductor labs, but rather in courtrooms and parliamentary committees.
    The ways in which generative AI can be used and how copyright can persist in the face of it will be fundamentally shaped in courts and parliaments, but it's far from the only crucially important topic being hashed out in those venues.
    The ongoing legal turmoil over the opening up of mobile app ecosystems, too, will have huge impacts on the games industry. Meanwhile, the debates over loot boxes, gambling, and various consumer protection aspects related to free-to-play models continue to rumble on in the background.
    Because the industry moves fast while governments move slow, it's easy to forget that that's still an active topic for as far as governments are concerned, and hammers may come down at any time.
    Regulation by governments, whether through the passage of new legislation or the interpretation of existing laws in the courts, has always loomed in the background of any major industry, especially one with strong cultural relevance. The games industry is no stranger to that being part of the background heartbeat of the business.
    The 2020s, however, are turning out to be the decade in which many key regulatory issues come to a head all at once, whether it's AI and copyright, app stores and walled gardens, or loot boxes and IAP-based business models.
    Rulings on those topics in various different global markets will create a complex new landscape that will shape the winds that blow through the business, and how things look in the 2030s and beyond will be fundamentally impacted by those decisions.
    #faces #court #challenges #disney #universal
    As AI faces court challenges from Disney and Universal, legal battles are shaping the industry's future | Opinion
    As AI faces court challenges from Disney and Universal, legal battles are shaping the industry's future | Opinion Silicon advances and design innovations do still push us forward – but the future landscape of the industry is also being sculpted in courtrooms and parliaments Image credit: Disney / Epic Games Opinion by Rob Fahey Contributing Editor Published on June 13, 2025 In some regards, the past couple of weeks have felt rather reassuring. We've just seen a hugely successful launch for a new Nintendo console, replete with long queues for midnight sales events. Over the next few days, the various summer events and showcases that have sprouted amongst the scattered bones of E3 generated waves of interest and hype for a host of new games. It all feels like old times. It's enough to make you imagine that while change is the only constant, at least it's we're facing change that's fairly well understood, change in the form of faster, cheaper silicon, or bigger, more ambitious games. If only the winds that blow through this industry all came from such well-defined points on the compass. Nestled in amongst the week's headlines, though, was something that's likely to have profound but much harder to understand impacts on this industry and many others over the coming years – a lawsuit being brought by Disney and NBC Universal against Midjourney, operators of the eponymous generative AI image creation tool. In some regards, the lawsuit looks fairly straightforward; the arguments made and considered in reaching its outcome, though, may have a profound impact on both the ability of creatives and media companiesto protect their IP rights from a very new kind of threat, and the ways in which a promising but highly controversial and risky new set of development and creative tools can be used commercially. A more likely tack on Midjourney's side will be the argument that they are not responsible for what their customers create with the tool I say the lawsuit looks straightforward from some angles, but honestly overall it looks fairly open and shut – the media giants accuse Midjourney of replicating their copyrighted characters and material, and of essentially building a machine for churning out limitless copyright violations. The evidence submitted includes screenshot after screenshot of Midjourney generating pages of images of famous copyrighted and trademarked characters ranging from Yoda to Homer Simpson, so "no we didn't" isn't going to be much of a defence strategy here. A more likely tack on Midjourney's side will be the argument that they are not responsible for what their customers create with the tool – you don't sue the manufacturers of oil paints or canvases when artists use them to paint something copyright-infringing, nor does Microsoft get sued when someone writes something libellous in Word, and Midjourney may try to argue that their software belongs in that tool category, with users alone being ultimately responsible for how they use them. If that argument prevails and survives appeals and challenges, it would be a major triumph for the nascent generative AI industry and a hugely damaging blow to IP holders and creatives, since it would seriously undermine their argument that AI companies shouldn't be able to include copyrighted material into training data sets without licensing or compensation. The reason Disney and NBCU are going after Midjourney specifically seems to be partially down to Midjourney being especially reticent to negotiate with them about licensing fees and prompt restrictions; other generative AI firms have started talking, at least, about paying for content licenses for training data, and have imposed various limitations on their software to prevent the most egregious and obvious forms of copyright violation. In the process, though, they're essentially risking a court showdown over a set of not-quite-clear legal questions at the heart of this dispute, and if Midjourney were to prevail in that argument, other AI companies would likely back off from engaging with IP holders on this topic. To be clear, though, it seems highly unlikely that Midjourney will win that argument, at least not in the medium to long term. Yet depending on how this case moves forward, losing the argument could have equally dramatic consequences – especially if the courts find themselves compelled to consider the question of how, exactly, a generative AI system reproduces a copyrighted character with such precision without storing copyright-infringing data in some manner. The 2020s are turning out to be the decade in which many key regulatory issues come to a head all at once AI advocates have been trying to handwave around this notion from the outset, but at some point a court is going to have to sit down and confront the fact that the precision with which these systems can replicate copyrighted characters, scenes, and other materials requires that they must have stored that infringing material in some form. That it's stored as a scattered mesh of probabilities across the vertices of a high-dimensional vector array, rather than a straightforward, monolithic media file, is clearly important but may ultimately be considered moot. If the data is in the system and can be replicated on request, how that differs from Napster or The Pirate Bay is arguably just a matter of technical obfuscation. Not having to defend that technical argument in court thus far has been a huge boon to the generative AI field; if it is knocked over in that venue, it will have knock-on effects on every company in the sector and on every business that uses their products. Nobody can be quite sure which of the various rocks and pebbles being kicked on this slope is going to set off the landslide, but there seems to be an increasing consensus that a legal and regulatory reckoning is coming for generative AI. Consequently, a lot of what's happening in that market right now has the feel of companies desperately trying to establish products and lock in revenue streams before that happens, because it'll be harder to regulate a technology that's genuinely integrated into the world's economic systems than it is to impose limits on one that's currently only clocking up relatively paltry sales and revenues. Keeping an eye on this is crucial for any industry that's started experimenting with AI in its workflows – none more than a creative industry like video games, where various forms of AI usage have been posited, although the enthusiasm and buzz so far massively outweighs any tangible benefits from the technology. Regardless of what happens in legal and regulatory contexts, AI is already a double-edged sword for any creative industry. Used judiciously, it might help to speed up development processes and reduce overheads. Applied in a slapdash or thoughtless manner, it can and will end up wreaking havoc on development timelines, filling up storefronts with endless waves of vaguely-copyright-infringing slop, and potentially make creative firms, from the industry's biggest companies to its smallest indie developers, into victims of impossibly large-scale copyright infringement rather than beneficiaries of a new wave of technology-fuelled productivity. The legal threat now hanging over the sector isn't new, merely amplified. We've known for a long time that AI generated artwork, code, and text has significant problems from the perspective of intellectual property rights. Even if you're not using AI yourself, however – even if you're vehemently opposed to it on moral and ethical grounds, the Midjourney judgement and its fallout may well impact the creative work you produce yourself and how it ends up being used and abused by these products in future. This all has huge ramifications for the games business and will shape everything from how games are created to how IP can be protected for many years to come – a wind of change that's very different and vastly more unpredictable than those we're accustomed to. It's a reminder of just how much of the industry's future is currently being shaped not in development studios and semiconductor labs, but rather in courtrooms and parliamentary committees. The ways in which generative AI can be used and how copyright can persist in the face of it will be fundamentally shaped in courts and parliaments, but it's far from the only crucially important topic being hashed out in those venues. The ongoing legal turmoil over the opening up of mobile app ecosystems, too, will have huge impacts on the games industry. Meanwhile, the debates over loot boxes, gambling, and various consumer protection aspects related to free-to-play models continue to rumble on in the background. Because the industry moves fast while governments move slow, it's easy to forget that that's still an active topic for as far as governments are concerned, and hammers may come down at any time. Regulation by governments, whether through the passage of new legislation or the interpretation of existing laws in the courts, has always loomed in the background of any major industry, especially one with strong cultural relevance. The games industry is no stranger to that being part of the background heartbeat of the business. The 2020s, however, are turning out to be the decade in which many key regulatory issues come to a head all at once, whether it's AI and copyright, app stores and walled gardens, or loot boxes and IAP-based business models. Rulings on those topics in various different global markets will create a complex new landscape that will shape the winds that blow through the business, and how things look in the 2030s and beyond will be fundamentally impacted by those decisions. #faces #court #challenges #disney #universal
    WWW.GAMESINDUSTRY.BIZ
    As AI faces court challenges from Disney and Universal, legal battles are shaping the industry's future | Opinion
    As AI faces court challenges from Disney and Universal, legal battles are shaping the industry's future | Opinion Silicon advances and design innovations do still push us forward – but the future landscape of the industry is also being sculpted in courtrooms and parliaments Image credit: Disney / Epic Games Opinion by Rob Fahey Contributing Editor Published on June 13, 2025 In some regards, the past couple of weeks have felt rather reassuring. We've just seen a hugely successful launch for a new Nintendo console, replete with long queues for midnight sales events. Over the next few days, the various summer events and showcases that have sprouted amongst the scattered bones of E3 generated waves of interest and hype for a host of new games. It all feels like old times. It's enough to make you imagine that while change is the only constant, at least it's we're facing change that's fairly well understood, change in the form of faster, cheaper silicon, or bigger, more ambitious games. If only the winds that blow through this industry all came from such well-defined points on the compass. Nestled in amongst the week's headlines, though, was something that's likely to have profound but much harder to understand impacts on this industry and many others over the coming years – a lawsuit being brought by Disney and NBC Universal against Midjourney, operators of the eponymous generative AI image creation tool. In some regards, the lawsuit looks fairly straightforward; the arguments made and considered in reaching its outcome, though, may have a profound impact on both the ability of creatives and media companies (including game studios and publishers) to protect their IP rights from a very new kind of threat, and the ways in which a promising but highly controversial and risky new set of development and creative tools can be used commercially. A more likely tack on Midjourney's side will be the argument that they are not responsible for what their customers create with the tool I say the lawsuit looks straightforward from some angles, but honestly overall it looks fairly open and shut – the media giants accuse Midjourney of replicating their copyrighted characters and material, and of essentially building a machine for churning out limitless copyright violations. The evidence submitted includes screenshot after screenshot of Midjourney generating pages of images of famous copyrighted and trademarked characters ranging from Yoda to Homer Simpson, so "no we didn't" isn't going to be much of a defence strategy here. A more likely tack on Midjourney's side will be the argument that they are not responsible for what their customers create with the tool – you don't sue the manufacturers of oil paints or canvases when artists use them to paint something copyright-infringing, nor does Microsoft get sued when someone writes something libellous in Word, and Midjourney may try to argue that their software belongs in that tool category, with users alone being ultimately responsible for how they use them. If that argument prevails and survives appeals and challenges, it would be a major triumph for the nascent generative AI industry and a hugely damaging blow to IP holders and creatives, since it would seriously undermine their argument that AI companies shouldn't be able to include copyrighted material into training data sets without licensing or compensation. The reason Disney and NBCU are going after Midjourney specifically seems to be partially down to Midjourney being especially reticent to negotiate with them about licensing fees and prompt restrictions; other generative AI firms have started talking, at least, about paying for content licenses for training data, and have imposed various limitations on their software to prevent the most egregious and obvious forms of copyright violation (at least for famous characters belonging to rich companies; if you're an individual or a smaller company, it's entirely the Wild West out there as regards your IP rights). In the process, though, they're essentially risking a court showdown over a set of not-quite-clear legal questions at the heart of this dispute, and if Midjourney were to prevail in that argument, other AI companies would likely back off from engaging with IP holders on this topic. To be clear, though, it seems highly unlikely that Midjourney will win that argument, at least not in the medium to long term. Yet depending on how this case moves forward, losing the argument could have equally dramatic consequences – especially if the courts find themselves compelled to consider the question of how, exactly, a generative AI system reproduces a copyrighted character with such precision without storing copyright-infringing data in some manner. The 2020s are turning out to be the decade in which many key regulatory issues come to a head all at once AI advocates have been trying to handwave around this notion from the outset, but at some point a court is going to have to sit down and confront the fact that the precision with which these systems can replicate copyrighted characters, scenes, and other materials requires that they must have stored that infringing material in some form. That it's stored as a scattered mesh of probabilities across the vertices of a high-dimensional vector array, rather than a straightforward, monolithic media file, is clearly important but may ultimately be considered moot. If the data is in the system and can be replicated on request, how that differs from Napster or The Pirate Bay is arguably just a matter of technical obfuscation. Not having to defend that technical argument in court thus far has been a huge boon to the generative AI field; if it is knocked over in that venue, it will have knock-on effects on every company in the sector and on every business that uses their products. Nobody can be quite sure which of the various rocks and pebbles being kicked on this slope is going to set off the landslide, but there seems to be an increasing consensus that a legal and regulatory reckoning is coming for generative AI. Consequently, a lot of what's happening in that market right now has the feel of companies desperately trying to establish products and lock in revenue streams before that happens, because it'll be harder to regulate a technology that's genuinely integrated into the world's economic systems than it is to impose limits on one that's currently only clocking up relatively paltry sales and revenues. Keeping an eye on this is crucial for any industry that's started experimenting with AI in its workflows – none more than a creative industry like video games, where various forms of AI usage have been posited, although the enthusiasm and buzz so far massively outweighs any tangible benefits from the technology. Regardless of what happens in legal and regulatory contexts, AI is already a double-edged sword for any creative industry. Used judiciously, it might help to speed up development processes and reduce overheads. Applied in a slapdash or thoughtless manner, it can and will end up wreaking havoc on development timelines, filling up storefronts with endless waves of vaguely-copyright-infringing slop, and potentially make creative firms, from the industry's biggest companies to its smallest indie developers, into victims of impossibly large-scale copyright infringement rather than beneficiaries of a new wave of technology-fuelled productivity. The legal threat now hanging over the sector isn't new, merely amplified. We've known for a long time that AI generated artwork, code, and text has significant problems from the perspective of intellectual property rights (you can infringe someone else's copyright with it, but generally can't impose your own copyright on its creations – opening careless companies up to a risk of having key assets in their game being technically public domain and impossible to protect). Even if you're not using AI yourself, however – even if you're vehemently opposed to it on moral and ethical grounds (which is entirely valid given the highly dubious land-grab these companies have done for their training data), the Midjourney judgement and its fallout may well impact the creative work you produce yourself and how it ends up being used and abused by these products in future. This all has huge ramifications for the games business and will shape everything from how games are created to how IP can be protected for many years to come – a wind of change that's very different and vastly more unpredictable than those we're accustomed to. It's a reminder of just how much of the industry's future is currently being shaped not in development studios and semiconductor labs, but rather in courtrooms and parliamentary committees. The ways in which generative AI can be used and how copyright can persist in the face of it will be fundamentally shaped in courts and parliaments, but it's far from the only crucially important topic being hashed out in those venues. The ongoing legal turmoil over the opening up of mobile app ecosystems, too, will have huge impacts on the games industry. Meanwhile, the debates over loot boxes, gambling, and various consumer protection aspects related to free-to-play models continue to rumble on in the background. Because the industry moves fast while governments move slow, it's easy to forget that that's still an active topic for as far as governments are concerned, and hammers may come down at any time. Regulation by governments, whether through the passage of new legislation or the interpretation of existing laws in the courts, has always loomed in the background of any major industry, especially one with strong cultural relevance. The games industry is no stranger to that being part of the background heartbeat of the business. The 2020s, however, are turning out to be the decade in which many key regulatory issues come to a head all at once, whether it's AI and copyright, app stores and walled gardens, or loot boxes and IAP-based business models. Rulings on those topics in various different global markets will create a complex new landscape that will shape the winds that blow through the business, and how things look in the 2030s and beyond will be fundamentally impacted by those decisions.
    0 التعليقات 0 المشاركات