• NVIDIA CEO Drops the Blueprint for Europe’s AI Boom

    At GTC Paris — held alongside VivaTech, Europe’s largest tech event — NVIDIA founder and CEO Jensen Huang delivered a clear message: Europe isn’t just adopting AI — it’s building it.
    “We now have a new industry, an AI industry, and it’s now part of the new infrastructure, called intelligence infrastructure, that will be used by every country, every society,” Huang said, addressing an audience gathered online and at the iconic Dôme de Paris.
    From exponential inference growth to quantum breakthroughs, and from infrastructure to industry, agentic AI to robotics, Huang outlined how the region is laying the groundwork for an AI-powered future.

    A New Industrial Revolution
    At the heart of this transformation, Huang explained, are systems like GB200 NVL72 — “one giant GPU” and NVIDIA’s most powerful AI platform yet — now in full production and powering everything from sovereign models to quantum computing.
    “This machine was designed to be a thinking machine, a thinking machine, in the sense that it reasons, it plans, it spends a lot of time talking to itself,” Huang said, walking the audience through the size and scale of these machines and their performance.
    At GTC Paris, Huang showed audience members the innards of some of NVIDIA’s latest hardware.
    There’s more coming, with Huang saying NVIDIA’s partners are now producing 1,000 GB200 systems a week, “and this is just the beginning.” He walked the audience through a range of available systems ranging from the tiny NVIDIA DGX Spark to rack-mounted RTX PRO Servers.
    Huang explained that NVIDIA is working to help countries use technologies like these to build both AI infrastructure — services built for third parties to use and innovate on — and AI factories, which companies build for their own use, to generate revenue.
    NVIDIA is partnering with European governments, telcos and cloud providers to deploy NVIDIA technologies across the region. NVIDIA is also expanding its network of technology centers across Europe — including new hubs in Finland, Germany, Spain, Italy and the U.K. — to accelerate skills development and quantum growth.
    Quantum Meets Classical
    Europe’s quantum ambitions just got a boost.
    The NVIDIA CUDA-Q platform is live on Denmark’s Gefion supercomputer, opening new possibilities for hybrid AI and quantum engineering. In addition, Huang announced that CUDA-Q is now available on NVIDIA Grace Blackwell systems.
    Across the continent, NVIDIA is partnering with supercomputing centers and quantum hardware builders to advance hybrid quantum-AI research and accelerate quantum error correction.
    “Quantum computing is reaching an inflection point,” Huang said. “We are within reach of being able to apply quantum computing, quantum classical computing, in areas that can solve some interesting problems in the coming years.”
    Sovereign Models, Smarter Agents
    European developers want more control over their models. Enter NVIDIA Nemotron, designed to help build large language models tuned to local needs.
    “And so now you know that you have access to an enhanced open model that is still open, that is top of the leader chart,” Huang said.
    These models will be coming to Perplexity, a reasoning search engine, enabling secure, multilingual AI deployment across Europe.
    “You can now ask and get questions answered in the language, in the culture, in the sensibility of your country,” Huang said.
    Huang explained how NVIDIA is helping countries across Europe build AI infrastructure.
    Every company will build its own agents, Huang said. To help create those agents, Huang introduced a suite of agentic AI blueprints, including an Agentic AI Safety blueprint for enterprises and governments.
    The new NVIDIA NeMo Agent toolkit and NVIDIA AI Blueprint for building data flywheels further accelerate the development of safe, high-performing AI agents.
    To help deploy these agents, NVIDIA is partnering with European governments, telcos and cloud providers to deploy the DGX Cloud Lepton platform across the region, providing instant access to accelerated computing capacity.
    “One model architecture, one deployment, and you can run it anywhere,” Huang said, adding that Lepton is now integrated with Hugging Face, giving developers direct access to global compute.
    The Industrial Cloud Goes Live
    AI isn’t just virtual. It’s powering physical systems, too, sparking a new industrial revolution.
    “We’re working on industrial AI with one company after another,” Huang said, describing work to build digital twins based on the NVIDIA Omniverse platform with companies across the continent.
    Huang explained that everything he showed during his keynote was “computer simulation, not animation” and that it looks beautiful because “it turns out the world is beautiful, and it turns out math is beautiful.”
    To further this work, Huang announced NVIDIA is launching the world’s first industrial AI cloud — to be built in Germany — to help Europe’s manufacturers simulate, automate and optimize at scale.
    “Soon, everything that moves will be robotic,” Huang said. “And the car is the next one.”
    NVIDIA DRIVE, NVIDIA’s full-stack AV platform, is now in production to accelerate the large-scale deployment of safe, intelligent transportation.
    And to show what’s coming next, Huang was joined on stage by Grek, a pint-sized robot, as Huang talked about how NVIDIA partnered with DeepMind and Disney to build Newton, the world’s most advanced physics training engine for robotics.
    The Next Wave
    The next wave of AI has begun — and it’s exponential, Huang explained.
    “We have physical robots, and we have information robots. We call them agents,” Huang said. “The technology necessary to teach a robot to manipulate, to simulate — and of course, the manifestation of an incredible robot — is now right in front of us.”
    This new era of AI is being driven by a surge in inference workloads. “The number of people using inference has gone from 8 million to 800 million — 100x in just a couple of years,” Huang said.
    To meet this demand, Huang emphasized the need for a new kind of computer: “We need a special computer designed for thinking, designed for reasoning. And that’s what Blackwell is — a thinking machine.”
    Huang and Grek, as he explained how AI is driving advancements in robotics.
    These Blackwell-powered systems will live in a new class of data centers — AI factories — built to generate tokens, the raw material of modern intelligence.
    “These AI factories are going to generate tokens,” Huang said, turning to Grek with a smile. “And these tokens are going to become your food, little Grek.”
    With that, the keynote closed on a bold vision: a future powered by sovereign infrastructure, agentic AI, robotics — and exponential inference — all built in partnership with Europe.
    Watch the NVIDIA GTC Paris keynote from Huang at VivaTech and explore GTC Paris sessions.
    #nvidia #ceo #drops #blueprint #europes
    NVIDIA CEO Drops the Blueprint for Europe’s AI Boom
    At GTC Paris — held alongside VivaTech, Europe’s largest tech event — NVIDIA founder and CEO Jensen Huang delivered a clear message: Europe isn’t just adopting AI — it’s building it. “We now have a new industry, an AI industry, and it’s now part of the new infrastructure, called intelligence infrastructure, that will be used by every country, every society,” Huang said, addressing an audience gathered online and at the iconic Dôme de Paris. From exponential inference growth to quantum breakthroughs, and from infrastructure to industry, agentic AI to robotics, Huang outlined how the region is laying the groundwork for an AI-powered future. A New Industrial Revolution At the heart of this transformation, Huang explained, are systems like GB200 NVL72 — “one giant GPU” and NVIDIA’s most powerful AI platform yet — now in full production and powering everything from sovereign models to quantum computing. “This machine was designed to be a thinking machine, a thinking machine, in the sense that it reasons, it plans, it spends a lot of time talking to itself,” Huang said, walking the audience through the size and scale of these machines and their performance. At GTC Paris, Huang showed audience members the innards of some of NVIDIA’s latest hardware. There’s more coming, with Huang saying NVIDIA’s partners are now producing 1,000 GB200 systems a week, “and this is just the beginning.” He walked the audience through a range of available systems ranging from the tiny NVIDIA DGX Spark to rack-mounted RTX PRO Servers. Huang explained that NVIDIA is working to help countries use technologies like these to build both AI infrastructure — services built for third parties to use and innovate on — and AI factories, which companies build for their own use, to generate revenue. NVIDIA is partnering with European governments, telcos and cloud providers to deploy NVIDIA technologies across the region. NVIDIA is also expanding its network of technology centers across Europe — including new hubs in Finland, Germany, Spain, Italy and the U.K. — to accelerate skills development and quantum growth. Quantum Meets Classical Europe’s quantum ambitions just got a boost. The NVIDIA CUDA-Q platform is live on Denmark’s Gefion supercomputer, opening new possibilities for hybrid AI and quantum engineering. In addition, Huang announced that CUDA-Q is now available on NVIDIA Grace Blackwell systems. Across the continent, NVIDIA is partnering with supercomputing centers and quantum hardware builders to advance hybrid quantum-AI research and accelerate quantum error correction. “Quantum computing is reaching an inflection point,” Huang said. “We are within reach of being able to apply quantum computing, quantum classical computing, in areas that can solve some interesting problems in the coming years.” Sovereign Models, Smarter Agents European developers want more control over their models. Enter NVIDIA Nemotron, designed to help build large language models tuned to local needs. “And so now you know that you have access to an enhanced open model that is still open, that is top of the leader chart,” Huang said. These models will be coming to Perplexity, a reasoning search engine, enabling secure, multilingual AI deployment across Europe. “You can now ask and get questions answered in the language, in the culture, in the sensibility of your country,” Huang said. Huang explained how NVIDIA is helping countries across Europe build AI infrastructure. Every company will build its own agents, Huang said. To help create those agents, Huang introduced a suite of agentic AI blueprints, including an Agentic AI Safety blueprint for enterprises and governments. The new NVIDIA NeMo Agent toolkit and NVIDIA AI Blueprint for building data flywheels further accelerate the development of safe, high-performing AI agents. To help deploy these agents, NVIDIA is partnering with European governments, telcos and cloud providers to deploy the DGX Cloud Lepton platform across the region, providing instant access to accelerated computing capacity. “One model architecture, one deployment, and you can run it anywhere,” Huang said, adding that Lepton is now integrated with Hugging Face, giving developers direct access to global compute. The Industrial Cloud Goes Live AI isn’t just virtual. It’s powering physical systems, too, sparking a new industrial revolution. “We’re working on industrial AI with one company after another,” Huang said, describing work to build digital twins based on the NVIDIA Omniverse platform with companies across the continent. Huang explained that everything he showed during his keynote was “computer simulation, not animation” and that it looks beautiful because “it turns out the world is beautiful, and it turns out math is beautiful.” To further this work, Huang announced NVIDIA is launching the world’s first industrial AI cloud — to be built in Germany — to help Europe’s manufacturers simulate, automate and optimize at scale. “Soon, everything that moves will be robotic,” Huang said. “And the car is the next one.” NVIDIA DRIVE, NVIDIA’s full-stack AV platform, is now in production to accelerate the large-scale deployment of safe, intelligent transportation. And to show what’s coming next, Huang was joined on stage by Grek, a pint-sized robot, as Huang talked about how NVIDIA partnered with DeepMind and Disney to build Newton, the world’s most advanced physics training engine for robotics. The Next Wave The next wave of AI has begun — and it’s exponential, Huang explained. “We have physical robots, and we have information robots. We call them agents,” Huang said. “The technology necessary to teach a robot to manipulate, to simulate — and of course, the manifestation of an incredible robot — is now right in front of us.” This new era of AI is being driven by a surge in inference workloads. “The number of people using inference has gone from 8 million to 800 million — 100x in just a couple of years,” Huang said. To meet this demand, Huang emphasized the need for a new kind of computer: “We need a special computer designed for thinking, designed for reasoning. And that’s what Blackwell is — a thinking machine.” Huang and Grek, as he explained how AI is driving advancements in robotics. These Blackwell-powered systems will live in a new class of data centers — AI factories — built to generate tokens, the raw material of modern intelligence. “These AI factories are going to generate tokens,” Huang said, turning to Grek with a smile. “And these tokens are going to become your food, little Grek.” With that, the keynote closed on a bold vision: a future powered by sovereign infrastructure, agentic AI, robotics — and exponential inference — all built in partnership with Europe. Watch the NVIDIA GTC Paris keynote from Huang at VivaTech and explore GTC Paris sessions. #nvidia #ceo #drops #blueprint #europes
    BLOGS.NVIDIA.COM
    NVIDIA CEO Drops the Blueprint for Europe’s AI Boom
    At GTC Paris — held alongside VivaTech, Europe’s largest tech event — NVIDIA founder and CEO Jensen Huang delivered a clear message: Europe isn’t just adopting AI — it’s building it. “We now have a new industry, an AI industry, and it’s now part of the new infrastructure, called intelligence infrastructure, that will be used by every country, every society,” Huang said, addressing an audience gathered online and at the iconic Dôme de Paris. From exponential inference growth to quantum breakthroughs, and from infrastructure to industry, agentic AI to robotics, Huang outlined how the region is laying the groundwork for an AI-powered future. A New Industrial Revolution At the heart of this transformation, Huang explained, are systems like GB200 NVL72 — “one giant GPU” and NVIDIA’s most powerful AI platform yet — now in full production and powering everything from sovereign models to quantum computing. “This machine was designed to be a thinking machine, a thinking machine, in the sense that it reasons, it plans, it spends a lot of time talking to itself,” Huang said, walking the audience through the size and scale of these machines and their performance. At GTC Paris, Huang showed audience members the innards of some of NVIDIA’s latest hardware. There’s more coming, with Huang saying NVIDIA’s partners are now producing 1,000 GB200 systems a week, “and this is just the beginning.” He walked the audience through a range of available systems ranging from the tiny NVIDIA DGX Spark to rack-mounted RTX PRO Servers. Huang explained that NVIDIA is working to help countries use technologies like these to build both AI infrastructure — services built for third parties to use and innovate on — and AI factories, which companies build for their own use, to generate revenue. NVIDIA is partnering with European governments, telcos and cloud providers to deploy NVIDIA technologies across the region. NVIDIA is also expanding its network of technology centers across Europe — including new hubs in Finland, Germany, Spain, Italy and the U.K. — to accelerate skills development and quantum growth. Quantum Meets Classical Europe’s quantum ambitions just got a boost. The NVIDIA CUDA-Q platform is live on Denmark’s Gefion supercomputer, opening new possibilities for hybrid AI and quantum engineering. In addition, Huang announced that CUDA-Q is now available on NVIDIA Grace Blackwell systems. Across the continent, NVIDIA is partnering with supercomputing centers and quantum hardware builders to advance hybrid quantum-AI research and accelerate quantum error correction. “Quantum computing is reaching an inflection point,” Huang said. “We are within reach of being able to apply quantum computing, quantum classical computing, in areas that can solve some interesting problems in the coming years.” Sovereign Models, Smarter Agents European developers want more control over their models. Enter NVIDIA Nemotron, designed to help build large language models tuned to local needs. “And so now you know that you have access to an enhanced open model that is still open, that is top of the leader chart,” Huang said. These models will be coming to Perplexity, a reasoning search engine, enabling secure, multilingual AI deployment across Europe. “You can now ask and get questions answered in the language, in the culture, in the sensibility of your country,” Huang said. Huang explained how NVIDIA is helping countries across Europe build AI infrastructure. Every company will build its own agents, Huang said. To help create those agents, Huang introduced a suite of agentic AI blueprints, including an Agentic AI Safety blueprint for enterprises and governments. The new NVIDIA NeMo Agent toolkit and NVIDIA AI Blueprint for building data flywheels further accelerate the development of safe, high-performing AI agents. To help deploy these agents, NVIDIA is partnering with European governments, telcos and cloud providers to deploy the DGX Cloud Lepton platform across the region, providing instant access to accelerated computing capacity. “One model architecture, one deployment, and you can run it anywhere,” Huang said, adding that Lepton is now integrated with Hugging Face, giving developers direct access to global compute. The Industrial Cloud Goes Live AI isn’t just virtual. It’s powering physical systems, too, sparking a new industrial revolution. “We’re working on industrial AI with one company after another,” Huang said, describing work to build digital twins based on the NVIDIA Omniverse platform with companies across the continent. Huang explained that everything he showed during his keynote was “computer simulation, not animation” and that it looks beautiful because “it turns out the world is beautiful, and it turns out math is beautiful.” To further this work, Huang announced NVIDIA is launching the world’s first industrial AI cloud — to be built in Germany — to help Europe’s manufacturers simulate, automate and optimize at scale. “Soon, everything that moves will be robotic,” Huang said. “And the car is the next one.” NVIDIA DRIVE, NVIDIA’s full-stack AV platform, is now in production to accelerate the large-scale deployment of safe, intelligent transportation. And to show what’s coming next, Huang was joined on stage by Grek, a pint-sized robot, as Huang talked about how NVIDIA partnered with DeepMind and Disney to build Newton, the world’s most advanced physics training engine for robotics. The Next Wave The next wave of AI has begun — and it’s exponential, Huang explained. “We have physical robots, and we have information robots. We call them agents,” Huang said. “The technology necessary to teach a robot to manipulate, to simulate — and of course, the manifestation of an incredible robot — is now right in front of us.” This new era of AI is being driven by a surge in inference workloads. “The number of people using inference has gone from 8 million to 800 million — 100x in just a couple of years,” Huang said. To meet this demand, Huang emphasized the need for a new kind of computer: “We need a special computer designed for thinking, designed for reasoning. And that’s what Blackwell is — a thinking machine.” Huang and Grek, as he explained how AI is driving advancements in robotics. These Blackwell-powered systems will live in a new class of data centers — AI factories — built to generate tokens, the raw material of modern intelligence. “These AI factories are going to generate tokens,” Huang said, turning to Grek with a smile. “And these tokens are going to become your food, little Grek.” With that, the keynote closed on a bold vision: a future powered by sovereign infrastructure, agentic AI, robotics — and exponential inference — all built in partnership with Europe. Watch the NVIDIA GTC Paris keynote from Huang at VivaTech and explore GTC Paris sessions.
    Like
    Love
    Sad
    23
    0 Комментарии 0 Поделились
  • ¿Quién diría que la alineación de impedancias sería el nuevo pasatiempo de moda? Ralph, emocionado como un niño en una tienda de caramelos, nos enseña la importancia de "matching" entre la carga y la fuente, como si se tratara de encontrar la pareja perfecta en una app de citas. Porque, claro, ¿quién no quiere que su circuito sea el alma de la fiesta eléctrica? ¡Impedancia y diversión, todo en uno! Así que la próxima vez que sientas que te falta energía, recuerda: tal vez solo necesites un poco de ajuste en tu "Smith Chart".

    #Impedancia #Circuitos #RalphYSuAventura #SmithChart #Energía
    ¿Quién diría que la alineación de impedancias sería el nuevo pasatiempo de moda? Ralph, emocionado como un niño en una tienda de caramelos, nos enseña la importancia de "matching" entre la carga y la fuente, como si se tratara de encontrar la pareja perfecta en una app de citas. Porque, claro, ¿quién no quiere que su circuito sea el alma de la fiesta eléctrica? ¡Impedancia y diversión, todo en uno! Así que la próxima vez que sientas que te falta energía, recuerda: tal vez solo necesites un poco de ajuste en tu "Smith Chart". #Impedancia #Circuitos #RalphYSuAventura #SmithChart #Energía
    HACKADAY.COM
    Pi Networks the Smith Chart Way
    [Ralph] is excited about impedance matching, and why not? It is important to match the source and load impedance to get the most power out of a circuit. He’s got …read more
    1 Комментарии 0 Поделились
  • Looker Studio, formerly known as Google Data Studio, is the magical place where data goes to become “insights” with just a few clicks. Because who needs extensive training or complex analysis when you can slap together a dashboard and call it a day? It’s the perfect tool for those who want to look busy while avoiding the actual work of interpreting data. Just throw in some colors, charts, and voilà! You’re a data wizard. The best part? It’s free! Which is ideal for all those who enjoy the thrill of getting something for nothing. So, jump on the Looker Studio bandwagon and let the world believe you’re a data genius—at least until they ask you to explain it.

    #LookerStudio #Data
    Looker Studio, formerly known as Google Data Studio, is the magical place where data goes to become “insights” with just a few clicks. Because who needs extensive training or complex analysis when you can slap together a dashboard and call it a day? It’s the perfect tool for those who want to look busy while avoiding the actual work of interpreting data. Just throw in some colors, charts, and voilà! You’re a data wizard. The best part? It’s free! Which is ideal for all those who enjoy the thrill of getting something for nothing. So, jump on the Looker Studio bandwagon and let the world believe you’re a data genius—at least until they ask you to explain it. #LookerStudio #Data
    ¿Qué es Looker Studio?
    En el mundo del análisis de datos, la visualización es clave para interpretar y comunicar insights de manera efectiva. Looker Studio, anteriormente conocido como Google Data Studio, es la respuesta de Google a esta necesidad, ofreciendo una plataform
    1 Комментарии 0 Поделились
  • Ankur Kothari Q&A: Customer Engagement Book Interview

    Reading Time: 9 minutes
    In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns.
    But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic.
    This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results.
    Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.

     
    Ankur Kothari Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns.
    Second would be demographic information: age, location, income, and other relevant personal characteristics.
    Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews.
    Fourth would be the customer journey data.

    We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance.
    Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in.

    You also want to make sure that there is consistency across sources.
    Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory.
    Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy.

    By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    First, it helps us to uncover unmet needs.

    By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points.

    Second would be identifying emerging needs.
    Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly.
    Third would be segmentation analysis.
    Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies.
    Last is to build competitive differentiation.

    Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions.

    4. Can you share an example of where data insights directly influenced a critical decision?
    I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings.
    We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms.
    That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs.

    That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial.

    5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time?
    When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences.
    We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments.
    Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content.

    With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns.

    6. How are you doing the 1:1 personalization?
    We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer.
    So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer.
    That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience.

    We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers.

    7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service?
    Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved.
    The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments.

    Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention.

    So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization.

    8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights?
    I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights.

    Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement.

    Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant.
    As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively.
    So there’s a lack of understanding of marketing and sales as domains.
    It’s a huge effort and can take a lot of investment.

    Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing.

    9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data?
    If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge.
    Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side.

    Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important.

    10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before?
    First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do.
    And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations.
    The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it.

    Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one.

    11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI.
    We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals.

    We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization.

    12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data?
    I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points.
    Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us.
    We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels.
    Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms.

    Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps.

    13. How do you ensure data quality and consistency across multiple channels to make these informed decisions?
    We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies.
    While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing.
    We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats.

    On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically.

    14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?
    The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices.
    Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities.
    We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases.
    As the world is collecting more data, privacy concerns and regulations come into play.
    I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies.
    And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture.

    So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.

     
    This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #ankur #kothari #qampampa #customer #engagement
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question, we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage. #ankur #kothari #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Ankur Kothari Q&A: Customer Engagement Book Interview
    Reading Time: 9 minutes In marketing, data isn’t a buzzword. It’s the lifeblood of all successful campaigns. But are you truly harnessing its power, or are you drowning in a sea of information? To answer this question (and many others), we sat down with Ankur Kothari, a seasoned Martech expert, to dive deep into this crucial topic. This interview, originally conducted for Chapter 6 of “The Customer Engagement Book: Adapt or Die” explores how businesses can translate raw data into actionable insights that drive real results. Ankur shares his wealth of knowledge on identifying valuable customer engagement data, distinguishing between signal and noise, and ultimately, shaping real-time strategies that keep companies ahead of the curve.   Ankur Kothari Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? Primarily, there are four different buckets of customer engagement data. I would begin with behavioral data, encompassing website interaction, purchase history, and other app usage patterns. Second would be demographic information: age, location, income, and other relevant personal characteristics. Third would be sentiment analysis, where we derive information from social media interaction, customer feedback, or other customer reviews. Fourth would be the customer journey data. We track touchpoints across various channels of the customers to understand the customer journey path and conversion. Combining these four primary sources helps us understand the engagement data. 2. How do you distinguish between data that is actionable versus data that is just noise? First is keeping relevant to your business objectives, making actionable data that directly relates to your specific goals or KPIs, and then taking help from statistical significance. Actionable data shows clear patterns or trends that are statistically valid, whereas other data consists of random fluctuations or outliers, which may not be what you are interested in. You also want to make sure that there is consistency across sources. Actionable insights are typically corroborated by multiple data points or channels, while other data or noise can be more isolated and contradictory. Actionable data suggests clear opportunities for improvement or decision making, whereas noise does not lead to meaningful actions or changes in strategy. By applying these criteria, I can effectively filter out the noise and focus on data that delivers or drives valuable business decisions. 3. How can customer engagement data be used to identify and prioritize new business opportunities? First, it helps us to uncover unmet needs. By analyzing the customer feedback, touch points, support interactions, or usage patterns, we can identify the gaps in our current offerings or areas where customers are experiencing pain points. Second would be identifying emerging needs. Monitoring changes in customer behavior or preferences over time can reveal new market trends or shifts in demand, allowing my company to adapt their products or services accordingly. Third would be segmentation analysis. Detailed customer data analysis enables us to identify unserved or underserved segments or niche markets that may represent untapped opportunities for growth or expansion into newer areas and new geographies. Last is to build competitive differentiation. Engagement data can highlight where our companies outperform competitors, helping us to prioritize opportunities that leverage existing strengths and unique selling propositions. 4. Can you share an example of where data insights directly influenced a critical decision? I will share an example from my previous organization at one of the financial services where we were very data-driven, which made a major impact on our critical decision regarding our credit card offerings. We analyzed the customer engagement data, and we discovered that a large segment of our millennial customers were underutilizing our traditional credit cards but showed high engagement with mobile payment platforms. That insight led us to develop and launch our first digital credit card product with enhanced mobile features and rewards tailored to the millennial spending habits. Since we had access to a lot of transactional data as well, we were able to build a financial product which met that specific segment’s needs. That data-driven decision resulted in a 40% increase in our new credit card applications from this demographic within the first quarter of the launch. Subsequently, our market share improved in that specific segment, which was very crucial. 5. Are there any other examples of ways that you see customer engagement data being able to shape marketing strategy in real time? When it comes to using the engagement data in real-time, we do quite a few things. In the recent past two, three years, we are using that for dynamic content personalization, adjusting the website content, email messaging, or ad creative based on real-time user behavior and preferences. We automate campaign optimization using specific AI-driven tools to continuously analyze performance metrics and automatically reallocate the budget to top-performing channels or ad segments. Then we also build responsive social media engagement platforms like monitoring social media sentiments and trending topics to quickly adapt the messaging and create timely and relevant content. With one-on-one personalization, we do a lot of A/B testing as part of the overall rapid testing and market elements like subject lines, CTAs, and building various successful variants of the campaigns. 6. How are you doing the 1:1 personalization? We have advanced CDP systems, and we are tracking each customer’s behavior in real-time. So the moment they move to different channels, we know what the context is, what the relevance is, and the recent interaction points, so we can cater the right offer. So for example, if you looked at a certain offer on the website and you came from Google, and then the next day you walk into an in-person interaction, our agent will already know that you were looking at that offer. That gives our customer or potential customer more one-to-one personalization instead of just segment-based or bulk interaction kind of experience. We have a huge team of data scientists, data analysts, and AI model creators who help us to analyze big volumes of data and bring the right insights to our marketing and sales team so that they can provide the right experience to our customers. 7. What role does customer engagement data play in influencing cross-functional decisions, such as with product development, sales, and customer service? Primarily with product development — we have different products, not just the financial products or products whichever organizations sell, but also various products like mobile apps or websites they use for transactions. So that kind of product development gets improved. The engagement data helps our sales and marketing teams create more targeted campaigns, optimize channel selection, and refine messaging to resonate with specific customer segments. Customer service also gets helped by anticipating common issues, personalizing support interactions over the phone or email or chat, and proactively addressing potential problems, leading to improved customer satisfaction and retention. So in general, cross-functional application of engagement improves the customer-centric approach throughout the organization. 8. What do you think some of the main challenges marketers face when trying to translate customer engagement data into actionable business insights? I think the huge amount of data we are dealing with. As we are getting more digitally savvy and most of the customers are moving to digital channels, we are getting a lot of data, and that sheer volume of data can be overwhelming, making it very difficult to identify truly meaningful patterns and insights. Because of the huge data overload, we create data silos in this process, so information often exists in separate systems across different departments. We are not able to build a holistic view of customer engagement. Because of data silos and overload of data, data quality issues appear. There is inconsistency, and inaccurate data can lead to incorrect insights or poor decision-making. Quality issues could also be due to the wrong format of the data, or the data is stale and no longer relevant. As we are growing and adding more people to help us understand customer engagement, I’ve also noticed that technical folks, especially data scientists and data analysts, lack skills to properly interpret the data or apply data insights effectively. So there’s a lack of understanding of marketing and sales as domains. It’s a huge effort and can take a lot of investment. Not being able to calculate the ROI of your overall investment is a big challenge that many organizations are facing. 9. Why do you think the analysts don’t have the business acumen to properly do more than analyze the data? If people do not have the right idea of why we are collecting this data, we collect a lot of noise, and that brings in huge volumes of data. If you cannot stop that from step one—not bringing noise into the data system—that cannot be done by just technical folks or people who do not have business knowledge. Business people do not know everything about what data is being collected from which source and what data they need. It’s a gap between business domain knowledge, specifically marketing and sales needs, and technical folks who don’t have a lot of exposure to that side. Similarly, marketing business people do not have much exposure to the technical side — what’s possible to do with data, how much effort it takes, what’s relevant versus not relevant, and how to prioritize which data sources will be most important. 10. Do you have any suggestions for how this can be overcome, or have you seen it in action where it has been solved before? First, cross-functional training: training different roles to help them understand why we’re doing this and what the business goals are, giving technical people exposure to what marketing and sales teams do. And giving business folks exposure to the technology side through training on different tools, strategies, and the roadmap of data integrations. The second is helping teams work more collaboratively. So it’s not like the technology team works in a silo and comes back when their work is done, and then marketing and sales teams act upon it. Now we’re making it more like one team. You work together so that you can complement each other, and we have a better strategy from day one. 11. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? We present clear business cases where we demonstrate how data-driven recommendations can directly align with business objectives and potential ROI. We build compelling visualizations, easy-to-understand charts and graphs that clearly illustrate the insights and the implications for business goals. We also do a lot of POCs and pilot projects with small-scale implementations to showcase tangible results and build confidence in the data-driven approach throughout the organization. 12. What technologies or tools have you found most effective for gathering and analyzing customer engagement data? I’ve found that Customer Data Platforms help us unify customer data from various sources, providing a comprehensive view of customer interactions across touch points. Having advanced analytics platforms — tools with AI and machine learning capabilities that can process large volumes of data and uncover complex patterns and insights — is a great value to us. We always use, or many organizations use, marketing automation systems to improve marketing team productivity, helping us track and analyze customer interactions across multiple channels. Another thing is social media listening tools, wherever your brand is mentioned or you want to measure customer sentiment over social media, or track the engagement of your campaigns across social media platforms. Last is web analytical tools, which provide detailed insights into your website visitors’ behaviors and engagement metrics, for browser apps, small browser apps, various devices, and mobile apps. 13. How do you ensure data quality and consistency across multiple channels to make these informed decisions? We established clear guidelines for data collection, storage, and usage across all channels to maintain consistency. Then we use data integration platforms — tools that consolidate data from various sources into a single unified view, reducing discrepancies and inconsistencies. While we collect data from different sources, we clean the data so it becomes cleaner with every stage of processing. We also conduct regular data audits — performing periodic checks to identify and rectify data quality issues, ensuring accuracy and reliability of information. We also deploy standardized data formats. On top of that, we have various automated data cleansing tools, specific software to detect and correct data errors, redundancies, duplicates, and inconsistencies in data sets automatically. 14. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? The first thing that’s been the biggest trend from the past two years is AI-driven decision making, which I think will become more prevalent, with advanced algorithms processing vast amounts of engagement data in real-time to inform strategic choices. Somewhat related to this is predictive analytics, which will play an even larger role, enabling businesses to anticipate customer needs and market trends with more accuracy and better predictive capabilities. We also touched upon hyper-personalization. We are all trying to strive toward more hyper-personalization at scale, which is more one-on-one personalization, as we are increasingly capturing more engagement data and have bigger systems and infrastructure to support processing those large volumes of data so we can achieve those hyper-personalization use cases. As the world is collecting more data, privacy concerns and regulations come into play. I believe in the next few years there will be more innovation toward how businesses can collect data ethically and what the usage practices are, leading to more transparent and consent-based engagement data strategies. And lastly, I think about the integration of engagement data, which is always a big challenge. I believe as we’re solving those integration challenges, we are adding more and more complex data sources to the picture. So I think there will need to be more innovation or sophistication brought into data integration strategies, which will help us take a truly customer-centric approach to strategy formulation.   This interview Q&A was hosted with Ankur Kothari, a previous Martech Executive, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Ankur Kothari Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    Like
    Love
    Wow
    Angry
    Sad
    478
    0 Комментарии 0 Поделились
  • Inside Summer Game Fest 2025: How Geoff Keighley and Producers Pulled Off Event Amid Industry Layoffs, ‘GTA 6’ Delay and Switch 2 Release

    With the ongoing jobs cuts across the gaming industry, the shift of “Grand Theft Auto 6” from release this fall to a launch next spring, and the distraction of the first new Nintendo console in eight years, there was a chance that Summer Game Fest 2025 wouldn’t have the same allure as the annual video game showcase has had in years past.Related Stories

    But the gamers came out in full force for the Geoff Keighley-hosted event on June 6, which live-streamed out of the YouTube Theater at SoFi Stadium in Los Angeles.

    Popular on Variety

    “Viewership was up significantly year over year,” Keighley told Variety. “Stream charts said it doubled its audience year over year for the peak concurrency to over 3 million peak concurrent viewers, which does not include China.”

    In person, both the Summer Game Fest live showcase event and its subsequent weekend Play Days event for developers and press saw “significantly higher” media creator attendance this year: more than 600 registered attendees vs. “somewhere in the 400s” in 2024, per SGF. The boost is an indicator that both the current U.S. political climate and significant changes in 2025’s game release schedule, like the delay of “Grand Theft Auto 6” until next May, didn’t affect interest in the event.

    “Things happen in the industry all the time that are big news worthy happenings,” Summer Game Fest producer and iam8bit co-creator Amanda White. “Switch 2 just happened and we’re here, it’s all working out, everybody’s having a great time playing games. It’s not irrelevant — it’s just part of the way things go.”

    As big a hit as the Switch 2 was with consumers upon release — selling more than 3.5 million units during the first four days after its June 5 launch — and noted multiple times during the Summer Game Fest live showcase on June 6, Nintendo’s new console was not the star of the three-day Play Days event for developers and media in Downtown Los Angeles, which ran June 7-9.

    “I have not seen a single attendee with a Switch 2 on campus,” SGF producer and iam8bit co-creator Jon M. Gibson said with a laugh. “There’s a few Switch 2s that Nintendo supplied. Some dev kits for Bandai and for Capcom. Of course, the launch happened on Thursday, so bandwidth from Nintendo is stretched thin with all the midnight launches and stuff. But they’re really supportive and supply some for some pre-release games, which is exciting.”

    Some big video publishers such as EA, Take-Two and Ubisoft skipped this year’s SGF, eliminating potential splashy in-show hits for eagerly anticipated games like “Grand Theft Auto 6.” But SGF still managed a few big moments, like the announcement and trailer release for “Resident Evil Requiem.” Gibson and White attribute that reveal and other moments like it to the immense trust the festival has managed to build up with video game publishers in just a few years.

    “We are very proud of our ability to keep the trust of all the publishers on campus,” Gibson said. “Six years into SGF as a whole, four years into Play Days, we’re very good. Because we have to print everything ahead of time, too. So there are lots of unannounced things that we’re very careful about who sees what. We have vendors who print and produce and manufacture physical objects under very tight wraps. We’re just very protective, because we know what it means to have to keep a secret because we’ve had our own games that we’ve had to announce, as well. Capcom is a great example with ‘Resident Evil.’ We knew that for a very long time, but they trusted us with information, and we were very careful about what our team actually knew what was going on.”

    And even though some of the gaming giants sat this year out, White says conversations were already happening on the Play Days campus about who is ready to return next year and what they’ll bring.

    “People get excited, they come and see. And each year we grow, so people see more potential,” White said.

    As for next year, the June show will take place just a few weeks after the planned May 26 release for “GTA 6.” While Switch 2 didn’t seem to distract too much, will the draw of playing the newly launched “GTA 6” prove to be so powerful it outshines whatever could be announced at SGF 2026?

    “My view is that all boats rise with ‘GTA’ launch,” Keighley said. “It is a singular cultural event that is the biggest thing in all of entertainment this decade. It will bring more people into gaming, sell lots of consoles and bring back lapsed gamers. There will never be a better time to feel the excitement and energy around gaming than SGF 2026.”

    See more from Variety‘s Q&A with Keighley about Summer Game Fest 2025 below.

    How was this year’s show impacted by the date shift for “GTA 6”? How much was planned before and after that big announcement?

    So far as I know there wasn’t any material impact, but I think the date move did allow a number of teams to feel more confident announcing their launch dates.

    Halfway through the year, what do you see as some of the biggest trends in gaming for 2025, and how did you look to reflect that in the show?

    We continue to see some of the most interesting and successful games come from smaller teams outside of the traditional publisher system – games like “Clair Obscur,” “Blue Prince” and “REPO.” So we wanted to highlight some of those projects at the show like “Ill” and “Mortal Shell 2.”

    What game announcements and trailers do you think resonated most with audiences after this show? What assets were the most popular?

    “Resident Evil Requiem” was a massive moment. Also we saw a lot of love for “Ill” from a small team in Canada and Armenia.
    #inside #summer #game #fest #how
    Inside Summer Game Fest 2025: How Geoff Keighley and Producers Pulled Off Event Amid Industry Layoffs, ‘GTA 6’ Delay and Switch 2 Release
    With the ongoing jobs cuts across the gaming industry, the shift of “Grand Theft Auto 6” from release this fall to a launch next spring, and the distraction of the first new Nintendo console in eight years, there was a chance that Summer Game Fest 2025 wouldn’t have the same allure as the annual video game showcase has had in years past.Related Stories But the gamers came out in full force for the Geoff Keighley-hosted event on June 6, which live-streamed out of the YouTube Theater at SoFi Stadium in Los Angeles. Popular on Variety “Viewership was up significantly year over year,” Keighley told Variety. “Stream charts said it doubled its audience year over year for the peak concurrency to over 3 million peak concurrent viewers, which does not include China.” In person, both the Summer Game Fest live showcase event and its subsequent weekend Play Days event for developers and press saw “significantly higher” media creator attendance this year: more than 600 registered attendees vs. “somewhere in the 400s” in 2024, per SGF. The boost is an indicator that both the current U.S. political climate and significant changes in 2025’s game release schedule, like the delay of “Grand Theft Auto 6” until next May, didn’t affect interest in the event. “Things happen in the industry all the time that are big news worthy happenings,” Summer Game Fest producer and iam8bit co-creator Amanda White. “Switch 2 just happened and we’re here, it’s all working out, everybody’s having a great time playing games. It’s not irrelevant — it’s just part of the way things go.” As big a hit as the Switch 2 was with consumers upon release — selling more than 3.5 million units during the first four days after its June 5 launch — and noted multiple times during the Summer Game Fest live showcase on June 6, Nintendo’s new console was not the star of the three-day Play Days event for developers and media in Downtown Los Angeles, which ran June 7-9. “I have not seen a single attendee with a Switch 2 on campus,” SGF producer and iam8bit co-creator Jon M. Gibson said with a laugh. “There’s a few Switch 2s that Nintendo supplied. Some dev kits for Bandai and for Capcom. Of course, the launch happened on Thursday, so bandwidth from Nintendo is stretched thin with all the midnight launches and stuff. But they’re really supportive and supply some for some pre-release games, which is exciting.” Some big video publishers such as EA, Take-Two and Ubisoft skipped this year’s SGF, eliminating potential splashy in-show hits for eagerly anticipated games like “Grand Theft Auto 6.” But SGF still managed a few big moments, like the announcement and trailer release for “Resident Evil Requiem.” Gibson and White attribute that reveal and other moments like it to the immense trust the festival has managed to build up with video game publishers in just a few years. “We are very proud of our ability to keep the trust of all the publishers on campus,” Gibson said. “Six years into SGF as a whole, four years into Play Days, we’re very good. Because we have to print everything ahead of time, too. So there are lots of unannounced things that we’re very careful about who sees what. We have vendors who print and produce and manufacture physical objects under very tight wraps. We’re just very protective, because we know what it means to have to keep a secret because we’ve had our own games that we’ve had to announce, as well. Capcom is a great example with ‘Resident Evil.’ We knew that for a very long time, but they trusted us with information, and we were very careful about what our team actually knew what was going on.” And even though some of the gaming giants sat this year out, White says conversations were already happening on the Play Days campus about who is ready to return next year and what they’ll bring. “People get excited, they come and see. And each year we grow, so people see more potential,” White said. As for next year, the June show will take place just a few weeks after the planned May 26 release for “GTA 6.” While Switch 2 didn’t seem to distract too much, will the draw of playing the newly launched “GTA 6” prove to be so powerful it outshines whatever could be announced at SGF 2026? “My view is that all boats rise with ‘GTA’ launch,” Keighley said. “It is a singular cultural event that is the biggest thing in all of entertainment this decade. It will bring more people into gaming, sell lots of consoles and bring back lapsed gamers. There will never be a better time to feel the excitement and energy around gaming than SGF 2026.” See more from Variety‘s Q&A with Keighley about Summer Game Fest 2025 below. How was this year’s show impacted by the date shift for “GTA 6”? How much was planned before and after that big announcement? So far as I know there wasn’t any material impact, but I think the date move did allow a number of teams to feel more confident announcing their launch dates. Halfway through the year, what do you see as some of the biggest trends in gaming for 2025, and how did you look to reflect that in the show? We continue to see some of the most interesting and successful games come from smaller teams outside of the traditional publisher system – games like “Clair Obscur,” “Blue Prince” and “REPO.” So we wanted to highlight some of those projects at the show like “Ill” and “Mortal Shell 2.” What game announcements and trailers do you think resonated most with audiences after this show? What assets were the most popular? “Resident Evil Requiem” was a massive moment. Also we saw a lot of love for “Ill” from a small team in Canada and Armenia. #inside #summer #game #fest #how
    VARIETY.COM
    Inside Summer Game Fest 2025: How Geoff Keighley and Producers Pulled Off Event Amid Industry Layoffs, ‘GTA 6’ Delay and Switch 2 Release
    With the ongoing jobs cuts across the gaming industry, the shift of “Grand Theft Auto 6” from release this fall to a launch next spring, and the distraction of the first new Nintendo console in eight years, there was a chance that Summer Game Fest 2025 wouldn’t have the same allure as the annual video game showcase has had in years past. (There was also the factor of the actors strike against video game companies, which, as of June 11, has been called off by SAG-AFTRA.) Related Stories But the gamers came out in full force for the Geoff Keighley-hosted event on June 6, which live-streamed out of the YouTube Theater at SoFi Stadium in Los Angeles. Popular on Variety “Viewership was up significantly year over year,” Keighley told Variety. “Stream charts said it doubled its audience year over year for the peak concurrency to over 3 million peak concurrent viewers, which does not include China.” In person, both the Summer Game Fest live showcase event and its subsequent weekend Play Days event for developers and press saw “significantly higher” media creator attendance this year: more than 600 registered attendees vs. “somewhere in the 400s” in 2024, per SGF. The boost is an indicator that both the current U.S. political climate and significant changes in 2025’s game release schedule, like the delay of “Grand Theft Auto 6” until next May, didn’t affect interest in the event. “Things happen in the industry all the time that are big news worthy happenings,” Summer Game Fest producer and iam8bit co-creator Amanda White. “Switch 2 just happened and we’re here, it’s all working out, everybody’s having a great time playing games. It’s not irrelevant — it’s just part of the way things go.” As big a hit as the Switch 2 was with consumers upon release — selling more than 3.5 million units during the first four days after its June 5 launch — and noted multiple times during the Summer Game Fest live showcase on June 6, Nintendo’s new console was not the star of the three-day Play Days event for developers and media in Downtown Los Angeles, which ran June 7-9. “I have not seen a single attendee with a Switch 2 on campus,” SGF producer and iam8bit co-creator Jon M. Gibson said with a laugh. “There’s a few Switch 2s that Nintendo supplied. Some dev kits for Bandai and for Capcom. Of course, the launch happened on Thursday, so bandwidth from Nintendo is stretched thin with all the midnight launches and stuff. But they’re really supportive and supply some for some pre-release games, which is exciting.” Some big video publishers such as EA, Take-Two and Ubisoft skipped this year’s SGF, eliminating potential splashy in-show hits for eagerly anticipated games like “Grand Theft Auto 6.” But SGF still managed a few big moments, like the announcement and trailer release for “Resident Evil Requiem.” Gibson and White attribute that reveal and other moments like it to the immense trust the festival has managed to build up with video game publishers in just a few years. “We are very proud of our ability to keep the trust of all the publishers on campus,” Gibson said. “Six years into SGF as a whole, four years into Play Days, we’re very good. Because we have to print everything ahead of time, too. So there are lots of unannounced things that we’re very careful about who sees what. We have vendors who print and produce and manufacture physical objects under very tight wraps. We’re just very protective, because we know what it means to have to keep a secret because we’ve had our own games that we’ve had to announce, as well. Capcom is a great example with ‘Resident Evil.’ We knew that for a very long time, but they trusted us with information, and we were very careful about what our team actually knew what was going on.” And even though some of the gaming giants sat this year out, White says conversations were already happening on the Play Days campus about who is ready to return next year and what they’ll bring. “People get excited, they come and see. And each year we grow, so people see more potential,” White said. As for next year, the June show will take place just a few weeks after the planned May 26 release for “GTA 6.” While Switch 2 didn’t seem to distract too much, will the draw of playing the newly launched “GTA 6” prove to be so powerful it outshines whatever could be announced at SGF 2026? “My view is that all boats rise with ‘GTA’ launch,” Keighley said. “It is a singular cultural event that is the biggest thing in all of entertainment this decade. It will bring more people into gaming, sell lots of consoles and bring back lapsed gamers. There will never be a better time to feel the excitement and energy around gaming than SGF 2026.” See more from Variety‘s Q&A with Keighley about Summer Game Fest 2025 below. How was this year’s show impacted by the date shift for “GTA 6”? How much was planned before and after that big announcement? So far as I know there wasn’t any material impact, but I think the date move did allow a number of teams to feel more confident announcing their launch dates. Halfway through the year, what do you see as some of the biggest trends in gaming for 2025, and how did you look to reflect that in the show? We continue to see some of the most interesting and successful games come from smaller teams outside of the traditional publisher system – games like “Clair Obscur,” “Blue Prince” and “REPO.” So we wanted to highlight some of those projects at the show like “Ill” and “Mortal Shell 2.” What game announcements and trailers do you think resonated most with audiences after this show? What assets were the most popular? “Resident Evil Requiem” was a massive moment. Also we saw a lot of love for “Ill” from a small team in Canada and Armenia.
    Like
    Love
    Wow
    Sad
    Angry
    532
    0 Комментарии 0 Поделились
  • Fusion and AI: How private sector tech is powering progress at ITER

    In April 2025, at the ITER Private Sector Fusion Workshop in Cadarache, something remarkable unfolded. In a room filled with scientists, engineers and software visionaries, the line between big science and commercial innovation began to blur.  
    Three organisations – Microsoft Research, Arena and Brigantium Engineering – shared how artificial intelligence, already transforming everything from language models to logistics, is now stepping into a new role: helping humanity to unlock the power of nuclear fusion. 
    Each presenter addressed a different part of the puzzle, but the message was the same: AI isn’t just a buzzword anymore. It’s becoming a real tool – practical, powerful and indispensable – for big science and engineering projects, including fusion. 
    “If we think of the agricultural revolution and the industrial revolution, the AI revolution is next – and it’s coming at a pace which is unprecedented,” said Kenji Takeda, director of research incubations at Microsoft Research. 
    Microsoft’s collaboration with ITER is already in motion. Just a month before the workshop, the two teams signed a Memorandum of Understandingto explore how AI can accelerate research and development. This follows ITER’s initial use of Microsoft technology to empower their teams.
    A chatbot in Azure OpenAI service was developed to help staff navigate technical knowledge, on more than a million ITER documents, using natural conversation. GitHub Copilot assists with coding, while AI helps to resolve IT support tickets – those everyday but essential tasks that keep the lights on. 
    But Microsoft’s vision goes deeper. Fusion demands materials that can survive extreme conditions – heat, radiation, pressure – and that’s where AI shows a different kind of potential. MatterGen, a Microsoft Research generative AI model for materials, designs entirely new materials based on specific properties.
    “It’s like ChatGPT,” said Takeda, “but instead of ‘Write me a poem’, we ask it to design a material that can survive as the first wall of a fusion reactor.” 
    The next step? MatterSim – a simulation tool that predicts how these imagined materials will behave in the real world. By combining generation and simulation, Microsoft hopes to uncover materials that don’t yet exist in any catalogue. 
    While Microsoft tackles the atomic scale, Arena is focused on a different challenge: speeding up hardware development. As general manager Michael Frei put it: “Software innovation happens in seconds. In hardware, that loop can take months – or years.” 
    Arena’s answer is Atlas, a multimodal AI platform that acts as an extra set of hands – and eyes – for engineers. It can read data sheets, interpret lab results, analyse circuit diagrams and even interact with lab equipment through software interfaces. “Instead of adjusting an oscilloscope manually,” said Frei, “you can just say, ‘Verify the I2Cprotocol’, and Atlas gets it done.” 
    It doesn’t stop there. Atlas can write and adapt firmware on the fly, responding to real-time conditions. That means tighter feedback loops, faster prototyping and fewer late nights in the lab. Arena aims to make building hardware feel a little more like writing software – fluid, fast and assisted by smart tools. 

    Fusion, of course, isn’t just about atoms and code – it’s also about construction. Gigantic, one-of-a-kind machines don’t build themselves. That’s where Brigantium Engineering comes in.
    Founder Lynton Sutton explained how his team uses “4D planning” – a marriage of 3D CAD models and detailed construction schedules – to visualise how everything comes together over time. “Gantt charts are hard to interpret. 3D models are static. Our job is to bring those together,” he said. 
    The result is a time-lapse-style animation that shows the construction process step by step. It’s proven invaluable for safety reviews and stakeholder meetings. Rather than poring over spreadsheets, teams can simply watch the plan come to life. 
    And there’s more. Brigantium is bringing these models into virtual reality using Unreal Engine – the same one behind many video games. One recent model recreated ITER’s tokamak pit using drone footage and photogrammetry. The experience is fully interactive and can even run in a web browser.
    “We’ve really improved the quality of the visualisation,” said Sutton. “It’s a lot smoother; the textures look a lot better. Eventually, we’ll have this running through a web browser, so anybody on the team can just click on a web link to navigate this 4D model.” 
    Looking forward, Sutton believes AI could help automate the painstaking work of syncing schedules with 3D models. One day, these simulations could reach all the way down to individual bolts and fasteners – not just with impressive visuals, but with critical tools for preventing delays. 
    Despite the different approaches, one theme ran through all three presentations: AI isn’t just a tool for office productivity. It’s becoming a partner in creativity, problem-solving and even scientific discovery. 
    Takeda mentioned that Microsoft is experimenting with “world models” inspired by how video games simulate physics. These models learn about the physical world by watching pixels in the form of videos of real phenomena such as plasma behaviour. “Our thesis is that if you showed this AI videos of plasma, it might learn the physics of plasmas,” he said. 
    It sounds futuristic, but the logic holds. The more AI can learn from the world, the more it can help us understand it – and perhaps even master it. At its heart, the message from the workshop was simple: AI isn’t here to replace the scientist, the engineer or the planner; it’s here to help, and to make their work faster, more flexible and maybe a little more fun.
    As Takeda put it: “Those are just a few examples of how AI is starting to be used at ITER. And it’s just the start of that journey.” 
    If these early steps are any indication, that journey won’t just be faster – it might also be more inspired. 
    #fusion #how #private #sector #tech
    Fusion and AI: How private sector tech is powering progress at ITER
    In April 2025, at the ITER Private Sector Fusion Workshop in Cadarache, something remarkable unfolded. In a room filled with scientists, engineers and software visionaries, the line between big science and commercial innovation began to blur.   Three organisations – Microsoft Research, Arena and Brigantium Engineering – shared how artificial intelligence, already transforming everything from language models to logistics, is now stepping into a new role: helping humanity to unlock the power of nuclear fusion.  Each presenter addressed a different part of the puzzle, but the message was the same: AI isn’t just a buzzword anymore. It’s becoming a real tool – practical, powerful and indispensable – for big science and engineering projects, including fusion.  “If we think of the agricultural revolution and the industrial revolution, the AI revolution is next – and it’s coming at a pace which is unprecedented,” said Kenji Takeda, director of research incubations at Microsoft Research.  Microsoft’s collaboration with ITER is already in motion. Just a month before the workshop, the two teams signed a Memorandum of Understandingto explore how AI can accelerate research and development. This follows ITER’s initial use of Microsoft technology to empower their teams. A chatbot in Azure OpenAI service was developed to help staff navigate technical knowledge, on more than a million ITER documents, using natural conversation. GitHub Copilot assists with coding, while AI helps to resolve IT support tickets – those everyday but essential tasks that keep the lights on.  But Microsoft’s vision goes deeper. Fusion demands materials that can survive extreme conditions – heat, radiation, pressure – and that’s where AI shows a different kind of potential. MatterGen, a Microsoft Research generative AI model for materials, designs entirely new materials based on specific properties. “It’s like ChatGPT,” said Takeda, “but instead of ‘Write me a poem’, we ask it to design a material that can survive as the first wall of a fusion reactor.”  The next step? MatterSim – a simulation tool that predicts how these imagined materials will behave in the real world. By combining generation and simulation, Microsoft hopes to uncover materials that don’t yet exist in any catalogue.  While Microsoft tackles the atomic scale, Arena is focused on a different challenge: speeding up hardware development. As general manager Michael Frei put it: “Software innovation happens in seconds. In hardware, that loop can take months – or years.”  Arena’s answer is Atlas, a multimodal AI platform that acts as an extra set of hands – and eyes – for engineers. It can read data sheets, interpret lab results, analyse circuit diagrams and even interact with lab equipment through software interfaces. “Instead of adjusting an oscilloscope manually,” said Frei, “you can just say, ‘Verify the I2Cprotocol’, and Atlas gets it done.”  It doesn’t stop there. Atlas can write and adapt firmware on the fly, responding to real-time conditions. That means tighter feedback loops, faster prototyping and fewer late nights in the lab. Arena aims to make building hardware feel a little more like writing software – fluid, fast and assisted by smart tools.  Fusion, of course, isn’t just about atoms and code – it’s also about construction. Gigantic, one-of-a-kind machines don’t build themselves. That’s where Brigantium Engineering comes in. Founder Lynton Sutton explained how his team uses “4D planning” – a marriage of 3D CAD models and detailed construction schedules – to visualise how everything comes together over time. “Gantt charts are hard to interpret. 3D models are static. Our job is to bring those together,” he said.  The result is a time-lapse-style animation that shows the construction process step by step. It’s proven invaluable for safety reviews and stakeholder meetings. Rather than poring over spreadsheets, teams can simply watch the plan come to life.  And there’s more. Brigantium is bringing these models into virtual reality using Unreal Engine – the same one behind many video games. One recent model recreated ITER’s tokamak pit using drone footage and photogrammetry. The experience is fully interactive and can even run in a web browser. “We’ve really improved the quality of the visualisation,” said Sutton. “It’s a lot smoother; the textures look a lot better. Eventually, we’ll have this running through a web browser, so anybody on the team can just click on a web link to navigate this 4D model.”  Looking forward, Sutton believes AI could help automate the painstaking work of syncing schedules with 3D models. One day, these simulations could reach all the way down to individual bolts and fasteners – not just with impressive visuals, but with critical tools for preventing delays.  Despite the different approaches, one theme ran through all three presentations: AI isn’t just a tool for office productivity. It’s becoming a partner in creativity, problem-solving and even scientific discovery.  Takeda mentioned that Microsoft is experimenting with “world models” inspired by how video games simulate physics. These models learn about the physical world by watching pixels in the form of videos of real phenomena such as plasma behaviour. “Our thesis is that if you showed this AI videos of plasma, it might learn the physics of plasmas,” he said.  It sounds futuristic, but the logic holds. The more AI can learn from the world, the more it can help us understand it – and perhaps even master it. At its heart, the message from the workshop was simple: AI isn’t here to replace the scientist, the engineer or the planner; it’s here to help, and to make their work faster, more flexible and maybe a little more fun. As Takeda put it: “Those are just a few examples of how AI is starting to be used at ITER. And it’s just the start of that journey.”  If these early steps are any indication, that journey won’t just be faster – it might also be more inspired.  #fusion #how #private #sector #tech
    WWW.COMPUTERWEEKLY.COM
    Fusion and AI: How private sector tech is powering progress at ITER
    In April 2025, at the ITER Private Sector Fusion Workshop in Cadarache, something remarkable unfolded. In a room filled with scientists, engineers and software visionaries, the line between big science and commercial innovation began to blur.   Three organisations – Microsoft Research, Arena and Brigantium Engineering – shared how artificial intelligence (AI), already transforming everything from language models to logistics, is now stepping into a new role: helping humanity to unlock the power of nuclear fusion.  Each presenter addressed a different part of the puzzle, but the message was the same: AI isn’t just a buzzword anymore. It’s becoming a real tool – practical, powerful and indispensable – for big science and engineering projects, including fusion.  “If we think of the agricultural revolution and the industrial revolution, the AI revolution is next – and it’s coming at a pace which is unprecedented,” said Kenji Takeda, director of research incubations at Microsoft Research.  Microsoft’s collaboration with ITER is already in motion. Just a month before the workshop, the two teams signed a Memorandum of Understanding (MoU) to explore how AI can accelerate research and development. This follows ITER’s initial use of Microsoft technology to empower their teams. A chatbot in Azure OpenAI service was developed to help staff navigate technical knowledge, on more than a million ITER documents, using natural conversation. GitHub Copilot assists with coding, while AI helps to resolve IT support tickets – those everyday but essential tasks that keep the lights on.  But Microsoft’s vision goes deeper. Fusion demands materials that can survive extreme conditions – heat, radiation, pressure – and that’s where AI shows a different kind of potential. MatterGen, a Microsoft Research generative AI model for materials, designs entirely new materials based on specific properties. “It’s like ChatGPT,” said Takeda, “but instead of ‘Write me a poem’, we ask it to design a material that can survive as the first wall of a fusion reactor.”  The next step? MatterSim – a simulation tool that predicts how these imagined materials will behave in the real world. By combining generation and simulation, Microsoft hopes to uncover materials that don’t yet exist in any catalogue.  While Microsoft tackles the atomic scale, Arena is focused on a different challenge: speeding up hardware development. As general manager Michael Frei put it: “Software innovation happens in seconds. In hardware, that loop can take months – or years.”  Arena’s answer is Atlas, a multimodal AI platform that acts as an extra set of hands – and eyes – for engineers. It can read data sheets, interpret lab results, analyse circuit diagrams and even interact with lab equipment through software interfaces. “Instead of adjusting an oscilloscope manually,” said Frei, “you can just say, ‘Verify the I2C [inter integrated circuit] protocol’, and Atlas gets it done.”  It doesn’t stop there. Atlas can write and adapt firmware on the fly, responding to real-time conditions. That means tighter feedback loops, faster prototyping and fewer late nights in the lab. Arena aims to make building hardware feel a little more like writing software – fluid, fast and assisted by smart tools.  Fusion, of course, isn’t just about atoms and code – it’s also about construction. Gigantic, one-of-a-kind machines don’t build themselves. That’s where Brigantium Engineering comes in. Founder Lynton Sutton explained how his team uses “4D planning” – a marriage of 3D CAD models and detailed construction schedules – to visualise how everything comes together over time. “Gantt charts are hard to interpret. 3D models are static. Our job is to bring those together,” he said.  The result is a time-lapse-style animation that shows the construction process step by step. It’s proven invaluable for safety reviews and stakeholder meetings. Rather than poring over spreadsheets, teams can simply watch the plan come to life.  And there’s more. Brigantium is bringing these models into virtual reality using Unreal Engine – the same one behind many video games. One recent model recreated ITER’s tokamak pit using drone footage and photogrammetry. The experience is fully interactive and can even run in a web browser. “We’ve really improved the quality of the visualisation,” said Sutton. “It’s a lot smoother; the textures look a lot better. Eventually, we’ll have this running through a web browser, so anybody on the team can just click on a web link to navigate this 4D model.”  Looking forward, Sutton believes AI could help automate the painstaking work of syncing schedules with 3D models. One day, these simulations could reach all the way down to individual bolts and fasteners – not just with impressive visuals, but with critical tools for preventing delays.  Despite the different approaches, one theme ran through all three presentations: AI isn’t just a tool for office productivity. It’s becoming a partner in creativity, problem-solving and even scientific discovery.  Takeda mentioned that Microsoft is experimenting with “world models” inspired by how video games simulate physics. These models learn about the physical world by watching pixels in the form of videos of real phenomena such as plasma behaviour. “Our thesis is that if you showed this AI videos of plasma, it might learn the physics of plasmas,” he said.  It sounds futuristic, but the logic holds. The more AI can learn from the world, the more it can help us understand it – and perhaps even master it. At its heart, the message from the workshop was simple: AI isn’t here to replace the scientist, the engineer or the planner; it’s here to help, and to make their work faster, more flexible and maybe a little more fun. As Takeda put it: “Those are just a few examples of how AI is starting to be used at ITER. And it’s just the start of that journey.”  If these early steps are any indication, that journey won’t just be faster – it might also be more inspired. 
    Like
    Love
    Wow
    Sad
    Angry
    490
    2 Комментарии 0 Поделились
  • Over 8M patient records leaked in healthcare data breach

    Published
    June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles!
    In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work  Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data     How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop  5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication. It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    #over #patient #records #leaked #healthcare
    Over 8M patient records leaked in healthcare data breach
    Published June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles! In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work  Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data     How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop  5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication. It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com. #over #patient #records #leaked #healthcare
    WWW.FOXNEWS.COM
    Over 8M patient records leaked in healthcare data breach
    Published June 15, 2025 10:00am EDT close IPhone users instructed to take immediate action to avoid data breach: 'Urgent threat' Kurt 'The CyberGuy' Knutsson discusses Elon Musk's possible priorities as he exits his role with the White House and explains the urgent warning for iPhone users to update devices after a 'massive security gap.' NEWYou can now listen to Fox News articles! In the past decade, healthcare data has become one of the most sought-after targets in cybercrime. From insurers to clinics, every player in the ecosystem handles some form of sensitive information. However, breaches do not always originate from hospitals or health apps. Increasingly, patient data is managed by third-party vendors offering digital services such as scheduling, billing and marketing. One such breach at a digital marketing agency serving dental practices recently exposed approximately 2.7 million patient profiles and more than 8.8 million appointment records.Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join. Illustration of a hacker at work   (Kurt "CyberGuy" Knutsson)Massive healthcare data leak exposes millions: What you need to knowCybernews researchers have discovered a misconfigured MongoDB database exposing 2.7 million patient profiles and 8.8 million appointment records. The database was publicly accessible online, unprotected by passwords or authentication protocols. Anyone with basic knowledge of database scanning tools could have accessed it.The exposed data included names, birthdates, addresses, emails, phone numbers, gender, chart IDs, language preferences and billing classifications. Appointment records also contained metadata such as timestamps and institutional identifiers.MASSIVE DATA BREACH EXPOSES 184 MILLION PASSWORDS AND LOGINSClues within the data structure point toward Gargle, a Utah-based company that builds websites and offers marketing tools for dental practices. While not a confirmed source, several internal references and system details suggest a strong connection. Gargle provides appointment scheduling, form submission and patient communication services. These functions require access to patient information, making the firm a likely link in the exposure.After the issue was reported, the database was secured. The duration of the exposure remains unknown, and there is no public evidence indicating whether the data was downloaded by malicious actors before being locked down.We reached out to Gargle for a comment but did not hear back before our deadline. A healthcare professional viewing heath data      (Kurt "CyberGuy" Knutsson)How healthcare data breaches lead to identity theft and insurance fraudThe exposed data presents a broad risk profile. On its own, a phone number or billing record might seem limited in scope. Combined, however, the dataset forms a complete profile that could be exploited for identity theft, insurance fraud and targeted phishing campaigns.Medical identity theft allows attackers to impersonate patients and access services under a false identity. Victims often remain unaware until significant damage is done, ranging from incorrect medical records to unpaid bills in their names. The leak also opens the door to insurance fraud, with actors using institutional references and chart data to submit false claims.This type of breach raises questions about compliance with the Health Insurance Portability and Accountability Act, which mandates strong security protections for entities handling patient data. Although Gargle is not a healthcare provider, its access to patient-facing infrastructure could place it under the scope of that regulation as a business associate. A healthcare professional working on a laptop   (Kurt "CyberGuy" Knutsson)5 ways you can stay safe from healthcare data breachesIf your information was part of the healthcare breach or any similar one, it’s worth taking a few steps to protect yourself.1. Consider identity theft protection services: Since the healthcare data breach exposed personal and financial information, it’s crucial to stay proactive against identity theft. Identity theft protection services offer continuous monitoring of your credit reports, Social Security number and even the dark web to detect if your information is being misused. These services send you real-time alerts about suspicious activity, such as new credit inquiries or attempts to open accounts in your name, helping you act quickly before serious damage occurs. Beyond monitoring, many identity theft protection companies provide dedicated recovery specialists who assist you in resolving fraud issues, disputing unauthorized charges and restoring your identity if it’s compromised. See my tips and best picks on how to protect yourself from identity theft.2. Use personal data removal services: The healthcare data breach leaks loads of information about you, and all this could end up in the public domain, which essentially gives anyone an opportunity to scam you.  One proactive step is to consider personal data removal services, which specialize in continuously monitoring and removing your information from various online databases and websites. While no service promises to remove all your data from the internet, having a removal service is great if you want to constantly monitor and automate the process of removing your information from hundreds of sites continuously over a longer period of time. Check out my top picks for data removal services here. GET FOX BUSINESS ON THE GO BY CLICKING HEREGet a free scan to find out if your personal information is already out on the web3. Have strong antivirus software: Hackers have people’s email addresses and full names, which makes it easy for them to send you a phishing link that installs malware and steals all your data. These messages are socially engineered to catch them, and catching them is nearly impossible if you’re not careful. However, you’re not without defenses.The best way to safeguard yourself from malicious links that install malware, potentially accessing your private information, is to have strong antivirus software installed on all your devices. This protection can also alert you to phishing emails and ransomware scams, keeping your personal information and digital assets safe. Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android and iOS devices.4. Enable two-factor authentication: While passwords weren’t part of the data breach, you still need to enable two-factor authentication (2FA). It gives you an extra layer of security on all your important accounts, including email, banking and social media. 2FA requires you to provide a second piece of information, such as a code sent to your phone, in addition to your password when logging in. This makes it significantly harder for hackers to access your accounts, even if they have your password. Enabling 2FA can greatly reduce the risk of unauthorized access and protect your sensitive data.5. Be wary of mailbox communications: Bad actors may also try to scam you through snail mail. The data leak gives them access to your address. They may impersonate people or brands you know and use themes that require urgent attention, such as missed deliveries, account suspensions and security alerts. Kurt’s key takeawayIf nothing else, this latest leak shows just how poorly patient data is being handled today. More and more, non-medical vendors are getting access to sensitive information without facing the same rules or oversight as hospitals and clinics. These third-party services are now a regular part of how patients book appointments, pay bills or fill out forms. But when something goes wrong, the fallout is just as serious. Even though the database was taken offline, the bigger problem hasn't gone away. Your data is only as safe as the least careful company that gets access to it.CLICK HERE TO GET THE FOX NEWS APPDo you think healthcare companies are investing enough in their cybersecurity infrastructure? Let us know by writing us at Cyberguy.com/ContactFor more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/NewsletterAsk Kurt a question or let us know what stories you'd like us to coverFollow Kurt on his social channelsAnswers to the most asked CyberGuy questions:New from Kurt:Copyright 2025 CyberGuy.com.  All rights reserved.   Kurt "CyberGuy" Knutsson is an award-winning tech journalist who has a deep love of technology, gear and gadgets that make life better with his contributions for Fox News & FOX Business beginning mornings on "FOX & Friends." Got a tech question? Get Kurt’s free CyberGuy Newsletter, share your voice, a story idea or comment at CyberGuy.com.
    Like
    Love
    Wow
    Sad
    Angry
    507
    0 Комментарии 0 Поделились
  • EPFL Researchers Unveil FG2 at CVPR: A New AI Model That Slashes Localization Errors by 28% for Autonomous Vehicles in GPS-Denied Environments

    Navigating the dense urban canyons of cities like San Francisco or New York can be a nightmare for GPS systems. The towering skyscrapers block and reflect satellite signals, leading to location errors of tens of meters. For you and me, that might mean a missed turn. But for an autonomous vehicle or a delivery robot, that level of imprecision is the difference between a successful mission and a costly failure. These machines require pinpoint accuracy to operate safely and efficiently. Addressing this critical challenge, researchers from the École Polytechnique Fédérale de Lausannein Switzerland have introduced a groundbreaking new method for visual localization during CVPR 2025
    Their new paper, “FG2: Fine-Grained Cross-View Localization by Fine-Grained Feature Matching,” presents a novel AI model that significantly enhances the ability of a ground-level system, like an autonomous car, to determine its exact position and orientation using only a camera and a corresponding aerialimage. The new approach has demonstrated a remarkable 28% reduction in mean localization error compared to the previous state-of-the-art on a challenging public dataset.
    Key Takeaways:

    Superior Accuracy: The FG2 model reduces the average localization error by a significant 28% on the VIGOR cross-area test set, a challenging benchmark for this task.
    Human-like Intuition: Instead of relying on abstract descriptors, the model mimics human reasoning by matching fine-grained, semantically consistent features—like curbs, crosswalks, and buildings—between a ground-level photo and an aerial map.
    Enhanced Interpretability: The method allows researchers to “see” what the AI is “thinking” by visualizing exactly which features in the ground and aerial images are being matched, a major step forward from previous “black box” models.
    Weakly Supervised Learning: Remarkably, the model learns these complex and consistent feature matches without any direct labels for correspondences. It achieves this using only the final camera pose as a supervisory signal.

    Challenge: Seeing the World from Two Different Angles
    The core problem of cross-view localization is the dramatic difference in perspective between a street-level camera and an overhead satellite view. A building facade seen from the ground looks completely different from its rooftop signature in an aerial image. Existing methods have struggled with this. Some create a general “descriptor” for the entire scene, but this is an abstract approach that doesn’t mirror how humans naturally localize themselves by spotting specific landmarks. Other methods transform the ground image into a Bird’s-Eye-Viewbut are often limited to the ground plane, ignoring crucial vertical structures like buildings.

    FG2: Matching Fine-Grained Features
    The EPFL team’s FG2 method introduces a more intuitive and effective process. It aligns two sets of points: one generated from the ground-level image and another sampled from the aerial map.

    Here’s a breakdown of their innovative pipeline:

    Mapping to 3D: The process begins by taking the features from the ground-level image and lifting them into a 3D point cloud centered around the camera. This creates a 3D representation of the immediate environment.
    Smart Pooling to BEV: This is where the magic happens. Instead of simply flattening the 3D data, the model learns to intelligently select the most important features along the verticaldimension for each point. It essentially asks, “For this spot on the map, is the ground-level road marking more important, or is the edge of that building’s roof the better landmark?” This selection process is crucial, as it allows the model to correctly associate features like building facades with their corresponding rooftops in the aerial view.
    Feature Matching and Pose Estimation: Once both the ground and aerial views are represented as 2D point planes with rich feature descriptors, the model computes the similarity between them. It then samples a sparse set of the most confident matches and uses a classic geometric algorithm called Procrustes alignment to calculate the precise 3-DoFpose.

    Unprecedented Performance and Interpretability
    The results speak for themselves. On the challenging VIGOR dataset, which includes images from different cities in its cross-area test, FG2 reduced the mean localization error by 28% compared to the previous best method. It also demonstrated superior generalization capabilities on the KITTI dataset, a staple in autonomous driving research.

    Perhaps more importantly, the FG2 model offers a new level of transparency. By visualizing the matched points, the researchers showed that the model learns semantically consistent correspondences without being explicitly told to. For example, the system correctly matches zebra crossings, road markings, and even building facades in the ground view to their corresponding locations on the aerial map. This interpretability is extremenly valuable for building trust in safety-critical autonomous systems.
    “A Clearer Path” for Autonomous Navigation
    The FG2 method represents a significant leap forward in fine-grained visual localization. By developing a model that intelligently selects and matches features in a way that mirrors human intuition, the EPFL researchers have not only shattered previous accuracy records but also made the decision-making process of the AI more interpretable. This work paves the way for more robust and reliable navigation systems for autonomous vehicles, drones, and robots, bringing us one step closer to a future where machines can confidently navigate our world, even when GPS fails them.

    Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter.
    Jean-marc MommessinJean-marc is a successful AI business executive .He leads and accelerates growth for AI powered solutions and started a computer vision company in 2006. He is a recognized speaker at AI conferences and has an MBA from Stanford.Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/AI-Generated Ad Created with Google’s Veo3 Airs During NBA Finals, Slashing Production Costs by 95%Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Highlighted at CVPR 2025: Google DeepMind’s ‘Motion Prompting’ Paper Unlocks Granular Video ControlJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Snowflake Charts New AI Territory: Cortex AISQL & Snowflake Intelligence Poised to Reshape Data AnalyticsJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Exclusive Talk: Joey Conway of NVIDIA on Llama Nemotron Ultra and Open Source Models
    #epfl #researchers #unveil #fg2 #cvpr
    EPFL Researchers Unveil FG2 at CVPR: A New AI Model That Slashes Localization Errors by 28% for Autonomous Vehicles in GPS-Denied Environments
    Navigating the dense urban canyons of cities like San Francisco or New York can be a nightmare for GPS systems. The towering skyscrapers block and reflect satellite signals, leading to location errors of tens of meters. For you and me, that might mean a missed turn. But for an autonomous vehicle or a delivery robot, that level of imprecision is the difference between a successful mission and a costly failure. These machines require pinpoint accuracy to operate safely and efficiently. Addressing this critical challenge, researchers from the École Polytechnique Fédérale de Lausannein Switzerland have introduced a groundbreaking new method for visual localization during CVPR 2025 Their new paper, “FG2: Fine-Grained Cross-View Localization by Fine-Grained Feature Matching,” presents a novel AI model that significantly enhances the ability of a ground-level system, like an autonomous car, to determine its exact position and orientation using only a camera and a corresponding aerialimage. The new approach has demonstrated a remarkable 28% reduction in mean localization error compared to the previous state-of-the-art on a challenging public dataset. Key Takeaways: Superior Accuracy: The FG2 model reduces the average localization error by a significant 28% on the VIGOR cross-area test set, a challenging benchmark for this task. Human-like Intuition: Instead of relying on abstract descriptors, the model mimics human reasoning by matching fine-grained, semantically consistent features—like curbs, crosswalks, and buildings—between a ground-level photo and an aerial map. Enhanced Interpretability: The method allows researchers to “see” what the AI is “thinking” by visualizing exactly which features in the ground and aerial images are being matched, a major step forward from previous “black box” models. Weakly Supervised Learning: Remarkably, the model learns these complex and consistent feature matches without any direct labels for correspondences. It achieves this using only the final camera pose as a supervisory signal. Challenge: Seeing the World from Two Different Angles The core problem of cross-view localization is the dramatic difference in perspective between a street-level camera and an overhead satellite view. A building facade seen from the ground looks completely different from its rooftop signature in an aerial image. Existing methods have struggled with this. Some create a general “descriptor” for the entire scene, but this is an abstract approach that doesn’t mirror how humans naturally localize themselves by spotting specific landmarks. Other methods transform the ground image into a Bird’s-Eye-Viewbut are often limited to the ground plane, ignoring crucial vertical structures like buildings. FG2: Matching Fine-Grained Features The EPFL team’s FG2 method introduces a more intuitive and effective process. It aligns two sets of points: one generated from the ground-level image and another sampled from the aerial map. Here’s a breakdown of their innovative pipeline: Mapping to 3D: The process begins by taking the features from the ground-level image and lifting them into a 3D point cloud centered around the camera. This creates a 3D representation of the immediate environment. Smart Pooling to BEV: This is where the magic happens. Instead of simply flattening the 3D data, the model learns to intelligently select the most important features along the verticaldimension for each point. It essentially asks, “For this spot on the map, is the ground-level road marking more important, or is the edge of that building’s roof the better landmark?” This selection process is crucial, as it allows the model to correctly associate features like building facades with their corresponding rooftops in the aerial view. Feature Matching and Pose Estimation: Once both the ground and aerial views are represented as 2D point planes with rich feature descriptors, the model computes the similarity between them. It then samples a sparse set of the most confident matches and uses a classic geometric algorithm called Procrustes alignment to calculate the precise 3-DoFpose. Unprecedented Performance and Interpretability The results speak for themselves. On the challenging VIGOR dataset, which includes images from different cities in its cross-area test, FG2 reduced the mean localization error by 28% compared to the previous best method. It also demonstrated superior generalization capabilities on the KITTI dataset, a staple in autonomous driving research. Perhaps more importantly, the FG2 model offers a new level of transparency. By visualizing the matched points, the researchers showed that the model learns semantically consistent correspondences without being explicitly told to. For example, the system correctly matches zebra crossings, road markings, and even building facades in the ground view to their corresponding locations on the aerial map. This interpretability is extremenly valuable for building trust in safety-critical autonomous systems. “A Clearer Path” for Autonomous Navigation The FG2 method represents a significant leap forward in fine-grained visual localization. By developing a model that intelligently selects and matches features in a way that mirrors human intuition, the EPFL researchers have not only shattered previous accuracy records but also made the decision-making process of the AI more interpretable. This work paves the way for more robust and reliable navigation systems for autonomous vehicles, drones, and robots, bringing us one step closer to a future where machines can confidently navigate our world, even when GPS fails them. Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter. Jean-marc MommessinJean-marc is a successful AI business executive .He leads and accelerates growth for AI powered solutions and started a computer vision company in 2006. He is a recognized speaker at AI conferences and has an MBA from Stanford.Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/AI-Generated Ad Created with Google’s Veo3 Airs During NBA Finals, Slashing Production Costs by 95%Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Highlighted at CVPR 2025: Google DeepMind’s ‘Motion Prompting’ Paper Unlocks Granular Video ControlJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Snowflake Charts New AI Territory: Cortex AISQL & Snowflake Intelligence Poised to Reshape Data AnalyticsJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Exclusive Talk: Joey Conway of NVIDIA on Llama Nemotron Ultra and Open Source Models #epfl #researchers #unveil #fg2 #cvpr
    WWW.MARKTECHPOST.COM
    EPFL Researchers Unveil FG2 at CVPR: A New AI Model That Slashes Localization Errors by 28% for Autonomous Vehicles in GPS-Denied Environments
    Navigating the dense urban canyons of cities like San Francisco or New York can be a nightmare for GPS systems. The towering skyscrapers block and reflect satellite signals, leading to location errors of tens of meters. For you and me, that might mean a missed turn. But for an autonomous vehicle or a delivery robot, that level of imprecision is the difference between a successful mission and a costly failure. These machines require pinpoint accuracy to operate safely and efficiently. Addressing this critical challenge, researchers from the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have introduced a groundbreaking new method for visual localization during CVPR 2025 Their new paper, “FG2: Fine-Grained Cross-View Localization by Fine-Grained Feature Matching,” presents a novel AI model that significantly enhances the ability of a ground-level system, like an autonomous car, to determine its exact position and orientation using only a camera and a corresponding aerial (or satellite) image. The new approach has demonstrated a remarkable 28% reduction in mean localization error compared to the previous state-of-the-art on a challenging public dataset. Key Takeaways: Superior Accuracy: The FG2 model reduces the average localization error by a significant 28% on the VIGOR cross-area test set, a challenging benchmark for this task. Human-like Intuition: Instead of relying on abstract descriptors, the model mimics human reasoning by matching fine-grained, semantically consistent features—like curbs, crosswalks, and buildings—between a ground-level photo and an aerial map. Enhanced Interpretability: The method allows researchers to “see” what the AI is “thinking” by visualizing exactly which features in the ground and aerial images are being matched, a major step forward from previous “black box” models. Weakly Supervised Learning: Remarkably, the model learns these complex and consistent feature matches without any direct labels for correspondences. It achieves this using only the final camera pose as a supervisory signal. Challenge: Seeing the World from Two Different Angles The core problem of cross-view localization is the dramatic difference in perspective between a street-level camera and an overhead satellite view. A building facade seen from the ground looks completely different from its rooftop signature in an aerial image. Existing methods have struggled with this. Some create a general “descriptor” for the entire scene, but this is an abstract approach that doesn’t mirror how humans naturally localize themselves by spotting specific landmarks. Other methods transform the ground image into a Bird’s-Eye-View (BEV) but are often limited to the ground plane, ignoring crucial vertical structures like buildings. FG2: Matching Fine-Grained Features The EPFL team’s FG2 method introduces a more intuitive and effective process. It aligns two sets of points: one generated from the ground-level image and another sampled from the aerial map. Here’s a breakdown of their innovative pipeline: Mapping to 3D: The process begins by taking the features from the ground-level image and lifting them into a 3D point cloud centered around the camera. This creates a 3D representation of the immediate environment. Smart Pooling to BEV: This is where the magic happens. Instead of simply flattening the 3D data, the model learns to intelligently select the most important features along the vertical (height) dimension for each point. It essentially asks, “For this spot on the map, is the ground-level road marking more important, or is the edge of that building’s roof the better landmark?” This selection process is crucial, as it allows the model to correctly associate features like building facades with their corresponding rooftops in the aerial view. Feature Matching and Pose Estimation: Once both the ground and aerial views are represented as 2D point planes with rich feature descriptors, the model computes the similarity between them. It then samples a sparse set of the most confident matches and uses a classic geometric algorithm called Procrustes alignment to calculate the precise 3-DoF (x, y, and yaw) pose. Unprecedented Performance and Interpretability The results speak for themselves. On the challenging VIGOR dataset, which includes images from different cities in its cross-area test, FG2 reduced the mean localization error by 28% compared to the previous best method. It also demonstrated superior generalization capabilities on the KITTI dataset, a staple in autonomous driving research. Perhaps more importantly, the FG2 model offers a new level of transparency. By visualizing the matched points, the researchers showed that the model learns semantically consistent correspondences without being explicitly told to. For example, the system correctly matches zebra crossings, road markings, and even building facades in the ground view to their corresponding locations on the aerial map. This interpretability is extremenly valuable for building trust in safety-critical autonomous systems. “A Clearer Path” for Autonomous Navigation The FG2 method represents a significant leap forward in fine-grained visual localization. By developing a model that intelligently selects and matches features in a way that mirrors human intuition, the EPFL researchers have not only shattered previous accuracy records but also made the decision-making process of the AI more interpretable. This work paves the way for more robust and reliable navigation systems for autonomous vehicles, drones, and robots, bringing us one step closer to a future where machines can confidently navigate our world, even when GPS fails them. Check out the Paper. All credit for this research goes to the researchers of this project. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter. Jean-marc MommessinJean-marc is a successful AI business executive .He leads and accelerates growth for AI powered solutions and started a computer vision company in 2006. He is a recognized speaker at AI conferences and has an MBA from Stanford.Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/AI-Generated Ad Created with Google’s Veo3 Airs During NBA Finals, Slashing Production Costs by 95%Jean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Highlighted at CVPR 2025: Google DeepMind’s ‘Motion Prompting’ Paper Unlocks Granular Video ControlJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Snowflake Charts New AI Territory: Cortex AISQL & Snowflake Intelligence Poised to Reshape Data AnalyticsJean-marc Mommessinhttps://www.marktechpost.com/author/jean-marc0000677/Exclusive Talk: Joey Conway of NVIDIA on Llama Nemotron Ultra and Open Source Models
    Like
    Love
    Wow
    Angry
    Sad
    601
    0 Комментарии 0 Поделились
  • Tariffed construction materials increased in price last month, ABC analysis finds

    Construction input prices rose 0.2% in May, according to a new Associated Builders and Contractors analysis of U.S. Bureau of Labor Statistics’ Producer Price Index data. Last month, nonresidential construction input prices reduced by 0.1%.
    Overall construction input prices are 1.3% higher than levels from a year ago, and nonresidential construction prices are 1.6% higher. Prices decreased in two of three major energy categories in April. Natural gas prices fell 18.7%, unprocessed energy materials were down 3.5%, and crude petroleum prices increased by 1.3%.
    Chart credit: Associated Builders and Contractors“Construction materials prices continued to increase at a faster-than-ideal pace in May,” said ABC Chief Economist Anirban Basu. “While input prices are up just 1.3% over the past year, that modest escalation is entirely due to price decreases during the second half of 2024. Costs have increased rapidly since the start of this year, with input prices rising at a 6% annualize...
    #tariffed #construction #materials #increased #price
    Tariffed construction materials increased in price last month, ABC analysis finds
    Construction input prices rose 0.2% in May, according to a new Associated Builders and Contractors analysis of U.S. Bureau of Labor Statistics’ Producer Price Index data. Last month, nonresidential construction input prices reduced by 0.1%. Overall construction input prices are 1.3% higher than levels from a year ago, and nonresidential construction prices are 1.6% higher. Prices decreased in two of three major energy categories in April. Natural gas prices fell 18.7%, unprocessed energy materials were down 3.5%, and crude petroleum prices increased by 1.3%. Chart credit: Associated Builders and Contractors“Construction materials prices continued to increase at a faster-than-ideal pace in May,” said ABC Chief Economist Anirban Basu. “While input prices are up just 1.3% over the past year, that modest escalation is entirely due to price decreases during the second half of 2024. Costs have increased rapidly since the start of this year, with input prices rising at a 6% annualize... #tariffed #construction #materials #increased #price
    ARCHINECT.COM
    Tariffed construction materials increased in price last month, ABC analysis finds
    Construction input prices rose 0.2% in May, according to a new Associated Builders and Contractors (ABC) analysis of U.S. Bureau of Labor Statistics’ Producer Price Index data. Last month, nonresidential construction input prices reduced by 0.1%. Overall construction input prices are 1.3% higher than levels from a year ago, and nonresidential construction prices are 1.6% higher. Prices decreased in two of three major energy categories in April. Natural gas prices fell 18.7%, unprocessed energy materials were down 3.5%, and crude petroleum prices increased by 1.3%. Chart credit: Associated Builders and Contractors“Construction materials prices continued to increase at a faster-than-ideal pace in May,” said ABC Chief Economist Anirban Basu. “While input prices are up just 1.3% over the past year, that modest escalation is entirely due to price decreases during the second half of 2024. Costs have increased rapidly since the start of this year, with input prices rising at a 6% annualize...
    Like
    Love
    Wow
    Sad
    Angry
    592
    2 Комментарии 0 Поделились
  • Block’s CFO explains Gen Z’s surprising approach to money management

    One stock recently impacted by a whirlwind of volatility is Block—the fintech powerhouse behind Square, Cash App, Tidal Music, and more. The company’s COO and CFO, Amrita Ahuja, shares how her team is using new AI tools to find opportunity amid disruption and reach customers left behind by traditional financial systems. Ahuja also shares lessons from the video game industry and discusses Gen Z’s surprising approach to money management.  

    This is an abridged transcript of an interview from Rapid Response, hosted by Robert Safian, former editor-in-chief of Fast Company. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode.

    As a leader, when you’re looking at all of this volatility—the tariffs, consumer sentiment’s been unclear, the stock market’s been all over the place. You guys had a huge one-day drop in early May, and it quickly bounced back. How do you make sense of all these external factors?

    Yeah, our focus is on what we can control. And ultimately, the thing that we are laser-focused on for our business is product velocity. How quickly can we start small with something, launch something for our customers, and then test and iterate and learn so that ultimately, that something that we’ve launched scales into an important product?

    I’ll give you an example. Cash App Borrow, which is a product where our customers can get access to a line of credit, often that bridges them from paycheck to paycheck. We know so many Americans are living paycheck to paycheck. That’s a product that we launched about three years ago and have now scaled to serve 9 million actives with billion in credit supply to our customers in a span of a couple short years.

    The more we can be out testing and launching product at a pace, the more we know we are ultimately delivering value to our customers, and the right things will happen from a stock perspective.

    Block is a financial services provider. You have Square, the point-of-sale system; the digital wallet Cash App, which you mentioned, which competes with Venmo and Robinhood; and a bunch of others. Then you’ve got the buy-now, pay-later leader Afterpay. You chair Square Financial Services, which is Block’s chartered bank. But you’ve said that in the fintech world, Block is only a little bit fin—that comparatively, it’s more tech. Can you explain what you mean by that?

    What we think is unique about us is our ability as a technology company to completely change innovation in the space, such that we can help solve systemic issues across credit, payments, commerce, and banking. What that means ultimately is we use technologies like AI and machine learning and data science, and we use these technologies in a unique way, in a way that’s different from a traditional bank. We are able to underwrite those who are often frankly forgotten by the traditional financial ecosystems.

    Our Square Loans product has almost triple the rate of women-owned businesses that we underwrite. Fifty-eight percent of our loans go to women-owned businesses versus 20% for the industry average. For that Cash App Borrow product I was talking about, 70% of those actives, the 9 million actives that we underwrote, fell below 580 as a FICO score. That’s considered a poor FICO score, and yet 97% of repayments are made on time. And this is because we have unique access to data and these technology and tools which can help us uniquely underwrite this often forgotten customer base.

    Yeah. I mean, credit—sometimes it’s been blamed for financial excesses. But access to credit is also, as you say, an advantage that’s not available to everyone. Do you have a philosophy between those poles—between risk and opportunity? Or is what you’re saying is that the tech you have allows you to avoid that risk?

    That’s right. Let’s start with how do the current systems work? It works using inferior data, frankly. It’s more limited data. It’s outdated. Sometimes it’s inaccurate. And it ignores things like someone’s cash flows, the stability of your income, your savings rate, how money moves through your accounts, or how you use alternative forms of credit—like buy now, pay later, which we have in our ecosystem through Afterpay.

    We have a lot of these signals for our 57 million monthly actives on the Cash App side and for the 4 million small businesses on the Square side, and those, frankly, billions of transaction data points that we have on any given day paired with new technologies. And we intend to continue to be on the forefront of AI, machine learning, and data science to be able to empower more people into the economy. The combination of the superior data and the technologies is what we believe ultimately helps expand access.

    You have a financial background, but not in the financial services industry. Before Block, you were a video game developer at Activision. Are financial businesses and video games similar? Are there things that are similar about them?

    There are. There actually are some things that are similar, I will say. There are many things that are unique to each industry. Each industry is incredibly complex. You find that when big technology companies try to do gaming. They’ve taken over the world in many different ways, but they can’t always crack the nut on putting out a great game. Similarly, some of the largest technology companies have dabbled in fintech but haven’t been able to go as deep, so they’re both very nuanced and complex industries.

    I would say another similarity is that design really matters. Industrial design, the design of products, the interface of products, is absolutely mission-critical to a great game, and it’s absolutely mission-critical to the simplicity and accessibility of our products, be it on Square or Cash App.

    And then maybe the third thing that I would say is that when I was in gaming, at least the business models were rapidly changing from an intermediary distribution mechanism, like releasing a game once and then selling it through a retailer, to an always-on, direct-to-consumer connection. And similarly with banking, people don’t want to bank from 9 to 5, six days a week. They want 24/7 access to their money and the ability to, again, grow their financial livelihood, move their money around seamlessly. So, some similarities are there in that shift to an intermediary model or a slower model to an always-on, direct-to-consumer connection.

    Part of your target audience or your target customer base at Block are Gen Z folks. Did you learn things at Activision about Gen Z that has been useful? Are there things that businesses misunderstand about younger generations still?

    What we’ve learned is that Gen Z, millennial customers, aren’t going to do things the way their parents did. Some of our stats show that 63% of Gen Z customers have moved away from traditional credit cards, and over 80% are skeptical of them. Which means they’re not using a credit card to manage expenses; they’re using a debit card, but then layering on on a transaction-by-transaction basis. Or again, using tools like buy now, pay later, or Cash App Borrow, the means in which they’re managing their consistent cash flows. So that’s an example of how things are changing, and you’ve got to get up to speed with how the next generation of customers expects to manage their money.
    #blocks #cfo #explains #gen #surprising
    Block’s CFO explains Gen Z’s surprising approach to money management
    One stock recently impacted by a whirlwind of volatility is Block—the fintech powerhouse behind Square, Cash App, Tidal Music, and more. The company’s COO and CFO, Amrita Ahuja, shares how her team is using new AI tools to find opportunity amid disruption and reach customers left behind by traditional financial systems. Ahuja also shares lessons from the video game industry and discusses Gen Z’s surprising approach to money management.   This is an abridged transcript of an interview from Rapid Response, hosted by Robert Safian, former editor-in-chief of Fast Company. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode. As a leader, when you’re looking at all of this volatility—the tariffs, consumer sentiment’s been unclear, the stock market’s been all over the place. You guys had a huge one-day drop in early May, and it quickly bounced back. How do you make sense of all these external factors? Yeah, our focus is on what we can control. And ultimately, the thing that we are laser-focused on for our business is product velocity. How quickly can we start small with something, launch something for our customers, and then test and iterate and learn so that ultimately, that something that we’ve launched scales into an important product? I’ll give you an example. Cash App Borrow, which is a product where our customers can get access to a line of credit, often that bridges them from paycheck to paycheck. We know so many Americans are living paycheck to paycheck. That’s a product that we launched about three years ago and have now scaled to serve 9 million actives with billion in credit supply to our customers in a span of a couple short years. The more we can be out testing and launching product at a pace, the more we know we are ultimately delivering value to our customers, and the right things will happen from a stock perspective. Block is a financial services provider. You have Square, the point-of-sale system; the digital wallet Cash App, which you mentioned, which competes with Venmo and Robinhood; and a bunch of others. Then you’ve got the buy-now, pay-later leader Afterpay. You chair Square Financial Services, which is Block’s chartered bank. But you’ve said that in the fintech world, Block is only a little bit fin—that comparatively, it’s more tech. Can you explain what you mean by that? What we think is unique about us is our ability as a technology company to completely change innovation in the space, such that we can help solve systemic issues across credit, payments, commerce, and banking. What that means ultimately is we use technologies like AI and machine learning and data science, and we use these technologies in a unique way, in a way that’s different from a traditional bank. We are able to underwrite those who are often frankly forgotten by the traditional financial ecosystems. Our Square Loans product has almost triple the rate of women-owned businesses that we underwrite. Fifty-eight percent of our loans go to women-owned businesses versus 20% for the industry average. For that Cash App Borrow product I was talking about, 70% of those actives, the 9 million actives that we underwrote, fell below 580 as a FICO score. That’s considered a poor FICO score, and yet 97% of repayments are made on time. And this is because we have unique access to data and these technology and tools which can help us uniquely underwrite this often forgotten customer base. Yeah. I mean, credit—sometimes it’s been blamed for financial excesses. But access to credit is also, as you say, an advantage that’s not available to everyone. Do you have a philosophy between those poles—between risk and opportunity? Or is what you’re saying is that the tech you have allows you to avoid that risk? That’s right. Let’s start with how do the current systems work? It works using inferior data, frankly. It’s more limited data. It’s outdated. Sometimes it’s inaccurate. And it ignores things like someone’s cash flows, the stability of your income, your savings rate, how money moves through your accounts, or how you use alternative forms of credit—like buy now, pay later, which we have in our ecosystem through Afterpay. We have a lot of these signals for our 57 million monthly actives on the Cash App side and for the 4 million small businesses on the Square side, and those, frankly, billions of transaction data points that we have on any given day paired with new technologies. And we intend to continue to be on the forefront of AI, machine learning, and data science to be able to empower more people into the economy. The combination of the superior data and the technologies is what we believe ultimately helps expand access. You have a financial background, but not in the financial services industry. Before Block, you were a video game developer at Activision. Are financial businesses and video games similar? Are there things that are similar about them? There are. There actually are some things that are similar, I will say. There are many things that are unique to each industry. Each industry is incredibly complex. You find that when big technology companies try to do gaming. They’ve taken over the world in many different ways, but they can’t always crack the nut on putting out a great game. Similarly, some of the largest technology companies have dabbled in fintech but haven’t been able to go as deep, so they’re both very nuanced and complex industries. I would say another similarity is that design really matters. Industrial design, the design of products, the interface of products, is absolutely mission-critical to a great game, and it’s absolutely mission-critical to the simplicity and accessibility of our products, be it on Square or Cash App. And then maybe the third thing that I would say is that when I was in gaming, at least the business models were rapidly changing from an intermediary distribution mechanism, like releasing a game once and then selling it through a retailer, to an always-on, direct-to-consumer connection. And similarly with banking, people don’t want to bank from 9 to 5, six days a week. They want 24/7 access to their money and the ability to, again, grow their financial livelihood, move their money around seamlessly. So, some similarities are there in that shift to an intermediary model or a slower model to an always-on, direct-to-consumer connection. Part of your target audience or your target customer base at Block are Gen Z folks. Did you learn things at Activision about Gen Z that has been useful? Are there things that businesses misunderstand about younger generations still? What we’ve learned is that Gen Z, millennial customers, aren’t going to do things the way their parents did. Some of our stats show that 63% of Gen Z customers have moved away from traditional credit cards, and over 80% are skeptical of them. Which means they’re not using a credit card to manage expenses; they’re using a debit card, but then layering on on a transaction-by-transaction basis. Or again, using tools like buy now, pay later, or Cash App Borrow, the means in which they’re managing their consistent cash flows. So that’s an example of how things are changing, and you’ve got to get up to speed with how the next generation of customers expects to manage their money. #blocks #cfo #explains #gen #surprising
    WWW.FASTCOMPANY.COM
    Block’s CFO explains Gen Z’s surprising approach to money management
    One stock recently impacted by a whirlwind of volatility is Block—the fintech powerhouse behind Square, Cash App, Tidal Music, and more. The company’s COO and CFO, Amrita Ahuja, shares how her team is using new AI tools to find opportunity amid disruption and reach customers left behind by traditional financial systems. Ahuja also shares lessons from the video game industry and discusses Gen Z’s surprising approach to money management.   This is an abridged transcript of an interview from Rapid Response, hosted by Robert Safian, former editor-in-chief of Fast Company. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with today’s top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode. As a leader, when you’re looking at all of this volatility—the tariffs, consumer sentiment’s been unclear, the stock market’s been all over the place. You guys had a huge one-day drop in early May, and it quickly bounced back. How do you make sense of all these external factors? Yeah, our focus is on what we can control. And ultimately, the thing that we are laser-focused on for our business is product velocity. How quickly can we start small with something, launch something for our customers, and then test and iterate and learn so that ultimately, that something that we’ve launched scales into an important product? I’ll give you an example. Cash App Borrow, which is a product where our customers can get access to a line of credit, often $100, $200, that bridges them from paycheck to paycheck. We know so many Americans are living paycheck to paycheck. That’s a product that we launched about three years ago and have now scaled to serve 9 million actives with $15 billion in credit supply to our customers in a span of a couple short years. The more we can be out testing and launching product at a pace, the more we know we are ultimately delivering value to our customers, and the right things will happen from a stock perspective. Block is a financial services provider. You have Square, the point-of-sale system; the digital wallet Cash App, which you mentioned, which competes with Venmo and Robinhood; and a bunch of others. Then you’ve got the buy-now, pay-later leader Afterpay. You chair Square Financial Services, which is Block’s chartered bank. But you’ve said that in the fintech world, Block is only a little bit fin—that comparatively, it’s more tech. Can you explain what you mean by that? What we think is unique about us is our ability as a technology company to completely change innovation in the space, such that we can help solve systemic issues across credit, payments, commerce, and banking. What that means ultimately is we use technologies like AI and machine learning and data science, and we use these technologies in a unique way, in a way that’s different from a traditional bank. We are able to underwrite those who are often frankly forgotten by the traditional financial ecosystems. Our Square Loans product has almost triple the rate of women-owned businesses that we underwrite. Fifty-eight percent of our loans go to women-owned businesses versus 20% for the industry average. For that Cash App Borrow product I was talking about, 70% of those actives, the 9 million actives that we underwrote, fell below 580 as a FICO score. That’s considered a poor FICO score, and yet 97% of repayments are made on time. And this is because we have unique access to data and these technology and tools which can help us uniquely underwrite this often forgotten customer base. Yeah. I mean, credit—sometimes it’s been blamed for financial excesses. But access to credit is also, as you say, an advantage that’s not available to everyone. Do you have a philosophy between those poles—between risk and opportunity? Or is what you’re saying is that the tech you have allows you to avoid that risk? That’s right. Let’s start with how do the current systems work? It works using inferior data, frankly. It’s more limited data. It’s outdated. Sometimes it’s inaccurate. And it ignores things like someone’s cash flows, the stability of your income, your savings rate, how money moves through your accounts, or how you use alternative forms of credit—like buy now, pay later, which we have in our ecosystem through Afterpay. We have a lot of these signals for our 57 million monthly actives on the Cash App side and for the 4 million small businesses on the Square side, and those, frankly, billions of transaction data points that we have on any given day paired with new technologies. And we intend to continue to be on the forefront of AI, machine learning, and data science to be able to empower more people into the economy. The combination of the superior data and the technologies is what we believe ultimately helps expand access. You have a financial background, but not in the financial services industry. Before Block, you were a video game developer at Activision. Are financial businesses and video games similar? Are there things that are similar about them? There are. There actually are some things that are similar, I will say. There are many things that are unique to each industry. Each industry is incredibly complex. You find that when big technology companies try to do gaming. They’ve taken over the world in many different ways, but they can’t always crack the nut on putting out a great game. Similarly, some of the largest technology companies have dabbled in fintech but haven’t been able to go as deep, so they’re both very nuanced and complex industries. I would say another similarity is that design really matters. Industrial design, the design of products, the interface of products, is absolutely mission-critical to a great game, and it’s absolutely mission-critical to the simplicity and accessibility of our products, be it on Square or Cash App. And then maybe the third thing that I would say is that when I was in gaming, at least the business models were rapidly changing from an intermediary distribution mechanism, like releasing a game once and then selling it through a retailer, to an always-on, direct-to-consumer connection. And similarly with banking, people don’t want to bank from 9 to 5, six days a week. They want 24/7 access to their money and the ability to, again, grow their financial livelihood, move their money around seamlessly. So, some similarities are there in that shift to an intermediary model or a slower model to an always-on, direct-to-consumer connection. Part of your target audience or your target customer base at Block are Gen Z folks. Did you learn things at Activision about Gen Z that has been useful? Are there things that businesses misunderstand about younger generations still? What we’ve learned is that Gen Z, millennial customers, aren’t going to do things the way their parents did. Some of our stats show that 63% of Gen Z customers have moved away from traditional credit cards, and over 80% are skeptical of them. Which means they’re not using a credit card to manage expenses; they’re using a debit card, but then layering on on a transaction-by-transaction basis. Or again, using tools like buy now, pay later, or Cash App Borrow, the means in which they’re managing their consistent cash flows. So that’s an example of how things are changing, and you’ve got to get up to speed with how the next generation of customers expects to manage their money.
    Like
    Love
    Wow
    Sad
    Angry
    449
    2 Комментарии 0 Поделились
Расширенные страницы