• Building an Architectural Visualization Community: The Case for Physical Gatherings

    Barbara Betlejewska is a PR consultant and manager with extensive experience in architecture and real estate, currently involved with World Visualization Festival, a global event bringing together CGI and digital storytelling professionals for 3 days of presentations, workshops, and networking in Warsaw, Poland, this October.
    Over the last twenty years, visualization and 3D rendering have evolved from supporting tools to become central pillars of architectural storytelling, design development, and marketing across various industries. As digital technologies have advanced, the landscape of creative work has changed dramatically. Artists can now collaborate with clients worldwide without leaving their homes, and their careers can flourish without ever setting foot in a traditional studio.
    In this hyper-connected world, where access to knowledge, clients, and inspiration is just a click away, do we still need to gather in person? Do conferences, festivals and meetups in the CGI and architectural visualization world still carry weight?

    The People Behind the Pixels
    Professionals from the visualization industry exchanging ideas at WVF 2024.
    For a growing number of professionals — especially those in creative and tech-driven fields — remote work has become the norm. The shift to digital workflows, accelerated by the pandemic, has brought freedom and flexibility that many are reluctant to give up. It’s easier than ever to work for clients in distant cities or countries, to build a freelance career from a laptop, or to pursue the lifestyle of a digital nomad.
    On the surface, it is a broadening of horizons. But for many, the freedom of remote work comes with a cost: isolation. For visualization artists, the reality often means spending long hours alone, rarely interacting face-to-face with peers or collaborators. And while there are undeniable advantages to independent work, the lack of human connection can lead to creative stagnation, professional burnout, and a sense of detachment from the industry as a whole.
    Despite being a highly technical and often solitary craft, visualization and CGI thrive on the exchange of ideas, feedback and inspiration. The tools and techniques evolve rapidly, and staying relevant usually means learning not just from tutorials but from honest conversations with others who understand the nuances of the field.

    A Community in the Making
    Professionals from the visualization industry exchanging ideas at WVF 2024.
    That need for connection is what pushed Michał Nowak, a Polish visualizer and founder of Nowak Studio, to organize Poland’s first-ever architectural visualization meetup in 2017. With no background in event planning, he wasn’t sure where to begin, but he knew something was missing. The Polish Arch Viz scene lacked a shared space for meetings, discussions, and idea exchange. Michał wanted more than screen time; he wanted honest conversations, spontaneous collaboration and a chance to grow alongside others in the field.
    What began as a modest gathering quickly grew into something much bigger. That original meetup evolved into what is now the World Visualization Festival, an international event that welcomes artists from across Europe and beyond.
    “I didn’t expect our small gathering to grow into a global festival,” Michał says. “But I knew I wanted a connection. I believed that through sharing ideas and experiences, we could all grow professionally, creatively, and personally. And that we’d enjoy the journey more.”
    The response was overwhelming. Each year, more artists from across Poland and Europe join the event in Wrocław, located in south-western Poland. Michał also traveled to other festivals in countries like Portugal and Austria, where he observed the same thing: a spirit of openness, generosity, and shared curiosity. No matter the country or the maturity of the market, the needs were the same — people wanted to connect, learn and grow.
    And beyond the professional side, there was something else: joy. These events were simply fun. They were energizing. They gave people a reason to step away from their desks and remember why they love what they do.

    The Professional Benefits
    Hands-on learning at the AI-driven visualization workshop in Warsaw, October 2024.
    The professional benefits of attending industry events are well documented. These gatherings provide access to mentorship, collaboration and knowledge that can be challenging to find online. Festivals and industry meetups serve as platforms for emerging trends, new tools and fresh workflows — often before they hit the mainstream. They’re places where ideas collide, assumptions are challenged and growth happens.
    The range of topics covered at such events is broad, encompassing everything from portfolio reviews and in-depth discussions of particular rendering engines to discussions about pricing your work and building a sustainable business. At the 2024 edition of the World Visualization Festival, panels focused on scaling creative businesses and navigating industry rates drew some of the biggest crowds, proving that artists are hungry for both artistic and entrepreneurial insights.
    Being part of a creative community also shapes professional identity. It’s not just about finding clients — it’s about finding your place. In a field as fast-moving and competitive as Arch Viz, connection and conversation aren’t luxuries. They’re tools for survival.
    There’s also the matter of building your social capital. Online interactions can only go so far. Meeting someone in person builds relationships that stick. The coffee-break conversations, the spontaneous feedback — these are the moments that cement a community and have the power to spark future projects or long-lasting partnerships. This usually doesn’t happen in Zoom calls.
    And let’s not forget the symbolic power of events like industry awards, such as the Architizer’s Vision Awards or CGArchitect’s 3D Awards. These aren’t just celebrations of talent; they’re affirmations of the craft itself. They contribute to the growth and cohesion of the industry while helping to establish and promote best practices. These events clearly define the role and significance of CGI and visualization as a distinct profession, positioned at the intersection of architecture, marketing, and sales. They advocate for the field to be recognized on its own terms, not merely as a support service, but as an independent discipline. For its creators, they bring visibility, credit, and recognition — elements that inspire growth and fuel motivation to keep pushing the craft forward. Occasions like these remind us that what we do has actual value, impact and meaning.

    The Energy We Take Home
    The WVF 2024 afterparty provided a vibrant space for networking and celebration in Warsaw.
    Many artists describe the post-event glow: a renewed sense of purpose, a fresh jolt of energy, an eagerness to get back to work. Sometimes, new projects emerge, new clients appear, or long-dormant ideas finally gain momentum. These events aren’t just about learning — they’re about recharging.
    One of the most potent moments of last year’s WVF was a series of talks focused on mental health and creative well-being. Co-organized by Michał Nowak and the Polish Arch Viz studio ELEMENT, the festival addressed the emotional realities of the profession, including burnout, self-doubt, and the pressure to constantly produce. These conversations resonated deeply because they were real.
    Seeing that others face the same struggles — and come through them — is profoundly reassuring. Listening to someone share a business strategy that worked, or a failure they learned from, turns competition into camaraderie. Vulnerability becomes strength. Shared experiences become the foundation of resilience.

    Make a Statement. Show up!
    Top industry leaders shared insights during presentations at WVF 2024
    In an era when nearly everything can be done online, showing up in person is a powerful statement. It says: I want more than just efficiency. I want connection, creativity and conversation.
    As the CGI and visualization industries continue to evolve, the need for human connection hasn’t disappeared — it’s grown stronger. Conferences, festivals and meetups, such as World Viz Fest, remain vital spaces for knowledge sharing, innovation and community building. They give us a chance to reset, reconnect and remember that we are part of something bigger than our screens.
    So, yes, despite the tools, the bandwidth, and the ever-faster workflows, we still need to meet in person. Not out of nostalgia, but out of necessity. Because, no matter how far technology takes us, creativity remains a human endeavor.
    Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Start your entry ahead of the Final Entry Deadline on July 11th. 
    The post Building an Architectural Visualization Community: The Case for Physical Gatherings appeared first on Journal.
    #building #architectural #visualization #community #case
    Building an Architectural Visualization Community: The Case for Physical Gatherings
    Barbara Betlejewska is a PR consultant and manager with extensive experience in architecture and real estate, currently involved with World Visualization Festival, a global event bringing together CGI and digital storytelling professionals for 3 days of presentations, workshops, and networking in Warsaw, Poland, this October. Over the last twenty years, visualization and 3D rendering have evolved from supporting tools to become central pillars of architectural storytelling, design development, and marketing across various industries. As digital technologies have advanced, the landscape of creative work has changed dramatically. Artists can now collaborate with clients worldwide without leaving their homes, and their careers can flourish without ever setting foot in a traditional studio. In this hyper-connected world, where access to knowledge, clients, and inspiration is just a click away, do we still need to gather in person? Do conferences, festivals and meetups in the CGI and architectural visualization world still carry weight? The People Behind the Pixels Professionals from the visualization industry exchanging ideas at WVF 2024. For a growing number of professionals — especially those in creative and tech-driven fields — remote work has become the norm. The shift to digital workflows, accelerated by the pandemic, has brought freedom and flexibility that many are reluctant to give up. It’s easier than ever to work for clients in distant cities or countries, to build a freelance career from a laptop, or to pursue the lifestyle of a digital nomad. On the surface, it is a broadening of horizons. But for many, the freedom of remote work comes with a cost: isolation. For visualization artists, the reality often means spending long hours alone, rarely interacting face-to-face with peers or collaborators. And while there are undeniable advantages to independent work, the lack of human connection can lead to creative stagnation, professional burnout, and a sense of detachment from the industry as a whole. Despite being a highly technical and often solitary craft, visualization and CGI thrive on the exchange of ideas, feedback and inspiration. The tools and techniques evolve rapidly, and staying relevant usually means learning not just from tutorials but from honest conversations with others who understand the nuances of the field. A Community in the Making Professionals from the visualization industry exchanging ideas at WVF 2024. That need for connection is what pushed Michał Nowak, a Polish visualizer and founder of Nowak Studio, to organize Poland’s first-ever architectural visualization meetup in 2017. With no background in event planning, he wasn’t sure where to begin, but he knew something was missing. The Polish Arch Viz scene lacked a shared space for meetings, discussions, and idea exchange. Michał wanted more than screen time; he wanted honest conversations, spontaneous collaboration and a chance to grow alongside others in the field. What began as a modest gathering quickly grew into something much bigger. That original meetup evolved into what is now the World Visualization Festival, an international event that welcomes artists from across Europe and beyond. “I didn’t expect our small gathering to grow into a global festival,” Michał says. “But I knew I wanted a connection. I believed that through sharing ideas and experiences, we could all grow professionally, creatively, and personally. And that we’d enjoy the journey more.” The response was overwhelming. Each year, more artists from across Poland and Europe join the event in Wrocław, located in south-western Poland. Michał also traveled to other festivals in countries like Portugal and Austria, where he observed the same thing: a spirit of openness, generosity, and shared curiosity. No matter the country or the maturity of the market, the needs were the same — people wanted to connect, learn and grow. And beyond the professional side, there was something else: joy. These events were simply fun. They were energizing. They gave people a reason to step away from their desks and remember why they love what they do. The Professional Benefits Hands-on learning at the AI-driven visualization workshop in Warsaw, October 2024. The professional benefits of attending industry events are well documented. These gatherings provide access to mentorship, collaboration and knowledge that can be challenging to find online. Festivals and industry meetups serve as platforms for emerging trends, new tools and fresh workflows — often before they hit the mainstream. They’re places where ideas collide, assumptions are challenged and growth happens. The range of topics covered at such events is broad, encompassing everything from portfolio reviews and in-depth discussions of particular rendering engines to discussions about pricing your work and building a sustainable business. At the 2024 edition of the World Visualization Festival, panels focused on scaling creative businesses and navigating industry rates drew some of the biggest crowds, proving that artists are hungry for both artistic and entrepreneurial insights. Being part of a creative community also shapes professional identity. It’s not just about finding clients — it’s about finding your place. In a field as fast-moving and competitive as Arch Viz, connection and conversation aren’t luxuries. They’re tools for survival. There’s also the matter of building your social capital. Online interactions can only go so far. Meeting someone in person builds relationships that stick. The coffee-break conversations, the spontaneous feedback — these are the moments that cement a community and have the power to spark future projects or long-lasting partnerships. This usually doesn’t happen in Zoom calls. And let’s not forget the symbolic power of events like industry awards, such as the Architizer’s Vision Awards or CGArchitect’s 3D Awards. These aren’t just celebrations of talent; they’re affirmations of the craft itself. They contribute to the growth and cohesion of the industry while helping to establish and promote best practices. These events clearly define the role and significance of CGI and visualization as a distinct profession, positioned at the intersection of architecture, marketing, and sales. They advocate for the field to be recognized on its own terms, not merely as a support service, but as an independent discipline. For its creators, they bring visibility, credit, and recognition — elements that inspire growth and fuel motivation to keep pushing the craft forward. Occasions like these remind us that what we do has actual value, impact and meaning. The Energy We Take Home The WVF 2024 afterparty provided a vibrant space for networking and celebration in Warsaw. Many artists describe the post-event glow: a renewed sense of purpose, a fresh jolt of energy, an eagerness to get back to work. Sometimes, new projects emerge, new clients appear, or long-dormant ideas finally gain momentum. These events aren’t just about learning — they’re about recharging. One of the most potent moments of last year’s WVF was a series of talks focused on mental health and creative well-being. Co-organized by Michał Nowak and the Polish Arch Viz studio ELEMENT, the festival addressed the emotional realities of the profession, including burnout, self-doubt, and the pressure to constantly produce. These conversations resonated deeply because they were real. Seeing that others face the same struggles — and come through them — is profoundly reassuring. Listening to someone share a business strategy that worked, or a failure they learned from, turns competition into camaraderie. Vulnerability becomes strength. Shared experiences become the foundation of resilience. Make a Statement. Show up! Top industry leaders shared insights during presentations at WVF 2024 In an era when nearly everything can be done online, showing up in person is a powerful statement. It says: I want more than just efficiency. I want connection, creativity and conversation. As the CGI and visualization industries continue to evolve, the need for human connection hasn’t disappeared — it’s grown stronger. Conferences, festivals and meetups, such as World Viz Fest, remain vital spaces for knowledge sharing, innovation and community building. They give us a chance to reset, reconnect and remember that we are part of something bigger than our screens. So, yes, despite the tools, the bandwidth, and the ever-faster workflows, we still need to meet in person. Not out of nostalgia, but out of necessity. Because, no matter how far technology takes us, creativity remains a human endeavor. Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Start your entry ahead of the Final Entry Deadline on July 11th.  The post Building an Architectural Visualization Community: The Case for Physical Gatherings appeared first on Journal. #building #architectural #visualization #community #case
    ARCHITIZER.COM
    Building an Architectural Visualization Community: The Case for Physical Gatherings
    Barbara Betlejewska is a PR consultant and manager with extensive experience in architecture and real estate, currently involved with World Visualization Festival, a global event bringing together CGI and digital storytelling professionals for 3 days of presentations, workshops, and networking in Warsaw, Poland, this October. Over the last twenty years, visualization and 3D rendering have evolved from supporting tools to become central pillars of architectural storytelling, design development, and marketing across various industries. As digital technologies have advanced, the landscape of creative work has changed dramatically. Artists can now collaborate with clients worldwide without leaving their homes, and their careers can flourish without ever setting foot in a traditional studio. In this hyper-connected world, where access to knowledge, clients, and inspiration is just a click away, do we still need to gather in person? Do conferences, festivals and meetups in the CGI and architectural visualization world still carry weight? The People Behind the Pixels Professionals from the visualization industry exchanging ideas at WVF 2024. For a growing number of professionals — especially those in creative and tech-driven fields — remote work has become the norm. The shift to digital workflows, accelerated by the pandemic, has brought freedom and flexibility that many are reluctant to give up. It’s easier than ever to work for clients in distant cities or countries, to build a freelance career from a laptop, or to pursue the lifestyle of a digital nomad. On the surface, it is a broadening of horizons. But for many, the freedom of remote work comes with a cost: isolation. For visualization artists, the reality often means spending long hours alone, rarely interacting face-to-face with peers or collaborators. And while there are undeniable advantages to independent work, the lack of human connection can lead to creative stagnation, professional burnout, and a sense of detachment from the industry as a whole. Despite being a highly technical and often solitary craft, visualization and CGI thrive on the exchange of ideas, feedback and inspiration. The tools and techniques evolve rapidly, and staying relevant usually means learning not just from tutorials but from honest conversations with others who understand the nuances of the field. A Community in the Making Professionals from the visualization industry exchanging ideas at WVF 2024. That need for connection is what pushed Michał Nowak, a Polish visualizer and founder of Nowak Studio, to organize Poland’s first-ever architectural visualization meetup in 2017. With no background in event planning, he wasn’t sure where to begin, but he knew something was missing. The Polish Arch Viz scene lacked a shared space for meetings, discussions, and idea exchange. Michał wanted more than screen time; he wanted honest conversations, spontaneous collaboration and a chance to grow alongside others in the field. What began as a modest gathering quickly grew into something much bigger. That original meetup evolved into what is now the World Visualization Festival (WVF), an international event that welcomes artists from across Europe and beyond. “I didn’t expect our small gathering to grow into a global festival,” Michał says. “But I knew I wanted a connection. I believed that through sharing ideas and experiences, we could all grow professionally, creatively, and personally. And that we’d enjoy the journey more.” The response was overwhelming. Each year, more artists from across Poland and Europe join the event in Wrocław, located in south-western Poland. Michał also traveled to other festivals in countries like Portugal and Austria, where he observed the same thing: a spirit of openness, generosity, and shared curiosity. No matter the country or the maturity of the market, the needs were the same — people wanted to connect, learn and grow. And beyond the professional side, there was something else: joy. These events were simply fun. They were energizing. They gave people a reason to step away from their desks and remember why they love what they do. The Professional Benefits Hands-on learning at the AI-driven visualization workshop in Warsaw, October 2024. The professional benefits of attending industry events are well documented. These gatherings provide access to mentorship, collaboration and knowledge that can be challenging to find online. Festivals and industry meetups serve as platforms for emerging trends, new tools and fresh workflows — often before they hit the mainstream. They’re places where ideas collide, assumptions are challenged and growth happens. The range of topics covered at such events is broad, encompassing everything from portfolio reviews and in-depth discussions of particular rendering engines to discussions about pricing your work and building a sustainable business. At the 2024 edition of the World Visualization Festival, panels focused on scaling creative businesses and navigating industry rates drew some of the biggest crowds, proving that artists are hungry for both artistic and entrepreneurial insights. Being part of a creative community also shapes professional identity. It’s not just about finding clients — it’s about finding your place. In a field as fast-moving and competitive as Arch Viz, connection and conversation aren’t luxuries. They’re tools for survival. There’s also the matter of building your social capital. Online interactions can only go so far. Meeting someone in person builds relationships that stick. The coffee-break conversations, the spontaneous feedback — these are the moments that cement a community and have the power to spark future projects or long-lasting partnerships. This usually doesn’t happen in Zoom calls. And let’s not forget the symbolic power of events like industry awards, such as the Architizer’s Vision Awards or CGArchitect’s 3D Awards. These aren’t just celebrations of talent; they’re affirmations of the craft itself. They contribute to the growth and cohesion of the industry while helping to establish and promote best practices. These events clearly define the role and significance of CGI and visualization as a distinct profession, positioned at the intersection of architecture, marketing, and sales. They advocate for the field to be recognized on its own terms, not merely as a support service, but as an independent discipline. For its creators, they bring visibility, credit, and recognition — elements that inspire growth and fuel motivation to keep pushing the craft forward. Occasions like these remind us that what we do has actual value, impact and meaning. The Energy We Take Home The WVF 2024 afterparty provided a vibrant space for networking and celebration in Warsaw. Many artists describe the post-event glow: a renewed sense of purpose, a fresh jolt of energy, an eagerness to get back to work. Sometimes, new projects emerge, new clients appear, or long-dormant ideas finally gain momentum. These events aren’t just about learning — they’re about recharging. One of the most potent moments of last year’s WVF was a series of talks focused on mental health and creative well-being. Co-organized by Michał Nowak and the Polish Arch Viz studio ELEMENT, the festival addressed the emotional realities of the profession, including burnout, self-doubt, and the pressure to constantly produce. These conversations resonated deeply because they were real. Seeing that others face the same struggles — and come through them — is profoundly reassuring. Listening to someone share a business strategy that worked, or a failure they learned from, turns competition into camaraderie. Vulnerability becomes strength. Shared experiences become the foundation of resilience. Make a Statement. Show up! Top industry leaders shared insights during presentations at WVF 2024 In an era when nearly everything can be done online, showing up in person is a powerful statement. It says: I want more than just efficiency. I want connection, creativity and conversation. As the CGI and visualization industries continue to evolve, the need for human connection hasn’t disappeared — it’s grown stronger. Conferences, festivals and meetups, such as World Viz Fest, remain vital spaces for knowledge sharing, innovation and community building. They give us a chance to reset, reconnect and remember that we are part of something bigger than our screens. So, yes, despite the tools, the bandwidth, and the ever-faster workflows, we still need to meet in person. Not out of nostalgia, but out of necessity. Because, no matter how far technology takes us, creativity remains a human endeavor. Architizer’s Vision Awards are back! The global awards program honors the world’s best architectural concepts, ideas and imagery. Start your entry ahead of the Final Entry Deadline on July 11th.  The post Building an Architectural Visualization Community: The Case for Physical Gatherings appeared first on Journal.
    Like
    Love
    Wow
    Sad
    Angry
    532
    2 Comentários 0 Compartilhamentos
  • Alec Haase Q&A: Customer Engagement Book Interview

    Reading Time: 6 minutes
    What is marketing without data? Assumptions. Guesses. Fluff.
    For Chapter 6 of our book, “The Customer Engagement Book: Adapt or Die,” we spoke with Alec Haase, Product GTM Lead, Commerce and AI at Hightouch, to explore how engagement data can truly inform critical business decisions. 
    Alec discusses the different types of customer behaviors that matter most, how to separate meaningful information from the rest, and the role of systems that learn over time to create tailored customer experiences.
    This interview provides insights into using data for real-time actions and shaping the future of marketing. Prepare to learn about AI decision-making and how a focus on data is changing how we engage with customers.

     
    Alec Haase Q&A Interview
    1. What types of customer engagement data are most valuable for making strategic business decisions?
    It’s a culmination of everything.
    Behavioral signals — the actual conversions and micro-conversions that users take within your product or website.
    Obviously, that’s things like purchases. But there are also other behavioral signals marketers should be using and thinking about. Things like micro-conversions — maybe that’s shopping for a product, clicking to learn more about a product, or visiting a certain page on your website.
    Behind that, you also need to have all your user data to tie that to.

    So I know someone took said action; I can follow up with them in email or out on paid social. I need the user identifiers to do that.

    2. How do you distinguish between data that is actionable versus data that is just noise?
    Data that’s actionable includes the conversions and micro-conversions — very clear instances of “someone did this.” I can react to or measure those.
    What’s becoming a bit of a challenge for marketers is understanding that there’s other data that is valuable for machine learning or reinforcement learning models, things like tags on the types of products customers are interacting with.
    Maybe there’s category information about that product, or color information. That would otherwise look like noise to the average marketer. But behind the scenes, it can be used for reinforcement learning.

    There is definitely the “clear-cut” actionable data, but marketers shouldn’t be quick to classify things as noise because the rise in machine learning and reinforcement learning will make that data more valuable.

    3. How can customer engagement data be used to identify and prioritize new business opportunities?
    At Hightouch, we don’t necessarily think about retroactive analysis. We have a system where we have customer engagement data firing in that we then have real-time scores reacting to.
    An interesting example is when you have machine learning and reinforcement learning models running. In the pet retailer example I gave you, the system is able to figure out what to prioritize.
    The concept of reinforcement learning is not a marketer making rules to say, “I know this type of thing works well on this type of audience.”

    It’s the machine itself using the data to determine what attribute responds well to which offer, recommendation, or marketing campaign.

    4. How can marketers ensure their use of customer engagement data aligns with the broader business objectives?
    It starts with the objectives. It’s starting with the desired outcome and working your way back. That whole flip of the paradigm is starting with outcomes and letting the system optimize. What are you trying to drive, and then back into the types of experiences that can make that happen?
    There’s personalization.
    When we talk about data-driven experiences and personalization, Spotify Wrapped is the North Star. For Spotify Wrapped, you want to drive customer stickiness and create a brand. To make that happen, you want to send a personalized email. What components do you want in that email?

    Maybe it’s top five songs, top five artists, and then you can back into the actual event data you need to make that happen.

    5. What role does engagement data play in influencing cross-functional decisions such as those in product development, sales, or customer service?
    For product development, it’s product analytics — knowing what features users are using, or seeing in heat maps where users are clicking.
    Sales is similar. We’re using behavioral signals like what types of content they’re reading on the site to help inform what they would be interested in — the types of products or the types of use cases.

    For customer service, you can look at errors they’ve run into in the past or specific purchases they’ve made, so that when you’re helping them the next time they engage with you, you know exactly what their past behaviors were and what products they could be calling about.

    6. What are some challenges marketers face when trying to translate customer engagement data into actionable insights?
    Access to data is one challenge. You might not know what data you have because marketers historically may not have been used to the systems where data is stored.
    Historically, that’s been pretty siloed away from them. Rich behavioral data and other data across the business was stored somewhere else.
    Now, as more companies embrace the data warehouse at the center of their business, it gives everyone a true single place where data can be stored.

    Marketers are working more with data teams, understanding more about the data they have, and using that data to power downstream use cases, personalization, reinforcement learning, or general business insights.

    7. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations?
    As a marketer, I think proof is key. The best thing is if you’ve actually run a test. “I think we should do this. I ran a small test, and it’s showing that this is actually proving out.” Being able to clearly explain and justify your reasoning with data is super important.

    8. What technology or tools have you found most effective for gathering and analyzing customer engagement data?
    Any type of behavioral event collection, specifically ones that write to the cloud data warehouse, is the critical component. Your data team is operating off the data warehouse.
    Having an event collection product that stores data in that central spot is really important if you want to use the other data when making recommendations.
    You want to get everything into the data warehouse where it can be used both for insights and for putting into action.

    For Spotify Wrapped, you want to collect behavioral event signals like songs listened to or concerts attended, writing to the warehouse so that you can get insights back — how many songs were played this year, projections for next month — but then you can also use those behavioral events in downstream platforms to fire off personalized emails with product recommendations or Spotify Wrapped-style experiences.

    9. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years?

    What we’re excited about is the concept of AI Decisioning — having AI agents actually using customer data to train their own models and decision-making to create personalized experiences.
    We’re sitting on top of all this behavioral data, engagement data, and user attributes, and our system is learning from all of that to make the best decisions across downstream systems.
    Whether that’s as simple as driving a loyalty program and figuring out what emails to send or what on-site experiences to show, or exposing insights that might lead you to completely change your business strategy, we see engagement data as the fuel to the engine of reinforcement learning, machine learning, AI agents, this whole next wave of Martech that’s just now coming.
    But it all starts with having the data to train those systems.

    I think that behavioral data is the fuel of modern Martech, and that only holds more true as Martech platforms adopt these decisioning and AI capabilities, because they’re only as good as the data that’s training the models.

     

     
    This interview Q&A was hosted with Alec Haase, Product GTM Lead, Commerce and AI at Hightouch, for Chapter 6 of The Customer Engagement Book: Adapt or Die.
    Download the PDF or request a physical copy of the book here.
    The post Alec Haase Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    #alec #haase #qampampa #customer #engagement
    Alec Haase Q&A: Customer Engagement Book Interview
    Reading Time: 6 minutes What is marketing without data? Assumptions. Guesses. Fluff. For Chapter 6 of our book, “The Customer Engagement Book: Adapt or Die,” we spoke with Alec Haase, Product GTM Lead, Commerce and AI at Hightouch, to explore how engagement data can truly inform critical business decisions.  Alec discusses the different types of customer behaviors that matter most, how to separate meaningful information from the rest, and the role of systems that learn over time to create tailored customer experiences. This interview provides insights into using data for real-time actions and shaping the future of marketing. Prepare to learn about AI decision-making and how a focus on data is changing how we engage with customers.   Alec Haase Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? It’s a culmination of everything. Behavioral signals — the actual conversions and micro-conversions that users take within your product or website. Obviously, that’s things like purchases. But there are also other behavioral signals marketers should be using and thinking about. Things like micro-conversions — maybe that’s shopping for a product, clicking to learn more about a product, or visiting a certain page on your website. Behind that, you also need to have all your user data to tie that to. So I know someone took said action; I can follow up with them in email or out on paid social. I need the user identifiers to do that. 2. How do you distinguish between data that is actionable versus data that is just noise? Data that’s actionable includes the conversions and micro-conversions — very clear instances of “someone did this.” I can react to or measure those. What’s becoming a bit of a challenge for marketers is understanding that there’s other data that is valuable for machine learning or reinforcement learning models, things like tags on the types of products customers are interacting with. Maybe there’s category information about that product, or color information. That would otherwise look like noise to the average marketer. But behind the scenes, it can be used for reinforcement learning. There is definitely the “clear-cut” actionable data, but marketers shouldn’t be quick to classify things as noise because the rise in machine learning and reinforcement learning will make that data more valuable. 3. How can customer engagement data be used to identify and prioritize new business opportunities? At Hightouch, we don’t necessarily think about retroactive analysis. We have a system where we have customer engagement data firing in that we then have real-time scores reacting to. An interesting example is when you have machine learning and reinforcement learning models running. In the pet retailer example I gave you, the system is able to figure out what to prioritize. The concept of reinforcement learning is not a marketer making rules to say, “I know this type of thing works well on this type of audience.” It’s the machine itself using the data to determine what attribute responds well to which offer, recommendation, or marketing campaign. 4. How can marketers ensure their use of customer engagement data aligns with the broader business objectives? It starts with the objectives. It’s starting with the desired outcome and working your way back. That whole flip of the paradigm is starting with outcomes and letting the system optimize. What are you trying to drive, and then back into the types of experiences that can make that happen? There’s personalization. When we talk about data-driven experiences and personalization, Spotify Wrapped is the North Star. For Spotify Wrapped, you want to drive customer stickiness and create a brand. To make that happen, you want to send a personalized email. What components do you want in that email? Maybe it’s top five songs, top five artists, and then you can back into the actual event data you need to make that happen. 5. What role does engagement data play in influencing cross-functional decisions such as those in product development, sales, or customer service? For product development, it’s product analytics — knowing what features users are using, or seeing in heat maps where users are clicking. Sales is similar. We’re using behavioral signals like what types of content they’re reading on the site to help inform what they would be interested in — the types of products or the types of use cases. For customer service, you can look at errors they’ve run into in the past or specific purchases they’ve made, so that when you’re helping them the next time they engage with you, you know exactly what their past behaviors were and what products they could be calling about. 6. What are some challenges marketers face when trying to translate customer engagement data into actionable insights? Access to data is one challenge. You might not know what data you have because marketers historically may not have been used to the systems where data is stored. Historically, that’s been pretty siloed away from them. Rich behavioral data and other data across the business was stored somewhere else. Now, as more companies embrace the data warehouse at the center of their business, it gives everyone a true single place where data can be stored. Marketers are working more with data teams, understanding more about the data they have, and using that data to power downstream use cases, personalization, reinforcement learning, or general business insights. 7. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? As a marketer, I think proof is key. The best thing is if you’ve actually run a test. “I think we should do this. I ran a small test, and it’s showing that this is actually proving out.” Being able to clearly explain and justify your reasoning with data is super important. 8. What technology or tools have you found most effective for gathering and analyzing customer engagement data? Any type of behavioral event collection, specifically ones that write to the cloud data warehouse, is the critical component. Your data team is operating off the data warehouse. Having an event collection product that stores data in that central spot is really important if you want to use the other data when making recommendations. You want to get everything into the data warehouse where it can be used both for insights and for putting into action. For Spotify Wrapped, you want to collect behavioral event signals like songs listened to or concerts attended, writing to the warehouse so that you can get insights back — how many songs were played this year, projections for next month — but then you can also use those behavioral events in downstream platforms to fire off personalized emails with product recommendations or Spotify Wrapped-style experiences. 9. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? What we’re excited about is the concept of AI Decisioning — having AI agents actually using customer data to train their own models and decision-making to create personalized experiences. We’re sitting on top of all this behavioral data, engagement data, and user attributes, and our system is learning from all of that to make the best decisions across downstream systems. Whether that’s as simple as driving a loyalty program and figuring out what emails to send or what on-site experiences to show, or exposing insights that might lead you to completely change your business strategy, we see engagement data as the fuel to the engine of reinforcement learning, machine learning, AI agents, this whole next wave of Martech that’s just now coming. But it all starts with having the data to train those systems. I think that behavioral data is the fuel of modern Martech, and that only holds more true as Martech platforms adopt these decisioning and AI capabilities, because they’re only as good as the data that’s training the models.     This interview Q&A was hosted with Alec Haase, Product GTM Lead, Commerce and AI at Hightouch, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Alec Haase Q&A: Customer Engagement Book Interview appeared first on MoEngage. #alec #haase #qampampa #customer #engagement
    WWW.MOENGAGE.COM
    Alec Haase Q&A: Customer Engagement Book Interview
    Reading Time: 6 minutes What is marketing without data? Assumptions. Guesses. Fluff. For Chapter 6 of our book, “The Customer Engagement Book: Adapt or Die,” we spoke with Alec Haase, Product GTM Lead, Commerce and AI at Hightouch, to explore how engagement data can truly inform critical business decisions.  Alec discusses the different types of customer behaviors that matter most, how to separate meaningful information from the rest, and the role of systems that learn over time to create tailored customer experiences. This interview provides insights into using data for real-time actions and shaping the future of marketing. Prepare to learn about AI decision-making and how a focus on data is changing how we engage with customers.   Alec Haase Q&A Interview 1. What types of customer engagement data are most valuable for making strategic business decisions? It’s a culmination of everything. Behavioral signals — the actual conversions and micro-conversions that users take within your product or website. Obviously, that’s things like purchases. But there are also other behavioral signals marketers should be using and thinking about. Things like micro-conversions — maybe that’s shopping for a product, clicking to learn more about a product, or visiting a certain page on your website. Behind that, you also need to have all your user data to tie that to. So I know someone took said action; I can follow up with them in email or out on paid social. I need the user identifiers to do that. 2. How do you distinguish between data that is actionable versus data that is just noise? Data that’s actionable includes the conversions and micro-conversions — very clear instances of “someone did this.” I can react to or measure those. What’s becoming a bit of a challenge for marketers is understanding that there’s other data that is valuable for machine learning or reinforcement learning models, things like tags on the types of products customers are interacting with. Maybe there’s category information about that product, or color information. That would otherwise look like noise to the average marketer. But behind the scenes, it can be used for reinforcement learning. There is definitely the “clear-cut” actionable data, but marketers shouldn’t be quick to classify things as noise because the rise in machine learning and reinforcement learning will make that data more valuable. 3. How can customer engagement data be used to identify and prioritize new business opportunities? At Hightouch, we don’t necessarily think about retroactive analysis. We have a system where we have customer engagement data firing in that we then have real-time scores reacting to. An interesting example is when you have machine learning and reinforcement learning models running. In the pet retailer example I gave you, the system is able to figure out what to prioritize. The concept of reinforcement learning is not a marketer making rules to say, “I know this type of thing works well on this type of audience.” It’s the machine itself using the data to determine what attribute responds well to which offer, recommendation, or marketing campaign. 4. How can marketers ensure their use of customer engagement data aligns with the broader business objectives? It starts with the objectives. It’s starting with the desired outcome and working your way back. That whole flip of the paradigm is starting with outcomes and letting the system optimize. What are you trying to drive, and then back into the types of experiences that can make that happen? There’s personalization. When we talk about data-driven experiences and personalization, Spotify Wrapped is the North Star. For Spotify Wrapped, you want to drive customer stickiness and create a brand. To make that happen, you want to send a personalized email. What components do you want in that email? Maybe it’s top five songs, top five artists, and then you can back into the actual event data you need to make that happen. 5. What role does engagement data play in influencing cross-functional decisions such as those in product development, sales, or customer service? For product development, it’s product analytics — knowing what features users are using, or seeing in heat maps where users are clicking. Sales is similar. We’re using behavioral signals like what types of content they’re reading on the site to help inform what they would be interested in — the types of products or the types of use cases. For customer service, you can look at errors they’ve run into in the past or specific purchases they’ve made, so that when you’re helping them the next time they engage with you, you know exactly what their past behaviors were and what products they could be calling about. 6. What are some challenges marketers face when trying to translate customer engagement data into actionable insights? Access to data is one challenge. You might not know what data you have because marketers historically may not have been used to the systems where data is stored. Historically, that’s been pretty siloed away from them. Rich behavioral data and other data across the business was stored somewhere else. Now, as more companies embrace the data warehouse at the center of their business, it gives everyone a true single place where data can be stored. Marketers are working more with data teams, understanding more about the data they have, and using that data to power downstream use cases, personalization, reinforcement learning, or general business insights. 7. How do you address skepticism or resistance from stakeholders when presenting data-driven recommendations? As a marketer, I think proof is key. The best thing is if you’ve actually run a test. “I think we should do this. I ran a small test, and it’s showing that this is actually proving out.” Being able to clearly explain and justify your reasoning with data is super important. 8. What technology or tools have you found most effective for gathering and analyzing customer engagement data? Any type of behavioral event collection, specifically ones that write to the cloud data warehouse, is the critical component. Your data team is operating off the data warehouse. Having an event collection product that stores data in that central spot is really important if you want to use the other data when making recommendations. You want to get everything into the data warehouse where it can be used both for insights and for putting into action. For Spotify Wrapped, you want to collect behavioral event signals like songs listened to or concerts attended, writing to the warehouse so that you can get insights back — how many songs were played this year, projections for next month — but then you can also use those behavioral events in downstream platforms to fire off personalized emails with product recommendations or Spotify Wrapped-style experiences. 9. How do you see the role of customer engagement data evolving in shaping business strategies over the next five years? What we’re excited about is the concept of AI Decisioning — having AI agents actually using customer data to train their own models and decision-making to create personalized experiences. We’re sitting on top of all this behavioral data, engagement data, and user attributes, and our system is learning from all of that to make the best decisions across downstream systems. Whether that’s as simple as driving a loyalty program and figuring out what emails to send or what on-site experiences to show, or exposing insights that might lead you to completely change your business strategy, we see engagement data as the fuel to the engine of reinforcement learning, machine learning, AI agents, this whole next wave of Martech that’s just now coming. But it all starts with having the data to train those systems. I think that behavioral data is the fuel of modern Martech, and that only holds more true as Martech platforms adopt these decisioning and AI capabilities, because they’re only as good as the data that’s training the models.     This interview Q&A was hosted with Alec Haase, Product GTM Lead, Commerce and AI at Hightouch, for Chapter 6 of The Customer Engagement Book: Adapt or Die. Download the PDF or request a physical copy of the book here. The post Alec Haase Q&A: Customer Engagement Book Interview appeared first on MoEngage.
    0 Comentários 0 Compartilhamentos
  • Sony is Still Putting Its Faith in ‘Marathon’

    Bungie’s Marathon is still coming out, and when it does, PlayStation plans on giving the extraction shooter a fair shot. During a recent investor interview, Sony Interactive Entertainment head Herman Hulst assured the game would come out before March 31, 2026, when Sony’s fiscal year ends. Touching on its recent alpha test, he descbied the feedback as “varied, but super useful.The constant testing, the constant re-validation of assumptions that we just talked about, to me is just so valuable to iterate and to constantly improve the title, so when launch comes, we’re going to give the title the optimal chance of success.” Hanging over PlayStation is 2024’s sci-fi shooter Concord, which shut down weeks after launch and later led to developer Firewalk Studios closing down. That’s been just one of several botched attempts from PlayStation’s attempts to enter live-service games, which includes several canceled projects and layoffs across its first-party studios. While acknowledging these “unique challenges” and attributing Concord’s failure to the “hypercompetitive market” of hero shooters, Hulst talked up how they’re avoiding the same mistakes with Marathon. “It’s going to be the first new Bungie title in over a decade, and it’s our goal to release a very bold, very innovative, and deeply engaging title. We’re monitoring the closed alpha cycle the team has just gone through. We’re taking all the lessons learned, we’re using the capabilities we’ve built and analytics and user testing to understand how audiences are engaging with the title.”

    One thing Hulst didn’t touch on, though, was the recent accusations of art plagiarism levvied against Bungie. In May, artist Fern “Antireal” Hook released evidence alleging the studio stole assets she made from previous work and failed to credit her. After investigating, Bungie attributed the theft to the work of a former employee, publicly apologized, and said it would do “everything we can to make this right” with Hook. It also promised to review all in-game assets and replace “questionably sourced” art with original, in-house work. With the mention of its arriving before the fiscal year ends, Marathon may be delayed out of its current September 23 launch. At time of writing, Bungie and PlayStation have kept mum on a potential delay, but the game failed to make an appearance at PlayStation’s recent State of Play in early June.Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.
    #sony #still #putting #its #faith
    Sony is Still Putting Its Faith in ‘Marathon’
    Bungie’s Marathon is still coming out, and when it does, PlayStation plans on giving the extraction shooter a fair shot. During a recent investor interview, Sony Interactive Entertainment head Herman Hulst assured the game would come out before March 31, 2026, when Sony’s fiscal year ends. Touching on its recent alpha test, he descbied the feedback as “varied, but super useful.The constant testing, the constant re-validation of assumptions that we just talked about, to me is just so valuable to iterate and to constantly improve the title, so when launch comes, we’re going to give the title the optimal chance of success.” Hanging over PlayStation is 2024’s sci-fi shooter Concord, which shut down weeks after launch and later led to developer Firewalk Studios closing down. That’s been just one of several botched attempts from PlayStation’s attempts to enter live-service games, which includes several canceled projects and layoffs across its first-party studios. While acknowledging these “unique challenges” and attributing Concord’s failure to the “hypercompetitive market” of hero shooters, Hulst talked up how they’re avoiding the same mistakes with Marathon. “It’s going to be the first new Bungie title in over a decade, and it’s our goal to release a very bold, very innovative, and deeply engaging title. We’re monitoring the closed alpha cycle the team has just gone through. We’re taking all the lessons learned, we’re using the capabilities we’ve built and analytics and user testing to understand how audiences are engaging with the title.” One thing Hulst didn’t touch on, though, was the recent accusations of art plagiarism levvied against Bungie. In May, artist Fern “Antireal” Hook released evidence alleging the studio stole assets she made from previous work and failed to credit her. After investigating, Bungie attributed the theft to the work of a former employee, publicly apologized, and said it would do “everything we can to make this right” with Hook. It also promised to review all in-game assets and replace “questionably sourced” art with original, in-house work. With the mention of its arriving before the fiscal year ends, Marathon may be delayed out of its current September 23 launch. At time of writing, Bungie and PlayStation have kept mum on a potential delay, but the game failed to make an appearance at PlayStation’s recent State of Play in early June.Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who. #sony #still #putting #its #faith
    GIZMODO.COM
    Sony is Still Putting Its Faith in ‘Marathon’
    Bungie’s Marathon is still coming out, and when it does, PlayStation plans on giving the extraction shooter a fair shot. During a recent investor interview, Sony Interactive Entertainment head Herman Hulst assured the game would come out before March 31, 2026, when Sony’s fiscal year ends. Touching on its recent alpha test, he descbied the feedback as “varied, but super useful. […] The constant testing, the constant re-validation of assumptions that we just talked about, to me is just so valuable to iterate and to constantly improve the title, so when launch comes, we’re going to give the title the optimal chance of success.” Hanging over PlayStation is 2024’s sci-fi shooter Concord, which shut down weeks after launch and later led to developer Firewalk Studios closing down. That’s been just one of several botched attempts from PlayStation’s attempts to enter live-service games, which includes several canceled projects and layoffs across its first-party studios. While acknowledging these “unique challenges” and attributing Concord’s failure to the “hypercompetitive market” of hero shooters, Hulst talked up how they’re avoiding the same mistakes with Marathon. “It’s going to be the first new Bungie title in over a decade, and it’s our goal to release a very bold, very innovative, and deeply engaging title. We’re monitoring the closed alpha cycle the team has just gone through. We’re taking all the lessons learned, we’re using the capabilities we’ve built and analytics and user testing to understand how audiences are engaging with the title.” One thing Hulst didn’t touch on, though, was the recent accusations of art plagiarism levvied against Bungie. In May, artist Fern “Antireal” Hook released evidence alleging the studio stole assets she made from previous work and failed to credit her. After investigating, Bungie attributed the theft to the work of a former employee, publicly apologized, and said it would do “everything we can to make this right” with Hook. It also promised to review all in-game assets and replace “questionably sourced” art with original, in-house work. With the mention of its arriving before the fiscal year ends, Marathon may be delayed out of its current September 23 launch. At time of writing, Bungie and PlayStation have kept mum on a potential delay, but the game failed to make an appearance at PlayStation’s recent State of Play in early June. [via IGN] Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.
    0 Comentários 0 Compartilhamentos
  • PlayStation Studios boss confident Marathon won't repeat the mistakes of Concord

    PlayStation Studios boss Hermen Hulst has insisted that Bungie's upcoming live service shooter Marathon won't make the same mistakes as Concord.Discussing the company's live service ambitions during a fireside chat aimed at investors, Hulst said the market remains a "great opportunity" for PlayStation despite the company having a decidedly patchy track record when it comes to live service offerings.Last year, the company launched and swiftly scrapped live service hero shooter Concord after it failed to hit the ground running. It shuttered developer Firewalk weeks later after conceding the title "did not hit our targets."Sony scrapped two more live services titles in development at internal studios Bluepoint Games and Bend Studios in January this year. Earlier this week, it confirmed an undisclosed number of workers at Bend had been laid off as the studio transitions to its next project.Hulst said the company has learned hard lessons from those failures, and believes Marathon is well positioned to succeed as a result. "There are som unique challenges associated. We've had some early successes as with Helldivers II. We've also faced some challenges, as with the release of Concord," said Hulst."I think that some really good work went into that title. Some really big efforts. But ultimately that title entered into a hyper-competitive segment of the market. I think it was insufficiently differentiated to be able to resonate with players. So we have reviewed our processes in light of this to deeply understand how and why that title failed to meet expectations—and to ensure that we are not going to make the same mistakes again."Related:PlayStation Studios boss claims the demise of Concord presented a learning opportunityHulst said PlayStation Studios has now implemented more rigorous processes for validating and revalidating its creative, commercial, and development assumptions and hypothesis. "We do that on a much more ongoing basis," he added. "That's the plan that will ensure we're investing in the right opportunities at the right time, all while maintaining much more predictable timelines for Marathon."The upcoming shooter is set to be the first new Bungie title in over a decade—and the first project outside of Destiny the studio has worked on since it was acquired by PlayStation in 2022.Hulst said the aim is to release a "very bold, very innovative, and deeply engaging title." He explained Marathon is currently navigating test cycles that have yielded "varied" feedback, but said those mixed impressions have been "super useful."Related:"That's why you do these tests. The constant testing and constant revalidation of assumptions that we just talked about, to me, is so valuable to iterate and to constantly improves the title," he added. "So when launch comes we're going to give the title the optimal chance of success."Hulst might be exuding confidence, but a recent report from Forbes claimed morale is in "free fall" at Bungie after the studio admitted to using stolen art assets in Marathon. That "varied" player feedback has also reportedly caused concern internally ahead of Marathon's proposed September 23 launch date.The studio was also made to ensure layoffs earlier this year, with Sony cutting 220 roles after exceeding "financial safety margins."
    #playstation #studios #boss #confident #marathon
    PlayStation Studios boss confident Marathon won't repeat the mistakes of Concord
    PlayStation Studios boss Hermen Hulst has insisted that Bungie's upcoming live service shooter Marathon won't make the same mistakes as Concord.Discussing the company's live service ambitions during a fireside chat aimed at investors, Hulst said the market remains a "great opportunity" for PlayStation despite the company having a decidedly patchy track record when it comes to live service offerings.Last year, the company launched and swiftly scrapped live service hero shooter Concord after it failed to hit the ground running. It shuttered developer Firewalk weeks later after conceding the title "did not hit our targets."Sony scrapped two more live services titles in development at internal studios Bluepoint Games and Bend Studios in January this year. Earlier this week, it confirmed an undisclosed number of workers at Bend had been laid off as the studio transitions to its next project.Hulst said the company has learned hard lessons from those failures, and believes Marathon is well positioned to succeed as a result. "There are som unique challenges associated. We've had some early successes as with Helldivers II. We've also faced some challenges, as with the release of Concord," said Hulst."I think that some really good work went into that title. Some really big efforts. But ultimately that title entered into a hyper-competitive segment of the market. I think it was insufficiently differentiated to be able to resonate with players. So we have reviewed our processes in light of this to deeply understand how and why that title failed to meet expectations—and to ensure that we are not going to make the same mistakes again."Related:PlayStation Studios boss claims the demise of Concord presented a learning opportunityHulst said PlayStation Studios has now implemented more rigorous processes for validating and revalidating its creative, commercial, and development assumptions and hypothesis. "We do that on a much more ongoing basis," he added. "That's the plan that will ensure we're investing in the right opportunities at the right time, all while maintaining much more predictable timelines for Marathon."The upcoming shooter is set to be the first new Bungie title in over a decade—and the first project outside of Destiny the studio has worked on since it was acquired by PlayStation in 2022.Hulst said the aim is to release a "very bold, very innovative, and deeply engaging title." He explained Marathon is currently navigating test cycles that have yielded "varied" feedback, but said those mixed impressions have been "super useful."Related:"That's why you do these tests. The constant testing and constant revalidation of assumptions that we just talked about, to me, is so valuable to iterate and to constantly improves the title," he added. "So when launch comes we're going to give the title the optimal chance of success."Hulst might be exuding confidence, but a recent report from Forbes claimed morale is in "free fall" at Bungie after the studio admitted to using stolen art assets in Marathon. That "varied" player feedback has also reportedly caused concern internally ahead of Marathon's proposed September 23 launch date.The studio was also made to ensure layoffs earlier this year, with Sony cutting 220 roles after exceeding "financial safety margins." #playstation #studios #boss #confident #marathon
    WWW.GAMEDEVELOPER.COM
    PlayStation Studios boss confident Marathon won't repeat the mistakes of Concord
    PlayStation Studios boss Hermen Hulst has insisted that Bungie's upcoming live service shooter Marathon won't make the same mistakes as Concord.Discussing the company's live service ambitions during a fireside chat aimed at investors, Hulst said the market remains a "great opportunity" for PlayStation despite the company having a decidedly patchy track record when it comes to live service offerings.Last year, the company launched and swiftly scrapped live service hero shooter Concord after it failed to hit the ground running. It shuttered developer Firewalk weeks later after conceding the title "did not hit our targets."Sony scrapped two more live services titles in development at internal studios Bluepoint Games and Bend Studios in January this year. Earlier this week, it confirmed an undisclosed number of workers at Bend had been laid off as the studio transitions to its next project.Hulst said the company has learned hard lessons from those failures, and believes Marathon is well positioned to succeed as a result. "There are som unique challenges associated [with live service titles]. We've had some early successes as with Helldivers II. We've also faced some challenges, as with the release of Concord," said Hulst."I think that some really good work went into that title. Some really big efforts. But ultimately that title entered into a hyper-competitive segment of the market. I think it was insufficiently differentiated to be able to resonate with players. So we have reviewed our processes in light of this to deeply understand how and why that title failed to meet expectations—and to ensure that we are not going to make the same mistakes again."Related:PlayStation Studios boss claims the demise of Concord presented a learning opportunityHulst said PlayStation Studios has now implemented more rigorous processes for validating and revalidating its creative, commercial, and development assumptions and hypothesis. "We do that on a much more ongoing basis," he added. "That's the plan that will ensure we're investing in the right opportunities at the right time, all while maintaining much more predictable timelines for Marathon."The upcoming shooter is set to be the first new Bungie title in over a decade—and the first project outside of Destiny the studio has worked on since it was acquired by PlayStation in 2022.Hulst said the aim is to release a "very bold, very innovative, and deeply engaging title." He explained Marathon is currently navigating test cycles that have yielded "varied" feedback, but said those mixed impressions have been "super useful."Related:"That's why you do these tests. The constant testing and constant revalidation of assumptions that we just talked about, to me, is so valuable to iterate and to constantly improves the title," he added. "So when launch comes we're going to give the title the optimal chance of success."Hulst might be exuding confidence, but a recent report from Forbes claimed morale is in "free fall" at Bungie after the studio admitted to using stolen art assets in Marathon. That "varied" player feedback has also reportedly caused concern internally ahead of Marathon's proposed September 23 launch date.The studio was also made to ensure layoffs earlier this year, with Sony cutting 220 roles after exceeding "financial safety margins."
    0 Comentários 0 Compartilhamentos
  • Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm

    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more

    When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development.
    What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute. 
    As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention.
    Engineering around constraints
    DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement.
    While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well.
    This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just million — less than 1.2% of OpenAI’s investment.
    If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate. Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development.
    That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently.
    This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing.
    Pragmatism over process
    Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process.
    The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of expertsarchitectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content.
    This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations. 
    Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance.
    Market reverberations
    Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders.
    Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI. 
    With OpenAI reportedly spending to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending billion or billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change.
    This economic reality prompted OpenAI to pursue a massive billion funding round that valued the company at an unprecedented billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s.
    Beyond model training
    Another significant trend accelerated by DeepSeek is the shift toward “test-time compute”. As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training.
    To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning”. This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards.
    The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM”. But, as with its model distillation approach, this could be considered a mix of promise and risk.
    For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted.
    At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of othersto create what is likely the first full-stack application of SPCT in a commercial effort.
    This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails.
    Moving into the future
    So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity. 
    Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market.
    Meta has also responded,
    With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail.
    Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching.
    Jae Lee is CEO and co-founder of TwelveLabs.

    Daily insights on business use cases with VB Daily
    If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI.
    Read our Privacy Policy

    Thanks for subscribing. Check out more VB newsletters here.

    An error occured.
    #rethinking #deepseeks #playbook #shakes #highspend
    Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm
    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development. What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute.  As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention. Engineering around constraints DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement. While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well. This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just million — less than 1.2% of OpenAI’s investment. If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate. Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development. That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently. This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing. Pragmatism over process Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process. The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of expertsarchitectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content. This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations.  Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance. Market reverberations Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders. Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI.  With OpenAI reportedly spending to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending billion or billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change. This economic reality prompted OpenAI to pursue a massive billion funding round that valued the company at an unprecedented billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s. Beyond model training Another significant trend accelerated by DeepSeek is the shift toward “test-time compute”. As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training. To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning”. This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards. The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM”. But, as with its model distillation approach, this could be considered a mix of promise and risk. For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted. At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of othersto create what is likely the first full-stack application of SPCT in a commercial effort. This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails. Moving into the future So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity.  Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market. Meta has also responded, With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail. Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching. Jae Lee is CEO and co-founder of TwelveLabs. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured. #rethinking #deepseeks #playbook #shakes #highspend
    VENTUREBEAT.COM
    Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm
    Join the event trusted by enterprise leaders for nearly two decades. VB Transform brings together the people building real enterprise AI strategy. Learn more When DeepSeek released its R1 model this January, it wasn’t just another AI announcement. It was a watershed moment that sent shockwaves through the tech industry, forcing industry leaders to reconsider their fundamental approaches to AI development. What makes DeepSeek’s accomplishment remarkable isn’t that the company developed novel capabilities; rather, it was how it achieved comparable results to those delivered by tech heavyweights at a fraction of the cost. In reality, DeepSeek didn’t do anything that hadn’t been done before; its innovation stemmed from pursuing different priorities. As a result, we are now experiencing rapid-fire development along two parallel tracks: efficiency and compute.  As DeepSeek prepares to release its R2 model, and as it concurrently faces the potential of even greater chip restrictions from the U.S., it’s important to look at how it captured so much attention. Engineering around constraints DeepSeek’s arrival, as sudden and dramatic as it was, captivated us all because it showcased the capacity for innovation to thrive even under significant constraints. Faced with U.S. export controls limiting access to cutting-edge AI chips, DeepSeek was forced to find alternative pathways to AI advancement. While U.S. companies pursued performance gains through more powerful hardware, bigger models and better data, DeepSeek focused on optimizing what was available. It implemented known ideas with remarkable execution — and there is novelty in executing what’s known and doing it well. This efficiency-first mindset yielded incredibly impressive results. DeepSeek’s R1 model reportedly matches OpenAI’s capabilities at just 5 to 10% of the operating cost. According to reports, the final training run for DeepSeek’s V3 predecessor cost a mere $6 million — which was described by former Tesla AI scientist Andrej Karpathy as “a joke of a budget” compared to the tens or hundreds of millions spent by U.S. competitors. More strikingly, while OpenAI reportedly spent $500 million training its recent “Orion” model, DeepSeek achieved superior benchmark results for just $5.6 million — less than 1.2% of OpenAI’s investment. If you get starry eyed believing these incredible results were achieved even as DeepSeek was at a severe disadvantage based on its inability to access advanced AI chips, I hate to tell you, but that narrative isn’t entirely accurate (even though it makes a good story). Initial U.S. export controls focused primarily on compute capabilities, not on memory and networking — two crucial components for AI development. That means that the chips DeepSeek had access to were not poor quality chips; their networking and memory capabilities allowed DeepSeek to parallelize operations across many units, a key strategy for running their large model efficiently. This, combined with China’s national push toward controlling the entire vertical stack of AI infrastructure, resulted in accelerated innovation that many Western observers didn’t anticipate. DeepSeek’s advancements were an inevitable part of AI development, but they brought known advancements forward a few years earlier than would have been possible otherwise, and that’s pretty amazing. Pragmatism over process Beyond hardware optimization, DeepSeek’s approach to training data represents another departure from conventional Western practices. Rather than relying solely on web-scraped content, DeepSeek reportedly leveraged significant amounts of synthetic data and outputs from other proprietary models. This is a classic example of model distillation, or the ability to learn from really powerful models. Such an approach, however, raises questions about data privacy and governance that might concern Western enterprise customers. Still, it underscores DeepSeek’s overall pragmatic focus on results over process. The effective use of synthetic data is a key differentiator. Synthetic data can be very effective when it comes to training large models, but you have to be careful; some model architectures handle synthetic data better than others. For instance, transformer-based models with mixture of experts (MoE) architectures like DeepSeek’s tend to be more robust when incorporating synthetic data, while more traditional dense architectures like those used in early Llama models can experience performance degradation or even “model collapse” when trained on too much synthetic content. This architectural sensitivity matters because synthetic data introduces different patterns and distributions compared to real-world data. When a model architecture doesn’t handle synthetic data well, it may learn shortcuts or biases present in the synthetic data generation process rather than generalizable knowledge. This can lead to reduced performance on real-world tasks, increased hallucinations or brittleness when facing novel situations.  Still, DeepSeek’s engineering teams reportedly designed their model architecture specifically with synthetic data integration in mind from the earliest planning stages. This allowed the company to leverage the cost benefits of synthetic data without sacrificing performance. Market reverberations Why does all of this matter? Stock market aside, DeepSeek’s emergence has triggered substantive strategic shifts among industry leaders. Case in point: OpenAI. Sam Altman recently announced plans to release the company’s first “open-weight” language model since 2019. This is a pretty notable pivot for a company that built its business on proprietary systems. It seems DeepSeek’s rise, on top of Llama’s success, has hit OpenAI’s leader hard. Just a month after DeepSeek arrived on the scene, Altman admitted that OpenAI had been “on the wrong side of history” regarding open-source AI.  With OpenAI reportedly spending $7 to 8 billion annually on operations, the economic pressure from efficient alternatives like DeepSeek has become impossible to ignore. As AI scholar Kai-Fu Lee bluntly put it: “You’re spending $7 billion or $8 billion a year, making a massive loss, and here you have a competitor coming in with an open-source model that’s for free.” This necessitates change. This economic reality prompted OpenAI to pursue a massive $40 billion funding round that valued the company at an unprecedented $300 billion. But even with a war chest of funds at its disposal, the fundamental challenge remains: OpenAI’s approach is dramatically more resource-intensive than DeepSeek’s. Beyond model training Another significant trend accelerated by DeepSeek is the shift toward “test-time compute” (TTC). As major AI labs have now trained their models on much of the available public data on the internet, data scarcity is slowing further improvements in pre-training. To get around this, DeepSeek announced a collaboration with Tsinghua University to enable “self-principled critique tuning” (SPCT). This approach trains AI to develop its own rules for judging content and then uses those rules to provide detailed critiques. The system includes a built-in “judge” that evaluates the AI’s answers in real-time, comparing responses against core rules and quality standards. The development is part of a movement towards autonomous self-evaluation and improvement in AI systems in which models use inference time to improve results, rather than simply making models larger during training. DeepSeek calls its system “DeepSeek-GRM” (generalist reward modeling). But, as with its model distillation approach, this could be considered a mix of promise and risk. For example, if the AI develops its own judging criteria, there’s a risk those principles diverge from human values, ethics or context. The rules could end up being overly rigid or biased, optimizing for style over substance, and/or reinforce incorrect assumptions or hallucinations. Additionally, without a human in the loop, issues could arise if the “judge” is flawed or misaligned. It’s a kind of AI talking to itself, without robust external grounding. On top of this, users and developers may not understand why the AI reached a certain conclusion — which feeds into a bigger concern: Should an AI be allowed to decide what is “good” or “correct” based solely on its own logic? These risks shouldn’t be discounted. At the same time, this approach is gaining traction, as again DeepSeek builds on the body of work of others (think OpenAI’s “critique and revise” methods, Anthropic’s constitutional AI or research on self-rewarding agents) to create what is likely the first full-stack application of SPCT in a commercial effort. This could mark a powerful shift in AI autonomy, but there still is a need for rigorous auditing, transparency and safeguards. It’s not just about models getting smarter, but that they remain aligned, interpretable, and trustworthy as they begin critiquing themselves without human guardrails. Moving into the future So, taking all of this into account, the rise of DeepSeek signals a broader shift in the AI industry toward parallel innovation tracks. While companies continue building more powerful compute clusters for next-generation capabilities, there will also be intense focus on finding efficiency gains through software engineering and model architecture improvements to offset the challenges of AI energy consumption, which far outpaces power generation capacity.  Companies are taking note. Microsoft, for example, has halted data center development in multiple regions globally, recalibrating toward a more distributed, efficient infrastructure approach. While still planning to invest approximately $80 billion in AI infrastructure this fiscal year, the company is reallocating resources in response to the efficiency gains DeepSeek introduced to the market. Meta has also responded, With so much movement in such a short time, it becomes somewhat ironic that the U.S. sanctions designed to maintain American AI dominance may have instead accelerated the very innovation they sought to contain. By constraining access to materials, DeepSeek was forced to blaze a new trail. Moving forward, as the industry continues to evolve globally, adaptability for all players will be key. Policies, people and market reactions will continue to shift the ground rules — whether it’s eliminating the AI diffusion rule, a new ban on technology purchases or something else entirely. It’s what we learn from one another and how we respond that will be worth watching. Jae Lee is CEO and co-founder of TwelveLabs. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured.
    0 Comentários 0 Compartilhamentos