• Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid

    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand.
    Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation.
    At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics.
    Future use cases for AEON include:

    Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio.
    Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings.
    Part inspection, which includes checking parts for defects or ensuring adherence to specifications.
    Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners.

    “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.”

    Using NVIDIA’s Three Computers to Develop AEON 
    To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models.
    Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations.
    AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning.


    This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment.
    In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation.
    “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.”
    Data Comes to Life Through Reality Capture and Omniverse Integration 
    AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas.

    Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure.
    “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.”
    AEON’s Next Steps
    By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON.
    This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data.
    Watch the Hexagon LIVE keynote, explore presentations and read more about AEON.
    All imagery courtesy of Hexagon.
    #hexagon #taps #nvidia #robotics #software
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Realityplatform powering Hexagon Reality Cloud Studio. Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon. #hexagon #taps #nvidia #robotics #software
    BLOGS.NVIDIA.COM
    Hexagon Taps NVIDIA Robotics and AI Software to Build and Deploy AEON, a New Humanoid
    As a global labor shortage leaves 50 million positions unfilled across industries like manufacturing and logistics, Hexagon — a global leader in measurement technologies — is developing humanoid robots that can lend a helping hand. Industrial sectors depend on skilled workers to perform a variety of error-prone tasks, including operating high-precision scanners for reality capture — the process of capturing digital data to replicate the real world in simulation. At the Hexagon LIVE Global conference, Hexagon’s robotics division today unveiled AEON — a new humanoid robot built in collaboration with NVIDIA that’s engineered to perform a wide range of industrial applications, from manipulation and asset inspection to reality capture and operator support. Hexagon plans to deploy AEON across automotive, transportation, aerospace, manufacturing, warehousing and logistics. Future use cases for AEON include: Reality capture, which involves automatic planning and then scanning of assets, industrial spaces and environments to generate 3D models. The captured data is then used for advanced visualization and collaboration in the Hexagon Digital Reality (HxDR) platform powering Hexagon Reality Cloud Studio (RCS). Manipulation tasks, such as sorting and moving parts in various industrial and manufacturing settings. Part inspection, which includes checking parts for defects or ensuring adherence to specifications. Industrial operations, including highly dexterous technical tasks like machinery operations, teleoperation and scanning parts using high-end scanners. “The age of general-purpose robotics has arrived, due to technological advances in simulation and physical AI,” said Deepu Talla, vice president of robotics and edge AI at NVIDIA. “Hexagon’s new AEON humanoid embodies the integration of NVIDIA’s three-computer robotics platform and is making a significant leap forward in addressing industry-critical challenges.” Using NVIDIA’s Three Computers to Develop AEON  To build AEON, Hexagon used NVIDIA’s three computers for developing and deploying physical AI systems. They include AI supercomputers to train and fine-tune powerful foundation models; the NVIDIA Omniverse platform, running on NVIDIA OVX servers, for testing and optimizing these models in simulation environments using real and physically based synthetic data; and NVIDIA IGX Thor robotic computers to run the models. Hexagon is exploring using NVIDIA accelerated computing to post-train the NVIDIA Isaac GR00T N1.5 open foundation model to improve robot reasoning and policies, and tapping Isaac GR00T-Mimic to generate vast amounts of synthetic motion data from a few human demonstrations. AEON learns many of its skills through simulations powered by the NVIDIA Isaac platform. Hexagon uses NVIDIA Isaac Sim, a reference robotic simulation application built on Omniverse, to simulate complex robot actions like navigation, locomotion and manipulation. These skills are then refined using reinforcement learning in NVIDIA Isaac Lab, an open-source framework for robot learning. https://blogs.nvidia.com/wp-content/uploads/2025/06/Copy-of-robotics-hxgn-live-blog-1920x1080-1.mp4 This simulation-first approach enabled Hexagon to fast-track its robotic development, allowing AEON to master core locomotion skills in just 2-3 weeks — rather than 5-6 months — before real-world deployment. In addition, AEON taps into NVIDIA Jetson Orin onboard computers to autonomously move, navigate and perform its tasks in real time, enhancing its speed and accuracy while operating in complex and dynamic environments. Hexagon is also planning to upgrade AEON with NVIDIA IGX Thor to enable functional safety for collaborative operation. “Our goal with AEON was to design an intelligent, autonomous humanoid that addresses the real-world challenges industrial leaders have shared with us over the past months,” said Arnaud Robert, president of Hexagon’s robotics division. “By leveraging NVIDIA’s full-stack robotics and simulation platforms, we were able to deliver a best-in-class humanoid that combines advanced mechatronics, multimodal sensor fusion and real-time AI.” Data Comes to Life Through Reality Capture and Omniverse Integration  AEON will be piloted in factories and warehouses to scan everything from small precision parts and automotive components to large assembly lines and storage areas. Captured data comes to life in RCS, a platform that allows users to collaborate, visualize and share reality-capture data by tapping into HxDR and NVIDIA Omniverse running in the cloud. This removes the constraint of local infrastructure. “Digital twins offer clear advantages, but adoption has been challenging in several industries,” said Lucas Heinzle, vice president of research and development at Hexagon’s robotics division. “AEON’s sophisticated sensor suite enables the integration of reality data capture with NVIDIA Omniverse, streamlining workflows for our customers and moving us closer to making digital twins a mainstream tool for collaboration and innovation.” AEON’s Next Steps By adopting the OpenUSD framework and developing on Omniverse, Hexagon can generate high-fidelity digital twins from scanned data — establishing a data flywheel to continuously train AEON. This latest work with Hexagon is helping shape the future of physical AI — delivering scalable, efficient solutions to address the challenges faced by industries that depend on capturing real-world data. Watch the Hexagon LIVE keynote, explore presentations and read more about AEON. All imagery courtesy of Hexagon.
    Like
    Love
    Wow
    Sad
    Angry
    38
    0 Σχόλια 0 Μοιράστηκε
  • So, Ruud is back, and this time he's tackling the age-old dilemma of dust separation like it's the final frontier of modern science. I mean, who knew optimizing dust collection could be the hot new trend? Forget world hunger or climate change—apparently, the real challenge we’ve all been ignoring is how to capture every last speck of sawdust in Ruud's shop!

    The dust collection system was already "solved," but hey, why not reinvent the wheel? Let’s all gather ‘round and hold our breaths for the groundbreaking revelations that will surely change our lives forever… or at least our workshop floors. Dust, the real MVP.

    #DustSeparation #WorkshopWisdom #ExtremeEfficiency #CapturingDust #
    So, Ruud is back, and this time he's tackling the age-old dilemma of dust separation like it's the final frontier of modern science. I mean, who knew optimizing dust collection could be the hot new trend? Forget world hunger or climate change—apparently, the real challenge we’ve all been ignoring is how to capture every last speck of sawdust in Ruud's shop! The dust collection system was already "solved," but hey, why not reinvent the wheel? Let’s all gather ‘round and hold our breaths for the groundbreaking revelations that will surely change our lives forever… or at least our workshop floors. Dust, the real MVP. #DustSeparation #WorkshopWisdom #ExtremeEfficiency #CapturingDust #
    HACKADAY.COM
    Optimizing Dust Separation for Extreme Efficiency
    [Ruud], the creator of [Capturing Dust], started his latest video with what most of us would consider a solved problem: the dust collection system for his shop already had a …read more
    1 Σχόλια 0 Μοιράστηκε
  • generative engine optimization, AI-powered search, content optimization, search engines, ChatGPT, Google SEO, digital marketing, AI content strategies, online visibility, future of search

    ## Introduction

    In the evolving landscape of digital marketing, the term Generative Engine Optimization (GEO) has surfaced as a significant development. This practice revolves around optimizing content specifically for AI-powered search engines, such as ChatGPT and Google. As search engines continue to incorp...
    generative engine optimization, AI-powered search, content optimization, search engines, ChatGPT, Google SEO, digital marketing, AI content strategies, online visibility, future of search ## Introduction In the evolving landscape of digital marketing, the term Generative Engine Optimization (GEO) has surfaced as a significant development. This practice revolves around optimizing content specifically for AI-powered search engines, such as ChatGPT and Google. As search engines continue to incorp...
    Generative Engine Optimization: The New Era of Search
    generative engine optimization, AI-powered search, content optimization, search engines, ChatGPT, Google SEO, digital marketing, AI content strategies, online visibility, future of search ## Introduction In the evolving landscape of digital marketing, the term Generative Engine Optimization (GEO) has surfaced as a significant development. This practice revolves around optimizing content...
    Like
    Love
    Wow
    Sad
    Angry
    493
    1 Σχόλια 0 Μοιράστηκε
  • How to optimize your hybrid waterfall with CPM buckets

    In-app bidding has automated most waterfall optimization, yet developers still manage multiple hybrid waterfalls, each with dozens of manual instances. Naturally, this can be timely and overwhelming to maintain, keeping you from optimizing to perfection and focusing on other opportunities to boost revenue.Rather than analyzing each individual network and checking if instances are available at each price point, breaking down your waterfall into different CPM ranges allows you to visualize the waterfall and easily identify the gaps.Here are some tips on how to use CPM buckets to better optimize your waterfall’s performance.What are CPM buckets?CPM buckets show you exactly how much revenue and how many impressions you’re getting from each CPM price range, giving you a more granular idea of how different networks are competing in the waterfall. CPM buckets are a feature of real time pivot reports, available on ironSource LevelPlay.Identifying and closing the gapsTypically in a waterfall, you can only see each ad network’s average CPM. But this keeps you from seeing ad network distribution across all price points and understanding exactly where ad networks are bidding. Bottom line - you don’t know where in the waterfall you should add a new instance.By separating CPM into buckets,you understand exactly which networks are driving impressions and revenue and which CPMs aren’t being filledNow how do you do it? As a LevelPlay client, simply use ironSource’s real time pivot reports - choose the CPM bucket filter option and sort by “average bid price.” From here, you’ll see how your revenue spreads out among CPM ranges and you’ll start to notice gaps in your bar graph. Every gap in revenue - where revenue is much lower than the neighboring CPM group - indicates an opportunity to optimize your monetization strategy. The buckets can range from small increments like to larger increments like so it’s important to compare CPM buckets of the same incremental value.Pro tip: To best set up your waterfall, create one tab with the general waterfalland make sure to look at Revenue and eCPM in the “measures” dropdown. In the “show” section, choose CPM buckets and sort by average bid price. From here, you can mark down any gaps.But where do these gaps come from? Gaps in revenue are often due to friction in the waterfall, like not enough instances, instances that aren’t working, or a waterfall setup mistake. But gaps can also be adjusted and fixed.Once you’ve found a gap, you can look at the CPM buckets around it to better understand the context. Let’s say you see a strong instance generating significant revenue in the CPM bucket right below it, in the -80 group. This instance from this specific ad network has a lot of potential, so it’s worth trying to push it to a higher CPM bucket.In fact, when you look at higher CPM buckets, you don’t see this ad network anywhere else in the waterfall - what a missed opportunity! Try adding another instance of this network higher up in the waterfall. If you’re profiting well with a -80 CPM, imagine how much more revenue you could bring at a CPM.Pro tip: Focusing on higher areas in the waterfall makes a larger financial impact, leading to bigger increases in ARPDAU.Let’s say you decide to add 5 instances of that network to higher CPM buckets. You can use LevelPlay’s quick A/B test to understand if this adjustment boosts your revenue - not just for this gap, but for any and all that you find. Simply compare your existing waterfall against the new waterfall with these 5 higher instances - then implement the one that drives the highest instances.Božo Janković, Head of Ad Monetization at GameBiz Consulting, uses CPM buckets "to understand at which CPMs the bidding networks are filling. From there, I can pinpoint exactly where in the waterfall to add more traditional instances - which creates more competition, especially for the bidding networks, and creates an opportunity for revenue growth."Finding new insightsYou can dig even deeper into your data by filtering by ad source. Before CPM buckets, you were limited to seeing an average eCPM for each bidding network. Maybe you knew that one ad source had an average CPM of but the distribution of impression across the waterfall was a black box. Now, we know exactly which CPMs the bidders are filling. “I find ironSource CPM buckets feature very insightful and and use it daily. It’s an easy way to identify opportunities to optimize the waterfall and earn even more revenue."

    -Božo Janković, Head of Ad Monetization at GameBiz ConsultingUnderstanding your CPM distribution empowers you to not only identify your revenue sources, but also to promote revenue growth. Armed with the knowledge of which buckets some of their stronger bidding networking are performing in, some publishers actively add instances from traditional networks above those ranges. This creates better competition and also helps drive up the bids from the biddersThere’s no need for deep analysis - once you see the gaps, you can quickly understand who’s performing in the lower and higher buckets, and see exactly what’s missing. This way, you won’t miss out on any lost revenue.Learn more about CPM buckets, available exclusively to ironSource LevelPlay here.
    #how #optimize #your #hybrid #waterfall
    How to optimize your hybrid waterfall with CPM buckets
    In-app bidding has automated most waterfall optimization, yet developers still manage multiple hybrid waterfalls, each with dozens of manual instances. Naturally, this can be timely and overwhelming to maintain, keeping you from optimizing to perfection and focusing on other opportunities to boost revenue.Rather than analyzing each individual network and checking if instances are available at each price point, breaking down your waterfall into different CPM ranges allows you to visualize the waterfall and easily identify the gaps.Here are some tips on how to use CPM buckets to better optimize your waterfall’s performance.What are CPM buckets?CPM buckets show you exactly how much revenue and how many impressions you’re getting from each CPM price range, giving you a more granular idea of how different networks are competing in the waterfall. CPM buckets are a feature of real time pivot reports, available on ironSource LevelPlay.Identifying and closing the gapsTypically in a waterfall, you can only see each ad network’s average CPM. But this keeps you from seeing ad network distribution across all price points and understanding exactly where ad networks are bidding. Bottom line - you don’t know where in the waterfall you should add a new instance.By separating CPM into buckets,you understand exactly which networks are driving impressions and revenue and which CPMs aren’t being filledNow how do you do it? As a LevelPlay client, simply use ironSource’s real time pivot reports - choose the CPM bucket filter option and sort by “average bid price.” From here, you’ll see how your revenue spreads out among CPM ranges and you’ll start to notice gaps in your bar graph. Every gap in revenue - where revenue is much lower than the neighboring CPM group - indicates an opportunity to optimize your monetization strategy. The buckets can range from small increments like to larger increments like so it’s important to compare CPM buckets of the same incremental value.Pro tip: To best set up your waterfall, create one tab with the general waterfalland make sure to look at Revenue and eCPM in the “measures” dropdown. In the “show” section, choose CPM buckets and sort by average bid price. From here, you can mark down any gaps.But where do these gaps come from? Gaps in revenue are often due to friction in the waterfall, like not enough instances, instances that aren’t working, or a waterfall setup mistake. But gaps can also be adjusted and fixed.Once you’ve found a gap, you can look at the CPM buckets around it to better understand the context. Let’s say you see a strong instance generating significant revenue in the CPM bucket right below it, in the -80 group. This instance from this specific ad network has a lot of potential, so it’s worth trying to push it to a higher CPM bucket.In fact, when you look at higher CPM buckets, you don’t see this ad network anywhere else in the waterfall - what a missed opportunity! Try adding another instance of this network higher up in the waterfall. If you’re profiting well with a -80 CPM, imagine how much more revenue you could bring at a CPM.Pro tip: Focusing on higher areas in the waterfall makes a larger financial impact, leading to bigger increases in ARPDAU.Let’s say you decide to add 5 instances of that network to higher CPM buckets. You can use LevelPlay’s quick A/B test to understand if this adjustment boosts your revenue - not just for this gap, but for any and all that you find. Simply compare your existing waterfall against the new waterfall with these 5 higher instances - then implement the one that drives the highest instances.Božo Janković, Head of Ad Monetization at GameBiz Consulting, uses CPM buckets "to understand at which CPMs the bidding networks are filling. From there, I can pinpoint exactly where in the waterfall to add more traditional instances - which creates more competition, especially for the bidding networks, and creates an opportunity for revenue growth."Finding new insightsYou can dig even deeper into your data by filtering by ad source. Before CPM buckets, you were limited to seeing an average eCPM for each bidding network. Maybe you knew that one ad source had an average CPM of but the distribution of impression across the waterfall was a black box. Now, we know exactly which CPMs the bidders are filling. “I find ironSource CPM buckets feature very insightful and and use it daily. It’s an easy way to identify opportunities to optimize the waterfall and earn even more revenue." -Božo Janković, Head of Ad Monetization at GameBiz ConsultingUnderstanding your CPM distribution empowers you to not only identify your revenue sources, but also to promote revenue growth. Armed with the knowledge of which buckets some of their stronger bidding networking are performing in, some publishers actively add instances from traditional networks above those ranges. This creates better competition and also helps drive up the bids from the biddersThere’s no need for deep analysis - once you see the gaps, you can quickly understand who’s performing in the lower and higher buckets, and see exactly what’s missing. This way, you won’t miss out on any lost revenue.Learn more about CPM buckets, available exclusively to ironSource LevelPlay here. #how #optimize #your #hybrid #waterfall
    UNITY.COM
    How to optimize your hybrid waterfall with CPM buckets
    In-app bidding has automated most waterfall optimization, yet developers still manage multiple hybrid waterfalls, each with dozens of manual instances. Naturally, this can be timely and overwhelming to maintain, keeping you from optimizing to perfection and focusing on other opportunities to boost revenue.Rather than analyzing each individual network and checking if instances are available at each price point, breaking down your waterfall into different CPM ranges allows you to visualize the waterfall and easily identify the gaps.Here are some tips on how to use CPM buckets to better optimize your waterfall’s performance.What are CPM buckets?CPM buckets show you exactly how much revenue and how many impressions you’re getting from each CPM price range, giving you a more granular idea of how different networks are competing in the waterfall. CPM buckets are a feature of real time pivot reports, available on ironSource LevelPlay.Identifying and closing the gapsTypically in a waterfall, you can only see each ad network’s average CPM. But this keeps you from seeing ad network distribution across all price points and understanding exactly where ad networks are bidding. Bottom line - you don’t know where in the waterfall you should add a new instance.By separating CPM into buckets, (for example, seeing all the ad networks generating a CPM of $10-$20) you understand exactly which networks are driving impressions and revenue and which CPMs aren’t being filledNow how do you do it? As a LevelPlay client, simply use ironSource’s real time pivot reports - choose the CPM bucket filter option and sort by “average bid price.” From here, you’ll see how your revenue spreads out among CPM ranges and you’ll start to notice gaps in your bar graph. Every gap in revenue - where revenue is much lower than the neighboring CPM group - indicates an opportunity to optimize your monetization strategy. The buckets can range from small increments like $1 to larger increments like $10, so it’s important to compare CPM buckets of the same incremental value.Pro tip: To best set up your waterfall, create one tab with the general waterfall (filter app, OS, Ad unit, geo/geos from a specific group) and make sure to look at Revenue and eCPM in the “measures” dropdown. In the “show” section, choose CPM buckets and sort by average bid price. From here, you can mark down any gaps.But where do these gaps come from? Gaps in revenue are often due to friction in the waterfall, like not enough instances, instances that aren’t working, or a waterfall setup mistake. But gaps can also be adjusted and fixed.Once you’ve found a gap, you can look at the CPM buckets around it to better understand the context. Let’s say you see a strong instance generating significant revenue in the CPM bucket right below it, in the $70-80 group. This instance from this specific ad network has a lot of potential, so it’s worth trying to push it to a higher CPM bucket.In fact, when you look at higher CPM buckets, you don’t see this ad network anywhere else in the waterfall - what a missed opportunity! Try adding another instance of this network higher up in the waterfall. If you’re profiting well with a $70-80 CPM, imagine how much more revenue you could bring at a $150 CPM.Pro tip: Focusing on higher areas in the waterfall makes a larger financial impact, leading to bigger increases in ARPDAU.Let’s say you decide to add 5 instances of that network to higher CPM buckets. You can use LevelPlay’s quick A/B test to understand if this adjustment boosts your revenue - not just for this gap, but for any and all that you find. Simply compare your existing waterfall against the new waterfall with these 5 higher instances - then implement the one that drives the highest instances.Božo Janković, Head of Ad Monetization at GameBiz Consulting, uses CPM buckets "to understand at which CPMs the bidding networks are filling. From there, I can pinpoint exactly where in the waterfall to add more traditional instances - which creates more competition, especially for the bidding networks, and creates an opportunity for revenue growth."Finding new insightsYou can dig even deeper into your data by filtering by ad source. Before CPM buckets, you were limited to seeing an average eCPM for each bidding network. Maybe you knew that one ad source had an average CPM of $50, but the distribution of impression across the waterfall was a black box. Now, we know exactly which CPMs the bidders are filling. “I find ironSource CPM buckets feature very insightful and and use it daily. It’s an easy way to identify opportunities to optimize the waterfall and earn even more revenue." -Božo Janković, Head of Ad Monetization at GameBiz ConsultingUnderstanding your CPM distribution empowers you to not only identify your revenue sources, but also to promote revenue growth. Armed with the knowledge of which buckets some of their stronger bidding networking are performing in, some publishers actively add instances from traditional networks above those ranges. This creates better competition and also helps drive up the bids from the biddersThere’s no need for deep analysis - once you see the gaps, you can quickly understand who’s performing in the lower and higher buckets, and see exactly what’s missing. This way, you won’t miss out on any lost revenue.Learn more about CPM buckets, available exclusively to ironSource LevelPlay here.
    Like
    Love
    Wow
    Sad
    Angry
    544
    0 Σχόλια 0 Μοιράστηκε
  • Why Designers Get Stuck In The Details And How To Stop

    You’ve drawn fifty versions of the same screen — and you still hate every one of them. Begrudgingly, you pick three, show them to your product manager, and hear: “Looks cool, but the idea doesn’t work.” Sound familiar?
    In this article, I’ll unpack why designers fall into detail work at the wrong moment, examining both process pitfalls and the underlying psychological reasons, as understanding these traps is the first step to overcoming them. I’ll also share tactics I use to climb out of that trap.
    Reason #1 You’re Afraid To Show Rough Work
    We designers worship detail. We’re taught that true craft equals razor‑sharp typography, perfect grids, and pixel precision. So the minute a task arrives, we pop open Figma and start polishing long before polish is needed.
    I’ve skipped the sketch phase more times than I care to admit. I told myself it would be faster, yet I always ended up spending hours producing a tidy mock‑up when a scribbled thumbnail would have sparked a five‑minute chat with my product manager. Rough sketches felt “unprofessional,” so I hid them.
    The cost? Lost time, wasted energy — and, by the third redo, teammates were quietly wondering if I even understood the brief.
    The real problem here is the habit: we open Figma and start perfecting the UI before we’ve even solved the problem.
    So why do we hide these rough sketches? It’s not just a bad habit or plain silly. There are solid psychological reasons behind it. We often just call it perfectionism, but it’s deeper than wanting things neat. Digging into the psychologyshows there are a couple of flavors driving this:

    Socially prescribed perfectionismIt’s that nagging feeling that everyone else expects perfect work from you, which makes showing anything rough feel like walking into the lion’s den.
    Self-oriented perfectionismWhere you’re the one setting impossibly high standards for yourself, leading to brutal self-criticism if anything looks slightly off.

    Either way, the result’s the same: showing unfinished work feels wrong, and you miss out on that vital early feedback.
    Back to the design side, remember that clients rarely see architects’ first pencil sketches, but these sketches still exist; they guide structural choices before the 3D render. Treat your thumbnails the same way — artifacts meant to collapse uncertainty, not portfolio pieces. Once stakeholders see the upside, roughness becomes a badge of speed, not sloppiness. So, the key is to consciously make that shift:
    Treat early sketches as disposable tools for thinking and actively share them to get feedback faster.

    Reason #2: You Fix The Symptom, Not The Cause
    Before tackling any task, we need to understand what business outcome we’re aiming for. Product managers might come to us asking to enlarge the payment button in the shopping cart because users aren’t noticing it. The suggested solution itself isn’t necessarily bad, but before redesigning the button, we should ask, “What data suggests they aren’t noticing it?” Don’t get me wrong, I’m not saying you shouldn’t trust your product manager. On the contrary, these questions help ensure you’re on the same page and working with the same data.
    From my experience, here are several reasons why users might not be clicking that coveted button:

    Users don’t understand that this step is for payment.
    They understand it’s about payment but expect order confirmation first.
    Due to incorrect translation, users don’t understand what the button means.
    Lack of trust signals.
    Unexpected additional coststhat appear at this stage.
    Technical issues.

    Now, imagine you simply did what the manager suggested. Would you have solved the problem? Hardly.
    Moreover, the responsibility for the unresolved issue would fall on you, as the interface solution lies within the design domain. The product manager actually did their job correctly by identifying a problem: suspiciously, few users are clicking the button.
    Psychologically, taking on this bigger role isn’t easy. It means overcoming the fear of making mistakes and the discomfort of exploring unclear problems rather than just doing tasks. This shift means seeing ourselves as partners who create value — even if it means fighting a hesitation to question product managers— and understanding that using our product logic expertise proactively is crucial for modern designers.
    There’s another critical reason why we, designers, need to be a bit like product managers: the rise of AI. I deliberately used a simple example about enlarging a button, but I’m confident that in the near future, AI will easily handle routine design tasks. This worries me, but at the same time, I’m already gladly stepping into the product manager’s territory: understanding product and business metrics, formulating hypotheses, conducting research, and so on. It might sound like I’m taking work away from PMs, but believe me, they undoubtedly have enough on their plates and are usually more than happy to delegate some responsibilities to designers.
    Reason #3: You’re Solving The Wrong Problem
    Before solving anything, ask whether the problem even deserves your attention.
    During a major home‑screen redesign, our goal was to drive more users into paid services. The initial hypothesis — making service buttons bigger and brighter might help returning users — seemed reasonable enough to test. However, even when A/B testsshowed minimal impact, we continued to tweak those buttons.
    Only later did it click: the home screen isn’t the place to sell; visitors open the app to start, not to buy. We removed that promo block, and nothing broke. Contextual entry points deeper into the journey performed brilliantly. Lesson learned:
    Without the right context, any visual tweak is lipstick on a pig.

    Why did we get stuck polishing buttons instead of stopping sooner? It’s easy to get tunnel vision. Psychologically, it’s likely the good old sunk cost fallacy kicking in: we’d already invested time in the buttons, so stopping felt like wasting that effort, even though the data wasn’t promising.
    It’s just easier to keep fiddling with something familiar than to admit we need a new plan. Perhaps the simple question I should have asked myself when results stalled was: “Are we optimizing the right thing or just polishing something that fundamentally doesn’t fit the user’s primary goal here?” That alone might have saved hours.
    Reason #4: You’re Drowning In Unactionable Feedback
    We all discuss our work with colleagues. But here’s a crucial point: what kind of question do you pose to kick off that discussion? If your go-to is “What do you think?” well, that question might lead you down a rabbit hole of personal opinions rather than actionable insights. While experienced colleagues will cut through the noise, others, unsure what to evaluate, might comment on anything and everything — fonts, button colors, even when you desperately need to discuss a user flow.
    What matters here are two things:

    The question you ask,
    The context you give.

    That means clearly stating the problem, what you’ve learned, and how your idea aims to fix it.
    For instance:
    “The problem is our payment conversion rate has dropped by X%. I’ve interviewed users and found they abandon payment because they don’t understand how the total amount is calculated. My solution is to show a detailed cost breakdown. Do you think this actually solves the problem for them?”

    Here, you’ve stated the problem, shared your insight, explained your solution, and asked a direct question. It’s even better if you prepare a list of specific sub-questions. For instance: “Are all items in the cost breakdown clear?” or “Does the placement of this breakdown feel intuitive within the payment flow?”
    Another good habit is to keep your rough sketches and previous iterations handy. Some of your colleagues’ suggestions might be things you’ve already tried. It’s great if you can discuss them immediately to either revisit those ideas or definitively set them aside.
    I’m not a psychologist, but experience tells me that, psychologically, the reluctance to be this specific often stems from a fear of our solution being rejected. We tend to internalize feedback: a seemingly innocent comment like, “Have you considered other ways to organize this section?” or “Perhaps explore a different structure for this part?” can instantly morph in our minds into “You completely messed up the structure. You’re a bad designer.” Imposter syndrome, in all its glory.
    So, to wrap up this point, here are two recommendations:

    Prepare for every design discussion.A couple of focused questions will yield far more valuable input than a vague “So, what do you think?”.
    Actively work on separating feedback on your design from your self-worth.If a mistake is pointed out, acknowledge it, learn from it, and you’ll be less likely to repeat it. This is often easier said than done. For me, it took years of working with a psychotherapist. If you struggle with this, I sincerely wish you strength in overcoming it.

    Reason #5 You’re Just Tired
    Sometimes, the issue isn’t strategic at all — it’s fatigue. Fussing over icon corners can feel like a cozy bunker when your brain is fried. There’s a name for this: decision fatigue. Basically, your brain’s battery for hard thinking is low, so it hides out in the easy, comfy zone of pixel-pushing.
    A striking example comes from a New York Times article titled “Do You Suffer From Decision Fatigue?.” It described how judges deciding on release requests were far more likely to grant release early in the daycompared to late in the daysimply because their decision-making energy was depleted. Luckily, designers rarely hold someone’s freedom in their hands, but the example dramatically shows how fatigue can impact our judgment and productivity.
    What helps here:

    Swap tasks.Trade tickets with another designer; novelty resets your focus.
    Talk to another designer.If NDA permits, ask peers outside the team for a sanity check.
    Step away.Even a ten‑minute walk can do more than a double‑shot espresso.

    By the way, I came up with these ideas while walking around my office. I was lucky to work near a river, and those short walks quickly turned into a helpful habit.

    And one more trick that helps me snap out of detail mode early: if I catch myself making around 20 little tweaks — changing font weight, color, border radius — I just stop. Over time, it turned into a habit. I have a similar one with Instagram: by the third reel, my brain quietly asks, “Wait, weren’t we working?” Funny how that kind of nudge saves a ton of time.
    Four Steps I Use to Avoid Drowning In Detail
    Knowing these potential traps, here’s the practical process I use to stay on track:
    1. Define the Core Problem & Business Goal
    Before anything, dig deep: what’s the actual problem we’re solving, not just the requested task or a surface-level symptom? Ask ‘why’ repeatedly. What user pain or business need are we addressing? Then, state the clear business goal: “What metric am I moving, and do we have data to prove this is the right lever?” If retention is the goal, decide whether push reminders, gamification, or personalised content is the best route. The wrong lever, or tackling a symptom instead of the cause, dooms everything downstream.
    2. Choose the MechanicOnce the core problem and goal are clear, lock the solution principle or ‘mechanic’ first. Going with a game layer? Decide if it’s leaderboards, streaks, or badges. Write it down. Then move on. No UI yet. This keeps the focus high-level before diving into pixels.
    3. Wireframe the Flow & Get Focused Feedback
    Now open Figma. Map screens, layout, and transitions. Boxes and arrows are enough. Keep the fidelity low so the discussion stays on the flow, not colour. Crucially, when you share these early wires, ask specific questions and provide clear contextto get actionable feedback, not just vague opinions.
    4. Polish the VisualsI only let myself tweak grids, type scales, and shadows after the flow is validated. If progress stalls, or before a major polish effort, I surface the work in a design critique — again using targeted questions and clear context — instead of hiding in version 47. This ensures detailing serves the now-validated solution.
    Even for something as small as a single button, running these four checkpoints takes about ten minutes and saves hours of decorative dithering.
    Wrapping Up
    Next time you feel the pull to vanish into mock‑ups before the problem is nailed down, pause and ask what you might be avoiding. Yes, that can expose an uncomfortable truth. But pausing to ask what you might be avoiding — maybe the fuzzy core problem, or just asking for tough feedback — gives you the power to face the real issue head-on. It keeps the project focused on solving the right problem, not just perfecting a flawed solution.
    Attention to detail is a superpower when used at the right moment. Obsessing over pixels too soon, though, is a bad habit and a warning light telling us the process needs a rethink.
    #why #designers #get #stuck #details
    Why Designers Get Stuck In The Details And How To Stop
    You’ve drawn fifty versions of the same screen — and you still hate every one of them. Begrudgingly, you pick three, show them to your product manager, and hear: “Looks cool, but the idea doesn’t work.” Sound familiar? In this article, I’ll unpack why designers fall into detail work at the wrong moment, examining both process pitfalls and the underlying psychological reasons, as understanding these traps is the first step to overcoming them. I’ll also share tactics I use to climb out of that trap. Reason #1 You’re Afraid To Show Rough Work We designers worship detail. We’re taught that true craft equals razor‑sharp typography, perfect grids, and pixel precision. So the minute a task arrives, we pop open Figma and start polishing long before polish is needed. I’ve skipped the sketch phase more times than I care to admit. I told myself it would be faster, yet I always ended up spending hours producing a tidy mock‑up when a scribbled thumbnail would have sparked a five‑minute chat with my product manager. Rough sketches felt “unprofessional,” so I hid them. The cost? Lost time, wasted energy — and, by the third redo, teammates were quietly wondering if I even understood the brief. The real problem here is the habit: we open Figma and start perfecting the UI before we’ve even solved the problem. So why do we hide these rough sketches? It’s not just a bad habit or plain silly. There are solid psychological reasons behind it. We often just call it perfectionism, but it’s deeper than wanting things neat. Digging into the psychologyshows there are a couple of flavors driving this: Socially prescribed perfectionismIt’s that nagging feeling that everyone else expects perfect work from you, which makes showing anything rough feel like walking into the lion’s den. Self-oriented perfectionismWhere you’re the one setting impossibly high standards for yourself, leading to brutal self-criticism if anything looks slightly off. Either way, the result’s the same: showing unfinished work feels wrong, and you miss out on that vital early feedback. Back to the design side, remember that clients rarely see architects’ first pencil sketches, but these sketches still exist; they guide structural choices before the 3D render. Treat your thumbnails the same way — artifacts meant to collapse uncertainty, not portfolio pieces. Once stakeholders see the upside, roughness becomes a badge of speed, not sloppiness. So, the key is to consciously make that shift: Treat early sketches as disposable tools for thinking and actively share them to get feedback faster. Reason #2: You Fix The Symptom, Not The Cause Before tackling any task, we need to understand what business outcome we’re aiming for. Product managers might come to us asking to enlarge the payment button in the shopping cart because users aren’t noticing it. The suggested solution itself isn’t necessarily bad, but before redesigning the button, we should ask, “What data suggests they aren’t noticing it?” Don’t get me wrong, I’m not saying you shouldn’t trust your product manager. On the contrary, these questions help ensure you’re on the same page and working with the same data. From my experience, here are several reasons why users might not be clicking that coveted button: Users don’t understand that this step is for payment. They understand it’s about payment but expect order confirmation first. Due to incorrect translation, users don’t understand what the button means. Lack of trust signals. Unexpected additional coststhat appear at this stage. Technical issues. Now, imagine you simply did what the manager suggested. Would you have solved the problem? Hardly. Moreover, the responsibility for the unresolved issue would fall on you, as the interface solution lies within the design domain. The product manager actually did their job correctly by identifying a problem: suspiciously, few users are clicking the button. Psychologically, taking on this bigger role isn’t easy. It means overcoming the fear of making mistakes and the discomfort of exploring unclear problems rather than just doing tasks. This shift means seeing ourselves as partners who create value — even if it means fighting a hesitation to question product managers— and understanding that using our product logic expertise proactively is crucial for modern designers. There’s another critical reason why we, designers, need to be a bit like product managers: the rise of AI. I deliberately used a simple example about enlarging a button, but I’m confident that in the near future, AI will easily handle routine design tasks. This worries me, but at the same time, I’m already gladly stepping into the product manager’s territory: understanding product and business metrics, formulating hypotheses, conducting research, and so on. It might sound like I’m taking work away from PMs, but believe me, they undoubtedly have enough on their plates and are usually more than happy to delegate some responsibilities to designers. Reason #3: You’re Solving The Wrong Problem Before solving anything, ask whether the problem even deserves your attention. During a major home‑screen redesign, our goal was to drive more users into paid services. The initial hypothesis — making service buttons bigger and brighter might help returning users — seemed reasonable enough to test. However, even when A/B testsshowed minimal impact, we continued to tweak those buttons. Only later did it click: the home screen isn’t the place to sell; visitors open the app to start, not to buy. We removed that promo block, and nothing broke. Contextual entry points deeper into the journey performed brilliantly. Lesson learned: Without the right context, any visual tweak is lipstick on a pig. Why did we get stuck polishing buttons instead of stopping sooner? It’s easy to get tunnel vision. Psychologically, it’s likely the good old sunk cost fallacy kicking in: we’d already invested time in the buttons, so stopping felt like wasting that effort, even though the data wasn’t promising. It’s just easier to keep fiddling with something familiar than to admit we need a new plan. Perhaps the simple question I should have asked myself when results stalled was: “Are we optimizing the right thing or just polishing something that fundamentally doesn’t fit the user’s primary goal here?” That alone might have saved hours. Reason #4: You’re Drowning In Unactionable Feedback We all discuss our work with colleagues. But here’s a crucial point: what kind of question do you pose to kick off that discussion? If your go-to is “What do you think?” well, that question might lead you down a rabbit hole of personal opinions rather than actionable insights. While experienced colleagues will cut through the noise, others, unsure what to evaluate, might comment on anything and everything — fonts, button colors, even when you desperately need to discuss a user flow. What matters here are two things: The question you ask, The context you give. That means clearly stating the problem, what you’ve learned, and how your idea aims to fix it. For instance: “The problem is our payment conversion rate has dropped by X%. I’ve interviewed users and found they abandon payment because they don’t understand how the total amount is calculated. My solution is to show a detailed cost breakdown. Do you think this actually solves the problem for them?” Here, you’ve stated the problem, shared your insight, explained your solution, and asked a direct question. It’s even better if you prepare a list of specific sub-questions. For instance: “Are all items in the cost breakdown clear?” or “Does the placement of this breakdown feel intuitive within the payment flow?” Another good habit is to keep your rough sketches and previous iterations handy. Some of your colleagues’ suggestions might be things you’ve already tried. It’s great if you can discuss them immediately to either revisit those ideas or definitively set them aside. I’m not a psychologist, but experience tells me that, psychologically, the reluctance to be this specific often stems from a fear of our solution being rejected. We tend to internalize feedback: a seemingly innocent comment like, “Have you considered other ways to organize this section?” or “Perhaps explore a different structure for this part?” can instantly morph in our minds into “You completely messed up the structure. You’re a bad designer.” Imposter syndrome, in all its glory. So, to wrap up this point, here are two recommendations: Prepare for every design discussion.A couple of focused questions will yield far more valuable input than a vague “So, what do you think?”. Actively work on separating feedback on your design from your self-worth.If a mistake is pointed out, acknowledge it, learn from it, and you’ll be less likely to repeat it. This is often easier said than done. For me, it took years of working with a psychotherapist. If you struggle with this, I sincerely wish you strength in overcoming it. Reason #5 You’re Just Tired Sometimes, the issue isn’t strategic at all — it’s fatigue. Fussing over icon corners can feel like a cozy bunker when your brain is fried. There’s a name for this: decision fatigue. Basically, your brain’s battery for hard thinking is low, so it hides out in the easy, comfy zone of pixel-pushing. A striking example comes from a New York Times article titled “Do You Suffer From Decision Fatigue?.” It described how judges deciding on release requests were far more likely to grant release early in the daycompared to late in the daysimply because their decision-making energy was depleted. Luckily, designers rarely hold someone’s freedom in their hands, but the example dramatically shows how fatigue can impact our judgment and productivity. What helps here: Swap tasks.Trade tickets with another designer; novelty resets your focus. Talk to another designer.If NDA permits, ask peers outside the team for a sanity check. Step away.Even a ten‑minute walk can do more than a double‑shot espresso. By the way, I came up with these ideas while walking around my office. I was lucky to work near a river, and those short walks quickly turned into a helpful habit. And one more trick that helps me snap out of detail mode early: if I catch myself making around 20 little tweaks — changing font weight, color, border radius — I just stop. Over time, it turned into a habit. I have a similar one with Instagram: by the third reel, my brain quietly asks, “Wait, weren’t we working?” Funny how that kind of nudge saves a ton of time. Four Steps I Use to Avoid Drowning In Detail Knowing these potential traps, here’s the practical process I use to stay on track: 1. Define the Core Problem & Business Goal Before anything, dig deep: what’s the actual problem we’re solving, not just the requested task or a surface-level symptom? Ask ‘why’ repeatedly. What user pain or business need are we addressing? Then, state the clear business goal: “What metric am I moving, and do we have data to prove this is the right lever?” If retention is the goal, decide whether push reminders, gamification, or personalised content is the best route. The wrong lever, or tackling a symptom instead of the cause, dooms everything downstream. 2. Choose the MechanicOnce the core problem and goal are clear, lock the solution principle or ‘mechanic’ first. Going with a game layer? Decide if it’s leaderboards, streaks, or badges. Write it down. Then move on. No UI yet. This keeps the focus high-level before diving into pixels. 3. Wireframe the Flow & Get Focused Feedback Now open Figma. Map screens, layout, and transitions. Boxes and arrows are enough. Keep the fidelity low so the discussion stays on the flow, not colour. Crucially, when you share these early wires, ask specific questions and provide clear contextto get actionable feedback, not just vague opinions. 4. Polish the VisualsI only let myself tweak grids, type scales, and shadows after the flow is validated. If progress stalls, or before a major polish effort, I surface the work in a design critique — again using targeted questions and clear context — instead of hiding in version 47. This ensures detailing serves the now-validated solution. Even for something as small as a single button, running these four checkpoints takes about ten minutes and saves hours of decorative dithering. Wrapping Up Next time you feel the pull to vanish into mock‑ups before the problem is nailed down, pause and ask what you might be avoiding. Yes, that can expose an uncomfortable truth. But pausing to ask what you might be avoiding — maybe the fuzzy core problem, or just asking for tough feedback — gives you the power to face the real issue head-on. It keeps the project focused on solving the right problem, not just perfecting a flawed solution. Attention to detail is a superpower when used at the right moment. Obsessing over pixels too soon, though, is a bad habit and a warning light telling us the process needs a rethink. #why #designers #get #stuck #details
    SMASHINGMAGAZINE.COM
    Why Designers Get Stuck In The Details And How To Stop
    You’ve drawn fifty versions of the same screen — and you still hate every one of them. Begrudgingly, you pick three, show them to your product manager, and hear: “Looks cool, but the idea doesn’t work.” Sound familiar? In this article, I’ll unpack why designers fall into detail work at the wrong moment, examining both process pitfalls and the underlying psychological reasons, as understanding these traps is the first step to overcoming them. I’ll also share tactics I use to climb out of that trap. Reason #1 You’re Afraid To Show Rough Work We designers worship detail. We’re taught that true craft equals razor‑sharp typography, perfect grids, and pixel precision. So the minute a task arrives, we pop open Figma and start polishing long before polish is needed. I’ve skipped the sketch phase more times than I care to admit. I told myself it would be faster, yet I always ended up spending hours producing a tidy mock‑up when a scribbled thumbnail would have sparked a five‑minute chat with my product manager. Rough sketches felt “unprofessional,” so I hid them. The cost? Lost time, wasted energy — and, by the third redo, teammates were quietly wondering if I even understood the brief. The real problem here is the habit: we open Figma and start perfecting the UI before we’ve even solved the problem. So why do we hide these rough sketches? It’s not just a bad habit or plain silly. There are solid psychological reasons behind it. We often just call it perfectionism, but it’s deeper than wanting things neat. Digging into the psychology (like the research by Hewitt and Flett) shows there are a couple of flavors driving this: Socially prescribed perfectionismIt’s that nagging feeling that everyone else expects perfect work from you, which makes showing anything rough feel like walking into the lion’s den. Self-oriented perfectionismWhere you’re the one setting impossibly high standards for yourself, leading to brutal self-criticism if anything looks slightly off. Either way, the result’s the same: showing unfinished work feels wrong, and you miss out on that vital early feedback. Back to the design side, remember that clients rarely see architects’ first pencil sketches, but these sketches still exist; they guide structural choices before the 3D render. Treat your thumbnails the same way — artifacts meant to collapse uncertainty, not portfolio pieces. Once stakeholders see the upside, roughness becomes a badge of speed, not sloppiness. So, the key is to consciously make that shift: Treat early sketches as disposable tools for thinking and actively share them to get feedback faster. Reason #2: You Fix The Symptom, Not The Cause Before tackling any task, we need to understand what business outcome we’re aiming for. Product managers might come to us asking to enlarge the payment button in the shopping cart because users aren’t noticing it. The suggested solution itself isn’t necessarily bad, but before redesigning the button, we should ask, “What data suggests they aren’t noticing it?” Don’t get me wrong, I’m not saying you shouldn’t trust your product manager. On the contrary, these questions help ensure you’re on the same page and working with the same data. From my experience, here are several reasons why users might not be clicking that coveted button: Users don’t understand that this step is for payment. They understand it’s about payment but expect order confirmation first. Due to incorrect translation, users don’t understand what the button means. Lack of trust signals (no security icons, unclear seller information). Unexpected additional costs (hidden fees, shipping) that appear at this stage. Technical issues (inactive button, page freezing). Now, imagine you simply did what the manager suggested. Would you have solved the problem? Hardly. Moreover, the responsibility for the unresolved issue would fall on you, as the interface solution lies within the design domain. The product manager actually did their job correctly by identifying a problem: suspiciously, few users are clicking the button. Psychologically, taking on this bigger role isn’t easy. It means overcoming the fear of making mistakes and the discomfort of exploring unclear problems rather than just doing tasks. This shift means seeing ourselves as partners who create value — even if it means fighting a hesitation to question product managers (which might come from a fear of speaking up or a desire to avoid challenging authority) — and understanding that using our product logic expertise proactively is crucial for modern designers. There’s another critical reason why we, designers, need to be a bit like product managers: the rise of AI. I deliberately used a simple example about enlarging a button, but I’m confident that in the near future, AI will easily handle routine design tasks. This worries me, but at the same time, I’m already gladly stepping into the product manager’s territory: understanding product and business metrics, formulating hypotheses, conducting research, and so on. It might sound like I’m taking work away from PMs, but believe me, they undoubtedly have enough on their plates and are usually more than happy to delegate some responsibilities to designers. Reason #3: You’re Solving The Wrong Problem Before solving anything, ask whether the problem even deserves your attention. During a major home‑screen redesign, our goal was to drive more users into paid services. The initial hypothesis — making service buttons bigger and brighter might help returning users — seemed reasonable enough to test. However, even when A/B tests (a method of comparing two versions of a design to determine which performs better) showed minimal impact, we continued to tweak those buttons. Only later did it click: the home screen isn’t the place to sell; visitors open the app to start, not to buy. We removed that promo block, and nothing broke. Contextual entry points deeper into the journey performed brilliantly. Lesson learned: Without the right context, any visual tweak is lipstick on a pig. Why did we get stuck polishing buttons instead of stopping sooner? It’s easy to get tunnel vision. Psychologically, it’s likely the good old sunk cost fallacy kicking in: we’d already invested time in the buttons, so stopping felt like wasting that effort, even though the data wasn’t promising. It’s just easier to keep fiddling with something familiar than to admit we need a new plan. Perhaps the simple question I should have asked myself when results stalled was: “Are we optimizing the right thing or just polishing something that fundamentally doesn’t fit the user’s primary goal here?” That alone might have saved hours. Reason #4: You’re Drowning In Unactionable Feedback We all discuss our work with colleagues. But here’s a crucial point: what kind of question do you pose to kick off that discussion? If your go-to is “What do you think?” well, that question might lead you down a rabbit hole of personal opinions rather than actionable insights. While experienced colleagues will cut through the noise, others, unsure what to evaluate, might comment on anything and everything — fonts, button colors, even when you desperately need to discuss a user flow. What matters here are two things: The question you ask, The context you give. That means clearly stating the problem, what you’ve learned, and how your idea aims to fix it. For instance: “The problem is our payment conversion rate has dropped by X%. I’ve interviewed users and found they abandon payment because they don’t understand how the total amount is calculated. My solution is to show a detailed cost breakdown. Do you think this actually solves the problem for them?” Here, you’ve stated the problem (conversion drop), shared your insight (user confusion), explained your solution (cost breakdown), and asked a direct question. It’s even better if you prepare a list of specific sub-questions. For instance: “Are all items in the cost breakdown clear?” or “Does the placement of this breakdown feel intuitive within the payment flow?” Another good habit is to keep your rough sketches and previous iterations handy. Some of your colleagues’ suggestions might be things you’ve already tried. It’s great if you can discuss them immediately to either revisit those ideas or definitively set them aside. I’m not a psychologist, but experience tells me that, psychologically, the reluctance to be this specific often stems from a fear of our solution being rejected. We tend to internalize feedback: a seemingly innocent comment like, “Have you considered other ways to organize this section?” or “Perhaps explore a different structure for this part?” can instantly morph in our minds into “You completely messed up the structure. You’re a bad designer.” Imposter syndrome, in all its glory. So, to wrap up this point, here are two recommendations: Prepare for every design discussion.A couple of focused questions will yield far more valuable input than a vague “So, what do you think?”. Actively work on separating feedback on your design from your self-worth.If a mistake is pointed out, acknowledge it, learn from it, and you’ll be less likely to repeat it. This is often easier said than done. For me, it took years of working with a psychotherapist. If you struggle with this, I sincerely wish you strength in overcoming it. Reason #5 You’re Just Tired Sometimes, the issue isn’t strategic at all — it’s fatigue. Fussing over icon corners can feel like a cozy bunker when your brain is fried. There’s a name for this: decision fatigue. Basically, your brain’s battery for hard thinking is low, so it hides out in the easy, comfy zone of pixel-pushing. A striking example comes from a New York Times article titled “Do You Suffer From Decision Fatigue?.” It described how judges deciding on release requests were far more likely to grant release early in the day (about 70% of cases) compared to late in the day (less than 10%) simply because their decision-making energy was depleted. Luckily, designers rarely hold someone’s freedom in their hands, but the example dramatically shows how fatigue can impact our judgment and productivity. What helps here: Swap tasks.Trade tickets with another designer; novelty resets your focus. Talk to another designer.If NDA permits, ask peers outside the team for a sanity check. Step away.Even a ten‑minute walk can do more than a double‑shot espresso. By the way, I came up with these ideas while walking around my office. I was lucky to work near a river, and those short walks quickly turned into a helpful habit. And one more trick that helps me snap out of detail mode early: if I catch myself making around 20 little tweaks — changing font weight, color, border radius — I just stop. Over time, it turned into a habit. I have a similar one with Instagram: by the third reel, my brain quietly asks, “Wait, weren’t we working?” Funny how that kind of nudge saves a ton of time. Four Steps I Use to Avoid Drowning In Detail Knowing these potential traps, here’s the practical process I use to stay on track: 1. Define the Core Problem & Business Goal Before anything, dig deep: what’s the actual problem we’re solving, not just the requested task or a surface-level symptom? Ask ‘why’ repeatedly. What user pain or business need are we addressing? Then, state the clear business goal: “What metric am I moving, and do we have data to prove this is the right lever?” If retention is the goal, decide whether push reminders, gamification, or personalised content is the best route. The wrong lever, or tackling a symptom instead of the cause, dooms everything downstream. 2. Choose the Mechanic (Solution Principle) Once the core problem and goal are clear, lock the solution principle or ‘mechanic’ first. Going with a game layer? Decide if it’s leaderboards, streaks, or badges. Write it down. Then move on. No UI yet. This keeps the focus high-level before diving into pixels. 3. Wireframe the Flow & Get Focused Feedback Now open Figma. Map screens, layout, and transitions. Boxes and arrows are enough. Keep the fidelity low so the discussion stays on the flow, not colour. Crucially, when you share these early wires, ask specific questions and provide clear context (as discussed in ‘Reason #4’) to get actionable feedback, not just vague opinions. 4. Polish the Visuals (Mindfully) I only let myself tweak grids, type scales, and shadows after the flow is validated. If progress stalls, or before a major polish effort, I surface the work in a design critique — again using targeted questions and clear context — instead of hiding in version 47. This ensures detailing serves the now-validated solution. Even for something as small as a single button, running these four checkpoints takes about ten minutes and saves hours of decorative dithering. Wrapping Up Next time you feel the pull to vanish into mock‑ups before the problem is nailed down, pause and ask what you might be avoiding. Yes, that can expose an uncomfortable truth. But pausing to ask what you might be avoiding — maybe the fuzzy core problem, or just asking for tough feedback — gives you the power to face the real issue head-on. It keeps the project focused on solving the right problem, not just perfecting a flawed solution. Attention to detail is a superpower when used at the right moment. Obsessing over pixels too soon, though, is a bad habit and a warning light telling us the process needs a rethink.
    Like
    Love
    Wow
    Angry
    Sad
    596
    0 Σχόλια 0 Μοιράστηκε
  • How to choose a programmatic video advertising platform: 8 considerations

    Whether you’re an advertiser or a publisher, partnering up with the right programmatic video advertising platform is one of the most important business decisions you can make. More than half of U.S. marketing budgets are now devoted to programmatically purchased media, and there’s no indication that trend will reverse any time soon.Everybody wants to find the solution that’s best for their bottom line. However, the specific considerations that should go into choosing the right video programmatic advertising solution differ depending on whether you have supply to sell or are looking for an audience for your advertisements. This article will break down key factors for both mobile advertisers and mobile publishers to keep in mind as they search for a programmatic video advertising platform.Before we get into the specifics on either end, let’s recap the basic concepts.What is a programmatic video advertising platform?A programmatic video advertising platform combines tools, processes, and marketplaces to place video ads from advertising partners in ad placements furnished by publishing partners. The “programmatic” part of the term means that it’s all done procedurally via automated tools, integrating with demand side platforms and supply side platforms to allow advertising placements to be bid upon, selected, and displayed in fractions of a second.If a mobile game has ever offered you extra rewards for watching a video and you found yourself watching an ad for a related game a split second later, you’ve likely been on the user side of an advertising programmatic transaction. Now let’s take a look at what considerations make for the ideal programmatic video advertising platform for the other two main parties involved.4 points to help advertisers choose the best programmatic platformLooking for the best way to leverage your video demand side platform? These are four key points for advertisers to consider when trying to find the right programmatic video advertising platform.A large, engaged audienceOne of the most important things a programmatic video advertising platform can do for advertisers is put their creative content in front of as many people as possible. However, it’s not enough to just pass your content in front of the most eyeballs. It’s equally important for the platform to give you access to engaged audiences who are more likely to convert so you can make the most of your advertising dollar.Full-screen videos to grab attentionYou need every advantage you can get when you’re grappling for the attention of a busy mobile user. Your video demand side platform should prioritize full-screen takeovers when and where they make sense, making sure your content isn’t just playing unnoticed on the far side of the screen.A range of ad options that are easy to testYour video programmatic advertising partner should be able to offer a broad variety of creative and placement options, including interstitial and rewarded ads. It should also enable you to test, iterate, and optimize ads as soon as they’re put into rotation, ensuring your ad spend is meeting your targets and allowing for fast and flexible changes if needed.Simple access to supplyEven the most powerful programmatic video advertising platform is no good if it’s impractical to get running. Look for partners that allows instant access to supply through tried-and-true platforms like Google Display & Video 360, Magnite, and others. On top of that, you should seek out a private exchange to ensure access to premium inventory.4 points for publishers in search of the best programmatic platformYou work hard to make the best apps for your users, and you deserve to partner up with a programmatic video advertising platform that works hard too. Serving video ads that both keep users engaged and your profits rising can be a tricky needle to thread, but the right platform should make your part of the process simple and effective.A large selection of advertisersEncountering the same ads over and over again can get old fast — and diminish engagement. On top of that, a small selection of advertisers means fewer chances for your users to connect with an ad and convert — which means less revenue, too. The ideal programmatic video advertising platform will partner with thousands of advertisers to fill your placements with fresh, engaging content.Rewarded videos and offerwallsInterstitial video ads aren’t likely to disappear any time soon, but players strongly prefer other means of advertisement. In fact, 76% of US mobile gamers say they prefer rewarded videos over interstitial ads. Giving players the choice of when to watch ads, with the inducement of in-game rewards, can be very powerful — and an offerwall is another powerful way to put the ball in your player’s court.Easy supply-side SDK integrationThe time your developers spend integrating a new video programmatic advertising solution into your apps is time they could have spent making those apps more engaging for users. While any backend adjustment will naturally take some time to implement, your new programmatic partner should offer a powerful, industry-standard SDK to make the process fast and non-disruptive.Support for programmatic mediationMediators such as LevelPlay by ironSource automatically prioritize ad demand from multiple third-party networks, optimizing your cash flow and reducing work on your end. Your programmatic video advertising platform should seamlessly integrate with mediators to make the most of each ad placement, every time.Pick a powerful programmatic partnerThankfully, advertisers and publishers alike can choose one solution that checks all the above boxes and more. For advertisers, the ironSource Programmatic Marketplace will connect you with targeted audiences in thousands of apps that gel with your brand. For publishers, ironSource’s marketplace means a massive selection of ads that your users and your bottom line will love.
    #how #choose #programmatic #video #advertising
    How to choose a programmatic video advertising platform: 8 considerations
    Whether you’re an advertiser or a publisher, partnering up with the right programmatic video advertising platform is one of the most important business decisions you can make. More than half of U.S. marketing budgets are now devoted to programmatically purchased media, and there’s no indication that trend will reverse any time soon.Everybody wants to find the solution that’s best for their bottom line. However, the specific considerations that should go into choosing the right video programmatic advertising solution differ depending on whether you have supply to sell or are looking for an audience for your advertisements. This article will break down key factors for both mobile advertisers and mobile publishers to keep in mind as they search for a programmatic video advertising platform.Before we get into the specifics on either end, let’s recap the basic concepts.What is a programmatic video advertising platform?A programmatic video advertising platform combines tools, processes, and marketplaces to place video ads from advertising partners in ad placements furnished by publishing partners. The “programmatic” part of the term means that it’s all done procedurally via automated tools, integrating with demand side platforms and supply side platforms to allow advertising placements to be bid upon, selected, and displayed in fractions of a second.If a mobile game has ever offered you extra rewards for watching a video and you found yourself watching an ad for a related game a split second later, you’ve likely been on the user side of an advertising programmatic transaction. Now let’s take a look at what considerations make for the ideal programmatic video advertising platform for the other two main parties involved.4 points to help advertisers choose the best programmatic platformLooking for the best way to leverage your video demand side platform? These are four key points for advertisers to consider when trying to find the right programmatic video advertising platform.A large, engaged audienceOne of the most important things a programmatic video advertising platform can do for advertisers is put their creative content in front of as many people as possible. However, it’s not enough to just pass your content in front of the most eyeballs. It’s equally important for the platform to give you access to engaged audiences who are more likely to convert so you can make the most of your advertising dollar.Full-screen videos to grab attentionYou need every advantage you can get when you’re grappling for the attention of a busy mobile user. Your video demand side platform should prioritize full-screen takeovers when and where they make sense, making sure your content isn’t just playing unnoticed on the far side of the screen.A range of ad options that are easy to testYour video programmatic advertising partner should be able to offer a broad variety of creative and placement options, including interstitial and rewarded ads. It should also enable you to test, iterate, and optimize ads as soon as they’re put into rotation, ensuring your ad spend is meeting your targets and allowing for fast and flexible changes if needed.Simple access to supplyEven the most powerful programmatic video advertising platform is no good if it’s impractical to get running. Look for partners that allows instant access to supply through tried-and-true platforms like Google Display & Video 360, Magnite, and others. On top of that, you should seek out a private exchange to ensure access to premium inventory.4 points for publishers in search of the best programmatic platformYou work hard to make the best apps for your users, and you deserve to partner up with a programmatic video advertising platform that works hard too. Serving video ads that both keep users engaged and your profits rising can be a tricky needle to thread, but the right platform should make your part of the process simple and effective.A large selection of advertisersEncountering the same ads over and over again can get old fast — and diminish engagement. On top of that, a small selection of advertisers means fewer chances for your users to connect with an ad and convert — which means less revenue, too. The ideal programmatic video advertising platform will partner with thousands of advertisers to fill your placements with fresh, engaging content.Rewarded videos and offerwallsInterstitial video ads aren’t likely to disappear any time soon, but players strongly prefer other means of advertisement. In fact, 76% of US mobile gamers say they prefer rewarded videos over interstitial ads. Giving players the choice of when to watch ads, with the inducement of in-game rewards, can be very powerful — and an offerwall is another powerful way to put the ball in your player’s court.Easy supply-side SDK integrationThe time your developers spend integrating a new video programmatic advertising solution into your apps is time they could have spent making those apps more engaging for users. While any backend adjustment will naturally take some time to implement, your new programmatic partner should offer a powerful, industry-standard SDK to make the process fast and non-disruptive.Support for programmatic mediationMediators such as LevelPlay by ironSource automatically prioritize ad demand from multiple third-party networks, optimizing your cash flow and reducing work on your end. Your programmatic video advertising platform should seamlessly integrate with mediators to make the most of each ad placement, every time.Pick a powerful programmatic partnerThankfully, advertisers and publishers alike can choose one solution that checks all the above boxes and more. For advertisers, the ironSource Programmatic Marketplace will connect you with targeted audiences in thousands of apps that gel with your brand. For publishers, ironSource’s marketplace means a massive selection of ads that your users and your bottom line will love. #how #choose #programmatic #video #advertising
    UNITY.COM
    How to choose a programmatic video advertising platform: 8 considerations
    Whether you’re an advertiser or a publisher, partnering up with the right programmatic video advertising platform is one of the most important business decisions you can make. More than half of U.S. marketing budgets are now devoted to programmatically purchased media, and there’s no indication that trend will reverse any time soon.Everybody wants to find the solution that’s best for their bottom line. However, the specific considerations that should go into choosing the right video programmatic advertising solution differ depending on whether you have supply to sell or are looking for an audience for your advertisements. This article will break down key factors for both mobile advertisers and mobile publishers to keep in mind as they search for a programmatic video advertising platform.Before we get into the specifics on either end, let’s recap the basic concepts.What is a programmatic video advertising platform?A programmatic video advertising platform combines tools, processes, and marketplaces to place video ads from advertising partners in ad placements furnished by publishing partners. The “programmatic” part of the term means that it’s all done procedurally via automated tools, integrating with demand side platforms and supply side platforms to allow advertising placements to be bid upon, selected, and displayed in fractions of a second.If a mobile game has ever offered you extra rewards for watching a video and you found yourself watching an ad for a related game a split second later, you’ve likely been on the user side of an advertising programmatic transaction. Now let’s take a look at what considerations make for the ideal programmatic video advertising platform for the other two main parties involved.4 points to help advertisers choose the best programmatic platformLooking for the best way to leverage your video demand side platform? These are four key points for advertisers to consider when trying to find the right programmatic video advertising platform.A large, engaged audienceOne of the most important things a programmatic video advertising platform can do for advertisers is put their creative content in front of as many people as possible. However, it’s not enough to just pass your content in front of the most eyeballs. It’s equally important for the platform to give you access to engaged audiences who are more likely to convert so you can make the most of your advertising dollar.Full-screen videos to grab attentionYou need every advantage you can get when you’re grappling for the attention of a busy mobile user. Your video demand side platform should prioritize full-screen takeovers when and where they make sense, making sure your content isn’t just playing unnoticed on the far side of the screen.A range of ad options that are easy to testYour video programmatic advertising partner should be able to offer a broad variety of creative and placement options, including interstitial and rewarded ads. It should also enable you to test, iterate, and optimize ads as soon as they’re put into rotation, ensuring your ad spend is meeting your targets and allowing for fast and flexible changes if needed.Simple access to supplyEven the most powerful programmatic video advertising platform is no good if it’s impractical to get running. Look for partners that allows instant access to supply through tried-and-true platforms like Google Display & Video 360, Magnite, and others. On top of that, you should seek out a private exchange to ensure access to premium inventory.4 points for publishers in search of the best programmatic platformYou work hard to make the best apps for your users, and you deserve to partner up with a programmatic video advertising platform that works hard too. Serving video ads that both keep users engaged and your profits rising can be a tricky needle to thread, but the right platform should make your part of the process simple and effective.A large selection of advertisersEncountering the same ads over and over again can get old fast — and diminish engagement. On top of that, a small selection of advertisers means fewer chances for your users to connect with an ad and convert — which means less revenue, too. The ideal programmatic video advertising platform will partner with thousands of advertisers to fill your placements with fresh, engaging content.Rewarded videos and offerwallsInterstitial video ads aren’t likely to disappear any time soon, but players strongly prefer other means of advertisement. In fact, 76% of US mobile gamers say they prefer rewarded videos over interstitial ads. Giving players the choice of when to watch ads, with the inducement of in-game rewards, can be very powerful — and an offerwall is another powerful way to put the ball in your player’s court.Easy supply-side SDK integrationThe time your developers spend integrating a new video programmatic advertising solution into your apps is time they could have spent making those apps more engaging for users. While any backend adjustment will naturally take some time to implement, your new programmatic partner should offer a powerful, industry-standard SDK to make the process fast and non-disruptive.Support for programmatic mediationMediators such as LevelPlay by ironSource automatically prioritize ad demand from multiple third-party networks, optimizing your cash flow and reducing work on your end. Your programmatic video advertising platform should seamlessly integrate with mediators to make the most of each ad placement, every time.Pick a powerful programmatic partnerThankfully, advertisers and publishers alike can choose one solution that checks all the above boxes and more. For advertisers, the ironSource Programmatic Marketplace will connect you with targeted audiences in thousands of apps that gel with your brand. For publishers, ironSource’s marketplace means a massive selection of ads that your users and your bottom line will love.
    0 Σχόλια 0 Μοιράστηκε
  • Unity Technical VFX Artist at No Brakes Games

    Unity Technical VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote3 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are looking for a Unity Technical VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITION - Vilnius, Lithuania or RemoteRole Overview:As a Unity Technical Artist specializing in VFX and rendering, you will develop and optimize real-time visual effects, create advanced shaders and materials, and ensure a balance between visual fidelity and performance. You will solve complex technical challenges, optimizing effects for real-time execution while collaborating with artists, designers, and engineers to push Unity’s rendering capabilities to the next level.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Create scalable shaders and materials for water, fog, atmospheric effects, and dynamic lighting.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Document best practices and educate the team on efficient asset and shader workflows.Collaborate with engineers to develop and implement custom rendering solutions.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a Technical Artist in game development, with a focus on Unity.Strong understanding of Unity’s rendering pipelineand shader development.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Hands-on experience with real-time lighting and atmospheric effects.Ability to debug and profile complex rendering issues effectively.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development.Knowledge of C#, Python, or C++ for extending Unity’s capabilities.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com
    Create Your Profile — Game companies can contact you with their relevant job openings.
    Apply
    #unity #technical #vfx #artist #brakes
    Unity Technical VFX Artist at No Brakes Games
    Unity Technical VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote3 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are looking for a Unity Technical VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITION - Vilnius, Lithuania or RemoteRole Overview:As a Unity Technical Artist specializing in VFX and rendering, you will develop and optimize real-time visual effects, create advanced shaders and materials, and ensure a balance between visual fidelity and performance. You will solve complex technical challenges, optimizing effects for real-time execution while collaborating with artists, designers, and engineers to push Unity’s rendering capabilities to the next level.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Create scalable shaders and materials for water, fog, atmospheric effects, and dynamic lighting.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Document best practices and educate the team on efficient asset and shader workflows.Collaborate with engineers to develop and implement custom rendering solutions.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a Technical Artist in game development, with a focus on Unity.Strong understanding of Unity’s rendering pipelineand shader development.Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Hands-on experience with real-time lighting and atmospheric effects.Ability to debug and profile complex rendering issues effectively.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development.Knowledge of C#, Python, or C++ for extending Unity’s capabilities.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com Create Your Profile — Game companies can contact you with their relevant job openings. Apply #unity #technical #vfx #artist #brakes
    Unity Technical VFX Artist at No Brakes Games
    Unity Technical VFX ArtistNo Brakes GamesVilnius, Lithuania or Remote3 hours agoApplyWe are No Brakes Games, the creators of Human Fall Flat.We are looking for a Unity Technical VFX Artist to join our team to work on Human Fall Flat 2.FULL-TIME POSITION - Vilnius, Lithuania or RemoteRole Overview:As a Unity Technical Artist specializing in VFX and rendering, you will develop and optimize real-time visual effects, create advanced shaders and materials, and ensure a balance between visual fidelity and performance. You will solve complex technical challenges, optimizing effects for real-time execution while collaborating with artists, designers, and engineers to push Unity’s rendering capabilities to the next level.Responsibilities:Develop and optimize real-time VFX solutions that are both visually striking and performant.Create scalable shaders and materials for water, fog, atmospheric effects, and dynamic lighting.Debug and resolve VFX performance bottlenecks using Unity Profiler, RenderDoc, and other tools.Optimize particle systems, volumetric effects, and GPU simulations for multi-platform performance.Document best practices and educate the team on efficient asset and shader workflows.Collaborate with engineers to develop and implement custom rendering solutions.Stay updated on Unity’s latest advancements in rendering, HDRP, and the Visual Effect Graph.Requirements:2+ years of experience as a Technical Artist in game development, with a focus on Unity.Strong understanding of Unity’s rendering pipeline (HDRP, URP, Built-in) and shader development (HLSL, Shader Graph).Experience developing performance-conscious visual effects, including particle systems, volumetric lighting, and dynamic environmental effects.Proficiency in GPU/CPU optimization techniques, LODs for VFX.Hands-on experience with real-time lighting and atmospheric effects.Ability to debug and profile complex rendering issues effectively.Excellent communication skills and ability to work collaboratively within a multi-disciplinary team.A flexible, R&D-driven mindset, able to iterate quickly in both prototyping and production environments.Nice-to-Have:Experience working on at least one released game project.Experience with Unity HDRP and SRP.Experience with multi-platform development (PC, console, mobile, VR/AR).Knowledge of C#, Python, or C++ for extending Unity’s capabilities.Experience developing custom node-based tools or extending Unity’s Visual Effect Graph.Background in procedural animation, physics-based effects, or fluid simulations.Apply today by sending your Portfolio & CV to jobs@nobrakesgames.com Create Your Profile — Game companies can contact you with their relevant job openings. Apply
    0 Σχόλια 0 Μοιράστηκε
  • Game Dev Digest Issue #286 - Design Tricks, Deep Dives, and more

    This article was originally published on GameDevDigest.comEnjoy!What was Radiant AI, anyway? - A ridiculously deep dive into Oblivion's controversial AI system and its legacyblog.paavo.meConsider The Horse Game - No I don’t think every dev should make a horse game. But I do think every developer should at least look at them, maybe even play one because, it is very important that you understand the importance of genre, fandom, and how visibility works. Even if you are not making a horse game, the lessons you can learn by looking at this sub genre are very similar to other genres, just not as blatantly clear as they are with horse games.howtomarketagame.comMaking a killing: The playful 2D terror of Psycasso® - I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.UnityIntroduction to Asset Manager transfer methods in Unity - Unity's Asset Manager is a user-friendly digital asset management platform supporting over 70 file formats to help teams centralize, organize, discover, and use assets seamlessly across projects. It reduces redundant work by design, making cross-team collaboration smoother and accelerating production workflows.UnityVideosRules of the Game: Five Tricks of Highly Effective Designers - Every working designer has them: unique techniques or "tricks" that they use when crafting gameplay. Sure, there's the general game design wisdom that everyone agrees on and can be found in many a game design book, but experienced game designers often have very specific rules that are personal to them, techniques that not everyone knows about or even agrees with. In this GDC 2015 session, five experienced game designers join the stage for 10 minutes each to share one game design "trick" that they use.Game Developers ConferenceBinding of Isaac Style Room Generator in Unity- Our third part in the series - making the rooms!Game Dev GarnetIntroduction to Unity Behavior | Unity Tutorial - In this video you'll become familiar with the core concepts of Unity Behavior, including a live example.LlamAcademyHow I got my demo ready for Steam Next Fest - It's Steam Next Fest, and I've got a game in the showcase. So here are 7 tips for making the most of this demo sharing festival.Game Maker's ToolkitOptimizing lighting in Projekt Z: Beyond Order - 314 Arts studio lead and founder Justin Miersch discuss how the team used the Screen Space Global Illumination feature in Unity’s High Definition Render Pipeline, along with the Unity Profiler and Timeline to overcome the lighting challenges they faced in building Projekt Z: Beyond Order.UnityMemory Arenas in Unity: Heap Allocation Without the GC - In this video, we explore how to build a custom memory arena in Unity using unsafe code and manual heap allocation. You’ll learn how to allocate raw memory for temporary graph-like structures—such as crafting trees or decision planners—without triggering the garbage collector. We’ll walk through the concept of stack frames, translate that to heap-based arena allocation, and implement a fast, disposable system that gives you full control over memory layout and lifetime. Perfect for performance-critical systems where GC spikes aren’t acceptable.git-amendCloth Animation Using The Compute Shader - In this video, we dive into cloth simulation using OpenGL compute shaders. By applying simple mathematical equations, we’ll achieve smooth, dynamic movement. We'll explore particle-based simulation, tackle synchronization challenges with double buffering, and optimize rendering using triangle strips for efficient memory usage. Whether you're familiar with compute shaders or just getting started, this is the perfect way to step up your real-time graphics skills!OGLDEVHow we're designing games for a broader audience - Our games are too hardBiteMe GamesAssetsLearn Game Dev - Unity, Godot, Unreal, Gamemaker, Blender & C# - Make games like a pro.Passionate about video games? Then start making your own! Our latest bundle will help you learn vital game development skills. Master the most popular creation platforms like Unity, Godot, Unreal, GameMaker, Blender, and C#—now that’s a sharp-lookin’ bundle! Build a 2.5D farming RPG with Unreal Engine, create a micro turn-based RPG in Godot, explore game optimization, and so much more.__Big Bang Unreal & Unity Asset Packs Bundle - 5000+ unrivaled assets in one bundle. Calling all game devs—build your worlds with this gigantic bundle of over 5000 assets, including realistic and stylized environments, SFX packs, and powerful tools. Perfect for hobbyists, beginners, and professional developers alike, you'll gain access to essential resources, tutorials, and beta-testing–ready content to start building immediately. The experts at Leartes Studios have curated an amazing library packed with value, featuring environments, VFX packs, and tutorial courses on Unreal Engine, Blender, Substance Painter, and ZBrush. Get the assets you need to bring your game to life—and help support One Tree Planted with your purchase! This bundle provides Unity Asset Store keys directly with your purchase, and FAB keys via redemption through Cosmos, if the product is available on those platforms.Humble Bundle AffiliateGameplay Tools 50% Off - Core systems, half the price. Get pro-grade tools to power your gameplay—combat, cutscenes, UI, and more. Including: HTrace: World Space Global Illumination, VFX Graph - Ultra Mega Pack - Vol.1, Magic Animation Blend, Utility Intelligence: Utility AI Framework for Unity 6, Build for iOS/macOS on Windows>?Unity AffiliateHi guys, I created a website about 6 years in which I host all my field recordings and foley sounds. All free to download and use CC0. There is currently 50+ packs with 1000's of sounds and hours of field recordings all perfect for game SFX and UI. - I think game designers can benefit from a wide range of sounds on the site, especially those that enhance immersion and atmosphere.signaturesounds.orgSmartAddresser - Automate Addressing, Labeling, and Version Control for Unity's Addressable Asset System.CyberAgentGameEntertainment Open SourceEasyCS - EasyCS is an easy-to-use and flexible framework for Unity, adopting a Data-Driven Entity & Actor-Component approach. It bridges Unity's classic OOP with powerful data-oriented patterns, without forcing a complete ECS paradigm shift or a mindset change. Build smarter, not harder.Watcher3056 Open SourceBinding-Of-Isaac_Map-Generator - Binding of Isaac map generator for Unity2DGarnetKane99 Open SourceHelion - A modern fast paced Doom FPS engineHelion-Engine Open SourcePixelationFx - Pixelation post effect for Unity UrpNullTale Open SourceExtreme Add-Ons Bundle For Blender & ZBrush - Extraordinary quality—Extreme add-ons Get quality add-ons for Blender and ZBrush with our latest bundle! We’ve teamed up with the pros at FlippedNormals to deliver a gigantic library of powerful tools for your next game development project. Add new life to your creative work with standout assets like Real-time Hair ZBrush Plugin, Physical Starlight and Atmosphere, Easy Mesh ZBrush Plugin, and more. Get the add-ons you need to bring color and individuality to your next project—and help support Extra Life with your purchase!Humble Bundle AffiliateShop up to 50% off Gabriel Aguiar Prod - Publisher Sale - Gabriel Aguiar Prod. is best known for his extensive VFX assets that help many developers prototype and ship games with special effects. His support and educational material are also invaluable resources for the game dev community. PLUS get VFX Graph - Stylized Fire - Vol. 1 for FREE with code GAP2025Unity AffiliateSpotlightDream Garden - Dream Garden is a simulation game about building tiny cute garden dioramas. A large selection of tools, plants, decorations and customization awaits you. Try all of them and create your dream garden.Campfire StudioMy game, Call Of Dookie. Demo available on SteamYou can subscribe to the free weekly newsletter on GameDevDigest.comThis post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article.
    #game #dev #digest #issue #design
    Game Dev Digest Issue #286 - Design Tricks, Deep Dives, and more
    This article was originally published on GameDevDigest.comEnjoy!What was Radiant AI, anyway? - A ridiculously deep dive into Oblivion's controversial AI system and its legacyblog.paavo.meConsider The Horse Game - No I don’t think every dev should make a horse game. But I do think every developer should at least look at them, maybe even play one because, it is very important that you understand the importance of genre, fandom, and how visibility works. Even if you are not making a horse game, the lessons you can learn by looking at this sub genre are very similar to other genres, just not as blatantly clear as they are with horse games.howtomarketagame.comMaking a killing: The playful 2D terror of Psycasso® - I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a freshtwist.UnityIntroduction to Asset Manager transfer methods in Unity - Unity's Asset Manager is a user-friendly digital asset management platform supporting over 70 file formats to help teams centralize, organize, discover, and use assets seamlessly across projects. It reduces redundant work by design, making cross-team collaboration smoother and accelerating production workflows.UnityVideosRules of the Game: Five Tricks of Highly Effective Designers - Every working designer has them: unique techniques or "tricks" that they use when crafting gameplay. Sure, there's the general game design wisdom that everyone agrees on and can be found in many a game design book, but experienced game designers often have very specific rules that are personal to them, techniques that not everyone knows about or even agrees with. In this GDC 2015 session, five experienced game designers join the stage for 10 minutes each to share one game design "trick" that they use.Game Developers ConferenceBinding of Isaac Style Room Generator in Unity- Our third part in the series - making the rooms!Game Dev GarnetIntroduction to Unity Behavior | Unity Tutorial - In this video you'll become familiar with the core concepts of Unity Behavior, including a live example.LlamAcademyHow I got my demo ready for Steam Next Fest - It's Steam Next Fest, and I've got a game in the showcase. So here are 7 tips for making the most of this demo sharing festival.Game Maker's ToolkitOptimizing lighting in Projekt Z: Beyond Order - 314 Arts studio lead and founder Justin Miersch discuss how the team used the Screen Space Global Illumination feature in Unity’s High Definition Render Pipeline, along with the Unity Profiler and Timeline to overcome the lighting challenges they faced in building Projekt Z: Beyond Order.UnityMemory Arenas in Unity: Heap Allocation Without the GC - In this video, we explore how to build a custom memory arena in Unity using unsafe code and manual heap allocation. You’ll learn how to allocate raw memory for temporary graph-like structures—such as crafting trees or decision planners—without triggering the garbage collector. We’ll walk through the concept of stack frames, translate that to heap-based arena allocation, and implement a fast, disposable system that gives you full control over memory layout and lifetime. Perfect for performance-critical systems where GC spikes aren’t acceptable.git-amendCloth Animation Using The Compute Shader - In this video, we dive into cloth simulation using OpenGL compute shaders. By applying simple mathematical equations, we’ll achieve smooth, dynamic movement. We'll explore particle-based simulation, tackle synchronization challenges with double buffering, and optimize rendering using triangle strips for efficient memory usage. Whether you're familiar with compute shaders or just getting started, this is the perfect way to step up your real-time graphics skills!OGLDEVHow we're designing games for a broader audience - Our games are too hardBiteMe GamesAssetsLearn Game Dev - Unity, Godot, Unreal, Gamemaker, Blender & C# - Make games like a pro.Passionate about video games? Then start making your own! Our latest bundle will help you learn vital game development skills. Master the most popular creation platforms like Unity, Godot, Unreal, GameMaker, Blender, and C#—now that’s a sharp-lookin’ bundle! Build a 2.5D farming RPG with Unreal Engine, create a micro turn-based RPG in Godot, explore game optimization, and so much more.__Big Bang Unreal & Unity Asset Packs Bundle - 5000+ unrivaled assets in one bundle. Calling all game devs—build your worlds with this gigantic bundle of over 5000 assets, including realistic and stylized environments, SFX packs, and powerful tools. Perfect for hobbyists, beginners, and professional developers alike, you'll gain access to essential resources, tutorials, and beta-testing–ready content to start building immediately. The experts at Leartes Studios have curated an amazing library packed with value, featuring environments, VFX packs, and tutorial courses on Unreal Engine, Blender, Substance Painter, and ZBrush. Get the assets you need to bring your game to life—and help support One Tree Planted with your purchase! This bundle provides Unity Asset Store keys directly with your purchase, and FAB keys via redemption through Cosmos, if the product is available on those platforms.Humble Bundle AffiliateGameplay Tools 50% Off - Core systems, half the price. Get pro-grade tools to power your gameplay—combat, cutscenes, UI, and more. Including: HTrace: World Space Global Illumination, VFX Graph - Ultra Mega Pack - Vol.1, Magic Animation Blend, Utility Intelligence: Utility AI Framework for Unity 6, Build for iOS/macOS on Windows>?Unity AffiliateHi guys, I created a website about 6 years in which I host all my field recordings and foley sounds. All free to download and use CC0. There is currently 50+ packs with 1000's of sounds and hours of field recordings all perfect for game SFX and UI. - I think game designers can benefit from a wide range of sounds on the site, especially those that enhance immersion and atmosphere.signaturesounds.orgSmartAddresser - Automate Addressing, Labeling, and Version Control for Unity's Addressable Asset System.CyberAgentGameEntertainment Open SourceEasyCS - EasyCS is an easy-to-use and flexible framework for Unity, adopting a Data-Driven Entity & Actor-Component approach. It bridges Unity's classic OOP with powerful data-oriented patterns, without forcing a complete ECS paradigm shift or a mindset change. Build smarter, not harder.Watcher3056 Open SourceBinding-Of-Isaac_Map-Generator - Binding of Isaac map generator for Unity2DGarnetKane99 Open SourceHelion - A modern fast paced Doom FPS engineHelion-Engine Open SourcePixelationFx - Pixelation post effect for Unity UrpNullTale Open SourceExtreme Add-Ons Bundle For Blender & ZBrush - Extraordinary quality—Extreme add-ons Get quality add-ons for Blender and ZBrush with our latest bundle! We’ve teamed up with the pros at FlippedNormals to deliver a gigantic library of powerful tools for your next game development project. Add new life to your creative work with standout assets like Real-time Hair ZBrush Plugin, Physical Starlight and Atmosphere, Easy Mesh ZBrush Plugin, and more. Get the add-ons you need to bring color and individuality to your next project—and help support Extra Life with your purchase!Humble Bundle AffiliateShop up to 50% off Gabriel Aguiar Prod - Publisher Sale - Gabriel Aguiar Prod. is best known for his extensive VFX assets that help many developers prototype and ship games with special effects. His support and educational material are also invaluable resources for the game dev community. PLUS get VFX Graph - Stylized Fire - Vol. 1 for FREE with code GAP2025Unity AffiliateSpotlightDream Garden - Dream Garden is a simulation game about building tiny cute garden dioramas. A large selection of tools, plants, decorations and customization awaits you. Try all of them and create your dream garden.Campfire StudioMy game, Call Of Dookie. Demo available on SteamYou can subscribe to the free weekly newsletter on GameDevDigest.comThis post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article. #game #dev #digest #issue #design
    GAMEDEV.NET
    Game Dev Digest Issue #286 - Design Tricks, Deep Dives, and more
    This article was originally published on GameDevDigest.comEnjoy!What was Radiant AI, anyway? - A ridiculously deep dive into Oblivion's controversial AI system and its legacyblog.paavo.meConsider The Horse Game - No I don’t think every dev should make a horse game (unlike horror, which I still think everyone should at least one). But I do think every developer should at least look at them, maybe even play one because, it is very important that you understand the importance of genre, fandom, and how visibility works. Even if you are not making a horse game, the lessons you can learn by looking at this sub genre are very similar to other genres, just not as blatantly clear as they are with horse games.howtomarketagame.comMaking a killing: The playful 2D terror of Psycasso® - I sat down with lead developer Benjamin Lavender and Omni, designer and producer, to talk about this playfully gory game that gives a classic retro style and a fresh (if gruesome) twist.UnityIntroduction to Asset Manager transfer methods in Unity - Unity's Asset Manager is a user-friendly digital asset management platform supporting over 70 file formats to help teams centralize, organize, discover, and use assets seamlessly across projects. It reduces redundant work by design, making cross-team collaboration smoother and accelerating production workflows.UnityVideosRules of the Game: Five Tricks of Highly Effective Designers - Every working designer has them: unique techniques or "tricks" that they use when crafting gameplay. Sure, there's the general game design wisdom that everyone agrees on and can be found in many a game design book, but experienced game designers often have very specific rules that are personal to them, techniques that not everyone knows about or even agrees with. In this GDC 2015 session, five experienced game designers join the stage for 10 minutes each to share one game design "trick" that they use.Game Developers ConferenceBinding of Isaac Style Room Generator in Unity [Full Tutorial] - Our third part in the series - making the rooms!Game Dev GarnetIntroduction to Unity Behavior | Unity Tutorial - In this video you'll become familiar with the core concepts of Unity Behavior, including a live example.LlamAcademyHow I got my demo ready for Steam Next Fest - It's Steam Next Fest, and I've got a game in the showcase. So here are 7 tips for making the most of this demo sharing festival.Game Maker's ToolkitOptimizing lighting in Projekt Z: Beyond Order - 314 Arts studio lead and founder Justin Miersch discuss how the team used the Screen Space Global Illumination feature in Unity’s High Definition Render Pipeline (HDRP), along with the Unity Profiler and Timeline to overcome the lighting challenges they faced in building Projekt Z: Beyond Order.UnityMemory Arenas in Unity: Heap Allocation Without the GC - In this video, we explore how to build a custom memory arena in Unity using unsafe code and manual heap allocation. You’ll learn how to allocate raw memory for temporary graph-like structures—such as crafting trees or decision planners—without triggering the garbage collector. We’ll walk through the concept of stack frames, translate that to heap-based arena allocation, and implement a fast, disposable system that gives you full control over memory layout and lifetime. Perfect for performance-critical systems where GC spikes aren’t acceptable.git-amendCloth Animation Using The Compute Shader - In this video, we dive into cloth simulation using OpenGL compute shaders. By applying simple mathematical equations, we’ll achieve smooth, dynamic movement. We'll explore particle-based simulation, tackle synchronization challenges with double buffering, and optimize rendering using triangle strips for efficient memory usage. Whether you're familiar with compute shaders or just getting started, this is the perfect way to step up your real-time graphics skills!OGLDEVHow we're designing games for a broader audience - Our games are too hardBiteMe GamesAssetsLearn Game Dev - Unity, Godot, Unreal, Gamemaker, Blender & C# - Make games like a pro.Passionate about video games? Then start making your own! Our latest bundle will help you learn vital game development skills. Master the most popular creation platforms like Unity, Godot, Unreal, GameMaker, Blender, and C#—now that’s a sharp-lookin’ bundle! Build a 2.5D farming RPG with Unreal Engine, create a micro turn-based RPG in Godot, explore game optimization, and so much more.__Big Bang Unreal & Unity Asset Packs Bundle - 5000+ unrivaled assets in one bundle. Calling all game devs—build your worlds with this gigantic bundle of over 5000 assets, including realistic and stylized environments, SFX packs, and powerful tools. Perfect for hobbyists, beginners, and professional developers alike, you'll gain access to essential resources, tutorials, and beta-testing–ready content to start building immediately. The experts at Leartes Studios have curated an amazing library packed with value, featuring environments, VFX packs, and tutorial courses on Unreal Engine, Blender, Substance Painter, and ZBrush. Get the assets you need to bring your game to life—and help support One Tree Planted with your purchase! This bundle provides Unity Asset Store keys directly with your purchase, and FAB keys via redemption through Cosmos, if the product is available on those platforms.Humble Bundle AffiliateGameplay Tools 50% Off - Core systems, half the price. Get pro-grade tools to power your gameplay—combat, cutscenes, UI, and more. Including: HTrace: World Space Global Illumination, VFX Graph - Ultra Mega Pack - Vol.1, Magic Animation Blend, Utility Intelligence (v2): Utility AI Framework for Unity 6, Build for iOS/macOS on Windows>?Unity AffiliateHi guys, I created a website about 6 years in which I host all my field recordings and foley sounds. All free to download and use CC0. There is currently 50+ packs with 1000's of sounds and hours of field recordings all perfect for game SFX and UI. - I think game designers can benefit from a wide range of sounds on the site, especially those that enhance immersion and atmosphere.signaturesounds.orgSmartAddresser - Automate Addressing, Labeling, and Version Control for Unity's Addressable Asset System.CyberAgentGameEntertainment Open SourceEasyCS - EasyCS is an easy-to-use and flexible framework for Unity, adopting a Data-Driven Entity & Actor-Component approach. It bridges Unity's classic OOP with powerful data-oriented patterns, without forcing a complete ECS paradigm shift or a mindset change. Build smarter, not harder.Watcher3056 Open SourceBinding-Of-Isaac_Map-Generator - Binding of Isaac map generator for Unity2DGarnetKane99 Open SourceHelion - A modern fast paced Doom FPS engineHelion-Engine Open SourcePixelationFx - Pixelation post effect for Unity UrpNullTale Open SourceExtreme Add-Ons Bundle For Blender & ZBrush - Extraordinary quality—Extreme add-ons Get quality add-ons for Blender and ZBrush with our latest bundle! We’ve teamed up with the pros at FlippedNormals to deliver a gigantic library of powerful tools for your next game development project. Add new life to your creative work with standout assets like Real-time Hair ZBrush Plugin, Physical Starlight and Atmosphere, Easy Mesh ZBrush Plugin, and more. Get the add-ons you need to bring color and individuality to your next project—and help support Extra Life with your purchase!Humble Bundle AffiliateShop up to 50% off Gabriel Aguiar Prod - Publisher Sale - Gabriel Aguiar Prod. is best known for his extensive VFX assets that help many developers prototype and ship games with special effects. His support and educational material are also invaluable resources for the game dev community. PLUS get VFX Graph - Stylized Fire - Vol. 1 for FREE with code GAP2025Unity AffiliateSpotlightDream Garden - Dream Garden is a simulation game about building tiny cute garden dioramas. A large selection of tools, plants, decorations and customization awaits you. Try all of them and create your dream garden.[You can find it on Steam]Campfire StudioMy game, Call Of Dookie. Demo available on SteamYou can subscribe to the free weekly newsletter on GameDevDigest.comThis post includes affiliate links; I may receive compensation if you purchase products or services from the different links provided in this article.
    0 Σχόλια 0 Μοιράστηκε
  • CD Projekt RED: TW4 has console first development with a 60fps target; 60fps on Series S will be "extremely challenging"

    DriftingSpirit
    Member

    Oct 25, 2017

    18,563

    They note how they usually start with PC and scale down, but they will be doing it the other way around this time to avoid issues with the console versions.

    4:15 for console focus and 60fps
    38:50 for the Series S comment 

    bsigg
    Member

    Oct 25, 2017

    25,153Inside The Witcher 4 Unreal Engine 5 Tech Demo: CD Projekt RED + Epic Deep Dive Interview



    www.resetera.com

     

    Skot
    Member

    Oct 30, 2017

    645

    720p on Series S incoming
     

    Bulby
    Prophet of Truth
    Member

    Oct 29, 2017

    6,006

    Berlin

    I think think any series s user will be happy with a beautiful 900p 30fps
     

    Chronos
    Member

    Oct 27, 2017

    1,249

    This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.
     

    HellofaMouse
    Member

    Oct 27, 2017

    8,551

    i wonder if this'll come out before the gen is over?

    good chance itll be a 2077 situation, cross-gen release with a broken ps6 version 

    logash
    Member

    Oct 27, 2017

    6,526

    This makes sense since they want to have good performance on lower end machines and they mentioned that it was easier to scale up than to scale down. They also mentioned their legacy on PC and how they plan on scaling it up high like they usually do on PC.
     

    KRT
    Member

    Aug 7, 2020

    247

    Series S was a mistake
     

    chris 1515
    Member

    Oct 27, 2017

    7,116

    Barcelona Spain

    The game have raytracing GI and reflection it will probably be 30 fps 600p-720p on Xbox Series S.
     

    bitcloudrzr
    Member

    May 31, 2018

    21,044

    Bulby said:

    I think think any series s user will be happy with a beautiful 900p 30fps

    Click to expand...
    Click to shrink...

     

    Yuuber
    Member

    Oct 28, 2017

    4,540

    KRT said:

    Series S was a mistake

    Click to expand...
    Click to shrink...

    Can we stop with these stupid takes? For all we know it sold as much as Series X, helped several games have better optimization on bigger consoles and it will definitely help optimizing newer games to the Nintendo Switch 2. 

    MANTRA
    Member

    Feb 21, 2024

    1,198

    No one who cares about 60fps should be buying a Series S, just make it 30fps.
     

    Roytheone
    Member

    Oct 25, 2017

    6,185

    Chronos said:

    This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.

    Click to expand...
    Click to shrink...

    They can just go for 30 fps instead on the Series S. No need for a special deal for that, that's allowed. 

    Matterhorn
    Member

    Feb 6, 2019

    254

    United States

    Hoping for a very nice looking 30fps Switch 2 version.
     

    Universal Acclaim
    Member

    Oct 5, 2024

    2,617

    Maybe off topic, but is 30fps target not so important anymore for 2027 industry-leading graphics? GTA is mainly doing it for design/physics/etc. whch is why the game can't be scaled down to 720-900p/60fps?
     

    chris 1515
    Member

    Oct 27, 2017

    7,116

    Barcelona Spain

    Matterhorn said:

    Hoping for a very nice looking 30fps Switch 2 version.

    Click to expand...
    Click to shrink...

    It will be a full port a few years after like The Witcher 3., they don't use software lumen here. I doubt the Switch 2 Raytracing capaclity is high enough to use the same pipeline to produce the Switch 2 version.

    EDIT: And they probably need to redo all the assets.

    /

    Fortnite doesn't use Nanite and Lumen on Switch 2. 

    Last edited: Yesterday at 4:18 PM

    bitcloudrzr
    Member

    May 31, 2018

    21,044

    Universal Acclaim said:

    Maybe off topic, but is 30fps target not so important anymore for 2027 industry-leading graphics? GTA is mainly doing it for design/physics/etc. whch is why the graphics can't be scaled down to 720p/60fps?

    Click to expand...
    Click to shrink...

    Graphics are the part of the game that can be scaled, it is CPU load that is the more difficult part, although devs have actually made cuts in the latter to increase performance mode fps viability. Even with this focus on 60fps performance modes, they are always going to have room to make a higher fidelity 30fps mode. Specifically with UE5 though, performance has been such a disaster all around and Epic seems to be taking it seriously now.
     

    Greywaren
    Member

    Jul 16, 2019

    13,530

    Spain

    60 fps target is fantastic, I wish it was the norm.
     

    julia crawford
    Took the red AND the blue pills
    Member

    Oct 27, 2017

    40,709

    i am very ok with lower fps on the series s, it is far more palatable than severe resolution drops with upscaling artifacts.
     

    Spoit
    Member

    Oct 28, 2017

    5,599

    Chronos said:

    This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.

    Click to expand...
    Click to shrink...

    And yet people keep talking about somehow getting PS6 games to work on the sony portable, which is probably going to be like half as powerful as a PS5, like that won't hold games back
     

    PLASTICA-MAN
    Member

    Oct 26, 2017

    29,563

    chris 1515 said:

    The game have raytracing GI and reflection it will probably be 30 fps 600p-720p on Xbox Series S.

    Click to expand...
    Click to shrink...

    There is kinda a misconception of how Lumen and the hybrid RT is handled in UE5 titles. AO is also part of the ray traced pipeline through the HW Lumen too.
    Just shadows are handled separately from the RT system by using VSM which in final look behvae quite like RT shadows in shape, same how FF16 handled the shadows looking like RT ones while it isn't traced.
    UE5 can still trace shadows if they want to push things even further. 

    overthewaves
    Member

    Sep 30, 2020

    1,203

    What about the PS5 handheld?
     

    nullpotential
    Member

    Jun 24, 2024

    87

    KRT said:

    Series S was a mistake

    Click to expand...
    Click to shrink...

    Consoles were a mistake. 

    GPU
    Member

    Oct 10, 2024

    1,075

    I really dont think Series S/X will be much of a factor by the time this game comes out.
     

    Lashley
    <<Tag Here>>
    Member

    Oct 25, 2017

    65,679

    Just make series s 480p 30fps
     

    pappacone
    Member

    Jan 10, 2020

    4,076

    Greywaren said:

    60 fps target is fantastic, I wish it was the norm.

    Click to expand...
    Click to shrink...

    It pretty much is
     

    Super
    Studied the Buster Sword
    Member

    Jan 29, 2022

    13,601

    I hope they can pull 60 FPS off in the full game.
     

    Theorry
    Member

    Oct 27, 2017

    69,045

    "target"

    Uh huh. We know how that is gonna go. 

    Jakartalado
    Member

    Oct 27, 2017

    2,818

    São Paulo, Brazil

    Skot said:

    720p on Series S incoming

    Click to expand...
    Click to shrink...

    If the PS5 is internally at 720p up to 900p, I seriously doubt that. 

    Revoltoftheunique
    Member

    Jan 23, 2022

    2,312

    It will be unstable 60fps with lots of stuttering.
     

    defaltoption
    Plug in a controller and enter the Konami code
    The Fallen

    Oct 27, 2017

    12,485

    Austin

    KRT said:

    Series S was a mistake

    Click to expand...
    Click to shrink...

    With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid.
     

    Horns
    Member

    Dec 7, 2018

    3,423

    I hope Microsoft drops the requirement for Series S by the time this comes out.
     

    chris 1515
    Member

    Oct 27, 2017

    7,116

    Barcelona Spain

    PLASTICA-MAN said:

    There is kinda a misconception of how Lumen and the hybrid RT is handled in UE5 titles. AO is also part of the ray traced pipeline through the HW Lumen too.

    Just shadows are handled separately from the RT system by using VSM which in final look behvae quite like RT shadows in shape, same how FF16 handled the shadows looking like RT ones while it isn't traced.
    UE5 can still trace shadows if they want to push things even further.
    Click to expand...
    Click to shrink...

    Yes indirect shadows are handled by hardware lumen. But at the end ot doesn¡t change my comment. i think the game will be 600´720p at 30 fps on Series S. 

    bitcloudrzr
    Member

    May 31, 2018

    21,044

    Spoit said:

    And yet people keep talking about somehow getting PS6 games to work on the sony portable, which is probably going to be like half as powerful as a PS5, like that won't hold games back

    Click to expand...
    Click to shrink...

    Has it been confirmed that Sony is going to have release requirements like the XS?
     

    Commander Shepherd
    Member

    Jan 27, 2023

    173

    Anyone remember when no load screens was talked about for Witcher 3?
     

    chris 1515
    Member

    Oct 27, 2017

    7,116

    Barcelona Spain

    No this is probably different than most game are doing it here the main focus is the 60 fps mode and after they can create a balancedand 30 fps mode.

    This is not the other way around. 

    stanman
    Member

    Feb 13, 2025

    235

    defaltoption said:

    With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid.

    Click to expand...
    Click to shrink...

    And your mistake is comparing a PC graphics card to a console. 

    PLASTICA-MAN
    Member

    Oct 26, 2017

    29,563

    chris 1515 said:

    Yes indirect shadows are handled by hardware lumen. But at the end ot doesn¡t change my comment. i think the game will be 600´720p at 30 fps on Series S.

    Click to expand...
    Click to shrink...

    Yes. I am sure Series S will have HW solution but probably at 30 FPS. that would be a miracle if they achieve 60 FPS. 

    ArchedThunder
    Uncle Beerus
    Member

    Oct 25, 2017

    21,278

    chris 1515 said:

    It will be a full port a few years after like The Witcher 3., they don't use software lumen here. I doubt the Switch 2 Raytracing capaclity is high enough to use the same pipeline to produce the Switch 2 version.

    EDIT: And they probably need to redo all the assets.

    /

    Fortnite doesn't use Nanite and Lumen on Switch 2.
    Click to expand...
    Click to shrink...

    Fortnite not using Lumen or Nanite at launch doesn't mean they can't run well on Switch 2. It's a launch port and they prioritized clean IQ and 60fps. I wouldn't be surprised to see them added later. Also it's not like the ray tracing in a Witcher 3 port has to match PS5, there's a lot of scaling back that can be done with ray tracing without ripping out the kitchen sink. Software lumen is also likely to be an option on P.
     

    jroc74
    Member

    Oct 27, 2017

    34,465

    Interesting times ahead....

    bitcloudrzr said:

    Has it been confirmed that Sony is going to have release requirements like the XS?

    Click to expand...
    Click to shrink...

    Your know good n well everything about this rumor has been confirmed.

    /S 

    Derbel McDillet
    ▲ Legend ▲
    Member

    Nov 23, 2022

    25,250

    Chronos said:

    This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.

    Click to expand...
    Click to shrink...

    How does this sound like a Cyberpunk issue? They didn't say they can't get it to work on the S.
     

    defaltoption
    Plug in a controller and enter the Konami code
    The Fallen

    Oct 27, 2017

    12,485

    Austin

    stanman said:

    And your mistake is comparing a PC graphics card to a console.

    Click to expand...
    Click to shrink...

     

    reksveks
    Member

    May 17, 2022

    7,628

    Horns said:

    I hope Microsoft drops the requirement for Series S by the time this comes out.

    Click to expand...
    Click to shrink...

    why? dev can make it 30 fps on series s and 60 fps on series x if needed.

    if they aren't or don't have to drop it for gta vi, they probably ain't dropping it for tw4. 

    chris 1515
    Member

    Oct 27, 2017

    7,116

    Barcelona Spain

    defaltoption said:

    With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid.

    Click to expand...
    Click to shrink...

    No the consoles won't hold back your 5090 because the game is created with hardware lumen, RT reflection, virtual shadows maps and Nanite plus Nanite vegetation in minds. Maybe Nanite character too in final version?

    If the game was made with software lumen as the base it would have holding back your 5090...

    Your PC will have much better IQ, framerate and better raytracing with Megalightand better raytracing settings in general. 

    bitcloudrzr
    Member

    May 31, 2018

    21,044

    jroc74 said:

    Interesting times ahead....

    Your know good n well everything about this rumor has been confirmed.

    /S
    Click to expand...
    Click to shrink...

    Sony is like the opposite of a platform holder "forcing" adoption, for better or worse.
     

    defaltoption
    Plug in a controller and enter the Konami code
    The Fallen

    Oct 27, 2017

    12,485

    Austin

    chris 1515 said:

    No the consoles won't hold back yout 5090 because the game is created with hardware lumen, RT reflection, virtual shadows maps and Nanite plus Nanite vegetation in minds. Maybe Nanite character too in final version?

    If the game was made with software lumen as the base it would have holding back your 5090...

    Your PC will have much better IQ, framerate and better raytracing with Megalightand better raytracing settings in general.
    Click to expand...
    Click to shrink...

    Exactly, the series s is not a "mistake" or holding any version of the game on console or even PC back, that's what I'm saying to the person I replied to, its stupid to say that.
     

    cursed beef
    Member

    Jan 3, 2021

    998

    Have to imagine MS will lift the Series S parity clause when the next consoles launch. Which will be before/around the time W4 hits, right?
     

    Alvis
    Saw the truth behind the copied door
    Member

    Oct 25, 2017

    12,270

    EU

    Chronos said:

    This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.

    Click to expand...
    Click to shrink...

    ? they said that 60 FPS on Series S is challenging, not the act of releasing the game there at all. The game can simply run at 30 FPS on Series S if they can't pull off 60 FPS. Or have a 40 FPS mode in lieu of 60 FPS.

    The CPU and storage speed differences between last gen and current gen were gigantic. This isn't even remotely close to a comparable situation. 

    defaltoption
    Plug in a controller and enter the Konami code
    The Fallen

    Oct 27, 2017

    12,485

    Austin

    misqoute post
     

    jroc74
    Member

    Oct 27, 2017

    34,465

    defaltoption said:

    With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid.

    Click to expand...
    Click to shrink...

    Ah yes, clearly 5090 cards are the vast majority of the minimum requirements for PC games.

    How can anyone say this with a straight face anymore when there are now PC games running on a Steam Deck.

    At least ppl saying that about the Series S are comparing it to other consoles.

    That said, it is interesting they are focusing on consoles first, then PC. 
    #projekt #red #tw4 #has #console
    CD Projekt RED: TW4 has console first development with a 60fps target; 60fps on Series S will be "extremely challenging"
    DriftingSpirit Member Oct 25, 2017 18,563 They note how they usually start with PC and scale down, but they will be doing it the other way around this time to avoid issues with the console versions. 4:15 for console focus and 60fps 38:50 for the Series S comment  bsigg Member Oct 25, 2017 25,153Inside The Witcher 4 Unreal Engine 5 Tech Demo: CD Projekt RED + Epic Deep Dive Interview www.resetera.com   Skot Member Oct 30, 2017 645 720p on Series S incoming   Bulby Prophet of Truth Member Oct 29, 2017 6,006 Berlin I think think any series s user will be happy with a beautiful 900p 30fps   Chronos Member Oct 27, 2017 1,249 This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.   HellofaMouse Member Oct 27, 2017 8,551 i wonder if this'll come out before the gen is over? good chance itll be a 2077 situation, cross-gen release with a broken ps6 version  logash Member Oct 27, 2017 6,526 This makes sense since they want to have good performance on lower end machines and they mentioned that it was easier to scale up than to scale down. They also mentioned their legacy on PC and how they plan on scaling it up high like they usually do on PC.   KRT Member Aug 7, 2020 247 Series S was a mistake   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain The game have raytracing GI and reflection it will probably be 30 fps 600p-720p on Xbox Series S.   bitcloudrzr Member May 31, 2018 21,044 Bulby said: I think think any series s user will be happy with a beautiful 900p 30fps Click to expand... Click to shrink...   Yuuber Member Oct 28, 2017 4,540 KRT said: Series S was a mistake Click to expand... Click to shrink... Can we stop with these stupid takes? For all we know it sold as much as Series X, helped several games have better optimization on bigger consoles and it will definitely help optimizing newer games to the Nintendo Switch 2.  MANTRA Member Feb 21, 2024 1,198 No one who cares about 60fps should be buying a Series S, just make it 30fps.   Roytheone Member Oct 25, 2017 6,185 Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... They can just go for 30 fps instead on the Series S. No need for a special deal for that, that's allowed.  Matterhorn Member Feb 6, 2019 254 United States Hoping for a very nice looking 30fps Switch 2 version.   Universal Acclaim Member Oct 5, 2024 2,617 Maybe off topic, but is 30fps target not so important anymore for 2027 industry-leading graphics? GTA is mainly doing it for design/physics/etc. whch is why the game can't be scaled down to 720-900p/60fps?   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain Matterhorn said: Hoping for a very nice looking 30fps Switch 2 version. Click to expand... Click to shrink... It will be a full port a few years after like The Witcher 3., they don't use software lumen here. I doubt the Switch 2 Raytracing capaclity is high enough to use the same pipeline to produce the Switch 2 version. EDIT: And they probably need to redo all the assets. / Fortnite doesn't use Nanite and Lumen on Switch 2.  Last edited: Yesterday at 4:18 PM bitcloudrzr Member May 31, 2018 21,044 Universal Acclaim said: Maybe off topic, but is 30fps target not so important anymore for 2027 industry-leading graphics? GTA is mainly doing it for design/physics/etc. whch is why the graphics can't be scaled down to 720p/60fps? Click to expand... Click to shrink... Graphics are the part of the game that can be scaled, it is CPU load that is the more difficult part, although devs have actually made cuts in the latter to increase performance mode fps viability. Even with this focus on 60fps performance modes, they are always going to have room to make a higher fidelity 30fps mode. Specifically with UE5 though, performance has been such a disaster all around and Epic seems to be taking it seriously now.   Greywaren Member Jul 16, 2019 13,530 Spain 60 fps target is fantastic, I wish it was the norm.   julia crawford Took the red AND the blue pills Member Oct 27, 2017 40,709 i am very ok with lower fps on the series s, it is far more palatable than severe resolution drops with upscaling artifacts.   Spoit Member Oct 28, 2017 5,599 Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... And yet people keep talking about somehow getting PS6 games to work on the sony portable, which is probably going to be like half as powerful as a PS5, like that won't hold games back   PLASTICA-MAN Member Oct 26, 2017 29,563 chris 1515 said: The game have raytracing GI and reflection it will probably be 30 fps 600p-720p on Xbox Series S. Click to expand... Click to shrink... There is kinda a misconception of how Lumen and the hybrid RT is handled in UE5 titles. AO is also part of the ray traced pipeline through the HW Lumen too. Just shadows are handled separately from the RT system by using VSM which in final look behvae quite like RT shadows in shape, same how FF16 handled the shadows looking like RT ones while it isn't traced. UE5 can still trace shadows if they want to push things even further.  overthewaves Member Sep 30, 2020 1,203 What about the PS5 handheld?   nullpotential Member Jun 24, 2024 87 KRT said: Series S was a mistake Click to expand... Click to shrink... Consoles were a mistake.  GPU Member Oct 10, 2024 1,075 I really dont think Series S/X will be much of a factor by the time this game comes out.   Lashley <<Tag Here>> Member Oct 25, 2017 65,679 Just make series s 480p 30fps   pappacone Member Jan 10, 2020 4,076 Greywaren said: 60 fps target is fantastic, I wish it was the norm. Click to expand... Click to shrink... It pretty much is   Super Studied the Buster Sword Member Jan 29, 2022 13,601 I hope they can pull 60 FPS off in the full game.   Theorry Member Oct 27, 2017 69,045 "target" Uh huh. We know how that is gonna go.  Jakartalado Member Oct 27, 2017 2,818 São Paulo, Brazil Skot said: 720p on Series S incoming Click to expand... Click to shrink... If the PS5 is internally at 720p up to 900p, I seriously doubt that.  Revoltoftheunique Member Jan 23, 2022 2,312 It will be unstable 60fps with lots of stuttering.   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin KRT said: Series S was a mistake Click to expand... Click to shrink... With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid.   Horns Member Dec 7, 2018 3,423 I hope Microsoft drops the requirement for Series S by the time this comes out.   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain PLASTICA-MAN said: There is kinda a misconception of how Lumen and the hybrid RT is handled in UE5 titles. AO is also part of the ray traced pipeline through the HW Lumen too. Just shadows are handled separately from the RT system by using VSM which in final look behvae quite like RT shadows in shape, same how FF16 handled the shadows looking like RT ones while it isn't traced. UE5 can still trace shadows if they want to push things even further. Click to expand... Click to shrink... Yes indirect shadows are handled by hardware lumen. But at the end ot doesn¡t change my comment. i think the game will be 600´720p at 30 fps on Series S.  bitcloudrzr Member May 31, 2018 21,044 Spoit said: And yet people keep talking about somehow getting PS6 games to work on the sony portable, which is probably going to be like half as powerful as a PS5, like that won't hold games back Click to expand... Click to shrink... Has it been confirmed that Sony is going to have release requirements like the XS?   Commander Shepherd Member Jan 27, 2023 173 Anyone remember when no load screens was talked about for Witcher 3?   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain No this is probably different than most game are doing it here the main focus is the 60 fps mode and after they can create a balancedand 30 fps mode. This is not the other way around.  stanman Member Feb 13, 2025 235 defaltoption said: With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid. Click to expand... Click to shrink... And your mistake is comparing a PC graphics card to a console.  PLASTICA-MAN Member Oct 26, 2017 29,563 chris 1515 said: Yes indirect shadows are handled by hardware lumen. But at the end ot doesn¡t change my comment. i think the game will be 600´720p at 30 fps on Series S. Click to expand... Click to shrink... Yes. I am sure Series S will have HW solution but probably at 30 FPS. that would be a miracle if they achieve 60 FPS.  ArchedThunder Uncle Beerus Member Oct 25, 2017 21,278 chris 1515 said: It will be a full port a few years after like The Witcher 3., they don't use software lumen here. I doubt the Switch 2 Raytracing capaclity is high enough to use the same pipeline to produce the Switch 2 version. EDIT: And they probably need to redo all the assets. / Fortnite doesn't use Nanite and Lumen on Switch 2. Click to expand... Click to shrink... Fortnite not using Lumen or Nanite at launch doesn't mean they can't run well on Switch 2. It's a launch port and they prioritized clean IQ and 60fps. I wouldn't be surprised to see them added later. Also it's not like the ray tracing in a Witcher 3 port has to match PS5, there's a lot of scaling back that can be done with ray tracing without ripping out the kitchen sink. Software lumen is also likely to be an option on P.   jroc74 Member Oct 27, 2017 34,465 Interesting times ahead.... bitcloudrzr said: Has it been confirmed that Sony is going to have release requirements like the XS? Click to expand... Click to shrink... Your know good n well everything about this rumor has been confirmed. /S  Derbel McDillet ▲ Legend ▲ Member Nov 23, 2022 25,250 Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... How does this sound like a Cyberpunk issue? They didn't say they can't get it to work on the S.   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin stanman said: And your mistake is comparing a PC graphics card to a console. Click to expand... Click to shrink...   reksveks Member May 17, 2022 7,628 Horns said: I hope Microsoft drops the requirement for Series S by the time this comes out. Click to expand... Click to shrink... why? dev can make it 30 fps on series s and 60 fps on series x if needed. if they aren't or don't have to drop it for gta vi, they probably ain't dropping it for tw4.  chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain defaltoption said: With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid. Click to expand... Click to shrink... No the consoles won't hold back your 5090 because the game is created with hardware lumen, RT reflection, virtual shadows maps and Nanite plus Nanite vegetation in minds. Maybe Nanite character too in final version? If the game was made with software lumen as the base it would have holding back your 5090... Your PC will have much better IQ, framerate and better raytracing with Megalightand better raytracing settings in general.  bitcloudrzr Member May 31, 2018 21,044 jroc74 said: Interesting times ahead.... Your know good n well everything about this rumor has been confirmed. /S Click to expand... Click to shrink... Sony is like the opposite of a platform holder "forcing" adoption, for better or worse.   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin chris 1515 said: No the consoles won't hold back yout 5090 because the game is created with hardware lumen, RT reflection, virtual shadows maps and Nanite plus Nanite vegetation in minds. Maybe Nanite character too in final version? If the game was made with software lumen as the base it would have holding back your 5090... Your PC will have much better IQ, framerate and better raytracing with Megalightand better raytracing settings in general. Click to expand... Click to shrink... Exactly, the series s is not a "mistake" or holding any version of the game on console or even PC back, that's what I'm saying to the person I replied to, its stupid to say that.   cursed beef Member Jan 3, 2021 998 Have to imagine MS will lift the Series S parity clause when the next consoles launch. Which will be before/around the time W4 hits, right?   Alvis Saw the truth behind the copied door Member Oct 25, 2017 12,270 EU Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... ? they said that 60 FPS on Series S is challenging, not the act of releasing the game there at all. The game can simply run at 30 FPS on Series S if they can't pull off 60 FPS. Or have a 40 FPS mode in lieu of 60 FPS. The CPU and storage speed differences between last gen and current gen were gigantic. This isn't even remotely close to a comparable situation.  defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin misqoute post   jroc74 Member Oct 27, 2017 34,465 defaltoption said: With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid. Click to expand... Click to shrink... Ah yes, clearly 5090 cards are the vast majority of the minimum requirements for PC games. How can anyone say this with a straight face anymore when there are now PC games running on a Steam Deck. At least ppl saying that about the Series S are comparing it to other consoles. That said, it is interesting they are focusing on consoles first, then PC.  #projekt #red #tw4 #has #console
    WWW.RESETERA.COM
    CD Projekt RED: TW4 has console first development with a 60fps target; 60fps on Series S will be "extremely challenging"
    DriftingSpirit Member Oct 25, 2017 18,563 They note how they usually start with PC and scale down, but they will be doing it the other way around this time to avoid issues with the console versions. 4:15 for console focus and 60fps 38:50 for the Series S comment  bsigg Member Oct 25, 2017 25,153 [DF] Inside The Witcher 4 Unreal Engine 5 Tech Demo: CD Projekt RED + Epic Deep Dive Interview https://www.youtube.com/watch?v=OplYN2MMI4Q www.resetera.com   Skot Member Oct 30, 2017 645 720p on Series S incoming   Bulby Prophet of Truth Member Oct 29, 2017 6,006 Berlin I think think any series s user will be happy with a beautiful 900p 30fps   Chronos Member Oct 27, 2017 1,249 This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation.   HellofaMouse Member Oct 27, 2017 8,551 i wonder if this'll come out before the gen is over? good chance itll be a 2077 situation, cross-gen release with a broken ps6 version  logash Member Oct 27, 2017 6,526 This makes sense since they want to have good performance on lower end machines and they mentioned that it was easier to scale up than to scale down. They also mentioned their legacy on PC and how they plan on scaling it up high like they usually do on PC.   KRT Member Aug 7, 2020 247 Series S was a mistake   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain The game have raytracing GI and reflection it will probably be 30 fps 600p-720p on Xbox Series S.   bitcloudrzr Member May 31, 2018 21,044 Bulby said: I think think any series s user will be happy with a beautiful 900p 30fps Click to expand... Click to shrink...   Yuuber Member Oct 28, 2017 4,540 KRT said: Series S was a mistake Click to expand... Click to shrink... Can we stop with these stupid takes? For all we know it sold as much as Series X, helped several games have better optimization on bigger consoles and it will definitely help optimizing newer games to the Nintendo Switch 2.  MANTRA Member Feb 21, 2024 1,198 No one who cares about 60fps should be buying a Series S, just make it 30fps.   Roytheone Member Oct 25, 2017 6,185 Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... They can just go for 30 fps instead on the Series S. No need for a special deal for that, that's allowed.  Matterhorn Member Feb 6, 2019 254 United States Hoping for a very nice looking 30fps Switch 2 version.   Universal Acclaim Member Oct 5, 2024 2,617 Maybe off topic, but is 30fps target not so important anymore for 2027 industry-leading graphics? GTA is mainly doing it for design/physics/etc. whch is why the game can't be scaled down to 720-900p/60fps?   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain Matterhorn said: Hoping for a very nice looking 30fps Switch 2 version. Click to expand... Click to shrink... It will be a full port a few years after like The Witcher 3., they don't use software lumen here. I doubt the Switch 2 Raytracing capaclity is high enough to use the same pipeline to produce the Switch 2 version. EDIT: And they probably need to redo all the assets. https://www.reddit.com/r/FortNiteBR/comments/1l4a1o4/fortnite_on_the_switch_2_looks_great_these_low/ Fortnite doesn't use Nanite and Lumen on Switch 2.  Last edited: Yesterday at 4:18 PM bitcloudrzr Member May 31, 2018 21,044 Universal Acclaim said: Maybe off topic, but is 30fps target not so important anymore for 2027 industry-leading graphics? GTA is mainly doing it for design/physics/etc. whch is why the graphics can't be scaled down to 720p/60fps? Click to expand... Click to shrink... Graphics are the part of the game that can be scaled, it is CPU load that is the more difficult part, although devs have actually made cuts in the latter to increase performance mode fps viability. Even with this focus on 60fps performance modes, they are always going to have room to make a higher fidelity 30fps mode. Specifically with UE5 though, performance has been such a disaster all around and Epic seems to be taking it seriously now.   Greywaren Member Jul 16, 2019 13,530 Spain 60 fps target is fantastic, I wish it was the norm.   julia crawford Took the red AND the blue pills Member Oct 27, 2017 40,709 i am very ok with lower fps on the series s, it is far more palatable than severe resolution drops with upscaling artifacts.   Spoit Member Oct 28, 2017 5,599 Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... And yet people keep talking about somehow getting PS6 games to work on the sony portable, which is probably going to be like half as powerful as a PS5, like that won't hold games back   PLASTICA-MAN Member Oct 26, 2017 29,563 chris 1515 said: The game have raytracing GI and reflection it will probably be 30 fps 600p-720p on Xbox Series S. Click to expand... Click to shrink... There is kinda a misconception of how Lumen and the hybrid RT is handled in UE5 titles. AO is also part of the ray traced pipeline through the HW Lumen too. Just shadows are handled separately from the RT system by using VSM which in final look behvae quite like RT shadows in shape, same how FF16 handled the shadows looking like RT ones while it isn't traced. UE5 can still trace shadows if they want to push things even further.  overthewaves Member Sep 30, 2020 1,203 What about the PS5 handheld?   nullpotential Member Jun 24, 2024 87 KRT said: Series S was a mistake Click to expand... Click to shrink... Consoles were a mistake.  GPU Member Oct 10, 2024 1,075 I really dont think Series S/X will be much of a factor by the time this game comes out.   Lashley <<Tag Here>> Member Oct 25, 2017 65,679 Just make series s 480p 30fps   pappacone Member Jan 10, 2020 4,076 Greywaren said: 60 fps target is fantastic, I wish it was the norm. Click to expand... Click to shrink... It pretty much is   Super Studied the Buster Sword Member Jan 29, 2022 13,601 I hope they can pull 60 FPS off in the full game.   Theorry Member Oct 27, 2017 69,045 "target" Uh huh. We know how that is gonna go.  Jakartalado Member Oct 27, 2017 2,818 São Paulo, Brazil Skot said: 720p on Series S incoming Click to expand... Click to shrink... If the PS5 is internally at 720p up to 900p, I seriously doubt that.  Revoltoftheunique Member Jan 23, 2022 2,312 It will be unstable 60fps with lots of stuttering.   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin KRT said: Series S was a mistake Click to expand... Click to shrink... With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid.   Horns Member Dec 7, 2018 3,423 I hope Microsoft drops the requirement for Series S by the time this comes out.   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain PLASTICA-MAN said: There is kinda a misconception of how Lumen and the hybrid RT is handled in UE5 titles. AO is also part of the ray traced pipeline through the HW Lumen too. Just shadows are handled separately from the RT system by using VSM which in final look behvae quite like RT shadows in shape, same how FF16 handled the shadows looking like RT ones while it isn't traced. UE5 can still trace shadows if they want to push things even further. Click to expand... Click to shrink... Yes indirect shadows are handled by hardware lumen. But at the end ot doesn¡t change my comment. i think the game will be 600´720p at 30 fps on Series S.  bitcloudrzr Member May 31, 2018 21,044 Spoit said: And yet people keep talking about somehow getting PS6 games to work on the sony portable, which is probably going to be like half as powerful as a PS5, like that won't hold games back Click to expand... Click to shrink... Has it been confirmed that Sony is going to have release requirements like the XS?   Commander Shepherd Member Jan 27, 2023 173 Anyone remember when no load screens was talked about for Witcher 3?   chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain No this is probably different than most game are doing it here the main focus is the 60 fps mode and after they can create a balanced(40 fps) and 30 fps mode. This is not the other way around.  stanman Member Feb 13, 2025 235 defaltoption said: With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid. Click to expand... Click to shrink... And your mistake is comparing a PC graphics card to a console.  PLASTICA-MAN Member Oct 26, 2017 29,563 chris 1515 said: Yes indirect shadows are handled by hardware lumen. But at the end ot doesn¡t change my comment. i think the game will be 600´720p at 30 fps on Series S. Click to expand... Click to shrink... Yes. I am sure Series S will have HW solution but probably at 30 FPS. that would be a miracle if they achieve 60 FPS.  ArchedThunder Uncle Beerus Member Oct 25, 2017 21,278 chris 1515 said: It will be a full port a few years after like The Witcher 3., they don't use software lumen here. I doubt the Switch 2 Raytracing capaclity is high enough to use the same pipeline to produce the Switch 2 version. EDIT: And they probably need to redo all the assets. https://www.reddit.com/r/FortNiteBR/comments/1l4a1o4/fortnite_on_the_switch_2_looks_great_these_low/ Fortnite doesn't use Nanite and Lumen on Switch 2. Click to expand... Click to shrink... Fortnite not using Lumen or Nanite at launch doesn't mean they can't run well on Switch 2. It's a launch port and they prioritized clean IQ and 60fps. I wouldn't be surprised to see them added later. Also it's not like the ray tracing in a Witcher 3 port has to match PS5, there's a lot of scaling back that can be done with ray tracing without ripping out the kitchen sink. Software lumen is also likely to be an option on P.   jroc74 Member Oct 27, 2017 34,465 Interesting times ahead.... bitcloudrzr said: Has it been confirmed that Sony is going to have release requirements like the XS? Click to expand... Click to shrink... Your know good n well everything about this rumor has been confirmed. /S  Derbel McDillet ▲ Legend ▲ Member Nov 23, 2022 25,250 Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... How does this sound like a Cyberpunk issue? They didn't say they can't get it to work on the S.   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin stanman said: And your mistake is comparing a PC graphics card to a console. Click to expand... Click to shrink...   reksveks Member May 17, 2022 7,628 Horns said: I hope Microsoft drops the requirement for Series S by the time this comes out. Click to expand... Click to shrink... why? dev can make it 30 fps on series s and 60 fps on series x if needed. if they aren't or don't have to drop it for gta vi, they probably ain't dropping it for tw4.  chris 1515 Member Oct 27, 2017 7,116 Barcelona Spain defaltoption said: With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid. Click to expand... Click to shrink... No the consoles won't hold back your 5090 because the game is created with hardware lumen, RT reflection, virtual shadows maps and Nanite plus Nanite vegetation in minds. Maybe Nanite character too in final version? If the game was made with software lumen as the base it would have holding back your 5090... Your PC will have much better IQ, framerate and better raytracing with Megalight(direct raytraced shadows with tons of lighe source) and better raytracing settings in general.  bitcloudrzr Member May 31, 2018 21,044 jroc74 said: Interesting times ahead.... Your know good n well everything about this rumor has been confirmed. /S Click to expand... Click to shrink... Sony is like the opposite of a platform holder "forcing" adoption, for better or worse.   defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin chris 1515 said: No the consoles won't hold back yout 5090 because the game is created with hardware lumen, RT reflection, virtual shadows maps and Nanite plus Nanite vegetation in minds. Maybe Nanite character too in final version? If the game was made with software lumen as the base it would have holding back your 5090... Your PC will have much better IQ, framerate and better raytracing with Megalight(direct raytraced shadows) and better raytracing settings in general. Click to expand... Click to shrink... Exactly, the series s is not a "mistake" or holding any version of the game on console or even PC back, that's what I'm saying to the person I replied to, its stupid to say that.   cursed beef Member Jan 3, 2021 998 Have to imagine MS will lift the Series S parity clause when the next consoles launch. Which will be before/around the time W4 hits, right?   Alvis Saw the truth behind the copied door Member Oct 25, 2017 12,270 EU Chronos said: This better not be a Cyberpunk situation all over again. If they can't get it to work on S, then they may just need to abandon that console. Work out a deal with MS or wait for their next generation. Click to expand... Click to shrink... ? they said that 60 FPS on Series S is challenging, not the act of releasing the game there at all. The game can simply run at 30 FPS on Series S if they can't pull off 60 FPS. Or have a 40 FPS mode in lieu of 60 FPS. The CPU and storage speed differences between last gen and current gen were gigantic. This isn't even remotely close to a comparable situation.  defaltoption Plug in a controller and enter the Konami code The Fallen Oct 27, 2017 12,485 Austin misqoute post   jroc74 Member Oct 27, 2017 34,465 defaltoption said: With that same attitude in this case you could say consoles are the mistake. You on your Series X or PS5 Pro are holding my 5090 back. Not so fun of a take anymore. Thats why its stupid. Click to expand... Click to shrink... Ah yes, clearly 5090 cards are the vast majority of the minimum requirements for PC games. How can anyone say this with a straight face anymore when there are now PC games running on a Steam Deck. At least ppl saying that about the Series S are comparing it to other consoles. That said, it is interesting they are focusing on consoles first, then PC. 
    0 Σχόλια 0 Μοιράστηκε
  • From Networks to Business Models, AI Is Rewiring Telecom

    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services.
    As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry.
    Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental.
    AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint.
    AI Is Reshaping Wireless Networks Already
    Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access, and intelligent automation across the stack.
    AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models.
    Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time.
    AI Acceleration Will Outpace Past Tech Shifts
    Many may underestimate the speed and magnitude of AI-driven change.
    The shift from traditional voice and data systems to AI-driven network intelligence is already underway.
    Although predictions abound, the true scope remains unclear.
    It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise.

    Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board.
    Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined.
    History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries.
    Technological shifts bring both new opportunities and complex trade-offs.
    AI Disruption Will Move Faster Than Ever
    The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed.
    Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway.
    As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other.
    Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward.
    AI Will Reshape All Sectors and Companies
    This shift will unfold faster than most organizations or individuals are prepared to handle.
    Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries.
    Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption.
    Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage.
    As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name.

    It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries.
    SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption.
    The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers.
    Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives.
    No Industry Is Immune From AI Disruption
    AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp.
    New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent.
    Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets.
    The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow.
    #networks #business #models #rewiring #telecom
    From Networks to Business Models, AI Is Rewiring Telecom
    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services. As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry. Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental. AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint. AI Is Reshaping Wireless Networks Already Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access, and intelligent automation across the stack. AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models. Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time. AI Acceleration Will Outpace Past Tech Shifts Many may underestimate the speed and magnitude of AI-driven change. The shift from traditional voice and data systems to AI-driven network intelligence is already underway. Although predictions abound, the true scope remains unclear. It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise. Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board. Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined. History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries. Technological shifts bring both new opportunities and complex trade-offs. AI Disruption Will Move Faster Than Ever The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed. Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway. As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other. Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward. AI Will Reshape All Sectors and Companies This shift will unfold faster than most organizations or individuals are prepared to handle. Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries. Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption. Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage. As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name. It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries. SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption. The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers. Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives. No Industry Is Immune From AI Disruption AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp. New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent. Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets. The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow. #networks #business #models #rewiring #telecom
    From Networks to Business Models, AI Is Rewiring Telecom
    Artificial intelligence is already rewriting the rules of wireless and telecom — powering predictive maintenance, streamlining network operations, and enabling more innovative services. As AI scales, the disruption will be faster, deeper, and harder to reverse than any prior shift in the industry. Compared to the sweeping changes AI is set to unleash, past telecom innovations look incremental. AI is redefining how networks operate, services are delivered, and data is secured — across every device and digital touchpoint. AI Is Reshaping Wireless Networks Already Artificial intelligence is already transforming wireless through smarter private networks, fixed wireless access (FWA), and intelligent automation across the stack. AI detects and resolves network issues before they impact service, improving uptime and customer satisfaction. It’s also opening the door to entirely new revenue streams and business models. Each wireless generation brought new capabilities. AI, however, marks a more profound shift — networks that think, respond, and evolve in real time. AI Acceleration Will Outpace Past Tech Shifts Many may underestimate the speed and magnitude of AI-driven change. The shift from traditional voice and data systems to AI-driven network intelligence is already underway. Although predictions abound, the true scope remains unclear. It’s tempting to assume we understand AI’s trajectory, but history suggests otherwise. Today, AI is already automating maintenance and optimizing performance without user disruption. The technologies we’ll rely on in the near future may still be on the drawing board. Few predicted that smartphones would emerge from analog beginnings—a reminder of how quickly foundational technologies can be reimagined. History shows that disruptive technologies rarely follow predictable paths — and AI is no exception. It’s already upending business models across industries. Technological shifts bring both new opportunities and complex trade-offs. AI Disruption Will Move Faster Than Ever The same cycle of reinvention is happening now — but with AI, it’s moving at unprecedented speed. Despite all the discussion, many still treat AI as a future concern — yet the shift is already well underway. As with every major technological leap, there will be gains and losses. The AI transition brings clear trade-offs: efficiency and innovation on one side, job displacement, and privacy erosion on the other. Unlike past tech waves that unfolded over decades, the AI shift will reshape industries in just a few years — and that change wave will only continue to move forward. AI Will Reshape All Sectors and Companies This shift will unfold faster than most organizations or individuals are prepared to handle. Today’s industries will likely look very different tomorrow. Entirely new sectors will emerge as legacy models become obsolete — redefining market leadership across industries. Telecom’s past holds a clear warning: market dominance can vanish quickly when companies ignore disruption. Eventually, the Baby Bells moved into long-distance service, while AT&T remained barred from selling local access — undermining its advantage. As the market shifted and competitors gained ground, AT&T lost its dominance and became vulnerable enough that SBC, a former regional Bell, acquired it and took on its name. It’s a case study of how incumbents fall when they fail to adapt — precisely the kind of pressure AI is now exerting across industries. SBC’s acquisition of AT&T flipped the power dynamic — proof that size doesn’t protect against disruption. The once-crowded telecom field has consolidated into just a few dominant players — each facing new threats from AI-native challengers. Legacy telecom models are being steadily displaced by faster, more flexible wireless, broadband, and streaming alternatives. No Industry Is Immune From AI Disruption AI will accelerate the next wave of industrial evolution — bringing innovations and consequences we’re only beginning to grasp. New winners will emerge as past leaders struggle to hang on — a shift that will also reshape the investment landscape. Startups leveraging AI will likely redefine leadership in sectors where incumbents have grown complacent. Nvidia’s rise is part of a broader trend: the next market leaders will emerge wherever AI creates a clear competitive advantage — whether in chips, code, or entirely new markets. The AI-driven future is arriving faster than most organizations are ready for. Adapting to this accelerating wave of change is no longer optional — it’s essential. Companies that act decisively today will define the winners of tomorrow.
    0 Σχόλια 0 Μοιράστηκε
Αναζήτηση αποτελεσμάτων