
The end of design certainty
uxdesign.cc
Why AI forces us to embrace emergence instead of clinging to control and understandingMade by Patrick Morgan with MidjourneyAnthropic CEO Dario Amodeis recent podcast with Lex Fridman caught my attention with an observation that keeps replaying in my mind. Even as his team works to understand and interpret AI models, he acknowledged a surprising truth:Theres no reason why [AI models] should be designed for us to understand them, right? Theyre designed to operate, theyre designed to work. Just like the human brain or human biochemistry. Theyre not designed for a human to open up the hatch, look inside and understand them.This statement gave me pause. It challenges something deeply ingrained from the last generation of software design: the idea that we must fully understand how something works in order to consider it a valid solution. For years, weve doubled down on the belief that data-driven design can eliminate uncertainty. The idea is that to make something effective, we first have to deconstruct it, grasp every nuance, and then carefully engineer it into existence.But theres an irony here: as weve become more obsessed with data-driven certainty, weve invented systems that operate beyond our ability to fully analyze or predict. AI models demonstrate that sometimes the most powerful solutions emerge from patterns we can observe but not fullyexplain.AI flips the script. It wasnt designed for human comprehensionit was designed to work. Understanding, if it comes at all, is often after the fact, something we piece together after we see that the system works. And that realization has forced me to rethink my assumptions about design and invention for this newera.Rethinking solutions in search of problemsTraditional design wisdom views solutions in search of problems as a criticism. It runs counter to everything were taught about being problem-focused and user-centered. But as Anthropics head of product design Joel Lewenstein observed in a recent Dive Club podcast with Michael Riddering: Ive come to see solutions in search of a problem as not a dirty word at all as long as you just lean into it and state your assumptions, saying look, theres the germ of something here and were going to exploreit.This isnt about abandoning user-centered designits about recognizing that with AI, understanding often emerges through exploration. The technologys capabilities are so novel that even those working on the frontier dont know whats possible until they seeit.This shift isnt limited to Anthropic, its happening across leading AI companies. Inspired by a recent conversation with Perplexitys head of design Henry Modisett, Linear CEO Karri Saarinen commented, At Perplexity they start projects by exploring LLM capabilities with very simple prototypes, even with just a command-line implementation. Only once theres proof that the idea can work consistently, and that they can bend it to do what they want, do they start designing the experience. Normally, you start with design to explore possibilities, and the tech follows. But in this domain, or this new era of software, LLM/AI is the tool for exploration, and design comesafter.Invention comes before understandingHistorically, software design has followed a structured, step-by-step approachone where every phase is carefully planned to produce a predictable outcome.Understand the problemdeeply.Define a precise solution.Craft an experience that is intentional and predictable.Ship a finished product that behaves exactly as expected.But this isnt how many of the most transformative inventions have come about. If you look at breakthroughs across history, the process is always messy and often reversed:Have an intentan idea of what youre trying toachieve.Experiment, iterate, and push forward without muchclarity.Uncover an unexpected breakthroughit works, but not how youthought.You study the breakthrough, refine it, and later figure out why itworks.This pattern of discovery before understanding runs deep. Alexander Fleming didnt intend to discover penicillinhe noticed something unexpected in his experiment and followed the thread. The steam engine was a product of tinkering; Newcomen and Watt refined working models decades before scientists understood the laws of thermodynamics that made them possible. Early radio pioneers transmitted signals across great distances without fully understanding the physics of electromagnetic waves. And the use of anesthesia in surgery revolutionized medicine long before scientists figured out its precise mechanism ofaction.AI amplifies this historical patternAI doesnt just follow this patternit speeds itup.Traditional software is deterministic; AI is probabilistic. It doesnt follow rigid rulesit generates outputs based on patterns and likelihoods we can observe but not fully predict. The technology itself resists complete upfront understanding.The challenge is that many designers, engineers, and product teams are still trying to apply legacy design methodologies to a technology that simply doesnt work that way. AI doesnt respect our craving for certainty. It doesnt wait for us to fully understand it before showing results. And the more we try to force it into rigid, explainable, deterministic workflows, the more we suffocate its potential.Why design struggles to let gobut why itshouldThis has forced me to confront my own biases. Ive spent a decade designing software, and the instinct to make things fully understood before they exist is deeply ingrained. Its particularly rooted in the culture of UX designthis idea that we cant build effectively unless we first have a full grasp of what weremaking.AI challenges this instinct and asks us to revise our beliefs. It requires us to lean into the ambiguity, to design before we fully understand, and to shape the raw materials of generative outputs as useful options emerge. As Lewenstein notes, You can talk about AI and you can write about AI, but theres something just so powerful about seeing a working prototype and feeling the dynamic, stochastic nature of it seeing a website get rendered in real time, iterating on it and seeing it change in front of youits just magical. Understanding comes through doing, through making something tangible that we can respond to andrefine.Confronting this tension reminds me of the Daoist concept of Wu Wei, often translated as effortless action or without force. Its the idea that instead of rigidly trying to control every element of a process, we should move with the natural flow of thingsguiding and shaping, rather than imposing. Wu Wei isnt passivity; its about working with forces rather than against them. In AI design, this means crafting interactions where users guide and shape outcomes, rather than micromanaging every detail. Its like how a surfer harnesses a waves energy rather than trying to control theocean.Design must guide, notcontrolSo what does it look like to design without force? Instead of suppressing the unknown, we embrace it as part of theprocess.We create affordances, not strict controls: building interfaces that guide behavior rather than dictate it. Instead of trying to expose every parameter, we need interfaces that let users navigate AI while embracing its variability.We prioritize steerability over explainability: giving users meaningful, intuitive ways to shape AIs behavior without needing to understand its internals. The goal isnt to make the black box transparent, but to make it controllable at the right level of abstraction.We embrace emergence: designing systems that adapt and evolve, rather than ones that are locked into rigid, pre-defined behaviors. This means creating spaces where unexpected capabilities can surface and be refined throughuse.This doesnt mean that understanding is unimportant. But it does mean we should be wary of overprioritizing upfront understanding at the cost of progress. AI is teaching us that function canand often mustprecede full comprehension. And as designers, builders, and creative thinkers, we need to get more comfortable working in thatspace.If this makes you uncomfortablegood. It means youre seeing the shift. But if it excites youwell, my friend, youre right where you need tobe.What might become possible if you embrace emergence instead of clinging tocontrol?Embracing theunknownThis shift is much bigger than a throwaway line on a podcast. Its a reframe for how we approach design and invention in this new age. For decades, the software business has trained us to believe that predictability, explainability, and control are the highest ideals. But many of the most powerful things in the worldour brains, ecosystems, markets, and now AI modelsdont operate thatway.If youre designing with AI, start experimenting before you demand clarity. Try tinkering with the raw materials first, then layering on design afterwardlike Perplexity does. Embrace the unknown as a creativetool.As we build in this new era, we need to ask ourselves: what happens when we stop forcing things to fit our desire for immediate understanding? What becomes possible when we embrace discovery as a design principle? And how do we shape these new, emergent systems in ways that are powerful, safe, and genuinely creative?We may not fully understand AI yet. But if history tells us anything, that might be exactly where we need tobe.Patrick Morgan is the founder of Unknown Arts. If you enjoyed this post, subscribe to his newsletter or follow him on social media: X, LinkedIn, Bluesky.The end of design certainty was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
0 Comments
·0 Shares
·22 Views