
UXDESIGN.CC
When users change their behavior to game the algorithm
How our awareness is breaking the social media algorithm.It is said that the eyes of the Mona Lisa (Leonardo Da Vinci) follow you around the room, an illusion now known as the ‘Mona Lisa Effect’.There was a time when social media was simple. You followed people, liked posts, and as a result you got shown more of the same. But now the feeds we scroll through are less about who we follow, and more about how we behave.If you watch a reel for longer than three seconds you can expect more like it. If you linger on a photo, pause mid-scroll, replay a TikTok – that’s the new like. Platforms today care far less about your explicit choices, and more about your passive ones. And honestly, it was revolutionary – at first.This behavioural model promised a more authentic insight into our preferences. After all, what we do is often far more telling than what we say. It’s clever, subtle, and even somewhat intimate. But as we’ve come to understand how these systems work, we’ve also begun to perform for them, whether consciously or not.Knowing you’re being watchedThere’s a well-documented psychological phenomenon known as the Hawthorne effect – named after a series of productivity experiments in the 1920s at the Hawthorne Works factory. Researchers found that workers altered their behaviour simply because they knew they were being observed. More broadly, this aligns with the observer effect in behavioural science: our awareness of surveillance alters our actions.The Hawthorne Works factory where productivity experiments took place (circa 1925). Source: Western Electric Company Photograph AlbumNow apply that to your online behaviour.We’ve started to understand how the algorithm thinks. We might clear our watch history to reset our feed. We might even find ourselves clicking on content not because we necessarily want to watch it, but because we want to train the algorithm. We avoid pausing too long on something we don’t want to see more of, or to be associated with going forward. The system still watches us, but we’re no longer behaving naturally. We’re gaming it.This feedback loop becomes flawed not because the algorithm isn’t smart, but because the data it collects is no longer clean. We’ve turned from subjects to strategists. And once that happens, how effective can behavioural-based content delivery really be?The tension at the heart of the algorithmThere’s actually a deeper problem here, and it’s not just technical.Modern recommendation systems rely heavily on inferred intent from passive signals. But when these signals are harvested without the users’ full understanding, it challenges core principles of ethical design, especially informed consent and autonomy.A 2020 report by the Ada Lovelace Institute highlighted how opaque algorithmic systems undermine user agency, particularly when platforms fail to explain how recommendations are made or allow users to meaningfully contest them.Do we really understand and consent to how our social media feeds are being populated? Source: Cottonbro StudioIt raises some uncomfortable questions:Is it ethical to personalise content based on signals users aren’t aware they’re giving?Are users being manipulated into feeding the system, rather than served by it?Do we have a duty to design for agency, and not just engagement?“People cannot be empowered in an environment where they do not understand how decisions about them are made.”— Ada Lovelace Institute, Rethinking DataWhen algorithms adapt based on our most subtle behaviours, especially without transparency, we edge into the territory of surveillance design, as explored by Tristan Harris and the Center for Humane Technology. And that should give all of us pause for thought.Tristan Harris, Center for Humane Technology. Source: Center for Humane TechnologySo what should designers do?We play a key role in shaping how these systems feel, function, and inform. If we acknowledge that the behavioural model is faltering under the weight of its own manipulation, we need to take responsibility for evolving it ethically.Here are five design principles to consider:Design for agency, not just efficiency. Make it easy for users to understand why they’re seeing something and to change it if they want to. The Mozilla Foundation recommends practices like explanation panels (“You’re seeing this because…”) and controllable filters.Use behavioural data responsibly. Yes, it can be useful. But we must ask whether passive signals are fair or representative. The UK Information Commissioner’s Office suggests distinguishing between observed and provided data when determining consent boundaries.Make the invisible visible. Help users understand what’s being tracked and why. Surfacing insights builds trust – something tech desperately needs. Look to platforms like Spotify, which offers limited explanations in its ‘Discover Weekly’ playlists.Prioritise consent beyond checkboxes. True consent is ongoing and contextual. UX researcher Cennydd Bowles argues for ‘consentful design’, where interactions are continuously negotiated, rather than locked behind an initial ‘agree to all’.Question the metric. Engagement is not a proxy for wellbeing. The Facebook whistleblower: Frances Haugen, revealed how internal teams struggled with this exact issue – knowing that what keeps users hooked isn’t always what’s good for them.Spotify’s Discover Weekly playlist gives the user limited context about how the playlist has been created. Source: SpotifyLooking to an ethical future of content deliveryIf the behavioural model is beginning to crack under the weight of our awareness, where do we go next?Do platforms double down, trying to outsmart the user? Do they return to less accurate, but more honest and explicit signals? Or is the future something else entirely – something slower, more intentional, and more ethical?We’re entering an era where the assumptions behind content delivery need to be revisited. If we change our behaviour when watched, and we now all know we’re being watched, then we’re often feeding the machine a performance rather than our preference.And any system built on performance, rather than authenticity, eventually loses its grip on reality. Is the user still even enjoying the experience?It’s time we designed systems that respect the user not just as a data point, but as a conscious, consensual ally. Because the future of UX can’t just be personalised, it also needs to be principled.When users change their behavior to game the algorithm was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
0 Σχόλια
0 Μοιράστηκε
26 Views