Discover more from Frames
This month's Frame: Why people are losing control over technology and what product teams can do about it
A framework for rethinking the relationship between people and digital tools.
The last decade has brought a steady wave of commercial applications of artificial intelligence for personal use. From recommendations in film and music platforms, to health trackers and chatbots, digital products built using AI promise to make our lives easier, better, more seamless. However there is a flip side to this shiny coin. Complex computation has significantly accelerated the “blackboxing” of technologies, ie. the obfuscation of the internal mechanisms of technological systems and devices.
Not knowing what happens under the technological hood is disconcerting and raises questions about how decisions are made, using what kinds of data, especially when technologies are accelerating faster than we can comprehend. Current developments in generative AI have surfaced a desire to rethink what we want technologies to be, and how they should shape our lives.
Langdon Winner, the American political theorist and Science and Technology Studies scholar, offers an approach that helps us understand human interactions with complex digital tools. He reminds us that a perceived lack of agency in these interactions isn’t novel and provides a framework that can inspire the design of digital products and platforms that help users feel in control.
Winner argues that our fundamental relationship with technology has changed. Previously, tools were simple to understand and obvious in their mechanism. For example, a hammer's functionality is both easily comprehensible and visible. But digital tools, like algorithmic recommendation systems, have complex, invisible "inner logics" that mean that we use them without necessarily understanding why or how they work. We are forced to just trust that they do what we want them to achieve.
For the most part, a lack of understanding of the technical intricacies of the tools that we use doesn’t concern us, and generally doesn’t impact our ability to use them. However, it’s in the moments where things go wrong, or when devices behave in unexpected or “suspicious” ways, that our lack of knowledge is exposed and turns into a frustrating obstacle.
Complex technologies also require us to adapt to our tools’ requirements and logics rather than the other way around. For example, we have taught ourselves to train “the algorithm” when we use social media by liking content, whilst actively ignoring other types in the fear that they are falsely registered as interests. We are, knowingly, redirecting our actions to adapt to the machine’s logic.
This is, as Winner suggests, because complex digital tools enrol users into their functioning, and consequently move us down the “operational chain”. From being the central executioner, to one of many actants in a larger system. For instance, whilst the energy required to wield a hammer is directly positioned with its operator, algorithms require other technological systems and devices to function—eg. mobile phones, networks, data, and other algorithms—and so our role becomes one of many.
This, together with their often incomprehensible mechanisms, creates a hierarchical dependency that ultimately privileges digital products, and can leave us feeling we no longer wield control over our own tools.
Using the framework
How then, do we put digital tools “back in their place”? How do we move from a “blackboxed” technology-first state to a “human-first” approach in which digital products are built to adapt to people's everyday lives, and not the other way round?
There are 3 ways to achieve this:
Designing opportunities that allow users to actively shape their experience is essential for redressing the power imbalance between people and tools. Customisations are an obvious answer—allowing users to actively shape their experience to make it work for them. Similarly, control features that help users enable / disable certain actions can allow users establish the terms on which they want to use tools. To go one step further, we may ask, how might we not only create ways to react to a predetermined agenda, but instead, set the agenda ourselves?
Agency can also be more subtle. Whilst many of us are willing to “help” recommendation systems learn about the content we enjoy, our willingness to participate in the system is predicated on feeling that our actions have impact. Product teams should design cues within the experience that signal to users that their actions are having an effect.
For instance, Instagram’s “Why are you seeing this post?” function allows users to see how their previous actions, like sharing content, have impacted their recommendations. Similarly, Netflix’s “Because you have seen this” draws a connection between viewers’ previous behaviours and the platform's suggestions.
Finally, product teams should consider how they can help users have a better grasp of digital products and platforms by learning the operating language of the system, when they want to read it. Legibility can cause undesirable friction and noise when it’s misplaced or enforced. Cues can play a double-sided role here. For instance, cues in the interfaces of algorithmically driven platforms not only reassure against odd recommendations, but have an educational side-effect. They seed the logic of the operating model that sits behind these systems, helping users better understand how they work.
As AI tools are developing at dizzying speed and playing an increasing role in our lives, it’s more important than ever that we design digital experiences that are useful and meaningful. We have seen some efforts by products and platforms to provide control, feedback and legibility. Now there is an urgent need to make these principles central facets of future generative AI tools as they’re being developed today, rather than playing catch up years later.
Frames is a monthly newsletter that sheds light on the most important issues concerning business and technology.
Stripe Partners helps technology-led businesses invent better futures. To discuss how we can help you, get in touch.