This month's Frame: when reality augments our device
A framework for understanding how “4E” cognition reveals it’s not a matter of the device augmenting reality, but reality augmenting the device.
“I don’t know what I’d do without my fluffy carpet”
The arrival of Apple’s Vision Pro has reignited interest in the capabilities and constraints of Augmented and Mixed Reality (AR / MR). The slew of people posting clips using their Vision Pro “in the wild”, from New York subway cars and cafes to military tours and first dates, shows the dizzying array of contexts and objects they rely upon to enhance their experience.
It is easy to forget what is at stake for the user who is asked to partially or fully surrender their primary sense to the virtual environment irrespective of the life that moves around them. The promise of AR and MR is precisely the ability for users’ to remain in their world instead of suspending it. The intermingling between spaces, from family kitchen to zombie apocalypse and back again, drive user delight as the theatres of daily life become newly enlivened.
Nevertheless, users still leverage haptic feedback from objects and surfaces to increase their sense of physical safety and comfort. A wingback chair, a thick pile rug, toe socks, or the cool lip of glass pressing into the bare calves of a gamer, all exemplify strategies of user augmentation that cannot be discounted. For tech companies to assume it’s only their devices that augment our realities, not our realities augmenting their devices, vastly underestimates the power and pleasures of objects and their role in cognition.
Should we optimise everything or just learn to see what’s already there?
The constraints of the built environment have long presented a challenge for UX researchers. Small or cluttered rooms invariably restrict users’ ability to engage with a range of AR/MR apps. Savvy users learn to identify the affordances of everyday objects to enrich their augmented or mixed reality experiences. However, casual users expecting to “plug in and play”, instead of learning to adapt, can flounder without the knowledge of how to manage their environment.
Instead of users’ “optimising out of” these challenges, or simply giving up on their devices altogether, UX designers can assist them in identifying the affordances of what surrounds them. The move toward embracing rather than removing objects to enhance users’ experience, parallels our recent explorations of “Bazaar behaviours” albeit on a different scale. Just as the bazaar challenges the assumption that friction should be eradicated, we contend that AR and MR design should also view the frictions between users' relationships with objects as vital and meaningful.
The framework
The field of “4E” cognition has emerged in response to traditional theories of sensory processing, which claim perception arises from cortical activity in the brain.
“4E” scientists and philosophers emphasise how cognition doesn’t just occur in the head, but is embodied (relying on physiological signals), embedded and enacted in a context, and extended via devices and objects.
For instance, studies show injecting botox into the face reduces the brain’s ability to interpret emotional stimuli, due to its reliance on facial muscles to recognise and respond to others. We rely on a tray or hook to relocate our keys and become disoriented when they are moved. While seemingly trivial, we only have to observe people living with dementia to appreciate the stakes of moving objects in disrupting their cognitive map. Design practices for Dementia care facilities are undergoing significant changes due to the growing recognition of the role of the environment in aiding wayfinding and reducing distress.
James Carney describes cognition as “physiologically expensive,” leaving us to exploit anything around us that might help us “perform computational or inferential work.” The mind, body and the environment, in turn, become a “coupled system” interdependent on each other. Let us now turn to examine how UX and product designers can work with the “4Es” to help users work with, not optimise out of, the objects that suffuse their lives.
Using the framework
The growing market for third party haptic vests, coupled with a user base seeking “electronic role-play” (ERP) of everyday acts of intimacy, demonstrates an appetite for immersive experiences that extend and embody encounters in virtual reality. We could also say that AR is built to maximise the cognitive extension of devices by overlaying screens and panels across our fields of vision, enacted by a pinch, swipe or grab of the air. Yet it is cognition’s embeddedness that presents the greatest challenge for UX designers tasked with mapping pain points in the user’s environment, which are more often managed, not solved. Contextual insights of this sort are also harder to land within product teams for the simple reason that bugs and glitches are immediately actionable and in the remit of software engineers.
Regenerative designer Daniele Christian Wahl, notes one of the key insights of complexity science “is the shift in intention away from prediction and control toward appropriate participation”. To support the “4Es” of AR users, UX designers must identify moments to participate in the user’s journey to play within their existing constraints rather than seek to control them. Let us imagine how we might use AI-powered image scanning of a user’s room (a version of which is already employed by architects and designers) to reveal the hidden affordances of the objects that surround them.
Any floor covering, piece of clothing or furniture are opportunities to both augment the device or indicate possible risks. Users could select the degree of alteration from unrolling that dusty yoga mat in the corner (setting the bounds of a movement intensive game) to rearranging the furniture to maximise the haptic feedback of an entire room. Thermal scanning of the room could reveal how temperatures like the cool glass of a coffee table, could be used as a feedback mechanism of safety, or even a source of “thermal delight,” to heighten the senses and protect users.
Spatial scanning could also assist users to develop “pre-flight checks” to work within the constraints of their home environment. Teaching users to formalise their preparation, leverages our natural tendencies to exploit the affordances of the environment in the ways Carney describes. Participating in users’ spatial design both contests the claim that AR and MR takes us away from our lives along with the cultural imperative for IRL encounters. AR could, in fact, re-sensitise people to the four dimensions of their cognition and illustrate the extent to which our reliance on objects, textures, and temperatures are an immutable feature of our visceral lives.
Frames is a monthly newsletter that sheds light on the most important issues concerning business and technology.
Stripe Partners helps technology-led businesses invent better futures. To discuss how we can help you, get in touch.