In the past few months, a curious set of buttons has appeared at the base of my Gmail messages. They are canned replies to whatever text the software has detected in the preceding email, clickable responses like “I’m interested” or “sounds good!”
I laughed when I first saw them because they seemed so absurd: I can’t fathom feeling inconvenienced by typing “sounds good!” just to literally save five seconds. They’ve since gone even further with their “Smart Compose” feature, which actually autocompletes your sentences within the email itself. Why do technologists consider this such a high priority?
Humans are creatures of habit. Last June, the Economist noted a study from the Technical University of Denmark that showed that, at any given time, people tend to have no more than two dozen regular haunts. I’m sure Google Maps knows every one of mine and autopopulates when I type. I’ve already outsourced my navigation skills, my calendar-remembering, and my phototagging to Google, and now they are encouraging me to outsource my words.
Google’s new text feature may not appeal to me, but other sorts of algorithmically determined services do. We have to remind ourselves, however, that these digital tools bring more than convenience. Every time we use a button that says “sounds good!” we are also feeding the machine-learning beast, increasing the likelihood that the software will present us, and others, with more of the same in the future. As we narrow our expression, give way to our predictability, the small ways in which we seek convenience can be death by a thousand paper cuts to other qualities we crave as humans—individualism, serendipity, discovery.
We need to ask ourselves: What are we exchanging for convenience?
Stuck in a Loop
At the moment, we’re caught in a feedback loop, with algorithms and machine learning each playing a role to further consolidate our experiences and expression, and our personal spaces are converging into a milquetoast mass. As Kyle Chayka, a writer for the Verge, pointed out in “Welcome to Airspace,” our physical spaces have succumbed to the Airbnb aesthetic: “Minimalist furniture. Craft beer and avocado toast. Reclaimed wood. Industrial lighting. Cortados. Fast internet. The homogeneity of these spaces means that traveling between them is frictionless, a value that Silicon Valley prizes.” Just as Google reads our email and determines the likeliest way that we’ll respond, the Airbnb platform algorithmically surfaces the most popular—and least unique—aesthetic, furthering its popularity.
Similarly, the same feedback loop—an aesthetic becomes popular, a platform emphasizes it, people mimic said aesthetic—affects our individual photographic expression, rendering it both predictable and indistinguishable. The wonderful Instagram account @insta_repeat is unsettling with its multiple examples of young, hip, white people sitting in canoes, holding coffee, and staring off into the distance at the mountains.
We are collectively feeding a system that chews up creative input and spits it out in a prioritized fashion, with click rates motivating people to imitate the most popular expression. In an article on SFMoMA’s Open Space, Joe Veix refers to “the YouTube face,” which is an exaggerated facial expression “making everyone look like extras in a Soundgarden music video.” It appears in the thumbnail images for all the most popular videos, creating instantaneous clickbait. The YouTube face is the result of us attempting to game the algorithm for clicks. The irony is that this very expression is actually the algorithm gaming our culture.
Design for Serendipity Rather than Convenience
We are meeting the machines where they are. On the Radiolab episode “More or Less Human,” Robert Krulwich describes this as feeling like “a slow downward slide.” So how do we, as designers, respond? Can we break the feedback loop?
What if we made a concerted effort to deprioritize efficiency and predictability and built sensations of surprise, randomness, and serendipity into our experiences instead? What if we designed to preserve what is unique about places and experiences? What if we made an effort to encourage our individual self-expression and that of people who use our products?
And how do we ourselves stay creative and curious and encourage others to do so?
We can find inspiration in unlikely places. For example, I love the creativity that emerges from “risky playgrounds,” a concept in which children are given tools (saws! hammers!) and materials in order to build their own play spaces. It’s a celebration of ingenuity, of spontaneity, whereas the algorithms produced by big tech are pushing us toward homogeneity and standardization.
And I take solace in the idea that humans are random beings who can still confound the system. A case in point: The Invisibilia episode “The Pattern Problem” follows Princeton researcher Matt Salganik as he tests whether a computer can analyze patterns in life events to predict GPAs by age 15. Despite having access to a robust data set, the computer fails miserably. It’s evidence that we have this randomness in us—yes, we’re unpredictable!—we just need to actively harness it to fight the sameness that technology platforms are steering us toward.
Shortly before his untimely death, Anthony Bourdain, hero of curiosity and discovery, was interviewed by Maria Bustillos in Popula. Referring to people who follow the detailed visits from his show, he said, “I much prefer people who just showed up in Paris and found their own way without any particular itinerary, who left themselves open to things happening.” The prescriptiveness with which we’ve designed our lives results in a less fearful existence but also a less human one. As humans, we need to learn, to be creative, to connect unlikely things. We need to be uncomfortable.
So let’s stop feeding the machine-learning beast. Take control of your own expression instead of letting the Google machine take over your keyboard. Let’s leave ourselves open to things happening.