Once upon a time, when you thought “cultural anthropologist,” you envisioned a researcher in khaki speaking little-known languages while studying people in remote areas. But Nick Seaver, an assistant professor of anthropology at Tufts, represents a stark departure from this commonly held image. He conducts his fieldwork in the cafeterias of tech companies, in hotel conference rooms, and through social media exchanges.
After graduating from MIT with a master’s in media studies, the young researcher wanted to explore a fundamental tension: while music is a cultural product, it’s also trapped by equipment and dependent on technology. The year was 2010, and music recommender systems, such as Pandora and Spotify, were changing the way the listening public consumed music. Seaver wanted to do ethnographic research about the employees who were developing these systems, a focus that was not just timely—it was uncharted territory. So he accepted a doctoral position in anthropology at UC Irvine and dove in.
The new fieldwork context brought a fresh challenge, namely the fact that tech companies, protective of proprietary information, tend to be wary of outsiders. Yet anthropologists have long studied reluctant groups (think secret societies in native communities, female freemasons in Italy, and stage magicians in France). So while music recommender systems may have been new, the work Seaver was doing was in keeping with what those in his field have long done: map out the cultural world of a subset of people.
Seaver wanted to know how tech developers working on music recommendation algorithms saw their roles, and how their own culture influenced their decisions. To collect data, he reached out online, at conferences, and through academic labs, eventually landing ninety interviews and even an internship at a tech company.
These days Seaver is working on a book, tentatively titled Computing Taste: The Making of Algorithmic Music Recommendation, about his big takeaway: as technologically advanced as algorithms have become, in the end, we can’t separate music selecting systems from the people who develop them. That’s because the systems depend on a “cascading set of little human decisions,” he said, including the criteria used to create and change them. “There is no such thing as an algorithmic decision,” Seaver added. “Algorithms are people.” And the way they think has implications.
This truth is evident in the ways that the workers described their jobs to Seaver. Some saw themselves as gardeners, weeding out bad songs from the listener’s mix. Others considered themselves more like tour guides, introducing listeners to new songs. And still others spoke of trapping listeners, getting them “hooked.” All of these frames of reference affect decision-making in the creation, modification, and evaluation of algorithms. In other words, they all wind up influencing which songs we hear.
Add it all up and algorithms start to seem a lot less mysterious. “It’s a set of people making a bunch of choices,” Seaver said. “And that’s something anthropology can do really well—hook technology back into the social world that it came from.” Something to ponder with each changing track.