By: Krista Davidson
3 Jun, 2021
Experts say algorithms may impact our cultural preferences. Digital recommendation systems, like those used for Netflix and Spotify, enhance our online experience by curating the types of music or movies that they think we’d enjoy.
shows they may also influence our interests and cultural preferences, and even contribute to inequality. A team of experts from one CIFAR AI & Society workshop called AI, Recommendations & the Curation of Culture, warn that these algorithms could have consequences for how we access cultural content and what is available to us.
CIFAR interviewed the AI & Society workshop participants to better understand the role of AI and recommendation systems in influencing culture.
CIFAR: How did you become interested in recommendation systems?
Ashton Anderson: I originally became interested in this topic a couple of years ago when I started working with Spotify. The issues are kind of obvious in retrospect, but algorithms are not something that a lot of people think of or are aware of.
Fernando Diaz: I’m a researcher at Microsoft Research in Montreal and my background is in information retrieval which covers things like web search or text search, but it also covers recommendation systems. I have been doing this for about 20 years in an industrial research context, at Yahoo, Microsoft, and Spotify.
Nicole Klassen: I’ve been in the media and communications industry for nearly two decades. One of my best career experiences, which is also where I learned about recommendation systems, is when I was part of a tech startup for African creatives.
CIFAR: What are some of the implications that recommendation systems may have for culture, specifically music?
Ashton Anderson: A lot of times, computer scientists who are developing recommendation systems and algorithms are not specifically tailoring them for music. One of the challenges for the computer science community is to start thinking about these issues. The most popular recommendation systems are based on an original algorithm that was designed for text and has been completely repurposed for more cultural platforms for distributing music.
Fernando Diaz: Conventional recommendation systems are crude in terms of how they make recommendations. Many simple models focus on average performance, which means recommendations for people from underrepresented groups are especially prone to error. Systems like Spotify or Netflix work well for people in relatively big groups, but not for those in smaller groups. The effects are not just on the consumers of content, but also the producers, because the incentives for content creation are impacted. For example, if I’m creating music and I want to operate within the context of a recommendation system, I will create with those systems in mind.
Larisa Mann: People make music to communicate and reinforce a sense of community on their own terms. When that music is made accessible to others on different terms, it doesn’t have the same meaning. The meanings are partly generated by the audiences that engage with it. Once you take music that comes out of a particular ethnic group, immigrant community, or underground queer dance music community, and you make it available to people who have no obligation to or connection to that community, it can erase and prevent some of the original understanding and connection of that music.
There’s a genre of music from New Orleans called “bounce”, and some of the big name musicians are queer and/or trans. When it crossed over into the wider dance music world, some people called it ‘sissy bounce’, [a term with homophobic connotations]. Within New Orleans, there were a lot of people making this music that didn’t identify as queer or trans, and some who did, but who found the term demeaning. If your music shows up as a recommendation under a genre that identifies you as having an identity that you don’t want, or that could potentially be dangerous for you, that’s a problem.
Nicole Klassen: The recommender systems to which we generally refer to when speaking about culture are algorithms that are designed to achieve a specific business objective, and that objective is driven by the stock markets and investors. There’s so much more we can do with the technology if we set different objectives and parameters.
Jeremy Morris: AI and algorithmic recommendation systems are a set of specific technologies and tools that have real effects, such as the type of songs that are produced by AI that in turn generate fans, listens, and plays on platforms like Spotify. It can also result in artists adjusting their songs, sounds and the metadata of their tracks, in order to appear higher up in search results and have their music be more “platform ready”. So for me, thinking about the future of AI’s impact on music and culture means thinking about the technology, but also the stories we tell about what that technology can and should do.
CIFAR: Are recommendation systems changing our cultural preferences?
Larisa Mann: There are massive inequalities of power when it comes to culture. Why is classical music piped into train stations and not death metal? The meaning of those genres are part of a power structure that says one of these is acceptable and respectable, and the other is threatening. If we bring recommendation systems into the world without accounting for these existing power structures already in play, it’s very likely that they’ll make them worse.
Nicole Klassen: Taylor Swift gets more radio airplay on our continent than most local artists. Unless countries have an enforced quota for local content, radio is for the record labels or those with money. South African and several other African broadcasters pay royalties on airplay. International labels and artists earn royalties from our broadcasters so they tend to dominate the space. Local artists and culture are progressively more excluded from exposure and royalties as a result of labels and artists who have the budget to influence platforms. This is devastating for a culturally rich continent like Africa. Instead of assisting the culture, the algorithms are speeding up the divide for creative talent.
Fernando Diaz: They’re reflecting a distorted view of what people want. The problem is the algorithms are modeling what people want right now, or have wanted in the past.
Ashton Anderson: Recommender systems are coded to an assumption that in a cultural context is almost certainly false. A lot of these platforms use systems that treat a similarity between any two songs as fixed for all people.
CIFAR: What were some of the key insights and recommendations developed as a result of the workshop?
Larisa Mann: Communities need to be involved in the design of the systems that are circulating in their cultures, as much as possible. It’s not just a matter of incorporating all of the content that represents different people, but the makers of the content should be involved. And, obviously, more money should go back to communities and musicmakers.
Nicole Klassen: Collectively we have so much more information available on how people relate to music and systems than when the first recommender platforms were launched. How does a cultural/music scene exist on a platform? Should it be closed or open? How does a live DJ know when to select the next track to keep their audience engaged? It would be interesting to attempt a system that does not need to satisfy investors – that has different objectives and parameters.
The AI & Society leadership for AI, Recommendations & the Curation of Culture includes: Ashton Anderson, University of Toronto, Canada; Georgina Born, Oxford University, United Kingdom; Fernando Diaz, Google Research Montreal, Canada; Jeremy Morris, University of Wisconsin-Madison, United States
The AI, Recommendations & the Curation of Culture workshop was part of a series of AI & Society workshops led by CIFAR in partnership with the French National Centre for Scientific Research (CNRS) and UK Research and Innovation (UKRI).
The AI & Society program, one of the objectives of the CIFAR Pan-Canadian AI Strategy, develops global thought leadership on the economic, ethical, political, and legal implications of advances in AI. The AI & Society workshops generate a deep understanding of how AI impacts society, and to recommend sociotechnical approaches to supporting responsible AI.
The team’s
is published by the Schwartz Reisman Institute for Technology and Society (SRI) at the University of Toronto. SRI is a multidisciplinary research and solutions hub investigating the social effects of powerful emerging technologies like artificial intelligence.