Spotify Technology SA on Monday announced it has formed a Safety Advisory Council to provide third-party input on issues such as hate speech, disinformation, extremism and online abuse.
The group represents another step in Spotify’s efforts to deal with harmful content on its audio streaming service after backlash earlier this year over “The Joe Rogan Experience,” in which the podcaster was accused of spreading misinformation about COVID-19.
The 18 experts, which include representatives from Washington, D.C. civil rights group the Center for Democracy & Technology, the University of Gothenburg in Sweden and the Institute for Technology and Society in Brazil, will advise Spotify as it develops products and policies and thinks about emerging issues.
“The idea is to bring in these world-renowned experts, many of whom have been in this space for a number of years, to realize a relationship with them,” said Dustee Jenkins, Spotify’s global head of public affairs. “And to ensure that it’s not talking to them when we’re in the middle of a situation … Instead, we’re meeting with them on a pretty regular basis, so that we can be much more proactive about how we’re thinking about these issues across the company.”
The council is purely advisory in nature, and Spotify can accept or reject its advice. Unlike Facebook’s oversight board, which decides what cases it reviews, Spotify will submit issues for its council to consider and provide feedback.
Sarah Hoyle, Spotify’s head of trust and safety, said the advisory council was not formed in reaction to “any particular creator or situation,” but rather a recognition of the challenges of operating a global service at a time when threats are constantly evolving.
“How do we augment the internal expertise that we already have at Spotify, to tap into these folks whose life’s work has been studying this, and they’re on the ground in markets all around the world, just like our users, just like our creators,” said Hoyle.
(Reporting by Dawn Chmielewski in Los Angeles; Editing by Chris Reese)