UC Berkeley data scientists join Angelique Kidjo in calling for a more social online environment.
Sitting on a stage at the University of California, Berkeley amid four computer scientists, award-winning international artist and performer Angelique Kidjo took aim at social media platforms and the algorithms behind them for the harm they are inflicting on people and culture while racking up massive profits in the process.
The computing experts around her joined in, agreeing with Kidjo’s assessment and also offering glimpses of hope of how artificial intelligence (AI) and data science can be used to help alter the current course.
The Oct. 28 discussion on“Place and displacement: Bias in our algorithms and society” was part of Kidjo’s season-long artist-in-residence at Berkeley. The panel included Associate Provost Jennifer Chayes, head of the Division of Computing, Data Science, and Society (CDSS); Berkeley computer science Ph.D. student Devin Guillory, who specializes in computer vision and machine learning; Assistant Professor Nika Haghtalab of the Department of Electrical Engineering and Computer Sciences (EECS) who works on AI and theory, focusing on learning in presence of social and strategic interactions; and Michael I. Jordan,the Pehong Chen Distinguished Professor in EECS whose research interests bridge the computational, statistical, cognitive and biological sciences.
Chayes opened the discussion saying that we are “living in a world mediated by AI and AI platforms that have huge effects on society” and asked the speakers about the effects they are seeing.
Guillory described his upbringing in Baton Rouge, Louisiana, a place he said has a very rich culture but is at the center of the climate crisis; his mother’s small hometown in the area has been essentially abandoned after being decimated by two hurricanes 15 years apart. As AI-driven platforms change how people consume content, music and art, he worries about the effects on regional cultures, which may be swept away or lost. “We don’t understand yet how it’s reshaping the cultural landscape,” Guillory said.
Haghtalab, originally from Iran, has lived in India, Canada and the United States. She said she still feels connected to each country as home because she found an environment where “I feel encouraged and supported in expressing myself,” giving her trust in the community. But as various online platforms erode people’s trust, that is a form of displacement, just as the loss of a physical space is, she said.
Having been exiled from her native Benin, Kidjo agreed that it’s easy to lose your balance when you lose your home, especially if you don’t have the strength to rebuild safety and trust in a new home. She mentioned the hundreds of refugees who have drowned trying to cross the Mediterranean and asked, “How can AI help create a better environment so people can stay where they are, reveal their own reality in their own home?” The problem, she said, is that many leaders are more interested in using AI to control populations rather than give them freedom.
Jordan agreed with Guillory’s comments about AI and culture, saying “it really is what is happening with culture and whether technology is helping or damaging culture.” He noted that he thinks of himself as a drummer more than a professor, though he has spent 30 years at Berkeley. “Music is a big part of our culture,” he said, describing how beats now played around the world originated in Africa. “It definitely unites us.”
The idea of building a home is central to building CDSS into an organization where people feel supported to take data science in new directions, cross longtime academic boundaries and collaborate with people in seemingly unrelated areas, added Chayes.
To help the audience understand what’s at work behind the scenes, Chayes asked how algorithms and data-dependent recommendation systems affect the integrity of the news media, search results, hiring and online marketplaces.
Haghtalab offered a high-level explanation of a system of people, books and videos and how to come up with recommendations of what people should read or watch based on similarities between them. The system then may draw on past reading lists, location, education level and the like to create recommendations. The recommendation is biased, not random, but that’s not necessarily bad. But if the bias is systematic -- that is the results are correlated due to a discriminating quality that shouldn’t be used, such as race or gender -- then some groups are not as well served.
“AI-mediated platforms are not just made for giving you book recommendations out of the goodness of their heart, they are gaining something in the background,” Haghtalab said. “If you think about news recommendations, the platforms are doing that because they want to keep people engaged, and keeping people engaged might be at the cost of happiness to the person.”
And the more links a person clicks may lead to more dangerous or polarizing content, Haghtalab said. But the longer users stay engaged, the more money the platforms earn from advertisers.
Kidjo offered her perspective that people are willingly paying money to become addicted to their phones and society is paying the price.
“I think you guys that are here working in AI, you should present something different that preserves our society, our culture,” she said. “Every time you go online, it’s not for free.”
Guillory added that how AI is designed and what society gets out of the system is strongly aligned with our economic interests and that it’s more profitable to hurt people than to help them.
“What are the unintended consequences of how we’re building this, who we’re building it for and what’s our intention and goal?” Guillory said. “Are we asking all of the right questions? I think a lot of times in the creation of this it’s an insular group, the same people creating this through multiple generations, and what’s the overarching impact of not centering a diverse group of people when we’re trying to develop these technologies.”
Midway through the discussion, Chayes turned to Jordan, calling him a leader in bringing together AI and economics and working toward more market-aware AI and more AI-aware markets. She asked if the tables could be turned and AI algorithms could empower artists and give them more opportunities.
As an example, Jordan cited UnitedMasters, an online platform that connects musicians with listeners via the major music streaming platforms. Not only does the service ensure that the musicians are paid for their work, Jordan said, but they also retain control over the ownership, which especially helps emerging artists. Several years ago, Jordan met with music impresario Steve Stoute at a downtown Berkeley cafe to discuss such a platform, which Stoute then launched in 2018 and now represents one million musicians.
“Probably 95 percent of the songs being listened to by someone in the U.S. is music written in the last two years and it was written by someone you’ve never heard of,” said Jordan, who prefers the term data science over AI. “They’re the peers of the people who are listening to the music and a lot of them are kids who are 16 years old who have been empowered to pick up a laptop and make music, and they’re good at it.”
Typically, they would upload their music to platforms like SoundCloud in the hopes people would hear it. But then services like Spotify could grab the tunes for free and make money serving them up to customers. In the process, the creators lost control, power and value, Jordan said.
“Then we listen to it a huge amount, but nothing goes back to the creators,” Jordan said, adding that the expectation that all content should be free goes back to the early days of advertiser-driven broadcast radio and television.
Instead of a commercial model, we can “create this two-way model, this relationship between people and you stand back out of the way and let it be,” Jordan said. “This is to me the way forward.”
Watch the video of the event.