Today’s social media networks are divisive and even dangerous, but they don’t have to be. We can turn them into something positive – and the first step is owning our data.
It’s worryingly easy for any Facebook user to be bombarded by conspiracy theories about everything from election fraud and COVID-19 vaccine plots to the cult that believes the highest echelons of American society are being run by devil-worshiping pedophiles. Facebook’s own research shows that the algorithms designed to select content that keeps users engaged can quickly turn a news feed into a conspiracy “rabbit hole.”
The terrible consequences of social media algorithms were starkly illustrated in the 2022 inquest into the death of London teenager Molly Russell. In the six months before the 14-year-old took her own life in 2017, she had saved, liked or shared more than two thousand pieces of Instagram content related to suicide, self-harm and depression. The coroner linked Molly’s death to “the negative effects of online content.”
Social media has undoubtedly boosted connection, fun and learning. But cases like Molly’s show it also has a dark side — which often results directly from the business models behind these services. Social media can exacerbate political polarization, drive the spread of misinformation and conspiracies, and destabilize vulnerable people. And, as artificial intelligence tools grow in popularity, this technology also will have the power to use our data in potentially harmful ways. We are at a watershed moment, but we are not powerless. We already have solutions that can be adopted and scaled. But action is needed.
Selling us to advertisers
Central to the problem is user data. Facebook, Twitter, YouTube, Pinterest and every other social media site collects data on us: everything from age and gender to location, interests and when we access the service. They can fill in gaps in our data by buying more from ‘brokers’ and, of course, they sell all the information they’ve got on us.
This is all done to sell advertising. The more complete the picture a social media site has of its users, the better it can target advertising. And the better they can target specific groups, the more they can charge advertisers. A company launching a new range of puppy food, for example, will pay more to reach an audience whose social media activity shows they’ve just got a puppy or are planning to get one.
But targeted advertising is just part of the picture. Social media companies are reliant on people to see the ads, and that means getting users to access the service regularly and for as long as possible. To achieve this, they developed algorithms which are designed to optimize for time spent on the platform.
As sociologist and Columbia University professor Zeynep Tufekci has written, this often involves suggesting content that becomes progressively more extreme: “As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.” In the six months before her death, 2,100 of the 16,300 posts saved by Molly Russell on Instagram were depression, self-harm or suicide-related. Extremism generates clicks.
Taking back control
Clearly, what keeps advertisers happy isn’t always what is best for human well-being. A 2020 survey found that two-thirds of Americans believe that social media has a negative effect on society. One key step towards changing that would be for users to control their own data. Without the power and incentive to sell personal data, social media companies would be encouraged to better serve their users instead.
The open-source Decentralized Social Networking Protocol (DSNP)
is intended to provide the foundation for such a world. It creates the infrastructure for a secure and universally accessible social graph that is disconnected from the financial incentives of social media companies. It is managed using open-source protocols that would allow users to interact with people on any application they choose — similar to the way that email works today.
This is just one part of a solution that will require broad consensus. The task of creating a social media landscape that is a force for good, rather than one that incentivizes misinformation and polarization will depend on collaboration between many stakeholders. We need a movement that brings together partners from technology, academia, social impact and the arts to transform the internet.
We’re advocating for a people-centric internet, which means getting the people involved, including you. Together, we can make social media more open, transparent and safe, but only if more people join us in a push for change. It’s time to stop scrolling and start acting.