It has not been a good few weeks for Facebook. The company has finally agreed to supply Congress with information about political advertisements bought during the US presidential campaign, apparently with Russian money and promoting inauthentic links and stories. Facebook’s weak response has been noted and castigated as a threat to democracy. But the issues go much deeper than this story, or indeed allegations of fake news and the creation of echo chambers and, I believe, the key clues lie in two academic papers. The first, with the descriptive title, “What Makes Online Content Viral?” analysed all articles published in the New York Times over a three-month period to understand the characteristics of articles shared more often than others. Controlling for as many factors as they can think of, their main conclusion is clear:
“While more awe-inspiring (a positive emotion) content is more viral and sadness-inducing (a negative emotion) content is less viral, some negative emotions are positively associated with virality. More anxiety- and anger-inducing stories are both more likely to make the most e-mailed list….Consistent with our theorizing, content that evokes high-arousal emotions (i.e. awe, anger and anxiety), regardless of their valence, is more viral.”
This conclusion should not come as a surprise to anyone who has spent time on social media, but it is useful to have the data. Social media companies have a vested interest in us spending as much time as possible using their tools and encourage us to post material that will be liked and shared. And the way to achieve this is simple – evoke awe, anger, anxiety or something similar. The rest will follow. The logical conclusion of this is that the most popular material on social media is the most emotive – considerations such as whether the material is useful or factually correct come a long way behind. This has always been known to journalists, of course, but social media lacks even the most cursory standards of fact-checking and material can be shared at a speed impossible in the pre-internet age. To use the Silicon Valley terminology, the fact that social media, particularly Facebook, is usually swamped with emotive and misleading material is a feature not a bug.
The second paper is even more scary but with a title that is only mildly sinister, “Experimental evidence of massive-scale emotional contagion through social networks”. To a certain type of psychologist, Facebook represents a wonderful laboratory for testing theories, and the test here is whether people who read sad or happy posts (identified by the use of certain keywords) are more likely to then produce sad or happy posts themselves (again identified through keywords). How do you test this? You take nearly 700,000 Facebook users and manipulate the newsfeed algorithms (Facebook’s top secret way of determining the order in which posts are seen) for two groups – one “sad”, one “happy” – and keep a group unchanged as a control. None of these people, of course, were aware of the experiment, which led to accusations that it breached ethical guidelines. But what is even more scary than Facebook’s willingness to experiment on its users without their knowledge is that it worked. Their summary diagram tells the story:
So there is a small but clearly measurable impact on the emotional state of Facebook users resulting from manipulation of their newsfeed algorithm. Or put this another way – we now have a company, a corporation remember, committed to maximising profits for their shareholders – with the capability to measurably influence the emotional state of over two billion people, without any of them actually being aware of it. If that doesn’t scare you, I’m not sure you have been paying attention.
The truth is, we have created a monster. The question now is whether it can be reined in before it becomes too powerful to deal with. The most hopeful signs so far have come from the European Commission, with Margrethe Vestager, the formidable Commissioner for Competition, one of the few officials willing to take on technology companies, including Facebook for making misleading statements relating to their acquisition of WhatsApp. But what will really make a difference is whether we, the general public, continue to invest our time, energy and attention in support of the company. A great deal may well depend on this.
Berger, J. & Milkman, K. (2012) “What Makes Online Content Viral?”, Journal of Marketing Research, Vol. XLIX (April), 192-205.
Kramer, A., Guillory, J. & Hancock, J. (2014) “Experimental evidence of massive-scale emotional contagion through social networks”, Proceedings of the National Academy of Sciences of the United States of America, vol. 111, no. 24, 8788-8790.