Number of social network users worldwide: 2.34bn*
Percentage of global population using Facebook: 22.9%
Political social media spending for 2016: estimated $1 billion
I think we can all agree that social media has a huge effect on politics, and vice versa. Not only have we seen #ArabSpring, but we’ve also seen the rise of fake news, intense political bubbles, outrage culture . . . there’s so much politics on our feeds that 37% of social media users are actually tired of all the political content they see, according to PewResearchCenter.
But what exactly does social media do to politics? As far as I know, there is no qualitative research. It’s all very anecdotal, and I suspect it’s always going to be hard to map a tweet to a swing vote.
Instead, I looked at the question the other way: what does politics do to social media?
Firstly, a brief look at the underlying nature of social networks.
I see two types of social networks: noisy networks and curated networks.
Noisy networks are literally open forums. They’re focused on content streaming, usually close to real-time. Think Twitter, or a basic chat room.
The problem noisy networks have is relevance at scale. It’s fine when it’s just four people shouting at each other, as the number of users increases, it becomes increasingly difficult for a user to find what they really need. And thus you get user drop-off. Curated networks try to combat this by either organizing information in strict hierarchies (forums) or delivering personalized feeds (Facebook).
There are real trade-offs here. The moment you impose rules, or have an algorithm sort through the feed for you, you’re taking away some amount of freedom of speech. On the plus side, things become more relevant to users and the network becomes more usable at scale – hence Facebook’s success.
|Freedom of speech||Relevance|
|Closer to real-time||Not as much|
|Grouping ideas together||Grouping people together|
In a curated network:
- Success relies on relevance
- Relevance relies on serving people what they want to see
- Any major divider (like politics or religion) creates camps to which the system must cater to
- As you interact with other people in these camps, you become increasingly less likely to see content from the other camp
Confirmation bias kicks in. This creates powerful echo chambers or ubbles. We’ve all seen this happen.
What you see, then, is a result of the circles / chambers / bubbles you’re part of.
On Facebook, this translates to the kind of Red Feed / Blue Feed divisions that the Wall Street Journal mapped out during the election.
This is how we end up with bubbles that nourish and sustain the flat-earthers, brexxiters, non-vaxxers – and other idiots like that. You’re literally kept in your own bubble because the network relies on your feeling safe amongst your own people.
In a noisy network:
- Success relies on ideas going viral (#tags)
- Can transcend geographical and bubble communities
- Any major divider (like politics or religion) creates disruptions in what people see and consume
- Can act as an open broadcasting tool and an excellent store of data
- Still relies on network effects and is not immune to confirmation bias, but resists better than curated networks
- Open to botting and spam
Noisy networks generally do a lot better at providing people with differing perspectives – because the people you follow tend to be complete strangers, and that gives them a higher chance of not being in your groupthink circles.
Am I a fan of noisy networks? Yes, sometimes. Because in theory, they let the underdogs have a voice. They let people leak information from warzones, challenge Presidents, and, as I personally observed during the Sri Lankan Election, they let journalists have as much control over the electoral discussion as two presidential candidates.
As an example, here’s a screenshot of a network map I generated around Sri Lankan political discussions. Here, a journalist and a citizen journalism site are competing – quite well – for control over a Twitter narrative mostly ruled by propaganda from two presidential candidates.
But noisy networks can be broken, and in completely disastrous ways. Noisy are like a classroom full of people shouting: whoever shouts loudest and most often gets heard most. Case in point: Donald Trump.
This is a screenshot from when I was working on the WSO2 Election Monitor. We analyzed close to a million tweets a day around the 2016 US Presidential elections. Part of the analysis was network community graphing, with the size of each tweep represented by the frequency and attention their tweets got.
Every day, it looked like this. Trump was larger than Fox News, than the opposition, than every media agency put together. He owned the discussion.
Shout long and hard enough and you win.
What does this mean?
- Content, discussion and argument happen on noisy networks – the true open forums
- These stories then spread to the echo chambers of curated media, where the massive audiences are
- Confirmation bias is a feature in both networks but it is easier for a new or different idea to be seen on a noisy network
In the real world
- Dominant social media – ie: Facebook – is more likely to cement political worldviews rather than opening them up for discussion – and offshoots of this, like fake news, is likely to spread even further unless Facebook’s new checks are effective
- 2+ billion people (and growing) are susceptible to social media
- Noisy networks are the closest thing we have to freedom of speech on the Internet
But this is also a two-way street.
Data scientists like Michal Kosinski (of Cambridge Analytica) do this:
In 2012, Kosinski proved that on the basis of an average of 68 Facebook “likes” by a user, it was possible to predict their skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent). But it didn’t stop there. Intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined. From the data it was even possible to deduce whether someone’s parents were divorced.
….Before long, he was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook “likes.” Seventy “likes” were enough to outdo what a person’s friends knew, 150 what their parents knew, and 300 “likes” what their partner knew. More “likes” could even surpass what a person thought they knew about themselves.
– from “The Data That Turned the World Upside Down” by Motherboard.Vice.com
And Michal Kosinki worked for Donald Trump.
What about fake news? We have everyone from Macedonian teens (Wired) to Robert Mercer, the billionaire fake news king with links to Steve Bannon and Nigel Farage. If you know how the networks work, you can game them. And people do know how networks work. We are actively being gamed.
Scout.com explored the idea of a weaponized propaganda machine. It’s actually already happening. Politics has figured out how to use the echo chamber, and the way things are set up, the people with the most amount of money to throw at the problem win.
So, in the future:
- Polarization and bubble communities on Facebook are going to grow
- Sites where the bubbles can be traversed (ie: Reddit) will always exist, but automated newsfeeds will prevent the majority from ever exploring
- Outrage and clickbait will become viable tools for breaking into bubbles and injecting new narratives
- Public fact checking services will be a bona fide need
- We’ll have fake news empires that hire their services out to politicians and government propaganda units at will – the ‘weaponized propaganda machine will be a very real thing.’
- Bots (which can converse with superhuman frequency) are likely to be the dominant force in shaping political narratives on noisy networks
This presentation was given at the Social Media Influencer conference organized by Internews Sri Lanka in March, 2017.