To recap: Facebook collects data on its users. Lots and lots of data. Likes, shares, what we post about, who our friends are, those personal details we choose to share, etc. etc. It uses this data to micro-target advertising, which is the basis of its business model, allowing it to earn vast advertising revenues: the user does not pay for Facebook, because the user is the product for sale.
This we have known for a long time, and while many of us have maybe been uncomfortable with it, we’ve generally accepted it as the price of staying in touch, sharing news and experiences, and having this endless source of distraction from the boring old ‘real’ world to bury ourselves in.
The story gets more troubling with Facebook’s algorithms for what we do and do not see on our news feed; whose posts, what sort of posts, and what “suggested” posts from around the web that no one has specifically shared to our timeline. This helps creates those famous ‘bubbles’ of like-minded people, ensuring that we are rarely troubled by news and opinions that challenge our existing worldview, but also helps facilitate the spread of ‘fake news’. One study last year found that Facebook was the number one distributor of such fake news stories in the run-up to the 2016 US elections. And it seems that the old adage, that a lie is half way round the world before the truth has got its boots on, has been supercharged in the supermedia age—and has now been scientifically verified in the case of Twitter at least.
Diving still deeper into the swamp, we come to ‘third party apps’ – which is where Cambridge Analytica come in. Third party apps (3PA) are programs and applications developed by third parties that interact with Facebook. Candy Crush is one example of a very popular 3PA. It means you can access these apps from Facebook, and also any website that allows you to login using Facebook is using a 3PA to do so. Loads of these quizzes that appear on your Facebook feed and which you can click on and do via Facebook, and then post to your Timeline about which combination of two Hogwarts houses you are, are also 3PAs.
The problem is, they are nasty little Trojan horses. Whenever you use one, you agree to give the third party developer access to all your Facebook data. What is more, until 2014 when Facebook changed the rules, you agreed to give the developer access to all your Facebook friends’ data. The developer was only supposed to use this Friends’ data for purposes related to the app itself, not, for example, to create psychological profiles of 50 87 million Facebook users, of whom 71 million were from the US, and use them to micro-target election material and fake news so as to help get Trump elected. Which is exactly what Cambridge Analytica did, using the Friends’ data from 237,000 users who clicked on an innocuous-looking “personality test” on Facebook. (The test was designed by academic Aleksandr Kogan who, through his company Global Science Research, then shared the data with Cambridge Analytica). This was against Facebook’s rules, but having been given these 50 million profiles, it didn’t require any clever hacking or scheming to go just a little bit beyond what Facebook allowed. Cambridge Analytica CEO Alexander Nix was even videoed boasting about how he helped get Trump elected. (He’s now been sacked, and banned from Facebook, scant consolation).
Facebook, for their part, were warned about the misuse of their data back in December 2015, but did little or nothing about it.
The problem with writing an article like this is the story keeps changing as you write it. Now it turns out that ‘malicious actors’ have probably got hold of information on ‘most’ of Facebook’s 2 billion users, by getting a ‘bot to plug a load of email addresses and phone numbers, acquired over the ‘dark web’ from a variety of data breaches, into Facebook’s profile search function, allowing them to connect the numbers and addresses to people’s names and whatever information they choose to make public on their profile.
Cambridge Analytica also did work for the Vote Leave campaign in the UK’s Brexit referendum in 2016. One whistleblower, Shahmir Sanni, alleged that Vote Leave used a company closely linked to CA, Aggregate IQ, to violate campaign spending limits (AIW was described by CA whistleblower Christopher Wylie as operating “almost as an internal department of Cambridge Analytica” at the time). A £625,000 donation from Vote Leave, the official umbrella organization of the pro-leave campaign, to another organization, BeLeave, was mostly spent on the services of AIQ, the same agency being used by Vote Leave. Sanni alleged that the donation was not a genuine donation, but in practice was used directly in conjunction with Vote Leave’s own spending with AIQ. According to the Guardian, “British electoral law prohibits co-ordination between different campaign organisations, which must all comply with spending limits. If they plan tactics or co-ordinate together, they must have a shared cap on spending. Vote Leave strongly denies any such co-ordination.”
Cambridge Analytica also worked for the re-election campaign of Nigeria’s phenomenally corrupt ex-President Goodluck Jonathan—on whose watch the Armsgate scandal took place—using a virulently Islamophobic smear campaign against Jonathan’s ultimately victorious opponent, Mohammadu Buhari, accusing him of intending to impose Sharia Law throughout Nigeria. What possible damage to peace could that cause in a country almost evenly split between Muslims and Christians, generally coexisting very well, but with periodic outbreaks of inter-community violence?
As well as their advanced big-data algorithmic marketing tools, Cambridge Analytica ex-CEO Alexander Nix also boasted on a video secretly recorded for the UK’s Channel 4 News about using more low-tech tactics of bribery and the use of sex-workers to entrap politicians, as part of the suite of services they offered their clients.
The Cambridge Analytica scandal has highlighted the massive privacy issues surrounding the vast amounts of data Facebook collect on all of us who use it, and the uses to which that data may be put by third parties with whom it is shared. Facebook clearly ignored the downsides of this for far too long, though it now seems to be only reluctantly catching on and changing its practices.
But potentially even worse is the role of Facebook in promoting ‘fake news’. Researchers recently found that mass exposure to false stories about Hillary Clinton (through Facebook and elsewhere) may well have done enough to swing a very tight Presidential election in Trump’s favor in 2016, which as we have noted represents a very grave threat to world peace indeed. But this is far from the only example, or the worst.
Perhaps the most devastating case, in its outcomes, of the misuse of Facebook to spread false news, is in Myanmar. Facebook, over the last few years, has rapidly become the number one news source for a large proportion of Myanmar’s population, and indeed for many their sole gateway to the internet. And Facebook has been extensively, and very effectively, used by far-right Burmese nationalist groups to spread hateful propaganda against the country’s Rohingya population, and Burmese Muslims more generally, so that demonization of the Rohingya as subhuman and malevolent has become a mainstream consensus among Myanmar’s majority population. Thus, the army’s campaign of mass ethnic cleansing, rape, and killing, against the Rohingya has faced very little opposition within the country.
Facebook, for their part, have done little or nothing to combat this virulent epidemic of violent hatred spread through its platform. While it would be too much to blame Facebook for causing the ethnic cleansing of the Rohingya, they have certainly been one of its key enablers.
Now I’ll just go share this article on Facebook…
Tagsadvocacy Africa African Union arms trade atrocities AU book review Bosnia conflict data corruption Covid-19 elections Employee of the month Eritrea Ethiopia famine Fletcher voices foreign policy gender genocide Global Arms Business human rights memorial Indonesia intervention Iraq justice Libya mediation memorialization migration new wars peace political marketplace Re-Framing the Debate Saudi Arabia Somalia South Africa South Sudan Sudan Syria trafficking UK UN US Yemen