The New Normal - #DataWeapons and #WeaponizingData - a cautionary tale of Facebook, Cambridge Analytica

We think we will look back in this period of humanity, not sure what historians or chroniclers will call it – “early 2018”, “pre 2020”, “precipice of artificial intelligence”, “dawn of robotic intelligence”, “before quantum computing”, “birth of blockchain decentralized autonomous organizations (DAO)”, – but we believe it will mark a turning point in cyber warfare, turning point in threats and vulnerabilities, and turning point in weaponizing data for diabolical tactics and nefarious schemes – for now, we can call it #DataWeapons #WeaponizingData --Aubrey Zhang Mikhailov


The new normal - #DataWeapons and #WeaponizingData - is a cautionary tale from Facebook-Cambridge Analytica. Evolving threats and vulnerabilities, diabolical tactics, and nefarious schemes, to undermine confidentiality, integrity, and inherent goodness, growing presence and grave societal danger.



Facebook in recent days has insisted that what Cambridge did was not a data breach, because it routinely allows researchers to have access to user data for academic purposes and users consent to this access when they create a Facebook account.

But Facebook prohibits this kind of data to be sold or transferred “to any ad network, data broker or other advertising or monetization-related service.” It says that was exactly what Dr. Kogan did, in providing the information to a political consulting firm.

Dr. Kogan declined to provide The Times with details of what had happened, citing nondisclosure agreements with Facebook and Cambridge Analytica.

The data, a portion of which was viewed by The New York Times, included details on users’ identities, friend networks and “likes.” The idea was to map personality traits based on what people had liked on Facebook, and then use that information to target audiences with digital ads.

Researchers in 2014 asked users to take a personality survey and download an app, which scraped some private information from their profiles and those of their friends, activity that Facebook permitted at the time and has since banned.

The technique had been developed at Cambridge University’s Psychometrics Center. The center declined to work with Cambridge Analytica, but Aleksandr Kogan, a Russian-American psychology professor at the university, was willing.

Dr. Kogan built his own app and in June 2014 began harvesting data for Cambridge Analytica.

He ultimately provided over 50 million raw profiles to the firm, said Christopher Wylie, a data expert who oversaw Cambridge Analytica’s data harvesting. Only about 270,000 users those who participated in the survey had consented to having their data harvested, though they were all told that it was being used for academic use.

Facebook said no passwords or “sensitive pieces of information” had been taken, though information about a user’s location was available to Cambridge.

Cambridge Analytica gathered this information to develop “psychographic” profiling tools, which it claimed could tailor political ads to users’ personality traits. “We exploited Facebook to harvest millions of people’s profiles,” whistleblower Christopher Wylie told The Observer. “And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”

It’s a great quote. But this weekend’s reports suggest these methods might not have actually been used in the 2016 US election. (In March 2017, a New York Times article said psychographics weren’t used; recent articles offer a somewhat more muddled picture.) Still, is it even possible to target a person’s inner demons using Facebook data? How afraid should we be of sophisticated psy-ops being deployed at scale, thanks to both the data we willingly give to Facebook and Facebook’s apparent inability to protect the people who use it? We asked Facebook how many other researchers had access to this data and if Facebook was reviewing those projects to see if misuse occurred elsewhere. Facebook hasn’t responded.


Taken altogether, it seems like Facebook was taken in by a shady firm that misused data and lied about it. When Facebook found out, it did nothing. And making matters worse, we can’t even point at Cambridge Analytica’s deception as the reason Trump was elected: a closer look at its methods suggests they might not even work.

Comments