Facebook is not only an unmitigated cesspool of data privacy failures, but for a time, the tech giant meddled a bit in the emotional manipulation of its users. A report, which first aired on NPR in 2012, outlined what appeared to be an attempt by Facebook scientists to manipulate more than 600,000 users’ newsfeed.
In true experimental fashion, they went with an A/B split test, with some users getting positive items, while others received negative ones. Unsurprisingly, a trend emerged: people who received negative news were more negative; people who received more positive news were more positive.
The New Scientist reported: “The research means ‘emotional contagion’ can happen online, not just face to face.” It went on to add: “The effect was significant, though modest. Ke Xu of Beihang University in Beijing has studied emotional contagion on Chinese social networks. He says [Facebook’s Adam] Kramer’s work shows that we don’t need to interact in person to influence someone’s feelings.” If this comes as surprise, then you haven’t been paying attention.
Are we really surprised by what Facebook does with our data anymore?