Prioritizing Privacy
How I learned that Facebook failed
Unless you have been living under a rock, the recent woes experienced by tech and social media giant Facebook have dominated news cycles on days when chemical weapons, nuclear proliferation, and North Korean summits might otherwise capture the attention of the American consciousness. The headline:
Facebook Doesn’t Care About Users’ Privacy
(and they probably never did).
In fact, they misappropriated the personal data of more than 85 million citizens, and then took a couple of years to admit it.
The blunder has not gone unnoticed by other tech titans. On a Saturday afternoon Apple CEO, Tim Cook, at March’s Beijing China Development Forum, offered his hot take on the situation at hand and how data affects human lives:
“This certain situation is so dire and has become so large that probably some well-crafted regulation is necessary. The ability of anyone to know what you’ve been browsing about for years, who your contacts are, who their contacts are, things you like and dislike, and every intimate detail of your life; from my own point of view, it shouldn’t exist.”
Needless to say, Zuckerberg has taken a beating; so too has Facebook shares, suffering a precipitous drop of 4.4% since the unveiling of the Cambridge Analytica scandal. While Facebook users are now looking more closely at their app settings, there has been a noticeable exodus from the platform for many users. Millennials were less enthralled with Facebook than the newer social media platforms like Instagram (also owned by Facebook) and SnapChat, so a loss of active users as a result of these privacy concerns are a black eye on, arguably, the world’s most visible social media platform. And it is costing them money in not only market value, but also ad sales.
Zuckerberg’s public comments (outside of his congressional testimony) have been largely tone deaf and saccharine sweet. They range from Facebook posts professing that “we have a responsibility to protect your data and if we can’t, then we don’t deserve to serve you” to question-dodging that would make any career politician proud. In an interview with The New York Times speaking about the Cambridge Analytica scandal, he offered the following response:
“Are there other Cambridge Analyticas out there? Were there apps which could have gotten access to more information and potentially sold it without us knowing or done something that violated people’s trust? We also need to make sure we get that under control.”
So, in other words, yes, there are more coming. Can we really trust an organization that doesn’t immediately accept responsibility for safeguarding private user information, knowing that it might have been used for undermining our very democracy?
The very fact that more people clicked on “fake news”—planted false stories and claims—than actual news stories should have set off huge red flags at Facebook. Truth be told, if they had an Information Governance (IG) program in place and were truly trying to maintain the integrity of our public news, and the integrity of their company, then this likely wouldn’t have happened.
The Lasting Effect of Fake News
News of this privacy breach comes in the wake of the outing of Cambridge Analytica and the role they may have had in politically-oriented ads. To hear Facebook speak about it publicly, they have been trying to combat the rise of fake news (despite famously denying they played any role in the election).
Ungoverned, false information that lingered because of the clicks it created, and not whether or not the story has been appropriately sourced and verified, is the primary cause of the rise of fake news that plagued Facebook and spilled out into casual conversations. Fake news is best understood in the context of giving oxygen to an idea regardless of its veracity: such stories were likely accelerated and allowed to fester as a result of these ineffective fact-checking efforts. Without access to the results of these third-party editors, we cannot understand the effect in its totality.
You might be wondering why we can’t apply more pressure to turn over the data or at least share the full results. The reason will irritate you. We can’t do much about it is because, as a private corporation, they are not obligated to release data. That is why GDPR-like legislation is being proposed in the U.S. on a state and national level. Facebook claims that in not revealing this internal data, they don’t risk revealing private user data––which is hypocritical at best, given what we know now about the Cambridge Analytica scandal.
Can we genuinely believe that Facebook is really prioritizing privacy as their new
ads suggest?
We say no, but draw your own conclusions, sports fans.
recent posts
You may already have a formal Data Governance program in [...]