Fixing the terms and conditions of privacy

Facebook CEO Mark Zuckerberg Testify Before Congress over social media data breach -  DC
Backlash following the privacy issues with Facebook has brought into question the freedom corporations have regarding the use of user data. (Olivier Douliery/Abaca Press/TNS)

Staff Writer
In a 2010 interview with TechCrunch, Mark Zuckerberg stated that he had taken an “about face” on privacy and argued that privacy is no longer a “social norm.” More specifically, he said “people have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people . . . [which] evolved over time.” At the time, his statements were not particularly surprising, considering they came soon after a highly controversial decision to change the privacy settings of 350 million Facebook users.
And now, in the aftermath of revelations about the data firm, Cambridge Analytica, Facebook seems to have done another “about face.” Early in his personal response to the crisis, Zuckerberg wrote that “we have a responsibility to protect your data, and if we can’t, then we don’t deserve to serve you,” a statement that signals not only a major change in the company’s talking points, but a tacit admittance that their long-running business model of selling users’ information to advertisers might be a violation of their users’ privacy.
Cambridge Analytica is a data firm largely funded by Robert Mercer, a wealthy Republican donor, and Stephen Bannon, former advisor to President Donald Trump.
From 2014 to 2015, it worked with Cambridge University professor Aleksandr Kogan to collect data on Facebook users. To that end, Professor Kogan created a personality test app for Facebook. Harmless enough, except it had a secret “feature”: by taking the test, you were giving the app access to your friends’ profiles as well. This could be turned off, but it’s hard to turn off a function that you didn’t know existed. Facebook put a stop to these practices and even rejected the second version of Kogan’s app, but at that point, he had about 87 million profiles, which he sold to Cambridge Analytica.
It should be noted that Kogan has taken assignments from the Russian government. And that at least one employee of Palantir, a data firm founded by major Trump supporter Peter Thiel, was involved in the decisions that led to the data purchases. And that the data purchased was used by several political campaigns, including the Trump campaign and the Brexit vote.
Thus, this issue is different from Facebook’s previous privacy scandals. It is clear that Facebook’s massive platform has made it possible to gather information on people on an unprecedented scale. Our nation’s institutions finally seem to be realizing this, as Mark Zuckerberg appeared before Congress twice this week. Those appearances were marked by inept questioning, ducked answers and vague platitudes, yet they may still represent a meaningful shift in how our government approaches its tech corporations. The government absolutely needs to do more to investigate and regulate Facebook, but it shouldn’t be focusing exclusively on Facebook. The Internet created a new age of technology corporation giants, and society needs to do be willing to do more to reign these corporations in to prevent abuses of power and privacy.
At the moment, there are plenty of people clamoring for greater regulation of Facebook and how it uses its users data, though there is just as much disagreement on what those changes should be. The company and its defenders are asking for time and faith that they can still fix the system they have created.
In all fairness to the company, Facebook has made significant strides in fixing its privacy issues. After all, by the time the second version of Professor Kogan’s app was finished, the gaps in their system had been closed. Not quickly enough, obviously, but the company was at least responsible enough to make that effort. And with the current crisis, isn’t it possible that these corrections might start coming faster?
On the other hand, even though the Cambridge Analytica scandal might only be a past issue, it still draws attention to the unique position that the company is in. By its sheer scale, it influences the nation to a point that many find disturbing when considering that this is still a private enterprise.
By that same token, however, we also need to recall that the standards and rules that allowed data to be breached were standards and rules the Facebook users agreed to follow in their End User License Agreement. This would seem to imply that this is a disagreement that needs to be resolved by the company and its users. “It’s hard to read nutrition labels, right? It’s very difficult. But we have them for a reason, because there’s certain information that should be accessible to the consumer . . .” senior Neah Lekan noted.
The social media giant that Facebook has become needs to grow out of its past as a dorm room project. At his testimony, Zuckerberg said that “Facebook is an idealistic and optimistic company. For most of our existence, we focused on all of the good that connecting people can do.” But, he conceded, “it’s clear now that we didn’t do enough to prevent these tools for being used as harm as well.”
Perhaps regulations, imposed by the government or by itself, will threaten Facebook’s bottom line. That doesn’t mean that they shouldn’t be imposed. Environmental regulations, worker protection and antitrust regulations all show that the United States understands that there are points where the profit motive needs to be subordinated to the social good. If companies started walking into our houses and taking notes on how we lived to figure out how to write advertisements, we would call that an invasion of privacy. That doesn’t change because it’s online.