On March 17, The New York Times and The Observer reported on Cambridge Analytica’s use of personal information data harvested by a Cambridge researcher, purportedly for academic purposes, without the consent of the relevant users.
As a result, Facebook banned Cambridge Analytica, a British political consulting firm, from advertising on its website. The Guardian also reported Facebook had been aware of this issue for two years, but did nothing to inform or protect the affected users.
The Guardian later reported Joseph Chancellor, co-director of Global Science Research, the company that acquired and sold the data from tens of millions of Facebook users to Cambridge Analytica, has been working for Facebook as a corporate quantitative social psychologist since around November 2015.
On April 4, Facebook CEO Mark Zuckerberg told reporters the personal information of up to 87 million people, most of whom are Americans, was improperly shared by Facebook with Cambridge Analytica during the 2016 U.S. presidential election.
Consequently, Facebook announced sweeping changes to many of its Application Programming Interfaces (API) — the software tools that allow third parties, such as advertisers, to collect and use data directly from Facebook.
Dwayne Whitten, information systems clinical associate professor, said Facebook users should be more cautious about their personal data, even at maximum privacy settings.
“Users should never expect that their data won’t be shared or accessed by others,” Whitten said. “It’s been shown repeatedly that data has been sold, shared or accessed by companies without their permission. As users, we shouldn’t feel any comfort with our personal data in databases of any type.”
Paula deWitte, computer science associate professor of practice, said there are steps Facebook users can take to protect their personal information.
“Turn off apps on Facebook, which many will find as an inconvenience. If apps are left on, be suspicious of apps that ask certain information,” deWitte said. “Review privacy settings regularly. Who sees your posts? Friends? Public? Friends of friends? Do not post or store information that you do not want public — i.e., your phone number.”
Whitten said Facebook shares data with third parties, a fact which Facebook users often aren’t aware of.
“Users should limit the amount of data they provide online and also what they store on their computers,” Whitten said. “It may be more inconvenient sometimes to take the options to ‘save password’ or to not use multi-factor authentication, but convenience isn’t worth the increased risk.”
DeWitte said she believes the privacy of social media users will end up becoming a tipping point in cybersecurity matters.
“We as Americans have a different philosophy of privacy than Europeans,” deWitte said. “The Europeans have just passed the General Data Protection Regulation, GDPR, that applies to citizens or residents of an EU country regardless of where they live. GDPR would have required explicit consent before Facebook and or Cambridge Analytica used that data. Further, GDPR allows the data subject to remove the consent at any time and also institute ‘the right to be forgotten.’ In the U.S., we do not adopt that philosophy yet — but as the public becomes more aware that their data is a commodity bought and sold, I believe there will be a public demand for change.”
Whitten said she has her own perspective on the current state of internet consumer privacy and information security regulation.
“More government regulations are required to further increase consumer privacy,” Whitten said. “The privacy laws in the U.S. are better than they were a decade ago, but there is much more work to be done in this area.”
DeWitte said future enforcement actions against companies such as Facebook would be much more severe.
“Additional regulations are coming,” deWitte said. “Right now, Facebook has broken no U.S. laws. They violated an FTC consent decree they signed in 2011, essentially a contract with the U.S. government that said Facebook would adhere to certain terms and conditions in maintaining privacy data. They did not. The fines are $40,000 per violation, which could reach one trillion dollars. That is highly unlikely. Facebook must comply with GDPR for its European operations. It will be easier for Facebook to implement GDPR for all of its users than to try and segment them into GDPR or no-[GDPR] compliance.”
In 2011, the Federal Trade Commission accused Facebook of violating consumer privacy on its platform. Ultimately, the case was settled out of court, but the terms agreed upon in the 2011 consent decree did not prevent data abuse in the Cambridge Analytica case.
“Although there are many regulations in place, we can’t assume they will be followed,” Whitten said. “Stiffer penalties can help, as businesses should be less inclined to misuse data if the resulting penalties aren’t worth it. But as we see over and over, misuse of data will happen and we as users need to protect ourselves. Unfortunately, we can’t assume everyone in the world is an ethical Aggie.”
DeWitte said some of the under-enforcement issues raised by the 2011 decree and the present scandal still are being ironed out.
“No, it is not under-enforcing,” deWitte said. “The FTC process takes time — just as lawsuits and courts take time. They have to have an investigation which they are opening. The FTC is limited by the number of investigators they have. It is a slow, but thorough process.”
The data scandal explained
April 25, 2018
0
Donate to The Battalion
$0
$5000
Contributed
Our Goal
Your donation will support the student journalists of Texas A&M University - College Station. Your contribution will allow us to purchase equipment and cover our annual website hosting costs, in addition to paying freelance staffers for their work, travel costs for coverage and more!
More to Discover