Faculty Viewpoints: Facebook, data and ethics

Tufts Gordon Institute faculty say that after Facebook’s data scandal, tech companies are going to face much tougher scrutiny around data privacy and ethics.

Tufts Gordon Institute faculty say that after Facebook’s data scandal, tech companies are going to face much tougher scrutiny around data privacy and ethics.

When the news surfaced in March that Cambridge Analytica, a political consulting firm, had unknowingly gained access to the private data of millions of users through Facebook, the social media platform was thrust into a global conversation around data use, privacy, and corporate ethics. Already faced with controversy over its role in the spread of fake news during the 2016 presidential election, Facebook apologized for mishandling user data, calling it a “breach of trust.”

Nevertheless, since the scandal broke, tech companies have had to face new questions about their own policies around data collection and privacy. Tufts Gordon Institute recently spoke to two faculty members, Gavin Finn and Sam Liggero, both Professors of the Practice, about what lessons can be learned by tech leaders and how companies need to be thinking about data and ethics moving forward.

Talk about Facebook’s business model. How does the company bring in revenue? What is the relationship between the company and its customers?

There are three constituents here: Facebook, users, and the customers. In Facebook’s case, the users are not the customers; the advertisers are the customers. Essentially, Facebook’s platform is delivering content so that they can collect data about the users and then sell those data to advertisers – that’s how they make money. – Gavin Finn

Facebook needs our information to make money. This has become a dilemma for companies whose main revenue is based on ads. If you’re an app developer and you want your app to be on Google or Facebook, Facebook has to share its information (your data) with the developer. So, if the app developer doesn’t behave themselves, like Cambridge Analytica, then you have a problem. – Sam Liggero

What did Facebook do wrong? What was the reason for so much backlash? 

People were unhappy to learn that Facebook uses their data to begin with, and then they learned that Facebook didn’t secure it properly – their trust had been violated. Users are now seeing that Facebook is allowing third parties and sometimes pretty bad actors, including potential foreign agents, to use information that they shouldn’t have access to in the first place. People were already aware of financial vulnerabilities, credit card theft, and those kinds of threats online, but now they are starting to worry about this new threat. These are consequences we haven’t really yet contemplated. People are realizing that society as a whole has a vulnerability. – Gavin Finn

Facebook doesn’t seem to have a clear answer about whether it’s a tech company, a media company, a data platform, or something else. Is that a problem?

At the moment, what this means is that they’re not really regulated – that’s an issue that should be on the table in the next few years. They [Facebook] claim they are a tech company and are therefore exempt from regulations that media companies are subject to. My personal opinion is that they are a tech company, but they are also a media company, and therefore should be subject to the same guidelines as the New York Times, Washington Post.Sam Liggero

From the user’s perspective, what should they be aware of when interacting with these platforms?

People need to take more responsibility for their own actions online. They have been operating under the false assumption that things they do online are private. They need to recognize that actions and those data have a very significant chance of being used in a way that they didn’t intend. – Gavin Finn

Mark Zuckerberg, Facebook’s CEO and Founder, recently had to testify in front of Congress. What role can public policy play in protecting privacy?

It was very clear that there’s a tremendous lack of understanding on the part of the lawmakers and policymakers on the basic fundamentals of how the digital economy works. That’s the thing that scares me the most, because the technology is advancing faster and they need to catch up. Without relevant and consistent public policy, companies are going to continue to experience a tremendous amount of stress when it comes to this kind of data collection, because there isn’t a specific framework that they know they need to adhere to. – Gavin Finn

In this new age of data-driven business models, how will the issue of privacy continue to be a factor?

They say there are 2.4 billion people on Facebook worldwide. That’s a massive reach. And for so many people in the world, it’s one of their sole sources of information. It has catalyzed so many movements, good and bad, given that it can reach so many people. Whether it’s the uprising in Egypt, or the #MeToo movement, it spreads like wildfire. Like any major technological advancement, however, it can be both good and bad. It’s not unlike splitting the atom – it can be a wonderful thing when used properly, but you can also harness that energy into a bomb. When you design something, you have to be cognizant of the fact that someone is going to try and beat the system. Facebook didn’t think this through enough – they were caught by surprise. – Sam Liggero

How should companies be thinking about ethics moving forward?

The people who manage the technology have an obligation to establish the safeguards to minimize the downsides – the role that Facebook is playing in society now is different than it was even a few years ago. Take Apple for example – they do a good job at preserving people’s privacy. It’s part of their values. – Sam Liggero

Companies themselves are now having to rethink what responsibilities they have if they are behaving in a way that positively impacts their shareholders financially, but in the long-term creates vulnerabilities and exposes society to significant threats. When it comes to data, I think everybody is in a very similar place – a lot of companies are struggling with this. Some companies are saying, yes, we do have a responsibility as corporate citizens to think about what the impacts are. – Gavin Finn

In terms of management and leadership, what lessons are there for other tech companies?

In Silicon Valley, you need to have both people who are technically astute and business astute people, plus people who are looking at the company’s effect on society. And in any company, you need a balance of leadership styles. In my class when we talk about innovation, we talk about having people who are ‘discoverers’ and ‘deliverers.’ Discoverers are innovating and coming up with the ideas. Deliverers are equally as important – they have to make sure you can scale things, have operations all lined up. You need that balance. – Sam Liggero

What can companies like Facebook do moving forward?

Every company, from Microsoft to Adobe to the newspapers to the banks, have all had issues associated with security and breaches/unauthorized data access. I think it is important that a whole new series of security regimes become the de facto model for companies. Companies have to take security a lot more seriously than they ever have before. For now, the primary burden is on the individual. But over time, these companies are all going to have to be much, much better about security. That’s going to require a sort of industry revolution in the data security world. – Gavin Finn

In the Faculty Viewpoints series, Tufts Gordon Institute faculty share thoughts on the latest on news and trends in leadership, business, technology and entrepreneurship.