Facebook COO Sheryl Sandberg and Mike Schroepfer took the stage at the Code Conference in Rancho Palos Verdes last night to discuss changes the company has made in the wake of the Cambridge Analytica scandal.
Interviewed by conference moderators Kara Swisher and Peter Kafka, Sandberg said the company now understands it was too late in responding to the privacy concerns raised by Cambridge Analytica. “We definitely know we were late. We said we are sorry, but sorry isn’t the point,” she said.
Instead it’s important to think about the responsibility it has in a different way. For the past 10 to 12 years, she said, Facebook focused on building and enabling social experiences, but not enough on the bad that could be done on the platform. “Now we are understanding the responsibility we have and trying to act accordingly,” she said.
There is a “fundamental tension” between tools that that allow for easy, free expression and keeping people safe, Schroepfer added. Facebook wants to facilitate discussions, but also make sure the platform doesn’t host hate speech or posts designed to manipulate elections.
The Cambridge Analytica Problem
The Cambridge Analytica issues dates back at least 10 years, when people were talking about wanting to “take data with them,” so Facebook developed APIs to help them do so. In those days, Schroepfer said, Facebook was optimistic and focused on the fact that enterpreneurs could use its data to develop new applications. It also thought people who used those apps understood what was happening.
By 2014, Facebook decided to restrict access to such data, and started a more proactive review of applications. In December 2015, the company heard via media reports that Cambridge Analytica had gotten Facebook data and resold it. Why did Facebook learn about this from the press? Once the data was outside Facebook, it could only observe the data, Schroepfer said.
Facebook immediately disabled the app that scraped the data, and tried to figure out who accessed it. Zeroing in on Cambridge Analytica, the firm insisted it had deleted the data, but that might not be the case, Schroepfer acknowledged.
Now the company is more focused on theoretical ways people could get the data, he said, and has made investments in security, content review, and development.
Looking back, “we wish we had more controls in place,” Sandberg said. She noted that despite legal assurances from Cambridge Analytica that it had deleted the data, “we should have audited them.” She noted that in recent months, the company has made moves to do that, though this is on hold pending a UK government review that takes priority.
In the run-up to the 2016 election, people were mostly worried about spamming and phishing emails, Sandberg noted. The company took steps to avoid those problems, but didn’t see that these different “more insidious threats” were coming. Now it understands and Facebook has taken very aggressive steps, she said.
Sandberg pointed to the deletion of fake accounts and its work with governments to help prevent similar occurrences around other elections, citing work in Alabama, Germany, and France. “We are showing that we are taking steps to make it better,” she said.
She also mentioned that while Facebook “always had” ways to control how users share data with applications, it is now at the top of the News Feed, instead of being hard to find. The company is also building new tools on top of these controls.
Didn’t See it Coming
Swisher asked what was wrong with the culture that it didn’t understand the potential for misuse, pointing to Facebook Live missteps. Sandberg pushed back, saying “Live is a great example” of how the company fixes things. She noted that when Live launched, there was “a lot of good, but were things that were wrong.” So now the company has human review of anything live within minutes. As a result, there have been posts taken down right away, and times when the company intervened and helped people.
Facebook has an open platform, and knows it will never prevent all the bad things. But she said the company could be more transparent and put more resources into making a safe community. The company has deleted 1.3 billion fake accounts; published its internal guidelines used to judge whether content should be taken down; and successfully removes 99 percent of terrorist content, 96 percent of adult photos and sexual content, but only 38 percent of hate speech before it’s reported to the company by users.
“We won’t get it all,” Sandberg admitted, but Schroepfer said Facebook has made more progress on this than he thought it would be able to.
On the problem of fake news, Sandberg said much of that comes from fake accounts; by taking those down, it reduces the problems. Another big source is economically motivated, so the company is moving to kick bad actors out of its ad networks. She also said the company is working on being more transparent, so you can see the people behind any political or issue posts, which allows people to find more things that are wrong and report them.
Asked about regulation, Sandberg said the company is already regulated with things like GDPR. “The question isn’t if there will be more regulation, but what kind of regulation,” she argued.
Facebook has spent a lot of money and put in a lot of complex systems to handle GDPR, and acknowledged that regulation can entrench big companies. And she worried about unintended consequences, noting that things like Caller ID were originally considered an invasion of privacy, so there was regulation preventing it.
Asked if Facebook is a monopoly and should be broken up, Schroepfer said there’s competition in the market, citing YouTube for video sharing, Twitter for posting public comments, and Snapchat, WeChat, and iMessage for messaging. “Consumers use the products they want,” he said, noting Facebook is “a very small part” of the overall advertising market.
Apple vs. Facebook
Asked about Apple CEO Tim Cook’s criticisms of the company, Sandberg said, “We strongly disagree with their characterization of our products and business model,” noting that as a free service, Facebook is available to people all over the world.
“We’ve looked at subscriptions and will continue to do so,” but said the heart of the product is a free service, Sandberg said.
Hearing about the terrible things that happen on the platform has made the company focus on new priorities, Schroepfer said. “It’s not fun, but it’s really important work.” He also said the focus on safety and security is the “biggest cultural shift” he’s seen at the company.
Facebook is focused on the need to provide safety, security, and integrity on platform, but “we understand it will be an arms race,” and there will be risks it did not see, Sandberg said. Facebook is “making huge investments that will hit our bottom line, but it’s worth doing.”
This article originally appeared on PCMag.com.