Facebook must accept some form of state regulation, acknowledging its status as a content provider somewhere between a newspaper and a telephone company, its co-founder Mark Zuckerberg has said.
He also claimed an era of clean democratic elections, free of interference by foreign governments, is closer due to Facebook now employing 35,000 staff working on monitoring content and security.
He admitted Facebook had been slow to understand the scale of the problem of foreign interference. He also defended his company from claims that it is leading to political polarisation, saying its purpose is to bring communities together.
Speaking at the Munich Security Conference, an annual high-level gathering of politicians, diplomats and security specialists, Zuckerberg sought to dispel the notion that his company had undermined democracy, weakened the social fabric or contributed to the weakening of the west through spreading distrust.
He said he supported state regulations in four fields covering elections, political discourse, privacy and data portability. He said: “We don’t want private companies making so many decision-balancing social equities without democratic processes.”
Zuckerberg, who is due to have fresh discussions with the EU commission regulators on Monday said, so long as “enough people have weighed in to come up with an answer” on regulation, the answer will not necessarily be right, but the process by which the decision is taken will in itself help build greater trust in the internet.
By contrast, he said authoritarian states were introducing highly controlled forms of internet that limited free expression. “I do think that there should be regulation in the west on harmful content … there’s a question about which framework you use for this,” Zuckerberg said during a question-and-answer session at the event.
“Right now there are two frameworks that I think people have for existing industries – there’s newspapers and existing media, and then there’s the telco-type model, which is ‘the data just flows through you’, but you’re not going to hold a telco responsible if someone says something harmful on a phone line. I actually think where we should be is somewhere in between,” he said.
He pointed out Facebook publishes 100bn pieces of content every day, adding: “It is simply not possible to have some kind of human editor responsible to check each one.”
Facebook’s responsibility for its content was not analogous to that of a newspaper editor, he said. Without expanding, he said some kind of third regulatory structure was required settled somewhere between newspapers and telephones.
Denying Facebook’s choice of content led to confirmatory bias by only giving its subscribers information with which they agree, he said: “We try to show some balance of views.”
The average Facebook subscriber has about 200 friends, most of whom share similar views. “It is not a technology problem, it is a social affirmation problem,” he argued. “The choice of what you see is based on the balance of what you share, rather than by choosing what you see. If your cousin has had a baby we had better make sure that is near the top,” he said.
He said his firm had been slow to see how foreign powers were interfering in elections, but Facebook was now spending an amount on security and content equivalent to the total value of the company in 2012, and claimed this “massive effort” was producing a greater understanding about how to protect the integrity of elections. Nearly 1m accounts had been taken down, he said.
But he warned new domestic actors, as well as foreign powers, were seeking to disrupt elections. The outside forces were also becoming more sophisticated in covering their tracks by pretending their messages were coming from a variety of IP addresses in different countries.
Facebook was also offering election campaigns a new free service where the candidate provides the internet details of its staff, and if one or more of the staff is hacked, the campaign’s security can be increased to a higher state of protection.
He said the firm had shifted from a reactive to proactive model, so much so that 99% of terrorist content is taken down before any external complaint is made. In the case of hate speech, 80% of content is removed without notification, but Facebook’s Artificial Intelligence was still struggling to distinguish the “small nuances” between content that was hate speech, or content that was condemning the hate speech, he said.
AAsked by Ronen Bergman of the New York Times about Facebook and WhatsApp’s lawsuit against Israeli spyware company NSO Group, Zuckerberg shrugged off the idea that the case could damage governments’ ability to work against terrorism. “They can defend themselves in court if what they think is legal,” he said, “but our view is that people should not be trying to hack into software that billions of people around the world use to try to communicate securely.”