As U.S. voters prepare to head to the polls Tuesday, the election will also be a referendum on Facebook.
In recent months, the social networking giant has beefed up scrutiny of what is posted on its site, looking for fake accounts, misinformation and hate speech, while encouraging people to go on Facebook to express their views.
"A lot of the work of content moderation for us begins with our company mission, which is to build community and bring the world closer together," Peter Stern, who works on product policy stakeholder engagement at Facebook, said at a recent event at St. John's University in New York City.
Facebook wants people to feel safe when they visit the site, Stern said. To that end, it is on track to hire 20,000 people to tackle safety and security on the platform.
As part of its stepped-up effort, Facebook works with third-party fact-checkers and takes down misinformation that contributes to violence, according to a blog post by Mark Zuckerberg, Facebook's CEO.
But most popular content, often dubbed "viral," is frequently the most extreme. Facebook devalues posts it deems are incorrect, reducing their viralness, or future views, by 80 percent, Zuckerberg said.
Disinformation campaigns
Recently Facebook removed accounts followed by more than 1 million people that it said were linked to Iran but pretended to look like they were created by people in the U.S. Some were about the upcoming midterm elections.
Facebook CEO Mark Zuckerberg testifies before a House Energy and Commerce hearing on Capitol Hill in Washington about the use of Facebook data to target American voters in the 2016 election and data privacy. VOA
The firm also removed hundreds of American accounts that it said were spamming political misinformation.
Still, Facebook is criticized for what at times appears to be flaws in its processes.
Vice News recently posed as all 100 U.S. senators and bought fake political ads on the site. After approving them all, Facebook said it made a mistake.
Politicians in Britain and Canada have asked Zuckerberg to testify on Facebook's role on spreading disinformation.
"I think they are really struggling and that's not surprising, because it's a very hard problem," said Daphne Keller, who used to be on Google's legal team and is now with Stanford University.
A protester wearing a mask with the face of Facebook founder Mark Zuckerberg, in between men wearing angry face emoji masks, is seen during a demonstration against Facebook outside Portcullis in London. VOA
"If you think about it, they get millions, billions of new posts a day, most of them some factual claim or sentiment that nobody has ever posted before, so to go through these and figure out which are misinformation, which are false, which are intending to affect an electoral outcome, that is a huge challenge," Keller said. "There isn't a human team that can do that in the world, there isn't a machine that can do that in the world."
Transparency
While it has been purging its site of accounts that violate its policies, the company has also revealed more about how decisions are made in removing posts. In a 27-page document, Facebook described in detail what content it removes and why, and updated its appeals process.
Stern, of Facebook, supports the company's efforts at transparency.
President Donald Trump speaks at a rally endorsing the Republican ticket in Erie, Pennsylvania, VOA
"Having a system that people view as legitimate and basically fair even when they don't agree with any individual decision that we've made is extremely important," he said.
The stepped-up efforts to give users more clarity about the rules and the steps to challenge decisions are signs Facebook is moving in the right direction, Stanford's Keller said.
"We need to understand that it is built into the system that there will be a fair amount of failure and there needs to be appeals process and transparency to address that," she said. (VOA)