Facebook is now rating its users according to how “trustworthy” it thinks they are as part its effort to combat cyber misinformation.
A user’s trustworthiness can be scaled anywhere between zero and one.
Tessa Lyons is a Facebook product manager who is in charge of fighting misinformation. She said they developed a system of assessment that measures a user’s reputation, The Washington Post reported.
It was made to combat the users that reported content as fake news—even when they weren’t.
Lyons added that it’s common for people to report something as being false because they disagree with the content.
It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” Lyons said.
The trustworthiness scale won’t be the only indicator of a user’s credibility. Lyons said that the score is only one among thousands of ways to understand which users flag content as problematic and which publishers users find trustworthy.
Claire Wardle, director of the Harvard Kennedy School that fact-checks on behalf of Facebook, said that the unknown criteria the platform uses to judge trustworthiness makes people feel uncomfortable but that having more transparency around these criteria would allow people to game the system.
It is not publicly known what other criteria Facebook uses to determine who is creating fake news or spreading misinformation.
The public originally wanted Facebook to discern what content is considered misinformation, wrote Samidh Chakrabarti on Facebook newsroom.
He added that Facebook didn’t want to be the judge of what is and isn’t true, and neither would the world ever want that.
Instead, they’ve opted for two solutions. To hire a third-party fact checker and educate social media users so they won’t become victims to fake news.
‘Wisdom of the Crowds’
Reporters without Borders (RSF) General Secretary Christophe Deloire criticized Facebook for its survey, the Press Gazette reported.
“You cannot use polling and belief as the basis for establishing facts,” Deloire said.
Buzzfeed shared the survey and added that the Facebook spokesperson said it was the only survey being used.
Below is the complete survey:
Do you recognize the following websites
How much do you trust each of these domains?
- A lot
- Not at all
Zuckerberg believes that the community are best in determining which sources are ”most objective” and ”broadly trusted.”
Continuing our focus for 2018 to make sure the time we all spend on Facebook is time well spent… Last week I…
Deloire stated that it is dangerous to allow the public to decide what news sources are trustworthy.
He is doubtful that Facebook users are capable of being objective in judging what is and isn’t reliable.
It also leaves it open for bad actors to manipulate the system.
The question of trust also needs to be raised as different people define the world differently in different contexts, Deloire said.
“Do audiences trust outlets to be accurate – or do they trust them to be on their side?”
Fake News or Wrong News?
Mark Zuckerberg told Recode in an interview that ”it’s hard to impugn intent and to understand the intent.”
He made a distinction between someone with the intention of misinforming people with spreading information that is factually wrong that the publisher believes is right.
”I’m Jewish, and there’s a set of people who deny that the Holocaust happened,” Zuckerberg said, ”… but at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong.”
In addition, he admitted that everyone gets things wrong when they speak publicly, himself included.
Zuckerberg doesn’t believe people should be taken off the platform just because they get things wrong or even get things wrong multiple times.
If someone doesn’t post content that organizes harm against someone, then the content will remain—even if people find it offensive.
But instead of removing the content, Zuckerberg said that Facebook has been reducing the reach of offensive content in the news feed.
“I think that we have a responsibility to be doing more there,” he said.