Last week, a story claiming that Ford Motor Co. was moving truck production from Mexico to Ohio went viral on Facebook. “The Trump Effect: It’s Happening Already!!” the Facebook user Right Wing News wrote. That story was actually based on a CNN report from 2015, before Donald Trump was even the Republican nominee for president.
The post was neither entirely true nor completely false. It fell into a gray area in the nuanced world of fact-checking, highlighting the thorny challenge of cracking down on fake news. While some articles are obviously fake, like one about the Pope endorsing Trump, many others are misleading, exaggerated or distorted, but contain a kernel of truth. They require judgment calls, and it can be hard to tell where to draw the line, professional fact-checkers say.
“It is a very slippery slope,” said Eugene Kiely, the director of FactCheck.org, a nonprofit that aims to reduce the level of deception and confusion in U.S. politics. “There’s bad information out there that’s not necessarily fake. It’s never as clear-cut as you think.”
Facebook is taking steps to address its role in spreading fake news, such as enlisting the help of third-party fact-checkers, Chief Executive Officer Mark Zuckerberg said Friday in a post. The social network was widely criticized for allowing false stories to circulate in the run-up to the U.S. presidential election, potentially influencing its outcome. Zuckerberg underscored the delicate balance his company must strike, saying “we need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content.”
“We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties,” he wrote.
Yet professional fact-checkers say Facebook must not punish articles that are partially accurate. They say their jobs, like the truth, can be complicated, which is why they grade stories on a scale. For example, Snopes.com called the Ford Ohio story “ mostly false,” and labels others “unproven” or a “mixture” of true and false. The fact-checking website PolitiFact uses labels like “true,” “half true,” or “pants on fire.” Facebook’s algorithm may not understand the various shades of falsehood.
“It’s easy to see how an algorithm-only solution to fake news could result in blocking stuff that’s not false or is misleading for reasons that are partisan but not inaccurate,” said Alexios Mantzarlis, who leads Poynter’s International Fact-Checking Network.
One article that circulated widely on Facebook before the election claimed Bill and Hillary Clinton had stolen White House furniture. The allegation actually dates back to when the Clintons left the White House. They returned many pieces of furniture and paid the government back for some gifts, according to PolitiFact, which concluded a version of the story contained “several inaccuracies” and was “mostly false,” but added “there is a grain of truth.”
“Did they steal furniture from the White House?” FactCheck.org’s Kiely said. “That’s a judgment call.”
Tweaking the Algorithm