Facebook’s fake news problem persists, CEO Mark Zuckerberg acknowledged last night.
Report from:- TechCrunch!!
He’d been dismissive about the reach of misinformation on Facebook, saying that false news accounted for less than one percent of all the posts on the social media network. But a slew of media reports this week have demonstrated that, although fake posts may not make up the bulk of the content on Facebook, they spread like wildfire — and Facebook has a responsibility to address it.
“We’ve made significant progress, but there is more work to be done,” Zuckerberg wrote, outlining several ways to address what he called a technically and philosophically complicated problem. He proposed stronger machine learning to detect misinformation, easier user reporting and content warnings for fake stories, while noting that Facebook has already taken action to eliminate fake news sites from its ad program.
The firestorm over misinformation on Facebook began with a particularly outrageous headline: “FBI Agent Suspected in Hillary Email Leaks Found Dead.”
The false story led to accusations that Facebook had tipped the election in Donald Trump’s favor by turning a blind eye to the flood of fake stories trending on its platform. The story, which ran just days before the election on a site for a made-up publication called Denver Guardian, suggests that Clinton plotted the murders of an imaginary agent and his imaginary wife, then tried to cover it up as an act of domestic violence. It was shared more than 568,000 times.
The Denver Guardian story caused a crisis at Facebook, and it hasn’t gone away. Last night, the
story appeared yet again in a friend’s newsfeed. “BREAKING,” the post blared. “FBI AGENT & HIS WIFE FOUND DEAD After Being ACCUSED of LEAKING HILLARY’s EMAILS.” This time, the story was hosted by a site called Viral Liberty. Beneath the headline is a button encouraging Facebook users to share the story, and according to Facebook’s own data, it’s been shared 127,680 times.
Facebook isn’t alone. Google and Twitter grapple with similar problems and have mistakenly allowed fake stories to rise to prominence as well. And although stories about the rise of fake news online have focused primarily on pro-Trump propaganda, the sharing-without-reading epidemic exists in liberal circles too — several of my Facebook friends recently shared an article by the New Yorker‘s satirist Andy Borowitz titled “Trump Confirms That He Just Googled Obamacare” as if it were fact, celebrating in their posts that Trump may not dismantle the Affordable Care Act after all his campaign promises to the contrary.
But, as the hub where 44 percent of Americans read their news, Facebook bears a unique responsibility to address the problem. According to former Facebook employees and contractors, the company struggles with fake news because its culture prioritizes engineering over everything else and because it failed to build its news apparatus to recognize and prioritize reliable sources.
Facebook’s media troubles began this spring, when a contractor on its Trending Topics team told Gizmodo that the site was biased against conservative media outlets. To escape allegations of bias, Facebook fired the team of journalists who vetted and wrote Trending Topics blurbs and turned the feature over to an algorithm, which quickly began promoting fake stories from sites designed to churn out incendiary election stories and convert them into quick cash.
It’s not a surprise that Trending Topics went so wrong, so quickly — according to Adam Schrader, a former writer for Trending Topics, the tool pulled its hashtagged titles from Wikipedia, a source with its own struggles with the truth.
Mark Zuckerberg is the front page editor of every newspaper in the world.
“The topics would pop up into the review tool by name, with no description. It was generated from a Wikipedia topic ID, essentially. If a Wikipedia topic was frequently discussed in the news or Facebook, it would pop up into the review tool,” Schrader explained.
From there, he and the other Trending Topics writers would scan through news stories and Facebook posts to determine why the topic was trending. Part of the job was to determine whether the story was true — in Facebook’s jargon, to determine whether a “real world event” had occurred. If the story was real, the writer would then draft a short description and choose an article to feature. If the topic didn’t have a Wikipedia page yet, the writers had the ability to override the tool and write their own title for the post.
Human intervention was necessary at several steps of the process — and it’s easy to see how Trending Topics broke down when humans were removed from the system. Without a journalist to determine whether a “real world event” had occurred and to choose a reputable news story to feature in the Topic, Facebook’s algorithm is barely more than a Wikipedia-scraping bot, susceptible to exploitation by fake news sites.
But the idea of using editorial judgement made Facebook executives uncomfortable, and ultimately Schrader and his co-workers lost their jobs.
“[Facebook] and Google and everyone else have been hiding behind mathematics. They’re allergic to becoming a media company. They don’t want to deal with it,” former Facebook product manager and author of Chaos Monkeys Antonio Garcia-Martinez told TechCrunch. “An engineering-first culture is completely antithetical to a media company.”
Of course, Facebook doesn’t want to be a media company. Facebook would say it’s a technology company, with no editorial voice. Now that the Trending editors are gone, the only content Facebook produces is code.
But Facebook is a media company, Garcia-Martinez and Schrader argue.
“Facebook, whether it says it is or it isn’t, is a media company. They have an obligation to provide legit information,” Schrader told me. “They should take actions that make their product cleaner and better for people who use Facebook as a news consumption tool.”
Garcia-Martinez agreed. “The New York Times has a front page editor, who arranges the front page. That’s what New York Times readers read every day — what the front page editor chooses for them. Now Mark Zuckerberg is the front page editor of every newspaper in the world. He has the job but he doesn’t want it,” he said.
Zuckerberg is resistant to this role, writing last night that he preferred to leave complex decisions about the accuracy of Facebook content in the hands of his users. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties,” he wrote. “We have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation.”
However, Facebook’s reliance on crowd-sourced truth from its users and from sites like Wikipedia will only take the company halfway to the truth. Zuckerberg also acknowledges that Facebook can and should do more.
Read the full article at:- Zuckerberg reveals plans to address misinformation on Facebook