Blog Archive

Friday

Facebook Mounts Effort to Limit Tide of Fake News




The new fake news feature on Facebook, as the site makes an effort to flag articles that are not true.

For weeks, Facebook has been questioned about its role in spreading fake news. Now the company has mounted its most concerted effort to combat the problem.
Facebook said on Thursday that it had begun a series of experiments to limit misinformation on its site. The tests include making it easier for its 1.8 billion members to report fake news, and creating partnerships with outside fact-checking organizations to help it indicate when articles are false. The company is also changing some advertising practices to stop purveyors of fake news from profiting from it.
Facebook, the social network, is in a tricky position with these tests. It has long regarded itself as a neutral place where people can freely post, read and view content, and it has said it does not want to be an arbiter of truth. But as its reach and influence have grown, it has had to confront questions about its moral obligations and ethical standards regarding what appears on the network.
Its experiments on curtailing fake news show that Facebook recognizes it has a deepening responsibility for what is on its site. But Facebook also must tread cautiously in making changes, because it is wary of exposing itself to claims of censorship.

“We really value giving people a voice, but we also believe we need to take responsibility for the spread of fake news on our platform,” said Adam Mosseri, a Facebook vice president who is in charge of its news feed, the company’s method of distributing information to its global audience.
He said the changes — which, if successful, may be available to a wide audience — resulted from many months of internal discussion about how to handle false news articles shared on the network.
What impact Facebook’s moves will have on fake news is unclear. The issue is not confined to the social network, with a vast ecosystem of false news creators who thrive on online advertising and who can use other social media and search engines to propagate their work. Google, Twitter and message boards like 4chan and Reddit have all been criticized for being part of that chain.
Still, Facebook has taken the most heat over fake news. The company has been under that spotlight since Nov. 8, when Donald J. Trump was elected the 45th president. Mr. Trump’s unexpected victory almost immediately led people to focus on whether Facebook had influenced the electorate, especially with the rise of hyperpartisan sites on the network and many examples of misinformation, such as a false article that claimed Pope Francis had endorsed Mr. Trump for president that was shared nearly a million times across the site.


The site is trying to combat phony news, but says “the magnitude of fake news across Facebook is one fraction of a percent of the content across the network.”

Mark Zuckerberg, Facebook’s chief executive, has said he did not believe that the social network had influenced the election result, calling it “a pretty crazy idea.” Yet the intense scrutiny of the company on the issue has caused internal divisions and has pushed Mr. Zuckerberg to say he was trying to find ways to reduce the problem.
In an interview, Mr. Mosseri said Facebook did not think its news feed had directly caused people to vote for a particular candidate, given that “the magnitude of fake news across Facebook is one fraction of a percent of the content across the network.”
Facebook has changed the way its news feed works before. In August, the company announced changes to marginalize what it considered “clickbait,” the sensational headlines that rarely live up to their promise. This year, Facebook also gave priority to content shared by friends and family, a move that shook some publishers that rely on the social network for much of their traffic. The company is also constantly fine-tuning its algorithms to serve what its users most want to see, an effort to keep its audience returning regularly.
This time, Facebook is making it easier to flag content that may be fake. Users can report a post they dislike in their feed, but when Facebook asks for a reason, the site presents them with a list of limited and vague options, including the cryptic “I don’t think it should be on Facebook.” In Facebook’s new experiment, users will have a choice to flag the post as fake news and have the option to message the friend who originally shared the piece to tell him or her the article is false.
If an article receives enough flags as fake, it can be directed to a coalition of groups that will fact-check it. The groups include Snopes, PolitiFact, The Associated Press, FactCheck.org and ABC News. They will check the article and can mark it as a “disputed” piece, a designation that will be seen on Facebook.
Partner organizations will not be paid, the companies said. Some characterized the fact-checking as an extension of their journalistic efforts.
“We actually regard this as a big part of our core mission,” James Goldston, the president of ABC News, said in an interview. “If that core mission isn’t helping people regard the real from the fake news, I don’t know what our mission is.”
Disputed articles will ultimately appear lower in the news feed. If users still decide to share such an article, they will receive a pop-up reminding them that the accuracy of the piece is in question.
Facebook said it was casting a wide net to add more partners to its fact-checking coalition and may move outside of the United States with the initiative if early experiments go well. The company is also part of the First Draft Coalition, an effort with other technology and media companies including Twitter, Google, The New York Times and CNN, to combat the spread of fake news online.

Photo
An article that has been flagged as “disputed.”

In another change in how the news feed works, articles that many users read but do not share will be ranked lower on people’s feeds. Mr. Mosseri said a low ratio of sharing an article after it has been read could be perceived as a negative signal, one that might reflect that the article was misleading or of poor quality.
“Facebook was inevitably going to have to curate the platform much more carefully, and this seems like a reasonably transparent method of intervention,” said Emily Bell, director at the Tow Center for Digital Journalism at Columbia University.
“But the fake cat is already out of the imaginary bag,” Ms. Bell added. “If they didn’t try and do something about it, next time around it could have far worse consequences.”
Facebook also plans to impede the economics of spreading fake articles across the network. Fake news purveyors generally make money when people click on the false articles and are directed to third-party websites, the majority of which are filled with dozens of low-cost ads.
Facebook will review those third-party links and check for things like whether the page is mostly filled with advertising content — a dead giveaway for spam sites — or to see whether a link masquerades as a different site, like a fake version of The New York Times. Such sites would not be eligible to display Facebook advertising on their pages.
Articles disputed by the fact-checking coalition will also not be eligible to be inserted into Facebook ads, a tactic viral spammers have used to spread fake news quickly and gain more clicks on their websites.
Facebook said that in these early experiments it would deal with only fake news content; it does not plan to flag opinion posts or other content that could not be easily classified. The changes will not affect satirical sites like The Onion, which often jabs at political subjects through tongue-in-cheek humor.

Facebook must take something else into consideration: its profit. Any action taken to reduce popular content, even if it is fake news, could hurt the company’s priority of keeping its users engaged on the platform. People spend an average of more than 50 minutes a day on Facebook, and the company wants that number to grow.
Executives at Facebook stressed the overriding factor right now is not just engagement.
“I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through,” Mr. Zuckerberg wrote in a post on Thursday. “We have a responsibility to make sure Facebook has the greatest positive impact on the world.”
source: New York Times

No comments:

Post a Comment