Concern About Fake Election News

[Eric M. Appleman, Nov. 20, 2016]  In July 2016 WTOE 5 News reported, “Pope Francis shocks world, endorses Donald Trump for President, releases statement.”  The story was false and WTOE 5 News was a fake news site.  This was one of many fake news stories to make its way into election discourse in the 2016 campaign, and the majority of them favored Trump.  Fake news is nothing new--one only need think of the National Enquirer--but with social media such stories can easily spread to a wide audience. 

A post-election analysis by BuzzFeed News found that "Fake Election News Stories Outperformed Real News On Facebook."  The BuzzFeed News analysis found that user engagement of fake election news on Facebook increased markedly in the last three months of the campaign and that the overwhelming majority of those stories were either pro-Trump or anti-Clinton. 

See: Craig Silverman.  "This Analysis Shows How Fake Election News Stories Outperformed Real News On Facebook."  BuzzFeed News, Nov. 16, 2016.


posts by Facebook founder Mark Zuckerberg
November 12, 2016 at 10:15 pm

I want to share some thoughts on Facebook and the election.

Our goal is to give every person a voice. We believe deeply in people. Assuming that people understand what is important in their lives and that they can express those views has driven not only our community, but democracy overall. Sometimes when people use their voice though, they say things that seem wrong and they support people you disagree with.

After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.

Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.

That said, we don't want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further.

This is an area where I believe we must proceed very carefully though. Identifying the "truth" is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.

As we continue our research, we are committed to always updating you on how News Feed evolves. We hope to have more to share soon, although this work often takes longer than we'd like in order to confirm changes we make won't introduce unintended side effects or bias into the system. If you're interested in following our updates, I encourage you to follow our News Feed FYI here: http://bit.ly/2frNWo2.

Overall, I am proud of our role giving people a voice in this election. We helped more than 2 million people register to vote, and based on our estimates we got a similar number of people to vote who might have stayed home otherwise. We helped millions of people connect with candidates so they could hear from them directly and be better informed. Most importantly, we gave tens of millions of people tools to share billions of posts and reactions about this election. A lot of that dialog may not have happened without Facebook.

This has been a historic election and it has been very painful for many people. Still, I think it's important to try to understand the perspective of people on the other side. In my experience, people are good, and even if you may not feel that way today, believing in people leads to better results over the long term.

______________________________________

November 19, 2016 at 12:15 am
Mark Zuckerberg

A lot of you have asked what we're doing about misinformation, so I wanted to give an update.

The bottom line is: we take misinformation seriously. Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We've been working on this problem for a long time and we take this responsibility seriously. We've made significant progress, but there is more work to be done.

Historically, we have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others -- like people sharing links to myth-busting sites such as Snopes -- to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread.

The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.

While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap. Normally we wouldn't share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway:

- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.

- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.

- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.

- Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.

- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.

- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.

- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.

Some of these ideas will work well, and some will not. But I want you to know that we have always taken this seriously, we understand how important the issue is for our community and we are committed to getting this right.

______________________________________
Facebook
By, Adam Mosseri, VP, News Feed
December 15, 2016

News Feed FYI: Addressing Hoaxes and Fake News

A few weeks ago we previewed some of the things we’re working on to address the issue of fake news and hoaxes. We’re committed to doing our part and today we’d like to share some updates we’re testing and starting to roll out.

We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully. We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations.

The work falls into the following four areas. These are just some of the first steps we’re taking to improve the experience for people on Facebook. We’ll learn from these tests, and iterate and extend them over time.

Easier Reporting
We’re testing several ways to make it easier to report a hoax if you see one on Facebook, which you can do by clicking the upper right hand corner of a post. We’ve relied heavily on our community for help on this issue, and this can help us detect more fake news.


Flagging Stories as Disputed

We believe providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with third-party fact checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.

It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share.

Once a story is flagged, it can’t be made into an ad and promoted, either.

Informed Sharing

We’re always looking to improve News Feed by listening to what the community is telling us. We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We’re going to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it.

Disrupting Financial Incentives for Spammers
We’ve found that a lot of fake news is financially motivated. Spammers make money by masquerading as well-known news organizations, and posting hoaxes that get people to visit to their sites, which are often mostly ads. So we’re doing several things to reduce the financial incentives. On the buying side we’ve eliminated the ability to spoof domains, which will reduce the prevalence of sites that pretend to be real publications. On the publisher side, we are analyzing publisher sites to detect where policy enforcement actions might be necessary.

It’s important to us that the stories you see on Facebook are authentic and meaningful. We’re excited about this progress, but we know there’s more to be done. We’re going to keep working on this problem for as long as it takes to get it right.