By any reasonable standard, Facebook counts among the most powerful companies in the world. According to its own statistics, 1.23 billion people log in a day—that’s something like 16 percent of the global population—and those visitors earn the company tens of billions of dollars a year. Why, then, does Facebook sometimes seems so helpless?
If you’ve logged checked into you own profile in the last week or so, you might have noticed an unusual invitation at the top of your feed. Spare enough that it’s easy to overlook—especially in contrast to the flashier and more personalized announcements that often adorn the site—the text. sometimes positioned beneath an animation of a magnifying glass scanning a newspaper, reads, “How to spot false news.” Below that, a brief, message invites the users to click through to “check out a few ways to identify whether a story is genuine.”
Accept that invitation and you’ll be redirected to the company’s cumbersome help site, and presented with a document that features 10 brief tips, seemingly designed to encourage critical reading and media literacy. Those tips, reportedly created in collaboration with the nonprofit First Draft, are both simple and reasonable. They variously invite users to “Be skeptical of headlines,” “Look closely at the URL,” “Watch for unusual formatting,” “Look at other reports,” and so on. While the presentation feels a little perfunctory (why not embed the tips in the news feed itself?), these are the sort of things that savvy media consumers do as a matter of course, and encouraging others to take up the practice can only help.
Somehow, though, Facebook still gives the impression of a hiker flapping his arms before a bear, struggling to scare off a monster many times its size. As Tech Crunch and others have noted, the strangeness starts with the company’s insistence that we’re not dealing with “fake news” but with “false news.” Its reasons for playing terminological hopscotch are clear enough: Embraced by Donald Trump and others, “fake news” has become a flag of partisan discontent, not an indication of truth or falsehood. By changing its own vocabulary accordingly, however, Facebook is tacitly acknowledging that it is subject to forces beyond its control.
(One indication that it’s a losing fight? Try searching for the tips page and Google will ask you whether you were actually trying to find “Tips to spot fake news” instead. What’s more virtually every article about the tips still uses the term “fake” in place of Facebook’s chosen alternative.)
This might be less significant were Facebook not so self-aggrandizing in its presentation of these tips. An official blog post from Adam Mosseri, Facebook’s news feed VP, repeatedly uses the phrase “educational tool” to describe the list of tips, which is a peculiar way to characterize what’s essentially a classroom handout from a first-year journalism seminar. Clearly “tool” is just self-congratulatory corporate speak here, but it’s still telling, since it suggests that the company has put real effort into these guidelines—and that it expects them to accomplish something.
In Facebook’s defense, it suggests that its tip sheet is little more than a stopgap as it works to develop others systems to limit the spread of misinformation. In another blog post, Mosseri lays out some of the company’s other plans, including its efforts to better detect fraudulent accounts and otherwise limit the economic incentives of spammers. Ultimately, though, it’s all but admitting that it relies most on its users, counting on them to report misinformation when they see it and to otherwise resist its spread.
Facebook’s new tips page, in other words, isn’t so much a “tool” as it is a cry for help, a desperate attempt to leverage the source of its power in pursuit of a war that it’s currently losing.
Disqus comments