© 2024 Blaze Media LLC. All rights reserved.
Opinion: Protecting us from 'fake news' is the digital version of a safe space
Robyn Beck/AFP/GettyImages

Opinion: Protecting us from 'fake news' is the digital version of a safe space

Most people think the problem with fake news is how one defines "fake." Inaccurate reporting from reputable news sources, is that fake? Barely-hidden bias? Straight-up lies for clicks?

What constitutes fake?

But the real problem — because it gets to the heart of the transactional costs of fake news — is not how "fake" is defined, but how "news" is defined. When people plunk their money or time or whatever other currency down in exchange for news, are they getting what they're buying? Or are they getting marketing material disguised as news?

There's an interesting (hopefully not fake) story circulating right now that makes the claim that a large portion of the traffic to fake news sites is generated by Facebook. Jumpshot, a data analytics firm, has reportedly crunched the numbers and this is what they found:

Overall, we found that Facebook referrals accounted for 50 percent of total traffic to fake news sites and 20 percent of total traffic to reputable news sites.

The domain with the highest portion of Facebook referrals is Occupy Democrats, which has more than 4.6 million likes on Facebook and 79 percent of its total traffic coming through Facebook. American News closely follows in second with 78 percent of its total traffic coming through Facebook. The fake news site has about 5.5 million likes on Facebook, and recently made headlines when a story about Megyn Kelly made its way to Facebook’s Trending section several weeks after it was exposed as false. In comparison, 29 percent of The Huffington Post’s traffic comes from Facebook, while 20 percent of The New York Times’ traffic and 11 percent of CNN’s traffic comes from Facebook as well.

All of this comes at a time when Facebook and Google, according to Yahoo News, are vowing to "to do a better job of curating the content that populates their sites. Which is all very comforting, if you really want software engineers assuming the role of civic arbiter that has traditionally fallen to journalists."

The problem with this newfound civic consciousness on the part of Facebook and Google, which both started out as profit-making tech companies, is that they have morphed into something else entirely, which makes their pronouncement to begin deciding what news is fit to read a pretty serious conflict of interest.

In short: they're no longer just tech companies. They've become hybrid tech and media companies, in the business of marketing and selling content to consumers. So can they be trusted to give us unvarnished, non-biased, "just the facts" news?

Forbes thinks not:

Advertising revenue is like heroin. Once you try it for a while, the highs become addictive.

Zuckerberg became addicted to advertising revenue, but because he was a new player in the game, he thought he could deny being in the media business and being addicted to ad revenue. He talked a good game, a game that would inspire the troops at Facebook, appease its users and distract investors. He said Facebook’s mission was “to give people the power to share and make the world more open and connected,” which is a lot more noble than admitting its mission is to maximize advertising revenue and profit. Wealth is even more addictive than heroin.

Once companies that draw a profit off numbers of shares begin telling us what to look at and read — and stop trusting that we can make our own decisions about what's real and what's not — we may be in for some mind-numbing head games. Twitter, for example, is starting to suggest banning Donald Trump's Twitter feed because he shares "fake" news. Well how sure are we that he does? And isn't it his right to do that (if he does)? And isn't it our responsibility to hold him accountable? How do we do that if we're protected from seeing what he shares or even knowing about it?

It's true that it's easy to share some of the more incendiary, funny, absurd stuff that floats around. It's piques our humanity, for good or bad, when something is over the top. But, as an anecdote, here's something that happened to me recently that I think illustrates why we have to let people learn from their mistakes.

A gentleman wrote an entirely fictitious anonymous op-ed and the Guardian, a reputable if left-leaning news outlet, ran it. I shared it on Twitter. And was quickly corrected and giggled at by someone who knew the truth.

Brilliant joke. And I, as a result, will be ever more vigilant in the future not to make the same mistake and look like a fool. And that's the only way fake news stops — when people get tired of looking like fools. Google, Facebook and Twitter shouldn't try to protect us from that. It's the digital version of a safe space.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?