≡ Menu

Internet Society Meetup Explores Fake News

I attended the Internet Society’s “Content Rules?!” session the other week. The panel drilled down on what we now call The Fake News problem (I couch it like this because, as you’ll see it’s not a new one), defining it and exploring causes and solutions.

There’s been a lot already written about fake news. It’s turned into a real meme and hot button, but there’s been lots of noise and confusion. That’s not surprising because it is a complex topic, one that only recently hit our radars in the wake of the election.

Giving it a name gave it legs, a thing to blame (in some cases just because someone doesn’t like an article), and evoked lots of teeth gnashing. The session gave me the opportunity to hear from some very smart people from different sides, better understand the issues and crystallize my thoughts about how we might address the problem.

Not a New Problem

Journalist and American University Professor Chuck Lewis started by explaining that fake news has been around for years in various forms, e.g. government disinformation and propaganda. Toni Muzi Falcone’s DigiDig wrap (in Italian) also discussed this.

“The irony of us feeling victimized by fake news is pretty thick,” he said. “We’ve gone from truth to truthiness to a post-truth society, and now it’s fake news,” said Chuck, “but it’s been going on for centuries.”

He blamed the number of people “spinning information” vs. reporting it, and the ratio of PR people to journalists (which has grown to 5:1), and said it is a crisis for journalism. The big questions are, who decides what is true, and how do you set standards for 200+ countries? We’ve traditionally relied on the press to be content mediation experts.

“We are at a critical, disturbing crossroad,” Lewis said, as “No one wants the government to be the mediators.”

A Systemic Problem

Compounding the problem are the changing ways we get info, and the growing influence of social networks. Gilad Lotan, head of data science at Buzzfeed, discussed this.

He’s studied political polarization in Israel. Gilad showed some fancy social graphs that tracked the spreading of stories in the wake of IDF’s bombing of a Palestinian school. Two different story lines emerged. Neither was “fake” Gilad explained; “They just chose to leave certain pieces of info out in order to drive home points of a narrative.”

Gilad further discussed how your network position defines the stories you see; this leads to polarization and homophily (a fancy way of saying echo chamber). He also explained the role of algorithmic ranking systems. “You’re much more likely to see content which aligns with your viewpoints,” he said. This spawns “personalized propaganda spaces.”

It gives bad actors a way to game the system. Gilad illustrated this via what had been the elephant in the room – the 2016 US presidential election. He shared images that showed phantom groups manipulating the spread of information.

“The awareness of how algorithms work gave them a huge advantage. To this day, if you search for ‘Hillary’s Health’ on YouTube or Google, you see conspiracy theories at the top.”

Moderator Aram Sinnreich, associate professor at American University added: “My impression as a media scholar and critic… is that there’s been a lot of finger-pointing… everyone feels that there’s been a hollowing out of the Democratic process… undermining of the traditional role that the media has played as the gatekeeper of the shared narrative and shared truths; people want to hold the platforms accountable.”

Flavors of Fake News

Andrew Bridges, a lawyer who represents tech platforms, said that it is important to define the problem before considering solutions. The knee-jerk reaction has been to try to turn social networks into enforcement agencies, but that would be a mistake, according to Bridges. That’s because there are seven things calling fake news that could have different solutions (I list them with the examples he cited):

  1. Research and reporting with a pretense of being objective (e.g., major newspapers)
  2. Research and reporting in service of a cause (National Review, Nation, New Republic)
  3. Pretend journalism – claim to be a news source but is a curator (Daily Kos)
  4. Lies – the ones that Politifact and others give Pinocchio noses or “pants on fire” awards
  5. Propaganda – the systematic pattern of lying for political gain
  6. Make-believe news, like Macedonian sites. They make up news from whole cloth.
  7. Counterfeit sites – they make you think you are at ABC News.com, for example

Then, he dramatically challenged the panel and audience to label certain big ticket topics as fake news or not: Evolution, global warming, the importance of low-fat diets, the importance of low carb diets.

Bridges said that there’s not necessarily a quick fix or tech solution to the problem. “These things have been out there in society, in front of our eyes for years.” He likened the problem to gerrymandering, gated communities and questions about Hillary’s health.

Some have proposed algorithmic transparency (not surprisingly, Bridges thinks it is an awful idea; “Opening them up just makes it easier to game the system”).

What could work, according to the lawyer? “I think we should look to algorithmic opacity, and brand values of the organizations applying the algorithm.” What about content moderation? He said “Do we turn it over to a third party, like a Politifact? Who moderates the moderator? We know what moderation is – it’s censorship.”

In Bridges view, education is important. We should teach the importance of research and fact checking, and keep each other honest: “Friends don’t let friends spread fake news.”

Other Challenges

Jessa Lingel, Ph.D. and assistant professor at Annenberg School of Communications, seemed to be the youngest on the panel and spoke up for millennials:

“You can’t promise a generation of Internet-loving people a sense of control and agency over content and not expect this breakdown in trust.” She talked about the growth of citizen-driven journalism and the shift from content generation to interpretation. Jessa bemoaned the digital natives’ loss of innocence:

“We were promised a Democratic web and got a populist one; a web that that connects us to different people, instead we got silos. Geography wasn’t supposed to matter… anyone with an Internet connection is the same… instead, geography matters a lot.”

Jessa siad that algorithmic transparency is important but said that it is not enough. “Opacity? I do want to pay attention to the man behind the curtain. We need more than that tiny little button that explains ‘why did I get this ad?’”

Up Next: More on Solutions

As you have hopefully seen from my post, there are many opinions on the situation, and it’s a complex topic.

What do you think? In my next post, I’ll share my thoughts on fake news problems and solutions.

{ 0 comments… add one }

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.