Research in Progress: The priming of ‘fake news’ for automated media

October 15, 2018

Christopher Cox

By Christopher M. Cox, Christopher Newport University

When considering ways to identify and discourage the spread of fake news online, the conversation inevitably turns toward the role of social media algorithms, bots and other automated technologies that obscure the distinctions between verifiable journalism and questionable information.

As we see with the ongoing dilemma of fake news on Facebook, one of the central quandaries is how (or if) Facebook can tweak its algorithms to ensure users receive a composite of reliable journalistically-sourced content.

This situation is indicative of a common approach to addressing fake news, as policymakers, journalists, scholars and proprietors of social media platforms focus on what constitutes fake news and what can be done to minimize its spread. In this vein, research on “algorithmic gatekeeping” and “filter bubbles” provide valuable insights into what enables fake news to gain traction through automated media— particularly social media platforms. 

While this research is invaluable to understanding the dynamics at play once fake news enters social media ecosystems, my research examines a critical yet underexplored component of fake news.

Rather than focus on the algorithmic proliferation of fake news on social media, my project: “Internet governance and the priming of (fake news) for automated media” examines technological and economic factors that influence the creation of fake news prior to automated dispersion. I’m particularly interested in technological and economic values informing the creation of fake news and the ways these values persist as fake news content is pulled into automated dispersion.

To better understand these complications, my study will:

  1. Use, as a case study, a website ( described by its creator as intended to explicitly create “fake” news to generate advertising revenue through Google Adsense
  2. Analyze the Terms and Conditions of this website’s domain registry service, web-hosting service, and Google Adsense
  3. Conceptualize the extent to which these services conceive of their role as governing bodies and economic factors motivating this disposition
  4. Examine relationships between how these services designate technological standards (such as assigning URLs) and the ways such designations ultimately influence the automated proliferation and consumption of fake news content

Through this approach, I seek to identify the online value chain for fake news and stress the importance of online governance beyond social media automation. A particularly critical aim is to understand how this value chain primes certain types of content for automated circulation.

Ultimately, by adding insights into the nature of governance undertaken by online intermediaries (web-hosting and domain registry platforms, Google Adsense) critical to the creation and accessibility of all online content (fake news or otherwise), my research aims to help policymakers, journalists, and media companies more precisely address the ability to identify, delineate, and typify fake news. 

For further information on this study, email Christopher M. Cox at . Results from the study will be available when the project is completed later this year. This project is supported by a Page/Johnson Legacy Scholar Grant from the Arthur W. Page Center.