Algorithmic Transparency in the News Media
Assistant Professor, Philip Merrill College of Journalism
University of Maryland
News organizations are increasingly employing algorithms in the production of news, for everything from gathering, organizing, and making sense of information, to creating and disseminating stories. Beats like crime reporting and company earnings reports are being encroached upon by automated writing software and news bots. Simulations and data-driven predictions are finding their way into news reports on topics like politics, finance, or public health. Such technology enables scale and an ostensibly objective and factual approach to editorial decision making based on the quantification and modeling of audiences and of the world itself. But such systems routinely embed human values and exude bias transmitted from their human designers or from emergent data-driven machine learning. These biases are substantial since they can systematically alter our perceptions and attention, creating consequences for the formation of publics and the fair and uniform provision of information. While the facts are still sacred, the reality we perceive in the news media is highly mediated by these algorithms. In this talk I will problematize the use of algorithms in the media, but also proffer a path forward premised on developing standards for algorithmic transparency—the disclosure of information about algorithms to enable monitoring, checking, criticism, or intervention by interested parties. I’ll argue that transparency is “the new objectivity” as the news media shifts to adopt new ethical guidelines that institutionalize ways to maintain trust and legitimacy with the public they serve.