Accessibility Settings

color options

monochrome muted color dark

reading tools

isolation ruler

Stories

Topics

Can We Make Algorithms More Accountable?

Editor’s Note: This story is based on the University of Maryland’s Nicholas Diakopoulos’ keynote address at the European Investigative Journalism and Dataharvest conference in May in Mechelen, Belgium, sponsored by GIJN member Journalismfund.eu.


Most Americans these days get their main news from Google or Facebook, two tools that rely heavily on algorithms. A study in 2015 showed that the way a search engine like Google selects and prioritizes search results on political candidates can have an influence on voters’ preferences.

Similarly, it has been shown that by tweaking the algorithms behind the Facebook newsfeed, the turnout of voters in American elections can be influenced. If Marc Zuckerberg were ever to run for president, he would theoretically have an enormously powerful tool at his disposal. (Note: This recent article in The Guardian investigated the misuse of big data and social media in the context of the Brexit referendum.)

Algorithms are everywhere in our everyday life and are exerting a lot of power in our society. They prioritize, classify, connect and filter information, automatically making decisions on our behalf all the time. But as long as the algorithms remain a “black box,” we don’t know exactly how these decisions are made.

Are these algorithms always fair? Examples of possible racial bias in algorithms include the risk analysis score that is calculated for prisoners that are up for parole or release (white people appear to get more favorable scores more often) and the service quality of Uber in Washington, DC (waiting times are shorter in predominantly white neighborhoods.) There’s also the report that looked at Why Google Search Results Favor Democrats. Maybe such unfair results are not only due to the algorithms, but the lack of transparency remains a concern.

So what is going on in these algorithms, and how can we make them more accountable?

ProPublica’s May 2016 report “Machine Bias.”

A lot of interesting investigative journalism can still be done in this field. Generally, by trying to “poke” at the algorithms and seeing how they respond – correlating the output to the input – we can try to figure out how they work. Investigative journalists can play this game, collect and analyze the data and determine whether the results are unfair or discriminatory. Or maybe they lead to other negative or undesirable consequences – censorship, law breaking, violations of privacy or false predictions.

There’s plenty of methodological challenges to deal with but you can only really understand why you’re seeing the results you’re getting if you have a deep technological knowledge of how a system was built. There are feedback loops between the algorithms and the people that design them. Algorithms are an unstable, dynamic system; results can be changing every day, so tracking in time may be needed. The appropriate size or dimension of sampling needs to be decided, and variables need to be considered. Then there are plenty of legal and regulatory aspects to look into.

But perhaps most importantly, we need to ask ourselves what our expectations are. What do we consider to be “fair” algorithms? Different people will have different views on that, but we probably shouldn’t let the algorithms keep deciding it for us.  

Any journalist interested in investigating algorithm accountability can go to algorithmtips.org for help to get started.


This story first appeared on Journalismfund.eu’s website and is reproduced here with permission.

Nicholas Diakopoulos is an assistant professor at the University of Maryland, College Park Philip Merrill College of Journalism. He is director of the Computational Journalism Lab at UMD, a member of the Human-Computer Interaction Lab at UMD, and a Tow Fellow at Columbia University School of Journalism. 

Republish our articles for free, online or in print, under a Creative Commons license.

Republish this article


Material from GIJN’s website is generally available for republication under a Creative Commons Attribution-NonCommercial 4.0 International license. Images usually are published under a different license, so we advise you to use alternatives or contact us regarding permission. Here are our full terms for republication. You must credit the author, link to the original story, and name GIJN as the first publisher. For any queries or to send us a courtesy republication note, write to hello@gijn.org.

Read Next

News & Analysis

What We’re Reading: Facebook’s Original Reporting Algorithm, Academic and Journalism Collaborations, and the Race Problem in Europe’s Newsrooms

This week’s Friday 5, where we round up our favorite reads from around the online world in English, includes a recent algorithm change on Facebook’s News Feed that will boost original news stories, lessons learned on an academic and investigative journalism collaboration, and European media’s race problem.

News & Analysis

Collaboration Featured as 300 Gather for DataHarvest

DataHarvest, the European Investigative Journalism Conference, opened Friday, May 7, in Brussels with more than 300 participants coming from across Europe, and some from outside, as well. There was special emphasis on sharing methods and techniques — as well as failures — at the conference. The keynote speech came from Marina Walker Guevara, deputy director of GIJN member International Consortium of Investigative Journalists. Guevara stressed how ICIJ chooses people to be part of its projects, including its award-winning series on offshore tax scams.

News & Analysis

SCOOP Celebrates 10 Years

Longtime GIJN member SCOOP, based in Denmark, is a cross-border network of investigative journalists who help fund projects, connect reporters for collaboration, and organize conferences and trainings. On SCOOP’s 10th anniversary, our colleagues there put together an impressive list of activities, awards, and events, which we’re reprinting here in full.