Skip to content

comp-journalism/list-of-algorithm-audits

Repository files navigation

List of Algorithm Audits

A continually-updated list of studies from the CSCW 2021 paper, "Problematic Machine Behavior: A Systematic Literature Review of Algorithm Audits"

Repository is a work-in-progress, open to community for edits.

Major update in February 2026 to add web interface.

Add a Study

To contribute a study, just add an entry in audit-data.js with the template:

{
  "Year": "2024",
  "Organization": "Google",
  "Behavior": "Discrimination",
  "Specific Behavior": "",
  "Method": "Direct scrape",
  "Domain": "Search",
  "Language": "English",
  "Country Studied": "US",
  "Country of Researchers": "US",
  "DOI": "https://doi.org/...",
  "Title": "Title of the study",
  "Authors": "Last et al.",
  "Source": "2026 Review (Your name)"
}

Guide:

  • Year: Publication year
  • Organization: Platform or system audited (e.g., Google, Facebook, Amazon)
  • Behavior: One of Discrimination, Distortion, Exploitation, or Misjudgement
  • Specific Behavior: Optional subcategory (e.g., Personalization, News distribution)
  • Method: One of Direct scrape, Sock puppets, Crowdsourcing, Carrier puppet, Persona scrape, or Code
  • Domain: e.g., Search, Recommendation, Advertising, Vision, Language Processing, Pricing, Criminal Justice
  • Language: Language of the content studied
  • Country Studied / Country of Researchers: Use ISO-style short names (e.g., US, UK, Germany)
  • DOI: Full URL to the paper (DOI link preferred)
  • Title: Paper title
  • Authors: Abbreviated (e.g., "Smith et al." or "Smith and Jones")
  • Source: The name of the human reviewer

Definitions

Algorithm Audit: an empirical study investigating a public algorithmic system for potential problematic behavior.

  • empirical study: includes an experiment or analysis (quantitative or qualitative) that generates evidence-based claims with well-defined outcome metrics. It must not be purely an opinion/position paper, although position papers with substantial empirical components were included
  • algorithmic system: is any socio-technical system influenced by at least one algorithm. This includes systems that may rely on human judgement and/or other non-algorithmic components, as long as they include at least one algorithm.
  • public: algorithmic system is one used in a commercial context or other public setting such as law enforcement, education, criminal punishment, or public transportation
  • problematic behavior: in this study refers to discrimination, distortion, exploitation, or mis- judgement, as well as various types of behaviors within each of these categories. A behavior is problematic when it causes harm (or potential harm). In the ACM Code of Ethics, examples of harm include "unjustified physical or mental injury, unjustified destruction or disclosure of information, and unjustified damage to property, reputation, and the environment."

Studies

Legacy section. Please use the web interface for a more up-to-date and explorable list.

Discrimination

The algorithm disparately treats or disparately impacts people on the basis of their race, age, gender, location, socioeconomic status, and/or intersectional identity. For example, an algorithm implicated in discrimination may systematically favor people who identify as males, or reinforce harmful stereotypes about elderly people.

Pricing

Advertising

Search

Recommendation

Computer Vision

Criminal Punishment

Language Processing

Distortion

The algorithm presents media that distorts or obscures an underlying reality. For example, an algorithm implicated in distortion may favor content from a given political perspective, hyper-personalize output for different users, change its output frequently and without good reason, or provide misleading information to users.

Search

Mapping

Recommendation

Advertising

Language Processing

Exploitation

The algorithm inappropriately uses content from other sources and/or sensitive personal information from people. For example, an algorithm implicated in exploitation may infer sensitive personal information from users without proper consent, or feature content from an outside source without attribution.

Advertising

Search

Misjudgment

The algorithm makes incorrect predictions or classifications. Notably, misjudgment can often lead to discrimination, distortion, and/or exploitation, but some studies in the review focused on this initial error of misjudgment without exploring second-order problematic effects. An algorithm implicated in misjudgment may incorrectly classify a user’s employment status or mislabel a piece of political news as being primarily about sports, for example.

Criminal Punishment

Advertising

About

A list of algorithm audit studies - now searchable and filterable!

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages