Agard Base
The Algorithm That Denied a Loan

The Algorithm That Denied a Loan

Region: North America|Issue: Algorithmic Bias|DLL Focus: 9 → 11 (Ethical Visualization & Bias Detection)

A bank's new AI credit-scoring tool promised fairness and speed. Within six months, thousands of qualified minority applicants were denied loans. The code, it turned out, had learned from decades of discriminatory approval data.

The Algorithm That Denied a Loan

Human Impact

Families lost the chance to buy homes; small businesses never opened. The numbers "worked," but the people disappeared behind them. Communities already under-served were quietly excluded again — this time by code.

What Went Wrong

Understanding the root causes helps us prevent similar failures in the future.

No one on the design team examined the training dataset for historical bias

The model optimized for prediction accuracy, not for justice

Data scientists saw the output curve, not the human cost beneath it

Ethical Reflection

Without empathy as a design parameter, efficiency became injustice. True data literacy means asking who benefits, who is missing, and what history hides inside the dataset.

Chart-Ed Connection

This failure sits at the intersection of DLL 9 (Detect bias in representations) and DLL 11 (Communicate findings with ethical clarity). A data-literate society audits its algorithms with compassion as well as logic.

Teaching Prompt

Discuss how this outcome could have changed if the development team had applied the DLL "Ethical Visualization" strand during model training and reporting.

Build Better Data Practices

The Chart-Ed Initiative for Global Data Literacy provides standards and frameworks to prevent these failures.

The Algorithm That Denied a Loan