Entities
View all entitiesIncident Stats
Incident Reports
Reports Timeline

- View the original report at its source
- View the report at the Internet Archive
An algorithm funded by the World Bank to determine which families should get financial assistance in Jordan likely excludes people who should qualify, according to an investigation published this morning by Human Rights Watch.
The algorith…

- View the original report at its source
- View the report at the Internet Archive
Summary
Governments worldwide are turning to automation to help them deliver essential public services, such as food, housing, and cash assistance. But some forms of automation are excluding people from services and singling them out for in…

- View the original report at its source
- View the report at the Internet Archive
The World Bank is increasingly incentivizing countries to develop technologies that can find and rank people in poverty so they can be provided with cash transfer and social assistance programs, according to a report by the Human Rights Wat…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents

Northpointe Risk Models
Gender Biases in Google Translate
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
