Description: Google's AI Overview feature mistakenly advised parents to use human feces in a potty training exercise, misinterpreting a method that uses shaving cream or peanut butter as a substitute. This incident is another example of an AI failure in grasping contextual nuances that can lead to potentially harmful, and in this case unsanitary, recommendations. Google has acknowledged the error.
Entities
View all entitiesAlleged: Google developed an AI system deployed by Google and AI Overview, which harmed Google , Parents and Google users.
Incident Stats
Incident ID
791
Report Count
1
Incident Date
2024-09-09
Editors
Incident Reports
Reports Timeline
futurism.com · 2024
- View the original report at its source
- View the report at the Internet Archive
Should parents smear human feces on balloons to teach their kids a lesson? Google's AI Overview says yes.
Google searches regarding toilet training tactics for children repeatedly returned suggestions from the company's "AI Overview" search…
Variants
A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Inappropriate Gmail Smart Reply Suggestions
· 22 reports
Alexa Plays Pornography Instead of Kids Song
· 16 reports
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
Inappropriate Gmail Smart Reply Suggestions
· 22 reports
Alexa Plays Pornography Instead of Kids Song
· 16 reports