“As we learned in chapter 3, the autocomplete function in smartphones is usually powered by a kind of machine learning called a Markov chain. But companies have a tough time stopping the AI from blithely making depressing or offensive suggestions.
As Daan van Esch, project manager for the Android system’s autocorrect app, called GBoard, told internet linguist Gretchen McCulloch, “*For a while, when you typed ‘I’m going to my Grandma’s,’ GBoard would actually suggest ‘funeral.’ It’s not wrong, per se. Maybe this is more common than ‘my Grandma’s rave party.’ But at the same time, it’s not something that you want to be reminded about. So it’s better to be a bit careful.”*
The AIs don’t know that this perfectly accurate prediction is nonetheless not the right answer, so human engineers have to step in to teach it not to supply that word.”
---
**Tags** — [[quotes]], [[artificial-intelligence]], [[ai-problems]], [[teaching-anecdotes]],
**Source** — [[202307171347 — B — You Look Like a Thing and I Love You]]