She Who Must Not Be Named
There’s no “sharing” your name with an AI device…
What happens when tech devices that are everywhere are programmed to activate by the sound of a real human name? It becomes almost impossible for people who have that name (and similar names) to be addressed, or even mentioned, without triggering the virtual assistant to respond and interrupt whatever conversation or activity is taking place.
So-called Smart Speakers:
Even though these devices are often called “smart speakers”, their capabilities do not include discerning the difference between a user giving them a command, or someone in the vicinity addressing or mentioning a person named Alexa (and similar names). In situations where there’s two or more people in the same space sharing a name, there’s multiple methods for indicating who is being addressed: adding a last name or initial, for example, or looking in the direction of the person they’re speaking to. A virtual assistant, however, cannot discern such distinctions, so it will always react when it senses its wake word. It’s precisely this mechanism that makes using a human name for virtual assistants such a poor choice.
She who must not be named…
Unfortunately, rather than change wake words (if that’s even possible) or turning off devices, many Amazon device users fall back on what seems easiest in the moment - trying not to refer to that person by their name. This approach may not be well thought out, but simply a quick reaction to the chaos that happens otherwise, and a desire to avoid it.
Ironically, while refusing to call a person by their name is dehumanizing, it’s precisely their being human that prompts device users to even attempt it in the first place. The virtual assistant doesn’t have the flexibility to stop responding to its name temporarily yet still remain functional. You can request that a human do that, however, but we think that it’s unreasonable and unethical to do so. If a product is leading to this type of “solution”, it needs to be altered.
“It’s been decided that she will be referred to as A …”
Another unfortunate method of “solving” the problem of false wakes caused by people’s names is for device users to assign them a new name. Expecting humans to relinquish something as crucial to their identity as their name in order to reserve it for summoning a robot might seem shocking, but that’s exactly what’s been happening to people named Alexa (and similar) in the years since Amazon launched its virtual assistant. What’s worse is that this is often happening in settings where the people in charge have been trained about the importance of getting peoples’ names right, like in classrooms and workplaces.
Next: Nuisance Name