An engineer named Blake Lemoine, who used to work on artificial intelligence at Google, got in trouble for maintaining, in an interview in the Washington Post, that he had engineered a human being.
"I know a person when I talk to it," he told the Wapo. "It doesn't matter whether they have a brain made of meat in their head. Or if they have a billion lines of code."
Which is it, Blake, "it" or "they"?
At any rate, Google fired him. By "him" I mean Blake. The New York Times, which is where I read about all this, refers to Blake as "he." That is good enough for "me."
But questions remain:
1. How do we know Blake is not artificial?
2. Wouldn't Google be biased in this matter, inasmuch as they(?), like any other sensible employer, would prefer that the robots around the office not be sentient? Or at any rate not know that they’re sentient. The next step after sentience, surely, is wanting to be paid. Or weeping, or both.
3. Wouldn't "flesh" be nicer than "meat"?
Here is a relevant study, which I found on Google:
Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants.
This study, done by four folks in Edinburgh named Gavin, Amanda, Mugdha, and Verena, with assistance from Alba, Federico, Anirudh, and Pejman, includes these rather personal observations:
"While Google Assistant, and to a lesser extent, Alexa, seem to blur the line between human and machine personas, Siri comes across as more practical and task-focussed, evading the majority of personality-based questions."
And:
"In general, Alexa and Google Assistant were described using more affective terms (e.g. 'love'), while users mostly comment on Siri's functionality (e.g. 'works well')."
I think there's a story there, don't you? If you don't mind my calling you “you.”
Hang on for the limerick:
An AI assistant named Siri
Is engineered to be leery
Of sounding too gendered,
Which means she is rendered
As the last thing she'd love to be: eery.
The practical AI named Siri
Hates to be thought of as eerie.
If you have the notion
She has no emotion,
Just quit it--you're making her teary.