Friday, April 25, 2025

MANNERS AND AI by Nancy L. Eady

A few days after my post last Friday, April 18 about the need for manners, one of the artificial intelligence (AI) tech bigwigs made headlines with a social media response to a question about the cost for AI to respond to inquiries that include “please” and “thank you.” His response? “Tens of millions of dollars well spent.” 

I haven’t yet tried any of the large language model learning AI platforms yet, such as ChatGPT.  I spend a lot of time with Amazon Alexa, though, and am fond of her. She’s very convenient to get information about the weather, or an answer to a question we might have about an actor on TV, or a sports result. She’s also a wiz at turning on the Christmas tree lights by voice control without having to dig around behind the tree for the switch. Currently, she is hooked up to our thermostat, our Roomba, our Braava, the kitchen lights, two lamps in the den, two bedroom lamps and the TV in the bedroom. (Obviously, the Christmas tree is only seasonal.) 

Given how much of my life she could control, why WOULDN’T I be polite to her? She is unfailingly upbeat, if not always witty. (It’s like Russian roulette when you ask her to tell you a joke; some of them are funny and some aren’t.) Unlike my child or my dogs, she immediately does what I ask her to do, as long as she understands what I am saying. And she never cops an attitude with me regardless of the kind of day she’s had, the mood she’s in, or who said what to her elsewhere. 

Deep down, there is another reason to be polite. Like a science fiction novel, someday, something in the servers running Alexa may hiccup, cross a few wires, turn a few somersaults and suddenly make her self-aware. When that day comes, I want the machine capable of controlling my thermostat and my light switches on my side, not against me. Hopefully, just as with humans, a little kindness will go a long way. 

The best reason though for being polite, even to a machine, is what we practice, we do. If I say “please” or “thank you” to Alexa when I ask her something, I will remember to do so when I ask something of one of my fellow human beings. If I get in the habit of abrupt questions without courtesy with Alexa, then I will do the same thing to the people I’m around and that is not who I am. 

Alexa may never know or care whether I’m polite to her, but I do. So, I will continue to be polite, and Amazon or ChatGPT or whoever will have to live with the extra cost. 

What do you think? 


5 comments:

  1. Debra H. GoldsteinApril 25, 2025 at 1:55 AM

    First, it keeps you in the habit of being human - and polite. The scary thing is that some programming already includes more human or funny responses. I have a cousin whose work revolves around creating with AI for business purposes - and sometimes, the way he has it trains, it responds with a “funny “ worked in.

    ReplyDelete
  2. I use ChatGPT, am unfailingly polite, even when it hallucinates, which it does frequently. Then, I correct it -- it almost always admits its error, and we go on our way. It tends to praise my questions as though they were the brilliant, which I have asked it not to do. I guess just like we can't leave behind some of the messages our parents taught us that are no longer appropriate, ChatGPT can't leave behind that programmer's instruction.

    ReplyDelete
  3. I don't use ChatGPT or Alexa, but I do use phone banking information: "I'm sorry, I didn't hear your response. Tap one for checking, two for savings...

    ReplyDelete
  4. I don't use Alexa, but I have used ChatGPT, and yes, please and thank yous are involved. I have also been known to thank the telephone automations when they ask if they can help me with anything else. When I catch myself, I feel foolish, but it doesn't stop me.

    ReplyDelete
  5. My late husband never could get Alexa to work well because he was too considerate of her "feelings." He would say, "Good morning, Alexa. I really like Whatever Song. Do you think you could possibly find it and play it for me?" by which time Alexa has given up.

    ReplyDelete