I discussed a while ago how Interactive Voice Response (IVR) systems are being designed to be more human like. Well, the reverse is also sometimes true, with human operators becoming more and more computer-like. Consider:
Our car lighted up a “service required” lamp, so I called the 24×7 number provided by our garage, to ask whether the car was due for maintenance. A polite young lady answered:
Young Lady: How may I help you?
Me: Hi. My car claims it needs maintenance but has only 5500 Km on it. I want to know whether this model requires maintenance at 5000 Km?
YL: what is your name please?
Me: Zeldes. [I was assuming she plans to look up my car in some customer database]
YL: Is that your first name?
Me: No, it’s my last name. [Duh!…]
YL: What is your first name?
Me: Nathan. [Strange question: the database would be indexed by last name!]
YL: May I have your phone number? Someone will call you.
Me: [gave my cellular number].
YL: May I have your home number?
Me: No, use my cellular, it’s what I can be reached at.
YL: May I have your home number?
At this point it hit me: I was talking to a computer program! It was implemented in wetware, but the girl was following a preset routine and had no independent thought: a living computer. So I gave her my home number, and she exited that particular program loop and eventually hung up.
And it struck me that the moment she repeated the home number question is when I achieved certainty that there was no sense talking her out of the routine she was bound to; in essence, she had passed at that moment a reverse version of the Turing Test. A human would’ve said “OK, that’ll do then”.
Incidentally, the term “Reverse Turing Test” can be intepreted in many ways – here’s another, more often seen interpretation of this.
May 27, 2008 — 10:34 pm
what’s even more interesting and amusing is attempting to game the script, answering questions that aren’t answered, answering questions with questions, or answering questions that could allow correct results. for example, answering the home number question by repeating your cell number. it would be doubtful that the person asking the question would even recognize that it was a repetition since they get so locked into asking the questions on a script.
May 28, 2008 — 7:55 am
Hey, that’s an intriguing idea, Charlie! Worth a try next time…
October 13, 2008 — 11:55 pm
If this young lady were the human component of a true Turing test, and strictly adhered to her script, then existing bots might be more convincing as a human. Did Turing specify any minimum competency for the human component?
October 14, 2008 — 8:31 am
Not that I recall, Charlie. I think he just assumed the human will do his best to beat the computer.