Rethink AI Objectives

Using AI to create humanlike computers is a shortsighted goal.

When customers feel like no one is listening, they may be right. Or, they may be unknowingly talking to a machine. Last week, Google announced Duplex, an artificial intelligence (AI) assistant that can handle customer service requests, such as booking an appointment or providing basic information. We’ve had automated phone attendants for years, but the new buzz is that customers can’t tell that this automated attendant isn’t human.

The Duplex announcement quickly brought questions about transparency. Should companies include a notice that “objects on the phone are less human than they appear?”

Of course they should (and Duplex will). Investing massively in technology to be intentionally duplicitous seems like it belongs in the junkyard of bad ideas we shouldn’t even consider, such as earthquake generators or encryption back doors. What would be the benefits of being dishonest with customers? Is the “gee whiz” worth the potential backlash?

Just because we can do something doesn’t mean we should. Consider the benefits of being up-front about interactions with bots:

  • Customer satisfaction can increase. For example, if customers know they are interacting with a computer and not using another person’s time, they may spend more time customizing their selection. I’ve certainly spent far more time online investigating travel options than I would ever subject a human travel agent to. With better customization, customer satisfaction can increase.
  • Customers can be better informed. When interacting with other humans, people hesitate to ask what they perceive to be dumb or repetitive questions. We can be reluctant to admit that we still don’t understand. But when interacting with machines, we are free to clarify to our heart’s content. As the use of technology removes the concern about long call times, we don’t have to worry about using a machine’s time.
  • Customer interaction can be faster. Social norms dictate that we first lubricate the conversation with bromides and banalities about weather and pleasantries. I deeply suspect that most customer service people actually don’t care if “I’m doing OK for a Monday.” Knowing that we are interacting with a machine, we can get straight to the point, saving everyone time.
  • Customers may be more honest. We care what other people think about us. As a result, we may not be completely honest when what we think differs from what we believe the other person wants to hear — the social desirability response bias. We are less likely to worry about what a machine thinks about us than what another human thinks.

Source: MIT Sloan Management Review

For Queries, Contact

+971 4 405 0817
marketing@futuresecsummit.com

Follow Us

copyright 2018. Futuresec Summit | Site Designed by Kern Culture