Image from MIT Review and GETTY
Brian Pokorny had heard of AI systems for call centers before. But as the IT director of Otsego County, New York, he assumed he couldn’t afford them. Then the pandemic hit, and the state governor ordered a 50% reduction of all government staff, forcing Pokorny to cut most of his call center employees. Meanwhile, inbound calls were rising as more residents began seeking reliable covid-related guidance and medical information.
So Pokorny picked up the first solution that landed on his desk: Watson Assistant for Citizens, which IBM had started offering to governments, health-care organizations, and research institutions in a 90-day free trial. Within days of his signing up, the Watson team helped him deploy a chatbot to address callers’ most common questions, such as how to identify symptoms or how to get tested. The software also made it easy for him to update and expand the chatbot’s responses as queries evolved.
As the coronavirus crisis has dragged on, understaffed government agencies, grocery stores, and financial services have all scrambled to set up similar systems for handling a new influx of calls. IBM saw a 40% increase in traffic to Watson Assistant from February to April of this year. In April, Google also launched the Rapid Response Virtual Agent, a special version of its Contact Center AI, and lowered the price of its service in response to client demand.
While call centers have long been a frontier of workplace automation, the pandemic has accelerated the process. Organizations under pressure are more willing to try new tools. AI firms keen to take advantage are sweetening the incentives. Over the last few years, advances in natural-language processing have also dramatically improved on the clunky automated call systems of the past. The newest generation of chatbots and voice-based agents are easier to build, faster to deploy, and more responsive to user inquiries. Once adopted, in other words, these systems will likely be here to stay, proving their value through their ease of use and affordability.
IBM’s and Google’s platforms work in similar ways. They make it easy for clients to spin up chat or voice-based agents that act a lot like Alexa or Siri but are tailored to different applications. When users text or call in, they are free to speak in open-ended sentences. The system then uses natural-language processing to parse their “intent” and responds with the appropriate scripted answer or reroutes them to a human agent. For queries that can’t be answered automatically, the algorithms group similar ones together to show the most commonly missed intents. “The nice thing about the technology is that it somewhat learns what types of questions are being asked, so we can plug them in; we can program it after the fact,” says Pokorny.
The platforms are proving popular among a range of organizations, especially those with limited technical resources. Small and midsize government organizations, including the city of Austin, Texas, and the Czech Ministry of Health, have used Watson to build chatbots that provide information about covid testing, prevention, and treatment. The Oklahoma Employment Security Commission has used Google’s virtual agent to help field over 60,000 daily calls related to unemployment claims. Health providers like the University of Arkansas for Medical Sciences and the University of Pennsylvania’s medical school have worked with both platforms to develop patient triage tools that help them administer timely care.
The goal of the systems is…