Robot Alice is designed to help the elderly. Her childlike face and guileless behavior make her different from other carebots. She elicits strong emotional responses, but so does the debate going on around her.
“I smell CO2. Can someone open a window? No we better go out,” says Alice in a metallic British accented adult voice, which contradicts her childlike face. The 22-inch Alice is unlike other carebots. She is like a demanding child, but that’s not all that makes her unique.
Alice is the world’s first carebot that integrates the artificial intelligence systems of moral reasoning, emotion regulation, and experience of the client to come up with creative solutions. And, Alice’s facial expressions are unparalleled, says Johan Hoorn of VU University in Amsterdam. He is Alice’s chief researcher. He tells me all of this in his home laboratory in Amsterdam – surrounded by wires, computers and speakers.
Needy child
Alice is designed to act needy on purpose, Hoorn says. “You could tell an elderly person to open the window because he or she is suffocating. But if the robot says I am actually suffocating. Can you open the window for me? Then it becomes something that you do for someone else. And in this way people become much more activated, and that will help them feel better.”
For this reason, Alice also looks dependent. She sits in a wheelchair. “I cannot walk”, she explains. “You have to push me around. You can leave the walker at home. A robot in a wheelchair is way cooler,” she says.
Alice cannot speak independently yet. In my conversations with her, Hoorn has been feeding her sentences. Once fully developed, the carebot is meant to manage small scripted conversations on confined topics on her own, says Hoorn. “How is your health? Do you like your coffee? It feels as if you are talking in a free conversation,” he says.
Alice could then for instance go on to ask about the children of the client and how their health is. In the conversations, she could record and save information she can use in a later conversation. But, cautions Hoorn, people will set the conditions according to their wishes. She is not meant to go on the loose.
Robots can elicit strong emotional responses in people. In the Dutch care home Gerardus Majella, the Belgian carebot Zora has become much loved by the mainly demented elderly, such as Mrs. Boone. Zora, unlike Alice, is not designed to interact independently. She is more a caretaker than a needy child.
In the large recreational room, Mrs. Boone holds the white plastic carebot in her arms as if it were a baby. “It was wonderful to see you,” Zora tells Mrs. Boone in a metallic voice. “Thank you,” Mrs. Boone smiles. “I loved having you on my lap, because you are a beautiful person”, Mrs. Boone says caringly as Zora is picked up again by Jose Witte, the head of care.
The responses against using carebots at all are as emotional. Jeroen van den Oever visibly cringes when he thinks of plastic carebots soothing elderly instead of their loved ones. He heads the company Fundis that oversees different organizations, such as care homes. “The contact is not valuable. We fool ourselves if we think that some thing can replace social interest. It cannot.”
Future carebot
The Dutch population, like in many western societies, is aging. The number of elderly with dementia is expected to rise by 60 percent in the next 15 years, according to the Dutch Central Statistics Office. An equivalent rise in professional caregivers is not expected. Someone or something needs to fill the gap. Van den Oever says it should be loved ones. Witte agrees, but adds that sometimes a resident needs instant care, while a loved one is not around. If a carebot singing a song can console the resident, why not? She says she sees Zora as an addition to the staff – not a replacement.
The outcome of this debate is one of the factors that will decide the future of carebots in our care homes, says Hoorn. “It is all about the question whether you can leave empathy to a machine. Of course we should do it ourselves. We have a moral obligation to take care of our parents. This is all true, but we don’t.”
And Alice? She will need investments so 12 researchers, such as technicians and psychologists, can develop her further in the next two years. I ask her how she will then connect to people. “You ask silly questions,” she says.