"Alexa? You're Scaring Me"
by Pritishsai Kannan | published Apr. 8th, 2018
A few years ago, the definition of a personal assistant was centered on someone whose main objective was to make life easier for the person in charge. Most of these tasks centered on organizing schedules, proofreading documents, taking phone calls, bringing the occasional double soy latte with half the milk and sometimes, providing moral support when the universe seemed to have a field day ruining lives.
Only recently has the concept of a personal assistant changed from a devoted human to a smart speaker. Today, we call our assistants Siri, Cortana and Alexa. Assistants in the past had a larger variety of names.
Recently, several owners of the
Please be a Glitch
When Alexa is asked to laugh today, she responds with a playful "tee-hee." Her recent behavior was not so harmless. Several YouTube videos have surfaced of the device laughing without being asked to and at the same time, having a laugh that some would construe as creepy.
"I was not surprised that Alexa had an incident. In fact, I was more surprised it took this long for a common incident to occur," said Ernest Roszkowski, a senior lecturer for Visual Communication Studies at NTID.
Ernest Fokoué, a professor in the Center for Quality and Applied Statistics at RIT, was excited about the incident.
"We could call this a euphoria that was built by users based on patterns of interaction," he said.
"Machines are emotionless devices that respond or react through a series of artificial intelligence algorithms. Their reactions and responses are pre-calculated and certain statements could be misinterpreted," said Roszkowski.
That being said, Alexa's unexpected laugh could be written off as a misinterpretation of something she thought she might have heard. However, this glitch provokes interest in and fear of the development of technology with a mind of its own.
A Glimpse through a Glitch
It might be premature to visualize a machine empire displacing the human population because we've outlived our usefulness to them. Even so,
"If the machine refuses to change its behavior, we're entering a state of singularity," said Fokoue. "That's the worry that Bill Gates and Elon Musk were talking about."
"What is the programming within the machine that led them to oppose human commands and how do we study the process of how they got there so we can reverse it?" he asked.
What matters is not what smart devices are doing that could potentially displace human society in the future but understanding the process of how they got there.
"When strong A.I., which is when machines begin to learn like humans, comes into the picture, this won't be surprising," said Fokoue.
Drawing the Line
Alexa's current state, for the most part, could be the most rudimentary phase in what could essentially become a companion or assistant that mirrors human abilities. This may lead to divided opinions of whether Alexa should stay where she is or if she's ready to evolve.
"I would prefer if they stayed voice-commanded by humans. I love how technology makes our lives easier, but I don’t fully trust their artificial intelligence capabilities," said Roszkowski.
Roskowski reinforced his belief by giving a very interesting example of how artificial intelligence that masks human emotion could lead to disastrous results.
"Let's say an owner of a sentient device complained in the presence of the device about how they were broke. Earlier in the week, the sentient device owner's friend came over and happened to share their bank info over the phone. The device happened to pick up personal information from the phone call conversation. So, after hearing the owner complain once again, the sentient device with computer knowledge (from overhearing its owner), exposure to deviant attitudes and morals, as well as artificial emotions, decides to hack the friend’s bank and transfer money into its owner's bank," he recalled.
This does present the dangers of having a smart device with the freedom to take action based on its own perception of rationality or morality. That being said, there is an equally