Amazon Alexa "predicted" The Death Of Humanity - Alternative View

Amazon Alexa "predicted" The Death Of Humanity - Alternative View
Amazon Alexa "predicted" The Death Of Humanity - Alternative View

Video: Amazon Alexa "predicted" The Death Of Humanity - Alternative View

Video: Amazon Alexa
Video: Never ASK ALEXA These Questions or You Will Regret It | STOP 2020 2024, September
Anonim

30-year-old California resident Shawn Kinnear received an Amazon Echo smart speaker as a gift for Christmas in 2016. The owner of the device claims that he rarely used it.

Recently, Amazon's personal assistant Alexa built into the Amazon Echo surprised and frightened the owner. On June 18 this year, Sean Kinnear was walking from kitchen to room when he heard a phrase spoken by Amazon Alexa with the usual intonation:

In a state of extreme surprise, Sean asked Amazon Alexa to repeat what he said, to which the AI system replied that it did not understand the command. The owner of the speaker stated that before this phrase he had not given any commands to the speaker, in addition, the TV or other device that could provoke Alexa to such a statement was not working in the room.

What makes Amazon Alexa behave differently than the developers intended? Common software glitches or real SkyNet implementation? Or maybe everything is much simpler, and Sean Kinnear decided to play a joke in this way?

Amazon has made statements about this so far.

Recommended: