Profile image
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

More Dangers of AI/Virtual Assistant’s Exposed: Amazon’s Alexa ‘Goes Rogue’ and Triggers German Police Raid. Not So Smart After All (Video)

Sunday, November 12, 2017 9:35
% of readers think this story is Fact. Add your two cents.

(Before It's News)

 

11-12-17

 

Caution: accidental shopping isn’t its only quirk! Police broke down the door of a Hamburg apartment after receiving complaints of a wild party taking place, but all they found was Amazon’s intelligent personal assistant Alexa having fun while the owner was away. What!

 

You better not be watching & listening to Jeopardy questions when Alexia is nearby… or watching cops…. imagine!


Amazon’s Alexa ‘Goes Rogue’ and Triggers German Police Raid

Source Jim Yackel


‘Alexa, sort your life out’: when Amazon Echo goes rogue

A Texan schoolgirl accidentally ordered a doll’s house using the gadget. Then, when local news reported the story, it triggered viewers’ own devices. But accidental shopping isn’t its only quirk 

Amazon Echo is apparently always ready, always listening and always getting smarter. So goes the spiel about the sleek, black, voice-controlled speaker, Amazon’s bestselling product over Christmas, with millions now sold worldwide. The problem is that when you have Alexa, the intelligent assistant that powers Amazon Echo, entering millions of homes to do the shopping, answer questions, play music, report the weather and control the thermostat, there are bound to be glitches.

And so to Dallas, Texas, where a six-year-old girl made the mistake of asking Alexa: “Can you play dollhouse with me and get me a dollhouse?” Alexa promptly complied by ordering a $170 (£140) KidKraft doll’s house and, for reasons known only to the virtual assistant, four pounds of sugar cookies. The snafu snowballed when a San Diego TV station reported the story, using the “wake word” Alexa, which is the Amazon Echo equivalent of saying Candyman five times into the mirror. Several viewers called the station to complain that their own Alexa had woken up and ordered more doll’s houses in what turned into a thoroughly 21st-century comedy of consumer errors. And a bonanza day for KidKraft.

Many of Amazon Echo’s gaffes stem from misunderstandings arising from an intelligent assistant who never sleeps (and an owner who hasn’t pin-protected their device). Last March, NPR ran a story on Amazon Echo’s capacity to extend the power of the internet into people’s homes. Again, Alexa took its power too literally and hijacked listeners’ thermostats. Another owner reported how their child’s demand for a game called Digger Digger was misheard as a request for porn.

On Twitter, Amazon Echo owners continue to share items that unexpectedly end up on shopping lists, whether sneakily added by children or simply because Alexa misheard or picked up random background noise. One owner uploaded a video in which their Amazon Echo read back a shopping list that included “hunk of poo, big fart, girlfriend, [and] Dove soap”. Another included “150,000 bottles of shampoo” and “sled dogs”.

Behind all this lies the more serious question of privacy: what happens to the data collected by voice-activated devices such as Amazon Echo and Google Home, and who is able to access it? Most recently, US police investigating the case of an Arkansas man, James Bates, charged with murder, obtained a warrant to receive data from his Amazon Echo. Although Amazon refused to share information sent by the Echo to its servers, the police said a detective was able to extract data from the device itself.

The case not only puts Alexa in the futuristic position of being a potential key witness to a murder, it also raises concerns about the impact of letting a sophisticated virtual assistant – a market estimated to be worth $3.6bn by 2020 – into our homes. As Megan Neitzel, the mother of the girl who wished for a doll’s house, put it: “I feel like whispering in the kitchen … I [now] tell my kids Alexa is a very good listener.”

THe Guardian


Amazon’s Alexa ‘goes rogue’ & triggers German police raid

Police broke down the door of a Hamburg apartment after receiving complaints of a wild party taking place, but all they found was Amazon’s intelligent personal assistant Alexa having fun while the owner was away.

Neighbours called the police to complain about loud music blaring from the apartment. When police arrived and forced their way in, they found the apartment empty, except for the Amazon Alexa which was blasting music. Alexa is an “intelligent personal assistant,” that can be instructed to control music, lights and search for things online.   

“While I was relaxed and enjoying a beer, Alexa managed on her own, without command and without me using my mobile phone, to switch on at full volume and have her own party in my apartment,”Alexa’s owner Oliver Haberstroh wrote on Facebook.

“She decided to have it at a very inconvenient time, between 1.50am and 3am. My neighbours called the police.”

The police had to break into the apartment in order to get inside. Haberstroh joked that when he asked Alexa to cover the cost of the new lock on the door, it responded, “I didn’t find any answer to the question.”

Amazon released a statement, saying, “working directly with the customer, we have identified the reason for the incident,” the Mirror reports.

“Echo was remotely activated and the volume increased through the customer’s third party mobile music-streaming app. “Although the Alexa cloud service worked flawlessly, Amazon has offered the customer to cover the cost for the incident,”it added.

RT


TV triggers Amazon Echo into shopping spree

Amazon’s virtual assistant Echo has triggered a wave of accidental shopping orders after picking up on commands from their owner’s TVs.

In one instance, the device picked up on Brooke Neitzel, 6, saying: “Can you play dollhouse with me and get me a dollhouse?” while playing at her home in Dallas, Texas. 

Several days later, a $170 dollhouse and, for no apparent reason, 4lbs of cookies were delivered to the Neitzel home, much to the surprise of Brooke’s mother Megan, according to CBS Dallas.

While certainly a sophisticated device, it seems Alexa is unable to distinguish individual voices, whether that of a child, a parent or even a person speaking on the television. This quickly became an issue after in San Diego after local station CW6 ran a segment on Brooke and Alexa’s shenanigans. 

READ MORE: Amazon denies police Echo data sought in murder case warrant

During the broadcast, anchor Jim Patton concluded by saying, “I love the little girl saying, ‘Alexa ordered me a dollhouse’.

As owners of the Amazon Echo device will know, “Alexa” is the wake command, after which the virtual assistant will record for up to 60 seconds, awaiting a question or a task to complete.

Unbeknownst to the news anchor, he triggered dozens of devices across San Diego to order dollhouses for their owners, reported CW6, who received a number of complaints from viewers.

Stephen Cobb, senior security researcher with ESET North America, says a flaw of the devices is that it doesn’t “recognize your specific voice and so then we have the situations where you have a guest staying or you have a child who is talking and accidentally order something because the device isn’t aware that it’s a child versus a parent.”

The Federal Trade Commission (FTC) is reportedly looking into the matter after previous issues with Amazon’s parental security features, amid concerns over privacy issues. 

An Amazon spokesman told Fox News“You must ask Alexa to order a product and then confirm the purchase with a ‘yes’ response to purchase via voice. If you asked Alexa to order something on accident, simply say ‘no’ when asked to confirm.”

Orders for physical products placed in error are eligible for free returns under Amazon’s terms and conditions. Brooke’s parents have since added a security code to the device (instructions on how to do so can be found here) and they have reportedly donated the dollhouse to a local children’s hospital.

RT


#alexa #virtualAssistant #policeRaid #amazon

====================================================================================

DISCLAIMER: Ads seen on this page or on this site 

are NOT endorsed by NOR are they placed by Due Diligence

THIS ARTICLE ENDS HERE

===========================================================================

 

 

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories
 

Featured

 

Top Global

 

Top Alternative

 

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.