[ad_1]
It wasn’t until the other day that I wondered if it would be fun to have a hello in my kitchen.
A cuckoo clock powered by Amazon Alexa i.e.
I concluded that the idea was crazy, as were most things that Alexa activated.
But we all have our prejudices and many Americans are only too happy to see Amazon’s Echoes and Dots scattered around their homes to make their lives easier.
Well, Alexa can even buy your mom for you, if you want.
Still, perhaps Alexa fans should be warned that things may not be as delicious as they seem.
Skills? Oh, everyone has skills.
New research from concerned academics at Ruhr University in Bochum, Germany, as well as equally concerned colleagues in the state of North Carolina – and even a researcher who, during the project, joined Google – may well make Alexa owners question the true meaning of life.
The researchers looked at 90,194 Alexa skills. What they found was a safety Emmenthal that would make a mouse wonder if there was any cheese there.
How would you like to shiver, oh happy Alexa owner?
How about this phrase from Dr. Martin Degeling: “A first problem is that Amazon has partially activated skills automatically since 2017. Previously, users had to agree to the use of each skill. Now they barely have an overview of how Alexa’s response came to them and who programmed it in the first place. “
So the first problem is that you have no idea where your smart response is coming from every time you wake Alexa from her sleep. Or, in fact, how sure your question might have been.
Ready for another quote from the researchers? Here it is: “When a skill is posted to the skill store, it also displays the developer’s name. We have found that developers can register with any company name when creating their developer account. on Amazon. This allows an attacker to easily impersonate any reputable manufacturer or service provider. “
Please, this is the kind of thing that makes us laugh when big companies are hacked – and don’t tell us for months, if not years.
These researchers actually tested the process for themselves. “In one experiment, we were able to post skills on behalf of a large company. Valuable information from users can be tapped here,” they said modestly.
This finding was also encouraging. Yes, Amazon has a certification process for these skills. But “there are no restrictions on modifying the backend code, which can change at any time after the certification process.”
In essence, therefore, a malicious developer could modify the code and start recovering sensitive personal data.
Security? Yeah, it’s a priority.
Then, say the researchers, there are the skill developers who post under a false identity.
Perhaps, however, it all sounds too dramatic. All of these jurisdictions surely have privacy policies that govern what they can and cannot do.
Sit down, please. According to the research: “Only 24.2% of skills have a privacy policy.” So three-quarters of the skills, well, no.
Don’t worry, though, it gets worse: “For some categories like ‘kids’ and ‘health and fitness,’ only 13.6% and 42.2% of skills have a privacy policy, respectively. As privacy advocates, we believe both “kids” and “health-related skills should be subject to higher data privacy standards. “
Naturally, I asked Amazon what she thought of these somewhat cold finds.
An Amazon spokesperson told me, “The security of our devices and services is a top priority. We perform security reviews as part of skills certification and have systems in place to continuously monitor them. Live skills to detect any potentially malicious behavior. Any offensive skills that we identify are blocked. upon certification or quickly disabled. We are constantly improving these mechanisms to better protect our customers. “
It is comforting to know that safety is a priority. I want to entertain customers with as much Alexa skills as possible so that Amazon can collect as much data as possible, this could be a higher priority.
Still, the spokesperson added: “We appreciate the work of independent researchers who help bring potential issues to our attention.”
Some might translate it as, “Damn, they’re right. But how do you expect us to watch all these little skills? We’re too busy thinking big.”
Hey, Alexa. Does anyone really care?
Of course, Amazon thinks its surveillance systems are working well to identify true disbelievers. In a way, however, expecting developers to play by the rules isn’t quite the same as making sure they do.
I also understand that the company believes that children’s skills are often not tied to a privacy policy because they do not collect personal information.
What or two parents might mumble, “Uh-huh?”
Ultimately, like many tech companies, Amazon would prefer that you monitor – and change – your own permissions, because that would be very profitable for Amazon. But who really has these surveillance skills?
This research, presented last Thursday at the Network and Distributed Systems Security Symposium, makes for such a frankly blunt read that at least one or two Alexa users might ponder what they’ve done. And with who.
Again, does the majority really care? Until some unpleasant event happens, most users just want to have an easy life, having fun talking to a machine when they could quite easily turn off the lights themselves.
After all, this isn’t even the first time researchers have exposed vulnerabilities in Alexa skills. Last year, academics tried to download 234 groundbreaking Alexa skills. Tell me how many have been approved, Alexa? Yes, all of them.
The latest skills researchers have contacted Amazon themselves to come up with a sort of “Hey, look at this.”
They say, “Amazon has confirmed some of the problems to the research team and says it is working on countermeasures.”
I wonder what skills Amazon uses to achieve this.
[ad_2]
Source link