The concept of “dropping in” on someone is already intrusive — and maybe outdated. I mean, when was the last time you showed up at someone’s house unannounced? Or missed the days when you couldn’t see who was calling and just had to answer the phone? Amazon’s new Echo device, the Echo Show, is flanked by a screen allowing for voice-activated video calls, and a new feature called Drop In. Drop In allows anyone you’ve given permission to to drop in on your device and see live footage from your home. All someone has to do is say, “Alexa, drop in on Dara,” and the microphone and camera will start the broadcast. The recipient does not have to answer the call for this to work. The Echo Show will also tell you which of your contacts have been active with their Show devices; conceivably, who is awake or at home. Where you otherwise could decline calls and create the appearance that you are not available, your contacts could see that you are, in fact, ever “available.” Ready for a Drop In?
Source: BuzzFeed News
The Echo smart speaker is always listening for its wake word: Alexa. Once the Echo hears the wake word, it records your voice and stores the data indefinitely in the cloud. These recordings have been sought by a prosecutor for a murder case involving an Arkansas man accused of killing his friend during a night of drinking at home in November 2015. The men had been using the Echo to stream music. While Amazon had initially pushed back against providing the recordings, the suspect in the case said that he would voluntarily allow them, and Amazon surrendered the data.
Amazon is currently pitching Echo Look, a device that goes in your bedroom and takes pictures of you. Yes, it’s true, and you can request an invitation to purchase one for $200. It comes with a program called Style Check, which will shine LED lights and take 360° photo and video of you in different outfits. Then, it will tell you which you look best in and which are most on trend.
Writers at Motherboard identify some important considerations:
-Style Check is akin to letting a machine decide if you are overweight
-there may be built-in biases as to which outfits look “good”
-who (and which third parties) will have access to the tremendous granularity of detail about your body, bedroom, and clothing?
-such detail allows Amazon to detect your emotional state or if you’re pregnant
-images and videos taken by Echo Look will be stored indefinitely
Some major vulnerabilities here.