Creepy new Amazon camera wants to watch you get dressed in your bedroom | |
Amazon.com has just released a creepy new "Echo Look" camera that you're supposed to install in your bedroom or bathroom. It allows Amazon to take photos or videos of you getting dressed (or undressed, for that matter) which are uploaded to Amazon.com servers. This isn't a hoax. It's the new insanity of the corporate surveillance state that now exists in America. |
(Natural News) Not content to merely track your interests in books, movies and products, Amazon.com, predominantly owned by globalist Jeff Bezos — the same owner of the fake news Washington Post —
now wants to watch you get dressed in your own bedroom.
The company has introduced a $199 camera called “Echo Look” that uploads images of you (from your own bedroom) to Amazon servers where, we’re told, Amazon will make “wardrobe recommendations” you might want to buy.
To get these recommendations from Amazon, you are supposed to install an internet-connected Amazon camera is the same place where you’re totally naked, or having sex with your partner, or carrying out hygienic behaviors involving tampons or talc.
Has nobody thought to ask whether this intrusive Amazon camera will be secretly recording photos and videos of you in your own bedroom or bathroom?
Via The Daily Sheeple:
It’s basically a camera that you stick in your bathroom or your bedroom or wherever you dress, and it takes full length images of what you’re wearing. It’s being called a “style assistant.” It has a service that “combines machine learning algorithms with advice from fashion specialists,” presumably to give you feedback on what you’re wearing. I’m guessing that at some point, it’ll make suggestions for clothes you should buy on Amazon.
Amazon wants to record everything you say or do in your own home
This Echo Look camera is, of course, just the latest insane intrusion into personal privacy by a monopolistic corporation with ties to the CIA and NSA.
Does any informed person really think Amazon isn’t recording all this audio and video, then sharing it with government spooks?
Thanks to Edward Snowden, we already know Google and Facebook are little more than NSA spying fronts that surveil their own users, then turn over all their findings to the intelligence agencies without a warrant.
Now, Amazon apparently believes ignorant consumers desire to exist in an Orwellian, totalitarian society where dangerous corporations run cameras that watch them poop, pee, sing horribly, have kinky gay sex, cross dress, snort lines of coke, masturbate or whatever else twisted people do in the privacy of their homes.
Don’t get me wrong on all this, by the way:
As far as I’m concerned, you are free to do whatever you want in your own bathroom…
but if you are stupid enough to install an Amazon.com camera that watches you do all these things, you are just begging to have all your surreptitious photos and videos blasted all across the ‘net following an Amazon.com server hack.
(Related: Read PrivacyWatch.news for more stories on how your privacy is violated.)
Learn more about how Amazon.com spies on you in my mini-documentary
(and don’t be stupid enough to BUY a spy camera that spies on you for the government):
Amazon debuted the "Echo Look" on Wednesday, a $200 device with a built-in camera designed to allow users to take selfies, organize their wardrobes, and solicit style advice from friends.
But as Amazon celebrates the debut, privacy advocates are warning that Amazon Echo Look could potentially peer into users' most intimate moments at home.
Amazon's Echo Look is essentially an updated version of the personal assistant device Echo — which allows users to read the news, set alarms, get traffic and weather reports, control their other smart home devices, etc. — except this time there's a built-in camera.
The Echo Look is being marketed as a fashion assistant. With the words "Alexa, take a picture" or "Alexa, take a video," the Echo Look will take full-length photos and videos. The device's camera comes with LED lighting, depth-sensing technology, and automatic background blur.
Users can then use a companion app to swipe through the photos and video, and send looks to friends. Amazon also rolled out a new app feature called "Style Check," which gives users advice if they're stuck between two outfits. Style Check is powered by both machine learning algorithms and input from style specialists, Amazon says.
Of course, questions about privacy and security abound as Amazon asks users to invite this camera into some of their most private moments: trying on clothes in their own bedrooms and bathrooms.
Speaking to CNET, Jackdaw Research analyst Jan Dawson suggested that it's possible the Echo Look's built-in camera and audio capabilities could potentially be used for other things, such as surveillance and video communication.
"There's lots of stuff you can do once you've got a connected camera in the home," Dawson said.
Forbes writer Curtis Silver cautioned that the future of technology is not always brighter than the past.
"How is spending $200 to put an IoT [Internet of Things] camera in your bedroom the future of technology?" Silver wrote. "It's the future alright, the future of the end of whatever privacy you though you had left and you're paying for it to be removed from your life."
Silver painted a hypothetical picture of where the Echo Look might lead consumers eventually.
"Now look around your bathroom, your bedroom," he wrote. "What do you see? Do you see a used toothbrush that Alexa might suggest you replace? Do you see the sex toys you left out last night? Do you see the person who is not your significant other asking what is that thing you are plugging in? How will Amazon use this information? Look at yourself. What do you see? Do you see the cancer under your skin that an Amazon employee might see first? Do you see the gender you identify with or the one the Amazon Echo Look algorithm thinks you are? The tattoo you are hiding from your family? The depression? The anxiety?"
The Echo only records audio when it's enabled by a so-called "wake word" (i.e. "Alexa...") or through a dedicated app, according to Amazon. When it's recording, the device light turns blue. There's also a button that can be used to turn off audio completely.
All audio and video goes to the cloud, where it appears to remain indefinitely.
In an interview with TechCrunch, Amazon addressed the latest wave of security concerns.
"Designated Amazon personnel may view photos and video to provide and improve our services, for example to provide feedback through Style Check," Amazon said. "We have rigorous controls in place to restrict access to these images. We do not provide any personal information to advertisers or to third party sites that display our interest-based ads."
The biggest online apparel retailer in the country, Amazon has become a major force in fashion. It's expected to triple its share of the U.S. apparel market from 2015 to 2020, according to research from Morgan Stanley.
Couple that with the success of Amazon's personal devices — customers are projected to spend $2.1 billion on smart speakers like the Amazon Echo by the year 2020, according to a recent report — and it's not surprise that Amazon is making bold investments at the intersection of fashion and smart home technology.
For now, the Echo Look is available by invitation only to select U.S. customers, CNET reported.
Source: CBS.
But, before signing up ...
1) Do you really want an uncovered camera in your bedroom?
Most people get dressed in their bedrooms. Could the Echo Look camera be activated by hackers when you don't think it is on? Theoretically, the Echo Look should be placed in a room other than your bedroom, but, realistically, think about where you are used to trying on outfits and comparing them - if you are like many other people you would have to consciously change your habit, and maybe the location of your mirror, once Echo Look arrives. And, if you share a residence with others, changing your "fashion analysis area" may not even be practical.
2) Will private objects, papers, medications, etc. be seen?
How tidy do you keep your bedroom, and do you have mirrors in your bedroom? While Echo Look is supposed to blur out backgrounds, are you sure that when it photographs you it won't pick up items that you don't want to be seen in photos? Even if you place the Echo Look elsewhere, will it pick up items of which you don't want Amazon to have an image? Think about how many private "items" you have that could show up in photos or videos -- from who is in your home, to developing medical conditions, to medication bottles, to pregnancy tests, to tattoos, to sex-related items, to letters from debt collectors, to rejection letters from jobs, to medical insurance claims, to firearms, to illegal substances -- I could literally dedicate an entire article to listing out what private things can be picked up in a photo or video within one's home. Are you sure Alexa will never see them?
3) Could your kids trigger photos when you don't expect them to?
Do you have children? If so, are you sure that they will never ask Alexa to take a photograph when you don't want one taken?
4) If you cover your camera in your office, wouldn't you want to do so anywhere else in your home?
Covering the camera, however, undermines the ability to activate Alexa without any physical action - just by instructing the Echo Look orally to "Alexa, take a picture" or "Alexa, take a video." If I bought an Echo Look, I would probably cover it when not in use anyway.
5) Do you want Amazon to know what you own?
Amazon already knows a lot about what you buy from it and from vendors selling through its storefront. Do you really want the firm to have a record of what other clothing you own and in what condition it appears to be?
6) Do you want Amazon to know about your health issues?
Could Amazon detect health issues in the images? Could it utilize knowledge of health concerns to market to you various products and services? Unless Amazon commits to permanently not using the photos for any purpose other than for providing fashion-related advice, it is likely that, at some point, it may use the information in images to start marketing other forms of products - perhaps related to addressing weight gain, pregnancy, skin disease, or other health related matters. On that note...
7) Will Amazon discover that you are pregnant before you have told the world?
If you share an Amazon account with family members, such a phenomenon could have serious repercussions on family relationships and privacy between family members. (Remember when Target informed a father of his teenage daughter's pregnancy?)
8) Could AI applied to photos produce emotional distress?
Besides warnings (explicit as described above, or even implied through fashion recommendation changes) about potential weight gain, there are other serious issues at hand. Could Alexa's artificial intelligence analysis of images, for example, lead to situations where women receive baby product samples and coupons after suffering miscarriages? If you think that such a scenario is unlikely, consider that marketers have made that mistake multiple times in the past.
9) Could images be demanded by the government?
Of course, anything in Amazon's possession could be demanded by government officials with warrants. There may not be anything in the photos that concerns you -- but, do you really want curious government folks snooping around inside your closet -- and your home?
10) Do you trust all of the people who will ever work for Amazon?
While Amazon claims that it has "rigorous controls in place to restrict access to these images," according to various reports, it also states that "Designated Amazon personnel may view photos and video to provide and improve our services, for example to provide feedback through Style Check." Keep this in mind.
11) Will Amazon store the images forever?
If so, what will it do with them? Will it ever change its policies regarding how the images are used? Could it eventually sell them to third parties unless folks opt out at some point in the future? Forever is a long time - it is hard to know what the future brings.
12) What if someone's sense of clothing and modesty changes with time?
Someone who converts to various faiths or adopts a stricter interpretation of a faith in which he or she was raised, for example, may not wish to have photos on Amazon's servers of themselves dressed in various outfits. Will Amazon make it easy to get rid of all such images from all storage locations including from those used by employees as mentioned earlier?
There is little doubt that the Echo Look represents the future of AIs -- hearing and seeing. But, if you decide to get one, be sure to understand the privacy repercussions - and take steps to minimize their impact on your life.
Keine Kommentare:
Kommentar veröffentlichen