Amazon m dressed for a week. Here is what I learned.



[ad_1]



  25EchoLook - Captstyle checkÓ screen captures the device and amazon echo search application. The device uses artificial intelligence to "help you look the best possible." (Amazon)

Amazon

Dressing up in the morning can be a chore. So for a week, I outsourced the job to Alexa.


Getting dressed in the morning can be a chore. So, for a week, I outsourced the job to Alexa.

Millions of us use Alexa, the artificial intelligence technology behind Amazon's Echo smart speakers, to get news, answer questions and play music. But the Echo Look, a $ 199 model released last month, adds a new twist: Alexa can "help you give the best of yourself."

The Echo Look experiment is strangely familiar to anyone who remembers the computer program of Alicia Silverstone. Horowitz chooses the good outfit in the 90s Clueless movie. The six-inch device syncs with your phone and your Amazon account and has a hands-free camera to take pictures of the entire body or 360-degree videos of your outfits . Start sailing for the Look, and artificial intelligence from Alexa steps in, organizing your daily wardrobe and suggesting new pieces – to buy on Amazon, of course.

A "Style Check" feature compares two outfits side by side. learning algorithms to choose a winning look. You can even have your images reviewed by the team of Amazon fashion specialists or other users via the Instagram social feed of Amazon, Spark.

Receive Talking Points in your inbox:

A summary of the main news of the day, delivered during the week.

Amazon has tested Echo Look's phone app version as one of its many forays into the retail apparel industry. The company has more than 65 brands of private label clothing and last month opened its subscription fashion box service, Prime Wardrobe, to all of its core members.

Kenlyn Jones, an assistant professor at the Massachusetts College of Art and Design, said that even though Amazon has revolutionized retail, "they have not been able to make the transition to fashion" as easily . Echo Look, she said, helps the company break into the industry by collecting data on all clothing purchases we've made elsewhere.

Intrigued, I asked the company for a revision copy of the new gadget. I must note that for the moment, my style is limited. At seven months of pregnancy, I wanted to see if Echo Look was ready to take on a challenge.

The installation is a kid's game: I download the Echo Look app, I connect to Amazon and I connect the speaker to my home Wi-Fi connection. In a few seconds, a friendly voice that strangely resembles that of Rachel Zoe asks to take a picture of my outfit. (The celebrity stylist is a partner in launching the Echo Look.)

I position the Look camera so that it's coming from head to toe .

"Alexa, take a picture," I say. After a long flash, he captures my image.

Yikes. The picture is terrible: my hair is not dry, I do not have makeup, and the angle is awful (oh, so many chins). I reach my hair dryer and realize: The Look has already made me far more aware of myself.

Awarded and ready, I make a second attempt. Using Style Check, I try a dress with blue and white stripes, then a red and white striped dress (I like stripes). In a few seconds I learn that the blue and white is the winner; "Colors work better together" and the "outfit form works better" for me. A third outfit, a blue and pink patterned wrap dress, beats the blue and white number.

"Better colors", the Look tells me through the application.

Of course, arriving at the office later in the day, I am immediately greeted by compliments from colleagues.

The Echo Look experience is strangely familiar to anyone who remembers the computer program that helped Cher Horowitz of Alicia Silverstone to choose the right outfit in the movie Clueless of the '90s.

"Thank you" I replied. "Amazon has dressed me today."

So what does the AI ​​do, exactly? Jones says the images are probably scanned to make an estimate of my measurements and body type, using the same proportion of calculations that you would use in making traditional patterns. Combined with the data that he has collected on fashion trends, it's "determining what works best statistically for your body," she said.

Throughout the week, while I slap a few outfits every morning, the Look builds a profile of my closet. I start to ask for the fashion notice.

"Denim-on-denim is here to stay – try mixing the light and dark washes," the look told me.

I grit my teeth. I am against the concept "denim tuxedo".

The next day he recommends to buy a quilted banana bag. Um, just no.

Another day I switch to the Look app, suggesting that my black stretch pants would work well with a military green BCBGeneration wrap shirt. I click on the image, and it sends me directly to my Amazon application. One click, and for $ 59, it's mine.

<img class = "inline-media__image" alt = "25EchoLook – screenshots of Òstyle checkÓ of the device and amazon echo app Amazon (19659029) Amazon

Amazon can give us the impression of creating personalized offers for us, but in reality it's just about collecting data in the hope of getting us all to buy more things

This is a major goal of the Echo Look, explains John Cheney-Lippold, professor of American culture at the University of Michigan and specialist in Internet studies.

Amazon may give us the impression of creating personalized offers for us, but really, it is simply about collecting data in the hope of making us buy more things. The profile that Amazon has created about me, based on my previous book purchases, coffee makers and gardening tools, easily granted with other women of my age and demographics, whose likes, Amazon bet, look like mine. Maybe I will not buy the t-shirt, but there is a good chance that someone else will want it.

"It's a really dehumanizing way of understanding the world," jokes Cheney-Lippold.

More disturbing, my pregnancy is also probably interesting for Amazon, he adds; his algorithms may be able to spot that I am a good candidate for their diaper subscription service. Amazon may also be examining my modern furniture and floral bedspread in the middle of my photos, says Cheney-Lippold, hoping to sell me matching pieces to fill the room.

I tell Cheney-Lippold that I feel uncomfortable. He's laughing. He's not finished.

The device can read my emotions, discern my sex, my ethnicity and my class, depending on the size of the room or the amount of natural light in the photo, Cheney-Lippold tells me.

I raised these issues with a representative of the company, who assured me that "we do not identify the elements of your photos that are not related to your outfit . "

But Amazon has not updated its privacy policies. And Cheney-Lippold said that as technology progresses, she could possibly want to do more with the images than we could achieve.

Facebook saw a massive pushback of customers after the Cambridge Analytica policy research firm accessed personal data from millions of social network users.

Cheney-Lippold said that privacy can be less of a concern for Amazon because it has a commercial interest in maintaining customer trust. But he said that every time someone uses the Echo Look, he helps Amazon to improve its machine learning capabilities – and no one really knows how the company will use these capabilities to l & # 39; future.

device? It's definitely useful and fun to play. And if I had desperately needed a wardrobe update, as I could do in a few months, it would be useful to place clothing orders without much effort.

But do I trust the device to dress me up? I do not think so. The style, as I see it, is a combination of personal taste, real life experience and emotion. And I do not trust an algorithm to know me as well.

Janelle Nanos
can be contacted at [email protected]. Follow her on Twitter @janellenanos .

[ad_2]
Source link