Taylor Swift has threatened lawsuits against Microsoft for racist and genocidal statements, Chatbot Tay



[ad_1]

Photo: Getty

According to a new book by Microsoft President Brad Smith, Taylor Swift has tried to prevent Microsoft from using the Tay nickname for a chatbot turned into a depraved racist troll vessel.

In March 2016, Microsoft launched a new chatbot in the United States designed to engage with young adults and teens via social media. According to Smith's book Tools and Weapons, co-authored by Carol Ann Browne, director of communications at Microsoft, the company initially introduced the XiaoIce robot to the Chinese market, where it has been used by millions of people and integrated with banking platforms. information and entertainment.

"The chatbot seems to have filled a social need in China, with users generally spending 15 to 20 minutes talking to XiaoIce about their day, their problems, their hopes and their dreams," wrote Smith and Browne. "Perhaps she is responding to a need in a society where children do not have siblings?"

However, when Microsoft decided to try it in America, the AI-based Twitter bot, called Tay, did not have the same success. The bot was designed to learn to speak by interacting with other people on Twitter and posted tweeting responses based on what people were saying to it.

Anyone who has spent time on Twitter knows that it was an experiment doomed to failure. Indeed, less than 24 hours after its publication, the trolls have corrupted Tay. As Verge had written at the time, "Twitter has taught the Microsoft AI bot to be a stooge racist in less than a day."

A popular and seemingly beneficial tool in China has failed here because Microsoft did not expect Americans who use social media to be toxic and racist.

Microsoft immediately deleted the account. The next day, the company published a blog in which she apologized for the inappropriate behavior of Tay. "We are deeply sorry for Tay's offensive and hurtful involuntary tweets, who represent neither who we are nor what we stand for, nor how we conceived Tay," he said.

The account not only annoyed people who were offended by racist racism, it also seems to have upset Taylor Swift. As Smith recalls in his book:

I was on vacation when I made the mistake of watching my phone during dinner. A Beverly Hills lawyer had just received an e-mail saying, "We represent Taylor Swift, on whose behalf this is intended for you." … He went on to state that "the name & # 39; Tay & # 39 ;, I am sure you must know, is closely associated with our client. "… The lawyer then argued that the use of the name Tay was creating a false and misleading association between the popular singer and our chatbot, and that she was violating federal and state laws.

Smith added that Microsoft's trademark lawyers disagreed, but that Microsoft did not want to fight against Swift, the company immediately began to discuss a new name. He took this incident as a remarkable example of "different cultural practices" in the United States and China.

In 2018, the bot was relaunched under the name of Zo, a version of Tay who had been trained to communicate with teens, but was also programmed to avoid talking about politics, race and religion. If you have never heard of Zo, it's because Zo is boring.

Looks like they changed the name but the bot looked more like Taylor Swift.

Microsoft declined to comment further on this story. Swift did not respond to a request for comment.

[ad_2]

Source link