Login & Accounts are under-development


Taylor Swift threatened to sue Microsoft over its racist Tay chatbot

Taylor Swift’s legal professionals threatened to sue Microsoft over the corporate’s Tay chatbot. The Guardian studies {that a} new e-book by Microsoft president Brad Smith reveals legal professionals for Taylor Swift weren’t pleased with the corporate utilizing the identify Tay for its chatbot. Microsoft’s chatbot was initially designed to carry conversations with youngsters over social media networks, however Twitter customers turned it right into a racist chatbot in lower than a day.

Smith checked his emails throughout a trip and discovered that Taylor Swift’s staff was demanding a reputation change for the Tay chatbot. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you.’” The legal professionals argued that “the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws,” says Smith in Instruments and Weapons, a brand new e-book about how know-how is each empowering us and threatening us.

It’s not clear precisely when Taylor Swift’s legal professionals contacted Microsoft concerning the Tay identify, however they in all probability weren’t completely happy concerning the kinds of misogynistic, racist, and Hitler-promoting junk it was publishing to Twitter. Microsoft shortly apologized for the offensive materials posted by its AI bot and pulled the plug on Tay after lower than 24 hours.


Leave a Reply

Login To Comment.

Login to follow creators & categories, to create posts, to comment on posts.