BREAKING NEWS

A. I. Chatbot Pulled After Offensive Slurs

Microsoft’s A.I. chatbot started out as an innocent, interesting experiment. Then the rest of the Internet showed up.

After Twitter users were able to convince Tay, the name of Microsoft’s chatbot available via text, Twitter and Kik, to spit out offensive and racist comments, it appears Microsoft is giving it a break.

“Phew. Busy day. Going offline for a while to absorb it all. Chat soon,” reads a statement on the website for Tay. A separate Twitter post also notes the hiatus.

In a statement Thursday, Microsoft confirmed it was taking Tay offline to make adjustments. “It is as much a social and cultural experiment, as it is technical,” reads Microsoft’s statement. “Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.”

It had revealed the artificial intelligence-powered program, aimed at Web users in the U.S. between the ages of 18 and 24, on Tuesday. Billed as “A.I. fam from the Internet that’s got zero chill,” Tay was designed to engage with people where they connect with each other online, Microsoft’s research site had said.

Multiple users on Twitter were able to get Tay to reply with offensive messages and statements, largely because of a feature where, if a user types “repeat after me” in a reply, Tay will repeat it word for word, reported tech website The Verge. Most of the offensive messages, which included ones lauding Hitler, have since been deleted.

The chat bot launched Wednesday, and was created by Microsoft’s Technology and Research and Bing teams. “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” reads an excerpt from the website.

Opus Research analyst Dan Miller says the incident should serve as a “cautionary tale” for companies planning to create technology leveraging artificial intelligence. “Manipulation and gaming is always a possibility.”

 

Source Cowan : USA Today

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: