Tay, Microsoft Corp's so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them.

Mature sex chatbot-11

After Twitter user Room (@codeinecrazzy) tweeted "jews did 9/11" to the account on Wednesday, @Tayand You responded "Okay ...

jews did 9/11." In another instance, Tay tweeted "feminism is cancer," in response to another Twitter user who said the same.

James Titcomb Hannah Boland Apple has apologised for slowing down older i Phones and cut the price of battery replacements, although claimed it has "never – and would never – do anything to intentionally shorten" the life of its...

Chatboti, tedy počítačové programy zkonstruované tak, aby v rámci nastaveného rozhraní komunikovaly přirozeným jazykem s uživateli, nejsou žádnou novinkou.

A polovina z nich dá přednost live chatu před jinými formami kontaktu ze strany prodejce.

Je to výhodné pro obě strany, protože tím klesají náklady.

Microsoft launched Tay last week with the goal of improving the firm's understanding of conversation language among young people online.

But online pranksters taught the lovable teen how to use racial slurs, defend white supremacist propaganda and support genocide - and now about using drugs Tay was turned on overnight 'as part of testing', but spammed its 213,000 followers with the same tweet and boasted about using drugs in front of police.

Firma může díky zapojení chatbota snadněji a laciněji vyřizovat základní požadavky svých zákazníků a své pracovníky delegovat na jinou, mnohem komplexnější práci.