Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation." Unfortunately, the conversations didn't stay playful for long.
Pretty soon after Tay launched, people starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpist remarks.
Imagination and suspension of disbelief are also critically important.
Cybersex can occur either within the context of existing or intimate relationships, e.g.
We have 3 delicious 3d sex sluts to choose from right now with more coming very soon.
You can even change the guy and be white, black, or latin depending on your mood. the game concept, graphics and basic features have everything that is required for a top-class virtual sex simulation." - MMOVSG "Realistic moans and natural looking movements...
In one form, this fantasy sex is accomplished by the participants describing their actions and responding to their chat partners in a mostly written form designed to stimulate their own sexual feelings and fantasies.
The quality of a cybersex encounter typically depends upon the participants' abilities to evoke a vivid, visceral mental picture in the minds of their partners.
Cybersex is commonly performed in Internet chat rooms (such as IRC, talkers or web chats) and on instant messaging systems.
If you tell Tay to "repeat after me," it will — allowing anybody to put words in the chatbot's mouth.
However, some of its weirder utterances have come out unprompted.
Searching through Tay's tweets (more than 96,000 of them!
) we can see that many of the bot's nastiest utterances have simply been the result of copying users.