← Return to search results
Back to Prindle Institute

The Tay Experiment: Does AI Require a Moral Compass?

In an age of frequent technological developments and innovation, experimentation with artificial intelligence (AI) has become a much-explored realm for corporations like Microsoft. In March 2016, the company launched an AI chatbot on Twitter named Tay with the handle of TayTweets (@TayandYou). Her Twitter description read: “The official account of Tay, Microsoft’s A.I. fam from the Internet that’s got zero chill! The more you talk the smarter Tay gets.” Tay was designed as an experiment in “conversational understanding” –– the more people communicated with Tay, the smarter she would get, learning to engage Twitter users through “casual and playful conversation.”

Continue reading “The Tay Experiment: Does AI Require a Moral Compass?”