A few months ago, the media did what they do best: create click-bait titles for the Facebook AI Research that trains bots to negotiate and it went viral when it was reported that Facebook had to shutdown said bots because they created their own language that human do not understand.
Some time not too long later, the media corrected the misinformation but that article didn’t go as viral because, let’s face it, the online generation has no interest in sharing boring truths. They just want sensational news.
I had thought the misunderstanding on that would’ve been cleared up by now until I learned that many people still think the AI was getting too smart, Facebook panicked, and the project was “shut down in time.” before they can take over the world.
The bots in Facebook’s research project were given a training data set to negotiate a barter trade of balls, books, and hats. This data set consists of 5808 dialogues, based on 2236 unique scenarios. The whole project is well-documented and available to the general public on Facebook. You can also download the project on Github to run it for yourself. So, no, running it will not cause your computer to take over the world.
The AI makes use of machine learning to learn off each other and try to optimize the results of their negotiations to receive as many items of what they want as possible. Each time, the results are scored and the AI continues to test and use the most optimal phrases and further refine them to arrive at even better negotiation methods to achieve even better results. For example, if the bot wants balls and books, the phrase “i would like 4 hats and you can have the rest” would likely not score very well.
Now, the problem: The bots are not incentivized to use human-comprehensible English, hence, as the machine learning went on, these bots learned patterns out of the English humans know, which looked like this:
Bob: i can i i everything else . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Bob: you i everything else . . . . . . . . . . . . . .
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i i can i i i everything else . . . . . . . . . . . . . .
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i . . . . . . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Bob: you i i i i i everything else . . . . . . . . . . . . . .
Alice: balls have 0 to me to me to me to me to me to me to me to me to
Bob: you i i i everything else . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
For the bots, the results mean more than sticking to English, so, if saying “the” five times can communicate the idea that it wants five copies of the item, that’s what it would do if that is more efficient in the negotiation.
But this project is not a failure as it did work as intended and revealed interesting discoveries on AI. Not only did the bots abandon comprehensible English to optimize the results, they were found to be skilled negotiators as they also began to feign interest in an item in order to “sacrifice” it later as a faux compromise.
So, when the AI began communicating in their own language quoted above, Facebook decided to shut it down because as with all research projects, you terminate it when it stops returning meaningful data.