Facebook is experimenting with a bot who knows how to bargain and lie





Most of the digital helpers that are offered to us by companies such as Apple, Amazon, Google are simple. They have almost no opportunity to talk with their owner. Well, yes, bots can answer the question “How are you?”, But they get confused and answer inadvertently with more complicated questions. In general, such systems can only be considered assistants with a stretch, while this is more likely a marketing ploy, a toy for geeks.



But if bots could lead a more productive conversation, better understand their owners, then they would be much more useful. So, now the Facebook company is working to improve the properties of bots, trying to make them more perfect. In general, the social network has been working on bots for a couple of years, but now developers have really got something interesting. This is a chatbot, which is called assistant M. It was created by employees of the FAIR division.



So far, the work has been limited only by the art of negotiation, the bot is trying to learn how to convince his bot-companion of something. In the future, to the current system will add all the new features and see what happens from this. Now the bot has even learned to tell lies or exaggerate, trying to pull the interlocutor to his side.



The central part of the entire system is a neural network that is trained to conduct dialogues of a certain type. And this is a self-learning network that remembers all previous dialogues and draws for itself the necessary experience. The work of the bot is tested, giving volunteers to communicate with him. After another dialogue, volunteers are asked to evaluate the dialogue, putting a certain score.



The developers themselves describe their main task in this case, how to train the bot to predict the possible development of the conversation. "Understanding" what can be discussed, the bot anticipates the phrase of the interlocutor and tries to convince him that he is right (the topics of the dialogs are varied). There is nothing surprising in the fact that the car foresaw scenarios for the development of events - technologies for this already exist. The same computer systems that play chess or go are able to do the same.







Facebook developers have achieved significant success in their work. In any case, the volunteers who communicate with the system do not always understand that they communicate with the machine (some of them communicate with people, others - with a bot). Interestingly, the bot has learned various negotiating strategies, for example, in some cases, he actively stalks the interlocutor during the entire dialogue, so that at the end he can compromise, but still convince the opponent of something (in particular, the need to purchase something). Created on the basis of assistant M bots communicate not only with people, but also with each other.



Facebook has a bots learning software platform called Mechanical Turk. But it, according to some experts, allows only basic communication skills to be given to software systems. A bot cannot get more advanced skills with it. In simple dialogs this is enough, but if you go to a more complex level (discussion of the simplest political problems), it will immediately become clear that the person is communicating with the program.



Interestingly, the neural network was also trained by imitating the conversation of two human interlocutors. So, two volunteers were asked to discuss how they would share some valuable thing if it fell into their hands. These dialogues were recorded and then “fed” to the neural networks. Bot tried to teach a dialogue in their favor, and not without success.



Interestingly, FAIR has two models of bots. The first is self-learning, it is a dynamic model, and the second is a static model that is taught by the developers themselves, but which cannot change any parameters of their own work on their own. And bots working on the first model communicated only with bots created on the second model. This is done because if both platforms are able to change their own parameters, then during the conversation they gradually changed the English language, creating some kind of their own dialect. So there are still difficulties, and Facebook representatives are actively working on them.



Employees of Facebook plan to use the results of their work to create a much more advanced, than now, digital assistant.



All Articles