Deal or no deal? Training AI bots to negotiate
At Reply, we’re pretty vocal about our thoughts that the objective of a bot should be to provide a service, not to mimic a human being. That said, we think fascinating peek under the hood at Facebook Artificial Intelligence Research (FAIR) is an example of where those lines get blurred. Because so often, the need to speak to a live human is do to the desire to negotiate, to get a better deal. So effectively, the work these folks are doing while it is mimicking human behavior, the objective remains: to provide a better service. Things that are a bit chilling? There were cases where agents initially feigned interest in a valueless item, only to later “compromise” by conceding it — an effective negotiating tactic that people use regularly. This behavior was not programmed by the researchers but was discovered by the bot as a method for trying to achieve its goals.