Artificial Intelligence / Machine Learning

Facebook’s AI tourist finds its way around New York City by asking for help from another algorithm

AI algorithms can learn to navigate in the real world using language—and that might help make chatbots and voice assistants smarter.

Jul 12, 2018

If you get lost in New York without a smartphone or a map, you’ll most likely ask a local for directions. Facebook’s researchers are training AI programs to do the same thing, and they’re hoping this could eventually make them far better at using language.

The Facebook Artificial Intelligence Research (FAIR) group in New York created two AI programs: a “tourist” effectively lost in the Big Apple, and a “guide” designed to help its fellow algorithm find its way around by offering natural-language instructions. The lost tourist sees photos of the real world, while the “guide” sees a 2-D map with landmarks. Together they are tasked with reaching a specific destination.

The idea is that by learning how instructions relate to real objects like a “restaurant” or a “hotel,” just as a baby learns by associating words with real objects and actions, the tourist algorithm will start to figure out what these things actually are—or at least how they fit into a simple street view of the world. AI researchers hope that algorithms taught this way will be more sophisticated in their use of language.

Facebook

Language remains a huge challenge for artificial intelligence. It’s easy to build algorithms capable of answering simple commands or even holding a rudimentary conversation, but complex dialogue is impossible for a machine. This is partly because decoding ambiguity in language requires some common-sense knowledge of the real world. Giving an algorithm simple rules or training it on large amounts of text often results in absurd misunderstandings (see “AI’s language problem”).

“One strategy for eventually building AI with human-level language understanding is to train those systems in a more natural way, by tying language to specific environments,” the researchers write in a related blog post. “Just as babies first learn to name what they can see and touch, this approach—sometimes referred to as ‘embodied AI’—favors learning in the context of a system’s surroundings, rather than training through large data sets of text.”

The Facebook research is an attempt to give AI algorithms some common sense by grounding their understanding of language in a simplified representation of the real world.

The idea of “embodied AI” has been around for some time, but most efforts to date have relied on simulated environments rather than actual images. Greater realism makes things more challenging, but it will be crucial if AI algorithms are to become more useful (see “Facebook helped create an AI scavenger hunt”).

The researchers used a 360° camera to capture New York City neighborhoods including Hell’s Kitchen, the Financial District, the Upper East Side, and Williamsburg.

They also ran experiments where the algorithms could experiment with their own protocols or language. Interestingly, the researchers found that things worked best when the algorithms were allowed to do this.

The Facebook researchers are releasing the code behind their project, called Walk the Talk, in hopes that other AI scientists will use it to further research on embodied AI and language algorithms.