By using insights from one job to help it do another, a successful new artificial intelligence hints at a more versatile future for machine learning.

Backstory: Most algorithms can be trained in only one domain, and can’t use what’s been learned for one task to perform another, new one. A big hope for AI is to have systems take insights from one setting and apply them elsewhere—what’s called transfer learning.

What’s new: DeepMind built a new AI system called IMPALA that simultaneously performs multiple tasks—in this case, playing 57 Atari games—and attempts to share learning between them. It showed signs of transferring what was learned from one game to another.

Why it matters: IMPALA was 10 times more data-efficient than a similar AI and achieved double the final score. That’s a promising hint that transfer learning is plausible. Plus, a system like this that learns using less processing power could help speed up training of  different types of AI.