The chairman of the National Transportation Safety Board said Tuesday that "system safeguards were lacking" in the Tesla S that killed a driver when it struck a truck in Florida in May 2016. 

According to Reuters, the new statement from the NTSB suggests that Tesla not only fielded a car with limited autonomous capabilities—something that company has previously acknowledged—but one that was also incapable of ensuring its human driver was paying attention to the road when its Autopilot system was engaged. 

“Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention, ” Robert Sumwalt, the agency's chairman, said. 

The news comes less than a month after the Wall Street Journal published a detailed account of the Autopilot program at Tesla, which included several accounts of its engineers expressing deep concerns about the system's safety. The system was only meant to provide partial autonomy that required continuous driver attention, but the company, particularly CEO Elon Musk, made public indications that Autopilot was capable of fully autonomous driving. 

At the time of the crash, Tesla vehicles were outfitted to detect if a driver's hands were on the wheel, and emit warning sounds if they were removed for more than a few seconds. In its statement, the NTSB said that such measures were insufficient for monitoring whether a driver was paying attention to the road.

Tesla has since ratcheted up how its cars watch for driver engagement and upgraded its sensor packages for new cars. But the NTSB's statement is a necessary reminder that we have a long way still to go in perfecting the human-vehicle interaction problem that lies at the heart of designing semi-autonomous cars.