On April 19th, another “self-driving” Tesla killed someone. A Tesla Model S, whose owner, Carl Hunter, from Snohomish, Washington, had engaged Autopilot, failed to see a motorcyclist in front of the car and, in the resulting collision — which saw the car come to rest atop the motorcycle — the rider, 28-year-old Jeffrey Nissen, succumbed to his injuries. According to numerous reports, Hunter admitted he actuated “self-driving” and was busy looking at his cellphone, obviously with the expectation that his attention was no longer needed to monitor his car’s Autopiloting.

To those us familiar with Advanced Driver-Assistance Systems (ADAS) — that’s the official name for semi-autonomous driving systems — none of this comes as a surprise. In another life, I also test cars, and the one thing that is an absolute certainty is that there are no truly go-anywhere, anytime, fully autonomous vehicles on the market. And yes, all you Elon Musk devotees, that includes “Full Self-Driving” Teslas.

More frightening is that it’s more than just a Tesla issue. Yes, Elon Musk is alone among major automakers in implying that his cars can literally drive themselves. And Tesla is seemingly the only company whose owners are so devoted that they are willing to put this false premise to the test. But the fact remains that almost all automakers offer some form of semi-autonomous driving. And, according to Insurance Institute for Highway Safety, most fail to see two-wheelers in front of them.

The week after Mr. Nissen died, the IIHS released the results of a study in which 10 premium compact sport utilities, each with the company’s semi self-driving ADAS system engaged, were tested as they approached a stationary motorcycle. The tests were performed at different closing speeds (50, 60 and 70 kilometres per hour) and two criteria were evaluated: Did the front collision warning system alert the driver that there was a motorcycle ahead; and, if the driver failed to heed the warning, did the Automatic Emergency Braking (AEB) system stop the car before it struck the motorcycle?

Only one of the sport ‘utes — a Subaru Forester with the company’s camera-based Eyesight system — garnered the IIHS’s “good” rating. It warned the driver of the stopped motorcycle in all cases, automatically stopped the car completely in the 50 and 60 km/h tests, and reduced the speed of impact by some 50 km/h in the 70 km/h test. At the other end of the spectrum, Chevrolet’s Equinox and the Volkswagen Taos were both rated as “poor:” they barely slowed at all before plowing into the motorcycle “target.” In between were vehicles rated “acceptable” — like the Honda CR-V, which stopped in the 50 and 60 km/h tests, but “failed to slow consistently in the 70 km/h trials” — and “marginal” (Ford’s Escape stopped before striking the target at 50 km/h but slowed “only modestly” at higher speed). It’s probably also worth noting that two other cases that Mojo has looked into of a Tesla self-driving into a motorcycle were from the rear, just like the IIHS’ failures.

The problem is two-fold. The first issue is technology. Pretty much everyone in the automotive business — save error-prone Tesla — seems to agree that, for cars to be truly full self-driving, they will need Lidar (light detection and ranging) sensors. Current vehicles — Tesla and all the SUVs tested by the IIHS — use more rudimentary camera/radar-sensors to detect objects in front of them. While they can sometimes detect motorcycles in front of them, sometimes they can’t. This is why they’re called “driver-assistance” systems, and why rules of the road insist the human behind the wheel is ultimately responsible for where the car goes and, unfortunately, for who it might hit.

The public’s understanding of what actually constitutes self-driving is perhaps the bigger issue. Officially, the American National Highway Traffic Safety Administration (NHTSA) classifies six levels of vehicular automation. Level 0 is no automation whatsoever with level 1 meaning offering assistance with acceleration and braking or steering. Level 2 is the same except that “or” is an “and.” Level 3 is called “Conditional Automation,” with the car handling all aspects of driving, but a human driver is needed in case the system can no longer operate safely. Levels 4 and 5 are the only systems that allow no drivers at all, the main difference being that Level 4 is allowed within “limited service areas” and a Level 5 is good for “all conditions and all roadways.”

For the record, there are no Level 5 vehicles currently being tested, Level 4 (High Automation) being the pinnacle of autonomous driving for the foreseeable future (our snow-covered roads are a real limitation to the “Full Automation”). More importantly, none of the major automakers offer a Level 3 machine. All — Tesla’s supposed Full Self Driving system as well as all the sport ‘utes tested by the IIHS — require that the driver “keep your hands on the steering wheel at all times and maintain control of your vehicle” — this is a quote from Tesla’s Autopilot website.

The problem, of course, is that drivers have come to rely on this (semi) automation and our reliance on such safety systems promoting the very distraction they are supposed to relieve. Until all such ADAS systems can fully recognize the motorcycles ahead of them, it would seem we remain vulnerable to both man and machine.