Analysis of the recent self-driving Tesla crash

Oct. 10, 2016
A debate is in progress concerning where the legal responsibilities of the manufacturers end and the responsibilities of the 'brain' of the car begin.

This article discusses the state-of-the-art of controlling self-driving cars, including the control loops and their limitations as they exist today. Part 2 will suggest improvements to their sensors and operating algorithms, using the experience and knowledge that we’ve accumulated in the process industries.

The evolution of industrial process control had three distinct periods:

  • Phase 1—Manual: In the first decades of the industrial age, the operator (assisted by some sensors) manually controlled the process. During this period, most of the many avoidable accidents were caused by operators.
  • Phase 2—Semi-Automatic: Starting around World War II, automatic PID control loops were introduced, but the operators were still in charge—they were still free to change setpoints or switch the loop to manual control. Therefore, control quality and safety was improved, but operator errors still could cause accidents.
  • Phase 3—Full-Automatic: During the past couple of decades, multivariable envelope controls, self-diagnosing smart sensors, voting systems and the use of Override Safety Control (OSC) eliminated those accidents that in the past were caused by operator errors.

In the transportation industry, the use of automation evolved more slowly. As of today, trains still run mostly under manual control (Phase 1), airplanes are usually controlled in the semi-automatic mode (Phase 2) as autopilots are available but can be overruled by the pilot, and it’s only in space travel and the military applications where fully-automatic control is used (Phase 3).

[javascriptSnippet]

The use of automatic process control in automobiles and trucks started later than in other industries, and picked up speed only in the past decade. The goals of this technology are both life saving and cost reduction. It is also expected that this would bring mobility to millions who today can’t afford to own cars. This development also had three phases:

Phase 1—Manual: Until the 21st century, most cars were controlled completely by the driver, who was assisted by only a few sensors (mirrors, speedometer, GPS). In 2015, about 250 million cars were on U.S. roads (1 billion worldwide), and in the U.S. alone, 35,200 people died in car accidents—about one per 70 million miles of driving.

Phase 2 — Semi-Automatic (autopilot): These control systems were developed during the past decade, and as yet, are on only about 70,000 (Tesla) cars on the road. Their control loops have only a few sensors (radar, camera, GPS, self-diagnostics), and they basically have only two controlled variables: speed and direction. They can park and maintain lane position, legal speed and distance to other cars, but operate in a “hands on the wheel” mode, so the driver can take over control at any time. To date, one fatal accident occurred after about 140 million miles of driving in these vehicles.

[sidebar id =5]

Phase 3—Automatic (driver-less autonomous): These control systems are mostly under development (Figure 1) although those designed by Google have already logged 1.5 million miles in California; the Chevrolet Bolt is “running around” in San Francisco; and Ford, Apple and others are planning to do the same soon. Testing of these cars is largely by the manufacturers, which should change. I would prefer if they were government-tested in standardized testing centers, which included simulated streets, cars, traffic lights, road signs, tunnels and robotic moving and stationary objects, including people and animals.

[sidebar id =1]

It seems that the first markets for autonomous robotic cars are expected to be in ride-hailing, ride-sharing (Uber, Lyft) and delivery, business-type applications, which are using commercial fleets. Besides increased safety, their cost of operation would be reduced both by full-time usage and by eliminating the salary of the drivers. To serve these markets, Fiat Chrysler works with Google; General Motors is in partnership with Lyft; Volvo teamed with Uber, etc.

The Tesla accident

The fatal Tesla accident on May 7 in Williston, Fla., shows the roles played by the three loop components (sensors, control algorithms and final control elements), the contributions they made to the accident, and how these early design errors can be corrected. I should also mention that an unresolved debate is in progress concerning where the legal responsibilities of the car manufacturers end (hardware such as sensors and final control elements, such as the brake) and where the responsibilities of the “brain” of the car begin (the control software). I believe the outcome of the legal process concerning this accident will represent a big step in resolving this debate.

[sidebar id =2]

The crash in Florida occurred on a divided highway when a white tractor-trailer in the westbound lane made a left turn, crossing in front of the oncoming traffic in the eastbound lane, in which the Tesla was traveling. It is unresolved if neither the driver nor the autopilot attempted to brake or if the brake actuators themselves were defective. So all we know is that the Tesla did not brake, and it would not have with or without the autopilot.

The failure to “see” a white tractor trailer (Figure 2) was probably because neither the camera of the autopilot nor the driver could see it against the brightly lit sky. It’s probable that the radar would have seen it, if it was properly focused, but because the object appeared like a stationary object above the road, the control algorithm could have assumed that it was not a vehicle, but an overpass or a road sign. As a result, the Tesla passed through, under the center of the trailer, was hit at windshield height, and came to rest at the side of the road, after hitting a fence and a power pole (Figure 3).

[sidebar id =3]

As we went to press, Tesla announced that it’s switching to metal-sensitive radar as the primary scanner, and only supplementing it by camera, plus forcing the drivers to keep their hands on the steering wheel. These software changes are being transmitted wirelessly to the correct the Tesla fleet because no hardware changes are involved. 

[sidebar id =4]

About the Author

Béla Lipták | Columnist and Control Consultant

Béla Lipták is an automation and safety consultant and editor of the Instrument and Automation Engineers’ Handbook (IAEH).