Cars these days are getting “smarter” than ever. New safety features allow cars to apply their own brakes if the driver fails to respond to warnings of a crash. Many vehicles can also automate part of the driving experience, with features like “adaptive cruise control” that ensures the vehicle keeps a consistent distance behind the vehicle in front of it.
While new car features are convenient and are intended to make driving safer, they sometimes have the opposite effect because drivers do not understand their proper use and limits. When automakers fail to instruct car buyers about proper use and warn against misuse, they can sometimes be held liable for auto accidents that occur as a result.
The Insurance Institute For Highway Safety recently released a study that looked at the behaviors and attitudes of drivers who had one of three partial-automation systems in their vehicle: The Cadillac Super Cruise, the Tesla Autopilot and the Nissan/Infiniti ProPILOT Assist. These systems are referred to as “partial automation” for good reason: They were only designed to help drivers and cannot be considered fully self-driving.
Unfortunately, many drivers who have these systems in their vehicles admit they are more likely to text, eat and engage in other distractions while the systems are in use. A shocking percentage of respondents even said they were comfortable treating their vehicles as fully self-driving when the systems were engaged.
Skeptics might be quick to say that drivers who behave this way have only themselves to blame if they get into a crash. However, the study showed clear differences in attitudes and behaviors among users of the three different systems mentioned above. Drivers of Cadillacs and Teslas were much more likely to assume that their cars were fully autonomous than Nissan/Infiniti drivers. This suggests that how these vehicles are marketed and whether they have adequate warnings might be influencing consumers to make dangerous choices.
There is a principle in product liability law known as “foreseeable misuse.” This means that product manufacturers need to anticipate and warn against the reasonably predictable ways that a product might be misused by consumers. In the case of partial-automation systems, it is easy to predict that drivers would want to mentally “check out” and let the vehicle drive itself as much as possible.
Automakers may also be liable if they mislead consumers about what their products can and cannot do. Certainly, Tesla’s decision to name its system the “Autopilot” seems more than a little misleading, and the company has been the subject of numerous high-profile lawsuits over Autopilot-involved crashes.
Until and unless cars become fully self-driving, automakers must make good faith attempts to teach drivers about how partial automation systems can and cannot be used. When they don’t take this responsibility seriously, people can get hurt or killed.
Attorney Advertising | Prior results do not guarantee a similar outcome. The information on this website is for general information purposes only. Nothing on this site should be taken as legal advice for any individual case or situation. This information is not intended to create, and receipt or viewing does not constitute, an attorney-client relationship. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.