In 2016 in the United States, a Model S P100D traveling with Autopilot, Tesla’s autonomous driving system, crashed into a truck at 120 km / h, killing its driver. The investigation of the NTSB (National Council for Transportation Safety) had concluded with a lack of vigilance of the driver, the latter having not reacted to the numerous warnings of the car to make him regain control. But the US agency had also concluded that the autopilot had been activated on a road that was not destined (the road would normally be limited access such as a highway, with few intersections and no stop signs) and that the driver had been able to leave the system driving alone for long periods without getting behind the wheel.
In 2017, the NTSB published a series of recommendations to limit the risk of accidents. These include the adoption of a system that allows the use of autonomous driving to be restricted to the roads for which it was designed, as well as a means to more effectively control the level of attention of the driver, who must always be able to regain control. in case of problem. For its part, the American manufacturer has so far limited itself to saying that it was the driver’s responsibility to know when and how he could activate his autopilot.
Elon Musk took over the task
In a letter to Elon Musk, NTSB President Jennifer Homendy said that she was very concerned about Tesla’s inaction in following her recommendations, especially since these tips were also sent to 5 other manufacturers (not mentioned in the letter) and that everyone would have responded. to the NTSB detailing the measures they intended to implement to limit the risks. This formal request from Ms Homendy comes as Tesla is testing highly advanced full autonomous driving features with a panel of drivers, before making them available to anyone who has purchased the option. Still in the beta testing stage, a lot of ink has been spilled on this program because it allows people who are not specially trained to try autonomous driving on open roads, with all that this implies as dangers to themselves but also to others. Also, the name is misleading, as it is not fully autonomous driving, as the driver must always be able to regain control of the electronics.
NEW: A letter sent today by @NTSB President Jennifer Homendy to Elon Musk notes Tesla’s lack of response to NTSB safety recommendations on 1) driver monitoring systems and 2) limiting the driver. use the autopilot to your intended road environment. pic.twitter.com/6NmIXNeDoN
– David Zipper (@DavidZipper) October 25, 2021
Last September, Reuters reported that the NHTSA (US Federal Highway Traffic Safety Agency) also asked Tesla to answer specific questions about its autopilot following another crash, the latest in a series of 12 involving a Tesla. It operates in semi-autonomous mode and an intervention. vehicle (firefighter, police, etc.).
Who to throw the stone at?
These various accidents raise the question of the responsibility and moral role of the builder. Should you limit the cases in which your system can be used at all costs or, on the contrary, push the development to the maximum and let the drivers take responsibility for their use or not? In many NTSB reports, we learn that driver surveillance is often questioned, but few can testify. One of the latest accidents was this summer (a Model S collided with a parked car, killing a 22-year-old student) and the driver, who survived, was able to tell that he dropped his phone and ‘bent over to retrieve it. Should we blame Tesla for allowing the possibility of activating the autopilot on a road not intended for his use, or the driver for having telephoned while driving while the laws of Florida, where the tragedy occurred?, Allow you to make phone calls while driving . and holding your device in hand?
While waiting for an answer that sets a precedent for all semi-autonomous driving systems, it is especially good to remember that these functions should be considered as driving aids and not as substitute drivers. As vehicles increasingly want to automate, the attention of the person behind the wheel is diminishing, according to a recent MIT study. Therefore, a clear legal framework and security measures are needed today to avoid incidents as much as possible in the future.
Business Consulting Nulled, WeaPlay, Elementor Pro Weadown, PHP Script, Slider Revolution Nulled, Newspaper – News & WooCommerce WordPress Theme, Avada 7.4 Nulled, Fs Poster Plugin Nulled, Wpml Nulled, Elementor Pro Weadown, Flatsome Nulled,Woodmart Theme Nulled, Jannah Nulled, WordPress Theme, Astra Pro Nulled, Rank Math Seo Pro Weadown, Yoast Nulled, Dokan Pro Nulled, Nulledfire, Wordfence Premium Nulled, Woodmart Theme Nulled, Consulting 6.1.4 Nulled, Jnews 8.1.0 Nulled, Premium Addons for Elementor, Plugins, PW WooCommerce Gift Cards Pro Nulled, WP Reset Pro, Woocommerce Custom Product Ad, Newspaper 11.2