Experts Believe Autonomous Cars Should Meet Specific Standards After Fatal Accidents
Last month was a difficult one for autonomous vehicles. Multiple incidents occurred, including a fatal accident involving a driverless Uber and a pedestrian, a Tesla Model S driver crashing into a barrier, and one of General Motor's autonomous Bolts receiving a ticketing for not giving a pedestrian enough space to cross. These issues reveal that driverless vehicles have numerous faults that need to be worked out. And for experts, last month's woes reveal that automakers need to step their game up when it comes to autonomous cars.
It's Time For Driverless Cars To Have Standards
In a lengthy piece, Reuters outlined what needs to change in the autonomous scene after last month's incidents. According to the outlet, safety and tech experts believe that all driverless cars should meet basic standards on how well they can detect a potential hazard on the road. In addition to that, automakers and technology companies need to find a better way for human operators that are behind the wheel of an autonomous vehicle to assume control over the car in the case of an emergency.
Some states have started to allow automakers and technology companies to test fully-autonomous vehicles on public roads. Some have jumped at the idea, embracing the future by the horns and putting self-driving vehicles on the road. The vehicles, though, still have a human operator behind the wheel, ready to take over in the case of an accident. But, as Reuters points out, neither of the human drivers behind the wheel of the vehicles took action before the incidents.
To handle driving on their own, autonomous vehicles are fitted with all sorts of tech. In addition to all of the software and computing power lurking underneath the bodywork, the vehicles have LiDAR, sensors, and cameras to watch the road. But, as Reuters claims, these systems are not standardized. This has led to companies using various combinations of items and different items all together, ending up with self-driving cars that are more autonomous than others.
"Humans don't have the ability to take over the vehicle as quickly as may be expected," said self-driving expert and investor Evangelos Simoudis in regard to an emergency situation.
Humans Appear To Be The Weak Link
While I won't go over the particulars of the fatal accidents, Simoudis does bring up a point that's evident in both incidents. The video footage of the Uber crash reveals that the woman behind the wheel of the autonomous car almost looked shocked to see a pedestrian on the road. It's important to note that she was looking down seconds before the accident took place.
In the Tesla Model X incident, the company's findings revealed that the driver received warnings to take control of the vehicle before the accident occurred, but never did. The electric vehicle was in its Autopilot mode when the incident happened.
According to Duke University mechanical engineering professor Missy Cummings, the fatal accidents involving Tesla's and Uber's cars show that the "technology they are using is immature." The companies, though, don't seem to agree.
In a response to Reuters, Tesla stated that drivers have a "responsibility to maintain control of the car" when Autopilot mode is engaged and need to be prepared to respond to "audible and visual cues." An Uber spokesperson told the outlet that, "safety is our primary concern every step of the way."
How The Government Can Get A Hold Of Autonomous Cars
While the UK government and governments of other countries have attempted to in front of autonomous cars, the United States' is lagging behind. A recent bill came out that originally would have allowed autonomous vehicles to flourish, especially testing of fully-self-driving cars on America's roads. Advocates for Highway and Auto Safety, a consumer group, claims that the bill, which is now going nowhere in the U.S. Senate, should be used to improve safety instead of pushing autonomous vehicles onto public roads.
The group, according to the report, has proposed amending the bill, which is referred to as the AV Start Act, to set standards for driverless cars. For instance, the bill should require a "vision test" to actually see how well the vehicle's sensors, cameras, and LiDAR see the road.
In addition to requiring autonomous vehicles to clear the vision test, the Advocates for Highway and Auto Safety also want the bill to cover semi-autonomous systems, like the one Tesla offers on its electric cars. Those systems aren't covered on the current bill.
The Advocates for Highway and Auto Safety aren't alone in wanting driverless vehicles to meet some sort of performance targets. As Reuters states, other groups have come forth, urging automakers, technology companies, and the government to require all types of semi-autonomous and autonomous vehicles to meet certain standards. They have also requested companies to have greater transparency, increased regulatory oversight, as well as better monitoring of and engagement with human operators.
As other reports have indicated, humans continue to be large problems when it comes to self-driving cars. Automakers aren't taking enough time to educate consumers and drivers on their autonomous systems and drivers aren't concerned with learning all of the limitations.
Consumer Reports' Jake Fisher, head of automotive testing for the outlet, claimed that human drivers "are bad at paying attention to automation and this technology is not capable of reacting to all types of emergencies." He went on to say, "It's like being a passenger with a toddler driving the car."
Knowing this, the Massachusetts Institute of Technology (MIT) is currently conducting tests on how drivers use semi-autonomous technology. Some of the automakers that MIT is currently looking at include Tesla, Jaguar, Volvo, and GM. "We just don't know enough about how drivers use any of these systems in the wild," said MIT research scientist Bryan Reimer.
Getting the balance right, where automakers and companies can test their autonomous technology without being hampered and where people are still safe on the road, is not going to be easy. But it is necessary. Autonomous systems expert and professor at Notre Dame University Mendoza College of Business, Timothy Carone, stated that autonomous technology's proponents must "find the right balance so the technology is tested right, but isn't hampered or halted."
Doing so, at least in Carone's eyes, will save lives in the long run.
via: Reuters, Consumer Reports
- Ford-Owned Autonomic Partners with RideOS for New Self-Driving Tech
- Is Calling Cars ‘Autonomous’ Dangerous? Auto Experts Believe it is
- Porsche Introduces Digital Charging Service Before Taycan Launch
- The Motley Fool Names 5 Industries Autonomous Cars Will Drastically Impact
- Uber CEO Looks to Expand Beyond Cars With Flying Cars, Bike-Share
- Should Night Vision be a Standard Feature on Autonomous Cars?
- Automakers Having Trouble Deciding Between Boasting Semi-Autonomous Features and Safety
- Artificial Intelligence, Not Autonomy Could be in Your Next Car