Will driverless cars REALLY make our roads safer? Tesla and GM under investigation after ‘self-driving crashes’
Two major car manufacturers are under fire after collisions involving their motors while using hi-tech features. So will the new technology really make UK motorways safer?
DRIVERLESS cars are due to appear on UK roads by 2021, with the promise of making our motorways safer and more efficient.
Experts have claimed the introduction of self-driving motors will reduce road deaths by removing human error from the equation.
But as manufacturers ramp up testing in the race to become the first to generate a fool-proof autonomous car, instances of the technology going wrong are starting to raise eyebrows about its safety.
Back in 2016, Tesla CEO Elon Musk famously stated "the probability of having an accident is 50 per cent lower" using the manufacturer's Autopilot feature compared to full human control, but recent incidents have seen the technology come under fire.
Following a collision with a Tesla Model S electric car and a fire engine in California on Monday, the driver allegedly told investigators he was using Autopilot at the time.
The system, which keeps a vehicle centred in its lane at a set distance from cars in front of it and also can change lanes and brake automatically, is now under investigation in the US for the second time.
Tesla wouldn't say if Autopilot was working at the time of the crash, but said in a statement Monday that drivers must stay attentive when it's in use.
The Model S Autopilot is "Level 2" on the self-driving scale - where "Level 5" motors can operate autonomously in nearly all circumstances.
Level 2 automation systems are generally limited to use on interstate highways in the US, but drivers are supposed to continuously monitor vehicle performance and be ready to take control if necessary with these models.
Monday's collision is not an isolated incident after several other high-profile cases of road accidents involving self-driving cars from different manufacturers in recent years.
The levels of autonomous driving
Level 1: The first level of autonomy means the driver remains in control of the car the entire time. Steering and acceleration can be controlled by the car. These systems are already on sale and include self-parking and lane assit
Level 2: These require drivers to pay attention to their surroundings and be prepared to "take control of the vehicle in specific situations". Drivers have to keep their hands on the wheel just in case, too. Tesla Autopilot currently displays this level.
Level 3: These cars can make decisions for themselves without the need for driver inputs in certain situations. The Audi A8 can offer this tech - regulation permitting - with the ability to drive itself up to 37mph and goes on sale next year.
Level 4: These are true "driverless cars" that can navigate without any driver help and can independently indicate, brake and steer. These won't be on sale until 2021 at the earliest.
Level 5: The end goal is a car that doesn't need a driver at all - there might not even be pedals or a steering wheel. Google is currently developing an example of this with its Waymo project.
General Motors is currently facing a lawsuit over a San Francisco collision which saw a motorcyclist knocked off his bike by a Chevrolet Bolt performing a lane change while in self-driving mode.
Oscar Nilsson is suing the motor manufacturer over the December incident, which he claims was the fault of the semi-autonomous technology.
One of Google's driverless cars was involved in a shunt when it pulled into the path of an oncoming bus back in February 2016.
Uber was forced to pull their self-driving cars from the road in March last year, too, after a test vehicle ended up on it side while attempting to make a turn.
Volvo also delayed its DriveMe driverless trials until 2021 over safety fears the tech wasn't ready for public use.
The first fatal road accident involving a self-driving car occurred in May 2016, when 40-year-old Joshua Brown was killed after his Tesla Model S collided with a truck in Florida.
Brown was allegedly watching a Harry Potter film while operating his car in the Autopilot mode.
The US National Transportation Safety Board in September determined that design limitations of the Tesla Model S Autopilot played a major role in a May 2016 fatal crash in Florida involving a vehicle operating under Autopilot - but it blamed the crash on an inattentive Tesla driver's overreliance on technology and a truck driver who made a left turn in front of the car.
Latest driverless car news
According to a recent investigation, the driver had his hands on the sedan's steering wheel for only 25 seconds out of the 37.5 minutes the vehicle's cruise control and lane-keeping systems were in use prior to the crash.
Tesla has taken steps to prevent drivers from using Autopilot improperly, including measuring the amount of torque applied to the steering wheel and sending visual and audio warnings.
If the warnings are ignored, drivers would be prevented from using Autopilot, the company has said.