Car makers and technology giants are all in a race to produce the best self-driving cars; a venture largely backed by the argument that computers are safer drivers than humans, since they eliminate the most common cause of car accidents: human error.
However, that notion may have just taken a big hit after one Tesla self-driving car was involved in a fatal crash. Tesla has been at the forefront of development in this area for some time, and is even at the point of testing these autonomous vehicles out in the real world.
How Did the Car Accident Happen?
Investigators say the self-driving car was travelling down the highway when a semi-truck made a left-turn in from of it. According to the reports, neither the car nor the driver could see the white truck and trailer against a broadly lit skyline, and therefore did not hit the brakes to avoid a collision.
Federal regulators have since become involved in the investigation, looking for more information towards the cause since they are currently working on a set of regulations for self-driving cars. The hope is that something can be learned from this fatal accident.
As of now, investigators have not indicated that there was any malfunction with the car or its software which may have resulted in the tragic accident.
Is Tesla Liable for Fatal Car Accidents Involving Their Self-Driving Cars?
Tesla’s self-driving cars are still in testing. Development seems to be moving along quickly, but they are not ready yet. Should Tesla be allowed to test their vehicles in public like this? What if the car had killed another person?
Even if the cars become totally independent, should the person in the car be allowed to watch videos? Or sleep? Airplanes have autopilot, but you wouldn’t want your pilot to take a nap mid-flight would you?