The dangers of driverless cars metamorworks

- Advertisement -
- Advertisement -

The world of driverless is, we are told, on the horizon but no one really knows what it will look like. While the technology is already in development it will take a long time before it is the standard in our world. When it is, so much will change. Entire cities and ways of life are designed around driving a car and when that need is removed what will arrive instead?

- Advertisement -

If you think about every city you know the one thing they all have in common (perhaps excluding Vienna) is the streets. They all have roads that are wide enough to fit the vehicles that rule our streets today. If cars no longer need drivers, they wouldn’t need as much space. Robotic drivers won’t need as much room for error or as much simplicity in design. This could change how we design the entire city. 

Distance will also change as a deciding factor. Cars speeds are limited today by safety not by ability. On most roads, it is not allowed to go above 30, 50, or 100 km/h depending on the type of street. This is because that is what is deemed safe. Cars have the ability to go much faster and stop quicker but we can’t risk it. In a world of driverless cars that would be possible. People will likely start to move further and further away from cities as their commute time will not increase but their distance will. This will have a huge impact on house prices in years to come.

Cars will change. Cars are designed with drivers in mind but if we are never driving then the design will change. We may reach a time when people don’t actually need to own a car at all. If you consider the Uber model it could just be the beginning. Instead of ordering a car, you could order a mobile gym, office cinema, or restaurant that will pick you up and take you to your destination while you complete some tasks. 

While all of this is exciting it is still a long way off. First, we must overcome the dangers of driverless cars. Today there are huge debates over how these cars should be programmed and who is liable when they crash. Is it the passenger or the car manufacturer? The way they are programmed has become a philosophical debate as well. 

If there is some external factor that is about to cause a crash and your car must react, how should it base its decisions? Should it act to save the driver, it’s the owner? Or should it act to cause the least amount of harm? If the car is trying to decide between crashing into a tree and killing the driver or crashing into a school and killing children, what should it do? We would all say save the school children but car manufacturers have already come out and said they think sales would increase if they protect the driver at all costs. 

What if two cars are forced to collide and the cars are able to decide who should take the most damage? Should cars have access to the personal data of both drivers and share information to determine who is better to keep alive for the sack of society? How would it choose? Should it choose the prison convict over the businessman? 

The technology of driverless cars is perhaps the simplest piece of the jigsaw. The real challenge will come in how we bring this technology into our society, what role we choose for it, and how we govern the rules. While having artificial intelligence recommend your next song is pretty safe (although sometimes they get it very wrong) can we really trust them with our lives on the road?