Fully autonomous (Level 5) cars are undergoing testing in several pockets of the world, but none are yet available to the general public. We’re still years away from that. The challenges range from the technological and legislative to the environmental and philosophical. Here are just some of the unknowns.
Lidar and Radar
Lidar is expensive and is still trying to strike the right balance between range and resolution. If multiple autonomous cars were to drive on the same road, would their lidar signals interfere with one another? And if multiple radio frequencies are available, will the frequency range be enough to support mass production of autonomous cars?
What happens when an autonomous car drives in heavy precipitation? If there’s a layer of snow on the road, lane dividers disappear. How will the cameras and sensors track lane markings if the markings are obscured by water, oil, ice, or debris?
Traffic Conditions and Laws
Will autonomous cars have trouble in tunnels or on bridges? How will they do in bumper-to-bumper traffic? Will autonomous cars be relegated to a specific lane? Will they be granted carpool lane access? And what about the fleet of legacy cars still sharing the roadways for the next 20 or 30 years?
State vs. Federal Regulation
The regulatory process in the U.S. has recently shifted from federal guidance to state-by-state mandates for autonomous cars. Some states have even proposed a per-mile tax on autonomous vehicles to prevent the rise of “zombie cars” driving around without passengers. Lawmakers have also written bills proposing that all autonomous cars must be zero-emission vehicles and have a panic button installed. But are the laws going to be different from state to state? Will you be able to cross state lines with an autonomous car?
Who is liable for accidents caused by an autonomous car? The manufacturer? The human passenger? The latest blueprints suggest that a fully autonomous Level 5 car will not have a dashboard or a steering wheel, so a human passenger would not even have the option to take control of the vehicle in an emergency.
Artificial vs. Emotional Intelligence
Human drivers rely on subtle cues and non-verbal communication—like making eye contact with pedestrians or reading the facial expressions and body language of other drivers—to make split-second judgment calls and predict behaviors. Will autonomous cars be able to replicate this connection? Will they have the same life-saving instincts as human drivers?
What are the benefits of autonomous cars?
The scenarios for convenience and quality-of-life improvements are limitless. The elderly and the physically disabled would have independence. If your kids were at summer camp and forgot their bathing suits and toothbrushes, the car could bring them the missing items. You could even send your dog to a veterinary appointment.
But the real promise of autonomous cars is the potential for dramatically lowering CO2 emissions. In a recent study, experts identified three trends that, if adopted concurrently, would unleash the full potential of autonomous cars: vehicle automation, vehicle electrification, and ridesharing. By 2050, these “three revolutions in urban transportation” could:
- Reduce traffic congestion (30% fewer vehicles on the road)
- Cut transportation costs by 40% (in terms of vehicles, fuel, and infrastructure)
- Improve walkability and livability
- Free up parking lots for other uses (schools, parks, community centers)
- Reduce urban CO2 emissions by 80% worldwide
What solutions does Synopsys have for autonomous cars?
Today’s cars have 100 million lines of code. Tomorrow’s autonomous cars will have more than 300 million lines of code, so cybersecurity is a growing concern. Synopsys is the leader in application security testing and software composition analysis, helping automotive customers build security into their software throughout the development lifecycle and across the supply chain.
Synopsys also offers a broad portfolio of auto-grade IP, certified for ISO 26262 and ASIL B & D readiness, to help customers build the best chips for applications like ADAS, infotainment, and mainstream MCUs. Synopsys embedded vision processor solutions help customers integrate capabilities like object and facial recognition, night vision, and adaptive cruise control.