Support our educational content for free when you purchase through links on our site. Learn more
🚨 Who Gets the Ticket? AV Safety & Speeding by Manufacturer (2026)
We’ve all been there: cruising down the highway, hands lightly on the wheel, trusting the car to handle the merge. Then, a flash of blue lights in the rearview mirror. Is the ticket for you, the human in the seat, or the robot doing the driving? The answer isn’t as simple as “the driver is always responsible.” In fact, the correlation between autonomous vehicle safety and speeding tickets is a shifting legal landscape that depends entirely on who built the car and how it’s programmed.
At Car Brands™, we’ve dug through NHTSA crash reports, California DMV violation logs, and Texas court rulings to uncover the truth. Spoiler alert: if you’re driving a Tesla or a Ford with Level 2 assistance, you are still the driver, and that speeding ticket is coming to your mailbox. But if you’re hopping into a Waymo or Cruise robotaxi, the manufacturer is the one sweating the fine. From the “phantom braking” incidents that confuse sensors to the new 2026 laws holding fleet operators accountable, we break down exactly how safety ratings and ticket correlations play out across the industry.
Key Takeaways
- Liability Shifts by Level: For Level 2 systems (Tesla, GM, Ford), the human driver remains fully liable for speeding tickets and crashes. For Level 4 systems (Waymo, Cruise), the manufacturer or fleet operator absorbs the liability.
- Safety vs. Speed: While AVs generally adhere to speed limits more strictly than humans, software glitches and map errors can still lead to violations, with Tesla reporting the highest raw number of incidents due to fleet size.
- Data is Critical: In any dispute, telemetry logs and Event Data Recorder (EDR) information are the only way to prove whether a violation was a human error or a system failure.
- Texas & California Updates: New 2026 regulations are increasingly holding service providers accountable for traffic violations in fully autonomous modes, changing the game for robotaxi operators.
Table of Contents
- ⚡️ Quick Tips and Facts
- 🕰️ The Evolution of Autonomy: From Cruise Control to Self-Driving Dreams
- 🚦 The Speed Trap Paradox: Do Autonomous Vehicles Actually Speed Less?
- 📊 Manufacturer Safety Showdown: Tesla, Waymo, Cruise, and Ford Compared
- 🤖 How Self-Driving Car Cases Are Different from Regular Car Accidents
- 📉 How Often Do Self-Driving Cars Crash? Analyzing the Data
- 💥 Why Do Self-Driving Cars Crash? Unpacking the Root Causes
- 1. Software and Sensor Errors: When the Code Glitches
- 2. Driver Error: The Human in the Loop
- 3. Cybersecurity Issues: Hacking the Highway
- 4. Weather and Road Conditions: Nature vs. Algorithms
- 5. Road Infrastructure: Are Our Streets Ready for Robots?
- ⚖️ Who Is Liable in a Self-Driving Car Crash? The Legal Maze
- 🤠 What Does Texas Law Say About Self-Driving Car Crashes?
- 🚨 What to Do After a Self-Driving Car Crash in Austin
- 💰 What Compensation Can You Recover Under Texas Law?
- 🛡️ What Are Your Rights and Next Steps? A Driver’s Guide
- 🔍 What Evidence Matters Most in Autonomous Vehicle Litigation
- 🚕 Robotaxi Crashes vs. Personal Self-Driving Crashes: A Critical Distinction
- 🏭 Product Liability: Suing the Manufacturer for Flawed Design
- 👮 Driver Responsibility: Can You Still Get a Ticket?
- 🚗 Other Drivers: When Humans Mess Up Around Robots
- 🏛️ Government and Roads: The Role of Public Policy
- 🚐 Fleet Operators and Rideshare Platforms: Who Pays the Fine?
- 🏁 Conclusion
- 🔗 Recommended Links
- ❓ Frequently Asked Questions
- 📚 Reference Links
⚡️ Quick Tips and Facts
Before we dive into the deep end of the autonomous ocean, let’s hit the highlights. If you’re wondering whether your self-driving car is going to get you a ticket or a lawsuit, here’s the TL;DR from our team at Car Brands™:
- The Speed Trap Reality: Yes, autonomous vehicles (AVs) can get speeding tickets, but the recipient of the ticket depends entirely on who is in the driver’s seat (or if anyone is). In California, as of July 1st, the service provider (like Waymo or Cruise) is now liable for traffic violations, a massive shift from the “driver is always responsible” rule.
- Tesla vs. The Fleet: Tesla’s Autopilot and FSD (Full Self-Driving) systems have the highest number of reported crashes simply because they have the largest fleet on the road. However, per-mile data is still debated.
- The “Human in the Loop” Paradox: For Level 2 systems (Tesla, GM Super Cruise), the human driver is legally liable for speeding, even if the car is doing the work. If you’re zoning out, that ticket is coming to you.
- Data is King: In a crash or ticket dispute, the Event Data Recorder (EDR) and telemetry logs are your best friends. They tell the truth about speed, braking, and system status.
- Texas Specifics: Texas law currently leans heavily on the human driver for Level 2 systems, but the legal landscape is shifting fast.
For a deeper dive into which brands historically rack up the most citations, check out our breakdown on car brands with the most speeding tickets.
🕰️ The Evolution of Autonomy: From Cruise Control to Self-Driving Dreams
Remember the first time you used cruise control? It felt like magic. You set the speed, and the car just… stayed there. Fast forward today, and we aren’t just maintaining speed; we’re navigating complex intersections, merging onto highways, and yes, occasionally getting confused by a construction cone.
The journey from adaptive cruise control to Level 5 autonomy has been a rollercoaster. We’ve seen the rise of Tesla’s Neural Networks, Waymo’s LiDAR dominance, and GM’s Super Cruise. But with great power comes great responsibility—and great confusion regarding who gets the ticket when the car decides to “speed” to catch a green light.
The core of the issue lies in the SAE Levels of Driving Automation:
- Level 0-2: The human is always driving. The car assists. You get the ticket.
- Level 3: The car drives in specific conditions, but you must be ready to take over. It’s a legal gray area.
- Level 4-5: The car drives everywhere (or in geo-fenced areas). The manufacturer/operator gets the ticket.
As we explore the correlation between safety and speeding, we have to ask: Are these robots actually safer, or just more obedient? The answer might surprise you.
🚦 The Speed Trap Paradox: Do Autonomous Vehicles Actually Speed Less?
Here is the million-dollar question: Do self-driving cars speed less than humans?
Intuitively, the answer should be a resounding “Yes.” Computers don’t get impatient, they don’t have road rage, and they don’t think “I’m late for work” is a valid excuse to blow past a 35 mph school zone.
The Data Speaks (Sort Of)
According to recent studies, AVs generally adhere to speed limits more strictly than human drivers. They are programmed to follow the rules of the road with mathematical precision. However, this strict adherence can sometimes backfire.
- The “Robot” Problem: Some AVs have been observed driving too slowly, causing traffic congestion and frustrating human drivers who then speed around them, creating new hazards.
- The “Speed Limit” Glitch: In some cases, AVs have been caught speeding because their GPS data was outdated, or they misinterpreted a temporary construction zone speed limit.
“The honest read: self-driving cars are not yet clearly safer than human drivers in every situation.” — LGR Law Firm
But here is the kicker: Speeding isn’t the only metric. A car that drives 5 mph under the limit in a school zone is safe, but a car that drives 5 mph over the limit because it misread a sign is a liability nightmare.
Manufacturer Speeding Corelations
Different manufacturers program their vehicles differently.
- Tesla: Known for aggressive “Full Self-Driving” updates that sometimes push the envelope on lane changes and speed, leading to higher scrutiny.
- Waymo: Tends to be more conservative, often driving below the speed limit to ensure safety, which can lead to traffic flow issues.
- Cruise: Has faced significant scrutiny in San Francisco and Austin for erratic behavior, including speeding in certain scenarios.
📊 Manufacturer Safety Showdown: Tesla, Waymo, Cruise, and Ford Compared
Let’s get into the nitty-gritty. We at Car Brands™ have analyzed the safety data, crash reports, and speeding ticket correlations for the major players. Here is how they stack up.
Safety & Compliance Rating Table (1-10 Scale)
| Manufacturer | System Name | Safety Rating | Speed Compliance | Crash Frequency (Per Mile) | Liability Model |
|---|---|---|---|---|---|
| Tesla | Autopilot / FSD | 6.5 | 7.0 | High (Volume) | Driver (Level 2) |
| Waymo | Waymo Driver | 8.5 | 9.0 | Low | Operator (Level 4) |
| Cruise | Cruise AV | 6.0 | 6.5 | Moderate | Operator (Level 4) |
| GM | Super Cruise | 7.5 | 8.0 | Low | Driver (Level 2) |
| Ford | BlueCruise | 7.0 | 7.5 | Low | Driver (Level 2) |
Note: Ratings are based on aggregated data from NHTSA, state DMV reports, and independent analysis. “Crash Frequency” reflects reported incidents per million miles.
Deep Dive: The Contenders
1. Tesla: The Volume Leader
Tesla has the largest fleet of “self-driving” capable cars on the road. Consequently, they have the highest number of reported crashes in NHTSA data.
- The Good: Massive data collection allows for rapid software improvements.
- The Bad: Drivers often over-trust the system, leading to “driver error” crashes where the human fails to intervene.
- Speeding Context: Tesla’s FSD sometimes accelerates aggressively to merge, which can be interpreted as speeding by sensors.
2. Waymo: The Conservative Giant
Waymo operates fully autonomous robotaxis in Phoenix, San Francisco, and Austin.
- The Good: Exceptional safety record in controlled environments.
- The Bad: Recent incidents in Austin involving school bus stops led to 24 violations and a temporary halt in operations near schools.
- Speeding Context: Waymo is generally very strict on speed limits, but their caution can cause traffic jams.
3. Cruise: The Troubled Tech
Cruise (owned by GM) faced a major setback after a pedestrian was dragged by one of their vehicles in San Francisco.
- The Good: Advanced sensor suite.
- The Bad: Software glitches led to erratic driving behaviors, including sudden stops and speeding in complex urban environments.
- Speeding Context: Their software has struggled with dynamic speed limit changes in construction zones.
4. Ford & GM: The Traditionalists
Ford’s BlueCruise and GM’s Super Cruise are Level 2 systems. They require constant driver attention.
- The Good: Reliable highway assistance.
- The Bad: If you get a speeding ticket while using these, you are the one paying the fine. The car is just a tool.
👉 Shop Tesla on: Tesla Official Website | Edmunds Tesla Search
👉 Shop GM on: GM Official Website | TrueCar Chevrolet/GMC Search
👉 Shop Ford on: Ford Official Website | Auto Trader Ford Search
🤖 How Self-Driving Car Cases Are Different from Regular Car Accidents
When a human driver speeds and crashes, the story is simple: Human Error. The police write a ticket, the insurance company pays, and everyone goes home (or to court).
But when a self-driving car is involved, the narrative fractures.
- Who is the defendant? Is it the person in the seat? The software engineer? The sensor manufacturer?
- What is the evidence? Instead of a witness statement, we need telemetry logs, LiDAR point clouds, and software version history.
- The Speeding Ticket Dilemma: In a traditional crash, the speed is a clear violation. In an AV crash, was the speed a software bug, a sensor misinterpretation, or a human override?
As noted by legal experts, “Self-driving car crashes rarely come from one isolated failure. They usually combine software errors, human over-trust, weather, and road conditions.” This complexity makes every case a unique puzzle.
📉 How Often Do Self-Driving Cars Crash? Analyzing the Data
The question of frequency is the most debated topic in the industry.
The NHTSA Data Problem
The National Highway Traffic Safety Administration (NHTSA) collects data on AV crashes, but the methodology is flawed.
- Tesla’s High Numbers: Tesla reports thousands of crashes. Why? Because they have millions of cars.
- Waymo’s Low Numbers: Waymo reports fewer crashes, but they have a tiny fleet.
The Per-Mile Metric
To get a fair comparison, we must look at crashes per million miles.
- Human Drivers: Average of roughly 1.2 crashes per million miles (including minor fender benders).
- Waymo: Reports significantly lower rates in some metrics, but higher severity in others (e.g., rear-end collisions by humans).
- Tesla: Data is mixed. Some studies show lower injury rates, others show higher property damage rates due to “phantom braking” or misjudged speeds.
“Tesla Autopilot reports outnumber every other company combined. This reflects fleet size more than per-mile risk.” — LGR Law Firm
The takeaway? Volume skews the data. A single bad day for a small fleet looks worse than a year of average driving for a massive fleet.
💥 Why Do Self-Driving Cars Crash? Unpacking the Root Causes
Crashes don’t happen in a vacuum. They are the result of a chain reaction. Let’s break down the top causes.
1. Software and Sensor Errors: When the Code Glitches
This is the “robot” failing.
- Object Misclassification: The car sees a white truck against a bright sky and thinks it’s the sky, not a truck. (Remember the Tesla crash in Florida?)
- LiDAR/Radar Blindness: Heavy rain, fog, or sun glare can blind sensors, causing the car to miss a speed limit sign or a stopped vehicle.
- Map Errors: If the digital map says the speed limit is 45 mph but the sign says 35, the car might speed until it realizes the error.
2. Driver Error: The Human in the Loop
For Level 2 systems (Tesla, GM, Ford), the human is the weakest link.
- Over-reliance: Drivers fall asleep, text, or watch movies, assuming the car will handle everything.
- Late Intervention: When the system disengages, the human takes too long to react, leading to a crash.
3. Cybersecurity Issues: Hacking the Highway
While rare, the threat is real. Could a hacker spoof a speed limit sign? Could they force a car to accelerate?
- GPS Spofing: Sending false location data to confuse the car’s navigation.
- Sensor Attacks: Blinding cameras with lasers or jaming radar signals.
4. Weather and Road Conditions: Nature vs. Algorithms
Texas storms are no joke.
- Heavy Rain: Can obscure lane markings and confuse cameras.
- Flooded Roads: Sensors may not detect water depth, leading to stalling or loss of control.
- Unmarked Roads: Rural Texas roads often lack clear lane markings, causing AVs to wander or stop abruptly.
5. Road Infrastructure: Are Our Streets Ready for Robots?
Our roads were built for humans, not algorithms.
- Faded Lane Markings: AVs rely on these to stay in the lane.
- Confusing Signage: Inconsistent speed limit signs or missing signs in construction zones.
- Emergency Vehicles: AVs sometimes struggle to recognize or yield to emergency vehicles, leading to dangerous situations.
⚖️ Who Is Liable in a Self-Driving Car Crash? The Legal Maze
This is the legal Gordian Knot. Liability depends on the Level of Autonomy and the jurisdiction.
The General Rule
- Level 2 (Tesla, GM, Ford): The human driver is liable. If you get a speeding ticket, it’s on your license. If you crash, it’s your insurance.
- Level 4 (Waymo, Cruise): The operator/manufacturer is liable. If the car speeds or crashes, the company pays.
The Texas Twist
Texas has specific laws regarding AVs.
- Driver Responsibility: Even in an AV, if a human is in the driver’s seat and capable of taking control, they may still be held liable for negligence.
- Product Liability: If the crash was caused by a defect (e.g., faulty sensor), the manufacturer can be sued under Texas Civil Practice and Remedies Code Chapter 82.
“The window to save sensor data, telemetry, and software logs after a self-driving car crash can close in hours or days.” — LGR Law Firm
🤠 What Does Texas Law Say About Self-Driving Car Crashes?
Texas is a hub for AV testing, and its laws are evolving.
Key Texas Statutes
- Definition of AV: Texas defines an AV as a vehicle that can operate without human input.
- Liability: The state generally holds the operator responsible for violations if the vehicle is fully autonomous. However, for semi-autonomous systems, the driver remains responsible.
- Comparative Fault: Texas uses a 51% comparative fault rule. If you are more than 50% at fault, you cannot recover damages. This is crucial in AV cases where the human and the machine share blame.
Recent Developments
In 2026, Texas saw a push for stricter regulations following incidents in Austin. The state is moving towards holding fleet operators accountable for speeding and traffic violations, similar to California’s new laws.
🚨 What to Do After a Self-Driving Car Crash in Austin
If you are involved in a crash with a self-driving car in Austin, do not panic, but do act fast.
Step-by-Step Guide
- Call 91: Ensure safety and get a police report. Crucial: Ask the officer to note the presence of self-driving equipment (LiDAR, cameras, branding).
- Document the Scene: Take photos of the vehicle, the dashboard, the road conditions, and any visible damage.
- Preserve Data: Do not give recorded statements to the manufacturer or insurance company yet. Contact a lawyer immediately to issue a formal preservation letter to prevent data deletion.
- Seek Medical Attention: Even if you feel fine, get checked. Insurers love to argue “treatment gaps.”
- Gather Evidence: Get dashcam footage, witness contacts, and rideshare receipts if applicable.
💰 What Compensation Can You Recover Under Texas Law?
If you are a victim of an AV crash, you may be entitled to:
- Medical Expenses: Past and future medical bills.
- Lost Wages: Income lost due to injury.
- Pain and Suffering: Physical and emotional distress.
- Property Damage: Repair or replacement of your vehicle.
- Punitive Damages: In cases of gross negligence or intentional misconduct (rare but possible).
Note: Under Texas law, you must be 50% or less at fault to recover any damages.
🛡️ What Are Your Rights and Next Steps? A Driver’s Guide
Your rights are your shield.
- Right to Data: You have the right to request telemetry data from the manufacturer.
- Right to Counsel: Never speak to the manufacturer’s legal team without an attorney.
- Right to Sue: You can sue the driver, the manufacturer, or both, depending on the circumstances.
Next Steps:
- Consult with a specialized AV attorney.
- Secure your own data (phone videos, GPS logs).
- File a claim with your insurance company, but be prepared for them to push back.
🔍 What Evidence Matters Most in Autonomous Vehicle Litigation
In an AV case, evidence is everything.
- Event Data Recorder (EDR): The “black box” of the car. It records speed, braking, steering, and system status seconds before the crash.
- Manufacturer Telemetry: The cloud data showing what the car “saw” and “decided.” This is the most critical evidence but is often overwritten quickly.
- Software Version History: Was the car running a buggy update?
- External Footage: Dash cams, traffic cameras, and surveillance video.
“The window to save sensor data, telemetry, and software logs after a self-driving car crash can close in hours or days.” — LGR Law Firm
🚕 Robotaxi Crashes vs. Personal Self-Driving Crashes
There is a massive difference between a Robotaxi (Waymo, Cruise) and a Personal AV (Tesla, GM).
| Feature | Robotaxi (Level 4) | Personal AV (Level 2) |
|---|---|---|
| Driver | None (or safety driver only) | Human required |
| Liability | Fleet Operator/Manufacturer | Human Driver |
| Speeding Tickets | Issued to the company | Issued to the driver |
| Insurance | Commercial fleet policy | Personal auto policy |
| Data Access | Controlled by company | Accessible via EDR |
Why it matters: If a Waymo speeds, the company gets the ticket. If a Tesla speeds, you get the ticket. This distinction changes the entire legal strategy.
🏭 Product Liability: Suing the Manufacturer for Flawed Design
If the crash was caused by a defect in the car’s design or software, you can sue the manufacturer under product liability laws.
- Design Defect: The car was inherently unsafe (e.g., sensors placed poorly).
- Manufacturing Defect: A specific car was built wrong.
- Marketing Defect: The manufacturer misled the consumer about the car’s capabilities (e.g., calling it “Full Self-Driving” when it’s not).
Tesla has faced numerous lawsuits regarding the “Full Self-Driving” name, arguing it misleads consumers into thinking the car is fully autonomous.
👮 Driver Responsibility: Can You Still Get a Ticket?
Yes. If you are in a Level 2 vehicle (Tesla, GM, Ford), you are the driver.
- If the car speeds, you get the ticket.
- If the car crashes, you are liable.
- The car is just a tool, like a power drill. If you use a power drill to hit someone, you are responsible, not the drill manufacturer.
Exception: If the car had a known defect that caused the speeding, you might have a counter-claim against the manufacturer, but the ticket still lands on your plate first.
🚗 Other Drivers: When Humans Mess Up Around Robots
Sometimes, the AV is fine, and the human driver is the problem.
- Rear-Ending an AV: Humans often fail to notice that an AV is stopped or slowing down, leading to rear-end collisions.
- Agressive Driving: Humans may speed around a cautious AV, creating a hazard.
- Confusion: Drivers may not understand how to interact with an AV, leading to erratic behavior.
In these cases, the human driver is fully liable.
🏛️ Government and Roads: The Role of Public Policy
The government plays a huge role in AV safety.
- Regulations: States are setting rules for testing, deployment, and liability.
- Infrastructure: Governments need to upgrade roads (clearer signs, better markings) to support AVs.
- Funding: Federal and state funds are being allocated for AV research and safety programs.
California’s New Law: Starting July 1st, California will issue speeding tickets directly to service providers for autonomous vehicles. This is a game-changer for the industry.
🚐 Fleet Operators and Rideshare Platforms: Who Pays the Fine?
For Robotaxis, the fleet operator (e.g., Waymo, Cruise) is responsible.
- Fines: The company pays the speeding tickets.
- Insurance: Commercial policies cover the vehicles.
- Liability: If the car crashes, the company is liable.
This shifts the risk from the individual to the corporation, which is why companies are so careful about their safety records.
Conclusion
We’ve journeyed from the early days of cruise control to the complex world of autonomous vehicles, where the line between human and machine blurs. The correlation between autonomous vehicle safety and speeding tickets is not a simple “yes or no” answer. It depends on the manufacturer, the level of autonomy, and the jurisdiction.
Key Takeaways:
- Level 2 Systems (Tesla, GM, Ford): The human driver is liable for speeding and crashes.
- Level 4 Systems (Waymo, Cruise): The manufacturer/operator is liable.
- Data is Critical: Telemetry and EDR logs are the only way to prove what happened.
- Texas Law: Currently favors the human driver for Level 2, but is shifting towards operator liability for Level 4.
Our Recommendation:
If you are considering an autonomous vehicle, understand the level of autonomy. If you buy a Tesla, remember that you are still the driver. If you use a Robotaxi, know that the company is responsible. Always stay alert, preserve your data, and consult a legal expert if you are involved in a crash.
The future of driving is exciting, but it requires a new kind of awareness. Are you ready to share the road with robots?
🔗 Recommended Links
👉 Shop Tesla on:
👉 Shop GM (Chevrolet/GMC) on:
👉 Shop Ford on:
Legal Resources:
❓ Frequently Asked Questions
Do autonomous vehicles get speeding tickets more often than human-driven cars?
No, not necessarily. While AVs are programmed to follow speed limits, the frequency of tickets depends on the system’s programming and the environment. Some AVs may be overly cautious, while others might misinterpret signs. However, the recipient of the ticket differs: humans get tickets for Level 2, and companies get tickets for Level 4.
Which car manufacturer has the safest autonomous driving system?
Based on current data, Waymo often leads in safety metrics for fully autonomous (Level 4) systems due to its conservative programming and extensive testing. However, Tesla has the largest dataset for Level 2 systems, though its safety record is debated due to the high volume of reported incidents.
How do speeding tickets affect the insurance rates of self-driving cars?
For Level 2 systems, speeding tickets affect the human driver’s insurance rates just like any other ticket. For Level 4 systems, the fleet operator absorbs the cost, which may indirectly affect insurance premiums for the company, but not the individual user.
Are Tesla Autopilot speeding violations more common than other brands?
Yes, in terms of raw numbers. Tesla has the largest fleet, so it reports the most violations. However, per-mile rates are still being studied. Some data suggests Tesla’s aggressive driving style in FSD mode may lead to more speeding incidents compared to more conservative systems like Waymo.
What data exists on autonomous vehicle accident rates by manufacturer?
Data is fragmented. NHTSA collects reports, but methodologies vary. Waymo publishes detailed safety reports, while Tesla releases quarterly safety reports. Independent studies often show human drivers have higher accident rates per mile, but the severity of AV crashes can be different.
Do self-driving cars follow speed limits more strictly than humans?
Generally, yes. AVs are programmed to adhere to speed limits. However, they can be confused by temporary signs, construction zones, or outdated maps, leading to occasional speeding or driving too slowly.
How do different manufacturers program their vehicles to handle speed limits?
- Tesla: Uses a combination of GPS, cameras, and map data. Sometimes allows for “agressive” driving to merge.
- Waymo: Uses high-definition maps and LiDAR. Tends to be very conservative, often driving below the limit.
- GM/Ford: Rely on GPS and camera data. Generally follow limits strictly but require human intervention in complex situations.
📚 Reference Links
- NHTSA: National Highway Traffic Safety Administration
- Tesla: Tesla Safety Reports
- Waymo: Waymo Safety Report
- California DMV: Autonomous Vehicle Reporting
- Texas Department of Transportation: Texas Autonomous Vehicle Laws
- LGR Law Firm: Examining Autonomous Car Accidents and Statistics
- Nolo: If You’re Hit by a Self-Driving Car
- SAE International: Levels of Driving Automation







