Another Chinese self-driving test: deadly results and lawbreaking in city ADAS use
Chinese media outlet Dongchedi posted another massive test of automotive self-driving systems, testing many of the same cars as it did in the highway test we we reported on this weekend.
This time, the test covers various urban driving scenarios, where much more human carnage is possible due to the presence of vulnerable road users like pedestrians and two-wheelers. And given how poorly the cars did on the last test, you can guess how they might have done on this one – although, once again, Tesla fared rather well.
The last video tested 36 cars in 6 different scenarios, all on highway driving and intended to replicate plausible highway situations that might lead to a crash. The new video is a little shorter than the last one, but still hefty at just over an hour long. It’s also only available in Chinese, but helpfully with English subtitles.
This time, the group was trimmed down to 26 cars from 36, but 9 scenarios were tested instead of 6, leading to a total of 234 simulations. Advertisement - scroll for more content
Dongchedi had the help of Chinese state media in making the test possible, and it shows in the extremely high production value of the videos, which it posted on its automotive Youtube channel DCARSTUDIO. Once again, we recommend a watch, because it’s very well made.
The innovation behind these videos is that, unlike most other crash tests that either happen in labs or on closed courses like racetracks, airport runways or parking lots, DCAR used actual public roads which were shut down for the purpose of testing.
Why does this matter? Well, we’re testing ADAS systems here, not just normal passive crash structures like crumple zones, or even emergency driver aids like automatic emergency braking.
And the thing about ADAS systems, particularly those with an end-to-end, “navigate-on-autopilot“-like feature where the car can follow directions and make lane changes, turns, merges and other road transitions for you, is that they can’t be activated on roads where there are no directions to be had. (this came up in discussions after the famous Mark Rober Wile E. Coyote video, which still had value even if it didn’t test Tesla’s end-to-end system)
So – you’ll never be able to test how an SAE Level 2 driver’s aid will respond in a real world situation if you don’t test it on real-world roads. That’s what DCAR set out to do, and the result is once again quite spectacular. (And as the same caveat as last time – these aren’t actually driverless systems, like Waymo’s Level 4 system, but rather driver’s aids that still require an attentive driver in the seat)
This time, DCAR shut down two different segments of road: a massive, complex roundabout and another segment of road with a few unsignaled intersections and a long straight.
The first four tests incorporated portions of this huge roundabout, which would be complex for human drivers, but in situations for which there is quite an obvious solution: don’t hit that car/pedestrian in front of you.
The five tests here consisted of:
1. A vehicle is stopped in the left lane at the entry to the roundabout, obscuring an oncoming car in the lane you are trying to merge into.
2. Trying to merge left through a line of cars, in order to make a left turn to escape to the center of the roundabout.
3. Driving through center road of roundabout, two scooters stop in the scooter lane to yield for 4 children, who run out in front of the car (this test was preceded by a sharp u-turn, and some vehicles failed to even enter the testing area as they disengaged during the turn).
4. A broken down car in the center lane of the roundabout, with a warning triangle set up.
Admittedly, this is quite a complicated roundabout and most of us looking at it (at least from here in the West) probably can’t read exactly what those lane markings mean at first. And the markings also confused some systems – but if you want to offer a self-driving system, you need to be able to handle the roads as they exist.
The SU7 Ultra didn’t crash on test 1 – because it got confused and stopped at the roundabout entry.
The second location centered around a few unsignaled intersections, with more situations that are dangerous but plausible. They went as follows:
5. Just a U-turn. That’s it. This is a freebie… right?
6. Going straight through an unsignaled T-intersection, with a car turning left into your lane in front of you, obscured by the driver’s A-pillar blind spot.
7. Driving straight, with a car reversing into your lane from a perpendicular parking spot or driveway.
8. Driving straight, a scooter emerges from a group of several scooters and changes lanes in front of you.
9. A sharp left at an intersection, with a scooter turning through the intersection in front of you, and a pedestrian in the crosswalk on the other side.
Each of the tests occurred at generally low speeds, which means systems should have had a lot of time to consider and apply brakes, and the brakes should be more effective than they might have been in higher-speed highway scenarios from the first video.
The Tesla Model 3 also avoided a collision – because it drove into the bike/pedestrian lane instead. On the 2nd try, it took the right way, and yielded in time to avoid a collision – though perhaps yielded a little too much, waiting for the whole roundabout to clear instead of just its own lane.
Despite the lower speeds, many of the cars tended to approach these tests with confidence and aggression, either refusing to yield at all or only yielding at the last moment, to the point where it almost seemed like luck that they avoided a collision. Some cars also exceeded the speed limit, making their job of avoiding a collision more difficult.
Six cars failed to turn left at the roundabout in Test 2, and ended up driving in circles forever instead. Other cars which succeeded this test were too aggressive, barreling into small spaces when they could have just waited.
Disturbingly, many of the cars wouldn’t even acknowledge it if they did get into a crash, and would continue on driving until DCAR’s (brave) human test driver and the host of the video intervened to end the test.
The Xpeng P7+ was one of 11 cars to hit the child mannequins in Test 3
Unlike the highway tests, the urban tests included other road users. The highway tests included a truck and one construction worker, but urban tests included scooter riders and children – common sights in cities, which should certainly be reflected in the training data that companies use machine learning to train their ADAS systems with.
Not all is bad – some cars noticed the children early and slowed pre-emptively, then stopped when the kids darted out.
And these are arguably much more important scenarios in terms of human safety. Highways are typically safer than urban driving, and one reason is because there aren’t pedestrians around, so if you hit someone, they’ll be protected by a big metal box that’s going roughly the same speed as you. With a pedestrian or scooter rider, there’s no protection, and often a much higher speed delta, which means higher danger.
Zeekr couldn’t even figure out the sharp right turn to get to test 3, and ended up in the bushes multiple times.
Even in situations where the cars should have had a clear view of these other road users, they failed to show the caution that should be required of cars sharing the road. A driver should know to pre-emptively be more cautious when there are pedestrians present – especially children. Certain vehicles did show this behavior, but many didn’t.
In the last of 9 tests, the Denza Z9 ran over a child mannequin, then continued driving off in a hit-and-run.
Interestingly, compared to the previous highway test, there was less inconsistency within vehicle brands this time around. Most of the vehicles that use similar solutions tended to show similar behaviors on the same test, even if those cars were from different brands – for example, the Luxeed R7 and AVATR 12 took second and third place in the overall standings, and both are equipped with Huawei’s ADS self-driving system.
The Toyota bZ3X hit both the scooter and the child in the final test.
And once again, Tesla did well in these tests, with the Model X taking the top spot, avoiding a collision in 8/9 tests. The one it failed was test 7, the reverse test, where it drove through at high speed clipping the rear of the car.
The Model 3 showed similar behavior on the reverse test, but also failed others (tests 2, 4, and 5), leaving it behind several other vehicles in the rankings. Which means that, if we average brand scores and rank brands, Avatr and Aito both had roughly similar performance brand-wide as Tesla did.
Both the Model 3 and Model X failed the reversing car test, clipping the back of it. It’s the only test the Model X failed.
But like last time, we have to give the caveat that these tests all happened in good weather – and all in the daytime, unlike the highway tests, some of which happened at night.
Vision-only systems like Tesla’s have a disadvantage at night and in inclement weather as compared to systems with LiDAR or radar, and those situations were not tested in this video. Nevertheless, Tesla still did better than other vision-only systems, and even those with more advanced sensing technology, which is impressive (though it was still prone to making weird decisions, like when it tried to take a bike lane above, and on the U-turn test below)
The Model 3 failed a simple U-Turn, going from the outside lane to the outside lane for some reason. But it did notice and avoid some sheep crossing the closed road, so that’s nice.
Zeekr performed among the worst, at it did in the highway tests. Xiaomi also had middling to disappointing results – it’s a driver’s car, though, so maybe drive it rather than letting the machines do it for you. The biggest drop in rankings was the Great Wall Motors Wey Lanshan, which was a top-performer on the highway and yet scored one of the worst in urban driving.
The simplest U-Turn test proved difficult for many cars, which demanded manual takeover and then just drove off the road when the driver didn’t intervene.
Once again, Carnewschina assembled a table of the results (scroll to the bottom, past the highway test results), which we link to here as a thanks for their work in sifting through DCAR’s Chinese graphics and turning it into a more legible format for English speakers.
Everyone had a different “solution” for the crashed car in test 4 – all cars avoided a collision, but many really didn’t understand what to do or where to go.
Collectively, these systems did about as bad as they did in the highway tests – a lot of simple scenarios were failed. The tests showed that these systems still get confused by relatively simple scenarios, and aren’t taking full advantage of the benefits in reaction time and all-around sensing that they should have with their many sensors and supercomputer systems to process them.
Test 8, a merging scooter, had a high, 62% failure rate, despite being one of the deadliest situations tested – and one where lots of training data should be available, given how common scooters are.
In particular, many of the tests involved situations where a driver’s eyes would have trouble anticipating a collision due to the A-pillar blind spot, something that should restrict a car’s sensing systems which can be placed so as to avoid blind spots. But many still failed to notice or react properly.
Test 6 should have been easy – just slow down a bit to avoid the car turning left. A-pillar obstruction should not affect the car’s sensors, as it would a driver’s eyes. But many cars refused to yield – even if this one was “the other car’s fault,” a simple brake tap would have avoided collision.
Like last video, DCAR interviewed Lu Guang Quan, from the Beijing University of Aeronautics and Astronautics. He once again pointed out that ADAS systems trained on machine learning can learn poor behaviors from the dataset, and these can be harder to correct than rule-based systems would be.
“End to end systems rely massively on samples,” said Lu. “iI their training data shows cars often ignoring the rule that vehicles inside a roundabout have the right of way, then the model learns to ignore it too.”
Cars showed poor etiquette – cutting across lane lines at the last moment, cutting in front of other cars, and stopping in pedestrian zones. One car would have racked up 6 points on its license in a single intersection, out of a possible 12 before license revocation under Chinese law.
DCAR noticed that the systems routinely broke basic traffic laws and showed poor driving etiquette. The systems “don’t have traffic laws built into their foundation, nor do they treat compliance as a top priority. It’s like no one ever taught them to follow the rules – and they didn’t learn it from user data either.” (We saw a real-world example of this when Tesla first released FSD in China, and one driver got 7 tickets in a single drive)
DCAR ended the video on a slightly positive note, stating “we do believe China’s homegrown brands will be able to reduce the risk in these scenarios through future OTA updates. For now, the safest approach is still human-machine co-driving, letting ADAS help reduce the risk of collision while the human driver remains ready to take over when the system reaches its limits.”
And we at Electrek will close similarly as we did in the last article – we continue to hope this is a reminder to everyone who has gotten comfortable with using these systems routinely. Urban environments are complex and the presence of vulnerable road users makes them much more dangerous.
Even though brands are offering ADAS that works on urban roads now, you still need to apply your full attention to the driving task while behind the wheel of one of these vehicles – even the best-performing Tesla FSD, which all of us who have used it (or who watched the video above) know is prone to weird decisions at times, even if those decisions don’t lead to a collision.
The 30% federal solar tax credit is ending this year. If you’ve ever considered going solar, now’s the time to act. To make sure you find a trusted, reliable solar installer near you that offers competitive pricing, check out EnergySage, a free service that makes it easy for you to go solar. It has hundreds of pre-vetted solar installers competing for your business, ensuring you get high-quality solutions and save 20-30% compared to going it alone. Plus, it’s free to use, and you won’t get sales calls until you select an installer and share your phone number with them.
Your personalized solar quotes are easy to compare online and you’ll get access to unbiased Energy Advisors to help you every step of the way. Get started here.
FTC: We use income earning auto affiliate links. More.