Compare that to a driver (of any skill level), they are trained to know the rules and if they get stuck, they will figure out what to do in that situation, acting on instinct, training, common sense or a self-preservation instinct. The computers in the car don't have any of those capabilities.
On the other hand, drivers may be interpreting the situation ahead of them wrongly due to having only one set of eyes that only work with a sufficient level of visible light, and their figuring out what to do may take more time than is available to them (better known as "getting into a situation that's over their head"). Common sense often turns out not to be sufficiently common either. And while I grant you the self preservation instinct, it's not rare to see that overridden by target fixation: one drives into what one's looking at even when having consciously noticed it being an obstacle to avoid, like a tree, a large rock or a fire engine. Consider it some kind of brain lock-up.
Driver assist technologies have been around long enough now that they reliably work as they should; in the 14 years that the Volvo XC90 has been on the market in the UK it has not been in the type of accident that its collision avoidance system was designed for. It's the driver replacement stuff that's not up to scratch, especially if it's (wrongly) interpreted by the driver as being something that it's evidently not ("Autopilot").
I wonder if they can even determine if a sensor is returning questionable or inaccurate data.
The traffic authority's report into the crash where that Tesla went under a truck that was crossing the lane it was in, contained parts of the vehicle logs. It shows that the system is aware at least of broken and missing sensors and actuators.
 unfortunately to the point that people have come to rely on them in a way that affects their traffic awareness.
 yes, that same system that Uber saw fit to disable, as they considered it might interfere with their own AV setup.