That sounds more like an issue of not correctly communicating the detected problem to the humans.
In any event, the situation where the humans aren't processing the data correctly and doing illogical things as a result would always lead to failure outcomes, until the point where we no longer allow humans to be pilots/operators at all.
In any event, the situation where the humans aren't processing the data correctly and doing illogical things as a result would always lead to failure outcomes, until the point where we no longer allow humans to be pilots/operators at all.