In animals, sensory inputs are so tightly integrated with brain function that it can be difficult to discern where the sensory apparatus ends and processing neurons begin.
The advantage of that became apparent to us in the Grand Challenge. When a tire blew out, suddenly the steering calibration too was finished, the log files showed. Because the steering calibration was not adaptive, there was no longer any means to avoid hitting other obstacles (and flattening more of those sticky tires in the process). It finally struck a several hundred year old Joshua tree, at which point DARPA race officials had to hit the 'kill' switch. Before then, over 80 miles into the race, we were finishing with the fastest split times. Just a year before, no one had finished and Carnegie Melon had gone the furthest, all of about 0.7 miles into the race.
If a predator gouges out one of your eyes, you still have a chance to do something to evade or otherwise mitigate the situation, just as a well engineered mobile AI can switch to compensate using other sensors, if necessary, should a primary one fail.
It was a rich experience. After the race, Anthony Levandowski (Google driverless car, Uber) briefly joined our team for the following year's Urban challenge. He showed us the advantages of the rotating LIDAR. I was recently relieved to read, at least Anthony managed to leave Uber without being the subject of a trade secrets lawsuit from Google. Anthony is a bright young engineer, and did more than anyone else to make safe driverless cars a reality.