Simply because, it’s not the job of neuroscience. We can not understand how the software works by mapping the regions and electrical circuitry of PC components. We need the language of mind, and the symbolic structures to understand the regime behind thoughts and emotions; we need software people. Everything we learn, hear, see, feel or experience will be translated into neural information activity, new connections, new logical doors, new switches (Our computers may not physically change themselves through running their software; but this is not the topic). In one way, it is impossible to put a sharp distinction between software and hardware as their existence requires one another, same is true for the relation between mind and brain. On the other hand, we also know that they operate according to their own logic. We cannot figure out how an HTML code works by observing transistor activity, just as we cannot tell “how” we learn things by counting neural paths. This would be trying to understand how a city works by counting the passing cars, measuring the amount of energy used or mapping the buildings and roads. Neuroscience has the full capacity of replicating brain artificially, and it might eventually solve many problems such as Alzheimer or stroke, but this type of knowledge doesn't hint anything about “why” question. I imagine a software project that aims mimicking the mind would provide a better insight on how brain works. A totally different approach, free from hardware architecture: focusing on the data process rather than the processor.