Does it even have a role anymore? Most kids, by the time they graduate high school, are educated enough about science and history that they see no need for the standard theological explanations of yesteryear. And morality is no longer viewed as unconditional servitude to a higher will. What good then does religion do for our society? If religion ceased to exist tomorrow, what changes could we expect to see in our world?