Why is being in the nude so "wrong" in the eyes of many Americans? I see no problem with it. Its who we are. Many other countries see no problem with it as well. Yet many Americans feel being nude as "shameful" and "insulting". Funny how we are the only species on this entire planet that insists on covering ourselves.