My modest knowledge of history suggests that in 19th - early 20th century U.S. populace and upper social strata called this country Christian, founded on Christian values, and so on. However, today both fundies and less extreme folks (not to speak of media and politicians), as a rule, use the phrase "Judeo-Christian". I am wondering when this phrase became a norm in the American society. My blind guess: late 60th-early 70th of 20th century. What happened then?