In my mind it is more logical for environmental measures to be taken by a right wing authority rather than a liberal one. Environmentalism was in fact, rooted in the right-wing totalitarian politics of 19th century Europe. The logic was that the masses were too preoccupied with human matters to actually promote change. Here in America, environmental concern is usually associated with leftist thought, and I fear that this is the biggest error. Liberalism has proven incompetent in making the necessary changes for a healthier world out of their obsessive pre-occupation with human equality and individualism. By removing more restrictions on man they assure greater degredation of the earth. I find it increasingly frustrating that no modern conservatives have ever pushed for positive environmental changes. Why?