Here in Australia we get some American news. Katie Curic, the news woman, just said "The american flag is the symbol of freedom around the world..." and went on to talk about terrorists and such. I've heard several USA presidents talk about how the USA is the world's champion of freedom and democracy; I've heard them say the world looks to them for leadership in such matters.
Does anyone actually believe that? Is it just rhetoric to make the USA public feel better?
Personally, I see the USA as the leader in joint military actions which benefit them and their allies (including Australia). Often these actions have positive results, but often negative results too.
I'm not posting this as a simple USA-bash. I'm really curious whether any Americans in particular believe such stories.
Does anyone actually believe that? Is it just rhetoric to make the USA public feel better?
Personally, I see the USA as the leader in joint military actions which benefit them and their allies (including Australia). Often these actions have positive results, but often negative results too.
I'm not posting this as a simple USA-bash. I'm really curious whether any Americans in particular believe such stories.