posted on Jun, 3 2004 @ 03:55 AM
I've been reading a lot lately about how Bush's war has "damaged" America.
I could accept this if people meant economically or psychologically but most people seem to be suggesting that the war has damaged how other countries
( specifically western and European countries ) perceive The US.
I'd like to offer a little hope.
Just as the Brits are seen as stuck up but cultured, the French arrogant but cowardly and the Germans rude but ruthlessly efficient, The US has always
been perceived as isolationist and greedy. But to be honest It never really bothered anyone. In Britain Americans were perceived as fat consumers who
regarded the rest of the world as their own private Disney land, once to visit and then back to the real world, US soil.
This may sound harsh but really no one takes it to seriously. The French aren't all arrogant and Cowardly, The Brits aren't all cultured and
Americans aren't all fat. Its just a global perception.
However. Along came Bush and put many of Americas perceived faults into practice, incurring massive casualties on all sides which Made Physical our
perception of US behaviour and policy.
The good news however is that I don't think any country truly believes that the US is at fault. They perceive Bush as taking the US stereotype to the
As such, Once Bush is removed much of the bad feeling he has incurred will simply disappear. Indeed most of the rest of the worlds opinion of the US
will probably improve simply because the "war on terror" will be seen as the actions of one man who's citizens were sensible enough to remove him (
however inaccurate that is ).
This is however simply my opinion, I don't blame the US for the war on terror, I blame Bush and, when Bush is gone, the US won't be "damaged" it
will get a round of applause from me and others like me.
What are other peoples opinions on the US. Has its reputation really been damaged or just the reputation of its current administration ?