posted on Oct, 12 2006 @ 08:32 AM
I'm asking this as a question to those who live there. After being on the board now for a while, we hear a lot of country bashing and it is pretty
clear that the american people often get tarred with the same brush as the goverment.
So teach me! My personal opinion of americans has always been very positive. During my time in the British forces we were always accepted with open
arms by both the american forces and also any american's we met. In fact they were more than welcoming.
When my kids are a bit older, i'd love to do more travelling, but for some reason, if anyone has mentioned going to the US ine the past few years i
always said it's one of the last places i'd want to go. Why have i said this though?
Probably because all we hear about is the cities nowadays, the terror threats to them and the government that runs the country. Thats enough to put me
off a place or so it would seem.
I know there are some beautiful places over there to see, even in the urban areas. What i'd like is for people to tell me the great points about
where you live, the scenery, the people, that sort of thing.
What would you recommend for a good trip to the US? Forgetting about the worlds troubles at the moment, what parts of the american culture have to be
experienced in your eyes?
[edit on 12/10/06 by CX]