posted on Apr, 21 2007 @ 02:40 PM
I have always been very anti-gun, but have entered into discussion with people who are pro-gun.
Several times the issue of culture has been raised, and I decided that I needed to be more aware of what US culture is about.
I'm not talking about the TV, Hollywood, European propaganda view of US culture, but what it really is.
I posted this question on the guns don't kill? thread, but think that that's probably the wrong place for a discussion of this nature.
I'd like any US citizen to tell me what it means to them personally, to be an American, in the hope that I can gain a greater understanding and
perspective of the cultural whole.
Ask me any questions that you like.
What I really want is personal views, thoughts and perspectives of what it is that an individual thinks about their country, and area, i.e.
city, country, north, south, prairie, mountains etc etc, rather than soundbites etc, but all replies are appreciated.