posted on Jun, 17 2006 @ 05:20 PM
When I was growing up, my parents, especially my Dad, stressed the importance of honesty and integrity. Nothing was more important. At 18, when I
went off to school and even more so afterwards, I embraced this concept, and besides maybe the ladies and the beer, it was my guiding light.
Up until about 5 years ago, this seemed to work very well for me and it seemed that most of my peers had the same idea. Since then, I'm not so sure.
I'm hoping to find out if others see this or that most of the rest of the world has avoided this situation.
My frame of reference is a little unusual because 6 years after I started my first real job, I left, only to return 5 years later. I've now been
back for 4 and the changes are clear as day. Now it's as if the truth only applies if it benefits the individual or the small group the individual
is a part of.
Finger pointing is the norm when problems appear. I'm discouraged from accepting responsibility for my/our screw ups and openly discussing solutions
with those outside our little group. Looking like we're working hard and knowning what we're doing has become more important than actually doing
so. Those who do nothing, get rewarded just as much as those who do their own work and pick up the slack left by those who don't. And I've become
the trouble maker for trying to point out these things.
Please ensure me that my situation is an exception to the old rule of honesty and integrity.