Why is it that human beings (American's more than any other society) seem to struggle with taking responsibility for their own faults and actions? I know that there are people out there with the integrity to stand up and say, "Yeah, I messed up and I'm sorry"; but they are so few and far between - they get profiled in "Hero" stories on 20/20 or other shows. These characteristics seem non-existent in the workplace anymore, and it is wrong that we as a society seem "ok" with this behavior.
What happened to standing up for what is right? What happened to saying "I'm sorry"? What happened to learning that responsibility, integrity, honesty, and hard work will pay off over lying, manipulation, conniving, and back-stabbing?
Oh, that's right...I remember...the 80's, Reaganomics, cocaine, Wall Street...that's what happened.
I'm bitter and pissed right now, my husband was used as a scape goat for the "#2" at his job on Friday and is now out of a job. My husband is one of the good ones, and anyone that has worked with him knows it - he should NOT have been the fall guy; but he was and now we have to deal with it.
I utterly loathe rich, lazy, liars that don't care about loyalty or honesty and have no integrity. I wish we could zap them all out of existence.