I have resigned myself to the decline of America. I don’t question this, not even a little bit. Most times, I want the whole damned thing to burn to the ground. I’m fed up with everything that is our society. I hate that we have no cultural values. I hate that we abhor social norms. Progressives had turned us into a nation with no anchor and damn it, we allowed it to happen.
If I didn’t care, I wouldn’t get so worked up over it. Maybe I don’t care in the same way a woman says “Do whatever you want. I don’t care.”
(Any man with half a brain knows this is a landmine and avoids it at all cost. Once you step on it, no matter which way you go, you’re losing a leg.)
So, maybe I just mean that I’m so infuriatingly disgusted with my country that I want to see it suffer. Perhaps some tough love will force us to alter our course? I doubt even that would make things better. I’m not the only one, either. My wife wants – quite badly – to move out into the middle of nowhere and forget this society even exists. My wife is a devout Christian and the erosion of our social/cultural mores has impacted her in an big way.
I imagine there are many like us around the country. Honest, working Americans who know that a better world is possible, but feel utterly helpless to do anything to make it happen. So, what do you think? Am I just fed up and ready to see cultural backlash force the progressives back under the rocks they slithered out from under? Perhaps just angry that people don’t seem to have any shame anymore? That society is broken and will only get worse as we spiral out of control?
What do you think?