(Before It's News)
It seems that they forget to tell us, as children, that life hurts.
Life will hold painful moments for us.
Life is sad sometimes.
I get it.
Adults don't want to be raising us, telling us all these horribly depressing things.
But I wonder if it's worse to find it out on our own than to be warned…
Would someone take a puff of their first cigarette if they knew what it does to their lungs, slowly, over time, and that smoking a pack a day would result in a shorter life?
Would someone trust a friend with a secret if they knew that you shouldn't trust so easily?
Would someone give themselves, their heart, their body, to someone else, if they knew that not every first love chooses to stick around and not give up on you?
How differently would this life be lived if we were warned of people, places and things? We learn about them as nouns in school, but why aren't we introduced to their schemes and wily ways too? Why aren't we warned that life will hurt us, even when we have done nothing to encourage it, to invite the pain, to want to feel the sharp sting of betrayal, of lost love, of anxiety and depression, of self-consciousness, of fear?
What if we had been warned?