August 6, 2009

Thoughts on Birth and Disease

Recently everyone around me is seemingly pregnant. Due to the fact that I have an extremely large family, I have all of the stories and details readily available.

I've never been pregnant or given birth, so that puts me on the outside of this issue looking in. But it seems to me that we have completely lost touch with the natural process of birth.

For years now I've been contemplating this issue and have been extremely interested in alternatives...midwifery, water births, etc. And recently, I've watched a documentary called "The Business of Being Born" which renewed my interest ten-fold. I myself, will probably never reproduce, but I feel it's important to spread the message to others who will.

"The Business of Being Born" shows how our society has taken a natural process and authentic rite of passage, and turned it into a medical emergency that the female body cannot handle alone.

How then, do these people assume the human race- nay, life itself, has evolved thus far?

I am not saying that the process of natural birth is always easy and pain free, but I feel it's an authentic experience that most women are missing out on. We've numbed ourselves to the point of not wanting to feel anything. If we feel the slightest discomfort, we take a pill. If we experience emotional lows, we take more pills, or attempt to drown it out with drugs and alcohol. How can we expect to be fully rounded human beings without being aware and working through our own processes?

It's sad that the most popular type of birth in the U.S. is a "designer birth", a scheduled cesarean section immediately followed with a tummy tuck. We're literally ripping our young from our bodies, whether they are ready or not. And it's been studied and documented that women who have cesareans are not as emotionally attached to their child as those who go through vaginal birth.

I implore you women who plan to have a child, to take back your power, stay out of the damn hospital, and trust your own body. You will know what to do. All of your ancestors did.

If you trust hospitals to heal and save us, you have not been given all the information. I have refused medical care and health insurance for years now. If you can't understand why, look into it for yourself. The medical industry in America is big business, with money to be made. There are people getting fat and rich off of your misery and loss.

If you haven't seen Michael Moore's movie "Sicko", I highly recommend you do. Also, I just watched a very informative documentary called, "The Gerson Miracle". This one shows how horrible diseases including cancer are cured every single day, through natural diet and detoxification. It's interesting to note that this method, though it is completely natural and harmless, is illegal in this country!

When will we turn ourselves around and realize we do more harm than good to ourselves and each other? The medical industry should NOT be a money producing conglomerate. When did the focus shift from healing? I wouldn't even go to a hospital to die, never mind get well. Granted, some good can come from their discoveries and practices, such as certain surgeries, etc. But it still needs to be redressed and reworked.