I’ve thought long and hard about a post I want to write to all you women out there. So bear with me, this might get rough.
I come from a very small, conservative town - one that possesses many qualities that I’ve come to love, yet many dark shadows lurking behind southern charm and forced smiles. I grew up in a slightly less conservative family where women, especially my mother, were (and still are) strong, independent, determined individuals who refused to let just any man take care of them. I was always taught that men and women were equal and that all people of skin color were equal (granted that they were good, hardworking citizens.) Of course we were taught about slavery, abolition, segregation, and the Civil Rights Movement in history class, along with the fight for women to vote (which is all I can remember learning about women). I was content knowing just these few things.
But there were so many things I didn’t know, at least no until I got out of town and got into college.
You see, I had no idea about the history of the fight for women’s right to birth control. I hadn’t even heard of Planned Parenthood. I mean, I knew that women could get an abortion, but I couldn’t fathom the idea of women being involuntarily sterilized at the same types of clinics just because of the color of their skin or their social class. The sex-trafficking, the domestic abuse (in all forms of relationships, mind you), the perilous image we as women allow to be placed on ourselves as a beauty “standard,” all started piling up in my shocked face.
"How could they do this to us?" I would say to myself.
How could a woman be denied of her right to her own body? It just didn’t make sense to me.
Where was the logic in taking away a woman’s ability to bear children? And even so, the logic in taking away the ability for women to control the birth of children? (As in, the same thing they were trying to get rid of)
Where is the justice for those ugly souls who drag women unwillingly into the night streets in a monetary exchange for sexual slavery? And better yet, those who choose to beat their women into submission?
Why is there such an unhealthy emphasis on beauty in this culture? Don’t these women know that they’re beautiful just the way they are? That they don’t have to fight hunger, and that they don’t have to purge their way to beauty?
I used to go through life naive and ignorant. I used to believe in the saying, “Ignorance is bliss.” But how can anyone live knowing that these retched things are happening to the numerically dominant gender of the nation? If I’m to be completely honest, I can’t live like that anymore. I know so much now, and I want to DO something about it!
Here, take a look at a few womanly facts:
How about some videos? These are centered more around racial issues with women, another topic I’m quite passionate about (that will be addressed later).
http://www.youtube.com/watch?v=RwA8p3CJZWY (only a snippet, but do more research)
Maybe I’m just blindly raging. But my eyes have been opened, and I am pissed off.
Dammit, I am a WOMAN!
For all of you struggling with the same issues I’ve mentioned, I’m with you - wholeheartedly, I am with you. For those of you who have been blinded by the same societal light as I was blinded with for so long, I’m with you. Together, only together, lovely women, can we change this world from how it IS, to how it SHOULD BE - a place where men and women are equal, where we do not have to hide, a place where we can be free.
I’d like to know what you all think of this. So please, feel free to respond, message, whatever. I’m here for anything.
Yours in the blasted struggle,