Yet this brings me to my point. Images of what? America? Surely this is a country where all women stand up for what they believe in and where every woman is a strong, courageous feminist.
Wrong.
We are still living in a society where women stand for (even long for) the bad boy. The chauvinist cowboy who'll slap her ass and call her "sweetie" when she gets mad instead of offering up his responsibility. We need to be the change we wish to see and yet why then are we still the highest consumers of beauty products, why still are there women who are against the right to control whether or not you give birth, why do we still buy products sold to us with this:
Until we stop supporting this, I can't really claim we're much better than the (other) patriarchal societies of the world.
No comments:
Post a Comment