So, I went to college, I took all the traditional courses and read all the books that feminists talk about. And I completely believe every word of it. The sticking point for me, is that I realized that I don't care about any of it. In fact, I realized, I actually like that I'm being oppressed and objectified. I like that most people, mostly men but even women do it, treat me like I'm less than a man. I like knowing that if I'm ever raped, if that fact became a public issue there would be a large part of the public and the news media that act as though it's my fault. Hell, I wouldn't even bother reporting it, and if it's someone I know I'd treat them exactly the same as before, and act like it never happened. By this point you might think I'm being sarcastic, but I'm completely serious. I love being a woman in a patriarchal society, and I might even be happier if I were alive back in like the victorian era or something, where things like clitoridectomy were still considered a cure for masturbation/promiscuity.

And to clarify, I don't think women are inferior, I don't think it's my fault if I'm raped, I don't think masturbation or promiscuity are bad things. I mean, if other people want to fight for equality go right ahead. I'm just saying, I'm perfectly happy to continue being valued as nothing more than a life support system for a vagina.

Comments

No comments yet... be the first to comment on this confession!

Comment this

Can't read the image? Click here to refresh