09 June 2014

Rape Culture? What The Hell Does That Even Mean?

From what I gather, somebody is pissed off about something Miss Nevada said about women learning self defense to avoid or prevent rape. Somehow, I can't image where the fuck the logic comes from, but somehow, there are a bunch of pissed of feminists claiming that in somefuckingway that statement is embracing the "rape culture."

I was unaware of such a culture.

These in-no-way-in-touch-with-reality feminists say something about not teaching women self defense to prevent rape, but only teaching men not to rape. Which is fucking weird. Does anyone teach men to rape? I was under the impression it was a pretty frowned on practice. 

And while we're on that subject... when did the definition of rape change so much? Rape is a horrible, violent crime and I am in no way lessening the severity of it. A friend and I had a conversation about this subject. He said that in some class or another, he was told "nothing is sexier than consent..." Is that what they mean about teaching men not to rape? 

Feminists are fucking weird.

The friend also mentioned something about informed consent, "How do they expect me to know if a girl is too drunk to know she's consenting?" Valid point, but furthermore, why does he have to worry about informed consent when I don't? I can go pick a drunk dude up and bang him anytime, and if he is too drunk to remember giving consent, or remember me at all, is that also "rape?" 

I'm not saying that it's okay. Most of us have seen the douche in the bar that picks up the drunk chick. Her friends may or may not try to talk her out of it, but drunk or not, she makes the choice to go home with the douche. Yes, that makes the guy an asshole, but not a rapist for fuck's sake.

Obviously, I am not talking about the dickheads that roofie people in bars. They should all be shot. 

Also, why, if a woman can be too drunk to give informed consent, isn't being "too drunk" a legal defense? 

People are fucking weird.