Sorry to any feminists reading (I know I have at least three of you on here ;), but...
99% of feminism consists of blaming anything and everything on men. There is a "feminist theory" of the emergence of paraphilias that basically says men develop fetishes to excuse the domination of, subjugation of, and aggression against women. It entirely ignores that women develop fetishes, and... arg... just, feminism makes such broad generalizations about men.
I realize that a lot of it comes from radical feminism, but still. I even have issues with the word feminism. ^_^; I mean, if language is a mirror of society, and we change language to create the society we want live in [i.e. we want equality in the workplace so we eliminate "chairman" and replace it with "chair", etc.], why are, at least, liberal feminists still calling it feminism? Shouldn't it be egalitarianism?
*bites things*