Who to Blame

Is it just me, or has there been an uptick in a certain, er, variety of social-conservative thought (and I am using the word "thought" somewhat broadly) that seeks to lay all the blame for America's ills on feminists and gays? Oh, and for those awkward moments when bashing homosexuals isn't socially acceptable, there's always some other group of queers you can pick on.

Listen, I don't mind sitting through all the sexist, homophobic drivel about "feminized society" and "feminized men" and all this crap, but don't expect me to get on the program with you.