Right-wingers have had a lot of rhetorical and political successes over the past 50 years, particularly on economic issues. But it's a tremendous achievement that now everybody at pays lip service to the general idea of racial and sexual equality. For all the awful things mainstream Republicans do, they can't actually come out and say "Obama isn't fit to be president because he's black, and black people are intellectually inferior." And they can't run against female politicians and say "Women aren't capable of handling jobs like being a Senator." They can try sly ways to insinuate this, and they can use all kinds of racial and gender prejudices to support policies that harm the poor or restrict abortion. But things have progressed to where everybody in the political mainstream understands, or at least pretends to understand, that racism and sexism are bad things.
I'm focusing on the political side of this, but the way it appears in the basic nature of social relations between people of different races and genders is (as Amanda discusses) the most important thing. The idea that I'm supposed to regard women around me as owing me some kind of deference just because I'm a guy, or think less of them for not being deferential to men, strikes me as alien and monstrous.
Obviously, it's not that everything is okay now. We've got major problems, among them a dismal economy in the intermediate term, and the threat of climate change in the long term. But I don't see that these problems are more grand and terrible than things we've faced and triumphed over in the past. I mean, this country had slavery 150 years ago. The progressive movement has made things better in some pretty tremendous ways over the last fifty years (and really, the last several centuries), and it's going to take a lot more than what's happened to make me pessimistic about the long-term direction of things.