January 6, 2010

Skin Deep Thought Of The Moment

If the entire framework of political discussion in America has been dragged rightward by an increasingly extremist and influential right wing (and it most certainly has), that necessarily means self-described "centrists" have all become Republicans.

This is also a by product of a Democratic party enthralled by corporate considerations. As it drifts ever more toward the right, it pushes the center before it.



No?