If the entire framework of political discussion in America has been dragged rightward by an increasingly extremist and influential right wing (and it most certainly has), that necessarily means self-described "centrists" have all become Republicans.
This is also a by product of a Democratic party enthralled by corporate considerations. As it drifts ever more toward the right, it pushes the center before it.
No?