People keep saying we are a "center right" nation, and yet polls consistently show voters support Democrats on nearly every issue. So I don't see the evidence for this claim, unless you believe the Democrats are the centrist party and the Republicans the far right party, in which narrow preferences for Democrats would actually be considered right of center.
I would agree that we are a center-right nation compared to the rest of the world, but certainly not in a context in which Democrats are considered left and Republicans right.
Two words: political environment