It seems that since the revolutionary war, the southern states have had an undue influence on the way american government is run. A major factor in the build up to the civil war was the advantage that southern, agricultural economies gave slaveowners in terms of representation and political dominance. The south nearly destroyed the union by forming the confederacy and the dixiecrats nearly derailed civil rights legislation. Even in recent political events the south's influence is still felt. We rarely elect a president who is not southern or who doesn't offer a southern running mate. The whole red state/blue state thing is mostly a southern split. Efforts to end affirmative action or repeal advances gained by minorities mostly come from southern politicians. The south is for the most part the poorest, least educated, least tech savvy, least progressive part of the nation, yet it has had a stranglehold on american politics for over 200 years. What's up with that? Can it be changed? Should it be changed?