Unlike any other region in the US, the entire South seems united with each other. Especially from what I can tell on the forum, there is a really strong sense of Dixie Pride that unites people from Florida to Mississippi to Arkansas. Why is that so prevelant in the South, and no where else in the country? (Now that I think more about it, perhaps the only other place in the country that inspires similar pride is NYC)
I would have to say its because of the Civil War. At the time they were almost like another country.
Not almost. The south was a part of another country.
Okay good point! The south did successfully separate from the Union even though it didn't last. My example of course was not as good as some others I could use. I know people in the south feel like their rights and their independence were taken from them.
A better example than immigrants would be territories that now belong to the United States. The Hawaiians and Native Americans have a bit in common with Southerners. All three were brought (back) into the Union by force and the natives of those who live(d) in these areas are very proud of who they are and what their respected areas, territories, countries, and or cultures were.