Why does the US Deep South have so bad reputation? How is it actually to live there? Are there any metropolitan cities?
For example. I've always considered Alabama being synonymous with countryside white trash, trailer parks etc. And that I've googled a little bit, they actually seem to have pretty nice cities too.
I know that according to the statistics, these states are poorer and have experienced population loss due to internal migration in the US. But on the other hand, many developed areas in the world, that are considered good places to live in, are a lot poorer. Also, just looking and the GDP, unemployment, education etc. does not tell you everything. Cost of living and just the general atmosphere matter a lot too.
I've read, dunno if this is true, that even the race relations are nowadays better in the South than in the big cities in the North and West.