When only Alabama is shown as a racially divided, segregation-wanting state, it is unfair. If Hollywood wants to cast judgement on the South as a whole, then there are many other states to choose from or lump us all together; but it is only Alabama [solely] that continues to be shown this way.
No offense taken, seriously. Hell, I would rather Hollywood and the media focus on Alabama being full of corruption (because unfortunately it is). Anything other than presenting the state like we are still the same as we were 50 years ago.
And since I wasn't a citizen of the world in those times, then I think I should be able to expect that 50 years later, I can watch a TV show about my state without seeing a bunch of racist rhetoric bandied about.