Is that really the case that the movies have portrayed the South darkly? Ever since Gone with the Wind and Rhett Butler's caustic patriotism (he knew and predicted the South would lose - remember the scene at a Tara party at the beginning of the war where he tells it all the other guests who are in an alcohol-fueled victorious and exhuberant mood -, yet still fought with the Confederacy, but then quickly made friends with the occupying Union officers, only to save that dimwit Ashley from a KKK meeting raided by Union soldiers). He wasn't a negative figure. Neither was Yancie Derringer or whoever Patrick Swayzee played in North and South, the series. A Confederate officer, a plantation owner, a proud man, but no whip-cracking monster.
Even in the Spaghetti Westerns, most of the lonely restless heroes were former Confederate soldies coming to grips with a lost war. Lee van Cleef in The Good, the Bad & The Ugly was a villainous and opportunistic Union officer.
It has been a long time since I read it, but there is a good book which addresses this issue called "Media-Made Dixie: The South in the American Imagination" by Jack Temple Kirby. I think he is more optimistic about it than I am, but all of this is subjective anyway. This kind of thing does come and go in waves, although it seems to me that in more recent times things have been more negative than positive. I really don't see anything comparable to "Gone With The Wind" as a movie or Yancy Derringer as a TV series. These were exceptional and positive. Even in literature (in which the South seems to have received a lot of recognition,) as often as not you're going to have someone writing about how screwed up things were in the South, the way William Faulkner did. Not that there is anything wrong with Faulker, of course. I'm just saying there seems to be some kind of craving to hear about the dark side of the South when it comes to popular culture. When it's something light, you might have something really stupid like "The Beverly Hillbillies," or "The Dukes of Hazard." Even the "Andy Griffith Show" was lacking in many ways, in my opinion. In general, when I see something about the south in the movies or on TV, I tend not to like it very much, mostly because it's so unrealistic. Yet people are going to see that and obviously get an impression from it. What else could they be expected to do? The Dutch, for instance, get really weary of people talking about windmills, tulips, pot, wooden shoes, coffee shops, etc. Who can blame them? If someone thinks that is what defines the Netherlands, then they're not even trying to understand the place. So, maybe they get offended when things like this happen, and I don't blame them. I also get offended when I keep seeing caricatures of the South in popular culture. One positive exception, however, might be the image of the Southern woman, the Southern belle so to speak. Sometimes that image you might see in the movies or on TV is not too far from the truth, and that's a good thing. Sometimes it's exaggerated, of course, but there is a lot of truth to the mystique of the Southern woman, in my opinion. I've always been pretty impressed. I think I'm not alone in feeling that way. There is something about this place which seems to bring out the best in women, and I have to say I honestly don't understand it myself.
Edit: In response to what you said about "The Birth of a Nation" and the reaction to the Civil Rights movement, I do agree. Jack Temple Kirby goes into some detail on "The Birth of a Nation," too.