The Southern States that seceded from the Union were merely attempting to ensure the rights guaranteed by the Constitution. With the election of Lincoln and many Radical Republicans, they no longer believed that their rights would be protected, hence secession. It wasn't long before the Lincoln administration lived up to their advanced billing and started to circumvent provisions of the Constitution.
However, Lincoln and the Radical Republicans that came to the fore by the end of the war, won the war. Is it really any surprise that their view of the history of the war (ie. fought to free the slaves) has become the predominate view of mainstream academia? It's vital that the war remains an important event in the lives of the Southern people, because that is the only way that some historical balance can be achieved.