If you heard someone today claim their primary allegiance to a state or region versus the United States how would you feel about that?
Have we completely given up a higher-allegiance to our states and regions versus Federal authority?
I believe that the ACW/WBTS had such far-reaching effects that the pre-1860 and post-1865 Americas are almost unrecognizable one to the other.
Let's try to list some of the pre-war views versus post-war views of America.
First of all, I believe the idea of state sovereignty, which was held sacrosanct prior to the war was thrown out like "the baby with the bath water" due to it's use by the Confederate States to defend the institution of slavery. Today, no one can claim "State's Rights" without hearing terms like racist, bigot, Jim Crow. Had the Civil War not been connected with the defense of slavery, would we honor state sovereignty more today?
Let me hear what you have to say about this one point and give me some more changes that were by-products of the war?
Jim