The Civil War and Reconstruction are considered to be transformative moments in American history. Do you feel that is really the case?
Does the Civil War/Reconstruction actually solve anything?
How are things different after the Civil War and Reconstruction?
Be sure to consider race relations and politics in both sections.