The belief that the South is the only region of the United States with racist ties creates a false narrative of American history, according to Karen Cox.
The belief that the South is the only region of the United States with racist ties creates a false narrative of American history, according to Karen Cox.