There is a history of racism in the United States that's pretty well documented and widely known
It should be taught, though I don't think it should be the focus of a curriculum. It makes me uneasy that we've chosen to frame this idea as an ideological buzzword.