As a product of American eduation, I can say resolutely that no, that was absolutely not taught.
Of course, this is partially because American education sucks and partially because we never HAD common land here: everything was privately owned, after it was stolen from the people who already lived here, and then most of it had people who had no say in the matter enslaved to work on it for the people who stole the land.
Of course, this is ALSO not really taught, because it’d make people feel sad and make the US look kinda bad, so it’s always talked about but you get like, a week of coverage on both subjects, at most.
It’s all but against the law in Florida (maybe other states as well?) to teach that aspect of history. Wouldn’t want the white kids to feel guilty for being white… because they know about things that happened in the past.
As a product of American eduation, I can say resolutely that no, that was absolutely not taught.
Of course, this is partially because American education sucks and partially because we never HAD common land here: everything was privately owned, after it was stolen from the people who already lived here, and then most of it had people who had no say in the matter enslaved to work on it for the people who stole the land.
Of course, this is ALSO not really taught, because it’d make people feel sad and make the US look kinda bad, so it’s always talked about but you get like, a week of coverage on both subjects, at most.
It’s all but against the law in Florida (maybe other states as well?) to teach that aspect of history. Wouldn’t want the white kids to feel guilty for being white… because they know about things that happened in the past.