butterflygirl24 2 months ago
It was white men who stole Native American land, divided tribes, ripped apart families, brainwashed Native children of their heritage, r@ped women n children, changed their shameful, evil white history so it seemed like America was white man land. They can lie about history, but many know the Truth!