Slavery, genocide of native popuations, proxy wars, and giving hard drugs to black communities in order to destabilize them are also all parts of American culture at some points in its history. Is something a good thing just because it is a part of American culture?
Slavery, genocide of native popuations, proxy wars, and giving hard drugs to black communities in order to destabilize them are also all parts of American culture at some points in its history. Is something a good thing just because it is a part of American culture?