Why The Left Needs Us All To Believe The United States Is Racist Forever
That America is a racist country is the great self-evident truth of the left and of the ruling class whose moral opinions are shaped by it. This truth is self-evident in the sense of being readily apparent to them, as… Continue Reading