Who started racism in the United States? Racism is the belief that one’s race, skin color, or more generally, one’s group, be it of religious, national or ethnic identity, is superior to others in humanity. History of racism in American landscape primarily since the European colonization of North America beginning in the 17th century.