Racism is a blatant problem in the United States. It is something that has been interwoven into our nation's history and foundation through slavery and oppression of minorities, but this in no way justifies or excuses it. I believe that in the present direction our country is headed, racism is fed and even encouraged.
Racism is, as defined by dictionary.com, the belief that race accounts for differences in human character or ability and that a particular race is superior to others. There are, at present, several forms of racism in the United States. The most obvious, and perhaps one of the oldest, is racism from whites towards African-Americans. Many white people believe that they are the "pure" race, that African-American people come from the devil, that God "intended" for the African-Americans to be slaves to the whites because African-Americans are a less evolved form of human beings.
Other types of white supremacy racism have been felt by Asians, Native Americans, Hispanics and Latinos, and Middle Easterners (especially those of Islamic or Jewish faith).
Some would venture to say that the prejudice many whites have toward these groups is deserved. After all, the Japanese wanted to see us dead, the Muslim killed thousands of Americans, the Hispanics are taking jobs that "belong" to the American people. Though these arguments are true, they do not reasonably explain racism or justify racism. They are petty excuses people use in order to not be thought of as bad or evil. Furthermore, there will never be an excuse or any logical justification of why the whites slaughtered and massacred the Native Americans and then proceeded to steal their land.
We see historical racism in both slavery as a practice and in the laws surrounding slavery and oppression of blacks. Slavery itself was obvious racism...