Over the past week I mentioned how we are starting to be racist to white people. I also think that we are becoming sexist to males.
First, racism to white people. In this society we are taught that white people are the oppressors, and minorities are the poor helpless victims. If you find this racist, than you are a logical person. But yet many people still believe that white people have to atone for their ancestor's sins, and white people, as a whole, are evil and cruel. Yet, none of this is racist, but when people say that a person is black, people jump up and yell "Racist".
In school, we learn about how men have screwed over society, and such. But, that is okay because men are inherently destructive and cruel (sarcasm). Teachers in school clearly favour girls over boys, so girls are better, right? Not really, but this is what we are taught. If you disagree, please comment below.
Over all, our society is reversing itself. We now are racist to white people, and sexist to men. This is a greatly unfortunate change because I am both a male and white. Oh well, I guess I have to pay for the actions of other people's ancestors (my family is an immigrant family, so we never screwed over black people, or women).