More and more we're seeing big media people openly demonizing whites and blaming white people for everything that happens in life that they dont like.
I find it almost unbelievable that the people who are doing this are the same people lecturing on racism. It seems they think white people are
something other than human and thus anything goes. I've been warning people for years that this WILL get out of hand and will cause violence.
Based on what Democrats have said the last few days, they just need to rid the country of white people and it will be a utopia.
EDIT: They also get in to "implicit bias", something that is not even a real thing. They aren't far off from the Salem Witch Trials at this point.
edit on 14-11-2016 by TheBulk because: (no reason given)