African Americans of all economic levels think the rest of America are Racists, but isn't really about Reputation? The News media show Blacks as drug dealers and addicts, fatherless families and gangsters killing each other. Black leaders ignore it, but the statistics confirm it.
It’s Not Racism, it's the news, the TV shows and movies, including Black producers and music videos that present themselves as telling the story of Black America.
For most Americans, it's about living in a neighborhood that you can safely walk the streets at night. Can you really blame the rest of America if we don’t agree or want to be a part of the Black culture the media portrays.