Black Men in America

We all know about what happened to Bill Cosby, but another actor by the name “Morgan Freeman,” was just accused for sexual harassment. The women accusing Freeman said that she was on the set of a movie Freeman was doing in 2015 called “Going in Style.” She said that Freeman would rub her back, and make comments. But how do we know for sure that he did this? What made the women speak out now after so many years? Maybe it was because of all the attention Bill Cosby was getting, and they thought they could get theres in too. But White Women in America are known to bring down African American males in high status. They don’t see a person but just an object that is talented. We see this with NFL players all the time, where they will sleep with white women and the women will cry rape because she didn’t get what she wanted, or things just simply didn’t go their way. Morgan Freeman could be in the wrong, and if he is I hope for him to suffer the consequences of his actions. But It just seems odd to me how this pattern is happening and how all of a sudden all these wealthy black men are being questioned.