So… I was talking to my friend the other day and we were talking about female bosses… And women in the military… I’ve heard this often that “the women are even worse than the men” to me this proves that women are often questioned whether or not they are suitable for a powerful or authoritative job, they even feel that they have to prove themselves to their male colleagues so they will be respected and taken seriously….
So we either don’t make it to the top or we have to be “ice queen bitches?” Are those our only options
It seems in my academic career that I have reached a problematic subject. I am taking “African-American perspectives on crime” my class is taught by a black male and the whole class is about black masculinity and how black men are unfairly tried in the judicial system (which is of course a problem) HOWEVER, black women are almost never talked about and if women are even INCLUDED in the discussions, in the films we watch they are sexual objects, in domestic positions, or called a bitch or a hoe.
My professor even said that if a woman lets a man call her a “bitch” or a “ho” then there is nothing that can be done about that.
He also glorifies male athletes…. which considering there is a gap of women in sports is something that leaves women out ENTIRELY.
Any feminist voices with other opinions about this subject? I am also looking for a feminist analysis of the popular tv show “The Boondocks” I would prefer to have a black feminist opinion, simply because there is both race and gender present in this issue.
As a feminist and openly claiming to be one I have faced discrimination I have been:
Assumed to be gay
Assumed to hate men
Assumed to be radically sexual or radically unsexual
Sexualized and hit on when discussing women’s issues
Not taken seriously
Been called a slut/whore
Asked “when are you going to stop with all this feminism talk?”
And finally… I’ve been told that there is no need for feminism and that we are all equal. What do you guys think?