Gender, by definition, is the range of characteristics pertaining to, and differentiating between and from masculinity and femininity. Nowhere in that definition does it say anything about the type of genitalia you have. Gender is a social construct forced on to people by society attempting to make everyone fit into one of two categories, masculine or feminine. Unfortunately, American society has decided that to be masculine means to be a boy and to be feminine means to be a girl. However, that’s not what it means all over the world. It some countries, the women are seen as more masculine, building homes and finding food, the bread winners if you will, while the men take care of the children and stay at home, taking the feminine traits for themselves.
Gender is the idea of certain characteristics belonging to a specific genitalia. Feminine with vaginas and masculine with penises. Who decided that? I understand that there is biology involved with men being taller, having a higher center of gravity, and having less body fat making it easier to gain muscle mass. However, women can be strong too. Most of the ideas surrounding what it means to be a man and what it means to be a women are created by what society wants and expects from us.
Society has told us that men can’t cry, can’t be scared, can’t be vulnerable because those emotions are deems weak, feminine, meant for women and women only. They say that men can be angry and scary even, but it doesn’t matter because they’re showing how strong and tough they are. They’re allowed to be scary because it’s what is expected of them. Women, however, are expected to be soft-spoken, weak, submissive, kind, and every other word you could think of that could also describe a daffodil. Women aren’t allowed to be angry, to be tough, to be strong because if we are, we’re called bitches or rude. We’re told we don’t smile enough, that we’re too rough and need to lighten up. Men would never be told to lighten up.
Gender is an idea, a box, that society wants to put us into so all the jobs that are required to keep a society going are dealt with. Women take care of the children, cook, clean, while men work with other men, bringing home the money and continuing the manly man’s world. What would happen if we stopped? Stopped being what is expected and turned society on it’s head? Do you think society would end? Or would it become something worth fighting for?