Lately on facebook we’ve been talking a lot about breastfeeding in public. Many people in American society find it to be “disgusting” “gross” “nudity” or “insensitive”–
Kim: Not going to lie, even as a female I seriously don’t want to see anyone nursing. Can’t they just put in a nice little room in the malls, with some carpet and a few sofas for moms?
Yvette: One of the biggest complainers about women publicly feeding a child are young women who have never chosen to breast feed or don’t have children. What kind of society do we have when we teach young women to hate one of their own natural abilities? Only in America.
Dana: Yes, nursing is natural. You need to be tasteful though. I wouldn’t want my ten year old son gawking at it.
Wendy: Honestly, I don’t want to see either one in front of my face when I am trying to eat. I’m sorry, but as a non-child bearing woman with absolutely NO DESIRE to have children, I am tired of self righteous mother throwing their breasts in my face everywhere I turn.
These were just a few of the many comments on the images above, but what I’m wondering is how our society got to this point. Why did boobs become so sexualized in America that women can’t feed their children in public without risk of being harassed, chastised, or even told to go feed their child in the bathroom. It’s not just men saying no to public breastfeeding, it’s also a lot of women.
|East Indian, 1950|
|painting by Mary Cassatt|
Why do you think American women’s breasts are so sexualized? When did this happen? Why do you think we have become a society in which this is even a topic of debate, instead of a socially accepted norm for all people?
my journal template
This is what I created and personally use to journal. It focuses on body love, self care, happy moments, things to do, as well as goals. Plus, it's in black and white so you can color all over it!