So man oh man. I came across this article from the New York Times about how in the last ten years breast implants have increased. A lot. I mean, sure, they could increase more, but it still seems like a huge number to me. They increased 39 percent since 2000. And maybe over time that doesn’t seem like a huge increase, but still. It asks the question why? Plastic surgery carries serious risks and it’s not like there’s any medical reason (that I know of) to get breast implants like there is to get a breast reduction.
Why do women feel the need to make their breasts bigger? I feel like this represents an underlying trend in society not to be happy with our body’s image. You know? There’s always bigger breasts, longer legs, better hair, thinner figure to have. But why? Why is it that some women feel such pressure to look a certain way that they’re willing to undergo surgery to look a different way? And it’s risky surgery at that. What is it about our society that women feel the need to do this? To risk possible death just to have bigger boobs?
What would it take to end the hate about our body image? What would it take for women to love themselves? This is the question I want to know.