What it actually means, how it’s evolved over time, and what’s at stake without it.
Andrew Richard / BuzzFeed
Body positivity is unlearning the idea that only certain bodies are worth acceptance and praise, and instead recognizing that all bodies are equally valuable. It's deciding what feels good and healthy for you personally, and letting other people do so for themselves. It's understanding that you deserve to live in your body without receiving the prejudice of others (whether that means rude comments, reduced economic opportunity, inadequate health care, or something else), and working toward a world where no one's body is the target of such bias.
Listen: We're all humans, and we've all been affected by the world around us. It's completely normal and okay to have bad body-image days, and beating yourself up over it will only make you feel worse.
All you can do is try to be kind to yourself (maybe some of this stuff will help), and always stand up for other people who are being treated unfairly because of social attitudes toward their bodies. And if you feel like loud, proud body love would be too awkward or performative of an act for you personally, you may want to investigate body neutrality.
from BuzzFeed - Style http://ift.tt/2dZb5Sl
沒有留言:
張貼留言