Need some Inspiration

Hey Guys,

I would like to write an essay on how socitey puts women under pressure regarding their bodys. I'm asking myself: Are trends like #body positivity and #body neutrality really helpful or do they just create a new norm to be self-confident no matter how you really feel about it? What do you think? Can't wait reading your comments :) Thank you!!

Edited