How do you stop hating your body when nothing seems to help?
I have struggled with body image my entire adult life. I have tried losing weight, working out, dressing better, but I still look in the mirror and feel disgust. No matter what I change externally the feeling stays the same. How do you actually make peace with your body?