yeah, you feel better physically and emotionally the healthier you are. I used to deal with a lot of body image issues too, between my scars, eczema, and being overweight, but the healthier I get, the more I learn to love my body, even as I continue in the process of getting healthier STILL. (especially for me, as I'm into calisthenics, learning to achieve awesome feats makes me feel proud of the body I'm living in.)