Snow, and winter in general, can be very polarizing. Some people love it, other people hate the cold and logistical problems that snow brings. But snow may have a hidden benefit: it could help you feel better about your body.
Being unhappy with your body is more common than you think. Surveys found that about one-third of teenagers and adults say they feel anxiety because of their body image, and in the US, that type of feeling is far more common.
Previous research has shown that nature can strongly influence how we feel about ourselves, and spending time in nature can lead to greater body positivity. Snow, it seems, can also have a similar effect.
Lead author Kamila Czepczor-Bernat, of the Medical University of Silesia, Poland, the author of a new study on this, says that most studies of this type focus on green areas, while snow is often ignored:
“A body of evidence now exists showing that nature exposure – living close to, frequenting, or engaging with environments such as forests and parks – is associated with a range of physical and psychological well-being benefits.
“However, in contrast to previous studies which have focused on the impact of blue and green natural environments on body image outcomes, ours is the first to show the positive impact on body appreciation from spending time in snow-covered environments.”
The research was fairly small, involving 87 women with an average age of 24. The participants took part in small groups, completing a measure of their body appreciation before and after taking a walk in the snowy woodland in the Silesia region of Poland. They also completed forms gauging their connectedness to nature and self-compassion.
The results show that even a little time in nature (in this case, 40 minutes) can result in greater self-body appreciation; the results were particularly high in people who scored highly on the self-compassion measure. Overall, the observed effect size was 0.56 — effects over 0.5 are considered to be “large”.
Co-author Viren Swami, Professor of Social Psychology at Anglia Ruskin University, said:
“Natural environments help to restrict negative appearance-related thoughts and shift attention away from an aesthetic view of the body and toward greater appreciation of the body’s functionality. Positive body image is important not only in its own right, but has other beneficial effects, including more positive psychological well-being.
“Our findings demonstrate the importance of ensuring that everyone can access restorative natural environments, which may be a cost-effective way of promoting healthier body image, and highlight that there are significant benefits of being outside in nature, whatever the weather.”
So if you’re feeling bad about your body, you may want to spend more time in nature. Whether it’s snow, a forest, or just a nearby park, regular walks (or even just a once-in-a-while walk) can make a big difference for your bodily health, mental health, and body positivity.
The study was published in Environmental Research and Public Health.