For a couple years I tried various barefoot-style shoes and sandals for every day life, except I'd often find myself still foot-striking as if I were wearing a regular structured shoe, which meant pain in new places
But lately I've just ditched the footwear altogether, hobbit style
(Not when in public/sidewalks/stores)
Now my feet seem to feel better generally, I'm becoming a bit tougher when stepping on gravel and painful stuff underfoot
But especially like the primal ancient sense of the whole thing
can you tell i was raised by kind-of hippies? anyways whenever im barefoot especially outside in the rain / on the beach / at the park i feel better. i believe deeply in decorum so of course i’m going to wear shoes everywhere but i think people would feel better if there were more opportunities to be barefoot. i mean this genuinely
Why is this such a lost art? Bring together those friends that would never have met other wise. Make the people you love, love each other. I recently had a small birthday wine n dine with my friends and anticipated awkwardness between my different groups of friends, but it was the complete opposite. God I love my friends and love making a space for them to mingle and laugh in eeeee. I think in a time where my friends and I are transitioning to adulthood and focusing on careers, this brings something to look forward to and a way to create a world outside of work and study.