For a couple years I tried various barefoot-style shoes and sandals for every day life, except I'd often find myself still foot-striking as if I were wearing a regular structured shoe, which meant pain in new places
But lately I've just ditched the footwear altogether, hobbit style
(Not when in public/sidewalks/stores)
Now my feet seem to feel better generally, I'm becoming a bit tougher when stepping on gravel and painful stuff underfoot
But especially like the primal ancient sense of the whole thing
can you tell i was raised by kind-of hippies? anyways whenever im barefoot especially outside in the rain / on the beach / at the park i feel better. i believe deeply in decorum so of course i’m going to wear shoes everywhere but i think people would feel better if there were more opportunities to be barefoot. i mean this genuinely