More or less every country, to an extent, is open to nudity (but have limits). Europe in particular is very open to it. But America is something else.
Everybody has opinions, and we respect that. Everybody has a personal choice, we respect that too. It seems in America, particularly among the far right Christians, nudity is politicized and is deemed a work of evil. Apparently, they demand the Statue Of David (Italy) be covered up. It's like they disregard the Adam and Eve story, and the fact that we all came into this world in the buff.
You go to a beach somewhere in Europe and nobody flinches if they see a fully naked person (man or woman). Whereas on a beach in America, a young woman could get reported by an "offended" Karen and get arrested by the local police when said woman accidentally has her swimsuit bottom pulled down.
Now being deliberately nude in a public family friendly place and masturbating/having sex there, THAT'S a different story.
Anyway, what makes many Americans so opposed to nudity, even when it can be legal and harmless? A throwback to the Puritan days?
Most Helpful Girls