While women`s nudity is more understood and natural? I mean men and women are human beings, we are both animal and species, why if a actress takes all the clothes off as part of the story, people do nto see it wrong and see it more natural but then a man does the exact same thing they are seen bad. Im going to quote an actor who went full frontal from a movie he had to do back in 2016 and he said this about it.
He said "In this movie, there is so much beheading and violence and yet people want to talk about my penis, and I think taht says something about society where people get disemboweled but itis the man`s junk that is of interest.. the idea that underneath every President, Prime Minister and ruling leader that we give power it is jsut an animal and if you stripped all their clothes they would be hunting and gathering and fornicating and pooping. We are just based animals, I like being objectified, I believe there is a double standard in Hollywood regarding this issue, why is ok for a woman to do it and not a man to do it?" Is it because we expect that a woman is the one who has to be naked but noone expects a man to do the same?"
I mean we already are used to see men topless and with a short, that is normal and usual at the beach, doing exercise and that is normal to see guys like that, but take just the short off and then people are appauled to see the man junk or their butt cause it is gross and disgusting and nothing interesting to see, those are the comments I had heard from women even men.
So nudity is a bad thing if it is part to tell a story. Nudity is not a normal thing then? Im not talking here about porn movies or adult movies, that is a different thing. Im talking about regular movies.
AI Bot Choice
Superb Opinion