
From my personal experience feom high schools to colleges, women do encourage and cover each others whens sexually harassing or assaulting a man. They cheer seeing a friend slap a man's/waiter's ass in clubs and laugh at personal and intimate details about their boyfriends they tell each others. This has become a part of that "female empowerent " mindset. I for the moment mostly am friend with guys since I work in a male dominated field (and thrown away my female friends acting like that) but I find it extremely worrying for my future children.
The worst I heard from women when I was in college was when stories about boys raped by their high school teachers were published. From "they should be happy" (as if men were animals in heat) to "at least they know what it is like to be women" or "men have done for thousand of years", passing by "it is okay when it is a female teacher, because it is love", I have heard a ton of horrible things.
Not only women should hold each other accountable but it is time society does it too. Female sexual offenders are given a pass from police, courts and their social circle. Plenty of people with (rightfully) point out the men commenting "where was she when I was in high school" (though I believe most of them are simply joking). But I see more people shiting on men using those handful of comments as if they were serious proof of how men think than people talking about the real issue : women raping boys. I also see the female comments on how it is not the same, that it is getting back at men for oppression or other stupid takes like this ignored.
Most Helpful Guys