Should we make efforts to return Men back to their roots?
What I mean is that today's men are not as manly, masculine and dominating OVERALL, as before.
I am talking about on an average.
Today's men fall into porn, submission, oppression, they are lost in their life, ways and dreams. They take pursuing women too far, some would even crawl for women, and never in even their bravest dream, they have seen themselves as the ones who women would want to go after.
If course I am not talking for all men.
But the number of men I am mentioning is not a minority.
Men also tend to be less educated about women, than vice versa. I have seen it numerous times, and I sometimes think it's purposefully being done in school for them. Men are less encouraged, as opposed to women, we see all those efforts about 'female empowerment'. And yet, despite all this, it reflects and backfires on us women, because many polls tend to show we are more and more unhappier lately.
Perhaps neglecting the one gender for the other led us to nowhere.
Should we start teaching and encourage boys once again, of their masculinity, instead of calling it toxic?
Make them embrace it and improve it on their own. Know who they are, help them identify what they want and encourage them on the way.
Being dominant and realise most women actually like that. Being protectors, not just cash providers.
Because when people say things like - that old man is from the time when men were MEN - it does speaks volumes about the path we are walking on now, and the direction we are headed to.
Most Helpful Guys