Given all the micro and macro aggressions women have to face on a regular basis, being told to smile may not seem like a big deal. But it is. Here is why.
MORE: 7 Behaviors That Betray A Toxic Person
Men feel entitled to things women would never dream of. Telling women to smile is one of these things.
Let’s be clear from the start. Men telling women to smile is a power move. What they are actually saying is that women should ignore their real emotional state and smile just because it’s aesthetically pleasing for them.
MORE: 3 Sneaky Things Liars Do To Mess With Your Head
MORE: 7 Signs of A Narcissistic Mind
When they hear this, women may feel threatened or oven harassed. They feel like their worth is given only by their capacity to brighten a man’s day. It doesn’t matter to men if the woman in case had an awful day, if she is sad or tired. She should smile in order for the man to feel better.
MORE: 7 Clues Someone Is Trying to Gaslight You
Telling women to smile is a sign of male privilege. It’s actually an order masquerading as being courteous. The subconscious intention is to transform a human being into an aesthetic object that you can modify to accommodate your every whim. It’s telling women they are not entitled to their own feelings, anger being one of the negative emotions women shouldn’t display, if they want to be attracting.
MORE: 5 Reasons Men Should Never Tell Women To Calm Down
How much in control do men actually need to feel if they want to even “edit” women’s facial expressions? Please, share this!