Lindsay McMahon
"The English Adventurer"

Do you feel pressure to smile a lot?

Do you think women are expected to smile more than men?

We spoke about this recently, and we then decided it would be a great topic to dive into today.

We’re looking at how women tend to get asked to smile more than men, how this whole trend has evolved, and why this is such an important topic of conversation.

Get Your Transcripts Today!

Make sure you understand every word you hear on All Ears English.

Bring your English to the advanced level with new vocabulary and natural expressions.

Subscribe and get the transcripts delivered by email.

Learn to speak naturally with the American accent.

Click here to subscribe and save 50%

Looking At A Common Trend

The reality is that women tend to be told to smile more often than men.

There may be a variety of reasons for this, but it’s a trend worth talking about.

This may come across as demeaning or disrespectful to women, as it’s not up to somebody else when or if they smile.

Though smiling is such a positive thing, you don’t have to smile just because somebody tells you to.

There are a lot of thoughts and history to the idea that women are told to smile more than men.

You can see this represented in this survey that was taken about this very subject.

98 percent of the women in survey have been instructed to smile a minimum of once.

Generally, the more senior women were told to smile more than less senior women—but the trend is still a very real one for women of all ages.

There are a lot of great points within this survey that are worth checking out, but the idea here is that women are told to smile quite frequently.

There is a lot of negativity to this as it shouldn’t be up to a man to tell a woman to smile—and likely the reverse wouldn’t happen very often either.

There Is A Certain History To This Idea

You may ask yourself what the big deal is?

Why does it matter that so many women have been told to smile by men?

Is this really a big problem?

It is based on how it makes women feel, and there is also a certain history to this trend or idea.

You’ll want to check out this article—“Why Do People Expect Women To Smile?” by JR Thorpe July 6, 2017

In the article, they talk about how women smiling and laughing used to be considered mischievous, sinful, and it could even cause problems.

Later in the 18th century, women were seen smiling in more works of art, and so this began to become mainstream and accepted.

Eventually women smiling became used more in ads, and it was considered to be nice.

The word “subservient” is even used, which is quite interesting.

This shows that the idea evolved from a smile being potentially threatening almost to a certain expectation.

It has however turned into something bigger and a sign of inequality, even in today’s world.

Women may feel they have to smile at work, so as not to come off as “threatening” or other related reasons, both personally and professionally.

The choice to smile or not should be made on your own, and yet many women feel the pressure to smile even when they don’t feel like it.

This is a gender issue within the culture, but it’s also something that can be racially charged as well.

“Women of color are subject to more than just sexist smiling-related baggage in U.S. culture; they are also subject to a long history of institutionalized racism that demanded smiles from people of color.”

This is a huge topic of conversation and one that you can be part of, as many natives will speak about this in a day where equality is such a focus within the culture.

Many Women Share This Negative Experience

This may be something that you have had some experience with, as it’s so very common.

We at AEE have had such experience with this, and it’s important to talk about these and share with others.

You may find that other people you talk to, particularly women, have had similar experiences and so it’s an important topic of conversation.

Lindsay remembers being in London and being told to smile when she was out one night.

It was weird, and it made her uncomfortable.

It actually became so ingrained in her, and she remembers it all these years later.

Are you the type of person who naturally smiles a lot or do you feel like you have to do that because of society?

Is this different in other cultures?

Is this push for women to smile universal?

Do you think that this will ever change or people will stop asking for women to smile?

What do you think is the appropriate response if someone tells you to smile?

This is certainly something to think about as it may happen to you at some point in time, and you want to be prepared for your reaction.

Please avoid telling women to smile as a general rule!

If you are worried if someone is upset, you can ask “is everything ok?” or “how are you?” but don’t focus on women and don’t focus on a smile.

There is so much more to discuss here, and it may very well warrant a follow up.


Smiling with regards to women has changed greatly over time.

It is not a good idea to tell a woman to smile, because it can be taken in the wrong way and come across as forced.

Though smiling is a great positive thing, it’s also something that should come naturally from someone.

You don’t ever want to make somebody, especially a women, feel uncomfortable or disrespected because you try to get them to smile when they may not feel like it.

Tell us how this relates to your own culture, as it may be interesting to see if this is a universal trend that has evolved over time.

If you have any questions, please leave them below in the comments section.

We’ll get back to you as soon as we can.

  • Badges (1)
  • Badges-1 (1)
  • Badges-2 (1)
  • US_ListenOn_AmazonMusic_button_black_RGB_5X
  • App-Store-Button
  • google-play-badge
  • Badges (1)
  • Badges-1 (1)
  • Badges-2 (1)
  • US_ListenOn_AmazonMusic_button_black_RGB_5X