Thursday, March 27, 2014

What Feminism Really Means

While doing some research for my Junior Theme topic, exploring the stigma that the word "feminist" or "feminism" carries, I came across some disturbing data which proves that people have a misunderstanding of the real definition of the word. Americans were asked by the Huffington Post, "Do you consider yourself a feminist, an anti-feminist, or neither?". The response was alarming:



As we can see from the data above, only 20%  of those polled considered themselves any kind of feminist. Those who were polled were also "asked if they believe that 'men and women should be social, political, and economic equals,' [and] 82 percent of the survey respondents said they did, and just 9 percent said they did not". We can see that people believe in the idea of feminism, but shy away from the label. Why is that?

Merriam-Webster defines feminism as "the theory of the political, economic, and social equality of the sexes". However, it seems to carry a different meaning to Americans, for "thirty-seven percent said they consider "feminist" to be a negative term". 

I was initially very shocked by this data, which in turn, prompted me to deepen my research into this misunderstanding. As I continue to explore this topic in the context of my junior theme, I will hopefully find an answer to this question, as to why feminism carries such a stigma. In the meantime, what are your opinions on this issue? What is your explanation? 

No comments:

Post a Comment