According to the Merriam-Webster dictionary, feminism is the - TopicsExpress



          

According to the Merriam-Webster dictionary, feminism is the belief that men and women should have equal rights and opportunities. Thats it. Thats all it is. Some of you might be saying to yourselves, But dont feminists want to destroy us all and hate men and eat babies and stuff? To which I respond: I think you are confusing feminism with Dr. Evil. They arent really the same. So here to explain what feminism is really about is the brilliant Laci Green.
Posted on: Thu, 05 Jun 2014 01:07:13 +0000

Trending Topics



Recently Viewed Topics




© 2015