7 Ways Feminism Is Destroying American Women
1. Makes women think that all they have to do to succeed in life is show up. Today’s woman thinks she deserves the fruits of life just because she’s a woman, that since her gender was “oppressed” for so long it’s time to receive reparations through generous societal benefits and advantages in education at the…