As women we must constantly fight for our health rights, among all our other rights. The state of Alabama is one of those terrible states that apparently dislikes women to have power over their health. Here are a few articles from the National Partnership for Women and Families. I recommend subscribing to their updates so when your state is under attack from women haters you can take action!
It’s aggravating that something as unavoidable as my gender could determine whether or not I have rights. How in the world did we evolve into a world where women are considered less important than men? More importantly though, how can we end this hateful oppression of women?