Women’s rights is the fight for the idea that women should have equal rights with men. Over history, this has taken the form of gaining property rights, the women’s suffrage, or the right of women to vote, reproductive rights, and the right to work for for equal pay. Without the rights of women, America would not be where it is today for the many accomplishments brought about have been aided by women significant to American history.