SOURCE: AP/John Bazemore
Women and their families deserve a fair shot so that they can get ahead and not just get by.
The role of women in the United States has changed dramatically over the past few decades. For one, more and more women have taken on new responsibilities outside the home by joining the paid workforce. While women made up only about one-third of the workforce in 1969, women today make up almost half of all workers in the United States. Women are also stepping up to lead the country; a record number of women ran for public office in 2012, and a record-high percentage of women are serving in Congress. In addition to making progress on issues of economics and leadership, women have made progress on health issues, which impact women’s personal well-being, as well as their economic security. Over the past few years, women have been able to end gender discrimination by big insurance companies and gain free contraception coverage because of the Affordable Care Act.
Read your state's report here.
Are you reading this on a mobile device? Scroll down and click “Web Version” to see more content and info.
Thank you for visiting!