10 Industries Where Women Rule

By Selena Dehne, JIST Publishing

Move over men ... women are becoming the major force in the job market. According to a recent report by the U.S. Department of Education's National Center for Education Statistics, women now earn the majority of degrees in many fields that men used to dominate.

Although women have traditionally led the fields of education and psychology, their dominance in fields such as business, history and biological and social sciences comes as a surprise in a typically male-driven job market. Even in fields they do not currently dominate, such as math and agriculture, women are making significant strides.