Peter Dazeley/Getty Images
Nursing has always been a field dominated by women, but that's starting to change, according to a new study. Men were 13 percent of the nursing profession in 2015, an increase from 2.2 percent in 1960. Why the upward trend?
First, the demand for nurses is higher. “As the boomers are aging, they’re needing more health care services. They are now the Medicare population. On average, people across all ages have more chronic conditions, like kidney disease, diabetes, conditions that require more health care,” study coauthor Elizabeth Munnich, assistant professor of economics at the University of Louisville, told the Wall Street Journal.
Also, more men are graduating from high school. A high school diploma is a requirement for nursing certification and college degree programs. At the same time, medical facilities have expanded across the country, driving up demand for nurses.
Another factor, Munnich said, is the evolution of men's and women’s responsibilities at home and at work. "As broad perspectives on gender roles become more similar, and more nontraditional roles become more accepted, more men have joined the field.”
Yet, over the last three decades, the share of men who were primary or secondary school teachers has decreased, and the growth of men in other female-dominated jobs has not been as pronounced.