Women doctors were still not usually hired to work in existing hospitals, so they started their own hospitals, often providing much-needed services to poor areas.
In the 1890s some of the top medical schools for men, such as Johns Hopkins, began to admit women students, but this led to the closing of all but three of the separate women’s medical colleges by 1903.
As a result, few women doctors could find teaching jobs.
The majority of women in the medical field worked as nurses, and many new nursing schools were established.
In 1897 a national organization (later called the American Nurses’ Association) was formed, although black nurses were effectively kept out and were forced to set up their own separate group.