What is the role of the women's college in the 21st century? Women’s colleges in the US rose in the 19th century as institutions that recognised the need to educate women who were forbidden to attend many of the most prestigious universities. From 1970s onwards, many of these universities began opening their doors to women, and as a result, forced these women’s colleges to justify and fight for their existence. Currently, many conversations have questioned the significance and role of the women’s colleges in the 21st century as higher education has become more accessible. These colleges encouraged the woman’s voice, as they were some of the only safe spaces where it was okay to speak as a woman.