In earlier times due to stereotypes men were forced out of nursing. It was believed that by nature women are better at nurturing the sick than men as they are more affectionate and more caring than men. Currently, more men are entering the field of nursing throughout the world today, as there is a major push to delete the stereotype that nurses are women. As a matter of fact, men are finding roles in all fields of nursing. Thus nursing is no more a gender, it is a profession.
Get Answers For Free
Most questions answered within 1 hours.