How does the media positively or negatively influence the public’s image of nursing? What other avenues may better educate the general public on the role and scope of nursing as well as the changing health care system?
The media, and Hollywood in particular, represent one avenue in which the general public becomes familiar with the role of nurses. How does the media positively or negatively influence the public’s image of nursing? What other avenues may better educate the general public on the role and scope of nursing as well as the changing health care system?