Should artificial intelligence (AI) be your co-pilot? Or do you consider AI to be a monster out to destroy you? Innovations can seem either exciting or frightening, depending on the amount of control we have over them.
My father, Chester Beaman, was an American diplomat. While my family lived overseas, we had servants. Our maid would do the tasks exactly as my mother decreed. Our cook had to follow her recipes to the letter, or he would be fired. The butler would open the door and announce the person entering. The steward would ensure that my father was properly dressed for fancy events.
I am sure that life has changed for diplomats now. Maids probably are expected to maintain feng shui and create fancy flower arrangements. Chefs, not cooks, must innovate. Instead of butlers admitting all who come, security officers decide who can enter. The steward is replaced with an administrative assistant who briefs the diplomat on proper protocols for diplomatic activities.
Artificial intelligence, likewise, has moved from following simple orders (like looking for patterns) to generating ideas. Instead of breaking codes, completing sentences, or predicting which bills need to be paid, the new generation of AI knows you well enough to suggest gifts, books, and ideas. AI watch: Defining artificial intelligence describes AI as perceiving, reasoning, processing, and deciding to act.
AI can be very useful in nursing. Some examples include:
- Managing medication
- Charting
- Determining staffing
- Educating patients
- Locating patient information
- Reading ECGs
- Assisting with clinical decision-making
- Getting authorizations
- Synthesizing data
- Predicting outcomes
- Educating staff
- Improving productivity
Educators have expressed some concern about the use of AI. In a recent study by the Chronicle of Higher Education, the participants felt the most negative and positive impacts would be on teaching and research for the use of AI. The same study found grave concerns about disinformation and misinformation in the use of AI. The threats of AI to higher education include privacy, security, copyright, equity, accountability, bias/accuracy, and replacing human interactions. The advantages for the use of AI would be enhancing critical thinking skills, creating individually tailored learner experiences, and offering help with writing skills. A Loughborough University study found, in particular, that AI was helpful to those students with learning disabilities.
Why do students use AI instead of writing their own work? Certainly, some use it to cheat or get out of doing their own work. However, many use AI to generate ideas, organize their thoughts, and improve their grammar. Some students also think that others are using it and if they don’t use it, they will have a disadvantage. Currently, however, much of the AI work lacks humanity, does not match the assignment given, does not align with APA style, is not culturally sensitive, and does not talk like a nurse would. For example, a nurse would write that they worked “on a unit,” whereas AI would use “in a unit.” AI may improve over time, but in this early form, it is not quality work.
Rather than castigating students for using AI, perhaps nurse educators should inquire why the students are using AI. If they need more help with grammar, nurse educators can show them how to use Microsoft Editor, which is built into Word. If they lack confidence in their own writing, then perhaps instructors should build their confidence with praise for their progress. If they think that AI can write better than they do, nurse educators should encourage students to become better writers. AI should serve students’ best interests. AI should be the servant, not supplanting them in building their skills.
Grammarly reports that 97% of employers plan to use AI in the next two years, so students need to be prepared to work in a world that uses AI. However, they must be accountable for writing their own scholarly papers and doing their own original research. Therefore, nurse educators need to set limits on which assignments allow for or encourage the use of AI, and which forbid it. Students must credit AI when they use it in assignments. AI should always compliment their work, not replace it. Instead, students should be empowered to succeed with their own ideas and scholarly writing. Nurse educators should provide the resources and support students need. Collaborations with librarians and IT support will give additional resources to students. Educators must also model the responsible use of AI. The classroom and simulation laboratory are good places to discuss the ethics of using AI.
Do you remember the “Miracle on the Hudson?” First officer Jeff Skiles was flying the airplane when it encountered a tragic bird strike, losing both engines. Immediately, the more experienced pilot, Sully Sullenberger, took control of the helm. Sullenberger safely landed the airplane in the Hudson River, and no passengers or crew were harmed. Skiles, who had just trained on that aircraft, was very familiar with the procedures. However, Sullenberger, a highly experienced pilot, was in charge. I would submit to you that AI should not be a co-pilot or an autopilot. To ensure patient safety, the accountable nurse must be the pilot, and AI can be the first officer—giving information, but not making the final decisions.
Dr. Nina Beaman, EdD, MSN, CNE, RN PMH-BC, RNC-AWHC, CMA (AAMA), is the Graduate Partnership Liaison at Aspen University. She is a member of Sigma’s Alpha Beta Alpha Chapter.