Trying to meet the AI challenge: Part 2

As I noted in this post at the end of December, I was studying ways to teach my students how to use AI (artificial intelligence; in this case, specifically ChatGPT). However, “teaching them how to use” is a misnomer because I quickly discovered that students already are using it. And I was naive to think otherwise.

That discovery meant that my lesson preparation turned from presenting the opportunity of AI to instead focus on the ethical and practical uses of AI when writing. From a course like Essentials of Written Communication (business writing), students need to learn to be able to use AI well when they get into their internships or jobs. In fact, ability to use AI, write prompts, and yet understand AI’s limitations, will be a requirement by employers.

https://learn.aiacontracts.com/articles/

But first …. let’s understand what AI can and can’t do. I had the students read the article: “The Good, the Bad, and the Ugly of AI Writing.” Here’s a synopsis:

AI can be good and helpful because it can save time, thus making employees more productive, and it can help reduce human error. (For instance, even using a program like Grammarly or the Editor in Microsoft Word is using AI to help clean up writing.) I want my students to see that AI can be helpful with time-saving idea generation, editing, and yes, even some first draft preparation.

For my students, it will be important to learn how to write good prompts and create perhaps some initial ideas for drafts of documents they will need in their future jobs, help them with research, provide guidance when making sense of statistics, etc. Learning how to use AI well will be a timesaver, allowing them time to do other important aspects of their job.

But ..

AI can be bad for myriad reasons. It is only as creative as what is already “out there”; thus, it really has no creativity or originality. It only takes what others have done as everything appears online. (As a writer, this for me is the nonstarter. I refuse to let it do any writing for me. Indeed, I am writing this blog post all on my own!)

Students who use AI to write their papers end up with writing that (ahem) is often easy for teachers to spot. (Not always, but sometimes.) One of the key assignments my colleagues and I are doing is to have students write something in class at the beginning of a semester, which gives us a sense of their writing ability and style. That provides us with a benchmark to work from. And we check all quotes and sources to make sure they exist and are correct.

Students must never depend on AI to deliver a final product. It needs their human eyes and human voice. Thus, they need to know how to edit, what to look for, and how to take what AI gives them and polish it.

But …

AI can be ugly because it doesn’t have ethical standards. It’s happy to write pages of uninspired, generic material, make up quotes, make up sources, make up statistics, all while using everyone else’s ideas that exist in online world.

My students need to find the uses for AI that are helpful but not unethical, such as brainstorming, clarifying material that is difficult to understand, even help with foreign language learning. As a professor, I have had it create some games to make the point of a lesson, or give me some case studies to use, or even advise on how to simplify a concept for my students.

For instance, one assignment we did in class was to have each student write an email to a prospective student. We discussed audience, format, and structure of good emails. We put the characteristics of our target audience on the board.

I gave them a worksheet on which they did the following three activities — all three of which would be turned in. First, write an email in class, without any kind of AI help. Second, create a prompt and put it in ChatGPT, then copy and paste onto the worksheet both the prompt and the generated email. Third, create a final email starting with the original and incorporating anything from AI that seemed helpful (and highlight those things).

I was pleasantly surprised to discover that, in most cases, the first email written on their own was just fine, but they did at times incorporate another point or even a particular phrase that they liked from the AI-generated version. And I did ask them to run their final version through the Microsoft Editor (yes, still an AI) to help them clean up any grammatical or spelling errors.

I’m hoping from this lesson they learned that some helpful ways to use ChatGPT can be for them to write a first draft, get a bit of help from AI, and then adjust their final product if the program did indeed give them something useful. And then, of course, to let it help them make sure of their grammar and spelling.

In other words, it’s only supplemental, not the final say.

Stay tuned. I’m still learning and working …

My next teaching challenge: The AI effect

The chair of my department came into my office and said, “Linda, brush up on AI. You’re going to need to teach your students to use it.” He was referring to my Essentials of Written Communication class, a class where I teach the format and strategy of different types of writing that is important to both their lives on campus and beyond into the business realm.

And I must teach my students to effectively and ethically use AI platforms (such as Chat GPT) to truly prepare them for their future careers. The world is heading that direction and they need to be ready.

I have to admit, I’m a little worried. I’m a veteran of the 5-1/4-inch floppy disk era. Even before that, I navigated my way from typing class in high school (on typewriters) to computers with various floppy disk sizes and on through the many, many versions of Microsoft Word (remember when “Clippy” would give writing advice?).

Clippy, courtesy of Mental Floss

And then came email (woo hoo! Files could be attached and sent instantaneously) and the Internet. So far I’ve managed to move through these past 50 years of my writing career with a minimum of turmoil.

But I have to admit that the world of Artificial Intelligence is setting me back on my heels a little. While many of my colleagues have embraced and are using it well, I’m setting aside January to catch up. Here’s my reading material, Teaching with AI.

I have my concerns. As a writer, I’m honestly worried about my students deferring to AI and not understanding the creativity needed for every kind of writing — an email, a news release, a report. Having an AI just generate these annoys every part of me.

But I’ve been teaching long enough to know that writing doesn’t come easily to everyone. Even as I teach my students to use programs like Grammarly or Microsoft Word’s editing tools on their papers means I already have been teaching them to use AI.

So as I prepare for my spring Essentials of Written Communication class, I will be rewriting my curriculum to continue to teach the formatting and structure of various types of writing, while planning for students to use AI. I plan to create assignments for them to write and then edit with AI; I’ll be showing them how their human touch is still vital to anything they use AI to create; I’ll be talking about the ethical use of AI so they understand its creative limitations (and potential for plagiarism).

As the book says, “It is essential that educators start to talk about these issues with students. if we want students to use AI responsibly, both in school and beyond, AI ethics must be baked into curriculum and include AI literacy, an emerging essential skill” (3).

Do you use AI? How has it helped you? What concerns do you have about its use?