Giving learners guidance on using AI

I’m currently enrolled in the first cohort of Building an Inclusive Quality Assurance Rubric offered by Eduflow Academy. It’s my first time being a learner since the public release of LLM AI tools like ChatGPT, so I wanted to highlight how course facilitator Bela Gaytan incorporated these tools into the course.

When I reached our first submission, I saw Bela had included a note about using AI. My first instinct was that this was going to be a warning against plagiarism. I was so wrong.

Image by allreadyserviceprivat from Pixabay.

Image description: Graphic of a robot and person collaborating across two different computers against a dark blue background. The robot sits on a large lightbulb with a laptop in its lap while the person sits at a desk with a desktop computer. Background graphics of sticky notes and lines on a graph represent productivity.

Directing learners’ use of AI through prompts and feedback

Instead of warning us away from AI, Bela suggested that we might:

  1. generate a draft of our submission using ChatGPT (and even provided us with a series of iterative prompts to do so), or

  2. run our own drafts through Chat GPT, using a description of the task as a basis for feedback

Other ways of guiding learners’ use of AI (according to AI):

I also put the question to ChatGPT: “How do you think instructional designers should give learners guidance on using AI in online courses?”

Its response mirrored Bela’s approach, suggesting that learning designers should “[provide] tutorials, tips, and troubleshooting advice.”

It also recommended grounding learners in the basics of AI:

  • a definition of AI

  • ethical considerations, and

  • AI’s limitations

I could see this kind of foundational instruction around AI and its uses being important, depending upon the audience’s prior knowledge of and experience with AI tools. It could always be offered as an optional resource available to those who need it.

ChatGPT offers recommendations for how learning designers can guide learners to use AI

Image description: Screenshot of conversation with ChatGPT called “Guidance on Using AI.” In response to the prompt, “How do you think instructional designers should give learners guidance on using AI in online courses?” (not pictured), ChatGPT responds in a list, part of which appears here: “3. Provide examples of AI in online courses: Providing concrete examples of how AI is used in online courses can help learners understand its practical applications. Examples could include chatbots, personalized learning, and automated grading. 4. Discuss the limitations of AI: It's important to also discuss the limitations of AI to help learners understand its current capabilities and areas where it falls short. This can include discussing topics such as its inability to replicate human emotions or make moral judgments. 5. Offer guidance on using AI tools: If learners will be using AI tools in the course, it's important to provide clear guidance on how to use them effectively. This can include providing tutorials, tips, and troubleshooting advice. Overall, instructional designers should aim to provide clear and accessible guidance on using AI in online courses that both informs learners and prepares them to use AI tools effectively.”

Final thoughts

Given that moving forward, many learners are likely to take their learning to AI tools anyway, it’s smart to provide them with guidance about how to do so productively and effectively.

Showing learners how to, as Gaytan wrote in the course, “harness the power of AI” increases the likelihood that they’ll use AI tools in a way that furthers learning outcomes.

Next
Next

An accessibility QA tool for websites