T&L blog post: Written by a human being – Integrating AI technologies in teaching, learning and assessment

教與學博客文章: 由人類撰寫--將人工智能技術融入教學、學習和評估

By Katrine Wong

AI literacy

Artificial intelligence (AI) refers to the ‘capacity of computers or other machines to exhibit or simulate intelligent behaviour’ (OED).

‘Literacy’ traditionally means ‘the ability to read and write’ (OED). Nowadays, one’s literacy is expanded to include a range of literacies, including computer literacy, digital literacy and AI literacy.

‘AI literacy’ describes the competencies necessary in a habitat wherein AI pervades both our private and public spaces and transforms the way that we communicate, live and work with each other, and with machines (Long & Magerko, 2020; Sabouret, 2020). AI literacy is increasingly crucial for effective collaboration between humans and machines, and it can promote human-machine collaboration and augment human intellect and capabilities (Akata et al., 2020).

As teachers in this brave new world, are we ready to embrace AI? Are we ready to develop our AI literacy? Are we ready to incorporate AI in our teaching and learning?

We can start by familiarising ourselves with AI technologies and applications. Alongside ChatGPT (Chat Generative Pre-trained Transformer) – a generative AI tool that allows users to input prompts and receive human-like text, images or even videos that are created from large datasets in response to the text prompts, there are also, to name a few (as of 29 March 2023):

  • Caktus, an AI writer for many school subjects. Paid service.
  • Codewhisperer, an ML-powered coding bot. Paid service.
  • Google Workspace services can generate emails and texts based on prompts. Free/paid service.
  • Jasper, a writing assistant that creates phrases, paragraphs or documents based on prompts. Paid service.
  • Microsoft’s services and products, e.g., Bing, Edge, Word Online can now summarize text, write emails, and more. Free/paid service.
  • Murf, a text to voice service for creating professional videos and presentations. Paid service.
  • Poe, an AI chat bot. Paid service.
  • Rytr, a writing assistant. Paid service.

For a while now, I’ve been learning how various tools work, slowly (and we don’t know what’s in store tomorrow or next week or next semester!). As I’m learning more about these tools, I’m starting to view them critically and see what their context and embedded principles are. One thing that I’m sure about is that these tools are, to me, powerful search engines. I haven’t used any text generated by such tools in any of my writing, and I don’t have any plans to do so in the (near) future. ChatGPT, for example, is known to hallucinate and concoct false information. Having said that, as of 29 March 2023, it can make things up ‘fluently in more than 50 languages’ (“ChatGPT,” 2023).

Of course, it’s important to know how our students use and study with such tools.

‏‏‎ ‎‏‏‎ Responsible use of AI in learning and teaching

Colleagues at UM have recently been advised that our students can ‘use ChatGPT or other generative-AI systems to enhance their learning’ and that ‘students should be aware that they must be authors of their own work’ (email ‘Notes on the use of generative-AI systems’, 11 April 2023).

Granted, generative AI tools can benefit education in ways such as providing real-time feedback, facilitating independent and personalised learning, generating summaries and translation texts document summarization, and even creating interactive e-learning material such as flashcards, crossword and videos, colleagues are concerned about the impact upon principles of academic integrity. While the initial panic is well understood, instead of fearing that large language models (LLMs) like ChatGPT can kill students’ abilities to write essays and work out answers for concept-checking questions, we can embrace the opportunities afforded by this technological advancement and rethink assessment design.

It is high time that we prioritise training our students to develop reasoning and problem-solving skills. Assessment tasks that involve writing summaries and online quizzes may no longer yield effective, indicative measurement of student achievement. For instance:

  1. We can design tasks that focus on cognitive processes such as analytical, critical and synthetic thinking. Students can demonstrate to us whether or not they have achieved respective intended learning outcomes through activities such as debate, concept diagrams and mind maps.
  2. We can, if time allows, supplement essay writing with Q&A to evaluate students’ actual learning.
  3. We can, when feasible, adjust the distribution of our assessment types in our course and focus on fewer open-book, take-home tasks.

In addition, part of our job as university teachers is to educate our students to become responsible global citizens who will live and work with AI technologies in a reliable, honest and respectable manner. This is no different from living and working as a responsible citizen. While the topic of AI ethics is an increasingly important question (Laupichler et al., 2022; Ng et al., 2021; Shih et al., 2021), this current blog post is not trying to discuss how we can teach AI ethics.

Instead, I would like to suggest a few questions that we can encourage our students to actively consider, when it comes to generative AI technology:

  • Why do I, as a student, need to use AI technology to answer this question? Can I use the knowledge and skills that I already have to answer questions on an assignment?
  • How reliable is the information gathered and generated? Is it correct? Have I checked the AI-generated text against reliable sources of information?
  • Is the information adequate to inform the scope of my discussion? What biases might be present in responses generated by AI technology?
  • What value does this AI-generated text/answer bring to my learning?
  • How can I ethically work with this AI-generated text/answer?

These questions and similar ones will help students begin to become responsible users of generative AI tools. At the same time, here are a few things that students should understand:

  • That they are responsible for their own submitted assignments;
  • That they are responsible for any inaccuracies or omissions in AI-generated materials;
  • That there is the possibility of made-up stuff (including fake references) in AI-generated materials; and
  • That it is recommended that students acknowledge the use of AI in written assignments and include appropriate citation of the sources.

In my next blog post, I will write more about assessment design in the age of AI.

‏‏‎ ‎‏‏‎ ‎

‏‏‎ ‎References:

Akata, Z., Balliet, D., de Rijke, M., Dignum, F., Dignum, V., Eiben, G., Fokkens, A., Grossi, D., Hindriks, K., Hoos, H., Hung, H., Jonker, C., Monz, C., Neerincx, M., Oliehoek, F., Prakken, H., Schlobach, S., van der Gaag, L., van Harmelen, F., … Welling, M. (2020). A Research Agenda for Hybrid Intelligence: Augmenting Human Intellect With Collaborative, Adaptive, Responsible, and Explainable Artificial Intelligence. Computer, 53(8), 18–28. https://doi.org/10.1109/MC.2020.2996587

‎‘artificial intelligence, n.’ OED Online. www.oed.com/view/Entry/271625. Accessed 29 March 2023.

ChatGPT is a marvel of multilingualism. (2023, March 29). The Economist. https://www.economist.com/culture/2023/03/29/chatgpt-is-a-marvel-of-multilingualism

Laupichler, M. C., Aster, A., Schirch, J., & Raupach, T. (2022). Artificial intelligence literacy in higher and adult education: A scoping literature review. Computers and Education: Artificial Intelligence, 3. https://doi.org/10.1016/j.caeai.2022.100101

‘literacy, n.’ OED Online. www.oed.com/view/Entry/109054. Accessed 29 March 2023.

Long, D. & Magerko, B. (2020). What is AI Literacy? Competencies and Design Considerations. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–16). https://doi.org/10.1145/3313831.3376727

Ng, D. T. K., Leung, J. C. L., Chu, S. K. W., & Qiao, M. S. (2021). Conceptualizing AI literacy: An exploratory review. Computers and Education: Artificial Intelligence, 2. https://doi.org/10.1016/j.caeai.2021.100041

Sabouret, N. (2020). Understanding artificial intelligence. CRC Press LLC. https://doi.org/10.1201/9781003080626

Shih, P. K., Lin, C. H., Wu, L. Y., & Yu, C. C. (2021). Learning ethics in AI-teaching non-engineering undergraduates through situated learning. Sustainability, 13(7). https://doi.org/10.3390/su13073718