4. Testing your assessment

Mistaken advice

Early advice to address the challenges of Generative AI (e.g. higher order thinking tasks, personalisation, reflection, exploiting limitations of Generative AI and progressive assessment) has not proven to be effective across disciplines. A range of Generative AI tools have performed highly for many types of assessment including:

  • reflective tasks 
  • a range of large-scale examinations from writing through to legal and medical education. 

How to test your assessment:

It is critical that we each test our assessment tasks to ensure they are Generative AI proof. You can do this by completing the following steps:

  1. copy and paste your assessment into a Generative AI tool
  2. regenerate the response at least once
  3. try adding scaffolds and imagine providing guidance to a struggling student on where to start. For example:
    1. provide an outline for…
    2. identify key researchers to include in work about…
    3. what key ideas should be considered for...
  4. try adding some of the following prompts:
    1. identify relevant experts in the field, generate responses as if the experts wrote them, and combine the experts’ answers by collaborative decision-making.
    2. explain your approach step-by-step.

Further considerations for your assessment 

Associate Professor Jason Lodge has identified 6 assessment redesign considerations at a course or program level which you may find valuable to apply to your existing assessment tasks:

 Short-termMedium-termLong-term
1. IgnoreMight get away with it momentarily⛔️⛔️
2. BanProblematicBecomes risky⛔️
3. InvigilateWhere appropriateWhere appropriateWhere appropriate
4. EmbraceBeing mindful of equity issuesWhere appropriate❤️
5. Design aroundRisky⛔️⛔️
6. RethinkRequires time and effort❤️❤️

In the medium term, it seems likely that our programs will continue to balance UQ’s three options for assessment along with alternatives of Invigilate, Embrace, and Rethink assessment (webinar recording). Dr Lodge’s suggested, until such time there is a comprehensive reimagining of how assessment that is both valid and reliable will co-exist with AI.

Further impact

Dr Helen Gneil, Director Higher Education Integrity Unit at TEQSA has predicted that with the increasing pace of change in the capability and availability of Generative AI, our:

  • courses may quickly become outdated
  • methods of assessment will require regular retooling
  • the kinds of data institutions collect and monitor may look different.

Generative AI proofing

Meanwhile, many academics are exploring ways to adapt their assessment to be more effective at preventing student use of Generative AI to produce their responses. You can view some strategies our colleagues have been working on:

In summary

As we learn to work with the rapidly changing capabilities of Generative AI tools it is important to test your assessment, make sensible enhancements, and discuss with your students about the purpose and expectations of the tasks you have set. You should encourage them to engage with Generative AI in a manner that maintains integrity and maximises their learning.