How to structure the Inspera conversation
A useful way to structure the conversation to ensure all the work of assessment is addressed around the assessment lifecycle (see below).
Some schools have used implementation conversations as an opportunity to have bigger discussions about assessment in their school.
Broader discussions about assessment
One school started their conversation with identifying questions related to the ‘pain points’ of assessment:
- broader discussions about assessment?
- how might some of the features of Inspera be able to assist with this?
- what processes might the School be able to change through Inspera’s online/digital capabilities?
Another school started with identifying the current state of assessment in the School, and how they wanted to develop their assessment so their implementation conversation could be informed by an agreed future vision:
- are we making a shift toward programmatic assessment?
- is there an assessment or curriculum review process happening alongside the implementation of Inspera?
- how does ‘authentic’ assessment apply to [name of school]?
- are we meeting diversity and inclusion needs?
- are we auditing and re-thinking our assessment in light of the challenges presented by Generative AI?
Both approaches elicited important considerations which were incorporated into the implementation conversation.
Questions to guide the planning and creation of assessment, rubrics, marking schemes and feedback
- who authors assessment in your school? A single Course Coordinator? A teaching team including a Learning Designer? Guest/adjunct lecturers who may not have a UQ staff account?
- if applicable, how will the School manage collaborative authoring? This is important to discuss to prevent version control issues
- who builds assessment in your school? (Although not the recommended approach, in some schools there are professional staff that do this on behalf of Course Coordinators). If applicable, what will efficient coordination between authors and builders look like?
- with respect to question titles and labels (which provide organisation, searchability and efficiency) is it important to develop a school-wide taxonomy for staff to follow? Some schools have taxonomies organised around content or topic areas, others around key words or learning verbs. Another suggestion is to label questions with the programmatic learning outcome they align to. All questions should be labelled with the course code
- is it important to the School that question banks and question sets are well managed (e.g. organised, searchable, filterable and accessible for staff other than the Course Coordinator)? Who will be responsible for reviewing and maintaining question banks and checking that naming conventions have been used consistently?
- does the School have standard practices or a policy regarding the provision of feedback to students? There are several ways of providing feedback, some of which are actioned at the authoring stage (e.g. question titles, feedback forms, predefined feedback on questions and rubrics)
- in courses where multiple staff are marking assessment, does the School have a policy about providing marking schemes to enhance consistency and moderation?
Questions to guide the development of assessment review practices
- who other than the Course Coordinator needs access to the assessment to conduct peer review? What does peer review include in your school? Is it grammar and spelling, correct marks allocated (etc.) as well as content?
- are there any other kinds of reviewing that you routinely do on an assessment? E.g. assessment that is aligned to learning outcomes, academic integrity considerations. Does your assessment provide an equitable opportunity for all students? Are there staff who may need access to the assessment for these purposes? Or is this typically done before the assessment is created in the platform?
- do you need to re-think elements of your review(s) in light of the possibilities digital platforms provide? What about reviewing students’ digital assessment experience (e.g. size of images, layout of information, amount of scrolling required, ease of question completion if a mouse is not permitted, clarity of audio, links to work, sufficient instructions)? How could the School design a sustainable means of regularly doing this?
Questions to guide the development of student preparation practices
- Does the school have (or want to develop) guidelines and standards for student preparation? Note, for school staff, this is limited to communication (e.g. ECP text, course site announcements) and practice opportunities. School staff are not expected to and should not provide technological support.
- What are your current processes regarding adjustments and students with access plans? In most instances, a better outcome for students can be achieved with early notice.
Questions to guide the development of assessment administration practices
Note: questions in this section are exclusively for school-based assessment, not centrally run exams.
- who schedules and sets up assessment (including the application of AEAs) in your school? Is the process and the people involved different for different types of assessment? Where the Inspera assessment is set up via an LTI link in Learn.UQ, who is allowed/has access to course sites needs to be considered
- who/how are extensions approved? Who should apply the extension in Inspera?
- on what assessment types and under what conditions would you reopen an assessment that has closed to allow students to submit (this is different to late submission functionality)?
- for timed assessment completed remotely (exam or other timed assessment that does not meet the UQ PPL description of an exam) AskUs needs to be notified. Who will be responsible for doing this in your school?
- there are multiple operational requirements when completing an on-campus exam (e.g. venue, power, wifi), especially it if is conducted in a lockdown browser (technological support for students is required). Is the School sufficiently resourced to organise School-based on-campus exams?
- if the School is going to run on-campus exams, who will supervise/invigilate these exams?
- would the School ever run an on-campus assessment in a computer lab?
- when there are communications from Examinations, eLearning or AskUs regarding Inspera, is there a key person in the School that you would like to nominate as the contact person who will disseminate this information?
Questions to guide the development of assessment marking and finalisation practices
- when a new tutor or casual academic joins UQ, who will direct them to apply for Grader access in Inspera? Will the School provide a budget for tutors and casual academics to receive one hour of marker training? Would the School consider a coordinated approach to this for new tutors to learn together in the same one-hour session (and therefore conserve costs)?
- there are multiple sources of feedback in Inspera that can be provided to students. There are also different options for how this feedback can be released to students. Is it important for the School to have consistent processes about releasing feedback?
- for centrally run examinations only (which are setup directly in Inspera), who will download marks from Inspera and upload them to Grade Centre?
- does the School have any mandatory reporting requirements that need to be considered?
Documentation of assessment practices, actions and responsibilities
Using a digital assessment platform with functionality across the assessment lifecycle will probably require revisions to existing practices or the development of new ones. An important outcome of an implementation conversation is to have these practices and responsibilities documented. The following template can be used for this purpose.