Article by SA Mathieson, Guardian Labs, (Transforming the student experience series paid for by Jisc), 20 February 2020
Throughout their study, students today routinely use virtual learning environments, online resources and other digital services. But at the end of their courses, many will be assessed using an old-school method: answers written with pen and paper in a few high-stakes hours.
“We’ve got quite an old-fashioned assessment system that hasn’t kept pace with student expectations,” says Chris Cobb, pro vice-chancellor of the University of London. “I don’t think I could write a start-to-finish essay with a pen nowadays. I don’t write like that any more.”
Some universities have digitised their exams, with the University of Newcastle having first tested the concept 13 years ago. In the 2018/19 academic year it ran 132 exams through its online assessment and feedback service, including some where students could use their own laptops and others that included long written answers. But this remains unusual.
The University of London’s member institutions use digital assessment extensively during courses, but Cobb says that changing final exams is not done lightly. This is partly due to concerns from the public, academics and employers over maintaining the quality of degrees, including the risks of grade inflation and identity fraud. He adds that students, too, express concerns about modernising exams, with the worry that changes are just about saving money or that they might involve biased machine-learning systems.
Cobb says that these are reasons to modernise assessment cautiously and transparently. But he believes that employers increasingly want employees to be able to collaborate remotely, ask for and deal with feedback, and work creatively and in groups. “I think we are teaching students those things,” he says, but they are hard to assess with written exams.
A report by education and technology not-for-profit Jisc calls for an overhaul at colleges and universities, suggesting five ways to improve assessment for all by 2025. Making assessment more “authentic” – closer in style to the situations students will face after graduation – is one of the main options for modernisation, according to Andy McGregor, director of edtech at Jisc, a membership organisation that provides digital solutions for UK education and research. For example, there is potential to extend the concept of flight simulators, through which aircrew are taught and assessed, more widely. This can include using virtual reality environments for medical education or, in the case of City of Glasgow College’s ship simulator, training in ship handling and navigation for nautical professionals.
“I think ‘authentic’ is about not just the retention of knowledge but about application in situations you will experience again and again,” says McGregor. This doesn’t mean scrapping all essay-style assessments, which can work better in assessing whether students have grasped theory rather than practice: “What you’d expect to see in the future is a variety of assessment types used throughout.”
Written exams are heading for modernisation, too. The University of London is assessing products that would allow it to digitise exams, such as those taken at the end of distance-learning courses in more than 180 countries; students currently have to travel to an accredited centre, often a British Council building, to sit the exam. Cobb says paper scripts sometimes get held up by customs officials or otherwise delayed, and a digital system will allow faster feedback and marking.
Another way to speed up feedback is to automate marking. This is already possible with the likes of multiple-choice questions, but there are new possibilities in using software that analyses language.
Since 2018, Bolton College has been developing a system that provides automated feedback on work placement evaluations, something all students have to write. Using a library of previous evaluations, the software has been trained to identify what students are writing about from the standard areas they need to cover, including how they developed communication skills, problem-solving and time management in their placement. It generates a graph of how much they have written on each area and advises them if an area has been missed or is too brief.
Aftab Hussain, the college’s learning technology manager, says such systems can work at a large scale, are cheap to use after they have been set up and are unbiased, whereas human markers can vary in generosity depending on the individual and the time of day. Students like the new system’s immediacy and the fact that it helps them meet requirements before they submit the work.
However, the system can be fooled as it just looks for coverage of the required topics, such as communications skills, problem-solving and timekeeping. This means someone could write a report on a football match rather than a work placement and if the piece covered the necessary mentions, the system wouldn’t notice. “That’s why it’s so important to keep the teacher in the loop,” says Hussain. As well as checking the software has got it right, teachers can look for emotional problems such as whether a student appears anxious. They are also needed to train and adjust the software.
Bolton College currently uses the system for “formative” assessment – feedback to help students improve – rather than the “summative” kind used for final marks. But Hussain thinks similar software will eventually be able to do summative marking, such as for a GCSE English question on “Why is Macbeth a tragedy?”, with humans setting it up, moderating and checking a sample of answers.
McGregor predicts that a mix of software-based and human assessment will become the norm over the next decade, with both playing to their strengths. He says students should ask their institutions to get on with modernising assessment: “What you’d expect to see in the education of the future is a variety of assessment types used throughout.”