Some Practical Reflections on Remote (Exam) Marking
The COVID-19 pandemic has forced universities to rethink the way in which they assess students in the advent of remote assessments and remote exams. Understandably, a lot of attention during the pandemic has been paid to the challenges when designing such remote assessments and their associated marking criteria.
However, a lot less attention has been given to the actual practicalities of marking remote assessments. This seems unfortunate because marking loads are high for many academics, and remote (online) marking can be even tougher and more time-consuming than normal. For instance, it’s more difficult to concentrate on a screen for long spells, markers can’t flick through a submission so quickly, and it’s often harder to tell which exam questions students have chosen. Moreover, as we expect remote assessments (including potentially remote exams) to remain in parts of some degree programmes after the pandemic, it’s important for us to learn how to mark remote assessments more efficiently and effectively.
To help address this gap, this article offers some practical reflections. While we focus on remote exam marking, much of what we discuss can also be applied more widely to remote coursework marking or coursework with online submissions. Section 1 starts by discussing the decision of whether to mark directly within a virtual learning environment (VLE) or to mark by downloading (and re-uploading) the submissions as PDF files. Section 2 then reflects on a key issue – whether to mark ‘script-by-script’ (where the marker marks all the questions of a script before moving on to the next one) or ‘question-by-question’ (where the marker marks the answers to a given question across all scripts, before moving on the next question). Finally, Section 3 discusses some further issues that are relevant when remote marking in a team.
1. Marking directly in a VLE vs downloading the scripts as PDFs?
In many universities, academics have the choice to mark directly in a VLE or to download the scripts to mark as PDF files (before then re-uploading them back into the VLE). In other universities, only one of these options is allowed. Here, we discuss some of the pros and cons of each method.
Marking directly in a VLE saves having to deal with lots of files and allows a marker to search for a given script easily. It also provides an easy way to record marks and comments. However, it has several disadvantages. First, VLEs can be slow and temperamental at busy periods and open to wi-fi issues, which can hinder productivity. In particular, opening a script and turning pages can be frustratingly slow. Second, if you are using a drawing tablet/stylus to mark, it can be troublesome in some cases.
Downloading the scripts to mark as PDF files has several advantages. It avoids relying on the internet while marking and so enables markers to proceed reliably offline. Further, by using the comment tab and selecting the appropriate option, markers can often use a drawing tablet/pen more easily. Also, the sidebar for comments allows you to store text regarding allocated marks or written comments with easy accessibility. Under this approach, markers can also keep track of marking progress by observing when each script was last modified within the file information. However, there can be a few minor drawbacks. First, markers may have a lot of files to manage so they need to be organised. Second, markers may have to upload the marked scripts back into the VLE for storage after they have finished, which can take time.
2. Marking question-by-question or script-by-script?
We now move on to a key issue – whether to mark question-by-question or script-by-script. Many of us will have a preferred method for normal exam marking but should we take a different approach when remote marking?
In normal circumstances, a question-by-question marking approach aims to increase efficiency and consistency by enabling the marker to only focus on the information required for one question at any point in time. For remote marking, when markers have to spend long hours on a screen, this reduction in required brainpower can be especially beneficial. However, the approach does have some downsides. First, if students have a choice of questions, markers may have to sink a significant amount of time at the start of the process to log which students opted for which questions because this information is typically not provided in remote exams. Second, compared to normal exams, a question-by-question approach requires the marker to move between scripts more frequently. Under remote marking, this can be especially cumbersome, but downloading the scripts as PDFs can help ease this issue. We found opening a batch of 20 PDFs at a time in different tabs helped speed things up but it can still be time-consuming. Overall, we think this approach can help concentration and reduce having to think too hard, but it did take a relatively longer time in aggregate when remote marking.
To try to reduce marking time, one could try a script-by-script marking approach instead. For this to be effective, the marker must be ready to mark a range of different questions within the same sitting in order to complete an individual script. This may be difficult for exams with many different questions or if the subject matter is particularly voluminous or complicated. In these cases, this approach could lead to frustration, fatigue and inconsistency. However, after marking a first batch of scripts, we found it was possible to develop a rhythm for some types of exams if we had good notes next to our computers. Rationally or not, we also thought this method helped motivation because it was satisfying to completely finish a batch of scripts in the same marking session.
As a third possibility, one can use a hybrid marking approach by marking some of an exam’s questions using a question-by-question approach but marking others in a more aggregated script-by-script approach. For instance, for easier/shorter questions within an exam, markers may save transition times by marking them script-by-script, but for harder/longer questions, markers may opt for a question-by-question approach to help reduce necessary brainpower. Alternatively, if an exam has a section with lots of short questions, a marker may find it too hard to hold all the answers and criteria in their head, and so prefer to mark this section question-by-question, while marking other sections in a more aggregate fashion, script-by-script.
The format of exam and the chosen marking approach may also influence whether the marker prefers to mark within the VLE or via the PDF files. For example, if the exam format and preferred marking approach lead to a lot of switching between different scripts, the marker may show a preference for the VLE or PDF format, depending on circumstances.
3. Further issues when marking in a team
When marking in a team, some further issues need to be considered. To reduce any coordination problems of markers wanting to access the same script at the same time or duplicate files, it may be most practical to ask team members to mark whole scripts. As consistent with a more script-by-script approach, one can then simply divide the cohort’s scripts between the different markers. However, this method can exaggerate the potential inconsistencies in marking between the different markers and require further moderation by the module leader.
Instead, to help ensure better consistency and reduce the need for moderation, team members can be assigned to mark the answers to a specific exam question across all scripts. However, this requires markers to switch between scripts more frequently and can create logistical issues and so some pre-planning is required.
In contrast to the design of remote assessments, there has been little discussion about the actual practicalities of marking remote assessments in higher education. This seems unfortunate because marking loads are high for many academics, and remote marking can be even tougher and more time-consuming than normal. In this article, we have offered some initial discussions to help others reflect on how they could become more efficient and effective in their marking approach.
 For instance, see Burnett and Paredes Fuentes (2020), and the resources on Future-Proof Assessment and Adaptable Assessment from the Economic Network’s 2021 and 2020 Online Symposia. For wider reviews of assessment in the pandemic, see OECD (2020) and QAA (2021). For a more general review on assessment, see Guest (2019).
Burnett, T. and S. Paredes Fuentes (2020) “Assessment in the Time of Pandemic: A Panic-Free Guide”, The Economics Network
Guest, J. (2019) “Assessment and Feedback” in the Economics Network Handbook for Economics Lecturers.
OECD (2020) "Remote Online Exams in Higher Education During the COVID-19 crisis", OECD Education Policy Perspectives, No. 6, OECD Publishing, Paris, https://doi.org/10.1787/f53e2177-en
QAA (2021) “How Good Practice in Digital Delivery and Assessment has Affected Student Engagement and Success - an Early Exploration”Back to top
- 1318 views