Signposting in a Virtual Learning Environment to boost attendance and engagement
Michael McCann
Department of Economics, Nottingham Trent University
michael.mccann at ntu.ac.uk
Published October 2023
Introduction
This case study reviews the use of signposting on the learning design of a Virtual Learning Environment (VLE) to address issues of student engagement and learning on a campus-based module delivered at a UK university. VLEs are web-based systems which support learning and are playing an increasingly prominent role in learning across higher education. This trend was intensified during the Covid pandemic as many universities delivered their modules through online channels and VLEs became the primary means of guiding student learning. Many of the changes introduced have had a lasting impact on campus-based courses across higher education involving a greater use of digital resources to complement face-to-face classes (Beetham and MacNeill, 2023).
Great attention has been paid in the literature to identifying the VLE resources which students find most valuable and are associated with better learning outcomes (see Brown and Foster, 2023 for a comprehensive review). Less attention has been placed on the learning design of VLEs. Learning Design is a “…methodology for enabling teachers to make informed choices in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies.” (Conole, 2012, p.121). Much analysis of VLE Learning Design tends to focus on the type of on-demand learning activities provided. Rienties and Toetenel (2016) conducted a large cross-sectional analysis of the VLE learning designs of 151 modules delivered by the Open University in the UK and found that students spent more time on VLEs with learning designs which emphasised active learning than VLEs with a learning design with more passive learning. However, in modules using blended learning, the learning design of a VLE can play an important role in guiding student learning, using not only the on-demand learning activities but also engagement in campus-based classes, for instance workshops or seminars.
Guidance is important since research shows that students are selective in their engagement with on-demand and campus-based learning activities. Barile et al., (2022) employed a revealed preference approach to analyse patterns of student engagement with different complementary learning activities involving on-demand VLE activities and ‘live’ seminar activities in the context of cognitive load theory (Mayer and Moreno, 2002; Paas et al., 2003; Sweller, 2004). The authors suggest that in limiting cognitive load, students were selective in their engagement with learning activities. Analysis revealed that the selection was determined by how valuable activities were perceived to be for completing assessment tasks. This was indicative of instrumentalism, with student effort on a module being determined by the assessment (Ramsden, 1992). The danger is that when students select learning activities they do not choose wisely, excluding valuable ones and limit their learning and potential attainment. In particular, they may view synchronous on-demand digital resources as a substitute for seminars and not attend these latter valuable learning opportunities.
In this context, a VLE is really useful in guiding students in their learning, helping them to manage their use of the complementary learning activities. Signposting (sometimes referred to as “signalling” or “cueing”) has been found to help learners know where to focus so they do not expend effort either searching for ideas or connecting them together (Mayer, 2009). Brown and Foster (2023) highlight that students want signposting to specific-session content within a VLE. However, there has been little analysis of the influence of such signposts on learner behaviour. Therefore, based on the findings of my previous work (see Barile et al., 2022) and the literature, I employed textual signposts in the titles of VLE pages for seminar activities which signalled a link between the campus-based seminar activities and assessment tasks.
In this case study, I explain the evolving use of the signposts across three iterations of module delivery and report on an analysis of their effectiveness in influencing student learning behaviour through attendance at seminars and use of related VLE on-demand learning activities.
An assumption underpinning the analysis is that data on usage will reveal student preferences. Therefore, a strength of the analysis is that, rather than relying on students’ self-reported preferences, we can demonstrate how the signposts employed in the VLE Learning Design influenced student behaviour.
Context
The Global Financial Markets module which is the focus of this case study is taken by full-time economics with finance students in their second year at Nottingham Trent University. It is a core module on the degree. In 2020/2021, a blended delivery model was introduced due to the Covid-19 pandemic. This involved a mix of synchronous learning activities and asynchronous on-demand digital learning activities. There were synchronous ‘live’ classes – a weekly 2-hour seminar and a 1-hour topic review session delivered through an online platform. Complementing these were asynchronous on-demand features provided through the VLE – bite-sized lecture recordings, copies of lecture slides, as well as web pages detailing the ‘live’ seminar activities, on-demand quizzes and additional reading.
Guidance on learning took the form of a study guide published as a news item on the VLE at the start of a topic. The study guide explained the different learning activities and resources, detailing the sequence in which they were to be used to deepen knowledge and understanding. For each topic, students watched a series of web recordings (approximately 6*10 minutes) which were available on-demand. These provided threshold knowledge and understanding as well as demonstrations of relevant tools of the finance software used on the module. Slides were provided to complement the recordings. Students then attended a "live" seminar (online in 2020/2021 and in-person in 2021/2022 and 2022/2023). Further, an on-demand quiz with feedback was provided on the VLE to encourage deeper learning. Finally, students attended an online topic review session. In 2022/23, this was replaced with a 1-hour in-person lecture.
The Global Financial Markets module was assessed using an individual written structured assignment (100%) comprising four tasks. These involved students completing tasks which required sourcing and analysing data using finance software as well as a critical application of theory. Table 1 illustrates the learning activities for each topic in the three academic years in the sequence they were expected to be completed.
Table 1: Dimensions and Sequence of Learning Resources and related Assessment in each year of delivery
On-demand Lecture recordings – 6*10 minutes recordings per topic (VLE). Lecture Slides to complement recordings (VLE) 2-hour live seminar (Seminar tasks posted on the VLE in advance)
On-demand quizzes with feedback (VLE) 2020/2021 and 2021/2022: 1-hour live online topic review session 2022/2023: 1-hour campus-based face-to-face lecture. |
Assessment: 4000-word individually written structured report (100%) |
Textual Signposts
In preparation of the module for blended delivery, I recognised that students would manage their engagement with learning activities to limit their cognitive load (Paas et al., 2003). They would not use all resources available but a subset which they perceived to be the most valuable for learning. Further, I acknowledged that the subset would be determined by the resources' perceived value in completing the assessment tasks (Ramsden, 1992). In such circumstances, students would look for signals and cues to identify a subset of learning activities which were most valuable for completing the assessment tasks.
For each topic, there are two types of seminars. Firstly, seminars which develop knowledge and understanding with a critical analysis of theory and, secondly, seminars using finance software which involve practice in sourcing and analysing financial data using relevant theory. Since students use the software for sourcing data for their assignment tasks, the second type of seminar is clearly aligned with assignment tasks. Previous experience demonstrated that students who did not attend the latter type of seminars struggled later in completing the module’s assessment. Therefore, I decided to incorporate textual signposts into the titles of certain seminar activities pages on the VLE to signal their value in completing a specific assessment task. The choice of seminar titles was driven by a hope that this would nudge students towards attending the seminars and mean they do not struggle later.
Consequently, in preparation for delivery in 2020/2021, I crafted the titles of these seminars on the VLE so that they signposted alignment with relevant assessment tasks. In 2020/2021, three of the 10 seminars used the finance software and had titles with signposts illustrating that these enabled practice for completing specific assignment tasks. Over the subsequent years of delivery, I gradually extended the use of signposting. For 2021/2022, I added another seminar with activities using the software. The title of this seminar’s VLE webpage was changed to signpost this practice. Therefore, a total of four was signposted in that year. In 2022/2023, I employed textual signposts in the titles of the VLE pages for every weekly seminar activity. This included those that signposting practice using the finance software which had been signposted in previous years, but also, the seminars which signalled alignment by developing knowledge and understanding of theory related to specific assessment tasks. Table 2 shows the titles for seminars on the VLE in each year of delivery, illustrating their iterative changes.
Table 2: VLE Seminar Page Titles in each year of delivery
2020/21 Titles | 2021/22 Titles | 2022/23 Titles |
---|---|---|
Seminar Activities | Seminar Activities | Seminar Activities – Aligned with task 1 of Assignment. |
Seminar Activities – Practise for Task 1 of Assignment | Seminar Activities – Practise for Task 1 of Assignment | Seminar Activities – Practise for Task 1 of Assignment |
Seminar Activities | Seminar Activities | Seminar Activities – Aligned with Task 2 of Assignment |
Seminar Activities – Practise for Task 2 of Assignment | Seminar Activities – Practise for Task 2 of Assignment | Seminar Activities – Practise for Task 2 of Assignment |
Seminar Activities | Seminar Activities | Seminar Activities – Aligned with Task 3 of assignment |
Seminar Activities – Practise for Task 3 of Assignment | Seminar Activities – Practise for Task 3 of Assignment | Seminar Activities – Practise for Task 3 of Assignment |
Seminar Activities | Seminar Activities | Seminar Activities – Aligned with task 3 of Assignment |
Seminar Activities | Seminar Activities – Practise for Task 4 of Assignment | Seminar Activities – Practise for Task 4 of Assignment |
Seminar Activities | Seminar Activities | Seminar Activities – Aligned with Task 4 of Assignment |
Seminar Activities | Seminar Activities | Seminar Activities – Aligned with Task 4 of Assignment |
Empirical Analysis of the Impact on Student Engagement
The empirical analysis investigates the impact of the signposts on students’ engagement on the module in the context of instrumental students using signals and cues to manage their learning on the module. In doing so, I analysed engagement, adopting a revealed preference framework whereby attendance at seminars and use of different VLE resources provide a good indication of the impact on students’ learning behaviour of the evolving textual signposts employed across the three iterations of delivery.
Data
At Nottingham Trent University, the Virtual Learning Environment (VLE) pages for modules record students’ use of online learning features. These data are routinely collected by Nottingham Trent University to monitor the engagement of students. Reports are available for lecturers to monitor students’ access of features on modules. We used these reports to measure use of the module-level VLE learning activities. We source data on student attendance at live classes from the University’s student records[1].
We measure attendance rates at the relevant live seminars for each topic (online in 2020/2021 and face-to-face in 2021/2022 and 2022/2023). Use of VLE seminar activities webpages is measured by (i) the number of visits to the VLE page and (ii) the time spent on the VLE pages. Use of on-demand lecture recordings for each weekly topic is measured by the average percentage of the bite-sized on-demand lecture recordings watched. Students’ use of lecture slides as a learning resource for each weekly topic is measured by the number of times the PDF files available are viewed. I did not measure time online using the slide files since students were likely to download these. Use of VLE on-demand quizzes for each topic is measured by (i) the number of times visits to VLE pages and (ii) the time spent on the quiz pages. The time spent on VLE seminar activities pages and VLE quizzes was standardised in terms of minutes for the statistical analysis. Table 3 shows the titles and definitions of the complementary learning activities.
Table 3: Variable titles and Definitions
Variable Title | Definition |
---|---|
VLE Lecture Recordings | % of each VLE Lecture Recordings watched |
VLE Lecture Slides | Number of visits to VLE pages with Lecture Slides |
‘Live’ Seminars | % of Seminars Attended |
Visits to VLE Seminar Activities Pages | Number of Visits to VLE Seminar Activities pages |
Time spent on VLE Seminar Activities | Mean time spent on the VLE Seminar Activities pages |
Number of Visits to VLE On-Demand Quizzes | Mean number of visits to VLE on-demand quizzes |
Time spent on VLE On-Demand Quizzes | Mean time spent on VLE on-demand quizzes |
We appreciate that quantitative measures of studying effort have limitations. For example, it is unable to capture the active or qualitative dimensions of studying effort, as discussed in Duncan et al. (2012) and Hu and Li (2017), respectively. Further, the literature lacks consensus on the measurement of engagement. For example, student engagement may be considered as number of views of a particular digital resource versus time spent using that resource. Boulton et al. (2018) provides a discussion of the issues around measurement. In addition, since many studies focus on specific institutions or modules, the results are context-dependent and rely on the different types of VLE resources considered in the analyses. We appreciate this is the case here.
Impact of Ex-Ante Signalling on Attendance
The study guide signposting learning for a topic was published in the week prior to delivery. Therefore, the signposts regarding the link between seminar activities and assessment tasks could be identified by students then. Students were also able to access the content of the webpages at that stage. I analysed whether the ex-ante textual signposts provided in the VLE influenced students’ pattern of attendance at seminars. This involved comparing 1) mean attendance rates at seminars with titles signposting links with assessment tasks, against 2) mean attendance rates at seminars with no signposts in their titles. I anticipated that attendance at seminars with titles indicating alignment with assessment tasks would be significantly higher. The mean attendance and t-tests of differences in means for the three cohorts are shown in table 4.
Table 4: mean attendance and t-tests of differences in means
Overall Mean seminar attendance | Mean attendance at seminars with signposts of practice | Mean attendance at seminars with no signposts | T-Statistics for differences in means | Obs. (df) | |
---|---|---|---|---|---|
2020/21 | 55% | 57.7% | 54.0% | 0.5998 | 126 (121) |
2021/22 | 48.1% | 54.5% | 41.7% | 2.337** | 138 (133) |
Overall attendance | Mean attendance at seminars with signposts of practice | Mean attendance at seminars with signposts of alignment | T-Statistics for differences in means | Obs. (df) | |
---|---|---|---|---|---|
2022/23 | 58.7% | 65.4% | 53.7% | 2.642*** | 120 (118) |
***indicates significance at 0.01 level
**indicates significance at 0.05 level
In both 2020/2021 and 2021/2022, there was a clear delineation between seminar activities with signposts and those without signposts. In those years, mean attendance at signposted seminars was higher than mean attendance at non-signposted seminars. This suggests that students used the ex-ante signals provided in the VLE in deciding whether a seminar was worth attending. Seminars with titles which signposted activities which offered practice in completing assessment tasks were deemed more valuable and were considered more important to attend. However, in analysing the results for each cohort, the difference in mean attendance rates was only statistically significant in 2021/2022. Attendance at the face-to-face campus-based seminars in that year required greater commitment than the seminars delivered online for the 2020/2021 cohort. This commitment may have meant that students in 2021/2022 were more selective in seminar attendance and used the ex-ante signposts provided in the VLE to decide which seminars were more valuable to attend.
In 2022/2023, when all the seminar activities posted on the VLE had textual signposts in their titles, overall mean attendance was higher than for the 2020/2021 and 2021/2022 cohorts. Indeed, it was higher than the mean for the signposted seminars for these previous cohorts. This is positive since it suggests that the increased signposting is associated with higher levels of attendance, though it must be appreciated that this is subject to the caveat that differences in student attributes and module dynamics (eg. timing of seminars) could also influence the pattern of behaviour.
Even in 2022/2023 there was still evidence of selectivity. Attendance rates at seminars which involved practical activities using the finance software were significantly higher than the seminars involving collaborative activities developing knowledge and understanding of relevant theory. Indeed, 24 of the 60 students on the module attended all of the seminars involving practice using the software. Only 7 students attended all of the seminars developing knowledge and understanding of relevant theory. This suggests that students were still being selective and were using the information provided on the VLE to distinguish that seminars involving practice using the finance software were more valuable.
Impact on the use of VLE Seminar Activities Webpages
We assess the influence of the signposts employed on the value attached to VLE seminar activities webpages by comparing visits of signposted VLE seminar activities with non-signposted seminar activities. It would be anticipated that students would visit the signposted VLE seminar activities pages more than non-signposted pages because they have been signalled as valuable for completing assessment tasks. This would involve repeat visits after the seminar session was delivered to use those resources. They should also spend more time on those pages. Table 5 shows the results.
Table 5: Use of VLE Seminar Activities Pages
Year | Overall | Signposted VLE Seminar Activities | Non-signposted VLE Seminar Activities | T-Statistics for differences in means | Obs. (df) | |
---|---|---|---|---|---|---|
Mean Number of Visits to VLE pages | 2020/21 | 1.55 | 2.51 | 1.08 | 6.272*** | 126 (93) |
2021/22 | 2.45 | 3.15 | 1.76 | 5.433*** | 138 (124) | |
Mean Time Spent on VLE pages time (min:sec) | 2020/21 | 7.34 | 10.56 | 5:53 | 2.102** | 126 (104) |
2021/22 | 10:46 | 15:34 | 5:57 | 5.04*** | 138 (96) |
2022/23 | Overall | Seminars with signposts of practice | Seminars with signposts of alignment | T-Statistics for differences in means | Obs. (df) | |
---|---|---|---|---|---|---|
Mean Number of Visits to VLE pages | 2.77 | 3.38 | 2.28 | 5.545*** | 120 (94) | |
Mean Time Spent on VLE pages (min:sec) | 13:29 | 19:05 | 8:59 | 4.367*** | 120 (79) |
***indicates significance at 0.01 level
**indicates significance at 0.05 level
Revealed usage indicates that VLE seminar activities signalled as valuable for completing the assessment were used significantly more than non-signalled seminar activities in both 2020/2021 and 2021/2022. The mean time spent on the signposted VLE seminar pages was nearly double the mean for non-signposted seminar pages in 2020/2021 and nearly three times in 2021/2022. The data indicates multiple visits which means it didn’t just involve use in the seminar itself, but revisits afterwards to use the resources, a sign of their perceived value for completing the assessment. In 2022/2023, when textual signposts were used in the titles of all seminar activities posted on the VLE, mean usage across all the seminar activities webpages was higher than the overall mean usage for the previous two cohorts, but also mean usage for signposted webpages. This suggests that the signals in the titles of seminar activities were successful in nudging students in the way anticipated, raising engagement with these resources.
Again, the selectivity in attendance is also evident in ongoing use of the seminar activities VLE pages. Students make more visits to — and spend much more time on — VLE seminar activities pages which involve practice in the use of the finance software compared to VLE seminar activities which signposted alignment with assessment tasks.
Use of Related VLE Learning Activities
In addition to the signposted seminar activities, each weekly topic had complementary VLE learning activities to be used in learning. These included on-demand lecture recordings, lecture slides and on-demand questions. The titles of these did not contain overt signposts to assessment tasks. The literature on signalling suggests that if these uncued learning activities are related to signposted activities they will be perceived to have greater value in completing assessment tasks than similar resources related to the seminar activities which do not have textual signposts to assessment tasks. I compared use of un-cued VLE learning activities in topics where signposts were used in the titles of VLE seminar activities pages with access of the same type of VLE activities in topics were signposts were not employed. I anticipated that the VLE learning activities in the topics with signposted seminar activities were used significantly more than VLE learning resources in topics without signposted seminar activities. Table 6 illustrates results of the analysis.
Table 6: Does signalling lead to greater use of uncued learning activities in the same topic?
Year | Related to Signalled Seminar Activities | Related to Non-signalled Seminar Activities | T-Statistic | Obs. (df) | |
---|---|---|---|---|---|
Mean % of VLE Lecture Recordings watched | 2020/21 | 66.7 | 50.6 | 2.858*** | 126 (122) |
2021/22 | 52.2 | 36.6 | 3.404*** | 138 (136) | |
Mean number of visits to VLE pages with Lecture Slides | 2020/21 | 2.39 | 1.32 | 4.013*** | 126 (86) |
2021/22 | 2.31 | 1.46 | 2.99*** | 138 (121) | |
Mean number of visits to VLE On-demand quizzes | 2020/21 | 0.87 | 0.82 | 0.426 | 126 (119) |
2021/22 | 0.70 | 0.65 | 0.42 | 138 (135) | |
Mean time spent on VLE On-demand quizzes | 2020/21 | 1:42 | 1:34 | 0.224 | 126 (114) |
2021/22 | 1:10 | 0:30 | 1.1 | 138 (72) |
2022/23 | Overall | Uncued activities in topics with seminars signposting practice | Uncued activities in topics with seminars signposting alignment | T-Statistic | Obs. (df) |
---|---|---|---|---|---|
Mean % of VLE Lecture Recordings watched | 38.3 | 42.1 | 35.7 | 0.937 | 120 (118) |
Mean number of visits to VLE pages with Lecture Slides | 1.92 | 2.65 | 1.34 | 4.758*** | 120 (84) |
Mean number of visits to VLE On-demand quizzes | 0.48 | 0.35 | 0.58 | -2.007** | 120 (111) |
Mean time spent on VLE On-demand quizzes | 01:02 | 00:56 | 01:08 | -0.46 | 120 (118) |
***indicates significance at 0.01 level
**indicates significance at 0.05 level
In both 2020/2021 and 2021/22, when there was clear distinction in the signposts for the weekly topic seminars, the results show evidence that some un-cued VLE learning activities in the same topics as the signposted seminar activities enjoyed significantly greater use. There is a significantly higher mean percentage of lecture recordings viewed in topics where signposts were used. This is also the case for the mean number of visits to the related VLE Lecture Slides pages for topics where signposts were used. The textual signals did not have produce any significant difference in the pattern of use for VLE on-demand quizzes. On average, these resources were utilised much less than the other uncued resources. Further, access rates were not very different whether these were in the same topic as signposted seminar activities or not.
In 2022/2023, when signposts demonstrating either practice for or alignment with assignment tasks were employed, there is no discernible pattern of usage. While the mean number of visits to lecture slides in topics with seminars signposting practice was significantly higher, there was no significant difference in the propensity to watch lecture recordings related to seminar titles signposting practice and those signposting alignment. Further, use of on-demand quizzes in topics signposting alignment was greater than those in topics signposting practice.
Discussion
Our results suggest that students are selective in their engagement across different learning activities. The distinctive signposts provided in the first two years did seem to influence their pattern of attendance at "live" seminars and their use of the VLE Seminar pages throughout module delivery. In 2022/2023, when all VLE seminar page titles employed signposts, the cohort exhibited higher average attendance at all seminars than previous years, with more visits to and time spent on VLE seminar activity pages too. Still, there was evidence of discrimination, with significantly higher attendance rates at seminars with direct practical use of finance software compared to those which had learning activities developing knowledge and understanding of underlying theory.
These findings suggest that students did not just use titles when selecting learning activities to engage with. They also observed the type of activities in seminars described on the VLE webpage to assess their value. The conclusion I draw is that students perceived that the seminars with practice using the finance software were more valuable when done in class. There was less value in trying to do the activities independently, without tutor support. Providing students with information ex-ante helps them do this and manage their attendance and engagement. But the pattern of attendance suggests that the seminar activities signposted must be expressed in a way which suggests that greatest value is gained by doing them in class. It is this which encourages greater attendance and subsequent use as a reference when completing the related tasks in the module assessment.
Conclusion
This case study analyses the implementation of a VLE learning design incorporating textual signals to help students manage engagement with module learning activities. Drawing on cognitive load theory and instrumentalism we used signposts in the titles of VLE seminar activities pages to highlight those aligned with assessment tasks. We used data on patterns of usage to analyse whether these signals had an impact on patterns of learner behaviour.
While this case study is limited to the signposting incorporated in one module across three iterations of delivery, the experience has lessons for practitioners regarding their VLE learning design in a campus-based module. Our results indicate that the signposts provided significantly influenced learners’ behaviour in a way which supports both cognitive load theory and instrumentalism. Overall, students were selective in their engagement with the complementary learning activities. Students were selective in their engagement and try to manage it in a way which is most valuable for learning. The most valuable subset of activities were those perceived to be most useful for completing assessment tasks. In such circumstances, VLEs can complement in-person teaching by incorporating signals which provide information about the value of attendance and engagement. Information and signposting seminar activities on a VLE in a way which makes them of greatest value if completed in class should nudge students to attend and engage more.
References
Barile, L., Elliott, C. and McCann, M., 2022. Which online learning resources do undergraduate economics students’ value and does their use improve academic attainment? A comparison and revealed preferences from before and during the Covid pandemic. International Review of Economics Education, 41, 100253. https://doi.org/10.1016/j.iree.2022.100253
Beetham, H. and MacNeill, S., 2023 Beyond blended: Post-pandemic curriculum and learning design: lessons from the higher education (HE) sector. JISC report.
Boulton, C.A., Kent, C. and Williams, H.T., 2018. Virtual learning environment engagement and learning outcomes at a ‘bricks-and-mortar’ university. Computers & Education, 126, pp.129-142. https://doi.org/10.1016/j.compedu.2018.06.031
Brown, G. and Foster, C., 2023. The Use of Virtual Learning Environments in Higher Education—Content, Community and Connectivism—Learning from Student Users. In AI, Blockchain and Self-Sovereign Identity in Higher Education (pp. 125-142). Cham: Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-33627-0_6
Conole, G., 2012. Designing for learning in an open world (Vol. 4). Springer Science & Business Media.
Duncan, K., Kenworthy, A. and McNamara, R. (2012) The effect of synchronous and asynchronous participation on students' performance in online accounting courses. Accounting Education, 21(4), 431-449. https://doi.org/10.1080/09639284.2012.673387
Hu, M. and H. Li (2017) Student engagement in online learning: A review. International Symposium on Educational Technology (ISET), 39-43. https://doi.org/10.1109/ISET.2017.17
Mayer, R. and Moreno, R. (2002) Aids to computer-based multimedia learning, Learning and Instruction, 2(1), 107–119. https://doi.org/10.1016/S0959-4752(01)00018-4
Mayer, R. E., 2009. Multimedia Learning. Cambridge University Press.
Paas, F., Renkl, A. & Sweller, J. 2003. Cognitive load theory and instructional design: Recent developments. Educational psychologist, 38, 1-4. https://doi.org/10.1207/S15326985EP3801_1
Ramsden, P. 1992 Learning to Teach in Higher Education, London: Routledge.
Rienties, B. and Toetenel, L., 2016. The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behaviour, 60, 333-341. https://doi.org/10.1016/j.chb.2016.02.074
Sweller, J. (2004) Instructional design consequences of an analogy between evolution by natural selection and human cognitive architecture, Instructional Science, 32, 9–31. https://doi.org/10.1023/B:TRUC.0000021808.72598.4d
Note
[1] We gained ethical approval from Nottingham Trent University for the use of this data.
↑ Top