The Economics Network

Improving economics teaching and learning for over 20 years

Extended Case Study: Computer Aided Assessment and Independent Learning in Macroeconomics

This is an updated, self-contained version of the extended case study in the Computer-Aided Assessment chapter of the Handbook for Economics Lecturers.

Introduction

The purpose of this extended case study is to describe the implementation of and learning outcomes of computer aided assessment and independent learning projects introduced into second year macroeconomics teaching at the University of Aberdeen in 1999. Evaluation is benchmarked against the learning objectives (specified by the course, university and discipline) compared with previous practise using traditional methods).

Overview

It is important when considering the introduction of any new and innovative practice to be aware of the nature of the target clientele and the institutional environment you operate in. A project that works well in one location and in one set of circumstances may have to be intelligently modified to be able to work well in another. Thus, in the first section (Where and Who?) we describe the environment within which the project was devised and how this informed the choice of learning and teaching methods developed. We then move on in the second section (How and Why?) to discuss how we determined our intended aims and learning outcomes - what did we want to achieve and how could we do it given the environment we operated in and the resource constraints we faced? In the third section (What and When?) we describe the practical implementation of our project - what did we do and when did we do it? Finally, (Evaluating the experience) we summarise the student experience of the project and evaluate the project from the point of view of the teaching team and the wider community. We assess whether the project fulfilled its ambitions, what improvements could be made, and prospects for the future.

Where and Who?

The University of Aberdeen in the North-East of Scotland was founded in 1495 and today has over 12,000 students, 79% of whom are enrolled on undergraduate courses. 90% of undergraduate students are from Scotland. In contrast to the rest of the UK, many students enter higher education at the age of 17, having completed only one year of post compulsory school education. An undergraduate MA degree programme therefore takes four years of full-time study, compared to the three years of the equivalent BA or BSc in England.

Arts and Social Science students are not admitted to particular disciplines, but to a general MA programme. In September 2001, over 1000 students were admitted to the MA programme. For their first two years they typically follow a modular programme of study, consisting of courses from a number of disciplines. At the end of their second year they finally choose their honours disciplines, which lead to a single or joint honours degree after a further two years of study.

Over 400 students take a course in economics in their first year, often as part of a programme leading to finance or management degrees as well as to economics per se. Thus, given the low level of pre-exposure to economics and the diverse nature of the clientele, first year courses are relatively general in nature and do not focus heavily on the technical training often found as part of English degree courses in the discipline. In the second year, approximately 100 students take second year macroeconomics, most of whom will go on to take degrees in economics science, either as single honours or joint honours with other disciplines. Second year courses, therefore, must make the bridge between a very general first year and the demands of honours level study. The second year must provide a sufficiently rigorous training to enable those students who wish to successfully cope with the rigours of honours level study to do so. For many students there is a substantial step up from first to second year. This is particularly the case in connection with the transition from the mainly descriptive and diagrammatic analysis used in first year to the more technical algebraic treatments required in second year. One challenge, therefore, of developing teaching methods in this environment, is to enable students to make this transition easier.

How and why?

In this section we explain what our objectives were and how this particular implementation of computer aided assessment and independent learning helped us to achieve them.

As many of the relationships in macroeconomics are complex and interrelated, a vital skill for any macroeconomist is to be proficient at model building and policy analysis. In the UK this centres on a modelling structure devised in the early 1960s by Mundell (who subsequently went on to win the Nobel prize for work in this area) and Fleming. This powerful model enables practitioners to relatively easily conduct policy analysis appropriate to low inflation open economies such as the UK, EU and USA. However, to do so effectively requires repeated calculations of an algebraic nature. Thus, students face a double hurdle of not only attempting to master the intuition behind how government policy or external shocks affect the macro economy, but also the need to develop their algebraic and quantitative skills. The nature of these skills are such that they require repeated exposure and practice to master them, which cannot be effectively provided in a traditional lecture setting, nor would the provision of sufficient small group tutorials be viable given resource constraints. And so an innovative use of IT was called for that facilitated the use of independent learning strategies by students, repeatedly exposing them to model building techniques, but taking some of the "grind" out of the calculations so that they could focus more clearly on the underlying theory and policy analysis.

Furthermore, the QAA Economic Benchmark document states that the aim of degree programmes in Economics should be to (among other things) "apply the knowledge and skills they have acquired to the solution of theoretical and applied problems", be equipped with "appropriate tools of analysis to tackle issues and problems of economic policy" and have "an ability to develop simplifying frameworks (and) appreciate what would be an appropriate level of abstraction". The model building techniques and policy analysis developed in our project very clearly satisfies these key objectives. Moreover, it clearly "develops a range of transferable skills of value in employment" and the use of computing skills specified in the benchmark document. Thus, the project satisfies discipline benchmarks as well as course and university requirements.

Given the resource constraints we faced in the current funding environment, any potential solutions had to be very parsimonious in terms of both developmental and running costs. The first of our requirements was for powerful, but user-friendly computational software. The University of Aberdeen had site licenses for Microsoft Office 97 and planned to migrate to Office 2000 in due course. Thus there was assurance of software continuity in this respect. This particular suite of software includes Microsoft Excel, which provides substantial computational power and the ability to visualise results graphically in an easy to use Windows-based environment. It is also a package that is in use by the vast majority of small and large corporations around the world, and competence in its application was a highly regarded transferable skill by our Employers Liaison Group (a panel of local entrepreneurs that advises the department on programme development and alumni relations, and currently includes the Director of the local chamber of commerce and the CEO of the Royal Bank of Scotland). Excel therefore stood out as the first choice in terms of maximum computational and graphics power at zero marginal cost.

Our second requirement was for software to enable us to deliver web-based material to students (including questionnaires to evaluate teaching and learning) and to provide information to students (including course information and the results of assessments). Our final requirement was a testing environment; software that would deliver the assessment, manage the on-line testing of students, and provide the results of the assessment to the examiner. Today, packages such as WebCT and Blackboard could satisfy both of these requirements, in what we now term a Virtual Learning Environment or VLE. However, in 1999 the university had only just begun to trial the use of WebCT, and its use required a steep learning curve in comparison to the more beginner-friendly Blackboard. On the other hand, the department had substantial experience in using traditional web based media in conjunction with the university's Oracle based student management information system (HEMIS). A number of departments had also began to trial the use of Question Mark as their testing environment, with some degree of success. The university had acquired a site license for the networked use of the testing environment, requiring departments only to purchase individual Designer and Reporter components (the Question Mark components that are used to design the assessments and manage the marking and reporting of results, respectively).

The decision was taken therefore, following extensive trials and consultation, to use Question Mark as the testing environment and traditional web based media for information dissemination and student management. The net cost of this was the reporter and Designer modules of Question mark. At the time the university's Learning Technology Unit (LTU), an arm of the library and computer services conglomerate DISS, were operating a small grants scheme to promote the innovative use of IT in teaching, learning and assessment. Projects involved bidding for support time, the idea being that this support would enable innovative projects to be established, with the understanding that departments would maintain the projects in subsequent years. The department's bid was successful and was helped greatly by the LTU experts in CAA design, conducting useful evaluations (both web based and via focus groups), and writing scripts to automate the web integration.

What and when?

In this section we detail what we did in practical terms and the time scale we did it in. The second year macroeconomics course takes place in the second semester of the academic year in Aberdeen. Teaching takes place over a twelve-week period commencing at the beginning of February. This is interrupted by the three-week spring holiday, dividing the twelve week teaching bloc into two blocs of eight and four weeks respectively. In the first two weeks, students are entitled to 'shop-around' and audit different courses, so it is important in this fluid environment that no material upon which assessment may be based is delivered in that period. We also wished to retain as part of our overall assessment of the course an essay based piece of work, as we felt that this developed important academic skills. The topic that most lent itself pedagogically to this mode of assessment was growth theory, which was taught in the final four weeks of the course. Thus, these timetabling constraints necessitated the independent learning and computer aided assessment to be contained in weeks 3 through 8.

We used weeks 3 and 4 (see the Gannt chart) to begin the delivery of the web-based material, which provided the background study material that we expected students to use in preparation for attempting the computer based work. Students could download this material in PDF format (we used Microsoft Word to edit the documents and equations, and Jaws PDF creator to convert the document to PDF format). We were very careful to use the same notation in the lectures (two, two-hour lectures per week), tutorials (one hour, fortnightly), computer workshops (see below) and web material as in the textbook (Gordon's "Macroeconomics", now in its 8th edition).

In weeks 5, 6 and 7, we then ran a series of three supervised computer workshops. The first workshop hour was concerned exclusively with Excel. The task students were given was to set up their basic worksheet that would enable them eventually to complete the assessment. They were given written documentation in the course guidebook on how to achieve this goal. It detailed a suggested structure of the worksheet and how to enter formulae (which was revision for them - see box 1). The formulae were obtained from the web material and supporting lectures, but needed to be rearranged appropriately for the tasks at hand. Thus, entering of formulae was not purely a mechanical exercise and required students to think what they were doing. The task was deliberately too large for the mean student to be able to complete in the supervised hour, thus requiring private study. It was explicitly stated to students that they would need to take advantage of the 24-hour computer suites available.

Box: Information skills at Aberdeen

All students entering the Aberdeen MA programme are required to attend an information skills course in the first week of their first year. This provides students with benchmark skills in Microsoft Word, Eudora (email) and Netscape (web), as well as library skills (via WebCT). They are pre and post evaluated on this course. Once the required skills are achieved, they are passed off. Those with extensive IT skills already, are often passed off within the first day. Most students manage to succeed within the first week. Extra classes are run in the following weeks for those who don't. The senate grid states that further skills, such as Excel and PowerPoint, may be embedded in courses, or acquired by dedicated courses run by DISS and which now lead to the ECDL qualification. In Economics, Excel skills are embedded within the year 1 microeconomics course (which also has been the subject of a CAA project, see http://www.abdn.ac.uk/diss/ltu/projects/econproj.hti). The second year macroeconomics course therefore builds upon this provision.

In the second workshop hour they were tasked with completing the trial assessment - a dummy run for the real thing, giving them experience of the testing environment, Question Mark. This trial assessment gave them feedback after each question so that they were able to assess how well they were doing, and how to correct any mistakes in their spreadsheet. This second supervised workshop also gave them the opportunity to consult their tutor on any questions arising from their private study in the preceding week.

The real assessment then had to be completed over the next week and a half in their own time. A catch-up lab was held the following week for any students who had been ill or otherwise absent, or just required extra help and advice.

A further supervised workshop was held using the package LiveEcon, an interactive suite of independent learning software under development by The Enterprise Library Limited (TELL). This was being simultaneously trialed at Aberdeen and the Judge Institute, Cambridge. Students were introduced to the modules relevant to the Mundell-Fleming model (these were developed by the present author) to complement their independent learning experiences thus far. Many students enjoyed having direct impact into the development of this package, since the program designer, Aberdeenshire entrepreneur Charles Jordan, regularly attended the workshops to get feedback directly from the students!

Students completed the real assessment on-line in their own time. It was closely based on the trial assessment, but using different coefficient values and some question variation. Each question a student was asked was drawn randomly from a library of similar questions. This security feature meant that no two students could ever be asked the same sequence of questions. For additional security, the logon time and terminal details of each student was recorded.

Once the deadline had been reached, the Reporter software was used to analyse the results. These were then processed and uploaded to the web, whereby students could access them together with a suggested solution, but only once they had completed an on-line evaluation questionnaire! We endeavoured to have this mounted a few days before the spring break so that students could go on vacation knowing how they had done. In subsequent years we managed to get the turnaround time down to as little as one day, meaning that we could push the deadline back and give them more time to complete the assessment.

  01 02
  Nov Dec Jan Feb Mar Apr May
Week no.   1 2 3 4 5 6 7 8 V V V 9 10 11 12
Question design    
IL web material mounted    
Workshops      
Trial assessment mounted      
Real assessment mounted      
Results and evaluation      
Traditional assessed essay      

Gannt Chart (2002)

The questions are written during November and December, to ensure that whatever may happen in January, the system is ready for use in February. In the first year of implementation, the independent learning material was also produced in this period. The require time decreased in subsequent years, as only minor corrections were needed to the web material, and the previous year's test can be used as a 'template' for the current year.

Evaluating the experience

From a purely pedagogic viewpoint, the project has been a resounding success. We feel that students' understanding of the model used and the transmission mechanism of policy in the system have vastly improved compared with previous practice. This is even borne out in the final exam based questions that cover these topics.

The mean CAA marks tend to be higher than for traditional essay based coursework (which in turn are higher than for closed-book exams). There is also evidence of some degree of team working among some groups of students. However, on balance we tend to regard this positively. After all, even if the input of each team member were unequal, a student doing most of the explaining would benefit from the enhanced knowledge of the subject that we as lecturers know we gain from teaching, whereas those being explained to perhaps benefit from an alternative exposition and positive peer role models.

Focus group evaluation revealed that students found the set-up costs high in terms of work, effort and understanding, but the assessment relatively easy by comparison. What they have found is that understanding how the model works and how to implement the system in Excel is harder than answering the questions about the system. This is also positive, since it demonstrates that students have been encouraged to do some harder 'deep learning', which in the long run they find more valuable. Comments like "the whole programme was a big boost in understanding the workings of macroeconomics" and "everything was well organised" sum up the general mood of the class.

We encountered a small number of technical problems, as is inevitable, but all were resolved. Since the icons for the trial assessment (which they can do as many times as they like) and the real assessment (which they can only do once) are similar in appearance, some students were mistakenly beginning the real assessment and having to illegally abort. This was addressed in subsequent years by restricting the mount time and inserting alert screens. We also responded to a desire for more Excel guidance by vastly increasing this content in the course booklet. A summary of the problems and evaluations are found at http://www.abdn.ac.uk/diss/ltu/projects/econ2proj.hti).

Conclusion

Finally, we may offer a few words of wisdom to those considering implementation of a similar project. First, design the aims and objectives of the project for specific pedagogic reasons, appropriate to the conditions in your department and university. Ask how it will enhance student learning, rather than how much time or resources it will save (computer aided assessment, though it may cut the marking load overall, rarely saves, and often adds to, the developer or coordinator's time!). Secondly, make sure that your aims are achievable given the resource constraints and support available (it is better to do it well on a small scale at first, than badly on a large scale all at once). Thirdly, conduct meaningful evaluations to measure how well the objectives have been achieved in the appropriate timescale. Verbal feedback, for example through focus groups, can be tremendously useful - and be prepared to be flexible enough to respond to feedback both during and after the implementation. And don't forget to explain to your students why you are doing it, and involve them in it. This way they will share in your objectives and help the smooth running and enjoyment of new experiences enormously.

Back to top
Contributor profiles