With a warm front pushing north from the Gulf of Mexico the weather in Chicago was unseasonably mild in early January as the delegates gathered for this years ASSA Convention.
Included among the many sessions organised by the various associations coming under the ASSA umbrella (by the way, it did rain heavily a couple of times!) were several which will be of interest to CHEER readers.
Betty Blecha and Tod Porter organised two sessions for the American Economic Association covering applications of CAL in Economics (in the classroom and on the Internet). Bille Goffe and Bob Parks gave the latest installment of their series of talks on whats on the Internet for Economists (three showings to ensure that as many people as possible could see the show) and they also organised a roundtable discussion on the implications of the current and future state of the Internet for academic journal publishing in economics.
The first CAL session had three presentations which focussed on the use of the Internet in teaching and learning economics: a presentation prepared by Ros OLeary (but delivered in her absence by me) entitled Advances in Co-operative Teaching Applications of the Internet in the UK, one by Joe Daniel from the University of Delaware with the title Computer Aided Instruction on the World Wide Web: The Third Generation which featured a demo of his oo_Micro! site, and then me again with a joint paper written with Barry Murphy on Interactive Quantitative Economics on the Web.
Ross paper looked mainly at three projects, CTI Economics, Biz/ed and SOSIG, all based at the University of Bristol in the Institute for Learning and Research Technology, but each with inputs from people elsewhere and with links to resources at other sites. A major theme of the paper was that there can be substantial benefits from cooperation and coordination of effort. The examples illustrate that when such projects are suitably structured they can produce high quality material, available to all via the Internet.
CHEER readers shouldnt really need to be told about the CTI Economics project. One of 24 subject based centres funded by the government to encourage the use of learning technologies in UK Higher Education, CTI Economics has had a Web service since 1994.
Here you can read CHEER on-line, search the Economics software catalogue, get the latest news on conferences, workshops and other events, connect to other resources such as Biz/ed, SOSIG and the WinEcon Web site, or view specially prepared reports on the latest technologies (see for example the latest report on Java and Streaming Audio & Video).
Biz/ed is a gateway site for business and economics students, teachers and lecturers. It contains selected data sets and mirrors for other data providers (Office of National Statistics, Penn World Data Tables, US Census Bureau etc.), company facts (The Body Shop, BMW etc.), tutor support and study skills material and links to other key national and international websites. It has an on-line searchable Internet resource database, also accessible from the CTI Economics pages, which can be used to search for quality resources by keyword.
SOSIG is a Web based service for social scientists, providing a gateway to thousands of high quality resources on the Internet. Again there is a searchable on-line catalogue which delivers a description of the contents and quality of each item and a link to the resource itself. SOSIG exercises a high degree of quality control and every resource in the catalogue has been carefully selected and described by a librarian or subject specialist.
On searching the SOSIG catalogue on the phrase "economic growth", only six resources were found, but they are all high quality resources. You will notice on the screen that there is a Thesaurus which can be used to provide alternative terms to try. The Thesaurus is derived from the HASSET Thesaurus which was developed by the Data Archive at the University of Essex (another example of the cooperation and collaboration which is a feature of these projects).
Funding from the European Unions DESIRE project has enabled SOSIG to develop a system to support collaborative cataloguing of resources across Europe. A pilot scheme involving the National Library of the Netherlands is currently underway.
Ross talk ended with some brief comments about other potential future developments, including the possibility of a link up with the SCOUT project at the University of Wisconsin.
Joe Daniel has developed a Web site called oo_Micro! which provides an on-line multimedia world for teaching, learning and applying microeconomic theory. The oo in the title stands for Object-Oriented ; Joe has used object-oriented programming techniques in Java to represent economic models as graphical objects that can be drawn, manipulated and animated directly on the Web page by pointing and clicking with the mouse.
The project has its origins in a Hypercard stack produced by Joe, but in early 1996 he begun work on preparing a Web version using Java.
There are four parts to oo_Micro!
Joe gave a brief demo starting from the Home Page (see Figure 3) and taking us through some of the pages in oo_Micro!, including the section covering General Equilibrium and the use of the Edgeworth Box (see Figure 4).
Figure 3: oo_Micro! Home Page. Click on the image to visit the page.
Figure 4: oo_Micro! animated graphics for General Equilibrium with Production. Click on the image to visit the page.
oo_Micro! has many attractive features. The graphs that you see are real graphs of actual functions, not drawings. You can change the parameters, click the animate button and watch the curves move to their new equilibrium position. Having the instructions and mini lectures in audio form provides a convenient alternative to the text (although the text can still be read on screen or printed/saved for reference later). Because the audio clips operate through RealAudio a user has complete control over them; you can start, stop, rewind and replay as required. oo_Micro! is pretty well exhaustive as far as intermediate micro goes - there is a full table of contents which you can use to take you to the models you want to work with.
Joe briefly discussed some of the problems he had faced in constructing his web pages. The Java code is supposed to be cross-platform but Joe had found particular problems with the IBM ThinkPad. It should also work with different browser software but Joe had found that new versions of the software would often break something that was OK before. At present Java doesnt allow you to specify the width of output (the number of decimal places) and because of this the results could occasionally be messy. Joe expects this to be addressed in later versions of Java.
Those of us at the session were very impressed by the results that Joe has achieved so far. Joe apologised for the use of his own voice on the RealAudio clips - he said his funds didnt stretch to the use of a professional Irish tenor. He hasnt yet decided how to market the product - getting users to pay in some way would generate funds for improvements and extensions.
In the final paper of the session I described and illustrated another approach to putting interactive microeconomics on the Web. As part of the ongoing CALECO Group research programme on computer-based learning, Barry Murphy has been investigating the use of symbolic algebra packages such as Mathcad, Mathematica and Scientific Notebook for creating interactive quantitative economics material.
In assessing various versions of the symbolic algebra packages it became clear that this type of software offered a potentially straightforward route for authoring powerful computer-based learning resources in economics, enabling the author to produce fully interactive quantitative economic models using standard mathematical notation, as much additional text as would be deemed suitable, and live-linked graphs which would be redrawn whenever parameter settings were altered. One can either build new worksheets from scratch within the software, or one can load in new or existing text from whatever word-processor the author is used to working with. Using the specialist software one can enter mathematical expressions and equations into the document in a natural form. Any required hypertext links can also be identified by the use of tagging or mark-up functions. The software is also able to undertake any necessary mathematical processing (both numerically and symbolically) of equations provided to it and can produce graphs of relationships stated in algebraic form. It provides an excellent environment both for generating and using material with guaranteed mathematical integrity without the need for advanced mathematical skills on behalf of the author or user. Now the latest versions of these programs have begun to harmonize themselves with the World-Wide Web, both by incorporating menu commands to give access to the Web and by operating as helper applications to standard Web browsers such as Netscape or Internet Explorer. This offers an alternative route to the HTML/Java Script applets approach to the authoring of documents for the Web which contain interactive mathematical and graphical elements. It is more powerful (it can handle symbolic algebra as well as ordinary numerical equations) and, we believe, is easier to program.
Initially using Mathcad 6.0 (because this was the first of this group of programs to offer a Web browser facility) Barry has produced a full set of interactive resources for Intermediate Microeconomics which he has been using on the second year undergraduate course of that name which he teaches.
The Mathcad Resource Site contains a set of approximately 90 interactive linked Mathcad worksheet files written by Barry Murphy containing lecture material and exercises for the entire course. The files covering the individual topics consist of a combination of explanatory text and separate regions containing equations and parameter settings and associated graphs. Each is quite short with an interactive exploratory experiment at its heart where the user is invited to alter the settings of one or more key parameter values (highlighted in yellow) and to observe the effect on the graph. Typically the user should respond to a message such as
Make (small, sensible) changes in the parameter highlighted above and observe the results in the diagram. Make a note of your results.
The material builds up gradually with users first visually exploring what happens when parameter settings are changed and then learning more about the theory which underlies the models. There are hotlinks to files containing associated material (underlined text indicates a hypertext link) and each section ends with a summary of the points covered and suggested further reading.
Users are not required to work through the mathematical manipulations themselves (although they should attempt to understand the essential structure of each model). The Maple engine which is built into Mathcad handles that. However students with above average mathematical ability are encouraged to try to work things through themselves by the use of messages such as
If you are feeling adventurous you can confirm your results algebraically.
Because the files are accessible via the Web students can work through the material as many times as they wish, experimenting with different parameter values until they have understood the properties of each of the models and the lessons to be learned from them. The aim is that students should understand the underlying logic of the models, not just be able to reproduce sketched diagrams.
Navigating through the files a student can select a particular topic and work through the material it contains. For example suppose the user selects Consumer Theory and then Consumer Preferences he or she will open a worksheet discussing preference relations and indifference classes. Various cases are discussed and illustrated including Cobb-Douglas preferences (see Figure 5). The student is invited to explore the way in which the properties of the preferences are shaped by the key parameter c, and to note the results. Here the student is asked to establish that the indifference curves are steeper, the larger is the parameter c.
Figure 5 The interactive Mathcad worksheet for exploring Cobb-Douglas
preferences
The worksheets are carefully designed for the staged development of understanding. When the students move on to a later worksheet on utility functions they again meet the Cobb-Douglas example. This time they can see how the parameter c enters into the equation for the utility function (see Figure 6).
Figure 6 The interactive Mathcad worksheet for exploring the
Cobb-Douglas utility function
We call the approach that student must take as guided exploration. At the end of each section a summary is provided of the concepts introduced and the lessons which should have been learned, together with references to texts such as Varian and the workbook by Bergstrom and Varian.
The experiments this year have verifed that the approach is practicable. Barry estimates that about 90% of the time spent producing the material would have been taken in planning and writing lectures even if they had been delivered in the traditional way. Only 10% of the time was used in translating the material into Mathcad form and installing it on the Web. Having course material available in this way obviates many of the problems which a lecturer has when faced with a large student group of heterogeneous backgrounds and skills. Although Barry has not dispensed with the regular weekly lecture he can use the software to illustrate his points and students can then go to the labs and work through the material again (and again if necessary) at their own pace. Students who miss the lecture dont hassle him for the handouts since they know it can all be found on the Web.
Further examination is continuing with alternative mathematical helper software. Already versions of the files for some of the topics have been produced using both Mathematica 3 and Scientific Notebook and it can be seen that both these programs provide a feasible alternative to Mathcad. A decision to choose one program rather than another rests partly on the funtionality offered in terms of power and ease of use of each package, partly on the cost and availability of software on university and student machines and partly on whether other economists elsewhere are producing material of this sort and making it available on the Web. For example if a lot of good material was being produced using Mathematica and being placed on the Web we would consider accommodating to that fact. Mathcad 6 is available for the Mac, as is Mathematica. Scientific Notebook is not (yet, and no promises) but Scientific Word and Scientific Workplace are, so a Mac version seems likely if Scientific Notebook has a big takeup. With Mathcad one can import text via the clipboard or paste text from an OLE server as a linked or embedded object using the Windows Packager utility. Files can be exported as Rich Text Format (RTF) file readable by most word-processors. Scientific Notebook can open LaTeX, text and RTF files. Mathematica provides the option of saving notebooks in HTML or TeX format The detailed exploitation of these facilities will be studied at a later stage of this project.
In addition to experimenting with different helper applications packages we would like to extend the approach to some other courses. Already people teaching courses in econometrics and corporate finance have expressed an interest in extending the pilot into their units.
A panel of discussants then commented on the three papers. Richard Wood remarked that the speed of change in the use of computers in economics was as fast as ever. Fascinating new things were being done with these new software tools, but he personally still found that spreadsheets provided the ideal computing environment. Students and (most) staff are familiar with them and you could get students both to work on previously authored spreadsheet models or to produce their own from scratch. They are very flexible in how they can be used and it might be better to stick to one familiar type of software rather than introducing lots of other new types of software. Roger McCain prefaced his remarks by saying that there were many good things he could say about the presentations, but with limited time it would be more useful to concentrate on the criticisms. Any economist must be pleased to see new interactive economics software tools, but he noticed that nothing had been said about the monitoring of the student use of the software. When using the Web, HTML cookies technology might be of some assistance here. In commenting on the issue of whether it was easier to program in Mathcad or Java Roger emphasized that there is a difference between the Java language itself and Java script and Java applets. Whilst it might be time consuming to program from scratch in Java one could reuse prewritten templates (emphasizing the Object Oriented nature of the tools) and this could reduce the production time substantially. On the question of the Helper applications called up by Web browsers Roger said this brought up the usual standards question - would people agree whether it was better to work with Mathcad, Mathematica or some other helper application. With all its foibles at least Java script is standard and it was for this reason that he preferred Joes approach. Tod Porter talked about the learning curves with new software for students and faculty. Most people now know how to use a browser and for this reason he welcomed the new generation of CAL material that was being developed for the Web.
The second session began with a presentation by Willem Bier of the IMF Institute describing and demonstrating a CD-ROM produced by the Institute (which is the training department of the IMF) on Exchange Rate Analysis. Willem emphasized that their customers are mid-career professionals with either a good academic background, a lot of work experience in economics, or both. The CD-ROM is a full multimedia introduction to Exchange Rate theory and practice and includes a Case Study on Poland. It is designed so that it can be used wherever you are, so long as you have access to a computer with CD-ROM and multimedia capabilities. The package has three parts; (i) a tutorial which outlines the key concepts and issues of Exchange Rate Analysis and Policy (ii) a case study on Poland and (iii) a supplementary resources section which contains text material on all the of the topics covered on the CD-ROM, including a full glossary. Both of the first two parts are fully multimedia in form with an audio commentary, video clips and animated diagrams and charts to put over the ideas.
As he talked, Willem demonstrated the program. Figure 7 shows the opening screen and Figure 8 shows a typical screen from the program, in this case part of the case study section.
Figure 7: Exchange Rate Analysis opening screen. Click to download the full-size picture.
Figure 8: Part of the case study section of the Exchange Rate Analysis CD-ROM
As you can see from Figure 8 the program has easy to use controls both within the screen (for example to pause the video clip) or in the bar at the bottom of the screen for navigation and access to other facilities such as the calculator or notepad (where the student may add their own comments). Transcripts of the text of all the video clips are available in the resources section of the program and you can have the transcript and video clip running together in different windows, if you wish.
The case study takes the form of a practical exercise, where you have been called in as an adviser, and you are given access to all the papers and the chance to hear from all the main players before you submit your final report (which is the way users are assessed on their understanding of the material).
Having now had a chance to work through the CD-ROM I can say that it is a very impressive teaching and learning tool that I would certainly want to use if I had to teach a course in this subject. It also provides an excellent medium for economists who are not exchange rate specialists but who would like to have an up to date knowledge of the area.
The next talk was on Classroom Multimedia: The Paper Component by Jim Clark of Wichita State University. Jim said that the purpose of the paper was to look at a way to improve student learning in multimedia based classes based on both personal experience and research findings. As background he said that he has been using computer and media technology in the classroom since 1982. A problem for students has always been that material is often presented at a speed which is too fast for note-taking. A typical solution (and one that he tried) is to give out printed copies of all the screen displays. Handouts of this type clearly have some advantages (students like them and at least they have accurate notes to take away with them) but, as many people who try this approach discover, some students think Why go to the class? I will just get a friend to pick up a copy of the handout for me. Attendances drop, and of course some students miss a lot because they cant always understand the notes if they were not at the session.
Jim decided to look at the research on note-taking, looking particularly at how note-taking can affect learning. He hoped to find an answer to the question Is there a useful way to structure note-taking in multimedia classes? What he found is that there are believed to be three stages in getting information into memory: perception, short-term memory and long-term memory.
Concentrating on the first stage first we find that data from the environment hits all our receptors and only a little is processed further. The objective is to get students to give their full attention to what you are trying to show them. Here good presentation and the use of animations can help. Moving on to the next stage, selected data are held in the memory for only a short period of time and there is only a very limited memory capacity. Research at AT&T suggests that only 7± 2 chunks of data can be remembered (which is why phone numbers are never longer than this and why 800 numbers are easiest to remember). To get the data into the long-term memory it must be encoded or structured in a way that people can make sense of it. The long-term memory capacity is virtually unlimited (for students anyway - some of us may get disk full symptoms as we get older!). To be remebered data must be such that people can make connections to things they already know. We also know that repeated exposure to information helps people to build these connections.
Turning now to what this means for note-taking, research suggests that notes have two functions: (i) the encoding function (the act of writing helps get the data into the long term memory) and (ii) the external storage function (the notes are there for you to refer back to).
Research on the encoding function gives mixed results. Note-takers usually (but not always) learn more than people who just listen. (It only works if the notes are accurate!). But (and this is the key point) students who start with some pre-structured notes learn more. This gives us a policy suggestion. Provide some partial notes which will be recognised as incomplete (and therefore students will have to turn up to complete them) but organise them in such a way as to help students structure the notes that they make.
The note taking system should
Needless to say Jims talk followed his own guidelines. The key points of his talk were projected on the screen in a series of colourfully and nicely designed computer generated slides, and there was a handout with all the information from the slide distributed to all those present. But (i) certain keywords (such as those underlined above) were missing and had to be filled in by the audience (ii) the right-hand side of each sheet of the handout was blank, giving people plenty of space to write in additional comments if they wished.
Jim said that students appear to appreciate the structured notes approach. They are even willing to pay for them!
Note. Jim provided a list of useful references ( including Cohn et al, Journal of Economic Education 1995 pp 291-307 and Kierwa et al Journal of Eduational Psycholgy 1991 pp 240-245). You can contact him at jeclark@twsuvm.uc.twsu.edu if you want the full details.
The third presentation in this session was by the husband and wife team Jean-Pierre and Catherine Langlois on Teaching Game Theory with Software in which they demonstrated the beta version of their program GamePlan. Jean-Pierre is a mathematician and Catherine is an economist and the project began because they were trying to produce a program to solve a stochastic game that Catherine was working on in her research. (It involves two competing firms making investment decisions under conditions of uncertainty). As Jean-Pierre worked on the problem he realised that it might be worth extending the range of the program (covering more Game Theory cases) and making it more user friendly so that other peple could use it in their teaching and research. The result is GamePlan which should be ready for full release later this year.
Jean-Pierre commented that beyond the standard basic problems of Game Theory it could be a frustrating subject to teach. Good results might be available on existence for some of these problems but it was difficult to give practical problems to students as assignments because they would take too long or be too difficult to solve. A good software package for the PC would offer a way forward; students would still need to do intelligent work in understanding the solutions and be creative in the way that they used the program, but if it was properly done it could put the focus on the games and their structures rather than on computational matters. The major problem in creating the program was not so much ensuring that the computer algorithms worked correctly as in designing it so that is was user friendly and easy to use.
GamePlan can be controlled either by clicking on keywords in the menu (File Player Node Move Solution Display Help etc) or via icons. Games can be created, displayed in different ways, solved and saved for later use. Initially you have a large blank white screen area which is where you work. Jean-Pierre illustrated the look and feel of the program by showing us how it deals with the simple Battle of the Sexes game. The software allows you to create players, put them where you want on the screen, edit them, and then display the information in a variety of different ways before you examine the solution(s). Jean-Pierre showed the information relating to each players options and payoffs using a coloured stereotype picture to distinguish them; a little blue matchstick man for the man and a little red matchstick woman for the woman.
The software allows three types of solution concept: classic Nash equilibrium, Perfect (perfect Bayesian, which combines strategy with a system of beliefs) and sequential. GamePlan allows you three views of the solutions: Pure, Explore (which seeks all possible solutions) and Sample (useful in large games with many solutions). Jean-Pierre showed us how the solutions are displayed on screen.
Next we were shown the Pay-Raise game where three lawmakers must vote on whether or not to raise their own pay. At least two yes votes are needed to secure the raise which players would obviously like, but it looks bad to constituents if you vote in favour of the proposal. In this game the software distinguished each of the players by using different colours (red, blue and green). Jean-Pierre showed us how GamePlan can display the game, examine the strategies and produce all the different solutions. He showed how the game is transformed if secret ballots are introduced. The number of solutions is reduced to four and thus you can illustrate the importance of the rules of the game.
Jean-Pierre showed us several other games to illustrate the different game structures that GamePlan can handle (including Catherines stochastic game which led to the software being developed in the first place). It was hard to understand the game, watch the software and take notes which would be meaningful a couple of months later as I write my report for CHEER (another illustration of the points Jim Clark was making!) I have two pages more of scribblings but nothing that I can make out. Suffice it to say that the demonstration of GamePlan was extremely stimulating, dealing with complex games in a powerful yet flexible and visually appealing way. As Betty Blecha said It was a real treat. I cant wait to see the finished version, which I understand should be available by May and will sell for under $100. In the meantime if you are interested and want to make contact with Jean-Pierre you can e-mail him at langlois@math.sfsu.edu.
The final presentation in the session Why Multimedia is better was by a team from the Georgia Institute of Technology; Richard Cebula, Willie Belton and John McLeod. They have produced two multimedia CD ROMs called Microeconomics Alive and Macroeconomics Alive to help them teach the subject in a way which makes the subject well, ... come alive.
Richard began by saying that the project was motivated by the need to address a number of key issues - larger classes, fewer teaching assistants and a growing perception that the subject is both boring and difficult. Students today are more visually oriented (MTV etc.) and less inclined to read books. In any case the rising price of books means that fewer of them will buy the textbook in the first place.
What is needed for these students is an approach which is engaging and entertaining, which encourages them to play and investigate but at the same time exposes them to the core ideas of the subject and enables them to see their relevance. Microeconomics Alive attempts to do this by using cartoon characters in a fictitious but realistic story about a fast food restaurant (selling Ostrichburgers!). John gave a brief demo to illustrate how it works. As the story unfolds the owner of the restaurant, Marge, faces a variety of economic problems which can then be linked to the underlying economic concepts. For example as demand for the Ostrichburgers increases the owner takes on more staff but is surprised to find that profits dont rise in proportion. Tables and charts illustrate what happens to revenues, costs and profits and the user can then consult a lesson on the subject which explains the difference between average and marginal costs, short and long-run cost curves and the relationship between production and costs.
The program covers the core areas of microeconomics (demand and supply, elasticities, production and costs, perfect competion, monopoly and oligopoly etc.) and in each section users can choose to look at the simulation, examine data using the spreadsheet and charting tool or look at the lesson. The hope is that the student will become sufficiently interested to take the software home and play with it.
The CD-ROM is not linked to any particular textbook and at the Georgia Institute of Technology it is assigned in addition to a textbook and the regular lectures. This year 225 students were registered for the course and the results so far are very encouraging. Lecture attendance has gone up, not down. Students have become more interested in the subject and recognised the relevance of the material. Office hour calls have gone down, but those questions raised have become more sophisticated and interesting. Feedback has been almost universally positive and the results have improved. The speakers emphasized that this is not a testing system - there are no quizzes. The sole object of the material is to encourage the student to learn about economics.
Microeconomics Alive is published by South-Western (ISBN 0-538-84650-X)
Contact Richard Cebula at richard.cebula@econ.gatech.edu for further information.
Teresa Riley and Betty Blecha were discussants for this session.
Teresa observed that the software
However we need more formal analysis of the effectiveness of the approach (not just anecdotal evidence).
Betty addressed the navigation issue. Students must be able to go back up the route they had taken and this wasnt always possible in mutlimedia software. The IMF program was good here, however. Turning to the issue of taking notes, she said that students take notes for a reason - it forces you to organise your thoughts. There is still too much passivity in some CD- ROM products.
Those of us who attended these sessions, particularly the presenters, are grateful to Betty and Tod their work in organising the program and ensuring that evrything ran smoothly on the day. You can find information about the plans for next year at http://userwww.sfsu.edu/~bjblecha/cai.htm
My report on the Bill Goffe/Bob Parks Whats on the Internet for Economists? Demo and Update will be brief. Not because there wasnt much covered - far from it. As usual it was packed full of information and even the most well-informed member of the audience was sure to have discovered something they didnt already know during at the session. No, the reason I can be brief is because the dynamic duo have set up a web site at http://wuecon.wustl.edu/~goffe/ASSA.98.html which contains the handout for the talk in three different formats (text, WordPerfect and MS Word), the PowerPoint display which they used to hold together the presentation (also available in HTML format) and a web page with the bookmarks for all the sites visited in the session (this can also be viewed as a regular HTML file). Superb, gentlemen!
I will just pick out a few points to highlight. We looked in at some very impressive material at ( HREF="http://www.fuqua.duke.edu/programs/gemba/techdemo/techdemo.htm#1) [this is no longer available publically - web editor], prepared for the Global Executive MBA Program at Duke University. This includes slides, sound and video clips (using Real Player Plus) in a highly structured Multimedia World Wide Web environment. We then went to a site especially prepared for a conference on New Institutional Economics which included a video clip of Ronald Coase talking at the conference. We also saw live TV pictures from France to illustrate the way in which the Internet is becoming a viable broadcast medium. We saw the AltaVista language translation service and looked at some of the increasing number of journals with Web sites and on-line viewing of papers (and some new journals which are exclusively on-line, although not in economics).
The section on class material rightly identified the pathbreaking work done by Joe Daniel with oo_Micro! (not just his use of Java applets but also the use of the RealPlayer software to add sound to enhance the delivery of material). Other sites mentioned in this section of the talk were Nouriel Roubinis Understanding the World Macroeconomy page at http://www.stern.nyu.edu/~nroubini/MACRO3.HTM and Cambell Harveys Financial Toolbox at http://www.duke.edu/~charvey/.
There was much, much more covered in this fast moving presentation. Dont forget also to pay regular visits to Bobs Economics Working Paper Archive and Bills Resources for Economists on the Internet.
At the special roundtable session on the Economics Profession and the Internet, organized by Bill Goffe and Bob Parks, the panel was asked to consider a number of questions concerning academic publishing and the Internet.
Bill Goffe chaired the session and the panel was made up of Bob Parks, Malcolm Getz and John Siegfried from Vanderbilt University, Tim Taylor of the University of Minnesota (also Editor of the Journal of Economic Perspectives) and Zachary Rolnik of Kluwer Academic Publishers.
On the first question most of the speakers agreed that a change in the technology which can allow different forms of delivery of papers (so you can read the paper on screen or download it to print out) didnt affect the crucial issue of quality control. It was argued that the editorial process adds significant value to papers and for this reason editorial review and publication are inextricably linked. Taylor said that the biggest problem in academic publishing is that there is too much of it - the quality is too low. Publishing on the Internet could make this worse. Enthusiasts were kidding themselves. On the Internet people can write anything they like (almost!) but there would be no guarantee that anyone would read it. He felt we should read more and write less. In this way costly publication can be beneficial in helping to ensure good quality. Bob Parks reminded us that in other disciplines such as Maths and High Energy Physics publishing via the Internet didnt mean the abandonment of refereeing of papers. He also pointed out that the publication via the Internet could give better certification of which papers were being accessed and read.
Following on from this it was argued that the quality of a journal was directly related to the quality of the editorial board and associated referees. Sigfried said that if the costs of publication could be reduced perhaps it was time that the funds released were used to pay the referees. Other panelists agreed that it was time that we looked again at the current process of two (or three) referees and the editor.
It was suggested that good journals can create economic rents. The price of a journal reflected not just the costs but also the value! The intriguing question was posed What would be the current market of the AER and Journal of Economic Perspectives? Someone suggested $10 million which caused a moments stunned silence on the part of the Editor!
Some panel members doubted that electronic submission and publication of papers would substantially reduce costs. Siegfried reckoned that about half of the AER costs are on distribution and thirty per cent cover certification. There is no agreed common electronic format for the submission of papers and publishers say it is still cheaper to typeset from scratch rather than try to edit documents submitted in electronic form.
It was suggested that putting the costs of publication onto the author might provide suitable incentives, for example in improving the quality of the first draft. This was an area where the electronic format plays a role in making it easier to revise documents. If there was an additional charge for submission in a non-standard format this could provide an incentive to ensure that authors met the common standards. The discussion looked at related issues such as whether in practice the university would pay the submission fee on your behalf (some journals already have submission fees) and whether authors could then be paid royalties for papers that have been accessed via the Internet. Siegfried said that authors having to pay for submissions and publication could result in more inequality in journal publishing with a winner takes all or star effect developing.
Parks emphasized that with publication on the Internet it would be possible to track access to individual articles rather than the whole journal. If royalties were based on access over time then authors like Black and Scholes (who had problems getting published to begin with) would get a revenue stream over a long period of time. This would give authors an incentive to spend more time on improving the quality of their work. The aim would be not just to get published but to have many readers. Another advantage of Internet publishing is that hypertext links would mean that readers could immediately refer to cited papers.
Tim Taylor also acknowledged that libraries are facing a storage problem with journals and electronic publishing would help here. Reference was made to the JSTORE project (see http://www.jstor.org/). Perhaps individual libraries could also set up their own local stores (in electronic form on the Intranet) of key articles on reading lists? Siegfried reminded us that many members of the AEA still dont have Internet access.
On the last question there was general agreement that it is helpful for readers to have access to the data used in papers, also special algorithms and other information which couldnt be included in the paper due to lack of space. It was pointed out that many publishers already offer this but the take up was less than 10%. Taylor said he was against making it mandatory to supply data - and there was no guarantee that just because a data set was supplied that it would be the correct on. There was some discussion about the role of replication in science and it was agreed that there would be an incentive to get things right if you knew that a thousand students would be checking the results and contacting you by e-mail if they found a discrepancy.
Questions were then invited from the audience. One contributor said that with electronic publishing responses to articles could become interactive - comments about errors or refinments could be linked to the original paper as was already happening in Maths. It was also suggested that some younger people might prefer electronic publication. They were more used to searching the Internet than going to libraries. Another said that he didnt understand the generally skeptical tone coming from the panel. Publishers and editors should think more about the users point of view than their own vested interests.
Overall, with the exception of Bob Parks, I found the panels answers were rather negative. Someone commented that it was no surprise that the innovations were mainly in the areas of maths, computing and high energy physics and made a disparaging remark about techno-geeks. It was a pity there was no representative of a University library on the panel as what libraries decide to do about electronic journal subscriptions is likely to be an important determinant of how quickly things change. Overall though an interesting discussion which, I think, was at its best when it focussed on objectives and incentives rather than just on costs.