Using PollEverywhere instead of clickers

Categories:

Jennifer Imazeki, San Diego State University
Published on this site November 2011

This case study is reproduced from Prof. Imazeki's "Economics for Teachers" blog post of 24 June 2011, under the terms of its Creative Commons BY-NC-ND licence. Visit the original post for comments and related posts.

Months ago, I mentioned that I was part of an ITS pilot of PollEverywhere this past spring. Quick reminder: PollEverywhere is a web-based service where anyone can create a multiple-choice or open-ended question and people can respond via text, Twitter or website. I first used PollEverywhere in the fall when I wanted a way for my teams to submit open-ended responses. The free version only allows up to 30 responses per poll which was fine for 13 team responses but wouldn't work for individual responses (since I have 75 students in each of my sections) so I used clickers for any individual responses. In the spring, the University bought a PE account subscription so there could be unlimited responses. It also meant that students could register and their responses were recorded so I could use PollEverywhere as a replacement for clickers. In this post, I'll explain the mechanics of how I used PollEverywhere and some of the associated pluses and minuses. In my next couple posts, I'll talk about what the students thought and my overall impressions.

Low-stakes assessments: I mostly used PE to have students submit their individual responses to multiple-choice application problems before they discussed those same problems in their teams. My main concern was making sure that students had to think about the problem individually a little and commit to an answer before discussion. I didn't care so much what specific answer they chose so students received credit just for answering anything (i.e., participating); there were only a couple times when I made credit dependent on selecting the 'right' answer. PE allows you to embed polls in PowerPoint and that is what I did, rather than switching over to the website each time. One downside of PE, relative to clickers, is that there is no timer so I created one using animation in PowerPoint. It's a clunky workaround but if you want to give students a visual indication of how much time they have to answer a question, I'm not sure what the alternatives are.

Grouping questions: One thing I had to decide was if I wanted to 'group' my questions together or not. The way PE normally works, each answer choice has a randomly-generated unique keyword; for example, if you want to choose answer A, then you send in '70101' and if you want to choose answer B, then you send in '70103', etc. With a paid account, you can also create your own keywords to replace the random numbers (but they still have to be unique since the keyword identifies both the question and the specific answer choice). An alternative is grouping multiple questions together and assigning a keyword to the group. Once you do that, respondents send in the group keyword before any questions are asked; they get a response that says they are enrolled in that 'event'. Then, the codes for individual answer choices within the group are numbered 1-9 and then alphabetically. That is, say the first question in the group has five answer choices; they would be numbered 1, 2, 3, 4, and 5 so students who wanted to select the fifth answer would only have to text in the number '5'.  If the next question in the group also has five answer choices, they would be numbered 6, 7, 8, 9 and A so students who wanted to select the fifth answer there would text in the letter 'A'. This can get a little bit confusing since most of us are used to talking about answer choices as A though E but I got used to it. For me, grouping questions together made it easier to keep my poll questions organized. I created one group for each class session, for each section, and named the group with the date and section time; for example, May11AM for my 11am class and May11PM for my 2pm class. As soon as students got to class, they knew they should text in the day's keyword so they would be ready to go when the first question came up. Here's what the first slide of every class looked like (this was up as students walked in):
 



And here's what a typical question looked like (note that the strip along the side was my 'timer'):
 


No integration with Blackboard: In order to give students credit for their PE responses, they first have to create accounts in PE and if they want, they can register their cell phones (so any response sent from that phone number is automatically connected to their account). If they don't register their phones, they have to log in each time and submit responses using a browser. Another drawback of PE, relative to clickers, is that it is not integrated with Blackboard, the course management system. This means there are extra steps for students (registering on a separate site) and extra steps for me. To get their daily points into Blackboard, I had to create a 'report' in PE, download that to Excel, make any necessary adjustments in Excel (such as giving credit for right answers versus just participation, or just summing up the points for the day), then upload to Blackboard. A colleague in the business school who also piloted PE this spring has apparently developed an Excel macro that can take care of some of the Excel manipulations but I just did things manually. For me, the extra work wasn't a huge issue but one thing that was frustrating was that in order to upload to Blackboard easily, I asked students to change their identifier to their University ID number (the default when they create their accounts is their email address). By the third week (and after multiple reminders), almost all students had done this but I had two students (one in each section) that never made the change; since I stopped making the adjustment for them after Week 3, this meant that their PE points were zero for every single class and they STILL didn't figure it out! [Note: if PE were used a lot more across campus so this happened in all their classes, I have to assume they would eventually fix it but I'm still amazed...]

Dealing with multiple responses: Another issue I had to consider was how to handle multiple responses. With most clicker system, students can change their responses as long as the question is open and the system will simply retain the last answer submitted. With PE, you can set an option to only allow up to X responses per person or unlimited responses; if you choose to allow multiple responses, PE records every response separately (every response is time-stamped). PE also can send a confirmation text so students can verify their response was received (this is an option you can turn on or off). In my case, since it usually didn't matter which specific answer a student selected, I set things up so they could only submit one response; on the few occasions where their specific choice 'mattered', I made sure to tell students that they needed to be extra careful before sending in their responses since they would only get one shot at it. My colleague in the business school allowed multiple responses and then used his Excel macro to only count the last submission for each student.

Next time, I'll share some of the feedback I got from students...

p.s. While I was working on writing this post, InsideHigherEd had an interesting article on standardization of clickers that mentions cell phones replacing clickers. And if you're more old-school, ProfHacker just posted an article about low-tech alternatives to clickers.

 

Visit the original post for comments and related posts.

Categories