Electronic response systems. A good resource, or a waste of time?
In a recent post, Stephen Downes says:
I read from time to time about in-class voting response systems, and while I think it's a great pub game, I don't see the merit in education.
He isn’t the only one. When I introduced a trial voting system into a training/awareness-raising session for education colleagues in a Local Authority several years ago, they were unimpressed, and could not see the educational potential. At first.
I don’t see voting systems as a panacea. I do not even see them as an end point, more as a place from which to start. Nevertheless, with some thought, they can be very useful indeed. Here are 10 ways in which they might be employed.
- For a quick mental arithmetic workout at the start or end of a lesson. Project ten questions on the whiteboard, and set up the voting system to allow only X seconds per question. The activity is a good one in itself, but it will also give the teacher a very good idea of widespread problems in a particular area. For example, if almost none of the pupils obtain the correct answers for the multiplication questions, that’s an indicator that perhaps this area needs to be revisited. It may require further checking out – perhaps the pupils were having a bad day – but it’s a quick way of getting that sort of overview.
- You could do the same sort of thing for literacy, of course,. For example, a set of cloze questions like “They got into there/their/they’re car” will give the teacher a quick picture of whether work needs to be done in this area.
- It does not take a great leap of imagination to see how the previous two examples might be extended to education technology or any other area of the curriculum. For example, quick tests on terminology may not be the most exciting thing in the world, but it is important to ensure that pupils use the correct terms for spreadsheet cells, database fields, photographic resolution and so on.
- The idea can be extended to before and after tests. Starting a new topic? Run a quick test to see how much the pupils already know, or think they know. Keep a record of the scores, and run the same test again at the end of the project. If there has been an overall improvement, keep a record of that and bring it, and similar results, to the attention of any visiting inspector. If there has not been an overall improvement, it needs looking into. Now, I emphasise again that the idea of this sort of test is to give a quick impression. I don’t think that, without a great deal of thought and preparation, they can be regarded as valid or reliable in the technical sense of those words (see Assessing ICT Understanding for more information).
- As I hinted in the foregoing point, the technology can be used to check out pupils’ misconceptions. For example, in economics there is a popular misconception that, in order to reduce money in circulation by the same amount, reducing public spending or increasing tax would have the same effect. In fact, raising tax leaves people with less money to spend on imports or to put into savings accounts, and would therefore be less dramatic than reducing public (ie state) spending. For example, if the government cancels a one million dollar project, that will take one million dollars out of circulation. However, if it raises one million dollars in taxation, it won’t take one million dollars out of circulation because not all of that would have been spent within the economy anyway: some of it would have been spent on imports, and some of it would have been saved, and some of it would have been taken in sales taxation. A teacher could use a voting system to discover whether pupils were aware of such subtleties.
- As hinted at in some of the points above, you can set up a student response system (to use the proper term) with students’ actual names. Although there’s a slight overhead in terms of setting up time, it’s worth doing because then you can see, once the hurly burly of the lesson is over, who “got” it and who didn’t. That will help you to give a more personalised level of instruction next time and/or to do some remedial work with a group of youngsters or even the whole class.
- A great development in the last couple of years has been the facility to construct surveys on the fly. Let’s say you decide, because of an item on the news that day, to have a discussion with the students on the advantages and disadvantages of making your location public when using services like Twitter or Facebook. When it’s time to find out how many people are in favour or against, simply write the options on the whiteboard, assign a number or letter to each, and ask everyone to vote (assumimg you have the right hardware and software, of course). You’ll get an instant graph and that can lead to further discussion.
- Even without that facility, you can use the system to assist in decision-taking. When I was working at a senior level in a Local Authority I introduced the use of an electronic voting system to some of the sessions in our staff “away days”. They’re not a substitute for discussion by any means, but they can help make the time spent on discussion more productive.
For example, lets’ say you have been given 20 minutes in a staff meeting to discuss educational technology training for the whole staff. You might want to begin by asking colleagues to vote on one of three options: no formal training, but a drop-in “surgery” instead; no face-to-face training at all, but a repository of videos and podcasts; or weekly 1 hour after-school training sessions on different applications. Obviously, the answer may be some combination of all three, but having a vote right at the start may indicate, for argument’s sake, that nobody at all wants to rely solely on videos and podcasts – in which case there is no point in wasting your remaining time on it: you’d be better off talking about how the other two options might work. - You can also use the response system to gauge before and after responses. Take the preceding example. Let’s say you had an hour with the staff, or at least a bit longer than just 20 minutes. You could start off by asking colleagues to vote for their preferred mode of in-service training, talk for 10 minutes or so on each one, answer questions, and then run the voting process again to see if there has been a general change in preference. Again, like the classroom tests, the aim of this is not to give you or anyone else the authority to make the final decision, but to help you get a feel for where people stand on the matter. In other words, used judiciously it can become a very valuable part of the decision-making process.
- It should not take too much effort to see how this very same approach could be used to help students make their own informed choices about wht topic to do their project on, what contribution to make in the local community, which charity to support or what activities to run on Open Day.
As with any other kind of technology, the strength of electronic voting systems becomes apparent when used in conjunction with other technologies (whiteboards, visualisers/document cameras) and plain old-fashioned good practice.
The photo is © Ann Oro at http://www.flickr.com/photos/njtechteacher/