In my post last week, I gave an account of the student feedback on the use of clickers in the College of Science initiative at NUIG. On the whole it was very positive, and it encouraged the College to expand the use of clickers to include 1st and 2nd year undergraduates. But, what about the staff issues?
Three group meetings were arranged during the academic year, when all staff involved in the clicker project were invited to come along and discuss progress, issues, problems, successes and to give suggestions. Initially well attended, the numbers coming to the meetings dropped significantly in the second semester. The issues that came up during the year reflect some of the findings of Roger, Angel and Jennifer.
Initially there was concern among some staff members that they were required to use clickers in their teaching. The strong message that came from the Dean's office was that, while nobody was forced to use the system, it was strongly recommended that staff come to training and consider how they might be used. Ultimately, uptake varied in difference disciplines.
Consistency of Use
There was a concern that, if clicker questions were not regularly used across all subjects, students might stop bringing them to class, thereby lessening the effect in those subjects that did use them. Although the use of clickers was not consistent across subjects, the survey of students at the end of the year indicated that 66% of respondents brought their clickers to every class, and a further 26% brought them to most classes.
There were a number of discussions and concerns around the use of clickers solely for the purpose of monitoring attendance. Everything in the literature indicates that this is not a good use of clickers, and likely to encourage the practice where students bring along 5 or 6 devices for their friends. At the same time, over time the data could be used to monitor trends in student participation, which could be used to identify students at-risk.
Through discussion, it was decided that clickers do not provide a reliable method of tracking attendance and that they were best used as a learning tool. At the same time, the use of clickers, where students find the activity valuable, can have the effect of increasing attendance.
At the end of the first semester, the Physics lecturers polled students (using clickers) about the possibility of awarding marks for participation, based on clicker use in class. The response was overwhelmingly in favour of using clickers and getting marks for participation. As a result, Physics students were awarded 5% of their overall mark for participation in the second semester.
This corresponds to what Roger Freedman described as low stakes clicker credit, in comparison to high stakes credit, where marks are awarded for a correct answer. Roger suggests that the choice of low or high stakes credit for participation can change the dynamics in a class. While there is no difference in the learning gain, high stakes credit can stifle student discussion.
There was some initial concern that the time used in lectures for clicker questions would result in less time to cover content, and ultimately this did prove to be the case. However, this also raises questions about the responsibility of the student in the learning process, and how much they can be expected to read outside of class. This has the effect of starting to transform the underlying pedagogy to accommodate increased interaction and participation in class.
Use of clicker questions can give very valuable feedback to the lecturer who is concerned about content. A well-designed question can indicate whether a class is following a lecture, for example. As Jennifer Kaplan stated, you may be covering the material, but are they getting it?
Another, related, concern was that the flow of a lecture is broken when a clicker question is asked. Students may begin to chat, and it can be difficult to bring the focus back to the lecture. In fact, a clicker question should be disruptive, encouraging a student to think about what is being asked. In a peer instruction situation, students are encouraged to discuss the question in pairs or groups.
Roger Freedman suggested that the best clicker questions are challenging, with multiple plausible answers, that reveal student confusion. Conflict leads to drama and gets the discussion going.
In particular, we found that the adoption of the clicker technology is unique to the culture and context of teaching within each discipline, and that this observation had to be factored into the training needs for different groups. The experience of peers is particularly persuasive, and we were lucky to have two academic staff members within the College who already had extensive experience of using clickers in their teaching. They both gave practitioner workshops at the start of the year, and were persuasive in their encouragement to use clickers.
Relating the NUIG experience to that of the experts in the webinar series, many of the issues (particularly around attendance monitoring, credit for participation, and the desire to cover content) are common. Some good tips I've picked up from the speakers are:
- Clickers are best used as a teaching tool, but even the best tools can be misused (Roger Freedman)
- Integrate clicker questions into your lecture, don't treat them as an add-on! You lecture less, and the students think more. (RF)
- Use clickers regularly, and tell students why you are using them.(RF)
- Explain regularly and often why you are using clickers and how the students will benefit. (AH)
- Experiment! See how it works best for you and your students (RF)