In other words, yes, I give tests.
But here too, I try to match the kind of tests I write to the demands of the content being assessed. I do use objective questions (e.g., multiple choice) sometimes, and I use free response questions (e.g., essays) sometimes as well. Typically I give tests in class, but I have been known to do "take-home" tests as well if it seems like a better fit.
But I think I've stumbled on to an in-between of these two approaches to which my students responded very favorably. It's a tech-mediated middle ground of an in-class and a take-home.
|A conference in Florida in November? Sounds like a plan!|
So I looked at the quiz feature in Canvas, our course management system. After some thinking and tinkering, I settled on an approach that I thought might work well. I wanted students to spend the time in preparation going through a review of the topics that they had studied. For a traditional take-home test, I don't always get the sense that they do this. I also wanted them to have to synthesize on the fly, as they would in a traditional in-class test. But because I value getting students to think geographically (rather than a memorizing a million facts) the structure of a take-home exam made sense to me. So here's what I did...
- I created a review guide to help them organize their thinking about some of the big themes we've been thinking about through the semester in light of the regions of the world we have been considering lately. I gave prompts for review along the lines of, "Be prepared to describe the different types of governments you would be likely to find in different regions of the world. For example, what would you find in North America? How about Central America? Western Europe? Have some examples ready to explain each region politically." I had similar prompts about the economics, population dynamics, levels of development, international relations, and unique features of each of the regions we have been exploring.
- In Canvas, I prepared a question bank with about 20 different questions included. Some of these I bundled together in sub-groups.
- I then created a "quiz" (Canvas calls them all "quizzes," even if they are tests...) comprised of five short essay questions. Then the fancy part: I set up the quiz to pull questions randomly from the question bank. For some, I used the designated sub-groups to ensure that every student got one question about economic geography, one about political geography, and one about population dynamics. Thus, each student only had to answer five questions, but each of them would have a unique set of questions to answer.
- For the actual test, I set the quiz to have a fixed time limit (which I told students ahead of time.) This was to help replicate a bit of the face-to-face test-taking experience, with a finite amount of time to write, demanding some synthesis on the spot. But to equal the playing field, I specifically told students I expected that they should have their text, notes, and other resources at hand as they answered the questions, as they would for a take-home test. The idea here is that they had to prepare well in advance, but they could also leverage their resources just-in-time as they were crafting their responses to the questions.
- For the test time, I set limits on when they could take it. The availability of the test was limited to a window of a few days, but students could take the test anytime that worked well for them within that window. Since I can check on when they take the test, I found it interesting to see that some did it in the morning, some in the afternoon, and some in the evening. (And...of course...there was that one student who did it at 1:00 in the morning...) But I thought this might be a real benefit for students as well: they can choose the time to write the test when they feel they are at their best for it.
How did they do? Overall, outstandingly well! They didn't have choice in the exact questions they would answer, but they had freedom of support (through their resources) for responding to the questions that were assigned to them.
I surveyed students about their experience of taking the test this way, and here's what I learned:
- 74% thought the test was about as difficult as they expected. (17.5% thought it was harder than expected, and 8.5% thought it was easier.)
- 91% reported that they thought this test was an appropriate way to demonstrate what they been learning about these regions of the world.
- 96% reported that they reviewed substantially before they began the test.
I asked them if they would like their final exam to also be offered in this format, and overwhelmingly they agreed (all but 2 responses were "Yes," and these two were "Yes, but..." and offered suggestions for changes, which I'll be considering for the future.)
It was a good learning opportunity to try something new for me in terms of a tech-mediated assessment option. I'm so thankful for such good natured students who are willing to try new things along with me! Based on this experience, I think I'll try this approach again in the future.