Thursday, November 16, 2017

Learning to Teach Again: Testing Trials

I am a big believer in closely matching my assessment vehicles to what I want students to know, understand, and do. I think that they way we assess students matter, and I try to use a variety of different kinds of assessments to help me understand what my students understand. This means I use some very informal in-class assessments like quick-writes, Padlet boards to capture their questions, and even monitoring the conversations in small group discussions. But this also means I use a variety of formal, summative assessments that require students to synthesize their learning.

In other words, yes, I give tests.

But here too, I try to match the kind of tests I write to the demands of the content being assessed. I do use objective questions (e.g., multiple choice) sometimes, and I use free response questions (e.g., essays) sometimes as well. Typically I give tests in class, but I have been known to do "take-home" tests as well if it seems like a better fit.

But I think I've stumbled on to an in-between of these two approaches to which my students responded very favorably. It's a tech-mediated middle ground of an in-class and a take-home.

A conference in Florida in November? Sounds like a plan!
I tried this out pragmatically in World Regional Geography last week, because I had scheduled an exam on the syllabus for a week I was going to be out of town at a conference. I debated having one of my colleagues just proctor the exam in class, as I've done before. I also debated just giving them a take-home exam, which I have sometimes done before, but haven't always been happy with, because it feels more like another paper to grade. (Not that this is all bad, mind, but if I advertise it as a test on the syllabus, some students feel like being assigned a paper to write is a bit of a bait-and-switch.)

So I looked at the quiz feature in Canvas, our course management system. After some thinking and tinkering, I settled on an approach that I thought might work well. I wanted students to spend the time in preparation going through a review of the topics that they had studied. For a traditional take-home test, I don't always get the sense that they do this. I also wanted them to have to synthesize on the fly, as they would in a traditional in-class test. But because I value getting students to think geographically (rather than a memorizing a million facts) the structure of a take-home exam made sense to me. So here's what I did...

  • I created a review guide to help them organize their thinking about some of the big themes we've been thinking about through the semester in light of the regions of the world we have been considering lately. I gave prompts for review along the lines of, "Be prepared to describe the different types of governments you would be likely to find in different regions of the world. For example, what would you find in North America? How about Central America? Western Europe? Have some examples ready to explain each region politically." I had similar prompts about the economics, population dynamics, levels of development, international relations, and unique features of each of the regions we have been exploring.
  • In Canvas, I prepared a question bank with about 20 different questions included. Some of these I bundled together in sub-groups.
  • I then created a "quiz" (Canvas calls them all "quizzes," even if they are tests...) comprised of five short essay questions. Then the fancy part: I set up the quiz to pull questions randomly from the question bank. For some, I used the designated sub-groups to ensure that every student got one question about economic geography, one about political geography, and one about population dynamics. Thus, each student only had to answer five questions, but each of them would have a unique set of questions to answer. 
  • For the actual test, I set the quiz to have a fixed time limit (which I told students ahead of time.) This was to help replicate a bit of the face-to-face test-taking experience, with a finite amount of time to write, demanding some synthesis on the spot. But to equal the playing field, I specifically told students I expected that they should have their text, notes, and other resources at hand as they answered the questions, as they would for a take-home test. The idea here is that they had to prepare well in advance, but they could also leverage their resources just-in-time as they were crafting their responses to the questions.
  • For the test time, I set limits on when they could take it. The availability of the test was limited to a window of a few days, but students could take the test anytime that worked well for them within that window. Since I can check on when they take the test, I found it interesting to see that some did it in the morning, some in the afternoon, and some in the evening. (And...of course...there was that one student who did it at 1:00 in the morning...) But I thought this might be a real benefit for students as well: they can choose the time to write the test when they feel they are at their best for it.
How did they do? Overall, outstandingly well! They didn't have choice in the exact questions they would answer, but they had freedom of support (through their resources) for responding to the questions that were assigned to them. 

I surveyed students about their experience of taking the test this way, and here's what I learned:
  • 74% thought the test was about as difficult as they expected. (17.5% thought it was harder than expected, and 8.5% thought it was easier.)
  • 91% reported that they thought this test was an appropriate way to demonstrate what they been learning about these regions of the world.
  • 96% reported that they reviewed substantially before they began the test.
I asked them if they would like their final exam to also be offered in this format, and overwhelmingly they agreed (all but 2 responses were "Yes," and these two were "Yes, but..." and offered suggestions for changes, which I'll be considering for the future.)

It was a good learning opportunity to try something new for me in terms of a tech-mediated assessment option. I'm so thankful for such good natured students who are willing to try new things along with me! Based on this experience, I think I'll try this approach again in the future.


  1. Another fantastic post Dave! Love the mix you suggest. Also like how you are creating tests that go beyond DOK 1 and 2. Gotta keep higher level learning the goal of what we do as educators.

    1. Thanks, Shannon! Agreed--I'm all about HOTS. There's a time and place to memorize...but it should always be in the service of deeper learning.