Sunday, August 23, 2015

Evolving Assessment (1/15/2015)

posted Jan 15, 2015, 1:28 PM by Ellen Crews   [ updated Jan 15, 2015, 1:32 PM ]


When I started teaching, it was at a school which struggled with long-term “Program Improvement” status. As such, there was a strong emphasis on continual assessment. “Assessment drives instruction” was the foundation for much of our planning.  This phrase, however, had developed an alternative interpretation with could best be summarized as, “The assessment drives instruction.”  I worked with a team to write benchmark exams for the district which closely mirrored the content and format of the state’s standardized assessment, with a goal of accomplishing the dual purpose of predicting student performance and preparing students to be successful.  Additional chapter tests were created using the same process.  The formula was multiple choice, and we carefully crafted questions that would predict future performance as well as wrong answers that reflected students’ foreseeable mistakes.  After each test, our assessment software would generate a detailed report on the outcome, and we could look at trends for specific questions or standards.  In addition, each student was assigned a color which reflected their overall performance.   
A few years ago I surveyed my students and, among other things, they reported a high level of test anxiety.  They had been through years of tests like mine, with teachers continually stressing the extreme importance of successful performance.  To combat their fears, I made some immediate changes. I stopped giving chapter tests and replaced them with smaller, more frequent assessments.  I also stopped using the words “test” and “quiz,” calling my mini-assessments an opportunity to “Show What You Know.”  The multiple choice questions were gone, and students could earn partial credit for showing appropriate procedures.  I also stopped preparing students for the district’s benchmark exams. I administered them without warning, telling students, “Just do the best you can and don’t worry about it.” Surprisingly, scores did not suffer from this lack of preparation and emphasis.  Student anxiety, as measured by subsequent surveys, decreased and students started enjoying math class.
The next year, my team decided to institute a weekly assessment, and made the shift from grading with points to using a rubric.   We were looking at mastery of the content standards, while emphasizing the Eight Mathematical Practices found in the Common Core State Standards. This was very different that the tests our students had taken in the past, and our students became increasingly frustrated.  They did not know how to demonstrate their thinking, and all of our suggestions and examples seemed to go over their heads.  We shifted to a new rubric which emphasized process over mastery.  For the next few months we focused on the Mathematical Practices and increasing students’ willingness to tackle complex problems.  We changed the names of our assessments to “Show What You Can Do” to emphasize the importance of process over outcome.  Gradually, as students became more comfortable with this new type of assessment, we moved back to grading on content.
This year, I started at a new magnet school and worked with a team to develop a general rubric to guide grading across all content areas.  I adapted this rubric to fit with what I what I valued in math.   The assessments also changed.  I have moved away from weekly assessments, and instead throw them in whenever I want information.  I try to assess with two questions at a time.  One question relates to concepts that students have experience with, and I consider that assessment to be summative.  The other question is based on concepts to which students have had little or no exposure. These I count these as formative.  Students may use their notes as a reference, and may ask questions during the assessment. If I do not have the time to provide individual feedback, I do provide an example a response that would receive a 4.  Students have the opportunity to review the example and reflect on how it differs from their own work before having a second chance at the assessment.
I’m still not sure of the overall impact of these changes.  I have noticed that many students are putting in more effort than before, and really striving to do the best that they can.  Students who tend to struggle have learned to at least try, because they know that they can get a score of 1 or 2 for their efforts.  And I have a frequent source of information about my students’ progress to help drive my instruction.

No comments:

Post a Comment