Assessment Literacy: What educators need to know and understand about assessments to strengthen teaching

by Dr. Bryan R. Drost

Whether we like it or not, assessment is a hot topic for everyone in education right now: how do we know students have learned? How have we formatively assessed?  Are teachers making the right instructional adjustments based on assessment for learning?  Are we asking kids the right questions?  The list goes on.  

What all of these questions bring to mind is the fact that educators, now more than ever, must have a solid understanding of assessment literacy.  Chappius et al (2011) describe assessment literacy as follows: “Assessment-literate educators…come to any assessment knowing what they are assessing, why they are doing so, how best to assess the achievement of interest, how to generate sound samples of performance, what can go wrong, and how to prevent these problems before they occur.”

As exciting as assessment literacy is to me as a psychometric, my reality as a curriculum director, principal, professor, and teacher, has been that many educators have never learned to write and use assessments in this manner. On the one hand, I don’t hold them personally responsible, as it is something that has never been taught to them. On the other hand, I do hold myself personally liable for teaching them as “how a teacher tests–the way a teacher designs tests and applies test data–can profoundly affect how well that teacher teaches” (Popham, 2003). As research has proven time and time, the connection between teaching and assessment is critical.  When this critical connection is understood, instruction improves (Drost, 2012).  

To help solve the assessment problems that were happening in my previous district, I built a professional learning community over the course of the year for my teachers.  To start with, I had teachers list all of the assessments that we were giving throughout the district as I had heard many times that we were giving too many. This helped teachers reflect on what we were actually doing and to really think about assessments in terms of formative and/or summative purposes. It also encouraged a bit of buy-in to this assessment literacy PLC as we were home to the opt-out movement in our county and several of my teachers wanted assessments gone, period.

Next, we analyzed the following three articles to give the group a knowledge base in relationship to assessment:

By exploring these articles, the team learned that it needed to be our mission if we wanted to improve instruction for our students and to continue to meet state standards that we needed to become more data literate.  

Now that minds had been primed, I worked with my teachers to understand the test development process, a process I have personally experienced at the state and national levels several times.  I focused on the idea that data is evidence of student mastery and that data is only effective if it ties back to claims—i.e. if students can answer a particular type of question correctly, an instructor can infer what a student knows or can do. I asked teachers to bring some of the assessments they had been giving and from a psychometric perspective, we began looking at the questions in terms of validity: did the assessment questions give evidence of student learning tied back to the standard?

To help support teachers in answering this rather lofty question, I utilized a template for a Table of Specifications found in Guskey’s (1997) text Implementing Mastery Learning.   After analyzing our own assessments (resource specific or teacher-created), and finding that we had a ton of gaps in relationship to student learning, we began to analyze Ohio’s released items in science and social studies in relationship to performance level descriptors and blueprints.  Ah-hah’s quickly showed up: staff members were surprised to see that questions that the state had written that were once “just too difficult for students to answer” were really truly aligned to the standards and their expectations.

Following these ah-hahs, we as an instructional staff were now ready to design assessment tasks that met the levels of understanding that students needed to demonstrate.  Utilizing Scalise’s chart on assessment question types, we back mapped assessments using a table of specifications and developed the following filters as things to look for as we wrote new questions. Each of the filters is designed to get at the overarching question of what evidence of student learning are you going to look for?

Filter 1 – Identify skills in learning sub-targets to assess

  • Have you identified all skills in learning sub-targets that represent all students included in the assessment?
  • Have you included questions/answer choices that allow students at each level to show you where they are in their mastery?
Filter 2 – Identify the level of rigor and/or Depth of Knowledge of the learning sub-targets

  • Is the degree of rigor for each skill identified by its approximate level according to Bloom’s Taxonomy and/or Depth of Knowledge?
  • Have you accounted for the content literacy standards?
  • Do you have items in place that allow them to show what they know at the right level of complexity?
Filter 3 – Determine types of assessment questions

  • Have you determined the appropriate type of assessment questions to assess skills?
Filter 4 – Additional considerations

  • Is the number of assessment items concise?
  • Will the scoring of the assessment measure the level of mastery for each learning target?
  • Is there vocabulary in the question that would impede student understanding of what is being assessed?
  • Do you have differentiated materials for students to communicate their responses?

At the end of the year, I was proud to say that we had solid assessments in language arts and math in many grade-levels that aligned to the standards.  Were they perfect?  No! Did tweaks need to happen to some of them? Definitely.  Did kids misinterpret questions – yup!  Did some teachers need to adjust instruction in relationship to their data – of course! Does the staff still have room to grow – completely! Did we strengthen our teaching as a result of this process – completely!