What if… All Educators Were Assessment Literate?

What if All Educators Were Assessment Literate?
– April 16, 2020

by Douglas G. Wren, Ed.D.

Doug Wren is a consultant and author of the book Assessing Deeper Learning: Developing, Implementing, and Scoring Performance Tasks. He recently retired from his position as Educational Measurement & Assessment Specialist with Virginia Beach City Public Schools. Before coming to Virginia Beach, Doug was Director of Research & Evaluation for the DeKalb County School District in Georgia. To learn more, go to https://wrenedconsult.com or check out his free course in the Edjacent Teachable School, Introduction to Assessment Literacy at https://edjacent.teachable.com/.

Like many educators, teaching was a calling for me. Like some, teaching was my second career. Returning to grad school after earning my certification was inevitable—I couldn’t get enough education about education.

A Little Learning Is a Dangerous Thing

The lines that follow this oft-misquoted phrase are “Drink deep, or taste not the Pierian spring: There shallow draughts intoxicate the brain.” What English poet Alexander Pope meant was that a little learning is exhilarating, but it’s better to drink your fill at the Pierian spring, a sacred source of knowledge in Greek mythology. My first taste at the Pierian spring of assessment literacy came during EPY 700: Educational Tests and Measurement, a class I found both intoxicating and relevant to my classroom. Drinking my fill of assessment literacy involved a doctoral program of study rich in ed psych coursework; however, educators don’t need to go to such lengths to become assessment literate.

“Assessment literacy consists of an individual’s understanding of the fundamental assessment concepts and procedures deemed likely to influence educational decisions,” wrote assessment icon W. James Popham. Two of these concepts are test reliability and validity. For a full explanation of the what, why, and how of assessment literacy, read Chapter 1 of Popham’s 2018 book, Assessment Literacy for Educators in a Hurry.

What We’ve Got Here Is Failure to Educate

A few years before No Child Left Behind became the law of the land, Jim Popham’s buddy Rick Stiggins made this prediction: “Teachers will be expected to be far more assessment literate in the future than they are today or have been in the past.” Whether teachers are more assessment literate than they were in 1998 is debatable. A 2012 study of 180 teacher preparation programs in 30 states indicated that only one-fifth of the programs adequately covered assessment literacy topics. Fewer than 2% of the programs appropriately covered the topics of analyzing assessment data and making instructional decisions based on the data.

Last fall I asked a friend of mine—a former K-12 teacher and administrator who teaches full-time at the university level—about preservice teachers being deficient in assessment knowledge when they graduate. He told me that typically, colleges and universities “expect” content area instructors to incorporate assessment topics into their courses. The (false) assumption is that these instructors are assessment literate and know what to cover.

Exacerbating the problem is that the majority of states and school districts either ignore the importance of assessment literacy or move it to the back burner. Not one state includes assessment literacy standards in their teacher certification requirements. Few districts offer effective professional development on the basics of tests and measurement.

So… What if All Educators Were Assessment Literate?

Before describing my ideal scenario if all educators were assessment literate, here are three real-life examples of tests and test results being misused, along with the unintended consequences:

Example #1 – Elementary and middle school students spend a significant amount of class time taking and retaking reading tests that mimic the big state-mandated test. Some educators believe these benchmark tests—as unreliable and lacking in validity evidence as many are—should be lengthy so children will build up stamina for the big test. There is no research to support this; test prep isn’t the same as preparing for a track meet. The worst part about making kids read test passages ad nauseum is the message that reading is laborious and not something to be enjoyed.

Example #2 – Some building administrators use physical or digital data walls with benchmark test scores to track the perceived progress of students and classes toward big test readiness. Data walls can be counterproductive. I know of a school where teachers are so afraid of looking bad on the wall of shame that they give their students inappropriate help on benchmark tests. When every student has artificially high scores, reading and math specialists are unable to identify the students that need real help.

Example #3 – A group of district administrators saw that the percentage of test takers who met the College and Career Readiness Benchmark on the SAT Math section was well below the percentage of test takers who met the benchmark in Evidence-based Reading & Writing (ERW). To increase the percentage of students meeting the Math benchmark, the administrators quickly decided that more high school kids should be encouraged to take advanced math courses. There was no discussion about whether it was in the best interests of every HS student to take additional math courses beyond Algebra I and Geometry, nor was there any mention of improving the teaching effectiveness of all HS math instructors. If these administrators were well versed in assessment literacy, they would have recognized that:

  • Considerably fewer male and female SAT test takers in every racial/ethnic group meet the College & Career Readiness Benchmark in math than meet the ERW benchmark. This is because college-level math courses are generally harder than college-level social science and English courses. Comparing the percent of SAT test takers who meet the Math benchmark with the percent who meet the ERW benchmark is as futile as comparing apples and oranges.
  • Increasing the number of students who take advanced math courses doesn’t mean that the percentage of SAT test takers meeting the Math benchmark score will automatically increase. There are many variables at play here, such as how well students were taught and their ability to apply specific math concepts on a college entrance exam.

Now here’s my best-case scenario if all educators were assessment literate. First, superintendents would do their best to convey important assessment knowledge to school board members and other politicians. These decision makers would then realize that one-shot standardized tests are not the best way to gauge teaching and learning. Next, after giving teachers long overdue raises, districts and states would invest in high-quality assessments that measure deeper learning. They would use the results formatively (i.e., to improve instruction), instead of summatively (i.e., to judge students, teachers, schools, and districts). Time spent on meaningful learning tasks would increase and time spent testing would decrease. If every teacher was assessment literate, students would understand why they are taking tests, and each test would be reliable, valid, and engaging.

Becoming Assessment Literate

The National Board for Professional Teaching Standards (NBPTS) recognizes the importance of assessment literacy. The most recent changes to the National Board Certification process include a component that evaluates teachers’ assessment and data literacy. If you’re considering NBPTS certification, I strongly recommend it.

If you’d like to learn more about assessment literacy, here are a few other suggestions:


College Board. (2020). SAT Suite of Assessments: Benchmarks. Retrieved from https://collegereadiness.collegeboard.org/about/scores/benchmarks

Estren, M. J. (1994, March 19). ‘Drink deep, or taste not.’ Washington Post. Retrieved from https://www.washingtonpost.com/archive/opinions/1994/03/19/drink-deep-or-taste-not/c9653c0f-dde5-4436-87ce-91044014fbb2

Ezzelle, C. (2019) Assessment practices for certification of accomplished teachers. Frontiers in Education, 10. Retrieved from https://www.frontiersin.org/articles/10.3389/feduc.2019.00090/full

Fabry, D. (2016, September 20). Why we need assessment literacy as part of teacher preparation [Blog post]. https://www.nwea.org/blog/2016/need-assessment-literacy-part-teacher-preparation

Greenberg, J., & Walsh, K. (2012). What teacher preparation programs teach about K-12 assessment: A review. Washington, DC: National Council on Teacher Quality. Retrieved from https://files.eric.ed.gov/fulltext/ED532766.pdf

Homeschool Resources Group. (n.d.). Drink deep, or taste not the Pierian spring: What’s in a name? Retrieved from http://www.homeschoolresourcesgroup.org/pierianspring/about-pierian-spring

Popham, W.J. (2018). Assessment literacy for educators in a hurry. Alexandria, VA: Association for Supervision and Curriculum Development. Retrieved from http://www.ascd.org/Publications/Books/Overview/Assessment-Literacy-for-Educators-in-a-Hurry.aspx

Stiggins, R.J. (1998, February) Auditing the quality of classroom assessment training in teacher education programs. Paper presented at the Annual Meeting of the American Association of Colleges of Teacher Education, New Orleans, LA. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=

Leave a Comment

Your email address will not be published.


Sign up to receive the latest information about Edjacent!