Leisa Michelle

Do tests accurately measure mastery?

This post is a part of my Personal Development Project for May 2016.

This is actually a difficult question to answer in a sharp yes or no. It depends on what kind of test is given and what the point of the test is. In this article, we’re going to talk about multiple-choice tests, fill-in-the-blank style tests, free response style tests, oral presentations, and portfolio-style projects, and see what each of these styles of testing is actually good for.

Multiple choice tests

They’re incredibly quick to grade and comparatively quick to take. Multiple choice tests make up the majority of tests that we take in formal education.

There are two main problems with multiple choice tests though. The first is that they’re easy to “crack”. You don’t have to necessarily know the answer to the question if you can recognize what isn’t the answer. The second issue is that multiple choice tests only determine whether you know a set of facts. There’s no room for interpretation or concession, all questions demand a single, clear, black and white answer.

It seems contrary to the very of purpose of education, though, that we test whether or not our students know exclusively facts. We want critical thinkers, right? We want people who ask questions and give thoughtful answers. The world is full of shades of gray that we want everyone to be able to observe and respect.

So the only thing that multiple choice tests can assess mastery of is clear, black and white facts. And yet because you can “crack” multiple choice tests, it doesn’t even do this very well.

Multiple choice tests might take a lot less time than other tests (both for the student and the teacher), but I think we can agree that they’re terrible assessments of mastery.

Fill in the blank tests

Fill in the blank tests are more difficult to complete and take longer to grade, but they have some improvements when compared to multiple choice tests.

Fill in the blank tests get rid of most of the problems that multiple choice tests have regarding “cracking” them. That is, you don’t automatically have an X% chance of getting the question right just by randomly selecting an answer. There’s no bank of answers to default on in a fill in the blank test. You have to come up with an answer on your own.

You could argue that this is a huge improvement because the student has to recall the solution and not simply recognize it. But I’m going to disagree because while it certainly is an improvement, going from eating Chips Ahoy to Famous Amos cookies isn’t much of an improvement in my opinion. We want some of Grandma’s homemade cookies, not this processed garbage.

The reality is that fill in the blank tests still have a major downfall along with multiple choice tests: they assume that there is a singular, clear, black and white answer to each question. The only thing you can really assess is whether or not someone knows straight facts.

Free response and essay-style tests

I bet you can sense a pattern developing now. Free response tests are much better than multiple choice and fill in the blank tests, but they also take much longer to complete and much longer to grade. This is also where grading becomes murky because free response tests allow for gray answers to gray questions about our gray world.

Free response and essay-style tests are good for assessing someone’s ability to analyze, synthesize, explain, and discuss in the written form. But the interesting thing is that sometimes writing isn’t the best form of communication.

I manage the publication Mozart For Muggles on Medium, and it’s been a lot of fun. But I learned something after writing an article on the introduction to Mahler’s 5th Symphony. Describing in writing what’s going on at various points in the music is clunky. It’s hard for the reader to follow along and understand when I link a video and say “the first ten seconds this” and “the next five seconds that”. Some of my articles work great, but some (like this one on Mahler) would be better as audio voice-overs.

Sometimes we need props to convey a point. Sometimes we need auditory or visual cues. And so essay-style tests have their limits. They’re pretty good for assessing mastery of certain skills and ideas for certain people, but they can’t be the “be all, end all” solution to testing.

Oral presentations

I rank oral presentations on the same level as free response and essay-style tests. They take a long time to complete, give, and assess like essays, and they’re also difficult to grade. But they’re comparatively effective measures of assessment.

Oral presentations and speeches can have a similar limitation as essay-style tests though. Sometimes we need props or written cues to explain something more clearly. Sometimes the audience needs more time to digest a point, or maybe someone needs to cross-reference a point or look up a word. Speech is an instantaneous form of communication, so there are naturally pros and cons to it.

Oral presentations are good for assessing whether or not someone understands both single facts and broad concepts. Like essays, they’re good for analyzing, synthesizing, explaining, and discussing.

The few times that oral presentations are given in school, the emphasis (and anxiety) is on the actual presentation material. We spend so much time honing our reading and writing that it’s easy to forget that speaking skills are just important to develop as writing skills. In fact, I would argue that speaking well is more important than writing well. Most of my connections begin with a face-to-face interaction and, you know, actually talking. I only text my friends and email people I’ve already met.

But regardless of whether you or disagree with me on which skill is more important, we can all agree that both writing and speaking are essential skills to master. In order to get better at speaking, you have to practice. If you want to be good at arguing or explaining things, practice. You don’t even have to formally give a speech or presentation to improve your speaking. Just go strike up a conversation with a stranger, or be more conscious and intentional about your speaking while you’re hanging out with friends and talking about Donald Trump’s ridiculous antics.

Summing up, oral presentations are just about as effective as essay-style tests in assessing mastery of analysis, synthesis, explanation, and description skills. Speaking requires a different strategy than writing though. Speaking itself is an independent skill like writing.

Portfolio-style projects

The most valuable type of assessment is the kind that requires creating something tangible. Even better is when you make a habit of creating things or maintain a collection of creations. I’m talking about portfolios.

When you have a collection of paintings or blog articles or video productions, that establishes presence, predictability, and a pattern of mastery. It’s the most reliable assessment of mastery. It’s also the most difficult for the student and teacher to keep up with. It takes a lot of time and effort.

The portfolio is great for assessing the same skills as essay-style tests and oral presentations (analysis, synthesis, explanation, and description) as well as skills like persistence and dedication.

Learning and action

Learning requires doing. Learning requires action. And the best way to determine someone’s mastery of a skill or idea is to see them actually do something. You want to prove you can think critically? Create something that exhibits your critical thinking prowess. You want to prove you understand how or why something works? Create something that exhibits your understanding. Write an article, design a product, build a website. Do something. Make something.

Most often, the synthesis of ideas and skills is what’s most important. Knowing the capital of every US state isn’t really impressive unless you can do something with it. Knowing how to solve complicated math problems doesn’t mean anything unless you can ask the questions that call for their use. Raw facts won’t help you too much in “the real world”. You have to be able to make sense of the facts. You have to analyze trends and be able to make predictions. You have to understand how to apply your knowledge in a meaningful way.

The importance of asking questions

I wondered a lot in my university calculus class (that I failed) why it was so important that I learn how to find the derivative of an equation, but not that I learn when and why I should find the derivative of an equation. I remember thinking, “They’re sure training me to answer lots of questions. But they’re not training me to ask any…”

I still have no idea what to do with all the years of math I learned. I don’t know what calculus is for. I don’t get it. (I’ve figured that if I want to make sense of math, I’m going to have to rigorously relearn it all on my own. I’m going to have to find real mathematicians to chat with. I’m going to have to pave a way to make math more relevant and more applicable. But that’s a project for another time.)

Ignorance is shamed

The current paradigm of assessment is actually detrimental to innovation and critical thinking, I would argue. We’re told it’s better to answer a question than leave it blank. Low test scores mean we’re stupid. High test scores mean we got it all figured out. You’re never allowed to not know the answer.

And that’s the killer, right there. We’re never allowed to not know the answer.

This makes me wonder, what is the point of assessment in the first place? “Can you do this job?” “Do you know this thing?” Well, that’s simple enough. But there’s another question that I think we should be asking that’s just as important:

“Are you ready to learn?”

Someone’s willingness to admit ignorance is invaluable. It shows intelligence and maturity and integrity to be able to say “I don’t know”. And the current assessment paradigm overlooks this completely.

So are tests in school good for assessing mastery?

Mainstream assessment pretends to only care about “facts” and “results”, but as we’ve seen, it does a really poor job of this. Multiple choice tests are most common and least effective for assessing mastery. The essay-style test and oral presentation are much more effective, yet much less common. And portfolios, the most effective form of assessment, are the least common form of all.

So I think that even though schools sometimes use effective forms of assessment, by and large they ineffectively assess mastery. Mainstream testing doesn’t do a good job of telling whether or not I understand what the American Civil War was all about. It doesn’t do a good job of telling whether or not I understand why we should eat a balanced diet. It just has me memorize random pieces of information and regurgitate it all on command.

And worse yet, testing as it’s practiced in school keeps us from asking questions and admitting ignorance.

Perhaps the real problem isn’t the testing itself though. Maybe the real issue is the fact that we grade tests and assignments in general, and then compare students to one another. Maybe we should be asking the question “is grading really necessary in school?” instead.

But that’s another article for another time. For now, I think it’s safe to just say that most tests given in school are bad at assessing mastery. Mastery is exhibited through tireless, consistent, tangible results. Essays and oral presentations are effective to a certain degree, but portfolios are by far the best form of assessment. They assess not just analysis, synthesis, explanatory, and descriptive skills, but also a person’s dedication and ability to overcome obstacles. They establish a pattern of competence and greatness.

Want to prove you know your stuff? Go create a portfolio.