But before I tackle Assessed Curriculum, I hope you don't mind if I spend just a little more time on why I love Winter so much. I promise, it will greatly enhance your understanding and appreciation for curriculum alignment and assessment. Or, at least, it will be loosely related.
So...Winter, why do I love you so? It's not that I really enjoy the bitingly-cold winds of a mostly-flat Iowa. But here are some things that I do enjoy about Winter: falling snow, playing in big snow piles with my dog Buddy, being warm and cozy inside watching TV with my wife Suzy and once again my dog Buddy, drinking a nice hot cup of coffee, and seeing my breath when I walk outside. For some strange reason, I just feel 7% cooler when I can see my breath outside. And believe me, I can use all of the cool points as I can get. Though I'm pretty sure that my wife would say believing this actually makes me not 7% more cool but 27% less cool.
You want to know what else I really love about Winter? I love wearing hats and gloves. I really only wear the same four pairs of shoes. But I do have a fairly extensive glove collection. My favorite is a pair of wool convertible glove-mittens. Ah yes, here they are:
Now these are cool gloves. Or should I say, really warm gloves. And so versatile, too. On cold, but not too-cold days, I can let my fingers breath in the crisp cold winter air. If it is just bitterly freezing, or if I'm going to be outside for a long time, I just flip the tops of the gloves over my frigid fingers and rock them mitten-style. Anyway, I feel like these gloves can do it all! Maybe you have a favorite pair of gloves that you love. Or maybe it's something else like blanket, a screwdriver, a dress, or a remote control. Whatever it is, we all have our favorites things. Now hold that thought, I'll come back to it in a little bit.
Let's look back a month, shall we? Last month, I spent some time digging into the question "What is Enacted Curriculum?" I get excited about the Enacted Curriculum for a number of reasons. First, its role in impacting student learning is huge, according to a review of research literature. But it also really interests and excites me because it's an area of education that is really, really hard to capture accurately. Think about it for a second, as a classroom teacher: do you have time to record everything that you actually taught each student? Most teachers I know do not, which is why survey methods are both interesting and promising.
Those are the sorts of things I talked about last month. This month, I am digging into the area of Assessed Curriculum. Recall that I use a multi-dimensional framework based on the work of Andy Porter that includes the intended, enacted, assessed, and learned curricula:
In this framework, the assessed curriculum is:
the knowledge and skills (i.e., the content) that are measured to determine student achievement.
In other words, the assessed curriculum is the "what" gets measured when we are trying to figure out where student learning is at. I know not many folks like using formulas to understand the world, but I do, so hang in there with me. I think of it like this:
content = stuff + what students do with the stuff
Simple enough, right? We are trying to dig into what students are learning when we assess. I'm guessing that most of you can get behind that idea. What is sometimes harder for folks to wrap their head around is that assessment is a type of curriculum. And I'm not just talking about things like the tests that come at the end of chapters in textbooks. I'll get into that a little later.
The definition I used for the term curriculum in my blog "What is Curriculum?" is based on Andy Porter's work and goes like this:
Curriculum is what students are supposed to learn, what they get the opportunity to learn, what gets assessed, and what is actually learned.
So, as strange as it may seem, from this perspective assessment is part of curriculum Another perspective to take is a curriculum alignment perspective. We want the assessment practices we use to align with what we teach, right? Curriculum alignment is about the "what" or the "stuff" of curriculum. Assessments have stuff. What teachers teach has stuff. If this logic is getting distracting, don't get bogged down in it. Just take the ideas themselves from this blog and don't worry about it.
Types of assessment
Remember when I said to not let perspectives get in the way of the big ideas in this blog? Like, one sentence ago? Yeah, this is on of those times. Folks spend a lot of ink and hot air...er...normally-temperatured breath, arguing about what to call different types of assessment processes and tools. I don't really want to go down the debate path on the vocabulary here, though I'm happy to have that discussion with anyone who is so inclined. For now, I'd just like to briefly present an assessment framework that I find handy for understanding the different types of assessment decisions that can/should be made.
I provided a general definition of assessed curriculum above. I'd like to get a little more specific now. Check this description out:
Assessment is a system of processes and tools that are used to determine the extent to which students are acquiring or have acquired the knowledge and skills listed in the curriculum and delivered via instruction (Niebling, et al., 2008) In general, there are four types of assessment decisions:
- Summative: Summative assessment tends to be comprehensive in nature, provides accountability, and is used to check the level of learning at the end of a unit of study. (RtI Action Network)
- Formative: Formative assessment is a collection of evidence about student learning that is used to inform instructional decisions in an ongoing manner. Progress Monitoring, a type of formative assessment used in RtI systems, is a scientifically-based practice used to assess students’ academic performance and evaluate the effectiveness of instruction. It is the process used to monitor implementation of specific interventions. (RtI Action Network)
- Screening: Screening assessment is a quick check of all students’ current levels of performance in a content or skill area. (RtI Action Network) The purpose is to help identify potential academic and/or behavioral concerns in need of additional assessment. (Midwest Instructional Leadership Council)
- Diagnostic: Diagnostic assessment is used to confirm screening results and to inform intervention by determining a student’s particular academic needs. (RtI Action Network)
Yes, yes I did. Thanks for reminding me. I want you to think about some of your favorite things again. And, if you have enough brain space left (it is the holiday season after all), think about what you know about assessment as well. I just laid out different purposes of collecting assessment information. Now let's see, what can my gloves teach us about making assessment decisions? After all, these are really a sweet pair of gloves. They can do all sorts of things. For example, they can train my dog, Buddy.
There's my guy. Pretty handsome, huh? I'm not quite sure why he's so spazzed about having fake antlers on his head. But clearly, he needs some training! Well guess what, my gloves can train Buddy!
Remarkable! As you can see, my gloves can give Buddy a command and sure enough, Buddy follows the command. Well, I'm feeling quite empowered now. I wonder what else my gloves can do?
I don't know about you, but I find driving during the holidays to be really stressful. And I've got so much else to do. Maybe my gloves can drive a car! If they can drive a car, then maybe they could go run errands for me. Let's see...
Ok, the gloves appear a little distracted, but they are still on the road. So far, so good...
Whoa! My gloves almost hit a strangely-calm dog and a very handsome man who were trying to cross the street.
Well, that didn't end very well, but it could have ended much worse. Maybe I expected too much too soon from my gloves. I mean, not only did they almost run over that dog and handsome man, they seem to be developing an attitude problem. Maybe I should scale it back a little bit.
I know! I don't really enjoy cooking dinner at night. I'd rather be reading about curriculum alignment and assessment. Perhaps my gloves could quick cook some dinner for me from time to time...
Now this seems to be working much better for my gloves. This glove has found the penne pasta and Alfredo sauce I set out. And it looks like my glove even got out some paper and something to write with, perhaps to write down a tasty recipe from a cookbook. I bet my other glove has started boiling some water to put the penne pasta into...
Yep, sure enough, there is my other glove, boiling a pot of water. I'm getting hungry just seeing these pictures. And I'm stoked, because it seems like I've found another amazing thing that my gloves can do! I'll check back in a few minutes to see how my penne pasta is doing........ok I'm back. Let's see how it's going...
Well, that didn't end too well, either. Maybe I've expected too much from my gloves. Maybe, after all of that, they didn't really train my dog Buddy at all. Maybe there was someone out of frame helping the gloves, making the gloves seem like they could do something they weren't really made to do.
If you are still reading this blog, congratulations, and thank you for putting up with my strange sense of humor. Let me get to my point. Like my gloves, like many of your favorite things, we may love them. We may think they can do a lot. But in the end, my gloves and you favorite things became our favorites because they were really good at one or two things, not everything.
Assessment practices and tools are the same way. When we try to make 1 minute oral reading fluency probes into diagnostic tools, they can let us down if that's all we use to try and get diagnostic data. When we try to turn measures that take 30-45 minutes per student into screening tools, we are likely to waste a lot of time and resources when we could have used a reliable and valid curriculum based measure (CBM), for example.
Furthermore, it's my opinion that when we demonize different assessment processes and tools, we are failing to take responsibility for our own actions or the actions of other educators who are misusing the processes/tools. Are you mad because state accountability assessments get misused and abused? Fine. But don't be mad at the measures. Be mad at the people misusing them. And help them learn more appropriate ways to use them. Because you know what, large-scale standardized assessments can provide helpful screening and/or summative data. They aren't useless, at least not in the hands of those who are properly trained to use them appropriately.
That's a bit of a soapbox, but I find it necessary to share my thoughts on this topic. Let's not just yell and scream about professional malpractice around assessment (yeah, I said it). Let's work together to promote and improve appropriate assessment practices.
What the research says about alignment and assessed curriculum
|created using text from|
Students do better on assessments when they've been taught the stuff that's in the assessments.
Genius, right! I know, I know. That is really common sense. But, having research data to back that up is handy. Here's what's really important though. This phenomenon holds for students who come from low socio-economic backgrounds, have low prior achievement, have disabilities, or belong to minority ethnic group. In other words, giving all students equity in opportunity to learn what they are to be assessed on can really help level to playing field. Check out the articles by Cohen (1987) and Gamoran and his colleagues (1997) to see some of this research.
Now, there are a few nuances to these findings. First, this doesn't mean all students can accomplish this learning by being taught the exact same way. Some kids will need more time so the content can be taught more slowly, or they may need to content taught differently. But we still need to give them an opportunity to learn. The second main nuance here is that it seems that when examining this opportunity to learn (i.e., alignment between the enacted and assessed curriculum), we have to look at both the topical/conceptual knowledge as well as the cognitive complexity of the content. These are topics for future blog posts, but I wanted to highlight them here.
Practical implications of research findings
Last month, I really hit on the importance of opportunity to learn for all students, which I reiterated with the summary of some research above. Instead of continuing to repeat myself, I'd like to take a slightly different angle here. Let me start with some questions (in no particular order, just numbered to keep track of the conversation):
- Have you ever heard a phrase that goes something like "this test is aligned to the standards"?
- Do you know how well aligned the the tests you acquire, are given, or create are with what you teach and/or state standards? (sorry, that's a long question)
- Do you know how well aligned the state tests are with state standards?
I'd be neglecting my professional duties if I let a question like #1 above go unchallenged. Really, "aligned"? Alignment is not a black and white issue. It's a matter of degree. There isn't, nor should there be, any test that covers all state standards (i.e., intended curriculum) or everything a teacher teaches (i.e,. enacted curriculum). Now, all of the items on a single test may all hit at least on or more state standards, or they may each be something that the teacher taught. But do you see how this question depends on which direction you looking at alignment? And we haven't even gotten into the multi-dimensional nature of alignment itself. Soon.
|picture source - Winners Dehli News|
Regardless of shiny packaging or how large the promises, there has to be some evidence of where things are aligned and misaligned. And, for that matter, how the creator of said shiny packaging made those alignment determinations should be clear as well. Hey, those are two very Tweetable statements. Give me some Twitter love!
As an educator then, how would you respond to questions #2 and #3 above? Can you? Should you be able to? If the test isn't really measuring what is being taught, or at least what is planned to be taught (e.g., screening, pre-testing), is that really fair to the students or teachers? I'll let you decide.
If you hear absolute statements about curriculum alignment, please raise the red warning flag, or your baloney meter, because you're probably about to hear/read something that isn't entirely true. Even if you hear a more conservative statement, like "strong alignment," does the maker of that statement explain how they came up with that conclusion? If not, they should. You deserve better than that, and so do our students.
Yet another revisit of textbooks and related materials
Let's dig into the matter of textbooks and related materials, shall we? It wouldn't be a very good blog post about curriculum and alignment matters if we didn't methinks. To date, I've made the claim that textbooks and related materials can be considered to be both intended and/or enacted curriculum, depending on how and when they are used. Furthermore, I've argued that textbooks and related materials shouldn't hold the honorable distinction of being exclusively considered "the curriculum." But can textbooks and related materials actually be considered assessed curriculum? You know I like scenarios, so here's one for your consideration:
|picture source -|
I would say they are making an assessed-to-intended curriculum alignment comparison. Here is my thinking...
First, the desired examination is looking at things that were already implemented, similar to last month's scenario. And like last month, textbooks and related materials seem to be at the center of the desired examination. The difference in this scenario, however, is that a specific aspect of the textbooks and related materials is being examined: the chapter/unit tests. These are assessments. Second, and like the scenario last month, the desired alignment examination is with the Core Content Standards and Benchmarks (i.e., the intended curriculum). That's why I think that in this case, textbooks and related materials can be used in an assessed-to-intended curriculum alignment comparison. What do you think? Leave a comment here or tweet me.
Check back here next month when I discuss the next topic in the Foundations Series: "What is Learning Curriculum?" and how is it different that the assessed curriculum. Thanks again for taking the time to read my thoughts. I hope all of you have/had a great holiday season. See you here next month!
Cohen, S.A.(1987). Instructional alignment: Searching for a magic bullet. Educational Researcher, 16, 16-20.
Gamoran, A., Porter, A.C., Smithson, J.L., & White, P.A. (1997). Upgrading high school mathematics instruction: Improving learning opportunities for low-achieving, low-income youth. Educational Evaluation and Policy Analysis, 19, 325-338.
Niebling, B. C., Roach, A. T., & Rahn-Blakeslee, A. (2008). Best practices in curriculum, instruction, and assessment alignment. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology, (4)5, 1059-1072. Bethesda, MD: National Association of School Psychologists.
Porter, A.C. (2006). Curriculum assessment. In J.L. Green, G. Camilli, & P.B. Elmore (Eds.), Complementary methods for research in education (3rd edition). Washington, DC: American Educational Research Association.
RtI Action Network. Glossary of Terms. Retrieved from http://www.rtinetwork.org/glossary.