mCLASS. To most outside of the world of teaching, this is just another one of those acronyms that goes in one ear and out the other. To those inside the world of teaching, it brings along a wide mix of feelings. For those unfamiliar, mCLASS is a literacy assessment tool that is mandated in all schools in North Carolina. It is administered on an ipad. The TRC component of mCLASS assesses a student’s ability to read different leveled texts through their accuracy and comprehension.
Teachers have seen many similar assessment tools over the years, but mCLASS goes one step further and assesses students’ comprehension not only orally but in written form. Students read a prompt without any assistance from the teacher, find evidence from the book to support their answer and respond to the question with specific details from the text. This falls right in long with Common Core’s initiative to go deeper and challenge students to be able to think critically about their reading.
Through my own experience with mCLASS and discussions with colleagues, I have come to realize that teachers have varying opinions about mCLASS. Some see it as just another assessment on the pendulum of change in education. Others see it as a great tool for measuring students’ reading abilities. Many are frustrated with the way it sets students back. Here are some of the basic pros and cons that I understand:
- mCLASS provides parent letters informing them about their student’s levels and ways that they can support them at home.
- mCLASS gives you some helpful information about a student’s reading profile and can help to target their needs in guided reading.
- When students move schools or districts, their assessment information can be accessed from their new school, helping to streamline information.
- Students are assessed way too often. The lowest students who need the most instructional time are being assessed the most.
- Some of the written response questions are not written on a student’s reading level. If they cannot read the question, they are automatically unable to sufficiently answer.
- Students’ answers are very harshly evaluated. Answers that I would consider acceptable do not pass according to the assessment rubrics.
- Students’ writing scores are not averaged. They may knock it out of the park on one question but if they miss one part on the second one, mCLASS will kick them back.
I am currently in a grad school class on Content Area Literacy. We were given a task to write a nonfiction narrative on numbers. Two peers and I decided to take a deeper look at some trends we have noticed with mCLASS results in our classrooms. We each teach either 1st, 2nd, or 3rd grade. One common trend we noticed was that many high level readers drop significantly when being assessed on mCLASS. This is due to a lot of factors – whether it be the selection of leveled texts, the student’s ability to read the written comprehension questions or the amount of evidence needed to provide a sufficient answer. We decided to write a parody of “Oops I Did It Again” about our students’ experiences with mCLASS.
Here are the links to the video along with blog posts from my peers:
As teachers, we will continue to monitor students’ reading progress and help them develop deeper comprehension skills, no matter the instrument. But even more than that, we recognize that students are more than just the numbers on a data sheet and recognizing their skills and areas to improve within and outside of assessments is the true mark of great teaching.