Enhancing Student Number Sense

A topnotch WordPress.com site

Student generated assessments

on October 6, 2012

Tanya…. I have been thinking a lot as well after reading Marzano’s chapter regarding this very topic.  I agree with having the students “show” us what they know in their own creative style/way. This brings into play many of the following questions:

-some parents seek the pencil/paper/traditional form of testing.  So we need to re-educate parents on these new styles of demonstrating learning. )

-although we would create and use a rubric for such an activity, is the rubric not somewhat subjective rather than objective??

I would love to see more of this student-generated work come into play. So often I have read or listened to a student explain or write about their strategy and it was genius…… yet didn’t fit into the “mould” of an answer.  If we are encouraging kids to “show us” what they know.. we need to be fair in how we ALL (provincially) grade them. (ie. provincial testing…)

Advertisements

8 responses to “Student generated assessments

  1. mrssmithnmes says:

    Tricia,

    I agree with you 100% about needing to change parents perceptions of assessment and education as a whole. I also think NMES is well on its way to doing that.

    I’m not exactly sure about rubrics being subjective, though. Perhaps I’m not entirely sure of what you mean and please correct me if I’m way off base here. I think rubrics are designed to be objective in nature. Teachers are expected to use one specific rubric to assess/grade student work in the exact same manner. Thus, eliminating the subjectivity that assessment/grading student work can be without them. I also believe that in order for a rubric to be a true assessment in evaluating exactly where on the grid a student lies in terms of what he/she does or does not know, all teachers of that grade level need to use that particular rubric in the same manner. This reminds me of a conversation some of us had this week about an upcoming math asssessment. The assessment is scripted and a comment was made that it being scripted was actually a good thing because then each student would be evaluated in the exact same manner. The teacher wouldn’t be tempted to change the level of language for particular students. You give each student the exact same language and they either know the answer or they do not. Albeit, it seems a bit harsh and flies in the face of “individualization”, but it sure does tell you where each student lies in what they do or do not know and what they can and cannot do.

    Any other thoughts…anyone?

    Tanya

    • ferntouchie says:

      I agree Tanya. As teachers we want to help our students succeed, even if it is by unintentionally leading them to the right answer due to the language and prompting used during assessment. Assessments that are administered in a very structured and consistent manner ensure that students are assessed fairly, which in turn provides an accurate snapshot of where the student is developmentally in their understanding of particular math concepts. Subjectivity comes to play, in my mind, when rubrics are created in a vague manner. Using terms such as Very Good, Good, Average, Poor or a 1-5 scale without specific targets indicated creates an opportunity for the teacher to pick and choose where they feel the student might be, taking into account the student as an individual and perhaps being more generous or too hard in their scoring. Having students be part of the rubric development is ideal as it allows students to get a better understanding of what they need to know and how they will be scored,putting the learning back into the hands of the students.

      Fern

      • mrssmithnmes says:

        Thanks, Fern. What you’ve described makes sense. If rubrics were created in this manner, they could certainly be subjective and not a true representation of where exactly that student is.

      • triciamcgraw says:

        Which brings us back to the fact that good teaching practice includes a variety of assessment methods and ways. Some may excel in the scripted form while othrs may not. As far as rubrics go…what Fern described is what I meant. I remember a few teachers developing a witing rubric one year. We used it all year and when a new feacher joined the team she challenged the rubric of vg good etc. Then aftr talking to more teachers we saw the difficulty in coming to concensus and wondered if it was more subj . The rubrics in our program and many out the. re r great . Does this clear things up Tanya? I think NMES is great and evolving with times as we should be. The mere fact we have a commitgee dedicated to these rich conversations to enhance student learning is wo.derful.

  2. mrssmithnmes says:

    Tricia,
    Yes, this does clear things up…thanks! We do need to use a multitude of assessments to ensure we get a good view of what each student can do. What works for one may not work for another. It’s important for us to remember that!
    Tanya

  3. mackay74 says:

    Great converstation and thank you for making the point of the need for differentiated assessment. If we want to assess the whole learning process then we must be willing to differentiate our methods of assessing. NMES is surely on the right path with having these types of discussions and providing enriching professional dialogue anywhere and anytime. 🙂

  4. mmereynolds says:

    Indeed the need for differentiated assessment is a must ; certainly if the philosophy is to reach all learners we need to create different forms of assessment and rubrics. The other day I had the student shows all the ways they knew how to represent a number. I provided a multi level of materials for them to use and I made notes while they were performing and communicating. It truly indicates that we have such a variety of learners and levels in our class. So, the question arises again, how do the provincial assessments represent our students knowledge and performance?

  5. lorijc says:

    The provincial assessments are a systemic snapshot and gage of how students, schools and district are doing over a period of time. As an indicator of individual student acheivement, there are many stones to be cast in the direction of provincial assessments. We need to remember the intention of this data and use our professional voices to educate the public to the value of other forms of assessment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: