# End of year reflections - SBAR analysis

I wrapped up grading final exams today. Feels great, but it was also difficult seeing students coming in to get their final checklists signed before they either graduate or move on to a different school. Lots of bittersweet moments in the classroom today.

I decided after trying my standards based grading (SBG) experiment that I wanted to compare different students' overall performances among the grading categories to their quiz percentages. In a previous post, I wrote about my experimentation doing my quiz assessments on very specific skill standards that the students were given. As I plan to change my grading to be more SBG based for the fall, I figured it would be good to have some comparison data to be able to argue the many reasons why this is a good idea.

In my geometry and algebra two classes, there are 28 total students. I removed two students from the data set that came in the last month of school, and one outlier in terms of overall performance.

The table below shows the names of each grading category, as well as the overall grade weight for the category used in calculating the grade. The numbers are the correlation coefficient in the data between the variable listed in the row and column. For example, the 0.47 is the correlation between the HW data and the Quizzes/Standards data for each student.

Homework (8%) | Quizzes/Standards (12%) | Test Average (48%) | Semester Exam (20%) | Final Grade (100%) | |

HW | 1 | 0.47 | 0.41 | 0.36 | 0.48 |

Quiz/Stndrds | 1 | 0.89 | 0.91 | 0.95 | |

Tests | 1 | 0.88 | 0.97 | ||

Sem. Exam | 1 | 0.95 | |||

Final Grade | 1 |

Some notes before we dive in:

- The percentages do not add up to 100% because I am leaving out the portfolio grade (8%) and classwork grades (4%) which are not performance based.
- Homework is graded on turning it in and showing work for answers. I collect and look at homework as a regular way to assess how well they are learning the material.
- The empty cells are to save ink; they are just the mirror image of the results in the upper half of the matrix since correlation is not order dependent.
- I know I only have 25 samples - certainly not ready for a research publication.

So what does this mean? I don't know that this is very surprising.

- Students doing HW is not what makes learning happen. I've always said that and this continues to support that hypothesis. It can help, it is evidence that students are doing
**something**with the material between classes, but simply completing it is not enough. I'm fine with this. I get enough information from looking at the homework to create activities that flesh out misunderstanding the next time we meet. The unique thing about homework is that it is often the first time students look at the material on their own rather than with their peers in class. - My tests include some direct assessments of skills, but also include a lot of new applications of concepts and questions requiring students to explain or show things to be true. It's very gratifying to see such a strong connection between the quiz scores and the test scores.
- I always wonder about the students that say "I get it in class, but then on the tests I freeze up." If there's any major lesson that SBG has confirmed for me, it's that student self-awareness of proficiency is generally not great without some form of external feedback. If this were the case, there would be more data with high quiz scores and low exam scores. That isn't the case here. My students need real and correct feedback on how they are doing, and the skills quizzes are a formalized way to do this.
- I find it really interesting how close the quiz average and the semester exam percentages are. The semester exam was cumulative and covered a lot of ground, but it didn't necessarily hit every single skill that was tested on quizzes. There were also not quizzes for every single skill, though I tried to hit a number of key ones.

This leads me to believe that it is possible to have several key standards to focus on for SBG purposes, and also to dedicate time to work on other concepts during class time through project based learning, explorations, or independent work. It's feasible to assess these other concepts as mathematical process standards that are assessed throughout the semester. It strikes a good balance between developing skills according to curriculum but not making classes a repetitive process of students absorbing procedures for different types of problems. I want to have both. My flipping experiments have worked well to approaching that ideal, but I'm not quite there yet.

I'll have more to say about the details of what I will change as I think about it during the summer. I think a combination of using BlueHarvest for feedback, extending SBG to my Calculus and Physics classes, and less emphasis on grading and collecting homework will be part of it. Stay tuned.

Hi Evan, good thing I just did stats with my grade 10's! I am embarrassed to say that I have never analyzed my own students' results in this way, to see the correlation between the various assessments. Another thing to do next year - did you use excel to generate these figures? A few questions:

-How often do you collect the homework that you graded? If you're not actually correcting it, just checking for work, I'm guessing it takes not too long for you to grade. Do they have the answers and check those themselves? I like this idea a lot. It gives you so much rich data, quantitative and qualitative.

-Since your tests include new new applications, do you give them any kind of scaffolding for this, or expose them to similarly new problems between tests? My kids often need training or tips just to get started, keep going, check their calculations, and keep organized for big problems, in and out of test situations.

I love how the various assessments move from small ideas to big ones, from specific math skills to life skills, like self-awareness.

Fascinating work! You are using math, actual math, to help your students with their math! Thanks for the inspiration!

Hi Audrey,

Thanks for your comments! I did this in Geogebra (with the spreadsheet view) after getting the raw percentage data from Powerschool.

My homework collection habits always start off pretty strict - I start the year by collecting every assignment for every class. This helps me identify the strugglers, the ones that write down just answers, and the ones that are clearly writing down the work of others right away. It gives me a chance to have conversations with all of them about their work habits and about what they do/don't understand.

We are on a block schedule, so that makes it so there isn't quite so much paperwork to manage every single day. I typically will check three or four key problems to make it manageable, though if a student has everything correct and justified, I'll typically check other problems too just to provide an opportunity to give feedback if I can. I encourage them to check answers themselves using Geogebra or Wolfram Alpha, and will also show them how to do so early on in the year.

As for test questions that are non-routine, I usually pepper my class activities with questions that require them to do things differently than just following procedures. I train them to solve the problems in the same way as those they haven't seen before - write down given information, do a bit of brainstorming on paper to see how to connect the given information to what they know. It's exactly the set of things you mentioned in the comment - use numbers to check calculations, stay organized, write down ANYTHING in case it might nudge them toward a solution. A big thing in preparing for tests is using technology like Geogebra. Often telling them to solve it that way and then work backwards gives them the confidence of knowing the answer, which can be really powerful. The other big thing I tell them is that I will never give them an unsolvable question on a test. I emphasize in most of what I do that knowing the answer is not important - figuring it out is. As fuzzy as that sounds, it has really worked over the past couple of years to give my students the right mindset to succeed in this way.

I appreciate your kind words!

Evan