Tag Archives: weinbergcloud

Scaling Reassessments, Yet Again (Part 3)

I've been quietly complaining recently to myself about how the reassessment sign-up and quiz distribution tool I created (for myself) isn't meeting my needs. Desmos, Peardeck, and the other online tools I use have a pretty impressive responsiveness when it comes to requests for features or queries about bugs, and that's at least partly because they have teams of expert programmers ready to go at any given moment. When you are an ed-tech company of one, there's nobody else to blame.

This is the last week when reassessment is permitted, so the size of the groups I've had for reassessments has been pretty large. Knowing this, I worked hard this past Sunday to update the site's inner workings to be organized for efficiency.

Now I know what my day looks like in terms of which reassessments students have signed up for, what their current grade is, and when they plan to see me during the day:

screen-shot-2016-12-06-at-12-53-11-pm

With one or two reassessments at a time, I got along just fine with a small HTML select box with names that were roughly sorted. Clicking those one at a time and then assigning a given question does not scale well at all. I can now see all of the students that have signed up for a reassessment, and then easily assign a given question to groups of them at a time:

screen-shot-2016-12-06-at-1-54-28-pm

The past two days have broken the reassessment records that I wrote about at the end of the first quarter - today, for example, there are over sixty five students taking quizzes. In the scrambling of getting everyone their quizzes in the past, I've made concessions by giving simpler questions that may not honestly assess students at either end of my 5 - 10 learning standard scale.

With the user experience more smooth now, I have been able to focus this week on making sure that the questions I assign truly assess what students know. I could not do this without the computer helping me out. It feels great to know things are working at this higher scale, and I'm looking forward to having this in place when we get going again in January.

Another WeinbergCloud Update

I decided a full overview of my online WeinbergCloud application was in order, so I recorded a screencast of me going through how it currently works. It's kind of neat that this has been a project under development for nearly two years. I've learned a lot about HTML, Javascript, the Meteor framework, and programming in general in the process, and it has been a lot of fun.

Stay for as long as you like, and then let me know your thoughts in the comments.

On Randomness & Teaching

I really enjoyed this article from Fast Company on the value of randomness in art and design.

As I read, I found many points resonated with what I feel about certain aspects of my teaching practice:

RANDOM VALUES CAN FILL IN THE UNIMPORTANT BLANKS

...The key here is an intelligent decision about what is ordered and configured versus what is appropriately random. In this kind of situation, a random function directly generates some aspect of the work, but that that aspect is usually not the focus.

I write often about the power of using programming and computation to take care of tedious tasks. This article reminded my that I also use computation to introduce randomness in situations where I don't care about the specifics. I often use a random group generator to make groups for classes. When the size of the groups matters to me on a given day, I use the generator. When the composition matters though, I might arrange them by hand because I don't have a tool to manage group composition automatically.

Another situation is where I need a nicely factorable quadratic expression - the values don't matter, but it must be factorable over the rational numbers. The randomness fills in where the details are unimportant here. I can make up a pair of binomials and multiply it out mentally, but I'd rather put in the time to make a generator do this for me. This is where my Khan Academy powered reassessment problem generator has been serving me exceptionally well:

generating-problems

Randomness also helps identify what matters to me and what doesn't. Sometimes I'll start generating random problems and realize that I want negative coefficients, or a pair of irrational zeroes. Making decisions and recognizing the patterns is what I want to be spending my time doing, and the computer inserting randomness helps me focus on these tasks.

Here's more from the article:

RANDOMNESS CAN BE USED TO EMPHASIZE THE ALGORITHM OVER THE RESULTS

Sometimes, the contribution of an artist or designer consist of the rules, logics and coded relationships rather than the output of that process. Repeatedly running the algorithm with random input values can productively undermine results relative to process.

It is an interesting challenge to design generators of questions that nurture specific types of thinking in students. Sometimes that thinking involves deliberate practice of a skill. Other times it involves figuring out a pattern. If students observe after solving a few equations that the form is the same or that the distributive property always seems to be useful after step three, they are identifying and making use of structure as CCSS.MP7 wishes they would.

animated

Having students notice what is the same about sets of different problems requires that you have a set of problems for them to look at. Textbooks always have these sets, but they are already neatly organized into rows and types. Being able to generate these problems easily and have students do the organization is a great way to get them to do the pattern finding themselves.

Credit Expiration & Standards Based Grading

For the background on my reassessment system, check out this previous post.

Here's the run down of my progression in using standards based grading over the past couple of years:

  • When students could reassess whenever they wanted, they often did so without preparation. They also rushed to do as many reassessments as possible at the end of a quarter or semester. I also needed a system to know who had signed up for a reassessment, for which standard they were assessing, and when they were coming in.
    Solution: Students needed to complete a reassessment sign-up form through Google Forms that included reflection on work that was done to review a standard. In general though, the reflection on these forms wasn't strong. I needed more, but didn't get around to clearly defining what I meant by strong reflection.
  • The difficulty of scanning through a form and getting the information I needed prompted me to create an online site using the Meteor programming framework that lets students sign up for reassessments. In real-time, this sorts the reassessments for a given day and helps me stay organized. The problem was that I still wasn't satisfied with what students needed to do to reassess. They needed to review their mistakes, talk to me, practice and get feedback, and then sign up. Having a way to manage that process was essential.
    Solution: The introduction of credits. Students earned credits for working after school, showing me practice problems, and doing other work to support the deliberate practice and learning needed to get closer to mastery.
  • Many students hoarded their credits until the end of the semester. This prevented the cycle of feedback about learning from continuing, and caused the end of the semester to still be a mad rush to reassess whichever standards are lowest in the grade-book using a machine gun approach.

This brings me to what I wrote about in my year-end reflection about SBG at the end of last year. Hoarding credits and saving them until they want to use them causes less reassessment, and that's not right. I want to nudge students to reassess more often and know that they should take opportunities as often as possible to show what they know. I've threatened to make credit expiration happen since August, and students have been asking when it would start.

No time like the present.

After working on this for a couple of days, I've activated a feature on my reassessment management app that allows credits to expire.

Screen Shot 2015-10-15 at 8.45.55 AM

Right now, I will be expiring credits manually. I need to see how students respond to this change before the system does this automatically. I get a visual indication that a given credit has expired and click the 'fire' button to expire the credit. I can also restore the credit if I change my mind. The asterisk button lets me apply the credit lifetime in the input box to a specific credit and change the expiration.

For old credits, I applied a much longer lifetime, but as students learn to adjust their behavior, I'm starting with a ten day expiration lifetime. That seems to be just the right amount of time to get students assessing within a reasonable amount of time of doing work related to a standard. I don't think this changes the time pressure to learn something within a given amount of time, which is one of the benefits of SBG. It does change the pressure to assess within a given amount of time, which I do want to happen.

I'm also adjusting some of my policies that cause the hoarding in the first place. Some of this tendency was a consequence of my system - I haven't let students go from a 5 to a 10 (on a ten point scale) with one assessment session. Mastery is demonstrated over time. I typically had students go from a 5 to an 8 on a perfect first assessment, and then left it there until the unit exam, when students can demonstrate mastery of standards in the context of many other problems.

I'm planning to loosen this progression in light of the credit expiration changes here. If a student is able to demonstrate the ability to answer questions related to a standard, no matter what I throw at them, that's a pretty good hint on their mastery level. It's up to me to give reassessment questions that measure what I'm looking for though. That's where the art of good assessment and experience comes in. I reserve the right to not raise the mastery level if I'm not convinced of a student's level - students know that taking a reassessment does not automatically mean their level will be raised. As long as that understanding continues, I think these changes will lead to better learning.

As always, I'll keep you all updated with how well this works in practice. I can always turn this feature off if it's a disaster.