Tag Archives: reassessment

Numbas and Randomized Assessment

At the beginning of my summer vacation, I shared the results of a project I had created to fill a need of mine to generate randomized questions. I subsequently got a link from Andrew Knauft (@aknauft) about another project called Numbas that had similar goals. The project is out of Newcastle University and the team is quite interested in getting more use and feedback on the site.

You can find out more at http://www.numbas.org.uk/. The actual question editor site is at https://numbas.mathcentre.ac.uk/.

Screen Shot 2016-08-22 at 8.23.01 PM

I've used the site for a couple of weeks now for generating assessments for my students. I feel pretty comfortable saying that you should be using it too, and in place of my own QuestionBuilder solution. I've taken the site down and am putting time into developing my own questions on Numbas. Why am I so excited about it?

  • It has all of the randomization capabilities of my site, along with robust variable browsing and grouping, conditions for variable constraints, and error management in the interface that I put on the back burner for another day. Numbas has these features right now
  • LaTEX formatting is built in along with some great simplification functions for cleaning up polynomial expressions.
  • Paper and online versions (including SCORM modules that work with learning management sites like Moodle) are generated right out of the box.
  • It's easy to create, share, and copy questions that others have created and adapt them to your own uses.
  • Visualization libraries, including Geogebra and Viz.js, are built in and ready to go.
  • The code is open sourced and available to install locally if you want to do so.

I have never planned to be a one-person software company. I will gladly take the output of a team of creative folks that know what they are doing with code over my own pride, particularly when I am energized and focused on what my classroom activities will look like tomorrow. The site makes it easy to generate assessments that I can use with my students with a minimal amount of friction in the process.

I'll get more into the details of how I've been using Numbas shortly. Check out what they've put together - I'm sure you'll find a way to include it in part of your workflow this year.

Credit Expiration & Standards Based Grading

For the background on my reassessment system, check out this previous post.

Here's the run down of my progression in using standards based grading over the past couple of years:

  • When students could reassess whenever they wanted, they often did so without preparation. They also rushed to do as many reassessments as possible at the end of a quarter or semester. I also needed a system to know who had signed up for a reassessment, for which standard they were assessing, and when they were coming in.
    Solution: Students needed to complete a reassessment sign-up form through Google Forms that included reflection on work that was done to review a standard. In general though, the reflection on these forms wasn't strong. I needed more, but didn't get around to clearly defining what I meant by strong reflection.
  • The difficulty of scanning through a form and getting the information I needed prompted me to create an online site using the Meteor programming framework that lets students sign up for reassessments. In real-time, this sorts the reassessments for a given day and helps me stay organized. The problem was that I still wasn't satisfied with what students needed to do to reassess. They needed to review their mistakes, talk to me, practice and get feedback, and then sign up. Having a way to manage that process was essential.
    Solution: The introduction of credits. Students earned credits for working after school, showing me practice problems, and doing other work to support the deliberate practice and learning needed to get closer to mastery.
  • Many students hoarded their credits until the end of the semester. This prevented the cycle of feedback about learning from continuing, and caused the end of the semester to still be a mad rush to reassess whichever standards are lowest in the grade-book using a machine gun approach.

This brings me to what I wrote about in my year-end reflection about SBG at the end of last year. Hoarding credits and saving them until they want to use them causes less reassessment, and that's not right. I want to nudge students to reassess more often and know that they should take opportunities as often as possible to show what they know. I've threatened to make credit expiration happen since August, and students have been asking when it would start.

No time like the present.

After working on this for a couple of days, I've activated a feature on my reassessment management app that allows credits to expire.

Screen Shot 2015-10-15 at 8.45.55 AM

Right now, I will be expiring credits manually. I need to see how students respond to this change before the system does this automatically. I get a visual indication that a given credit has expired and click the 'fire' button to expire the credit. I can also restore the credit if I change my mind. The asterisk button lets me apply the credit lifetime in the input box to a specific credit and change the expiration.

For old credits, I applied a much longer lifetime, but as students learn to adjust their behavior, I'm starting with a ten day expiration lifetime. That seems to be just the right amount of time to get students assessing within a reasonable amount of time of doing work related to a standard. I don't think this changes the time pressure to learn something within a given amount of time, which is one of the benefits of SBG. It does change the pressure to assess within a given amount of time, which I do want to happen.

I'm also adjusting some of my policies that cause the hoarding in the first place. Some of this tendency was a consequence of my system - I haven't let students go from a 5 to a 10 (on a ten point scale) with one assessment session. Mastery is demonstrated over time. I typically had students go from a 5 to an 8 on a perfect first assessment, and then left it there until the unit exam, when students can demonstrate mastery of standards in the context of many other problems.

I'm planning to loosen this progression in light of the credit expiration changes here. If a student is able to demonstrate the ability to answer questions related to a standard, no matter what I throw at them, that's a pretty good hint on their mastery level. It's up to me to give reassessment questions that measure what I'm looking for though. That's where the art of good assessment and experience comes in. I reserve the right to not raise the mastery level if I'm not convinced of a student's level - students know that taking a reassessment does not automatically mean their level will be raised. As long as that understanding continues, I think these changes will lead to better learning.

As always, I'll keep you all updated with how well this works in practice. I can always turn this feature off if it's a disaster.

Building (Ev)anAcademy Exercises for Reassessments

I've written multiple times previously about WeinbergCloud, the system I created using the Meteor framework that lets students sign up for reassessments. Over the course of the semester, I've been developing some aspects of the system that I'm excited about, and I'll talk about them all eventually. One in particular has held my focus for the past few days, and it's probably the one feature I've been talking about for the longest: building a reassessment engine within the system.

Part of this is out of necessity. The wireless network settings on our school network have changed, so the Python reassessment engine that I've used for reassessments over the last two years no longer works when hosted on my personal laptop. I've managed reasonably well this year using problems from textbooks and handouts, but it became time to automate this using my new knowledge of Javascript and Meteor.

Whatever your feelings about Khan Academy, the reality is that the organization has put a lot of energy and resources into developing a pretty comprehensive web application built around assessment. Not only are these resources available for free for teachers and students to use, but the source code is as well. The code for anyone to be able to run their own local version of exercises has been around for a while at Github here. The Javascript libraries that go with these exercises are also pretty impressive and capable - generating random numbers with constraints, simplifying fractions and expressions, and numerous other helper functions are already written by people that code much better than I do. They also wrote a math-typesetting library called KaTEX that has some performance advantages over some other libraries...or so I hear. I'm sure much of the 'why' here is lost on me.

After two days of tinkering, I've adapted some of their code in my app for the purposes of generating questions for reassessments. Writing questions and defining variables is all done in HTML, just as in their own local application, which means it's possible to add questions without having to load in a database through FTP or some other method. The code rendering the questions onto the webpage I had to write myself, but eventually I determined some ways to make this work for me.

I can put in HTML and Javascript definitions into text areas. Here's an example of a question asking for a simplified fraction for slope:
Screen Shot 2015-10-02 at 9.14.46 PM

A preview appears below to make sure the question appears the way I want it. In the variable definitions are strings of Javascript code that calculate and define the variables using the Khan Academy utility functions. The question text is then rendered using KaTEX. The random values change on each reload of the page, but these values could potentially be fixed for an individual student's quiz.

As I create frames for questions, I get a virtually infinite supply of questions I can use for assessing students on learning. Here are a bunch that I put together for testing the interface:

Screen Shot 2015-10-02 at 9.35.43 PM

The next step is to link each question frame to a course standard and build my database of questions. I'm loving the possibilities for building on this further, and will share as they develop. Stay tuned.

Progress on Python-Powered randomized quiz generator

One of the projects floating around in my head since the end of last year is creating an easy to use tool that will automatically generate questions for students to test their skills either on their own or while in class. My first attempt at this was during a unit in Geometry on translations, my first attempt at implementing standards based grading. I was taking a Udacity course on web applications and realized that if I could write a quiz generator online, it would be the easiest way to give students a sense of how they were doing without needing me to be part of the process.

As most people doing reassessments tend to be, I was a bit overwhelmed with the paperwork side of things, especially because many of the students just wanted to know if they were right or not. I had made some Python programs to generate quiz questions one by one and decided to try to adapt it to the web application so students could input their answers to questions that had different numbers every time. I had tried to use other options such as PollEverywhere, Socrative to at least collect the data and check it as right/wrong (which would have been good enough for a start in terms of collecting data, but left out the randomization part). The problem with these is that I believe they are hosted in the US and are incredibly slow without a VPN. I needed a solution that was fast, and if I could add the randomization, that would be even better. I decided to try to adapt my quiz generator to a Google App Engine hosted web application.

Needless to say (at least for me) this was not an easy task. I had a loose understanding of how to manage GET and POST requests and use cookies to store random values used. The biggest challenge came from checking answers on the server side. For someone figuring out Python concept as he goes, it involved a lot of fists on the keyboard at the time. My attempt is posted here. There were tons of bugs (and still are) but I at least got up the nerve to try it in class. The morning I was excited to premiere it, I also found out another interestingly infuriating nugget of info: Google App Engine is blocked in China.

I gave up at the time, as it was almost summer. I was interested in helping out with the development of the Physics Problem Database project during the summer, but opportunities for sitting down and coding while on a whirlwind tour of the US seeing friends and family weren't that numerous. It's amazing to see how John, Andy, and others have gotten the database site together and doing functionally cool things in a short amount of time. I spent some time over the summer learning PHP and MYSQL, but was pulled back into Python when I saw the capabilities of webpy and web2py to do applications. I see a lot of features and possibility there, but fitting my ideas to that framework is beyond what I know how to do and what I have been able to figure out during my time prepping and starting school. That will come later.

I keep coming back to the fact that randomization needs to be built into the program interface from the beginning. I want students that need to practice to be able to do so with different problems each time, because that frees them from needing me to be there to either generate them myself or prevent them from creating impossible problems. I want the reassessment process to be as simple as possible, and for the lowest level skills, they don't necessarily need me to be testing them in person. That's what in person interviews and conversations (including those through BlueHarvest) are all about. I won't rely on a tool like this to check proficiency, but it's a start for giving students a tool that will get them thinking along those lines.

I've had the structure for how to do this in my head for a while, and I started sketching out what it would be in a new Python program last week. This morning, after learning a bit more about the newer string formatting options in Python that offer more options than basic string substitution, I hunkered down and put together what is at least a workable version of what I want to do.

Please visit here to see the code, and here to give it a shot on repl.it.

The basic structure is that every question can use either random integers, an irrational decimal value, or signed integers in its text. With all of the messiness of methods to generate and replace the random numbers inside the Question class, it is fairly easy to generate questions with random values and answers. I admit that the formatting stinks, but the structure is there. I could theoretically make some questions for students this way that could be used on Monday, but I probably won't just yet. I think a nap is in order.

Next steps:

  • I need to work on the answer checking algorithm. At the moment it just compares an entered decimal answer to being within a certain tolerance of the calculated answer. My plan is to expand the Question definition to include another input variable for question type. Single numerical answers are one question type, Coordinates are another, and symbolic equations or expressions are yet another one I'd like to include. Based on the question type, the answer method in the Question class can be adjusted.
  • As an extension to this, I'd like to include sympy as part of this for making both question generation and answer checking. It has the ability to show that two symbolic expressions are equal to each other, among many other really nice capabilities. This will let me generate all sorts of nice Calculus and algebraic manipulation questions without too much difficulty.
  • I'd like to be able to format things nicely for algebraic questions, and possibly generate graphical questions as well.
  • The ultimate goal is to then get this nicely embedded as a web application. As I mentioned before, there is too much going on in the web2py framework for me to really get how to do this, but I think this is something I can do with a bit of help from the right sources.

I'm having a ball learning all of this, and to know that it will eventually make for a nice learning tool that students will benefit from using is a nice incentive for doing it.