# SBG and Leveling Up, Part 3: The Machine Thinks!

Read the first two posts in this series here:

...or you can read this quick review of where I've been going with this:

• When a student asks to be reassessed on a learning standard, the most important inputs that contribute to the student's new achievement level are the student's previously assessed level, the difficulty of a given reassessment question, and the nature of any errors made during the reassessment.
• Machine learning offers a convenient way to find patterns that I might not otherwise notice in these grading patterns.

Rather than design a flow chart that arbitrarily figures out the new grade given these inputs, my idea was to simply take different combinations of these inputs, and use my experience to determine what new grade I would assign. Any patterns that exist there (if there are any) would be determined by the machine learning algorithm.

I trained the neural network methodically. These were the general parameters:

• I only did ten or twenty grades at any given time to avoid the effects of fatigue.
• I graded in the morning, in the afternoon, before lunch, and after lunch, and also some at night.
• I spread this out over a few days to minimize the effects of any one particular day on the training.
• When I noticed there weren't many grades at the upper end of the scale, I changed the program to generate instances of just those grades.
• The permutation-fanatics among you might be interested in the fact that there are 5*3*2*2*2 = 120 possibilities for numerical combinations. I ended up grading just over 200. Why not just grade every single possibility? Simple - I don't pretend to think I'm really consistent when I'm doing this. That's part of the problem. I want the algorithm to figure out what, on average, I tend to do in a number of different situations.

After training for a while, I was ready to have the network make some predictions. I made a little visualizer to help me see the results:

You can also see this in action by going to the CodePen, clicking on the 'Load Trained Data' button, and playing around with it yourself. There's no limit to the values in the form, so some crazy results can occur.

The thing that makes me happiest about the result is that there's nothing surprising about the results.

• Conceptual errors are the most important ones that limit students from making progress from one level to the next. This makes sense. Once a student has made a conceptual error, I generally don't let students increase their proficiency level
• Students with low scores that ask for the highest difficulty problems probably shouldn't.
• Students that have an 8 can get to a 9 by doing a middle difficulty level problem, but can't get to a 10 in one reassessment without doing the highest difficulty level problem. On the other hand, a student that is a 9 that makes a conceptual error on a middle difficulty problem are brought back to a 7.

When I shared this with students, the thing they seemed most interested to use this to do is decide what sort of problem they want for a given reassessment. Some students with a 6 have come in asking for the simplest level question so they can be guaranteed a rise to a 7 if they answer correctly. A lot of level 8 students want to become a 10 in one go, but often make a conceptual error along the way and are limited to a 9. I clearly have the freedom to classify these different types of errors as I see fit when a student comes to meet with me. When I ask students what they think about having this tool available to them, the response is usually that it's a good way to be fair. I'm pretty happy about that.

I'll continue playing with this. It was an interesting way to analyze my thinking around something that I consider to still be pretty fuzzy, even this long after getting involved with SBG in my classes.

# Notes to the Future with Google Scripts & PearDeck

I wrote previously about my use of PearDeck in an end of semester activity. One of the slides in this deck was one in which I asked students to write themselves a note containing the things they would want to remember three weeks later at the beginning of semester two. With vacation now over, that day is now. I wrote a Google script that automatically sends the notes they wrote to each student. This allowed me to generally send these out without inadvertently reading the notes in detail. I saw some of them, but made an effort not to see who was writing what.

The PearDeck output spreadsheet for this deck looks like this:

Column 3 of the spreadsheet contains columns with the student's email addresses, so that made it easy to get the address corresponding with a given note to the future, which is column 4. By selecting 'Script Editor' from the tools menu, you can create a script that has the ability to process this data.

You can delete the code that is there, and then paste in the code below to create the email script.

You'll need to save the script, and from the run menu, select 'sendEmails'. You'll need to give permission for this script to read the spreadsheet for this to proceed. The emails will all be sent from your Google email.

Code:

function sendEmails() {

var sheet = SpreadsheetApp.getActiveSheet();
var startRow = 2; // First row of data to process
var numRows = 12; // Number of rows to process

var dataRange = sheet.getRange(startRow, 1, numRows, 5) //get all the data in the spreadsheet from the range between (startRow,1) and (numRows,5).
//This gets the first 12 students that are in this class, and the five columns of data I want to use for the email.

var data = dataRange.getValues(); //Store the spreadsheet data in an array
for (i in data) { //for each row in the spreadsheet
var row = data[i];

var studentEmail = row[2]; //the student email is in element 2, which is the third column

var subject = "Note To The Future (AKA Now): " + studentEmail; //email subject

//This next line is text formatted using HTML tags that appears before each students' note.

var greeting = "Happy New Year!

Before break, I asked you to write an email to yourself with things you would want to remember at the beginning of the semester. Whatever you wrote in that text box in Pear Deck is below for you to enjoy.

I'm looking forward to seeing you Tuesday or Wednesday in class.

Be well,
EMW

";
var message = greeting + row[3] //Combines the greeting and the student's individual note

MailApp.sendEmail({to:studentEmail, subject:subject, htmlBody:message});
//Sends the email to the student, with the subject defined in line 14, and the message from lines 20 and 21.
//htmlBody means the email will be formatted as HTML, not just text.
}

}

# Holiday Travel and Exporting PearDeck Data to Desmos

One of the unique phenomena of international schools is the reality that, during a vacation, the school population disperses to locations across the world. I had students do an end of semester reflection through PearDeck, and one of the slides asked students to drag a dot to where they were going to spend the vacation.

PearDeck allowed me to see the individual classes and share these with the students one at a time. I wanted to create a composite of all of the classes together in Desmos to share upon our return to classes, which happens tomorrow. You can find the result of this effort below. This is the combined data for draggable slides from five different sessions of the same deck.

The process of creating this image was a bit of work to figure out, but in the end wasn't too hard to pull off. Here's how I did it.

The export function of a completed PearDeck session, among other things, gives the coordinates of each student's dragged dot in a Draggable slide. I could not use these coordinates as is, as graphing them on top of the map image in Desmos did not actually yield the correct locations. I guessed that these coordinates represented a percentage of the width of the image used for the Draggable background since the images people upload are likely all of different sizes. I did a brief search in the documentation, and couldn't find official confirmation, but I'm fairly sure this is the case. An additional complication for using these is that the origin is at the upper left hand corner, which is typical for programming pixel art, but not correct for use with a Cartesian system as in Desmos.

This means that an exported data point located at 40, 70 is at 40% of the width of the image, and 70% of the height of the image, measured from the top left corner.

Luckily, Desmos makes it pretty easy to apply a transformation to the data to make it graph correctly. I took all of the data from the PearDeck export, pasted it into a spreadsheet class by class, and then pasted the aggregate data into a Desmos table. Desmos appears to have a 50 point limitation for pasting data this way, which is why the Desmos link below has two separate tables.

Click here to see the graph and data on Desmos

If there's an easier way to do this, I'd love to hear your suggestions in the comments.

# Releasing Today: States-n-Plates

I'm excited to share States-n-Plates , a project I built with Dan Meyer.

Dan proposed the idea for this activity a while ago with his typically high level of excitement about activities that provoke interesting and productive classroom conversation. This time, however, it wasn't about mathematics. I was looking for a bigger scale project to help me develop my ReactJS skills, so I took it on. Dan was patient enough to let me hack away at the project in this context. Though I could have certainly done it more quickly using jQuery or another framework, I wanted to try building this project in a particular way.

Specifically:

• I wanted to be able to play the game myself when I was done. Hard coding everything into a series of HTML pages would have likely resulted in my seeing each plate and the answer over the many times I reloaded during development. By abstracting the behavior of the game to be automated for each group of license plates, I saw most of the plates for the first time during testing.
• I wanted to experiment with a drag and drop library for React as an exercise for use in future experiments.
• I also wanted to have a slightly different UI behavior for the desktop and mobile versions. This functionality came from Bootstrap. This led to a bit of wonkiness on small phone displays, but larger tablets work great using touch, and the desktop version works well using drag and drop.
• I also wanted to experiment with modularity of both files and React component JSX files. I used Webpack. I don't understand Webpack.

As in my past collaborations with Dan, I learned to do a number of things I didn't think I could do. For example, I told Dan 'no' on the fading effect at one point, and then subsequently figured out how to make it happen through lots of searches, StackOverflow, and careful reading of the React documentation.

If you want to play with the code, the Github repository is at https://github.com/emwdx/states-n-plates/. You don't need the big node_modules directory for this to work locally, but it is required if you want to change the bundle.js file.

I have more thoughts on the learning process I went through, but that will be shared soon. Have fun and share with your friends.

# Getting Grade Data from PowerSchool Pro (#TeachersCoding)

Given that I use standards based grading with most of my classes, the grades I assign to students change quickly. I'm modifying those scores multiple times a day in some cases in my school's instance of PowerSchool Pro.

What the system currently lacks is an easy way to get that data out. For whatever reason, the only export format is PDF. This makes it difficult to get things into a spreadsheet.

After some hacking around in the console, I was able to put together a script that scrapes a class scoresheet page for the student names and assignment names and stores the result in a variable called exportData. This code is included below, and is also here in a gist. Paste the entire code into the console and run it. Then type in exportData and the scraped data will appear.

You can then copy and paste the resulting string (leaving out the quotes) into Excel, OpenOffice, or Google Sheets and the data will appear there, ready to be spreadsheet-ified.

The only place where this doesn't work perfectly is when there are more students than will fit on the page. As far as I could tell after poking around, the grade data is re-rendered to fit the page as scrolling occurs. I didn't work that hard to see if the data is stored somewhere else on the page, so someone with a bit more insight might be able to improve upon my work.

Here is the full code:

var nameElements = $('.student-name').toArray(); var assignmentElements =$('var').toArray();
var names = [];
var assignments = [];
var assignmentNumber;

assignmentElements.forEach(function(name,index){

assignments.push(name.innerHTML)

})

names = names.slice(0,0.5*(names.length))

var rows = $( "tr[id*='std']" ).toArray() rows.forEach(function(row){ var currentName =$(row).find('.student-name')[0].innerHTML;
var gradeElements = \$(row).find('var');

})

}

})

assignmentNumber = names[0][1].length;

assignmentString = 'Name \t';

for(var i = 0;i<2*assignmentNumber-1;i+=2){ assignmentString += assignments[i] + '\t ' } var gradeString = ''; names.forEach(function(name){ var currentString = ''; currentString += name[0]+ "\t " name[1].forEach(function(grade){ currentString += grade + "\t " }) gradeString += currentString + "\n " }) var exportData = assignmentString+"\n"+gradeString;

# Generating Function Library Quizzes (#TeachersCoding)

I've required my IB classes in the past two years to be able to draw some standard functions from memory as part of our function families unit. Creating quizzes for this has been a hassle since I've manually had to build these using Word or LibreOffice. I greatly dislike formatting things using either software package.

I decided this week that creating these quizzes using HTML seemed like a perfect application of my developing React skills. Here's the result:

The order of the functions randomly generates on each page load, which makes it easy to generate new versions. I've been able to export these as PDF files and then send them right to the printer.

You can access the code here on CodePen:

See the Pen FunctionLibraryQuiz by Evan Weinberg (@emwdx) on CodePen.0

Feel free to use this or modify to fit your needs.

# Two Lane Road and Collaborative Data Collection

I love doing three act problems. This fact should surprise nobody that regularly reads my blog.

In tasks that involve prediction or measurement from a range of sources, I see lots of tables of values made by students that stay in the notebook. I have always wanted to get that data in the hands of the rest of the students to use, or not use, as they see fit. In one of my previous iterations, I pasted the data into a shared Google spreadsheet that students could then paste into a Desmos graph, again, if they felt doing so would be helpful. This was incredibly rich source of material for conversations between students. Still, that extra step of having to paste from one collaborative document (Google) to a non-collaborative one (Desmos Calculator) was one more step than I felt was needed.

Of course, you're now screaming at the screen. "Calling out Desmos for being non-collaborative is entirely off base, Evan" , you say. I agree to an extent. Their own activities share data collected by individual students, and on the teacher side, the Activity Builder does the same thing for letting teachers see student data all in one place. They do this incredibly well. Students also get to see each others answers when teachers let them. What doesn't happen right now is students seeing each other's graphs, tables, and lists of expressions.

This, along with a desire to play with the Desmos API, is why I created DataTogether (Github repository here), a hacky way to make Desmos data collaborative. The page is written in React, and uses Firebase to do the realtime data connection.

Dan Meyer tweeted shortly after that these changes might be somewhere in the pipeline already:

This is probably why I may not be adding a lot of code comments to my code in the near future.

I did my Two Lane Road 3-act with a small group of students this morning on account of tenth graders being out for the PSAT. After the standard Act 1 conversation, and a really great conversation about agreements between groups on collecting data from the video, the students began collecting data on the red and blue cars.

The students were efficiently able to collect data together on separate computers after profuse apologies for the limitations of my code:

I then had each student use the tools within Desmos to construct a linear model from the data. The fact that two computers were looking at the same data, but in different Desmos windows, paid significant dividends when two students on the same team created their models in different ways. One student made a regression. Another created a line that went through one set of points perfectly, but missed another. Math class conversation gold right there.

I exported both of their data through the console (code shown below) and pasted it into Desmos. I then put together a simulation of the red and blue car so that the teams could see what their car looked like in simulation.

This allowed us to make a prediction directly off of their models that looked like the original video.

We ran out of time in the end to do much more than sharing predictions and watching the third act, but I'm pretty pleased with how things went overall. My paper handouts with three printed color frames of the video went unused. I think

A big shout-out of thanks to everyone for helping test the data collection tool I shared earlier.

Here's a screenshot of our age vs. teaching years data:

The data can be downloaded from the DataTogether page, loading data set 3DR9, and then by going to the console and entering the code below:

ptString = "";
myComponent.state.groupData.forEach(function(pt){ptString=ptString+pt.x+" \t "+pt.y+" \n "});

This string can then be pasted right into the Desmos expression list if you want to play with it.

# Numbas and Randomized Assessment

At the beginning of my summer vacation, I shared the results of a project I had created to fill a need of mine to generate randomized questions. I subsequently got a link from Andrew Knauft (@aknauft) about another project called Numbas that had similar goals. The project is out of Newcastle University and the team is quite interested in getting more use and feedback on the site.

You can find out more at http://www.numbas.org.uk/. The actual question editor site is at https://numbas.mathcentre.ac.uk/.

I've used the site for a couple of weeks now for generating assessments for my students. I feel pretty comfortable saying that you should be using it too, and in place of my own QuestionBuilder solution. I've taken the site down and am putting time into developing my own questions on Numbas. Why am I so excited about it?

• It has all of the randomization capabilities of my site, along with robust variable browsing and grouping, conditions for variable constraints, and error management in the interface that I put on the back burner for another day. Numbas has these features right now
• LaTEX formatting is built in along with some great simplification functions for cleaning up polynomial expressions.
• Paper and online versions (including SCORM modules that work with learning management sites like Moodle) are generated right out of the box.
• It's easy to create, share, and copy questions that others have created and adapt them to your own uses.
• Visualization libraries, including Geogebra and Viz.js, are built in and ready to go.
• The code is open sourced and available to install locally if you want to do so.

I have never planned to be a one-person software company. I will gladly take the output of a team of creative folks that know what they are doing with code over my own pride, particularly when I am energized and focused on what my classroom activities will look like tomorrow. The site makes it easy to generate assessments that I can use with my students with a minimal amount of friction in the process.

I'll get more into the details of how I've been using Numbas shortly. Check out what they've put together - I'm sure you'll find a way to include it in part of your workflow this year.

# QuestionBuilder: Create and Share Randomized Questions

I've written previously about my desire to write randomized questions for the purpose of assessment. The goal was never to make a worksheet generator - those exist on the web already. Instead, I wanted to make it easy to create assessment questions that are similar in form, but different enough from each other that the answers or procedures to solve them are not necessarily identical.

Since January, I've been working on a project called QuestionBuilder. It's a web application that does the following:

• Allows the creation of assessment questions that contain randomized elements, values, and structures.
• Uses regular Javascript, HTML, and the KaTEX math rendering library to create and display the questions
• Makes it easy to share questions you create with community members and build upon the work of others to make questions that work for you.

Here's a video in which I convert a question from the June 2016 New York State Regents exam for Algebra 2 Common Core into a randomized question. Without all of my talking, this is a quick process.

I've put a number of questions on the site already to demonstrate what I've been using this to do. These range from simple algebra to physics questions. Some other folks I appreciate and respect have also added questions in their spare time.

For now, you'll need to create an account and log in to see these questions in action. Go to http://question-builder.evanweinberg.org, make an account, and check out the project as it exists at this point.

My hope is to use some time this summer to continue working on it to make it more useful for the fall. I'll also be making some other videos to show how to use the features I've added thus far. Feel free to contact me here, through Twitter (@emwdx), or by email (evan at evanweinberg.com) if you have questions or suggestions.

# Problems vs. Exercises

My high school mathematics teacher, Mr. Davis, classified all learning tasks in our classroom into two categories: problems and exercises. The distinction between the two is pretty simple. Problems set up a non-routine mathematical conflict. Once that conflict is resolved once, problems cease to be problems - they become exercises. Exercises tend to develop content skills or application of knowledge - problems serve to develop one's habits of mathematical practice and understanding.

I tend to give a mixture of the two types to my students. The immediate question in an assessment context is whether my students have a particular skill or can apply concepts. Sometimes this can be established by doing several problems of the same or similar type. This is usually the situation when students sign up for a reassessment on a learning standard. In cases where I believe my students have overfit their understanding to a particular question type, I might throw them a problem - a new task that requires higher levels of understanding. I might also give them a task that I know is similar to a question they had wrong last time, with a twist. What I have found over time is that there needs to be a difference between what I give them on a subsequent assessment, or I won't get a good reading on their mastery level.

The difficulty I've established over the past few years learning to use SBG has been curating my own set of problems and exercises for assessment. I have textbooks, both electronic and hard copy, and I've noted down the locations of good problems in analog and digital forms. I've always felt the need to guard these and not share them with students so that they don't become exercises. My sense is that good problems are hard to find. Good exercises, on the other hand, are all over the place. This also means that if I've given Student A a particular problem, that I have to find an entirely different one for Student B in case the two pool their resources. In other words, Student A's problem then becomes Student B's exercise. I haven't found that students end up thinking that way, but I still feel weird about using the same problem multiple times.

What I've always wanted was a source of problems that somehow straddled the two categories. I want to be able to give Student A a specific problem that I carefully designed for assessing a particular standard, and student B a different manifestation of that same problem. This might mean different numbers, or a slight variation that still assesses the same thing. I don't want to have to reinvent the problem every single time - there must be a way to avoid repeating that effort. By carefully designing a problem once, and letting, say, a computer make randomized changes to different instances of that problem, I've created a task I can use with different students. Even if I'm in the market for exercises, it would be nice to be able to create those quickly and efficiently too. Being able to share that initial effort with other teachers who also share a need would be a bonus.

I think I've made an initial stab at creating something to fit that need.