# QuestionBuilder: Create and Share Randomized Questions

I've written previously about my desire to write randomized questions for the purpose of assessment. The goal was never to make a worksheet generator - those exist on the web already. Instead, I wanted to make it easy to create assessment questions that are similar in form, but different enough from each other that the answers or procedures to solve them are not necessarily identical.

Since January, I've been working on a project called QuestionBuilder. It's a web application that does the following:

• Allows the creation of assessment questions that contain randomized elements, values, and structures.
• Uses regular Javascript, HTML, and the KaTEX math rendering library to create and display the questions
• Makes it easy to share questions you create with community members and build upon the work of others to make questions that work for you.

Here's a video in which I convert a question from the June 2016 New York State Regents exam for Algebra 2 Common Core into a randomized question. Without all of my talking, this is a quick process.

I've put a number of questions on the site already to demonstrate what I've been using this to do. These range from simple algebra to physics questions. Some other folks I appreciate and respect have also added questions in their spare time.

For now, you'll need to create an account and log in to see these questions in action. Go to http://question-builder.evanweinberg.org, make an account, and check out the project as it exists at this point.

My hope is to use some time this summer to continue working on it to make it more useful for the fall. I'll also be making some other videos to show how to use the features I've added thus far. Feel free to contact me here, through Twitter (@emwdx), or by email (evan at evanweinberg.com) if you have questions or suggestions.

# Endings and Beginnings

Today, I bid farewell to my home away from home for the past six years.

When I first moved away from New York, I had shed all doubts that the teaching career was for me. I knew that learning and exploring were important elements of a meaningful existence on this planet, both for me and my students. I knew that few things were more satisfying than spending time with good people around plates of food. I knew that not knowing the local language or the location of the nearest supermarket was a cause for excitement, not fear. Purposely putting one's self into situations with unknown outcomes is not a reckless act. It is precisely these challenges that define and refine who we are so that we are better prepared for those events that we do not expect.

I knew these things already. And yet, I leave China today as a changed teacher. I met students from all around the world. I made connections not just with new people in the same building as me, but with teachers in many distributed time zones. People that I respected and admired for their ideas humbled me as they invited me to join in their conversations and explore ideas with me. I found opportunities to present at conferences and get to know others that had also fallen in love with the international teaching lifestyle. I started this blog, and surprisingly, had people read it with thoughts of their own to share.

I also learned to accept the reality that life continues in twenty four time zones. News from home made it seem more foreign and paradoxically more connected to my own experiences here. When opening my eyes and my various devices in the morning to see what had happened while I slept, I again never knew what to expect. I lost family members both suddenly and over stretches of time. Kids grew up. Our parents sold their houses and apartments. Friends put prestigious letters at the end of their names.

Our world changed as well. We added new countries to our passports and got lost in cities that refused to abide by a grid system. We fell in love with our dog and his aggressive sneezing at harmless bystanders. We tried to address the life and work balance through weeknight dinners and mini vacations. We repeatedly overcommitted to traveling during our summers off and time went too quickly. We became parents.

I write this not because anything I'm saying is especially new. The 'time marches on' canon is well established. That does not invalidate the reality that we're all experiencing life and its passage for the first time ourselves. This is the magic that we, as teachers, witness between the end of one year and the beginning of the next. We tweak our lessons from the previous year with the hope that they prompt more questions and productive confusion on the next iteration. Our students do experience some of the ideas we introduce for the first time in our classrooms, and it is unique that we get to design those experiences ourselves.

The best way to understand the rich range of emotions that our students experience while in our care is to live deeply and richly in our own lives. We need to learn to know and love others, explore and make mistakes, and be ready to move forward even when the future is uncertain. My time abroad thus far has given me numerous journeys through these human experiences. I would not give them up for the world, and luckily, I do not have to do so.

I'll write more about my next move in a future post.
Until then, I wish you all a summer full of good times with good people.

# Hacking The 100-Point Scale - Part 1

One highlight of teaching at an international school is the intersection of many different philosophies in one place. As you might expect, the most striking of these is that of students comparing their experiences. It's impressive how the experienced students that have moved around quickly learn the system of the school they are currently attending and adjust accordingly. What unites these particularly successful students is their awareness that they must understand the system they are in if they are to thrive there.

This is the case with teachers, as we share with each other just as much. We discuss different school systems and school structures, traditions, and assessment methods. Identifying the similarities and differences in general is an engaging exercise. In general, these conversations lead to a better understanding of why we do what we do in the classroom. Also, in general, these conversations end with specific ideas for what we might do differently on the next meeting with students.

There is one important exception. No single conversation topic has caused more argument, debate, and unresolved conflict at the end of a staff meeting than the use of the 100-point scale.

The reason it's so prevalent is  that it's easy to use. Multiply the total points earned by 100, and then divide by the total possible points. What could go wrong with this system that has been used for so long by so many?

There a number of conversation threads that have been particularly troublesome in our international context, and I'd like to share one here.

### "A 75 isn't a bad score."

For a course that is difficult, this might be true. Depending on the Advanced Placement course, you can earn the top score of 5 on the exam by earning anywhere between around 65% and 100% of the possible points. The International Baccalaureate exams work the same way. I took a modern physics exam during university on which I earned a 75 right on the nose. The professor said that considering the content, that was excellent, and that I would probably end up with an A in the course.

The difference between these courses and typical school report cards is that the International Baccalaureate Organization (IBO), College Board, and college professor all did some sort of scaling to map their raw percentages to what shows up on the report card. They have specific criteria for setting up the scaling that goes from a raw score to the 1 - 5 or 1 - 7 scores for AP or IB grades respectively.

What are these criteria? The IBO, to its credit, has a document that describes what each score indicates about a student with remarkable specificity. Here is their description of a student that receives score of 3 in mathematics:

Demonstrates some knowledge and understanding of the subject; a basic sense of structure that is not sustained throughout the answers; a basic use of terminology appropriate to the subject; some ability to establish links between facts or ideas; some ability to comprehend data or to solve problems.

Compare this to their description of a score of 7:

Demonstrates conceptual awareness, insight, and knowledge and understanding which are evident in the skills of critical thinking; a high level of ability to provide answers which are fully developed, structured in a logical and coherent manner and illustrated with appropriate examples; a precise use of terminology which is specific to the subject; familiarity with the literature of the subject; the ability to analyse and evaluate evidence and to synthesize knowledge and concepts; awareness of alternative points of view and subjective and ideological biases, and the ability to come to reasonable, albeit tentative, conclusions; consistent evidence of critical reflective thinking; a high level of proficiency in analysing and evaluating data or problem solving.

I believe the IBO uses statistical and norm referenced methods to determine the cut scores between certain score bands. I'm also reasonably sure the College Board has a similar process. The point, however, is that these bands are determined so that a given score matches

The college professor used his professional judgement (or a bell curve, I don't actually know) to make his scaling. This connects the raw score to the 'A' on my report card that indicated I knew what I was doing in physics.

The reason this causes trouble in discussions of grades in our school, and I imagine in other schools as well, is the much more ill-defined definition of what percentage grades mean on the report card. Put quite simply, does a 90% on the report card mean the student has mastered 90% of the material? Completed 90% of the assignments? Behaved appropriately 90% of the time? If there are different weights assigned to categories of assignments in the grade book, what does an average of 90% mean?

This is obviously an important discussion for a school to have. Understanding the meaning of the individual percentage grades and what they indicate about student learning should be clear to administrators, teachers, parents, and most importantly, the students themselves. These is a tough conversation.

Who decided that 60% is the percentage of the knowledge I need to get credit? On a quiz on tool safety in the maker space, is 60% an appropriate cut score for someone to know enough? I say no. On the report card, I'd indicate that a student has a 50 as their grade until they demonstrate he or she can get 100% of the safety questions correct. Here, I've redefined the grade in the grade book as being different from the percentage of points earned, however. In other words, I've done the work of relating a performance measure to a grade indicator. These should not be assumed to be the same thing, but being explicit about this requires a conversation defining this to be the case, and communication of this definition to students and teachers sharing sections of the same course.

Most of this time, I don't think there is time for this conversation to happen, which is the first reason I believe this issue exists. The second is the fact that a percentage calculation is mathematically simple and understood as a concept by students, teachers, and parents alike. Grades have been done this way for so long that a grade on the 100-point scale is generally assumed to be this percentage mastered or completed concept.

This is too important to be left to assumption. I'll share more about the dangers of this assumption in a future post.

# My Journey with Meteor as a Teacher-Coder

Many of you may know about my love for Meteor, the Javascript framework that I've used for a number of projects in and around the classroom. I received an email this morning alerting me (and the many other users) that the free hosting service they have generously offered since inception would be shutting down over the next month.

To be honest, I'm cool with this decision. I think it's important to explain why and express my appreciation for having access to the tool for as long as I have.

I started writing programs to use in my classroom in earnest in 2012. Most of these tended to be pretty hacky - a simple group generator and a program to randomly generate practice questions on geometric transformations were among these early ones. The real power I saw for these was the ability to collect, store, and filter information that would be useful for teaching so that I could focus my time on using that information to decide on the next steps for my students. I took a Udacity course on programming self-driving cars and on web applications and loved what I learned. I learned to use some Python to take some of the programs I had written early on and run them within web pages. I built some nifty online activities inspired by the style of Dan Meyer and put them out for others across the world to try out. (Links for these Half-Full and Shapes tasks are below.) It was astounding how powerful I felt being able to take something I created and get it out into the internet wilderness for others to see.

It was also astounding how much time it took. I learned Javascript to manage the interactivity in the web page, and then once that was working, I switched to Python on the server to manage the data coming from users. For those that have never done this sort of switching, it involves a lot of misplaced semicolons, tabs, and error messages. I accepted that this was the way the web worked - Javascript in front, and Python (or PHP, Rails, Perl, etc.) on the back end. That extra work was what kept someone like me from starting a project on a whim and putting it together. That cost, in the midst of continuing to do my actual job of teaching and assessing students five days a week, was too great.

This was right around the summer of 2013 when a programmer named Dave Major introduced me to Meteor. I did not know the lingo of reactivity or isomorphic Javascript - I just saw the demonstration video on YouTube and thought it was cool. It made the connection between the web page and the server seamless, eliminating the headaches I mentioned earlier. Dave planned to put together some videos and tutorials to help teachers code tools for the classroom using Meteor, and I was obviously on board. Unfortunately, things got in the way, and the video series didn't end up happening. Still, with Dave's help, I learned a bit about Meteor and was able to see how easy it was to go from an idea to a working application. I was also incredibly impressed that Meteor made it easy to get an application online with one line: meteor deploy (application-name here) . No FTP, no hostname settings - one line of code in the terminal, and I could share with anybody.

With that server configuration friction eliminated, I had the time to focus on learning to build truly useful tools for myself. I created my standard based grading system called WeinbergCloud that lets students sign up for reassessments, earn credit for the homework and practice they did outside of class, and see the different learning objectives for my course. I created a system for my colleagues to use to award house points for the great things that students did during the school day. I made a registration and timing system for our school's annual charity 5K run that reduced the paperwork and time required of our all volunteer staff to manage the hundreds of registrants. I spoke at a Meteor DevShop about this a year and a half ago and have continued to learn more since then.

Most importantly to me, it gave me knowledge to share with a class of web programming students, who have learned to create their own apps. One student from last year's class learned about our library media specialist's plan to hold a read-a-thon, and asked if he could create an interactive website to show the progress of each class using, you guessed it, Meteor. Here's a screenshot of the site he created in his spare time:

And yes, all of these apps have been hosted on the free deploy server at *.meteor.com, and yes, I will have to do the work of moving these sites to a new place. The public stance from Meteor has been that the free site should not really be used for production apps, something I've clearly been doing for over two years now. I re-read that line on the documentation website back in January and asked myself what I would do if I no longer had access to that site. The result: I did what I am paid to do as a master learner, and learned to host a site on my personal server. That learning was not easy. The process definitely had me scratching my head. But it also meant that I had a better understanding of the value that the free site had given me over my time using it.

The reality is that Meteor has clearly and publicly shifted away from being just being that framework that has a free one line deployment. The framework has so much going for it, and the ability to create interesting apps is not going away. The shift toward doing what one does best requires hard choices, and the free site clearly was something that did not serve that purpose. It means that those of us that value the free deploy as a teaching tool can seek other options for making it as easy to get others in the game as it was for us.

Meteor has helped me be better at my job, and I appreciate their work.

As promised, here are those learning task sites I mentioned before:

# Choosing the Next Question

If a student can solve $3x - 1 = 5$ for x, how convinced are we of that student's ability to solve two step equations?

If that same student can also solve $14 = 3x + 2$ , how does our assessment of their ability change, if at all?

What about $-2-3x= 5$ ?

Ideally, our class activities push students toward ever increasing levels of generalization and robustness. If a student's method for solving a problem is so algorithmic that it fails when a slight change is made to the original problem, that method is clearly not robust enough. We need sufficiently different problems for assessing students so that we know their method works in all cases we might through their way.

In solving $3x-1 = 5$ , for example, we might suggest to a student to first add the constant to both sides, and then divide both sides by the coefficient. If the student is not sure what 'constant' or 'coefficient' mean, he or she might conclude that the constant is the number to the right of the x, and the coefficient is the number to the left. This student might do fine with $10 =2x-4$ , but would run into trouble solving $-2-3x = 5$ . Each additional question gives more information.

The three equations look different. The operation that is done as a first step to solving all three is the same, though the position of the constant is different in all three. Students that are able to solve all three are obviously proficient. What does it mean that a student can solve the first and last equations, but not the middle one? Or just the first two? If a student answers a given question correctly, what does that reveal about the student's skills related to that question?

It's the norm to consider these issues in choosing questions for an assessment. The more interesting question to me theses days is that if we've seen what a student does on one question, what should the next question be? Adaptive learning software tries to do this based on having a large data set that maps student abilities to right/wrong answers. I'm not sure that it succeeds yet. I still think the human mind has the advantage in this task.

Often this next step involves scanning a textbook or thinking up a new question on the spot. We often know the next question we want when we see it. The key then is having those questions readily available or easy to generate so we can get them in front of students.

In a unit on Meteor applications for my web design class, I wrote a series of applications to help my students see the basic structure of a few Meteor applications so that they could eventually design their own. The students had seen applications that tallied votes from a survey, compiled links, and a simple blog. This one was about being competitive, and the students were understandably into it.

This tutorial was designed to use MeteorPad due to some challenges associated with installing Meteor on student computers. The first one involved permissions issues since students are not by default given access to the terminal. The second involved connectivity issues to the Meteor servers from China, which unfortunately, I never fully resolved. MeteorPad offered an awesome balance between ease of making an application and minimizing the need for Terminal. For anyone looking to get started with Meteor, I recommend trying out MeteorPad as it doesn't require any knowledge of working in the terminal or software installation.

I was able to take data on the students clicking away as I created new pieces of gold and put it out into the field. I've written previously about my enthusiasm for collecting data through clicks, so this was another fun source of data.

# Reaction Time & Web Data Collection

If you put out an open call through email to complete a task for nothing in return, it might make sense not to expect much. I tried to make it as simple as possible to gather some reaction time data for my IB Mathematics SL class to analyze. My goal for each class has been to get an interesting data set each time and see what students can make out of it. After several hours of having this open, I had a really nice set of data to give the class.

I know my social networks are connections between some phenomenal people. That said, I didn't know that the interest in trying this out would be so substantial, and in several cases, get people to try multiple times to get their own best time. In less than a week, I've collected more than 1,000 responses to my request to click a button:

I coded this pretty quickly and left out the error correction I would have included given the number of people that did this. I've been told that between phones, tablets, desktops, laptops, and even SmartBoards, there have been many different use cases for times ranging from hundredths of a second to more than five minutes - clearly an indication that this badly needs to be tweaked and fixed. That said, I am eager to share the results with the community that helped me out, along with the rest of the world. A histogram:

There's nothing surprising here to report on a first look. It is clear that my lazy use of jQuery to handle the click event made for a prominent second peak at around 0.75 seconds for those tapping on a screen rather than clicking. Some anecdotal reporting from Facebook confirmed this might have been the explanation. The rest of the random data outside of the reasonable range is nothing more than poorly coding the user experience on my part. Sorry, folks.

This isn't the first time I've done a data collection task involving clicking a button - far from it. It's amazing what can be collected with a simple task and little entry cost, even when it's a mathematical one. One of the things I wonder about these days is which tools are needed to make it easy for anyone (including students) to build a collection system like this and investigate something of personal importance. This has become much easier with tools such as Google Docs, but it isn't easy to get a clean interface that strips away the surrounding material to make the content the focus. For all I know, there may already be a solution out there. I'd love to hear about it if you know.

# Students Coding Tilman's Art with HTML5

I'm a big fan of the Geometry Daily Tumblr. Tilman's minimalist geometry images are beautiful in their simplicity. I've always wondered about reproducing art in code as a vehicle for learning to code, and have had it on my list to do this myself using Tilman's work.

In my web programming class, where we are currently playing around with the HTML5 canvas and its drawing capabilities, this concept was a perfect opportunity to let students play around with the art form. They quickly observed the beauty of what can be created using these tools, and the power of doing so by studying someone that is great at it.

Here are some of the results of their sketches. Some did precise imitations, others did their own interpretations. Click on the image to see their code posted on JSFiddle.

# Formula Sheet - A Toolbox or Takeout Menu?

During the IB Exams, students get a set of equations and constants to use. Part of the motivation behind them is to reduce the amount of memorization required. There's no sense in students memorizing Planck's constant or the Law of Cosines in a context that emphasizes application of these ideas.

That said, I've heard variations on the following from different students just in the past three days:

• I thought I was right, then I looked at the formula sheet, and realized I was wrong. (She was right the first time.)
• I didn't study it because I knew it was on the formula sheet.
• I don't know what formula to use.

If you read my blog, you know that I don't test formula memorization for all sorts of reasons. You get it. I get it. It has a place, but that place isn't one I want to be spending my time.

You might also know that I've experimented with different versions of resources available to students during a test. I've done open note-card, open A4 sheet, open A5 sheet, open computer/closed network, open computer/open network, open notebook, and open people (i.e. a group test) formats.

I believe that the act of students creating their own formula sheets is more effective than handing one to them. The process of seeing how a formula is applied in different contexts and deciding what needs to be remembered is valuable on its own. Identifying that one problem is similar to another for reasons of physics shows understanding. I want to make opportunities for that to happen. Reducing the size of the resource requires students to prioritize. These are all high level skills.

The difficulty is that students see formulas directly as a pathway from problem to solution. Most problems worth solving don't fit with that level of simplicity. Formula sheets give you the factual information, and rely on the user to know how to connect that information to a problem. The student thinks that the answer is staring at them in the face, and they just have to pick the right one. As teachers, we want students to identify information they need, then look at the reference to get it.

This is part of the reason I like standards based grading, as it justifies assessing students through conversation. A student asks me for a specific piece of information. If it's how to calculate something, I'll tell them if the related learning standard is about applying a concept, not calculating a quantity. If their request directly asks for the answer to the question, I don't tell them. If they ask for a hint, I give them enough to get them moving, and adjust their proficiency level for the related standard according to the amount of help I give them.

In the long run, however, students need to know how to use the resources available to them. This is one of those big picture skills everyone talks about. Students need to know how to use Google to effectively find what they are looking for. They need to know that typing the text of a question into Yahoo Answers is not going to get them the answer they are looking for. I do know that if a student directly says "I can't remember a formula for [ ]", and I give them an equation sheet, they can usually find it. If they use the formula sheet as step one, they are not likely to complete the problem on their own. Having the sheet there in front of them makes it far too easy to start a problem that way. Would having students tally the number of times they looked at their sheet be enough of a feedback mechanism to keep this in check?

I don't know what the answer is right now.

How do you help students treat a formula sheet more like a tool box, and less like a restaurant take-out menu?

# Before a break: Seniors Think 'School'

The seniors completed their final presentations this week. This was a series of TED style talks on subjects ranging from 3D printing and product placement to the connections between meat and cancer and the lack of women in foreign policy. I've been really pleased with how this group has developed their skills in communicating ideas, both through writing last semester, and in visual communication more recently.

We still have a couple of months left in the year, so when the seniors and I got back together for one class before spring break, they wanted to know what we were going to do with the time left. This point of the year for seniors, more so than other times, has a consistent theme of time ticking down in all sorts of ways. They keep an accurate count of the number of days left in school on a small chalkboard in the lounge. They keep track of college acceptances on a big map there as well. Keeping them in the present is much more easily said than done, so I tend to push seniors to think through big picture stuff at this stage.

So when we sat down in class this past week, I had rearranged the tables into a big family style U-shape to make. Lear that something would be 'different' from that point forward. I talked to them about my history in education. I described different schools I went to, how they nudged my personal path one way or another. I then showed them two talks, one from Ken Robinson about the learning revolution, and the other from Shawn Cornally describing the Iowa BIG school.

My questions after both of these were simple:

In what ways are you who you are because of your school experience?

In what ways are you who you are in spite of your school experience?

We had a brief conversation about this, and students had really insightful and revealing comments about it. I didn't want to give a big assignment or written reflection for the break though. This is family vacation time, and I didn't feel the need - plus my plans for the next steps are still in the formative stages. I did want to get seniors at least thinking big picture about the role of school as part of their identity. One senior said on the way out: "pretty deep for the day before spring break, Weinberg." For someone who thinks about education as much as I (and most teachers I know) do, this type of question is the norm.