Taking Time Learning Math: A Student's Perspective

Yesterday was our school's student led conference day. I've written previously on how proud these days make me as an educator. Whens students do genuine reflection on their learning and share the ups and downs of their school days, it's hard not to see the value of this as an exercise.

During one conference, a student shared a fascinating perspective on her learning in math. This is not the usual level of specificity that we get from our students, so I am eager to share her thinking. Here's the student's comment during the conference:

“It isn’t that I don’t like math. Learning takes time in math, and I don’t always get the time it takes to really understand it.”

I asked her for further clarification, and this was her response:

...Math is such an interesting subject that can be “explored” in so many different ways, however, in school here I don’t really get to learn it to a point where I say yeah this is what I know, I fully understand it. We move on from topic to topic so quickly that the process of me creating links is interrupted and I practice only for the test in order to get high grades.

It's certainly striking to get this sort of feedback from a student who is doing all the things we ask her to do. The activities this student is doing in class are not day-after-day repetitions of "I do, we do, you do" - we do a range of class activities that involve exploring, questioning, and interacting with other students.

This student's comment is about limitations of time. She isn't saying that we aren't doing enough of X, Y, or Z - quite the contrary, she just is asking for time to let it sink in. She doesn't answer the question of what that time looks like, but that's not her job, it's ours.

I know I always feel compelled to nudge a class forward in some way. This doesn't mean I moving through material more quickly, but I do push for increased depth, intuition, or quality conversation about the content in every class period. Her comment makes me realize that something still stands to be improved. Great food for thought for the weekend.

My Journey with Meteor as a Teacher-Coder

Many of you may know about my love for Meteor, the Javascript framework that I've used for a number of projects in and around the classroom. I received an email this morning alerting me (and the many other users) that the free hosting service they have generously offered since inception would be shutting down over the next month.

To be honest, I'm cool with this decision. I think it's important to explain why and express my appreciation for having access to the tool for as long as I have.

I started writing programs to use in my classroom in earnest in 2012. Most of these tended to be pretty hacky - a simple group generator and a program to randomly generate practice questions on geometric transformations were among these early ones. The real power I saw for these was the ability to collect, store, and filter information that would be useful for teaching so that I could focus my time on using that information to decide on the next steps for my students. I took a Udacity course on programming self-driving cars and on web applications and loved what I learned. I learned to use some Python to take some of the programs I had written early on and run them within web pages. I built some nifty online activities inspired by the style of Dan Meyer and put them out for others across the world to try out. (Links for these Half-Full and Shapes tasks are below.) It was astounding how powerful I felt being able to take something I created and get it out into the internet wilderness for others to see.

It was also astounding how much time it took. I learned Javascript to manage the interactivity in the web page, and then once that was working, I switched to Python on the server to manage the data coming from users. For those that have never done this sort of switching, it involves a lot of misplaced semicolons, tabs, and error messages. I accepted that this was the way the web worked - Javascript in front, and Python (or PHP, Rails, Perl, etc.) on the back end. That extra work was what kept someone like me from starting a project on a whim and putting it together. That cost, in the midst of continuing to do my actual job of teaching and assessing students five days a week, was too great.

This was right around the summer of 2013 when a programmer named Dave Major introduced me to Meteor. I did not know the lingo of reactivity or isomorphic Javascript - I just saw the demonstration video on YouTube and thought it was cool. It made the connection between the web page and the server seamless, eliminating the headaches I mentioned earlier. Dave planned to put together some videos and tutorials to help teachers code tools for the classroom using Meteor, and I was obviously on board. Unfortunately, things got in the way, and the video series didn't end up happening. Still, with Dave's help, I learned a bit about Meteor and was able to see how easy it was to go from an idea to a working application. I was also incredibly impressed that Meteor made it easy to get an application online with one line: meteor deploy (application-name here) . No FTP, no hostname settings - one line of code in the terminal, and I could share with anybody.

With that server configuration friction eliminated, I had the time to focus on learning to build truly useful tools for myself. I created my standard based grading system called WeinbergCloud that lets students sign up for reassessments, earn credit for the homework and practice they did outside of class, and see the different learning objectives for my course. I created a system for my colleagues to use to award house points for the great things that students did during the school day. I made a registration and timing system for our school's annual charity 5K run that reduced the paperwork and time required of our all volunteer staff to manage the hundreds of registrants. I spoke at a Meteor DevShop about this a year and a half ago and have continued to learn more since then.

Most importantly to me, it gave me knowledge to share with a class of web programming students, who have learned to create their own apps. One student from last year's class learned about our library media specialist's plan to hold a read-a-thon, and asked if he could create an interactive website to show the progress of each class using, you guessed it, Meteor. Here's a screenshot of the site he created in his spare time:
Screen Shot 2016-03-11 at 10.34.26 AM

And yes, all of these apps have been hosted on the free deploy server at *.meteor.com, and yes, I will have to do the work of moving these sites to a new place. The public stance from Meteor has been that the free site should not really be used for production apps, something I've clearly been doing for over two years now. I re-read that line on the documentation website back in January and asked myself what I would do if I no longer had access to that site. The result: I did what I am paid to do as a master learner, and learned to host a site on my personal server. That learning was not easy. The process definitely had me scratching my head. But it also meant that I had a better understanding of the value that the free site had given me over my time using it.

The reality is that Meteor has clearly and publicly shifted away from being just being that framework that has a free one line deployment. The framework has so much going for it, and the ability to create interesting apps is not going away. The shift toward doing what one does best requires hard choices, and the free site clearly was something that did not serve that purpose. It means that those of us that value the free deploy as a teaching tool can seek other options for making it as easy to get others in the game as it was for us.

Meteor has helped me be better at my job, and I appreciate their work.


As promised, here are those learning task sites I mentioned before:

Choosing the Next Question

If a student can solve 3x - 1 = 5 for x, how convinced are we of that student's ability to solve two step equations?

If that same student can also solve 14 = 3x + 2 , how does our assessment of their ability change, if at all?

What about -2-3x= 5 ?

Ideally, our class activities push students toward ever increasing levels of generalization and robustness. If a student's method for solving a problem is so algorithmic that it fails when a slight change is made to the original problem, that method is clearly not robust enough. We need sufficiently different problems for assessing students so that we know their method works in all cases we might through their way.

In solving 3x-1 = 5 , for example, we might suggest to a student to first add the constant to both sides, and then divide both sides by the coefficient. If the student is not sure what 'constant' or 'coefficient' mean, he or she might conclude that the constant is the number to the right of the x, and the coefficient is the number to the left. This student might do fine with 10 =2x-4 , but would run into trouble solving -2-3x = 5 . Each additional question gives more information.

The three equations look different. The operation that is done as a first step to solving all three is the same, though the position of the constant is different in all three. Students that are able to solve all three are obviously proficient. What does it mean that a student can solve the first and last equations, but not the middle one? Or just the first two? If a student answers a given question correctly, what does that reveal about the student's skills related to that question?

It's the norm to consider these issues in choosing questions for an assessment. The more interesting question to me theses days is that if we've seen what a student does on one question, what should the next question be? Adaptive learning software tries to do this based on having a large data set that maps student abilities to right/wrong answers. I'm not sure that it succeeds yet. I still think the human mind has the advantage in this task.

Often this next step involves scanning a textbook or thinking up a new question on the spot. We often know the next question we want when we see it. The key then is having those questions readily available or easy to generate so we can get them in front of students.

Standards Based Grading & Streamlining Assessments

I give quizzes at the beginning of most of my classes. These quizzes are usually on a single standard for the course, and are predictably on whatever we worked on two classes before. I also give unit exams as ways to assess student mastery of the standards all together. Giving grades after exams usually consists of me looking at a single student's exam, going standard by standard through the entire paper, and then adjusting their standards grades accordingly. There's nothing groundbreaking happening here.

The two downsides to this process are that it is (a) tedious and (b) is subject to my discretion at a given time. I'm not confident that I'm consistent between students. While I do go back and check myself when I'm not sure, I decided to try a better way. If you're a frequent reader of my blog, you know that either a spreadsheet or programming is involved. This time, it's the former.

Screen Shot 2016-02-25 at 9.07.41 AM

One sheet contains what I'm calling a standards map, and you can see this above. This relates a given question to the different standards on an exam. You can see above that question 1 is on only standard 1, while question 4 spans both standards 2 and 3.

The other sheet contains test results, and looks a lot like what I used to do when I was grading on percentages, with one key difference. You can see this below:

Screen Shot 2016-02-25 at 9.10.02 AM

Rather than writing in the number of points for each question, I simply rate a student's performance on that question as a 1, 2, or 3. The columns S1 through S5 then tally up those performance levels according to the standards that are associated with each question, and then scale those values to be a value from zero to one.

 

This information was really useful when going through the last exam with my ninth graders. The spreadsheet does the association between questions and standards through the standards map, so I can focus my time going through each exam and deciding how well a student completed a given question rather than remembering which standard I'm considering. I also found it much easier to make decisions on what to do with a student's standard level. Student 2 is an 8 on standard 1 before the exam, so it was easy to justify raising her to a 10 after the exam. Student 12 was a 7 on standard 4, and I left him right where he was.

 

I realize that there's a subtlety here that needs to be mentioned - some questions that are based on two or three standards might not communicate effectively a student's level with a single 1, 2, or 3. If a question is on solving systems graphically, a student might graph the lines correctly, but completely forget to identify the intersection. This situation is easy to address though - questions like this can be broken down into multiple entries on the standards map. I could give a student a 3 on the entry for this question on the standard for graphing lines, and a 1 for the entry related to solving systems. Not a big deal.

I spend a lot of time thinking about what information I need in order to justify raising a student's mastery level. Having the sort of information that is generated in this spreadsheet makes it much clearer what my next steps might be.

 

You can check out the live spreadsheet here:

Standards Assessment - Unit 5 Exam

Boat Race, Revisited

A couple of years ago, I was impressed with Dan Meyer and Dave Major's creation of Boat Race, an activity that involved navigating around buoys with some knowledge of bearings. I hoped to use his creation for my ninth graders two years ago, but Boat Race in its original form was zapped from the interwebs. At the time,  I did an analog version, which you can find here in PDF form:

07 - CW - Boat Race

 

This year, when looking at my materials in the revamped Math 9 course, I felt compelled to take a crack at my own digitization of this activity.  Here's the result:

Screen Shot 2016-02-24 at 12.29.18 AM

You can also visit the live site here and try it out yourself.

Boat Race

The moving circle moves painfully slow by design. Students will (hopefully) be compelled to do a good job of calculating distances and angles accurately. I plan to give them the analog version on paper for planning purposes. Shortest time by the end of the class wins fame and glory.

An Easy Transformation: Right Triangle Trigonometry

From Haese and Harris MYP 9:

Screen Shot 2016-02-21 at 5.07.33 PM

I was looking for problems to give my students as applications of the right angle trigonometry from our previous class. The problem is essentially equivalent to the basic questions requiring them to find an unknown side or angle - the work is all done for them. One of my standards is all about parsing a word problem for the information needed to answer it, and this question does not require students to do any parsing.

I removed all of the measurements, and this problem became remarkably more demanding:

Screen Shot 2016-02-21 at 5.08.48 PM

This will certainly prompt more conversation than in its original form.

It's embarrassing how easy it was to make this change - I anticipate a nice payoff in class.

#Teachers Coding - Bingo Cards

When I attended a Calculus AB workshop back in 2003, one of the nice takeaways was a huge binder of materials that could be used immediately with students. I ended up scanning much of those materials and taking the digital versions with me when I moved overseas.

One of these activities was called derivative bingo. This was a set of two sheets, one with a list of expressions, and another a 5 x 5 bingo card with the derivatives of those expressions. It was perfect to use after introducing derivative rules such as the product and quotient rules to develop proficiency.

It also wasn't as fun of an activity for two reasons. The first was that I only had one bingo card provided as part of the activity. Since everyone had the same card, everyone would really obtain five in a row after doing the same set of problems. Yes, I could have made different ones at some point in the past twelve years to resolve this problem, but I never thought about it with enough advance time to do so. The other reason was that the order of the list of derivatives was carefully designed so that you only obtained five in a row after doing most of the problems provided. Good for the purposes of getting students to do more practice, but definitely an attribute that hacks the entertainment value even more.

As you might expect, I wrote a computer tool to manage this. You can visit this site and see a sample card. Reload the page, and you'll generate a new one.

This turned into a nice little competition between groups of students, and I kept a tally of how many total rows had been matched by each group as they developed. The different cards led to some great conversations between students about their results:

Screen Shot 2016-01-22 at 10.13.05 AM

 

 

 

I use the KaTEX rendering library to make the mathematical expressions look good. If you would like to edit the files for use with your own class, you can go to the GitHub repository here and download a zip file with all of the files. You'll find instructions there for changing the code to fit your needs. If those instructions don't make sense to you, let me know.

If you would just like a set of cards for the derivative practice activity that is ready for use with a class, that PDF is here: derivative-bingo-class-files