Monthly Archives: March 2015

Before a break: Seniors Think 'School'

The seniors completed their final presentations this week. This was a series of TED style talks on subjects ranging from 3D printing and product placement to the connections between meat and cancer and the lack of women in foreign policy. I've been really pleased with how this group has developed their skills in communicating ideas, both through writing last semester, and in visual communication more recently.

We still have a couple of months left in the year, so when the seniors and I got back together for one class before spring break, they wanted to know what we were going to do with the time left. This point of the year for seniors, more so than other times, has a consistent theme of time ticking down in all sorts of ways. They keep an accurate count of the number of days left in school on a small chalkboard in the lounge. They keep track of college acceptances on a big map there as well. Keeping them in the present is much more easily said than done, so I tend to push seniors to think through big picture stuff at this stage.

So when we sat down in class this past week, I had rearranged the tables into a big family style U-shape to make. Lear that something would be 'different' from that point forward. I talked to them about my history in education. I described different schools I went to, how they nudged my personal path one way or another. I then showed them two talks, one from Ken Robinson about the learning revolution, and the other from Shawn Cornally describing the Iowa BIG school.

My questions after both of these were simple:

  

In what ways are you who you are because of your school experience?

In what ways are you who you are in spite of your school experience? 

We had a brief conversation about this, and students had really insightful and revealing comments about it. I didn't want to give a big assignment or written reflection for the break though. This is family vacation time, and I didn't feel the need - plus my plans for the next steps are still in the formative stages. I did want to get seniors at least thinking big picture about the role of school as part of their identity. One senior said on the way out: "pretty deep for the day before spring break, Weinberg." For someone who thinks about education as much as I (and most teachers I know) do, this type of question is the norm.

Before a Break: CCSS Math, Bogram Problems, and Peer Feedback

I spent the day in a room full of my colleagues as part of our school's official transition to using the Common Core standards for mathematics. While I've kept up to date with the development of CCSS and the roll-out from here in China, it was helpful to have some in-person explanation of the details from some experts who have been part of it in the US. Our guests were Dr. Patrick Callahan from the Illustrative Mathematics group and Jessica Balli, who is currently teaching and consulting in Northern California.

The presentation focused on three key areas. The first focused on modeling and Fermi problems. I've written previously about my experiences with the modeling cycle as part of the mathematical practice standards, so this element of the presentation was mainly review. Needless to say, however, SMP4 (Model with mathematics) is my favorite, so I love anything that generates conversation about it.

That said, one element of Jessica's modeling practice struck me by surprise, particularly given my enthusiasm for Dan Meyer's three-act framework. She writes about the details on her blog here, so go there for the long form. When she begins her school year with modeling activities, she leaves out Act 3.. Why?

Here's Jessica talking about the end of the modeling task:

Before excusing them for the day, I had a student raise their hand and ask, "So, what's the answer?" With all eyes on me, a quick shrug of my shoulders communicated to them that that was not my priority, and I was sticking to it (and, oh, by the way, I have no idea what time it will be fully charged). Some students left irritated, but overall, I think the students understood that this was not going to be a typical math class.
Mission accomplished.

Her whole goal is to break students of the 'answer-getting' mentality and focus on process. This is something we all try to do, but perhaps pay it more lip-service than we think by filling that need for Act 3. Something to consider for the future.

The other two elements, also mostly based in Jessica's teaching, went even further in developing other student skills.

I had never head of Bongard problems before Jessica introduced us to them. This involves looking at well defined sets of six examples and non-examples, and then writing a rule that describes each one.

Here's an example: Bongard Problem, #1:
p001

You can find the rest of Bongard's original problems here.

In Jessica's class, students share their written rules with classmates, get feedback, and then revise their rules based only on that feedback. Before today's session, if I were to do this, I would eventually get the class together and write an example rule with the whole class as an example. I'm probably doing my students the disservice by taking that short-cut, however, because Jessica doesn't do this. She relies on students to do the work of piecing together a solid rule that works in the end. She has a nicely scaffolded template to help students with this process, and spends a solid amount of time helping students understand what good feedback looks like. Though she helps them with vocabulary from time to time, she leaves it to the students to help each other.

Dr. Callahan also pointed out the importance of explicitly requiring students to write down their rules, not just talk about them. In his words, this forces students to focus on clarity to communicate that understanding.

You can check out Jessica's post about how she uses these problems here:
Building Definitions, Bongard Style

The final piece took the idea of peer feedback to the next level with another template for helping students workshop their explanations of process. This should not be a series of sentences about procedure, but instead mathematical reasoning. The full post deserves a read to find out the details, because it sounds engaging and effective:

"Where Do I Put P?" An Introduction to Peer Feedback

I want to focus on one highlight of the post that notes the student centered nature of this process:

I returned the papers to their original authors to read through the feedback and revise their arguments. Because I only had one paper per pair receive feedback, I had students work as pairs to brainstorm the best way to revise the original argument. Then, as individuals, students filled in the last part of the template on their own paper. Even if their argument did not receive any feedback, I thought that students had seen enough examples that would help them revise what they had originally written.

I've written about this fact before, but I have trouble staying out of student conversations. Making this written might be an effective way for me to provide verbal mathematical details (as Jessica said she needs to do periodically) but otherwise keep the focus on students going through the revision process themselves.

Overall, it was a great set of activities to get us thinking about SMP3 (Construct viable arguments and critique the reasoning of others) and attending to precision of ideas through use of mathematics. I'm glad to have a few days of rest ahead to let this all sink in before planning the last couple of months of the school year.

Theory of Knowledge and the Thinking Machine

tl,dr

I created an interactive lesson called Thinking Machine for use with a talk I gave to the IB theory of knowledge class, which is currently on a unit studying mathematics.

Screen Shot 2015-03-21 at 9.47.12 PM
The lesson made good use of the Meteor Blaze library as well as the Desmos Graphing Calculator API. Big thanks to Eli and Jason from Desmos for helping me with putting it together.


I was asked by a colleague if I was interested in speaking to the IB theory of knowledge class during the mathematics unit. I barely let him finish his request before I started talking about what I was interested in sharing with them.

If you read this blog, you know that I'm fascinated by the intersection of computers and mathematical thinking. If you don't, now you do. More specifically, I spend a great deal of time contemplating the connections between mathematics and programming. I believe that computers can serve as a stepping stone between students understanding of arithmetic and the abstract idea of a variable.

The fact that computers do precisely what their programmers make them do is a good thing. We can forget this easily, however, in our world that has computers doing fairly sophisticated things behind the scenes. The fact that Siri can understand what we say, and then do what we ask, is impressive. The extent to which the computer knows what it is doing is up for debate. It's pretty hard to argue though that computers aren't doing similar types of reasoning processes that humans do in going about their day.

Here's what I did with the class:

I began by talking about myself as a mathematical thinker. Contrary to what many of them might think, I don't spend my time going around the world looking for equations to solve. I don't seek out calculations for fun. In fact, I actively dislike making calculations. What I really enjoy is finding interesting problems to solve. I get a great deal of satisfaction and a greater understanding of the world through doing so.

What does this process involve? I make observations of the world. I look for situations, ideas, and images that interest me. I ask questions about what I see, and then use my understanding of the world, including knowledge in the realm of mathematics, to construct possible answers. As a mathematical and scientific thinker, this process of gathering evidence, making predictions using a model, testing them, and then adjusting those models is in my blood.

I then set the students loose to do an activity I created called Thinking Machine. I styled it after the amazing lessons that the Desmos team puts together, and used their tools to create it. More on that later. Check it out, and come back when you're done.

The activity begins with a first step asks students make a prediction of a mathematical rule created by the computer. The rule is never complicated - always a linear function. When the student enters the correct rule, the computer says to move on.

The next step is to turn the tables on the student - the computer will guess a rule (limited to linear, quadratic, cubic, or exponential functions) based on three sets of inputs and outputs that the student provides. Beyond those three inputs, the student should only answer 'yes' or 'no' to the guesses that the computer provides.

The computer learns by adjusting its model based on the responses. Once the certainty is above a certain level, the computer gives its guess of the rule, and shows the process it went through of using the student's feedback to make its decision. When I did this with the class, more than half of the class had their guesses correctly determined. I've since tweaked this to make it more reliable.

After this, we had a discussion about whether or not the computer was thinking. We talked about what it means for a computer to have knowledge of a problem at hand. Where did that knowledge come from? How does it know what is true, and what is not? How does this relate to learning mathematics? What elements of thinking are distinctly human? Creativity came up a couple times as being one of these elements.

This was a perfect segue to this video about the IBM computer Watson learning to be a chef:

Few were able to really explain this away as being uncreative, but they weren't willing to claim that Watson was thinking here.

Another example was this video from the Google Deep Thinking lab:

I finished by leading a conversation about data collection and what it signifies. We talked about some basic concepts of machine learning, learning sets, and some basic ideas about how this compared to humans learning and thinking. One of my closing points was that one's experience is a data set that the brain uses to make decisions. If computers are able to use data in a similar way, it's hard to argue that they aren't thinking in some way.

Students had some great comments questions along the way. One asked if I thought we were approaching the singularity. It was a lot of fun to get the students thinking this way, especially in a different context than in my IB Math and Physics classes. Building this also has me thinking about other projects for the future. There is no need to invent a graphing library on your own, especially for use in an activity used with students - Desmos definitely has it all covered.

Technical Details

I built Thinking Machine using Bootstrap, the Meteor Blaze template engine, jQuery, and the Desmos API. I'm especially thankful to Eli Luberoff and Jason Merrill from Desmos who helped me with using the features. I used the APIto do two things:

  • Parse the user's rule and check it against the computer's rule using some test values
  • Graph the user's input and output data, perform regressions, and give the regression parameters

The whole process of using Desmos here was pretty smooth, and is just one more reason why they rock.

The learning algorithm is fairly simple. As described (though much more briefly) in the activity, the algorithm first assumes that the four regressions of the data are equally likely in an array called isThisRight. When the user clicks 'yes' for a given input and output, the weighting factor in the associated element of the array is doubled, and then the array is normalized so that the probabilities add to 1.

The selected input/output is replaced by a prediction from a model that is selected according to the weights of the four models - higher weights mean a model is more likely to be selected. For example, if the quadratic model is higher than the other three, a prediction from the quadratic model is more likely to be added to the list of four. This is why the guesses for a given model appear more frequently when it has been given a 'yes' response.

Initially I felt that asking the user for three inputs was a bit cheap. It only takes two points to define a line or an exponential regression, and three for a quadratic regression. I could have written a big switch statement to check if data was linear or exponential, and then quadratic, and then say it had to then be cubic. I wanted to actually give a learning algorithm a try and see if it could figure out the regression without my programming in that logic directly. In the end, the algorithm works reasonable well, including in cases where you make a mistake, or you give two repeated inputs. With only two distinct points, the program is able to eventually figure out the exponential and quadratic, though cubic rules give it trouble. In the end, the prediction of the rule is probability based, which is what I was looking for.

The progress bar is obviously fake, but I wanted something in there to make it look like the computer was thinking. I can't find the article now, but I recall reading somewhere that if a computer is able to respond too quickly to a person's query, there's a perception that the results aren't legitimate. Someone help me with this citation, please.

Clicking Useless Buttons and Exponential Models

Last fall, when I was teaching my web design students about jQuery events, I included an example page that counted the number of times a button was clicked and displayed the total. As a clear indicator of their strong engagement in what I asked them to do next, my students competed with each other to see who could click the most number of clicks in a given time period. With the popularity of pointless games like Cookie Clicker , I knew there had to be something there to use toward an end that served my teaching.

Shortly afterwards, I made a three-act video activity that used this concept - you can get it yourself here.

This was how I started a new unit on exponential functions with my Math 10 class this week. The previous unit was about polynomials, and had polynomial regression for modeling through Geogebra as a major component. One group went straight to Geogebra to solve this problem to figure out how many clicks. For the rest, the solutions were analog. Here's a sample:

Screen Shot 2015-03-12 at 4.43.32 PM

Screen Shot 2015-03-12 at 4.43.54 PM

Screen Shot 2015-03-12 at 4.44.06 PM

When we watched the answer video, there was a lot of discouragement about how nobody had the correct answer. I used this as an opportunity to revisit the idea of mathematics as a set of different models. Polynomial models, no matter what we do to them, just don't account for everything out there in the universe. There was a really neat interchange between two students sitting next to each other, one who added 20 each time, and another who multiplied by 20 each time. Without having to push too much, these students reasoned that the multiplication case resulted in a very different looking set of data.

This activity was a perfect segue into exponential functions, the most uncontrived I think I've set up in years. It was based, however, on a useless game with no real world connections or applications aside from other also useless games. No multiplying bacteria or rabbits, no schemes of getting double the number of pennies for a month.

I put this down as another example of how relevance and real world don't necessarily go hand in hand when it comes to classroom engagement.

Speed of Sound Lab - Another Update

Two years ago, I wrote about how I took tuning forks out of the standard resonance tube lab for measuring the speed of sound.

My students all have phones and make a modest effort to keep them put away. I decided to get them to take them out for the lab today.

I found this free function generator app that generates clean sine, square, triangle, and sawtooth waves across a pretty good range. Above 1 kilohertz, harmonics are visible on a software frequency analyzer, so I didn't have students go quite that high. The frequency can be set across this range by entering the frequency manually, or by using preset buttons on the app. By playing the waveform, plugging in earphones, and hanging them on top of the tube, finding the fundamental vibration frequency is pretty straight forward.

Collecting data in this lab has, in my experience, been a pretty slow process. Today though, my students were able to collect 15-20 frequency and height pairs in less than half an hour. I took all of their data and graphed it together. I'm pretty impressed with how consistently the data sits in a line:

The slope of the best fit of L vs. 1/(4f) forced through the origin is 320 m/s, which is probably the closest result to theoretical that I've ever gotten. The precision of the data is the big winner here. It was a simple task to ask students to cycle back through their range of frequencies and check that their new measurements meshed well with the old.