The seniors completed their final presentations this week. This was a series of TED style talks on subjects ranging from 3D printing and product placement to the connections between meat and cancer and the lack of women in foreign policy. I've been really pleased with how this group has developed their skills in communicating ideas, both through writing last semester, and in visual communication more recently.
We still have a couple of months left in the year, so when the seniors and I got back together for one class before spring break, they wanted to know what we were going to do with the time left. This point of the year for seniors, more so than other times, has a consistent theme of time ticking down in all sorts of ways. They keep an accurate count of the number of days left in school on a small chalkboard in the lounge. They keep track of college acceptances on a big map there as well. Keeping them in the present is much more easily said than done, so I tend to push seniors to think through big picture stuff at this stage.
So when we sat down in class this past week, I had rearranged the tables into a big family style U-shape to make. Lear that something would be 'different' from that point forward. I talked to them about my history in education. I described different schools I went to, how they nudged my personal path one way or another. I then showed them two talks, one from Ken Robinson about the learning revolution, and the other from Shawn Cornally describing the Iowa BIG school.
My questions after both of these were simple:
In what ways are you who you are because of your school experience?
In what ways are you who you are in spite of your school experience?
We had a brief conversation about this, and students had really insightful and revealing comments about it. I didn't want to give a big assignment or written reflection for the break though. This is family vacation time, and I didn't feel the need - plus my plans for the next steps are still in the formative stages. I did want to get seniors at least thinking big picture about the role of school as part of their identity. One senior said on the way out: "pretty deep for the day before spring break, Weinberg." For someone who thinks about education as much as I (and most teachers I know) do, this type of question is the norm.
I spent the day in a room full of my colleagues as part of our school's official transition to using the Common Core standards for mathematics. While I've kept up to date with the development of CCSS and the roll-out from here in China, it was helpful to have some in-person explanation of the details from some experts who have been part of it in the US. Our guests were Dr. Patrick Callahan from the Illustrative Mathematics group and Jessica Balli, who is currently teaching and consulting in Northern California.
The presentation focused on three key areas. The first focused on modeling and Fermi problems. I've written previously about my experiences with the modeling cycle as part of the mathematical practice standards, so this element of the presentation was mainly review. Needless to say, however, SMP4 (Model with mathematics) is my favorite, so I love anything that generates conversation about it.
That said, one element of Jessica's modeling practice struck me by surprise, particularly given my enthusiasm for Dan Meyer's three-act framework. She writes about the details on her blog here, so go there for the long form. When she begins her school year with modeling activities, she leaves out Act 3.. Why?
Here's Jessica talking about the end of the modeling task:
Before excusing them for the day, I had a student raise their hand and ask, "So, what's the answer?" With all eyes on me, a quick shrug of my shoulders communicated to them that that was not my priority, and I was sticking to it (and, oh, by the way, I have no idea what time it will be fully charged). Some students left irritated, but overall, I think the students understood that this was not going to be a typical math class.
Her whole goal is to break students of the 'answer-getting' mentality and focus on process. This is something we all try to do, but perhaps pay it more lip-service than we think by filling that need for Act 3. Something to consider for the future.
The other two elements, also mostly based in Jessica's teaching, went even further in developing other student skills.
I had never head of Bongard problems before Jessica introduced us to them. This involves looking at well defined sets of six examples and non-examples, and then writing a rule that describes each one.
Here's an example: Bongard Problem, #1:
You can find the rest of Bongard's original problems here.
In Jessica's class, students share their written rules with classmates, get feedback, and then revise their rules based only on that feedback. Before today's session, if I were to do this, I would eventually get the class together and write an example rule with the whole class as an example. I'm probably doing my students the disservice by taking that short-cut, however, because Jessica doesn't do this. She relies on students to do the work of piecing together a solid rule that works in the end. She has a nicely scaffolded template to help students with this process, and spends a solid amount of time helping students understand what good feedback looks like. Though she helps them with vocabulary from time to time, she leaves it to the students to help each other.
Dr. Callahan also pointed out the importance of explicitly requiring students to write down their rules, not just talk about them. In his words, this forces students to focus on clarity to communicate that understanding.
The final piece took the idea of peer feedback to the next level with another template for helping students workshop their explanations of process. This should not be a series of sentences about procedure, but instead mathematical reasoning. The full post deserves a read to find out the details, because it sounds engaging and effective:
I want to focus on one highlight of the post that notes the student centered nature of this process:
I returned the papers to their original authors to read through the feedback and revise their arguments. Because I only had one paper per pair receive feedback, I had students work as pairs to brainstorm the best way to revise the original argument. Then, as individuals, students filled in the last part of the template on their own paper. Even if their argument did not receive any feedback, I thought that students had seen enough examples that would help them revise what they had originally written.
I've written about this fact before, but I have trouble staying out of student conversations. Making this written might be an effective way for me to provide verbal mathematical details (as Jessica said she needs to do periodically) but otherwise keep the focus on students going through the revision process themselves.
Overall, it was a great set of activities to get us thinking about SMP3 (Construct viable arguments and critique the reasoning of others) and attending to precision of ideas through use of mathematics. I'm glad to have a few days of rest ahead to let this all sink in before planning the last couple of months of the school year.
I created an interactive lesson called Thinking Machine for use with a talk I gave to the IB theory of knowledge class, which is currently on a unit studying mathematics.
The lesson made good use of the Meteor Blaze library as well as the Desmos Graphing Calculator API. Big thanks to Eli and Jason from Desmos for helping me with putting it together.
I was asked by a colleague if I was interested in speaking to the IB theory of knowledge class during the mathematics unit. I barely let him finish his request before I started talking about what I was interested in sharing with them.
If you read this blog, you know that I'm fascinated by the intersection of computers and mathematical thinking. If you don't, now you do. More specifically, I spend a great deal of time contemplating the connections between mathematics and programming. I believe that computers can serve as a stepping stone between students understanding of arithmetic and the abstract idea of a variable.
The fact that computers do precisely what their programmers make them do is a good thing. We can forget this easily, however, in our world that has computers doing fairly sophisticated things behind the scenes. The fact that Siri can understand what we say, and then do what we ask, is impressive. The extent to which the computer knows what it is doing is up for debate. It's pretty hard to argue though that computers aren't doing similar types of reasoning processes that humans do in going about their day.
Here's what I did with the class:
I began by talking about myself as a mathematical thinker. Contrary to what many of them might think, I don't spend my time going around the world looking for equations to solve. I don't seek out calculations for fun. In fact, I actively dislike making calculations. What I really enjoy is finding interesting problems to solve. I get a great deal of satisfaction and a greater understanding of the world through doing so.
What does this process involve? I make observations of the world. I look for situations, ideas, and images that interest me. I ask questions about what I see, and then use my understanding of the world, including knowledge in the realm of mathematics, to construct possible answers. As a mathematical and scientific thinker, this process of gathering evidence, making predictions using a model, testing them, and then adjusting those models is in my blood.
I then set the students loose to do an activity I created called Thinking Machine. I styled it after the amazing lessons that the Desmos team puts together, and used their tools to create it. More on that later. Check it out, and come back when you're done.
The activity begins with a first step asks students make a prediction of a mathematical rule created by the computer. The rule is never complicated - always a linear function. When the student enters the correct rule, the computer says to move on.
The next step is to turn the tables on the student - the computer will guess a rule (limited to linear, quadratic, cubic, or exponential functions) based on three sets of inputs and outputs that the student provides. Beyond those three inputs, the student should only answer 'yes' or 'no' to the guesses that the computer provides.
The computer learns by adjusting its model based on the responses. Once the certainty is above a certain level, the computer gives its guess of the rule, and shows the process it went through of using the student's feedback to make its decision. When I did this with the class, more than half of the class had their guesses correctly determined. I've since tweaked this to make it more reliable.
After this, we had a discussion about whether or not the computer was thinking. We talked about what it means for a computer to have knowledge of a problem at hand. Where did that knowledge come from? How does it know what is true, and what is not? How does this relate to learning mathematics? What elements of thinking are distinctly human? Creativity came up a couple times as being one of these elements.
This was a perfect segue to this video about the IBM computer Watson learning to be a chef:
Few were able to really explain this away as being uncreative, but they weren't willing to claim that Watson was thinking here.
Another example was this video from the Google Deep Thinking lab:
I finished by leading a conversation about data collection and what it signifies. We talked about some basic concepts of machine learning, learning sets, and some basic ideas about how this compared to humans learning and thinking. One of my closing points was that one's experience is a data set that the brain uses to make decisions. If computers are able to use data in a similar way, it's hard to argue that they aren't thinking in some way.
Students had some great comments questions along the way. One asked if I thought we were approaching the singularity. It was a lot of fun to get the students thinking this way, especially in a different context than in my IB Math and Physics classes. Building this also has me thinking about other projects for the future. There is no need to invent a graphing library on your own, especially for use in an activity used with students - Desmos definitely has it all covered.
I built Thinking Machine using Bootstrap, the Meteor Blaze template engine, jQuery, and the Desmos API. I'm especially thankful to Eli Luberoff and Jason Merrill from Desmos who helped me with using the features. I used the APIto do two things:
Parse the user's rule and check it against the computer's rule using some test values
Graph the user's input and output data, perform regressions, and give the regression parameters
The whole process of using Desmos here was pretty smooth, and is just one more reason why they rock.
The learning algorithm is fairly simple. As described (though much more briefly) in the activity, the algorithm first assumes that the four regressions of the data are equally likely in an array called isThisRight. When the user clicks 'yes' for a given input and output, the weighting factor in the associated element of the array is doubled, and then the array is normalized so that the probabilities add to 1.
The selected input/output is replaced by a prediction from a model that is selected according to the weights of the four models - higher weights mean a model is more likely to be selected. For example, if the quadratic model is higher than the other three, a prediction from the quadratic model is more likely to be added to the list of four. This is why the guesses for a given model appear more frequently when it has been given a 'yes' response.
Initially I felt that asking the user for three inputs was a bit cheap. It only takes two points to define a line or an exponential regression, and three for a quadratic regression. I could have written a big switch statement to check if data was linear or exponential, and then quadratic, and then say it had to then be cubic. I wanted to actually give a learning algorithm a try and see if it could figure out the regression without my programming in that logic directly. In the end, the algorithm works reasonable well, including in cases where you make a mistake, or you give two repeated inputs. With only two distinct points, the program is able to eventually figure out the exponential and quadratic, though cubic rules give it trouble. In the end, the prediction of the rule is probability based, which is what I was looking for.
The progress bar is obviously fake, but I wanted something in there to make it look like the computer was thinking. I can't find the article now, but I recall reading somewhere that if a computer is able to respond too quickly to a person's query, there's a perception that the results aren't legitimate. Someone help me with this citation, please.
Last fall, when I was teaching my web design students about jQuery events, I included an example page that counted the number of times a button was clicked and displayed the total. As a clear indicator of their strong engagement in what I asked them to do next, my students competed with each other to see who could click the most number of clicks in a given time period. With the popularity of pointless games like Cookie Clicker , I knew there had to be something there to use toward an end that served my teaching.
Shortly afterwards, I made a three-act video activity that used this concept - you can get it yourself here.
This was how I started a new unit on exponential functions with my Math 10 class this week. The previous unit was about polynomials, and had polynomial regression for modeling through Geogebra as a major component. One group went straight to Geogebra to solve this problem to figure out how many clicks. For the rest, the solutions were analog. Here's a sample:
When we watched the answer video, there was a lot of discouragement about how nobody had the correct answer. I used this as an opportunity to revisit the idea of mathematics as a set of different models. Polynomial models, no matter what we do to them, just don't account for everything out there in the universe. There was a really neat interchange between two students sitting next to each other, one who added 20 each time, and another who multiplied by 20 each time. Without having to push too much, these students reasoned that the multiplication case resulted in a very different looking set of data.
This activity was a perfect segue into exponential functions, the most uncontrived I think I've set up in years. It was based, however, on a useless game with no real world connections or applications aside from other also useless games. No multiplying bacteria or rabbits, no schemes of getting double the number of pennies for a month.
I put this down as another example of how relevance and real world don't necessarily go hand in hand when it comes to classroom engagement.
My students all have phones and make a modest effort to keep them put away. I decided to get them to take them out for the lab today.
I found this free function generator app that generates clean sine, square, triangle, and sawtooth waves across a pretty good range. Above 1 kilohertz, harmonics are visible on a software frequency analyzer, so I didn't have students go quite that high. The frequency can be set across this range by entering the frequency manually, or by using preset buttons on the app. By playing the waveform, plugging in earphones, and hanging them on top of the tube, finding the fundamental vibration frequency is pretty straight forward.
Collecting data in this lab has, in my experience, been a pretty slow process. Today though, my students were able to collect 15-20 frequency and height pairs in less than half an hour. I took all of their data and graphed it together. I'm pretty impressed with how consistently the data sits in a line:
The slope of the best fit of L vs. 1/(4f) forced through the origin is 320 m/s, which is probably the closest result to theoretical that I've ever gotten. The precision of the data is the big winner here. It was a simple task to ask students to cycle back through their range of frequencies and check that their new measurements meshed well with the old.
This was supposed to be the shortest part of a warm up activity. It turned into a long discussion that revealed a lot of student misunderstandings.
The question was about whether we could ignore air resistance on a textbook being thrown in the air. We spent most of our time discussing the differences and similarities between the three items here:
There were interesting comments about what factors influence the magnitude of air resistance. I was definitely leading the conversation, but it wasn't until a student mentioned acceleration that anyone was able to precisely explain why one fell differently from another. We eventually settled on making a comparison between gravity force and air resistance force and calculating acceleration to see how close it was to the acceleration of gravity.
I've written about my backwards approach to to projectile motion previously here, here, and here.
I had students solving the warm-up problem to that first lesson, which goes like this:
A student is at one end of a basketball court. He wants to throw a basketball into the hoop at the opposite end.
What information do you need to model this situation using the Geogebra model? Write down [______] = on your paper for any values you need to know to solve it using the model, and Mr. Weinberg will give you any information he has.
Find a possible model in Geogebra that works for solving this problem.
At what minimum speed he could throw the ball in order to get the ball into the hoop?
The students did what they usually do with the Geogebra projectile motion model and solved it with some interesting methods. One student lowered the hoop to the floor. Another started with a 45 degree angle, and then increased the speed successively until the ball made it into the hoop. Good stuff.
A student's comment about making lots of guesses here got me thinking about finding solutions more algorithmically. I've been looking for new ways to play around with genetic algorithms and Monte Carlo methods since they are essentially guess and check procedures made productive by the power of the computer.
I wrote a Python program that does the following:
Get information about the initial characteristics of the projectile and the desired final location.
Make a large number of projectiles (guesses) with random values for angle and initial speed within a specified range.
Calculate the ending position of all of the projectiles. Sort them by how far they end up compared to the desired target.
Take the twenty projectiles with the least error, and use these values to define the initial values for a new, large number of projectiles.
Repeat until the error doesn't change much between runs.
Report the projectile at the end with the least error.
Report the entire procedure a number of times to see how consistent the 'best' answer is.
As a final step, I have this program outputting commands to graph the resulting projectile paths on Desmos. Pasting the result into the console while a Desmos calculator open, makes a nice graph for each of the generated projectiles and their intersecting at the desired target:
This shows that there is a range of possible answers, which is something I told my physics class based on their own solutions to the problem. Having a way to show (rather than tell) is always the better option.
I also like that I can change the nature of the answers I get if I adjust the way answers are sorted. This line in the code chooses how the projectile guesses are sorted by minimizing error:
Over the past few weeks, I've made some changes to my standards based grading system using the Meteor framework. These changes were made to address issues that students have brought up that they say get in the way of making progress. Whether you view these as excuses or valid points, it makes sense to change some of the features to match the students' needs.
I don't know what standard 6.2 means, Mr. Weinberg.
There are many places students could look to get this information. It does make sense, however, to have this information near where students sign up for reassessments.
When students select a standard, a link pops up (if the standard exists) with a description. This has made a big difference in students knowing whether the standard they sign up for is what they actually intend to assess.
I also added the entry for the current mastery level, because this is important in selecting appropriate assessments. The extra step looking it up in the online gradebook isn't worth it to me, and asking students to look it up makes it their responsibility. That's probably where it belongs.
Can you post example problems for each standard?
The biggest issue students have in searching for online resources for a specific standard is not knowing the vocabulary that will get the best resources. There's lots of stuff out there, but it isn't all great.
I post links to class handouts and notes on a school blog, so the information is already online. Collecting it in one place, and organizing it according to the standards hasn't been something I've put time into.
Students can now see the standards for a given course, listed in order. If students are interested, they can look at other courses, just to see what they are learning. I have no idea if this has actually happened.
Selecting a standard brings a student to see the full text and description of the standard. I can post links to the course notes and handout, along with online resources that meet my standards for being appropriately leveled and well written.
At the moment, I'm the only one that can add resources. I've written much of the structure to ultimately allow students to submit sites, up-vote ones that are useful to them, and give me click data on whether or not students are actually using this, but I'm waiting until I can tweak some UI details to make that work just the way I want it.
Mr. Weinberg, I signed up for an assessment, but it's not showing up.
The already flaky internet in China has really gotten flakier as of late. Students are signing up for reassessments, but because of the way I implemented these requests being inserted into the database, these requests weren't actually making it to the server. I've learned a lot more about Meteor since I wrote this a year ago, so I've been able to make this more robust. The sign-up window doesn't disappear until the server actually responds and says that the insert was successful. Most importantly, students know to look for this helper in the upper left hand side of the screen:
If it glows red, students know to reload the page and reconnect. Under normal Meteor usage conditions, this isn't a problem because Meteor takes care of the connection process automatically. China issues being what they are, this feature is a necessity.
I've written before about how good it feels to build tools that benefit my students, so I won't lecture you, dear reader, about that again. In the year since first making this site happen though, I've learned a lot more about how to build a tool like this with Meteor. The ease with which I can take an idea from prototype to production is a really great thing.
The next step is taking a concept like this site and abstracting it into a tool that works for anyone that wants to use it. That is a big scale project for another day.
Given that few of my students have programmed before this class, there are some gaps in knowledge that I'll need to think through. The one thing I didn't want to do in this class was declare that students need to go through a full CS course before being able to touch this material. The value of a framework like Meteor is the ease with which anyone can piece together an application. The most consistent theme in this course has been that getting students working with code and troubleshooting upfront is much more productive than a lecture on for-loops. My model has been to have students take a piece of code, figure out what it does, and then hack or tweak it to do something different. As students get more experience, they become more comfortable writing code from scratch to complete a task.
Despite knowing this about my group, I almost didn't do continue the model in exactly this way after the winter break. I was going to do a unit on classes and methods, but in my brainstorming how to do this, I realized that the better approach would be to look at database queries and MongoDB. The concepts of properties would be obvious in the way Mongo stores information, so it would then be easy to talk about the concept of objects once students were interacting with the database. Again, this became more 'Application first, theory second.'
This also meant I had another opportunity to bring up fundamentals of computational thinking. I opened up the lesson by having students look at a screen sized list of all of the ninth grade students and some of the information stored by the school. I asked them questions like this:
How many students in Mr. S's advisory are girls?
How many students in the red or blue house are in Mrs. M's advisory?
Do any students share the same birthday?
They didn't mind this too much, but there were some different answers to the questions that came from counting by hand. It was fairly mindless work and none of them were too bothered by my requests to do it with this data set. Then I told them that we were going to do the same with the list of 160 students in the entire upper school.
They didn't ask it because they know me at this point, so I told them that there was, of course, a better way. I taught them some Mongo queries using a sandboxed collection consisting of student information from the school. I then set them loose with a list of questions about a fictional database of people that I generated and posted at http://citizens.meteor.com. (The names for the collection came from the cast list for the most recent Hobbit movie. A fun side project, by the way.) A subsequent lesson was about sorting, limiting, and interacting with an array of returned documents, and students handled it well. We did some quick demonstrations of dot notation, but I didn't make a big deal out of it.
In the next class, I gave students the following prompt:
Mr. Weinberg wants to survey students on their favorite ice cream flavor. He wants to be able to sort the results by grade, gender, and house. Sketch the layout of a web form he could use to collect this information.
Their sketches were good fun:
I asked them to then work backwards from their database experience: What would a single document in the database containing the results of this survey look like? They were coming up with sample properties and values. I then showed them how these values could be captured from the form elements using jQuery.
Then came the Meteor magic.
I took the web form and pasted it into a template. I took the jQuery calls and put them in an event handler for the template. I added a line to create a collection, made another quick template to show results, and then made a helper to fill that template with the entries in the database. One last thing I put in to prevent rapid submissions – calls to clear out all of the form elements after the database insert.
I typed meteor in the terminal, fixed one error, and then the app was live. I had students go to my IP address and the form popped up. The page started filling with survey results as students realized they could interact with the page. These were initially full submissions, but soon after, lines with empty values showed up as students realized that they could add garbage data and submit them really quickly. I told some of the students that were doing this that people would be doing that with their apps soon, so there would need to be a way to handle it in their apps.
I then set students off in groups to do this same process with different web applications along the lines of the one I used to start class. It was incredibly fun hearing them talk about how they were going to move forward, including a number of new web page sketches. I gave them more concepts to work from, including an after school activities form and a web portal through which students could tell school administration that they were going to be late to school. I asked them to write down potential database queries to help find important information quickly. The really impressive part came when they had ideas for what they wanted to program. One student suggested a database of sports scores. Another, an online store.
The class was abuzz with ideas for what was possible. I knew that I had to show students how to get these into Meteor the next class, in the easiest way possible.
Enter Meteorpad. I made a streamlined form with instructions on how to take the web forms they had designed and get them into a template with as few steps as possible. The students don't currently have terminal access on their Macbooks, so I can't get them to run Meteor locally.
You can check out the MeteorPad template I gave them here: here...
...and the full set of instructions for adapting their code to it here.
They followed the instructions, and by the end of the class, most had their own versions working. The students then started tweaking them to see what they could do to make it work as they wanted.
Today, my students were going from an idea concept to coding their own apps to getting these prototypes online. Yes, they were primitive, lacked error handling, styling, and had typos. No, the students didn't have much understanding of the differences between helpers and event handlers. That is just fine. It's only January! I originally thought we'd get to this point by the end of the year, so this is a great point to be at right now.
I'm can now help students take their ideas and turn them into working prototypes. These students know how to look up code to do what they want to do. They've happened upon W3Schools web pages and StackOverflow, and while they are generally overwhelmed by what they find there, they know how to ask me the right questions about what they see. This was a great way to end a busy week.
I am now in the second semester of teaching a senior research project course. The first semester consisted of students identifying a research question and thesis, and then putting together a fully developed and referenced research paper. In the past, the second semester was devoted to putting together presentations on the same topic. I've been encouraged to modify this sequence as I see fit this year.
If there's one thing I want students to care about in terms of the presentations, it's that awareness of design principles can help their ideas come across clearly. As a result, I've pieced together some activities that center on learning design principles as a way to communicate meaning.
I started this semester's first class with an exercise from p. 47 of the Design Basics Index. Here's the basic idea:
Draw ten circles of the same size and uniform color on your paper in an arrangement that shows each of the following words:
I then collected their drawings using my submitMe application so that we could see them all together.
The results were really fun to look at and discuss. Here's a selection:
One really nice result was that students pointed out the commonalities between some of the drawings and discussed them without my bringing it up. When does unity cause intimidation? When does unity cause isolation?
This was a blast. Definitely a good way to start a lot of conversation without needing to say too much.