Category Archives: algebra 2

Rethinking my linear function approach in Algebra 2

My treatment of linear functions in the past has been pretty traditional. Solve for y, y = mx + b, graphing using slope intercept, then move on to linear inequalities in two variables...it is just dull this way. Most students have seen it before in one form or another, and it wasn't exciting (or that novel) to them the first time they learned it. It doesn't have to be this way, and I committed myself this year to doing things differently.

My approach has been centered on two big ideas:

  1. Linear functions have a constant rate of change. All of the other qualities they have are related to this important fact.
  2. There is an amazing connection between graphs, tables of values, and the equations that generate linear functions. These are not three separate skills, they are three views of the same fundamental mathematical object. Corollary: Teaching them on three separate days or sticking to one view at a time creates an unnecessary pigeon-holing effect that sticks with students for as long as conditions in your class permit.

On day one, we did my Robot Tracking activity posted here at GeogebraTube. The video introduction was reviewed in class and students worked on it for much of the period. This emphasized a fundamental concept around linear functions of distance and time that was pretty intuitive to nearly all of the students that did this activity.

Predicting where something is located, assuming it continues moving at a constant rate is one of the most common applications of linearity. We do it all the time. Can we cross the street in front of the bus? Mental calculation. Where should I kick the soccer ball to get it right in front of the forward moving toward the goal? Mental calculation. I don't mean actually sitting down and calculating where it will be, but that the human brain is pretty good at noticing the velocity of objects, and making a pretty good guess of where it will be. They had a number of methods of coming to an answer that ranged from geometric (simply drawing a line) to counting grid squares, using the trace function, and proportional reasoning.

We ended the period looking at the Python script I posted here and trying to calculate speed from the information generated by the program. Part of the homework assignment for the next class was to try to answer the question posed by another Python program posted here. The table of values is randomly determined each time, and students could (and often did) try it multiple times to get it right.

The next lesson had a single instance of this program as a warm up for the whole class - everyone had to agree on what value of position I needed to enter for the given time value.. They were pretty good at checking each other and having good conversations about how to go about it. They answered correctly, but we had a good conversation about the different ways to get there. They all centered on using the fact that there was equal spacing between all of the points. Most students used some variation of finding the distance moved per second and whether it was positive or negative, and then counted off intervals. In most cases, it was a bit complicated and required a lot of accounting to get to their answer.

We went over the reason we could do this - the constant rate of change - and verified it using a few different pairs of points. I then threw in the idea of using the point (x,y) and using the constant rate of change with that point. We got to frac{y - b}{x - a}=m and I asked them to write this using the slope we calculated and any point they liked from the table of data. Students seated next to each other I encouraged to use different points. I then asked them to answer the original question from the Python program using their equation. (Un)surprisingly enough, they all ended up with the correct (and same) answer as before.

Some of them started distributing and writing in slope intercept form. THe thing I was kind of excited about was that they didn't feel the equation had to be written that way, they just felt like seeing what happened. Many discovered the fact that their answers were the same after doing so, even though they started with different points. We did a couple examples of solving more basic 'Write an equation for a line that..." questions, but did so without making a huge deal out of slope-intercept form or point-slope form and why one might be better than the other in different situations.

Today was the third day going through this concept - the warm up activity had three levels to it:

The goal here was to constantly push the students to go back and forth between the equation and numerical representations of these functions. There were lots of good things students figured out from these. We then made the jump to looking at how the graph is connected to the table and equation - just one more way of looking at the same mathematical function, and it shares the meaning that comes with the other two representations: a constant rate of change. The new idea introduced as part of this was that of an intercept. What does it mean on the graph? What does it mean for the table? We didn't talk explicitly about the intercept's meaning of the equation (again, trying to avoid the "that's just y = mx + b, I know this already...TUNED OUT") , but it came out in the process of identifying it algebraically, from tables, and then graphing.

By the end of the period, we were graphing linear functions. Students were asking excellent questions about when the intercepts alone can be used to graph the line, when they can't (2x+3y=6 versus 2x+3y=7) but they again stuck to the idea of finding a point they know is on the graph, and then using the constant rate of change to find others. Instead of spending a boring lesson explicitly telling them what my expectations are for graphing lines (labeled and scaled axes, line going all the way across the extent of the axes, arrows on axes and lines) I was able to gently nudge students to do this while they worked.

We'll see how things go as we continue to move forward. The big thing I like about this progression so far is that modeling real phenomena will be a natural extension of what we've already done - not a lesson at the end of contrived examples with clean numbers. My goal originally was to get this group comfortable with messy data and being comfortable with using different tools to make sense of it.

I've kept my students hermetically sealed from this messiness in the past - integer coefficients, integer values, and explicit step-by-step ways of graphing, generating tables, and writing equations. As I mentioned before, it was, well, boring and predictable, and perpetuated the idea that these skills are all separated from each other. It also continued the pattern that there would be a day in each unit where the numbers are messy, the real world word problems day, but that the pain associated with it would last a day and would be over soon enough.

I'm hoping to reduce this effect by changing my approach. That by seeing the different aspects of linear functions, it will seem natural to use a graph to figure out something that might not make sense algebraically, or use numerical values to solve an algebraic problem. I especially like this because exploring the three views of functions really is, in my opinion, the primary learning goal of the Algebra 2 course. If I can establish this as an expectation early on, I think the latter parts of the course will work much more smoothly.

Why SBG is blowing my mind right now.

I am buzzing right now about my decision to move to Standards Based Grading for this year. The first unit of Calculus was spent doing a quick review of linear functions and characteristics of other functions, and then explored the ideas of limits, instantaneous rate of change, and the area under curves - some of the big ideas in Calculus. One of my standards reads "I can find the limit of a function in indeterminate form at a point using graphical or numerical methods."

A student had been marked proficient on BlueHarvest on four out of the five, but the limit one held her back. After some conversations in class and a couple assessments on the idea, she still hadn't really shown that she understood the process of figuring out a limit this way. She had shown that she understood that the function was undefined on the quiz, but wasn't sure how to go about finding the value.

We have since moved on in class to evaluating limits algebraically using limit rules, and something must have clicked. This is what she sent me this morning:

Getting things like this that have a clear explanation of ideas (on top of production value) is amazing - it's the students choosing a way to demonstrate that they understand something! I love it - I have given students opportunities to show me that they understand things in the past through quiz retakes and one-on-one interviews about concepts, but it never quite took off until this year when their grade is actually assessed through standards, not Quiz 1, Exam 1.

I also asked a student about their proficiency on this standard:

I can determine the perimeter and area of complex figures made up of rectangles/ triangles/ circles/ and sections of circles.

I received this:
...followed by an explanation of how to find the area of the figure. Where did she get this problem? She made it up.

I am in the process right now of grading unit exams that students took earlier in the week, and found that the philosophy of these exams under SBG has changed substantially. I no longer have to worry about putting on a problem that is difficult and penalizing students for not making progress on it - as long as the problem assesses the standards in some way, any other work or insight I get into their understanding in what they try is a bonus. I don't have to worry about partial credit - I can give students feedback in words and comments, not points.

One last anecdote - a student had pretty much shown me she was proficient on all of the Algebra 2 standards, and we had a pretty extensive conversation through BlueHarvest discussing the details and her demonstrating her algebraic skills. I was waiting until the exam to mark her proficient since I wanted to see how student performance on the exam was different from performance beforehand. I called time on the exam, and she started tearing up.

I told her this exam wasn't worth the tears - she wanted to do well, and was worried that she hadn't shown what she was capable of doing. I told her this was just another opportunity to show me that she was proficient - a longer opportunity than others - but another one nonetheless. If she messed up a concept on the test from stress, she could demonstrate it again later. She calmed down and left with a smile on her face.

Oh, and I should add that her test is looking fantastic.

I still have students that are struggling. I still have students that haven't gone above and beyond to demonstrate proficiency, and that I have to bug in order to figure out what they know. The fact that SBG has allowed some students to really shine and use their talents, relaxed others in the face of assessment anxiety, and has kept other things constant, convinces me that this is a really good thing, well worth the investment of time. I know I'm just preaching to the SBG crowd as I say this, but it feels good to see the payback coming so quickly after the beginning of the year.

Winning the battle over Python programming

Two stories to share after this week's activities with students about programming. I have posted previously about my interest in making Python a fundamental part of my classes this year, and so I am finding ways to include it when it makes sense to do so.

I have a couple of students that are bridging the gap between Algebra 2 and Precalculus with an independent study that I get to design. The tentative title of the course for their transcript is 'Fundamentals of Mathematical Thinking' and the overall goal is to get these students a chance to develop their fundamental skills to be successful in later classes. I see it as an opportunity to really dig in to some cool mathematical ideas and get them to, well, dig into the fundamentals of mathematical thinking. I don't plan too much emphasis on the algorithms (though we will spend some time working on skills in algebra, polynomial manipulation, functions, and other crucial topics where they are weak). Looking at a situation, exploring the way different variables might be used to model that situation, and then really digging in to abstract the variables into a model.

We are starting with what I think is the most fundamental application of this: sequences and series. Even simpler, the first task I gave the students was to look at the number of bricks in the rows of a triangular tower and use Python to add up the bricks in each row. This started as a couple of exercises getting to know Python's syntax. They are then taking programs I wrote to model this problem and adjusting them to find other sums, including the sum of even and odd numbers. One student that completed this task was intrigued that the sum of the latter consisted of perfect squares, but we didn't explore it any further at this point.

I then gave this student a bunch of sequences. His task was simple: model each one in Python and generate the given terms. This is a standard exercise for Algebra & Precalc students by hand, but I figured that if he could do this with Python, clearly he was able to figure out the pattern. I showed him how to write fractions using string concatenation (e.g. 1/3 = 1 + "/" + 3) which enabled him to develop the harmonic series. Today he figured out Fibonacci and a couple other new ones. It was really fascinating to see him mess around think deeply about the patterns associated with each one. I did tap him slightly in the right direction with Fibonacci, but I have otherwise been hands off. I am also having him write about his work to give him opportunities to work on his writing too. When he feels comfortable sharing it (and I have already warned him that this is the plan), I will post links to his work here.

The other new thing was in Calculus. I have shortened my review of Pre-Calculus concepts substantially, and have made the first unit a survey of limits, rate of change, and definite integrals. Most of this has required technology to explore local linearity and difference quotients. On Thursday, I introduced using rectangular sums to find area - they were otherwise stuck on counting boxes, and I could tell they felt it was like baby math. They really didn't know any other way.

In showing them rectangular sums, we had some pretty good discussions about overestimating and underestimating. The students had conversations about how rough the approximation with only 3 - 5 rectangles gave for area under a parabola. A couple of them figured out how to use more rectangles. I told them I was going to write a program to do this while they were sitting and working. I created this program and talked them through how it works. They thought it was too complicated to be worth the time, but I think they did understand the basic idea. I then changed the value of N and asked them what they thought that meant. They got it right the first time. I then pushed the value to higher and higher values of N and they immediately saw that it was approaching a limit. Game, set, match.

Today I had the AP students together working on another definite integral activity that focused on the trapezoidal rule. I showed them the code again and gave them the line that calculates area. It wasn't too much of a stretch for them to work their way to adjusting the program to work for the Trapezoidal rule. We ran out of time to discuss comparisons between the two programs, but they stayed late after class and into their lunch getting it working on their own computers and playing a bit. Here is what we came up with.

The big battle I see is two-fold.

  • Help students not be intimidated by the idea of writing a program to do repetitive calculations.
  • Give students opportunities to see it as necessary and productive to use a computer to solve a problem.

Sometimes these battles are the same, other times they are different. By using the built-in version of Python on their Macs, I have already started seeing them run commands and use text editors to create scripts without too much trouble. That's the first battle. My plan is to give lots of examples supporting the second one in the beginning, and slowly push the burden of writing these programs on to the students as time goes by and they become more comfortable with the idea. So far I am feeling pretty good about it - stay tuned.

Python in Algebra 2 - An Experiment

One of my goals is to include some Python programming as part of all of my courses this year, and to do so in a way that isn't just wedging it in where it doesn't belong. We don't have a formal programming class in our school given our size, but I have heard that students are interested in the broad topic of programming, and know that they could benefit. So, I am finding times to get students playing around with it as a tool.

The perfect opportunity in Algebra 2 today came in evaluating algebraic expressions. I don't like reviewing the topic, at least haven't in the past, because in most cases the students remember enough of it to think that they know how to do it, but have forgotten all the nasty bits about order of operations, distributing negative signs, and the infamous -5^2 = 25 when evaluating -a^2 at a = -5. They typically have great interactions reminding each other of the rules, but by the time they get to me in Algebra 2, the idea is no longer fresh. The lesson then ends up being the math lesson equivalent of an air freshener - temporary and stale.

Following my goal, I figured this would be a perfect opportunity to introduce the topic first as a programming topic, and then use the computer as a resource for the students to check their arithmetic. We started the class with some basic order of operations questions:

This was followed by pasting the following into a Python interpreter as everyone was watching:

print "Answers to the Warm-Up Questions:"
print (8*3 - 2*4)
print (27 + 18/9 - 3**2+1)
print (40 + 24)/8 - (2**3+1)

This was following a suggestion from Kevin Krenz to demonstrate the fast way to solve it using the Python interpreter. While they weren't wildly impressed, they did accept that this was an option for them to check their work in these types of questions, and were up for learning how it worked.

I then showed them how to run a Python file on their Macbooks, which all have at least Python 2.6 running on them in the terminal. I talked about working in the terminal as running around in the basement of their computer - lots of power and hidden secrets there to play around with (or mess up if not careful). After learning to do this, they edited a partially completed Python script which I have posted at Github here.

I really liked what happened afterwards, though it did not feel (at all) like a clean, straightforward way of going over algebraic expressions. It was messy. Different people were at different places during the entire 30 minutes we worked on it, which was much longer than I expected. Quite appropriately though, it slowly came together like writing a program often does. Lots of good discoveries and realizations of simple errors that I didn't need to force.

Students realized the difference between 2*x and 2x to the computer. They realized quite cleanly that they needed to tell the computer outright that there is multiplication between a coefficient and a variable. They saw this was not the case for -x although they also thought they might need to write it as -1*x. The Python interpreter pointed this out to them immediately. The interpreter didn't do so well on 4(3 - x) since it considered it a function call, but with some prodding, most students realized it was the same error.

There was enough information in the script for them to figure out how to do exponents, so I was happy not to have to go through that separately. The only really big problem was the fact that Python 2.6 doesn't have the nice floating point capability for division that 3.2 has. For the first problem, part (a), the answer is 0.5, but Python returns 0 since it assumes integer division with a plain / symbol. I went around to student computers replacing x/y with x*1./y, but this became an opportunity to converse with students about division as multiplication by the multiplicative inverse or reciprocal. Another unintended complication that then resulted in more review of pure mathematical concepts.

With all of this done, the students were then pretty proficient at trying to do the substitution by hand and checking against the answers from the computer. Most caught the serious mistakes without too much input from me - the computer did that work for me.

After finishing problem 1, the students got a big kick out of how I told them to program Problem 2 at the end of the script. They were directly teaching the computer to answer these questions through code. I think they saw that programming really is how you teach a computer to do what you want it to do, and had at least a minimal sense of pride in being able to do so.

One student said this was pretty cool, and compared it to a video game. Another appeared to want to kill me the entire time. They were all pretty patient with the activity though, and trusted that this would make them better at what they needed to learn for my class - probably the most important part to this not leading to a serious case of Thursday afternoon mutiny.

In the grand scheme of technology implementation, this activity was nothing more than using Python to replace a graphing calculator with substitution capability. This type of knowledge, however, is important for doing more substantial applications of computational thinking. I think it's important to get students to see what it can do before being interested in creating something as simple as 'Hello world'. That doesn't seem to interest the vast majority of students. While I did most of the programming for this task, this is a gateway to the students doing more and seeing more down the line. Now that they know how to do the basics of editing and running a program, we will be more successful in doing more sophisticated things later on.

Experimenting with iBooks Author

I recently took the step of dipping my feet in the Apple pool, much to the surprise of many people that know me and my preferences. There were a few reasons that I decided it would be a good idea, but one of them was the opportunity to experiment on my own time with iBooks Author.

I've tossed around the idea of writing a book. A few ideas for topics have been bouncing around, one being one in which the concepts of mathematical thinking are explored through programming. Given that all Mac computers have Python installed automatically, not to mention the ease that it can be installed on other platforms quite easily, Python is a perfect fit.

Now that I'm set up with my Mac, I've spent the last couple of days playing with it and getting to know its quirks. It does have quirks. I spent a couple of hours today battling a mystery white box that covered anything that slid into it, and that remained even after saves, restarts, and reboots. Eventually I got rid of it (though I'm not totally sure that I am sure how) and put together an activity I plan to have some independent study students work through this year.

The quiz options are nice ways to make things interactive, but they have all the same downsides of multiple choice questions. If there was a fill-in-the-blank option, I could very easily see putting together my own self-guided lessons along the lines of Udacity. That's really what I'm looking for. The really powerful thing to have would be an HTML5 Python interpreter, and I haven't yet looked to see if something exists that would work with the interface.

I found out late in the process that images placed in landscape mode only show up in the portrait orientation if they are set to be 'inline' instead of floating or anchored. Backsliding ensued.

On the whole, it's a nice free publishing platform, including for nice PDF files. I didn't have much multimedia material to throw in, and my attempts to do so would have been for exercising features, not for enhancing the book as a learning opportunity. As many have noted previously, iBooks author offers quite a bit of horsepower for generating flashy multimedia textbooks, but the extent to which it revolutionizes education isn't quite there. Opportunities for interfacing with others reading the same content through chat, messages, or something like that would be a step in that direction.

For what it's worth, feel free to check out the final product below. While the text is written as if it's a finished book ("More information on this can be found in the Appendix"), it very much isn't. Just an experiment to fill my hours battling jet lag back in China.

Mathematical Reasoning with Python

Results of a unit long experiment in SBG and flipping.

I've been a believer in the concept of standards based instruction for a while. The idea made a lot of sense when I first learned about the idea when Grant Wiggins visited my school in the Bronx a few years ago to present on Understanding by Design. Dan Meyer explored the idea quite a bit using his term of the concept checklist. Shawn Cornally talks on his blog about really pushing the idea to give students the freedom to demonstrate their learning in a way they choose, though he ultimately retains judgment power on whether they have or not. Countless others have been really generous in sharing their standards and their ideas for making standards work for their students. Take a look at my blogroll for more people to read about. For those unaware, here's the basic idea: Look at the entire unit and identify the specific skills or you want your students to have. Plan your unit to help them develop those skills. Assess and give students feedback on those skills as often as possible until they get it. In standards based grading (SBG), reporting a grade (as most of us are required to do) as a fraction of standards completed or acquired becomes a direct reflection of how much students have learned. Compare this to the more traditional version of grading that consists of an average of various 'snapshots' on assignments, on which grades might be as much a reflection of effort or completion as of actual learning. If learning is to be the focus of what we do in the classroom, then SBG is a natural way of connecting that learning to the grades and feedback we give to students. My model for several years now has been, well,  SBG lite. Quizzes are 15% of the total grade and test only a couple skills at a time. Students can retake quizzes as many times as they want to show that they have the skills in isolation. On tests, (60% of the total grade) students can show that they can correctly apply the set of all of their acquired skills on exercises (questions they have seen before) as well as problems (new questions that test conceptual understanding). As much as I tell students they can all have a grade of 100% for quizzes and remind those that don't to retake, it doesn't happen. I'll get a retake here or there. I am still reporting quiz grades as an average of a pool of "points" though, and this might leave enough haziness in the meaning of the grade for a student to be OK with a 60%. For this unit in Geometry and Algebra 2, I have specifically made the quiz grade a set of standards to be met. The point total is roughly the same as in previous units. It is a binary system - students either have the standard (3/3) or they don't (0/3), and they need to assess each standard at least twice to convince me they have it. I really like Blue Harvest, but my students didn't respond so well to having twowhole websites to use to check progress. While a truly scientific study would have changed only one variable at a time, I also found that structuring the skill standards this way required me to change the way class itself was structured. This became an experiment not only in reporting grades, but in giving my students the power to work on things in their own way. This also freed me up to spend my time in class assessing, giving feedback, and assessing again. More on this ahead. The details:

Geometry

I started the unit by defining the seven skills I wanted the students to have by the end on this page. The unit was on transformational geometry, so a lot of the skills were pretty straight forward applications of different types of transformations to points, line segments, and polygons. I had digital copies of all of the materials I put together last year for this unit, so I was able to post all of that material on the wiki for students to work through on their own. I adjusted these materials as we moved through the unit and as I saw there were holes in their understanding. I was also able to make some videos using Jing and Geogebra to explain some concepts related to using vocabulary and symmetry, and these seemed to help some students that needed a bit of direct instruction in addition to what I provided to them one on one. I also tried another experiment - programming assignments related to applying transformations to various points. I said completing these assignments and chatting with me about them would qualify them for proficiency on a given standard. Assigning homework was simple: Choose a standard or two, and do some of the suggested problems related to those standards. Be prepared to show me your evidence of study when you come into class. Students that said 'I read my notes' or 'I looked it over' were heckled privately - the emphasis was on actively working to understand concepts. Some students did flail a bit with the new freedom, so I made suggestions for which standards students should spend a particular day working on, and this helped these students to focus. I threw together some concept quizzes for the standards covered by the previous classes, and students could choose to work on those question types they felt they had mastered. Some handed the quiz right back knowing they weren't ready. I was really pleased with the level of awareness they quickly developed around what they did and didn't understand. I quickly ran into the logistical nightmare of managing the paperwork and recording assessment results. Powerschool Blue Harvest, whatever - this was the most challenging aspect of doing things this way. I often found myself bogged down during the class period recording these things, which got in the way of spending quality face time with students around their understanding. Part of this was that I was recording progress for each standard, whether good or bad, in the comment field for each student. "Understands basic idea of translation, but is confusing the image and pre-image" is the sort of comment I started writing in the beginning. While this was nice, and I think could have led to students reading the comments and getting ideas for what they needed to work on, it was a bit redundant since I was having actual conversations with students about these facts. Here is where Blue Harvest shines - I can easily send students a quick message explaining (and showing) what they need to work on. Even more powerful would be recording the conversation when I actually talk to the student, but that would be more practical with an iPad/cell phone app to avoid lugging my computer from desk to desk. Still, I wanted the feedback to be immediate and be recorded, so I knew I had to change my approach. The compromise was to only record positive progress. If a student's quiz showed no progress, it didn't get a comment in Powerschool. If they showed progress, but needed to fix a small detail in their understanding, they might get a comment. If they clearly got it, they got a comment saying that they aced it. Two or more positive comments (and my independent review) led to a 3/3 for each standard. The other promise I made was that if they clearly demonstrated proficiency on the exam (which had non-standard questions and some things they needed to explain) I would give them credit for the standard. The other difficult issue was creating a bank of reassessment questions. My system of making a quiz on the spot and handing it out to individual students was too time consuming. I created an app(using my new Udacity knowledge) to try to do this, the centerpiece being a randomized set of questions that emphasized knowing how to figure out the answers rather than students potentially sharing all the answers. They quickly found all the bugs in my system, and showed that it is far from ready for being an actual useful tool for this purpose. I appreciated their humor and patience in being guinea pigs for an idea. As you might notice from the image above, there is a pretty strong relationship between the standards mastered and the exam scores. Most student exam scores were either the same or better following this system in comparison to previous exams. The most important metric is the fact that most students weren't hurt by going to this more student-centered model. Some student took more notes while working to understand the material than they have all year. Other students spoke more to their classmates and both gave and received more help in comparison to when I was at the front of the room asking questions and doing mini-lessons. While there was a lot of staring at screens during this unit, there was also a lot of really great discussion. I would have focused conversations with every single student three to four times a class, and they were directly connected to the level of understanding they had developed. Some needed direct application questions. Others could handle deeper synthesis and 'why is this true' questions about more abstract concepts. It felt really great doing things this way. I have always insisted on crafting one good solid presentation to give the class - the perfect lesson - with good questions posed to the class and discussions inevitably resulting from them. I have to admit that having several smaller, unplanned, but 'messier' conversations to guide student learning have nurtured this group to be more independent and self driven than I expected before we started.

Algebra 2

The unit focused on the students' first exposure to logarithmic and exponential functions. The situation in Algebra 2 was very similar to Geometry, with one key difference. The main difference of this class compared to Geometry is that almost all of the direct instruction was outsourced to video. I decided to follow the Udacity approach of several small videos (<3 min), because that meant there was opportunity (and the expectation) that only two minutes would go by before students would be expected to do something. I like this much better because it fit my own preferences in learning material with the Udacity courses. I had 2 minutes to watch a video about hash functions in Python while brushing my teeth - my students should have that ability too. I wasn't going for the traditional flipped class model here. My motivation was less about requiring students to watch videos for homework, and more about students choosing how they wanted to go through the material. Some students wanted me to do a standard lesson, so I did a quick demonstration of problems for these students. Others were perfectly content (and successful) watching the video in class and then working on problems. Some really great consequences of doing things this way:

  • Students who said they watched all my videos and 'got it' after three, two minute videos, had plenty of time in the period to prove it to me. Usually they didn't.. This led to some great conversations about active learning. Can you predict the next step in the video when you try solving the problem on your own? What? You didn't try solving it on your own? <SMIRK>  The other nice thing about this is that it's a reinvestment of two minutes suggesting that they try again with the video, rather than a ten or fifteen minute lesson from Khan Academy.
  • I've never heard such spirited conversation between students about logarithms before. The process of learning each skill became a social event - they each watched the video together, rewound or paused as needed, and then got into arguments while trying to solve similar problems from the day's handout. Often this would get in the way during teacher-centered lessons, and might be classified incorrectly as 'disruption' rather than the productive refining and conveyance of ideas that should be expected as part of real learning.
  • Having clear standards for what the students needed to be able to do, and making clear what tools were available to help them learn those specific standards, led to a flurry of students demanding to show me that they were proficient. That was pretty cool, and is what I was trying to do with my quiz system for years, but failed because there was just too much in the way.
  • Class time became split between working on the day's standards, and then stopping at an arbitrary time to then look at other cool math concepts. We played around with some Python simulations in the beginning of the unit, looked at exponential models, and had other time to just play with some cool problems and ideas so that the students might someday see that thinking mathematically is not just followinga list of procedures, it's a way of seeing the world.

I initially did things this way because a student needed to go back to the US to take care of visa issues, and I wanted to make sure the student didn't fall behind. I also hate saying 'work on these sections of the textbook' because textbooks are heavy, and usually blow it pretty big. I'm pretty glad I took this opportunity to give it a try. I haven't finished grading their unit exams (mostly because they took it today) but I will update with how they do if it is surprising.


Warning: some philosophizing ahead. Don't say I didn't warn you. I like experimenting with the way my classroom is structured. I especially like the standards based philosophy because it is the closest I've been able to get to recreating my Montessori classroom growing up in a more traditional school. I was given guidelines for what I was supposed to learn, plenty of materials to use, and a supportive guide on the side to help me when I got stuck. I have seen a lot of this process happening with my own students - getting stuck on concepts, and then getting unstuck through conversation with classmates and with me. The best part for me has been seeing my students realize that they can do this on their own, that they don't always need me to tell them exactly what to do at all times. If they don't understand an idea, they are learning where to look, and it's not always at me. I get to push them to be better at what they already know how to do rather than being the source of what they know. It's the state I've been striving to reach as a teacher all along, and though I am not there yet, I am closer than I've ever been before. It's a cliche in the teaching world that a teacher has done his or her job when the students don't need you to help them learn anymore. This is a start, but it also is a closed-minded view of teaching as mere conveyance of knowledge. I am still just teaching students to learn different procedures and concepts. The next step is to not only show students they can learn mathematical concepts, but that they can also make the big picture connections and observe patterns for themselves. I think both sides are important. If students see my classroom as a lab in which to explore and learn interesting ideas, and my presence and experience as a guide to the tools they need to explore those ideas, then my classroom is working as designed. The first step for me was believing the students ultimately wantneed to know how to learn on their own. Getting frustrated that students won't answer a question posed to the entire class, but then will gladly help each other and have genuine conversations when that question comes naturally from the material. All the content I teach is out there on the internet, ready to be found/read/watched as needed. There's a lot of stuff out there, but students need to learn how to make sense of what they find. This comes from being forced to confront the messiness head on, to admit that there is a non-linear path to knowledge and understanding. School teaches students that there is a prescribed order to this content, and that learning needs to happen within its walls to be 'qualified' learning. The social aspect of learning is the truly unique part of the structure of school as it currently exists. It is the part that we need to really work to maintain as content becomes digital and schools get more wired and connected. We need to give students a chance to learn things on their own in an environment where they feel safe to iterate until they understand. That requires us as teachers to try new things and experiment. It won't go well the first time. I've admitted this to my students repeatedly throughout the past weeks of trying these things with my classes, and they (being teenagers) are generous with honest criticism about whether something is working or not. They get why I made these changes. By showing that iteration, reflection, and hard work are part of our own process of being successful, they just might believe us when we tell them it should be part of theirs.

Snacking on Statistics and Variability

One of my goals this year in Algebra 2 has been to include more discrete math, statistics, and probability when I can. I've been convinced by all sorts of smart people that as traditional as it may be to have Calculus as the ultimate goal for math students, statistics and probability are the math that people are more likely to need to use. It compels me to include it in my courses as more than a separate unit.

As if I didn't need another reason, we are also in a spell of reviewing properties of radicals, and it's refreshing to get my students thinking differently after a period of simplifying, multiplying, and rationalizing.

I gave them the following scenario:

  • Imagine yourself in twenty years - you are, of course, rich and famous. You are hiring someone to fly your personal jet. your last pilot fell asleep on the job, though he was luckily parked at the gate when it happened.

Two pilots have applied for the position, both equally qualified as pilots. In order to help you make your decision (and avoid the previous situation), you have asked them to keep track of how many hours of sleep they get over a two week period before the interview.

Two weeks later, they return to you with the following data:

  • What differences do you notice about the two pilots?
  • What calculations would you make to describe any quantitative differences between them?
  • Which one would you hire? Why?

Note: This data is completely made up. My new semi-obsession is in using normal distributions to mess up clean functions and force my students (in physics and math) to deal with messy data.

The students almost immediately started calculating means - exactly what I would have expected them to do given what they have been taught to do when faced with a table of data like this. Some did so manually, others used the Geogebra file that generated the data to make their calculation.

The results were fairly consistent - everyone chose the second pilot. When asked why, they said the pilot gets more sleep on average, and so would be the better choice.

When I asked who was more consistent in their sleep, they were easily able to identify the first pilot. When asked why, many had explanations that correctly danced around how most of the data was closer to the average. No students really brought up this fact before I asked though, which leads me to believe they observed one of two things:

  • The importance of the  consistency doesn't really matter given the difference in the means for how much sleep the pilots got.
  • They didn't think to look at consistency at all.

Some other interesting tidbits:

  • None of the students thought to construct a histogram to look at the data. When asked, about half of the class said they knew how to construct a histogram. I didn't dig any deeper to flesh this out. I was going to throw one together in Geogebra, but decided that might be something we should look at with more time available.
  • Half of the class that is taking AP Psychology didn't think about finding standard deviation. Again, I didn't dig any deeper to find if this was because they didn't know that it might apply here, or because they thought the values of the means were more important.

There is plenty here to generate discussion, but the one thing I wonder about is if variation about a mean is a concept that comes naturally to students to consider when given a set of 1-D data. One of my professors mentioned offhand in an experimental design class that any measurement you take is a distribution, a point which I have never forgotten. Up to that moment, I had never really thought much about it either.

Sure, I had collected data in my biology, chemistry, and physics classes before and knew I had to take multiple data points. All I knew then was that doing so made my data "better". More data makes things better. Get it? My understanding in high school science was also that you never measure the same quantity at the exact same value ten times in a row because someone in your lab group is always messing it up or doing it wrong. Averaging things together smooths that out. I don't recall ever discussing in either math or science class that the true beauty of statistics comes from managing, communicating, and understanding variability in data that will never really go away. I have always shuddered when students write lab report conclusions that discuss how "the data are/is wrong because" rather than focusing on what the data reveals about an experiment.  We definitely want to work to minimize experimental error, but sometimes the variation in the data is an important characteristic of what is being measured.

Maybe this is something that needs to be explicitly taught in the way we present statistics to our students. It seems like something that needs to be drawn out over time, rather than in one big statistics unit of a course that focuses on other things. I think using  technology to handle the mechanics of calculating statistical quantities allows students to focus more on what the statistics say and develop their intuition about it. We risk letting the important ideas of variation and statistics collect dust and stagnate as  another box of content for students to throw in the closet of their busy, distracted brains.

Bringing robotic cars and Udacity to my classroom

I was really excited to learn about Udacity, a new online education system that premiered two courses on February 20th. That a course on programming a robotic car would appeal to me is probably not surprising to anyone that knows me. I also love having yet one more excuse to continue learning Python, especially one that gets me working with an expert in the field such as Professor Sebastian Thrun. I recall reading about him shortly before his team's successful bid at the DARPA Grand Challenge, and have since seen his name repeated at many key moments along my development as a robotics enthusiast.

The course is structured really well, with short videos introducing concepts, quizzes and programming tasks (with solutions) along the way to check comprehension, and homework assignments. The students love that I have homework.

I am busy, but this was too cool to pass up.

I also have a pretty hard time hiding the things I'm enthusiastic about in my classroom, so the content of the class has been something I've mentioned and shared with students at the start or end of planned activities. The whole classroom gasped at this video from the 25th second onward:

Based on that reaction, I really wanted to give them a sense for the things I was learning to do. The first week centered on learning about localization - a process that uses probability calculations to estimate the location of the car using sensor readings and a map of the surroundings. I did a quick overview of what this meant as a filler activity to break up work during class, but wanted to find a way to do much more.

Today's Algebra 2 class was going to be missing a couple students that are attending a Model UN conference, so I figured it would be a good time to try something different.

We started with the following warm-up problems:

Mr. Weinberg tells you we are guaranteed to have a quiz one of the days between Monday and Friday. He tells you that the probabilities of the quiz happening Monday through Thursday are 0.1, 3/8, 1/16, and 36%. What is the probability that the quiz will be on Friday? On which day is the quiz most likely to occur?

This helped review the total probability principle which is key to understanding the localization algorithm. We also did a review of finding the probability of compound independent events, first with a tree diagram, and then using multiplication and the counting principle.

We then went through the following activity for the rest of the period:
Robot Localization activity

I adapted parts of the course material provided by Udacity, primarily simplifying language, cleaning up diagrams, and adjusting the activities for my students who do not have any programming ability. We did have a Python activity back in October, but installing and running Python was a hassle on the 1-1 Macbooks with OSX since I was trying to do it with Python 3 and IDLE. It was only shortly afterward that I learned that an earlier version of Python was automatically installed. Oops. For this activity, we used http://repl.it/ to do the programming. This worked fantastically well.

The students seemed to do really well with the introductory material and filling things in, and modifying the basic programming went smoothly. They ran into some trouble around problem 7, which I half expected - that was the first part of the activity when I told them to do something without any rationale behind it. Most were generally able to implement the procedure and get to problem 9, but at this point at the end of the day on a Friday afternoon, fatigue started to take over. This was after around 45 minutes of working on the activity.

I added a section on motion for possible use in another class, as I ultimately would like them to be able to throw my own homework solution code into a simulator provided by Udacity user Anna Chiara. I did not deal with any of the sensor probability or move probability. The intuition for understanding how those apply in the algorithm is a bit subtle for the background of my students, and would take more of an investment of time than I think my students have the patience for at this state. I think it would be easier to talk about how these issues exist, and then have them observe what they mean by looking at the output of the program.

All in all, it was a cool, low-key way to share my own learning with students after an exhausting week. I think we all needed a bit of a change.

My tutor's name is Geogebra CAS.

When I first started teaching, I learned that the best thing to have students do after factoring a trinomial was to have the students check by multiplying out the binomials. At the time, it naively made total sense - students don't even need me to be there to practice! They can do this on their own while sitting on the subway or waiting for the bus - whatever dead time they have. The students that need to practice factoring can do as much of this as they need until they can factor with some degree of automaticity.

Some (not all) students took my advice. Of those that did, I often saw stuff like this:

x² - 4 = (x - 2)(x - 2)
= x² - 2x + x(2) + 4 = x² - 4

This was a worse situation than how we started - not only were they factoring incorrectly, but their inability to multiply binomials was giving them the false idea that they were doing a good job of factoring! This frustrated me to no end - even if I did give students time during class to practice and develop these skills, what could I tell them to do to improve outside of class? One colleague considered stopping giving homework because he saw it repeatedly reinforcing student errors. I didn't go that far, but I did start grading homework to try to find mistakes.

The missing piece for these students is the lack of useful and correct feedback. Most of them learned the procedures, but made arithmetic or careless errors such as leaving out terms when simplifying. Without any correct data to make decisions on, these students were just going through a procedure and generating incorrect results, and using the incorrect results to validate an incorrect procedure. If they had a way to generate correct feedback, this experience would stop being worthless and instead become a useful method for developing student skills!

This is where CAS systems come in - Wolfram Alpha is nice, but Geogebra CAS is even better because of speed. I worked with a student that needed practice both in simplifying polynomial expressions and factoring polynomials completely. This is what I had him do while he sat with me:

  • Make up a pair of binomials of the form (x - 5)(4x - 5), multiply them, and then find the quadratic and linear coefficients. When you are ready, use the Simplify[] command to check your answer.
  • Make up a product of polynomials of the form 4x^2(x+5)(2x-5) . Multiply it out all the way on paper, and then check your result using the Simplify[] command.

After this step, we talked about how he could do this on his own and check his work. While we were sitting there, he made mistakes, but was able to catch them himself. He was the source of the problems, and was able to check and see if his final answers were correct. We then moved on to factoring practice:

  • Write out 15 products of binomials (3x-1)(x+5). For some of them, add a monomial factor. Include a couple sum and difference polynomials as well. Multiply any three of them out manually and check using Simplify[].
  • Use Geogebra to multiply any ten of the the rest of them and write down the resulting polynomials on a separate sheet of paper.
  • Eat dinner, watch TV, or something that has nothing to do with factoring.
  • Return to the paper and factor the ten polynomials you wrote down completely. Use Factor[] to check and make sure your final answers match what Geogebra produces - if there are differences, check to see if you have actually factored completely or not. Make a note of any repeat mistakes.

There is a whole lot of extra busy work involved in this process, but part of that is because it's easy to factor a polynomial that you just generated moments before if you still remember the factors. For some students, this won't matter, but it helps ensure that the exercises generates are actually useful. This student was on fire during class today, even though we were looking at a different topic entirely. I should have asked him directly whether this is the case, but perhaps the boost of confidence going through this process gave him is part of the reason. I also really like that this method allows the student to simultaneously work on multiplying polynomials and factoring them. My method beforehand would have been to stick to multiplying, then factoring, and then mix them up - there's no reason to do this.

Computer algebra has been around for a while. The reason I think it's now to the point where it can be transformational is that it's easily accessible, easy to use, and almost instant. This idea of using technology (and particularly Geogebra) to help students develop their pencil and paper skills is one that really excites me. I'm excited to see if it works with the students that came in a bit behind but are willing to put in the time to catch up. I don't want my class time to be spent learning algorithms - that defeats my strong belief that we should focus on teaching mathematical thinking, modeling, and problem formulation instead of algorithms. That said, students do need to be able to develop their skills, and this offers a personalized way to help them do so on their own.

 

Impressing the parents with Wolfram Alpha...it's for your own good, kids.

I received a few emails from parents recently wondering how to help their children get better in math. Parents often apologize for not being strong at math themselves, and the students, in my case all teenagers, have trouble communicating with parents about, well, a lot of things, let alone math. Creating a genuine way for children and parents to communicate with each other about math has always been difficult. Thankfully, tools like Wolfram Alpha can come to the rescue.

Here is the advice I gave one parent this week whose child is learning to factor quadratic trinomials:


Think of four numbers, keep them between 1 and 8. For example, 2, 1, 5, 7.

You can then write them like this: (2x+1)(5x+7) or make some negative: (2x - 1)(5x+7).

Go to Wolfram Alpha, and in the main input bar, type what you wrote, as shown below:

A page will load with a part that looks like this, a bit of the way down the page:

Give the top one (10x^2+9x-7) to him, and say to factor it. A groan at this point is natural. And then he will remember how to do it. The final answer should be the same as what you entered into the website. You can come up with new numbers and do this as much as you want - it will only make him stronger. If he has trouble, make the 1st and 3rd numbers you pick be 1, and it will simplify it a bit.

Yes, it will result in at least some expression of teenage 'come-on-mom/dad-ery'. But that's probably going to happen anyway, right?