# Projectile Motion with Python, Desmos, and Monte Carlo Simulation

I've written about my backwards approach to to projectile motion previously here, here, and here.

I had students solving the warm-up problem to that first lesson, which goes like this:

A student is at one end of a basketball court. He wants to throw a basketball into the hoop at the opposite end.

What information do you need to model this situation using the Geogebra model? Write down [______] = on your paper for any values you need to know to solve it using the model, and Mr. Weinberg will give you any information he has.
Find a possible model in Geogebra that works for solving this problem.
At what minimum speed he could throw the ball in order to get the ball into the hoop?

The students did what they usually do with the Geogebra projectile motion model and solved it with some interesting methods. One student lowered the hoop to the floor. Another started with a 45 degree angle, and then increased the speed successively until the ball made it into the hoop. Good stuff.

A student's comment about making lots of guesses here got me thinking about finding solutions more algorithmically. I've been looking for new ways to play around with genetic algorithms and Monte Carlo methods since they are essentially guess and check procedures made productive by the power of the computer.

I wrote a Python program that does the following:

• Get information about the initial characteristics of the projectile and the desired final location.
• Make a large number of projectiles (guesses) with random values for angle and initial speed within a specified range.
• Calculate the ending position of all of the projectiles. Sort them by how far they end up compared to the desired target.
• Take the twenty projectiles with the least error, and use these values to define the initial values for a new, large number of projectiles.
• Repeat until the error doesn't change much between runs.
• Report the projectile at the end with the least error.
• Report the entire procedure a number of times to see how consistent the 'best' answer is.

I've posted the code for this here at Github.

As a final step, I have this program outputting commands to graph the resulting projectile paths on Desmos. Pasting the result into the console while a Desmos calculator open, makes a nice graph for each of the generated projectiles and their intersecting at the desired target:

This is also on a live Desmos page here.

This shows that there is a range of possible answers, which is something I told my physics class based on their own solutions to the problem. Having a way to show (rather than tell) is always the better option.

I also like that I can change the nature of the answers I get if I adjust the way answers are sorted. This line in the code chooses how the projectile guesses are sorted by minimizing error:

` self.ordered = self.array.sort(key=lambda x: abs(x.error))`

If I change this to instead sort by the sum of error and the initial speed of the projectile, I get answers that are much closer to each other, and to the minimum speed necessary to hit the target:

Fun stuff all around.

# Coding For The Classroom: SubmitMe

For more than a year now, my process of sharing student work involves me going around the class, snapping pictures on my phone, and uploading the results through a web page to my laptop. It's a lot smoother than using a document camera, and also enables students themselves to upload pictures of their work if they want, or if I ask them to. This is much smoother and faster than using a native application in iOS or Android because it's accessed through a web page, and is hosted locally on my laptop in the classroom.

I've written about my use of this tool before, so this is more of an update than anything else. I have cleaned up the code to make it easier for anyone to run this on their own computers. You can download a ZIP file of the code and program here:
submitMe

Unzip the file somewhere convenient on your computer, and make a note of where this is on your computer. You need to have a Python compiler installed for this to run, so make sure you get that downloaded and running first. If you have a Mac, you already have it on your computer.

Here's what you need to do:

1. Edit the submit.py file in the directory containing the uncompressed files using a text editor.
2. Change the address in the line with HOST to match the IP address of your computer. You can obtain this in Network Preferences.
3. Change the root_path line to match the directory containing the uncompressed files. In the zip file, the line refers to where I have these files on my own computer. These files are located in the /Users/weinbergmath/Sites/submitMePortable directory. This needs to be the absolute address on your file system.
4. Run the submit.py file using Python. If you are on a Mac, you can do this by opening a terminal using Spotlight, going to the directory containing these files, and typing `python submit.py . ` Depending on your fire-wall settings, you might need to select 'Allow' if a window pops up asking for permission for the Python application.
5. In a web browser, enter the IP address you typed in Step 2 together, port 9000. (Example: http://192.168.0.172:9000). This is how students will access the page on their computers, phones, or tablets. Anyone on the same WiFi network should be able to access the page.

That should be it. As students upload images, they will be located in the /images directory where you unzipped the files. You can browse these using Finder or the File Browser. I paste these into my class notes for use and discussion with students.

Let me know if you need any help making this work for you. If needed, I can throw together a screen cast at some point to make it more obvious how to set this up.

# Math Caching and Immediately Useful Teaching Data

Last July, I posted a video in which I showed how to create a local, customized version of the Math Caching activity that can be found here.

I was inspired to revisit the idea last weekend reading Dan Meyer's post about teacher dashboards. The part that got me thinking, and that stoked the fire that has been going in my head for a while, is identifying the information that is most useful to teachers. There are common errors that an experienced teacher knows to expect, but a new teacher may not recognize is common until it is too late. Getting a measure of wrong answers, and more importantly, the origin of those wrong answers, is where we ideally should be making the most of the technology in our (and the students') hands. Anything that streamlines the process of getting a teacher to see the details of what students are doing incorrectly (and not just that they are getting something wrong) is valuable. The only way I get this information is by looking at student work. I need to get my hands on student responses as quickly as I can to make sense of what they are thinking.

As we were closing in on the end of an algebra review unit with the ninth graders this week, I realized that the math cache concept was good and fun and at a minimum was a remastering of the review sheet for a one-to-one laptop classroom. I came up with a number of questions and loaded it into the Python program. When one of my Calculus students stopped in to chat, and I showed her what I had put together, I told her that I was thinking of adding a step where students had to upload a screenshot of their written work in addition to entering their answer into the location box. She stared at me and said blankly: 'You absolutely have to do that. They'll cheat otherwise.'

While I was a bit more optimistic, I'm glad that I took the extra time to add an upload button on the page. I configured the program so that each image that was uploaded was also labeled with the answer that the student entered into the box. This way, given that I knew what the correct answers were, I knew which images I might want to look at to know what students were getting wrong.

This was pure gold.

Material like this was quickly filling up the image directory, and I watched it happening. I immediately knew which students I needed to have a conversation with. The answers ranged from 'no solution' to 'identity' to 'x = 0' and I instantly had material to start a conversation with the class. Furthermore, I didn't need to throw out the tragically predictable 'who wants to share their work' to a class of students that don't tend to want to share for all sorts of valid reasons. I didn't have to cold call a student to reluctantly show what he or she did for the problem. I had their work and could hand pick what I wanted to share with the class while maintaining their anonymity. We could quickly look at multiple students' work and talk about the positive aspects of each one, while highlighting ways to make it even better.

In this problem, we had a fantastic discussion about communicating both reasoning and process:

The next step that I'd like to make is to have this process of seeing all of the responses be even more transparent. I'd like to see student work popping up in a gallery that I can browse and choose certain responses to share with the class. Another option to pursue is to get students seeing the responses of their peers and offer advice.

Automatic grading certainly makes the job of answering the right/wrong question much easier. Sometimes a student does need to know whether an answer is correct or not. Given all the ways that a student could game the system (some students did discuss using Wolfram Alpha during the activity) the informative part on the teaching and assessment end is seeing the work itself. This is also an easy source of material for discussion with other teachers about student work (such as with Michael Pershan's Math Mistakes).

I was blown away with how my crude hack to add this feature this morning made the class period a much richer opportunity to get students sharing and talking about their work. Now I'm excited to work on the next iteration of this idea.

# Rethinking the headache of reassessments with Python

One of the challenges I've faced in doing reassessments since starting Standards Based Grading (SBG) is dealing with the mechanics of delivering those reassessments. Though others have come up with brilliant ways of making these happen, the design problem I see is this:

• The printer is a walk down the hall from my classroom, requires an ID swipe, and possibly the use of a paper cutter (in the case of multiple students being assessed).
• We are a 1:1 laptop school. Students also tend to have mobile devices on them most of the time.
• I want to deliver reassessments quickly so I can grade them and get them back to students immediately. Minutes later is good, same day is not great, and next day is pointless.
• The time required to generate a reassessment is non-zero, so there needs to be a way to scale for times when many students want to reassess at the same time. The end of the semester is quickly approaching, and I want things to run much more smoothly this semester in comparison to last.

I experimented last fall with having students run problem generators on their computers for this purpose, but there was still too much friction in the system. Students forgot how to run a Python script, got errors when they entered their answers incorrectly, and had scripts with varying levels of errors in them (and their problems) depending on when they downloaded their file. I've moved to a web form (thanks Kelly!) for requesting reassessments the day before, which helps me plan ahead a bit, but I still find it takes more time than I think it should to put these together.

With my recent foray into web applications through the Bottle Python framework, I've finally been able to piece together a way to make this happen. Here's the basic outline for how I think I see this coming together - I'm putting it in writing to help make it happen.

• Phase 1 - Looking Good: Generate cleanly formatted web pages using a single page template for each quiz. Each page should be printable (if needed) and should allow for questions that either have images or are pure text. A function should connect a list of questions, standards, and answers to a dynamic URL. To ease grading, there should be a teacher mode that prints the answers on the page.
• Phase 2 - Database-Mania: Creation of multiple databases for both users and questions. This will enable each course to have its own database of questions to be used, sorted by standard or tag. A user can log in and the quiz page for a particular day will automatically appear - no emailing links or PDFs, or picking up prints from the copier will be necessary. Instead of connecting to a list of questions (as in phase 1) the program will instead request that list of question numbers from a database, and then generate the pages for students to use.
• Phase 3 - Randomization: This is the piece I figured out last fall, and it has a couple components. The first is my desire to want to pick the standard a student will be quizzed on, and then have the program choose a question (or questions) from a pool related to that particular standard. This makes reassessments all look different for different students. On top of this, I want some questions themselves to have randomized values so students can't say 'Oh, I know this one - the answer's 3/5'. They won't all be this way, and my experience doing this last fall helped me figure out which problems work best for this. With this, I would also have instant access to the answers with my special teacher mode.
• Phase 4 - Sharing: Not sure when/if this will happen, but I want a student to be able to take a screenshot of their work for a particular problem, upload it, and start a conversation about it with me or other students through a URL. This will also require a new database that links users, questions, and their work to each other. Capturing the conversation around the content is the key here - not a computerized checker that assigns a numerical score to the student by measuring % wrong, numbers of standards completed, etc.

The bottom line is that I want to get to the conversation part of reassessment more quickly. I preach to my students time and time again that making mistakes and getting effective feedback is how you learn almost anything most efficiently. I can have a computer grade student work, but as others have repeatedly pointed out, work that can be graded by a computer is at the lower level of the continuum of understanding. I want to get past the right/wrong response (which is often all students care about) and get to the conversation that can happen along the way toward learning something new.

Today I tried my prototype of Phase 1 with students in my Geometry class. The pages all looked like this:

I had a number of students out for the AP Mandarin exam, so I had plenty of time to have conversations around the students that were there about their answers. It wasn't the standard process of taking quiz papers from students, grading them on the spot, and then scrambling to get around to have conversations over the paper they had just written on. Instead I sat with each student and I had them show me what they did to get their answers. If they were correct, I sometimes chose to talk to them about it anyway, because I wanted to see how they did it. If they had a question wrong, it was easy to immediately talk to them about what they didn't understand.

Though this wasn't my goal at the beginning of the year, I've found that my technological and programming obsessions this year have focused on minimizing the paperwork side of this job and maximizing opportunities for students to get feedback on their work. I used to have students go up to the board and write out their work. Now I snap pictures on my phone and beam them to the projector through an Apple TV. I used to ask questions of the entire class on paper as an exit ticker, collect them, grade them, and give them back the next class. I'm now finding ways to do this all electronically, almost instantly, and without requiring students to log in to a third party website or use an arbitrary piece of hardware.

The central philosophy of computational thinking is the effort to utilize the strengths of  computers to organize, iterate, and use patterns to solve problems.  The more I push myself to identify my own weaknesses and inefficiencies, the more I am seeing how technology can make up for those negatives and help me focus on what I do best.

# Who’s gone overboard modeling w/ Python? Part II - Gravitation

I was working on orbits and gravitation with my AP Physics B students, and as has always been the case (including with me in high school), they were having trouble visualizing exactly what it meant for something to be in orbit. They did well calculating orbital speeds and periods as I asked them to do for solving problems, but they weren't able to understand exactly what it meant for something to be in orbit. What happens when it speeds up from the speed they calculated? Slowed down? How would it actually get into orbit in the first place?

Last year I made a Geogebra simulation that used Euler's method  to generate the trajectory of a projectile using Newton's Law of Gravitation. While they were working on these problems, I was having trouble opening the simulation, and I realized it would be a simple task to write the simulation again using the Python knowledge I had developed since. I also used this to-scale diagram of the Earth-Moon system in Geogebra to help visualize the trajectory.

I quickly showed them what the trajectory looked like close to the surface of the Earth and then increased the launch velocity to show what would happen. I also showed them the line in the program that represented Newton's 2nd law - no big deal from their reaction, though my use of the directional cosines did take a bit of explanation as to why they needed to be there.

I offered to let students show their proficiency on my orbital characteristics standard by using the program to generate an orbit with a period or altitude of my choice. I insist that they derive the formulae for orbital velocity or period from Newton's 2nd law every time, but I really like how adding the simulation as an option turns this into an exercise requiring a much higher level of understanding. That said, no students gave it a shot until this afternoon. A student had correctly calculated the orbital speed for a circular orbit, but was having trouble configuring the initial components of velocity and position to make this happen. The student realized that the speed he calculated through Newton's 2nd had to be vertical if the initial position was to the right of Earth, or horizontal if it was above it. Otherwise, the projectile would go in a straight line, reach a maximum position, and then crash right back into Earth.

The other part of why this numerical model served an interesting purpose in my class was as inspired by Shawn Cornally's post about misconceptions surrounding gravitational potential and our friend mgh. I had also just watched an NBC Time Capsule episode about the moon landing and was wondering about the specifics of launching a rocket to the moon. I asked students how they thought it was done, and they really had no idea. They were working on another assignment during class, but while floating around looking at their work, I was also adjusting the initial conditions of my program to try to get an object that starts close to Earth to arrive in a lunar orbit.

Thinking about Shawn's post, I knew that getting an object out of Earth's orbit would require the object reaching escape velocity, and that this would certainly be too fast to work for a circular orbit around the moon. Getting the students to see this theoretically was not going to happen, particularly since we hadn't discussed gravitational potential energy among the regular physics students, not to mention they had no intuition about things moving in orbit anyway.

I showed them the closest I could get without crashing:

One student immediately noticed that this did seem to be a case of moving too quickly. So we reduced the initial velocity in the x-direction by a bit. This resulted in this:

We talked about what this showed - the object was now moving too slowly and was falling back to Earth. After getting the object to dance just between the point of making it all the way to the moon (and then falling right past it) and slowing down before it ever got there, a student asked a key question:

Could you get it really close to the moon and then slow it down?

Bingo. I didn't get to adjust the model during the class period to do this, but by the next class, I had implemented a simple orbital insertion burn opposite to the object's velocity. You can see and try the code here at Github. The result? My first Earth - lunar orbit design. My mom was so proud.

The real power here is how quickly students developed intuition for some orbital mechanics concepts by seeing me play with this. Even better, they could play with the simulation themselves. They also saw that I was experimenting myself with this model and enjoying what I was figuring out along the way.

I think the idea that a program I design myself could result in surprising or unexpected output is a bit of a foreign concept to those that do not program. I think this helps establish for students that computation is a tool for modeling. It is a means to reaching a better understanding of our observations or ideas. It still requires a great amount of thought to interpret the results and to construct the model, and does not eliminate the need for theoretical work. I could guess and check my way to a circular orbit around Earth. With some insight on how gravity and circular motion function though, I can get the orbit right on the first try. Computation does not take away the opportunity for deep thinking. It is not about doing all the work for you. It instead broadens the possibilities for what we can do and explore in the comfort of our homes and classrooms.

# Automating conference scheduling using Python

I've always been interested in the process of matching large sets of data to a set of constraints - apparently the Nobel committee agreed this past week in awarding the economics prize. The person in charge of programming at my school in the Bronx managed to create an algorithm that generated a potential schedule for over four thousand students given student requests and needs. There was always some tweaking that needed to be done at the end to make it work, but the fact that the computer was able to start the process always amazed me. How do you teach a computer to do this sort of matching in an efficient way?

This has application within my classroom as well - generating groups based on ability, conflicting personalities, location - all complex situations that required time and attention to do correctly. In the end though, this is the same problem as arranging the schedules. It's easy to start with a random arrangement and then make adjustments based on experience. There has to be a way to do this in an automated way that teaches the computer which placements work or do not. Andy Rundquist does this using genetic algorithms - I must know more about how he does it, as this is another approach to this type of problem.

This became a more tangible challenge for me to attempt to solve last year when I saw that the head of school was doing the two days of parent-teacher conference scheduling by hand. This is a complex process given the following constraints he was working to fulfill:

• Parent preferences for morning/afternoon conference times.
• Consecutive conference times for parents that had siblings so that the amount of time parents had to wait around was minimized.
• Balanced schedules between the two days for all teachers.
• Teachers with children had breaks in their schedule to attend conferences of their children.

This was apparently a process of 4 - 5 hours that sometimes required starting over because he discovered that the schedule he had started putting together was over constrained and could not meet all requirements. During this process, however, he had figured out an algorithm for what was most likely to work. Schedule the families with the largest number of children first, and work down the list in order of decreasing size. Based on the distribution of younger vs. older children in the school, start by scheduling the youngest children in a family first, and move to the older ones. Save all families with single children for last.

Hearing him talk about this process was interesting and heartbreaking at the same time - he works incredibly hard on all aspects of his job, and I wanted to provide some way to reduce the requirements of at least this task on his schedule. I was also looking for a reason to really learn Python, so this challenge became my personal exercise in problem based learning.

It took a while to figure out all of the details, but I broke it down into stages. How do you input the family data based on how it is already stored by the front office? (I didn't want to ask the hard-working office staff to reformat the data to make it easier for me - this was supposed to make things easier on everyone.) How do you create a structure for storing this data in Python? How do you implement the algorithm that the head of school used and balance it with the idea of fairness and balance to all families and teachers?

Over the following few months, I was able to piece it together. It was, needless to say, a really interesting exercise. I learned how to ask the right questions that focused on the big picture needs of the administration, so that I could wrestle with the details of how to make it happen. The students learned that I was doing this ("Mr. Weinberg is using his robots to schedule conferences!") and a few wanted to know how it worked. I have posted the code here as a gist.

I put in more than the 4-5 hours required to do this by hand. It was a learning experience for me. It also paid serious dividends when we needed to schedule conferences again for this year. We wanted to change the schedule slightly to be one full day rather than two half days, and it was a simple task adjusting the program to do this. We wanted to change the times of conferences so that the lower and upper schools had different amounts of time for each, rather than being a uniform twenty minutes each. (This I was not able to figure out before we needed conferences to go out, but I see a simple way to do it now.)

The big question that administration was about the upper school conferences. Last year we had seven different rooms for simultaneous conferences, and the question was whether we could reduce the number to five. I ran the program with five rooms a number of different times, and it was unable to find a working schedule, even with different arrangements and constraints. It was able to find one that worked with six rooms though, which frees administrators from needing to be in individual conference rooms so that they can address issues that come up during the day. Answering that question would not have been possible if scheduling was done by hand.

The question of using computers to automate processes that are repetitive has been in my head all this year. I've come to recognize when I am doing something along these lines, and try to immediately switch into creating a tool in Python to do this automatically. On the plane during a class trip last week, we needed to arrange students into hotel rooms, so I wrote a program to do this. I used it this week to also arrange my Algebra 2 students in groups for class. Generating practice questions for students to use as reassessment? I always find myself scrambling to make questions and write them out by hand, but my quiz generator has been working really well for doing this. Yesterday I had my first day of generating quizzes based on individual student needs.

The students get a kick out of hearing me say that I wrote a Python program to do XXX or YYY, and their reactions certainly are worth the effort. I think it just makes sense to use programming solutions when they allow me to focus on more important things. I have even had some success with getting individual students to want to learn to do this themselves, but I'll write more about that later.

# Rethinking my linear function approach in Algebra 2

My treatment of linear functions in the past has been pretty traditional. Solve for y, y = mx + b, graphing using slope intercept, then move on to linear inequalities in two variables...it is just dull this way. Most students have seen it before in one form or another, and it wasn't exciting (or that novel) to them the first time they learned it. It doesn't have to be this way, and I committed myself this year to doing things differently.

My approach has been centered on two big ideas:

1. Linear functions have a constant rate of change. All of the other qualities they have are related to this important fact.
2. There is an amazing connection between graphs, tables of values, and the equations that generate linear functions. These are not three separate skills, they are three views of the same fundamental mathematical object. Corollary: Teaching them on three separate days or sticking to one view at a time creates an unnecessary pigeon-holing effect that sticks with students for as long as conditions in your class permit.

On day one, we did my Robot Tracking activity posted here at GeogebraTube. The video introduction was reviewed in class and students worked on it for much of the period. This emphasized a fundamental concept around linear functions of distance and time that was pretty intuitive to nearly all of the students that did this activity.

Predicting where something is located, assuming it continues moving at a constant rate is one of the most common applications of linearity. We do it all the time. Can we cross the street in front of the bus? Mental calculation. Where should I kick the soccer ball to get it right in front of the forward moving toward the goal? Mental calculation. I don't mean actually sitting down and calculating where it will be, but that the human brain is pretty good at noticing the velocity of objects, and making a pretty good guess of where it will be. They had a number of methods of coming to an answer that ranged from geometric (simply drawing a line) to counting grid squares, using the trace function, and proportional reasoning.

We ended the period looking at the Python script I posted here and trying to calculate speed from the information generated by the program. Part of the homework assignment for the next class was to try to answer the question posed by another Python program posted here. The table of values is randomly determined each time, and students could (and often did) try it multiple times to get it right.

The next lesson had a single instance of this program as a warm up for the whole class - everyone had to agree on what value of position I needed to enter for the given time value.. They were pretty good at checking each other and having good conversations about how to go about it. They answered correctly, but we had a good conversation about the different ways to get there. They all centered on using the fact that there was equal spacing between all of the points. Most students used some variation of finding the distance moved per second and whether it was positive or negative, and then counted off intervals. In most cases, it was a bit complicated and required a lot of accounting to get to their answer.

We went over the reason we could do this - the constant rate of change - and verified it using a few different pairs of points. I then threw in the idea of using the point (x,y) and using the constant rate of change with that point. We got to $frac{y - b}{x - a}=m$ and I asked them to write this using the slope we calculated and any point they liked from the table of data. Students seated next to each other I encouraged to use different points. I then asked them to answer the original question from the Python program using their equation. (Un)surprisingly enough, they all ended up with the correct (and same) answer as before.

Some of them started distributing and writing in slope intercept form. THe thing I was kind of excited about was that they didn't feel the equation had to be written that way, they just felt like seeing what happened. Many discovered the fact that their answers were the same after doing so, even though they started with different points. We did a couple examples of solving more basic 'Write an equation for a line that..." questions, but did so without making a huge deal out of slope-intercept form or point-slope form and why one might be better than the other in different situations.

Today was the third day going through this concept - the warm up activity had three levels to it:

The goal here was to constantly push the students to go back and forth between the equation and numerical representations of these functions. There were lots of good things students figured out from these. We then made the jump to looking at how the graph is connected to the table and equation - just one more way of looking at the same mathematical function, and it shares the meaning that comes with the other two representations: a constant rate of change. The new idea introduced as part of this was that of an intercept. What does it mean on the graph? What does it mean for the table? We didn't talk explicitly about the intercept's meaning of the equation (again, trying to avoid the "that's just y = mx + b, I know this already...TUNED OUT") , but it came out in the process of identifying it algebraically, from tables, and then graphing.

By the end of the period, we were graphing linear functions. Students were asking excellent questions about when the intercepts alone can be used to graph the line, when they can't ($2x+3y=6$ versus $2x+3y=7$) but they again stuck to the idea of finding a point they know is on the graph, and then using the constant rate of change to find others. Instead of spending a boring lesson explicitly telling them what my expectations are for graphing lines (labeled and scaled axes, line going all the way across the extent of the axes, arrows on axes and lines) I was able to gently nudge students to do this while they worked.

We'll see how things go as we continue to move forward. The big thing I like about this progression so far is that modeling real phenomena will be a natural extension of what we've already done - not a lesson at the end of contrived examples with clean numbers. My goal originally was to get this group comfortable with messy data and being comfortable with using different tools to make sense of it.

I've kept my students hermetically sealed from this messiness in the past - integer coefficients, integer values, and explicit step-by-step ways of graphing, generating tables, and writing equations. As I mentioned before, it was, well, boring and predictable, and perpetuated the idea that these skills are all separated from each other. It also continued the pattern that there would be a day in each unit where the numbers are messy, the real world word problems day, but that the pain associated with it would last a day and would be over soon enough.

I'm hoping to reduce this effect by changing my approach. That by seeing the different aspects of linear functions, it will seem natural to use a graph to figure out something that might not make sense algebraically, or use numerical values to solve an algebraic problem. I especially like this because exploring the three views of functions really is, in my opinion, the primary learning goal of the Algebra 2 course. If I can establish this as an expectation early on, I think the latter parts of the course will work much more smoothly.

# Progress on Python-Powered randomized quiz generator

One of the projects floating around in my head since the end of last year is creating an easy to use tool that will automatically generate questions for students to test their skills either on their own or while in class. My first attempt at this was during a unit in Geometry on translations, my first attempt at implementing standards based grading. I was taking a Udacity course on web applications and realized that if I could write a quiz generator online, it would be the easiest way to give students a sense of how they were doing without needing me to be part of the process.

As most people doing reassessments tend to be, I was a bit overwhelmed with the paperwork side of things, especially because many of the students just wanted to know if they were right or not. I had made some Python programs to generate quiz questions one by one and decided to try to adapt it to the web application so students could input their answers to questions that had different numbers every time. I had tried to use other options such as PollEverywhere, Socrative to at least collect the data and check it as right/wrong (which would have been good enough for a start in terms of collecting data, but left out the randomization part). The problem with these is that I believe they are hosted in the US and are incredibly slow without a VPN. I needed a solution that was fast, and if I could add the randomization, that would be even better. I decided to try to adapt my quiz generator to a Google App Engine hosted web application.

Needless to say (at least for me) this was not an easy task. I had a loose understanding of how to manage GET and POST requests and use cookies to store random values used. The biggest challenge came from checking answers on the server side. For someone figuring out Python concept as he goes, it involved a lot of fists on the keyboard at the time. My attempt is posted here. There were tons of bugs (and still are) but I at least got up the nerve to try it in class. The morning I was excited to premiere it, I also found out another interestingly infuriating nugget of info: Google App Engine is blocked in China.

I gave up at the time, as it was almost summer. I was interested in helping out with the development of the Physics Problem Database project during the summer, but opportunities for sitting down and coding while on a whirlwind tour of the US seeing friends and family weren't that numerous. It's amazing to see how John, Andy, and others have gotten the database site together and doing functionally cool things in a short amount of time. I spent some time over the summer learning PHP and MYSQL, but was pulled back into Python when I saw the capabilities of webpy and web2py to do applications. I see a lot of features and possibility there, but fitting my ideas to that framework is beyond what I know how to do and what I have been able to figure out during my time prepping and starting school. That will come later.

I keep coming back to the fact that randomization needs to be built into the program interface from the beginning. I want students that need to practice to be able to do so with different problems each time, because that frees them from needing me to be there to either generate them myself or prevent them from creating impossible problems. I want the reassessment process to be as simple as possible, and for the lowest level skills, they don't necessarily need me to be testing them in person. That's what in person interviews and conversations (including those through BlueHarvest) are all about. I won't rely on a tool like this to check proficiency, but it's a start for giving students a tool that will get them thinking along those lines.

I've had the structure for how to do this in my head for a while, and I started sketching out what it would be in a new Python program last week. This morning, after learning a bit more about the newer string formatting options in Python that offer more options than basic string substitution, I hunkered down and put together what is at least a workable version of what I want to do.

Please visit here to see the code, and here to give it a shot on repl.it.

The basic structure is that every question can use either random integers, an irrational decimal value, or signed integers in its text. With all of the messiness of methods to generate and replace the random numbers inside the Question class, it is fairly easy to generate questions with random values and answers. I admit that the formatting stinks, but the structure is there. I could theoretically make some questions for students this way that could be used on Monday, but I probably won't just yet. I think a nap is in order.

Next steps:

• I need to work on the answer checking algorithm. At the moment it just compares an entered decimal answer to being within a certain tolerance of the calculated answer. My plan is to expand the Question definition to include another input variable for question type. Single numerical answers are one question type, Coordinates are another, and symbolic equations or expressions are yet another one I'd like to include. Based on the question type, the answer method in the Question class can be adjusted.
• As an extension to this, I'd like to include sympy as part of this for making both question generation and answer checking. It has the ability to show that two symbolic expressions are equal to each other, among many other really nice capabilities. This will let me generate all sorts of nice Calculus and algebraic manipulation questions without too much difficulty.
• I'd like to be able to format things nicely for algebraic questions, and possibly generate graphical questions as well.
• The ultimate goal is to then get this nicely embedded as a web application. As I mentioned before, there is too much going on in the web2py framework for me to really get how to do this, but I think this is something I can do with a bit of help from the right sources.

I'm having a ball learning all of this, and to know that it will eventually make for a nice learning tool that students will benefit from using is a nice incentive for doing it.

# Winning the battle over Python programming

Two stories to share after this week's activities with students about programming. I have posted previously about my interest in making Python a fundamental part of my classes this year, and so I am finding ways to include it when it makes sense to do so.

I have a couple of students that are bridging the gap between Algebra 2 and Precalculus with an independent study that I get to design. The tentative title of the course for their transcript is 'Fundamentals of Mathematical Thinking' and the overall goal is to get these students a chance to develop their fundamental skills to be successful in later classes. I see it as an opportunity to really dig in to some cool mathematical ideas and get them to, well, dig into the fundamentals of mathematical thinking. I don't plan too much emphasis on the algorithms (though we will spend some time working on skills in algebra, polynomial manipulation, functions, and other crucial topics where they are weak). Looking at a situation, exploring the way different variables might be used to model that situation, and then really digging in to abstract the variables into a model.

We are starting with what I think is the most fundamental application of this: sequences and series. Even simpler, the first task I gave the students was to look at the number of bricks in the rows of a triangular tower and use Python to add up the bricks in each row. This started as a couple of exercises getting to know Python's syntax. They are then taking programs I wrote to model this problem and adjusting them to find other sums, including the sum of even and odd numbers. One student that completed this task was intrigued that the sum of the latter consisted of perfect squares, but we didn't explore it any further at this point.

I then gave this student a bunch of sequences. His task was simple: model each one in Python and generate the given terms. This is a standard exercise for Algebra & Precalc students by hand, but I figured that if he could do this with Python, clearly he was able to figure out the pattern. I showed him how to write fractions using string concatenation (e.g. 1/3 = 1 + "/" + 3) which enabled him to develop the harmonic series. Today he figured out Fibonacci and a couple other new ones. It was really fascinating to see him mess around think deeply about the patterns associated with each one. I did tap him slightly in the right direction with Fibonacci, but I have otherwise been hands off. I am also having him write about his work to give him opportunities to work on his writing too. When he feels comfortable sharing it (and I have already warned him that this is the plan), I will post links to his work here.

The other new thing was in Calculus. I have shortened my review of Pre-Calculus concepts substantially, and have made the first unit a survey of limits, rate of change, and definite integrals. Most of this has required technology to explore local linearity and difference quotients. On Thursday, I introduced using rectangular sums to find area - they were otherwise stuck on counting boxes, and I could tell they felt it was like baby math. They really didn't know any other way.

In showing them rectangular sums, we had some pretty good discussions about overestimating and underestimating. The students had conversations about how rough the approximation with only 3 - 5 rectangles gave for area under a parabola. A couple of them figured out how to use more rectangles. I told them I was going to write a program to do this while they were sitting and working. I created this program and talked them through how it works. They thought it was too complicated to be worth the time, but I think they did understand the basic idea. I then changed the value of N and asked them what they thought that meant. They got it right the first time. I then pushed the value to higher and higher values of N and they immediately saw that it was approaching a limit. Game, set, match.

Today I had the AP students together working on another definite integral activity that focused on the trapezoidal rule. I showed them the code again and gave them the line that calculates area. It wasn't too much of a stretch for them to work their way to adjusting the program to work for the Trapezoidal rule. We ran out of time to discuss comparisons between the two programs, but they stayed late after class and into their lunch getting it working on their own computers and playing a bit. Here is what we came up with.

The big battle I see is two-fold.

• Help students not be intimidated by the idea of writing a program to do repetitive calculations.
• Give students opportunities to see it as necessary and productive to use a computer to solve a problem.

Sometimes these battles are the same, other times they are different. By using the built-in version of Python on their Macs, I have already started seeing them run commands and use text editors to create scripts without too much trouble. That's the first battle. My plan is to give lots of examples supporting the second one in the beginning, and slowly push the burden of writing these programs on to the students as time goes by and they become more comfortable with the idea. So far I am feeling pretty good about it - stay tuned.

# Python in Algebra 2 - An Experiment

One of my goals is to include some Python programming as part of all of my courses this year, and to do so in a way that isn't just wedging it in where it doesn't belong. We don't have a formal programming class in our school given our size, but I have heard that students are interested in the broad topic of programming, and know that they could benefit. So, I am finding times to get students playing around with it as a tool.

The perfect opportunity in Algebra 2 today came in evaluating algebraic expressions. I don't like reviewing the topic, at least haven't in the past, because in most cases the students remember enough of it to think that they know how to do it, but have forgotten all the nasty bits about order of operations, distributing negative signs, and the infamous -5^2 = 25 when evaluating -a^2 at a = -5. They typically have great interactions reminding each other of the rules, but by the time they get to me in Algebra 2, the idea is no longer fresh. The lesson then ends up being the math lesson equivalent of an air freshener - temporary and stale.

Following my goal, I figured this would be a perfect opportunity to introduce the topic first as a programming topic, and then use the computer as a resource for the students to check their arithmetic. We started the class with some basic order of operations questions:

This was followed by pasting the following into a Python interpreter as everyone was watching:

print "Answers to the Warm-Up Questions:"
print (8*3 - 2*4)
print (27 + 18/9 - 3**2+1)
print (40 + 24)/8 - (2**3+1)

This was following a suggestion from Kevin Krenz to demonstrate the fast way to solve it using the Python interpreter. While they weren't wildly impressed, they did accept that this was an option for them to check their work in these types of questions, and were up for learning how it worked.

I then showed them how to run a Python file on their Macbooks, which all have at least Python 2.6 running on them in the terminal. I talked about working in the terminal as running around in the basement of their computer - lots of power and hidden secrets there to play around with (or mess up if not careful). After learning to do this, they edited a partially completed Python script which I have posted at Github here.

I really liked what happened afterwards, though it did not feel (at all) like a clean, straightforward way of going over algebraic expressions. It was messy. Different people were at different places during the entire 30 minutes we worked on it, which was much longer than I expected. Quite appropriately though, it slowly came together like writing a program often does. Lots of good discoveries and realizations of simple errors that I didn't need to force.

Students realized the difference between 2*x and 2x to the computer. They realized quite cleanly that they needed to tell the computer outright that there is multiplication between a coefficient and a variable. They saw this was not the case for -x although they also thought they might need to write it as -1*x. The Python interpreter pointed this out to them immediately. The interpreter didn't do so well on 4(3 - x) since it considered it a function call, but with some prodding, most students realized it was the same error.

There was enough information in the script for them to figure out how to do exponents, so I was happy not to have to go through that separately. The only really big problem was the fact that Python 2.6 doesn't have the nice floating point capability for division that 3.2 has. For the first problem, part (a), the answer is 0.5, but Python returns 0 since it assumes integer division with a plain / symbol. I went around to student computers replacing x/y with x*1./y, but this became an opportunity to converse with students about division as multiplication by the multiplicative inverse or reciprocal. Another unintended complication that then resulted in more review of pure mathematical concepts.

With all of this done, the students were then pretty proficient at trying to do the substitution by hand and checking against the answers from the computer. Most caught the serious mistakes without too much input from me - the computer did that work for me.

After finishing problem 1, the students got a big kick out of how I told them to program Problem 2 at the end of the script. They were directly teaching the computer to answer these questions through code. I think they saw that programming really is how you teach a computer to do what you want it to do, and had at least a minimal sense of pride in being able to do so.

One student said this was pretty cool, and compared it to a video game. Another appeared to want to kill me the entire time. They were all pretty patient with the activity though, and trusted that this would make them better at what they needed to learn for my class - probably the most important part to this not leading to a serious case of Thursday afternoon mutiny.

In the grand scheme of technology implementation, this activity was nothing more than using Python to replace a graphing calculator with substitution capability. This type of knowledge, however, is important for doing more substantial applications of computational thinking. I think it's important to get students to see what it can do before being interested in creating something as simple as 'Hello world'. That doesn't seem to interest the vast majority of students. While I did most of the programming for this task, this is a gateway to the students doing more and seeing more down the line. Now that they know how to do the basics of editing and running a program, we will be more successful in doing more sophisticated things later on.