Monthly Archives: February 2015

When can we neglect air resistance?

This was supposed to be the shortest part of a warm up activity. It turned into a long discussion that revealed a lot of student misunderstandings.

The question was about whether we could ignore air resistance on a textbook being thrown in the air. We spent most of our time discussing the differences and similarities between the three items here:
IMG_0650

There were interesting comments about what factors influence the magnitude of air resistance. I was definitely leading the conversation, but it wasn't until a student mentioned acceleration that anyone was able to precisely explain why one fell differently from another. We eventually settled on making a comparison between gravity force and air resistance force and calculating acceleration to see how close it was to the acceleration of gravity.

Projectile Motion with Python, Desmos, and Monte Carlo Simulation

I've written about my backwards approach to to projectile motion previously here, here, and here.

I had students solving the warm-up problem to that first lesson, which goes like this:

A student is at one end of a basketball court. He wants to throw a basketball into the hoop at the opposite end.

What information do you need to model this situation using the Geogebra model? Write down [______] = on your paper for any values you need to know to solve it using the model, and Mr. Weinberg will give you any information he has.
Find a possible model in Geogebra that works for solving this problem.
At what minimum speed he could throw the ball in order to get the ball into the hoop?

The students did what they usually do with the Geogebra projectile motion model and solved it with some interesting methods. One student lowered the hoop to the floor. Another started with a 45 degree angle, and then increased the speed successively until the ball made it into the hoop. Good stuff.

A student's comment about making lots of guesses here got me thinking about finding solutions more algorithmically. I've been looking for new ways to play around with genetic algorithms and Monte Carlo methods since they are essentially guess and check procedures made productive by the power of the computer.

I wrote a Python program that does the following:

  • Get information about the initial characteristics of the projectile and the desired final location.
  • Make a large number of projectiles (guesses) with random values for angle and initial speed within a specified range.
  • Calculate the ending position of all of the projectiles. Sort them by how far they end up compared to the desired target.
  • Take the twenty projectiles with the least error, and use these values to define the initial values for a new, large number of projectiles.
  • Repeat until the error doesn't change much between runs.
  • Report the projectile at the end with the least error.
  • Report the entire procedure a number of times to see how consistent the 'best' answer is.

I've posted the code for this here at Github.

As a final step, I have this program outputting commands to graph the resulting projectile paths on Desmos. Pasting the result into the console while a Desmos calculator open, makes a nice graph for each of the generated projectiles and their intersecting at the desired target:
Screen Shot 2015-02-14 at 11.12.03 AM

This is also on a live Desmos page here.

This shows that there is a range of possible answers, which is something I told my physics class based on their own solutions to the problem. Having a way to show (rather than tell) is always the better option.

I also like that I can change the nature of the answers I get if I adjust the way answers are sorted. This line in the code chooses how the projectile guesses are sorted by minimizing error:

self.ordered = self.array.sort(key=lambda x: abs(x.error))

If I change this to instead sort by the sum of error and the initial speed of the projectile, I get answers that are much closer to each other, and to the minimum speed necessary to hit the target:

Screen Shot 2015-02-14 at 11.19.01 AM

Fun stuff all around.

Coding WeinbergCloud - An Update

Over the past few weeks, I've made some changes to my standards based grading system using the Meteor framework. These changes were made to address issues that students have brought up that they say get in the way of making progress. Whether you view these as excuses or valid points, it makes sense to change some of the features to match the students' needs.

I don't know what standard 6.2 means, Mr. Weinberg.

There are many places students could look to get this information. It does make sense, however, to have this information near where students sign up for reassessments.

When students select a standard, a link pops up (if the standard exists) with a description. This has made a big difference in students knowing whether the standard they sign up for is what they actually intend to assess.

Screen Shot 2015-02-13 at 1.07.27 PM

I also added the entry for the current mastery level, because this is important in selecting appropriate assessments. The extra step looking it up in the online gradebook isn't worth it to me, and asking students to look it up makes it their responsibility. That's probably where it belongs.

Can you post example problems for each standard?

The biggest issue students have in searching for online resources for a specific standard is not knowing the vocabulary that will get the best resources. There's lots of stuff out there, but it isn't all great.

I post links to class handouts and notes on a school blog, so the information is already online. Collecting it in one place, and organizing it according to the standards hasn't been something I've put time into.

Students can now see the standards for a given course, listed in order. If students are interested, they can look at other courses, just to see what they are learning. I have no idea if this has actually happened.

Screen Shot 2015-02-13 at 1.20.19 PM

Selecting a standard brings a student to see the full text and description of the standard. I can post links to the course notes and handout, along with online resources that meet my standards for being appropriately leveled and well written.

Screen Shot 2015-02-13 at 1.20.40 PM

At the moment, I'm the only one that can add resources. I've written much of the structure to ultimately allow students to submit sites, up-vote ones that are useful to them, and give me click data on whether or not students are actually using this, but I'm waiting until I can tweak some UI details to make that work just the way I want it.

Mr. Weinberg, I signed up for an assessment, but it's not showing up.

The already flaky internet in China has really gotten flakier as of late. Students are signing up for reassessments, but because of the way I implemented these requests being inserted into the database, these requests weren't actually making it to the server. I've learned a lot more about Meteor since I wrote this a year ago, so I've been able to make this more robust. The sign-up window doesn't disappear until the server actually responds and says that the insert was successful. Most importantly, students know to look for this helper in the upper left hand side of the screen:

Screen Shot 2015-02-13 at 1.14.13 PM

If it glows red, students know to reload the page and reconnect. Under normal Meteor usage conditions, this isn't a problem because Meteor takes care of the connection process automatically. China issues being what they are, this feature is a necessity.

I've written before about how good it feels to build tools that benefit my students, so I won't lecture you, dear reader, about that again. In the year since first making this site happen though, I've learned a lot more about how to build a tool like this with Meteor. The ease with which I can take an idea from prototype to production is a really great thing.

The next step is taking a concept like this site and abstracting it into a tool that works for anyone that wants to use it. That is a big scale project for another day.