# Testing physics models using videos & Tracker

I've gotten really jealous reading about how some really great teachers have stepped up and used programming as learning tools in their classes. John Burk's work on using vPython to do computational modeling with his students is a great way to put together a virtual lab for students to test their theories and understand the balanced force model. I also like Shawn Cornally's progression of tasks using programming in Calculus to ultimately enable his students to really understand concepts and algorithms once they get the basic mechanics.

I've been looking for ways to integrate simple programming tasks into my Algebra 2 class, and I think I'm sold on Python. Many of my students run Chrome on their laptops, and the Python Shell app is easily installed on their computers through the app store. It would be easy enough to ask them to enter code I post on the wiki and then modify it as a challenge at the end of beginning of class.. It's not a formal programming course at all, but the only way I really got interested in programming was when I was using it to do something with a clear application. I'm just learning Python now myself, so I'm going to need a bit more work on my own before I'll feel comfortable troubleshooting student programs. I want to do it, but I also need some more time to figure out exactly how I want to do it.

In short, I am not ready to make programming more than just a snack in my classes so far. I have, however, been a Tracker fan for a really long time since I first saw it being used in a lab at the NASA Glenn Research Center ten years ago. Back then, it was a simple program that allowed you to import a video, click frame by frame on the location of objects, and export a table of the position values together with numerically differentiated velocity and acceleration. The built-in features have grown considerably since then, but numerical differentiation being what it is, it's really hard to get excellent velocity or acceleration data from position data. I had my students create their own investigations a month ago and was quite pleased with how the students ran with it and made it their own. They came to this same conclusion though - noisy data does not a happy physics student make.

I wanted to take the virtual laboratory concept of John's vPython work (such as the activities described here) for my students, but not have to invest the time in developing my students' Python ability because, as I mentioned, I barely qualify myself as a Python novice. My students spent a fair amount of time with Tracker on the previous assignment and were comfortable with the interface. It was at this point that I really decided to look into one of the most powerful capabilities of the current version of Tracker: the dynamic particle model.

My students have been working with Newton's laws for the past month. After discovering the power of the dynamic model in Tracker, I thought about whether it could be something that would make sense to introduce earlier in the development of forces, but I now don't think it makes sense to do so. It does nothing for the notion of balanced forces. Additionally, some level of intuition about how a net force affects an object is important for adjusting a model to fit observations. I'm not saying you couldn't design an inquiry lab that would develop these ideas, but I think hands-on and actual "let me feel the physics happening in front of me" style investigation is important in developing the models - this is the whole point of modeling instruction. Once students have developed their own model for how unbalanced forces work, then handing them this powerful tool to apply their understanding might be more meaningful.

The idea behind using the dynamic particle model in Tracker is this: any object being analyzed in video can be reduced to analyzing the movement of a particle in response to forces. The free body diagram is the fundamental tool used to analyze these forces and relate them to Newton's laws. The dynamic particle model is just a mathematical way to combine the forces acting on the particle with Newton's second law. Numerical integration of acceleration then produces velocity and positions of the particle as functions of time. Tracker superimposes these calculated positions of the particle onto the video frames so the model and reality can be compared.

This is such a powerful way for students to see if their understanding of the physics of a situation is correct. Instead of asking students to check order of magnitude or ask about the vague question "is it reasonable", you instead ask them whether the model stops in the same point in the video as the object being modeled. Today, I actually didn't even need to ask this question - the students knew not only that they had to change something, but they figured out which aspect of the model (initial velocity or force magnitude) they needed to change.

It's actually a pretty interesting progression of things to do and discuss with students.

- Draw a system schema for the objects shown in the video.

- Identify the object(s) that you want to model from the video. Draw a free body diagram.

- Decide which forces from the diagram you CAN model. Forces you know are constant (even if you don't know the magnitude) are easy to model. If there are other forces, you don't have to say "ignore them" arbitrarily as the teacher because you know they aren't important. Instead, you encourage students start with a simple model and adjust the parameters to match the video.

- If the model cannot be made to match the video, no matter what the parameter values, then they understand why the model might need to be adjusted. If the simple model is a close enough match, the discussion is over. This way we can stop having our students say "my data is wrong because..." and instead have them really think about whether the fault is with the data collection or with the model they have constructed!

- Repeat this process of comparing and adjusting the model to match the observations until the two agree within a reasonable amount.

**Isn't the habit of comparing our mental models to reality the sort of thing we want our students to develop and possess long after they have left our gradebook?**

It's so exciting to be able to hand students this new tool, give them a quick demo on how to make it work, and then set them off to model what they observe. The feedback is immediate. There's some frustration, but it's the kind of frustration that builds intuition for other situations. I was glad to be there to witness so we could troubleshoot together rather than over-plan and structure the activity too much.

Here is the lab I gave my students: Tracker Lab - Construction of Numerical models If you are interested in an editable version, let me know. I have also posted the other files at the wiki page. Feel free to use anything if you want to use it with your students.

I am curious about the falling tissue video and what students find - I purposely did not do that part myself. Took a lot of will-power to not even try. How often do we ask students to answer questions we don't know the answer to? Aren't those the most interesting ones?

I promise I won't break down and analyze it myself. I've got some Python to learn.