Monthly Archives: October 2014

Coding for the Classroom with Meteor Series: ImageShare

When I visited Meteor headquarters for their monthly DevShop, I gave a lightning talk on my use of the Meteor framework for developing tools that helped me do my job as a teacher. Both during and after my talk, I was asked how I thought I could help other teachers learn to do what I had done. I pledged at the time that it was my goal to make some videos and tutorials about how to use it for classroom specific applications. Since then, I've had some ideas for what I might do.

When I asked idea-man Dan Meyer what he thought the first project should be, the response came back surprisingly quick:

As usual, Dan's expectations were high. I was waiting for Meteor to release its 1.0 version before getting started, so when that happened this week, I hit the books interwebs hard to figure out how to make the response viewer a reality with Meteor.

Thankfully, it actually came together quite quickly. This is an amazing testament to the power that the Meteor framework has for minimizing the idea-to-app lifecycle, and making it easy to get these tools in the hands of teachers.

You can check out my 26 minute tutorial video below. I made it almost real-time (minus some edited video flubs) to show how quick it is to get started.

I have also included the files that I made in the tutorial on Github here:
https://github.com/emwdx/image-share

Take a look and let me know what you think. I would like to do others if there are requests for teaching-related apps out there. Keep me posted on what you would like to see.

Computational Thinking in Teaching and Learning (Re-post)

A modified version of this post appeared on the Techsmith Blog here and in their quarterly newsletter, the Learning lounge. I appreciate their interest in my perspective. I hope to continue this important discussion here with my readers.

The idea of computational thinking has radically changed my approach to teaching over the past few years. This term, first coined by Jeanette Wing, a professor of computer science at Carnegie Mellon University, refers to several key ideas of thinking that are essential to computer science. The paper clearly identifies the reality that there are some tasks that computers do extremely well, and others that are better suited to the human brain. Traditionally, computer scientists have worked to outsource the calculating, organizing, searching, and processing work for task X to a computer so that they can focus on the more complex, challenging, and engaging aspects of the same task. According to Wing, one of the most essential skills we should develop in students is sorting tasks into these two groups.

My classroom, at its best, is a place where maximum time is spent with students wrestling with an engaging task. They should be working together to develop both intuition and understanding for required content. I can read the smiles or frowns and know whether I should step in. I can use my skills to nudge students in the right direction when I think they need it. Knowing precisely when they need it can't easily be determined by an algorithm. For some students, this moment comes early on after encountering a new concept. Others require just one more minute of struggle before the idea clicks and it's in their brains for good. Knowing the difference comes from the very human experience of time in classrooms with learners.

This is the human side of teaching. It is easy to imitate and approximate using technology, but difficult to produce authentically. Ideally, we want to maximize these personal opportunities for learning, and minimize the obstacles. For me, the computer has been essential to doing both, specifically, identifying the characteristics of tasks that a computer does better. If a computer can perform a task better than me or my students alone, I'm willing to explore that potential.

The most consistent application of this principle has been in the reduction of what I call 'dead time'. I used to define this as time spent on tasks required for learning to be possible, but not actually a learning task itself. Displaying information on the board, collecting student answers, figuring out maximum and minimum guesses for an estimation problem - these take time. These sorts of tasks - displaying, collecting, processing - also happen to be the sort at which computers excel. I wrote a small web application that runs from my classroom computer that allows students to snap a picture of their work and upload it to my computer, anonymously if they choose. We can then browse student answers as a class and have discussions about what we see. The end result is equivalent to the idea of students writing their work on the board. The increased efficiency of sharing this work, archiving it, and freeing up class time to build richer activities on top of it makes it that much more valuable to let the computer step in.

I've also dabbled in making videos of direct instruction, but I have students watch and interact with them while they are in the classroom. During whole class instruction, I can't really keep track of what each student is and isn't writing down because I am typically in a static location in the classroom. With videos simultaneously going throughout the classroom, I can see what students write down, or what they might be breezing through too quickly. I get a much better sense of what students are not understanding because I can read their faces. I can ask individualized questions of students to assess comprehension. The computer distributes and displays what I've put together or curated for my students – one of its strengths. My own processing power and observation skills are free to scan the room and figure out what the next step should be.

Letting the computer manage calculation (another of its strengths) enables students to focus on the significance of calculations, not the details of the calculations themselves. This means that students can truly explore and gain intuition on a concept through use of software such as Geogebra or a spreadsheet before they are required to manage the calculations themselves. For students that struggle with arithmetic operations, this enables them to still make observations of mathematical objects, and observe how one quantity affects another. This involvement has the potential to inspire these same students to then make the connections that underlie their skill deficiencies.

Full disclosure though: I don't have a 100% success rate in doing this correctly. I've invested time in programming applications that required much more effort than an analog solution. For instance, I spent a week writing all of my class handouts in HTML because the web browser seemed like a solution that was more platform independent than a PDF. That ended when I realized the technology was getting in the way of my students making notes on paper, a process I respect for its role in helping students make their own learning tools. There are some tasks that work much more smoothly (or are just more fun) using paper and a marker.

I value my student’s time. I value their thoughts. I want to spend as much class time as is possible building a community that values them as well. Where technology gets in the way of this, or adds too much friction to the process, I set it aside. I sit with students and tell stories. I push them to see how unique it is to be in a room for no other reason but to learn from each other. When I can write a program to randomize groups or roll a pair of dice a thousand times to prove a point about probability, I do so.

Knowing which choice is better is the one I wish I could write an algorithm to solve. That would take a lot of the fun out of figuring it out for myself.

Sensors First - Progress Report

I wrote previously about my plans to change how I teach programming to my LEGO robotics students. By including sensor use as a starting point, my hope is to equip students with the experience to know when sensors can do a better job than simply aiming the robot toward the target and hoping for the best.

IMG_0071-0.JPG
Yesterday was my first open ended challenge after beginning this approach. Students needed to build and program their robots to retrieve the loops located at the ends of the black line paths. The time available for them to do so was kept short. As one more way to advantage sensors over a trial and error approach, I told them that I might tell them to start their robot anywhere along the line, and that they could only pick up their robot once while retrieving the two loops.

I really didn't need that final requirement. Students quickly figured out how to adapt the line following tricks I taught them to this task. In a forty minute period, all of the teams made progress and were able to make contact with the loop using a collection mechanism.

The most satisfying result? Not a single group spent significant time aiming their robot. They clearly didn't feel the need, which is a step in the right direction.

IMG_0073-1.JPG

Standards Based Grading(SBG) and The SUMPRODUCT Command

I could be very late to the party finding this out. If so, excuse my excitement.

I gave a multiple choice test for my IB Physics course last week. Since I am using standards based grading (SBG), I wanted a quick way to see how students did on each standard. I made a manually coded spreadsheet eight years or so ago to do this. It involved multiple columns comparing answers, multiple logical expressions, and then a final column that could be tallied for one standard. Multiply that by the total number of standards...you get the drill.

I was about to start piecing together an updated one together using that same exhausting methodology when I asked myself that same question that always gets me going: is there a better way?

Of course there is. There pretty much always is, folks.

For those of you that don't know, the SUMPRODUCT command in Excel does exactly what I was looking for here. It allows you to add together quantities in one range that match a set of criteria in another. Check out the example below:

Screen Shot 2014-10-14 at 3.28.09 PM

The column labeled 'Response Code' contains the formula '=1*(B6=E6)', which tests to see if the answer is correct. I wanted to add together the cells in F6 to F25 that were correct (Response Code = 1) and had the same standard as the cell in H6. The command in cell I6 is '=SUMPRODUCT((F6:F25)*(E6:E25=H6))'. This command is equivalent to the sum F6*(E6=H6) + F7*(E7=H6)+F8*(E8=H6)+...and so on.

If I had known about this before, I would've been doing this in some way for all of my classes in some way since moving to standards based grading. I've evaluated students for SBG after unit exams in the past by looking at a student's paper, and then one-by-one looking at questions related to each standard and evaluating them. The problem has been in communicating my rationale to students.

This doesn't solve the problem for the really great problems that are combinations of different standards, but for students that need a bit more to go on, I think this is a nice tool that (now) doesn't require much clerical work on my part. I gave a print out of this page (with column F hidden) to each student today.

Here is a sample spreadsheet with the formulas all built in so you can see how it works. Let me know what you think.
Exam Results Calculator

Revising my thinking: Force Tables

I've avoided force tables as a lab in the past. This is primarily because when I first started teaching physics and saw some collecting dust in the lab equipment room, the activities that were written for them seemed so formulaic that I was bored by them. I didn't know then what I might do to make it more interesting.

In making an activity using them today, I actually played around with them a bit. They are a bit tricky to set up, but once you have the weights balanced, it's oddly satisfying to see the ring in the center floating there:
Screen Shot 2014-10-09 at 9.41.09 AM

The theme of my lesson planning is a search for this type of gold: how can we play with this?

I've done activities involving 'find the unknown mass' before, and the force table offered an efficient way in to doing this.

Screen Shot 2014-10-09 at 9.38.44 AM

I asked students to figure out the mass of the weight circled in blue. I asked them to decide what information they needed to do so, and they requested the other two masses, which I provided.

Students worked quickly using their knowledge of forces and equations of equilibrium. They figured out pretty quickly that the angles between the threads were approximately equal, a fact I didn't notice until I looked from above:
Screen Shot 2014-10-11 at 11.12.34 AM

Their predicted answer of 290.9 grams was impressively close to the actual answer of 292.2 grams. We discussed that the assumption that the angles were the same might contribute for the error.

On the whole, this was a fun way to put to use a piece of equipment that I've kept out of my classroom for largely silly reasons. I think I'll definitely add this to the playlist for future units on equilibrium.