Sensors First – A Changed Approach

I presented to some FIRST LEGO League teachers on the programming software for the LEGO Mindstorms EV3 last week. My goal was to present the basics of programming in the system so that these teachers could coach their students through the process of building a program.

The majority of programs that students create are the end product of a lot of iteration. Students generally go through this process to build a program to do a given task:

  1. Make an estimate (or measurement) of how far the motors must rotate in order to move the robot to a given location.
  2. Program the motors to run for this distance.
  3. Run the program to see how close the robot gets to the desired location.
  4. Adjust the number in Step 1. Repeat until the robot ends up in the right location.

Once the program gets the robot to the right location, this process is repeated for the next task that the robot must perform. I’ve also occasionally suggested a mathematical approach to calculate these distances, but the reality is that students would rather just try again and again until the robot program works. It’s a great way to introduce students to the idea of programming as a sequence of instructions, as well as familiarity with the idea that getting a program right on the first try is a rarity. It’s how I’ve instructed students for years – a low bar for entry given that this requires a simple program, and a high ceiling since the rest of programming instructions are extensions of this concept.

I now believe, however, that another common complaint that coaches (including me) have had about student programs is a direct consequence of this approach. Most programs (excluding those students with a lot of experience) require the robot to be aimed correctly at the beginning of the program. As a result, students spend substantial time aiming their robot, believing that this effort will result in a successful run. While repeatability is something that we emphasize with students (I have a five in a row success rule before calling a mission program completed) it’s the method that is more at fault here.

The usual approach in this situation is to suggest that students use sensors in the program to help with repeatability. The reason they don’t do so isn’t that they don’t know how to use sensors. It is that the aim and shoot method is, or seems, good enough. It is so much easier in the student’s mind to continue the simpler approach than invest in a new method. It’s like when I’ve asked my math students to add the numbers from 1 to 30, for example. Despite the fact that they have learned how to quickly calculate arithmetic series before, many of them pick up their calculators and enter the numbers into a sum, one at a time, and then hit enter. The human tendency is to stick to those patterns and ideas that are familiar until there is truly a need to expand beyond them. We stick with what works for us.

One of my main points to the teachers in my presentation was that I’m making a subtle change to how I coach my students through this process. I’m calling it ‘sensors first’.

The tasks I give my students in the beginning to learn programming are going to require sensors in order to complete. Instead of telling students to program their robot to drive a given distance and stop, I’ll ask them to drive their robot forward until a sensor on their robot sees a red line. I’ll also require that I start the robot anywhere I want in the test of their program.

It’s a subtle difference, and requires no difference in the programming. In the EV3 software, here’s what it looks like in both cases, using wheels to control the distance, and a sensor:
Screen Shot 2014-09-21 at 1.29.24 PM

What am I hoping will be different?

  • Students will look to the challenges I give them with the design requirement built in that aim-and-shoot isn’t an option that will result in success. If they start off thinking that way, they might always think how a sensor could be used to make the initial position of the robot irrelevant. FLL games always have a number of printed features on the mat that can be used to help with this sort of task.
  • When I do give tasks where the students can start the robot wherever they choose, students will (hopefully) think first whether or not the starting position should matter or not. In cases where it doesn’t, then they might decide to still use a sensor to guide them (hopefully for a reason), or drop down to a distance based approach when it makes sense to do so. This means students will be routinely thinking what tool will best do the job, rather than trying to use one tool to do everything.
  • This philosophy might even prompt a more general need for ways to reduce the uncertainty and compound error effect associated with an aim and shoot approach. Using the side of the table as a way to guide straight line driving is a common and simple approach.

These sorts of problem solving approaches are exactly the way successful engineering design cycle works. Solutions should be found that maximize the effectiveness of a design while minimizing costs. I’m hoping this small change to the way I teach my students this year gets them spending more time using the tools built into the robot well, rather than trying to make a robot with high variability (caster wheels, anyone?) do the same thing two times in a row.

2 thoughts on “Sensors First – A Changed Approach

  1. I agree, students don’t seem to understand sensors and so don’t usually use them. I am all for trying something to help them learn how to make the robot work more efficiently. Is your “sensors first” rule working?

    1. Not yet – just getting started. At this point there are three students (out of around fifteen) that have done LEGO robotics before, so I’m really curious to see how the group develops as a whole in comparison to previous years.

Leave a Reply

Your email address will not be published. Required fields are marked *