Color Following Robot – RVR, Adafruit Metro, Pixy2

One of the fundamental challenges in maker education right now is the reality that there are many different interesting tools available, but not all of them talk to each other. A notable change, however, is that many of them use open source software that makes it possible to understand how each one works. All that remains is connecting the dots.

About a year ago, I learned about the Pixy2 camera platform. This is an impressive device that can easily learn colors and deliver coordinates of those colors in a camera image through a number of different protocols. The example code is primarily provided through a series of Arduino projects. After working through some Arduino projects with my robotics class a year ago, I decided that barebones Arduino isn’t the way to get students excited about programming microcontrollers. There are way too many semicolons, error-prone libraries, and finicky elements of the C++ language that get in the way of students being able to make things on their own. That said, the Pixy2 is incredibly versatile. It tracks colored blobs quickly and accurately.

Over the past year, I’ve also spent time in the CircuitPython ecosystem. The Adafruit team has done an amazing job of documenting their extensive line of boards (such as the Metro and Feather Express lines) and providing resources through CircuitPython to use these out of the box.

The other release last October was the Sphero RVR line. The first project I gave students a year ago was to build a motorized car using VEX motor controllers, a motorized platform, and an Arduino board. That was a lot more trouble than I expected, especially given that I wanted students to then use the cars they built with a Pixy2. That was a flimsy platform and I should have seen that from miles away. The Sphero RVR is packed with strong motors, a robust sensor assembly, and a solid hackable frame upon which other things can be mounted. This thing drives quickly and reliably, and that’s after only a few months of being available. Commands can be sent to it through a serial port to get it to drive. As of now, the API is limited to driving at a given heading and speed, raw motor power control, and flashing LEDs.

I wanted to be able to demonstrate some concepts of control systems to my high school students. I also had a number of Pixy2 cameras available from my experiment last year. I also had decided that CircuitPython was the way to go for teaching students embedded microcontroller program using the Adafruit boards. It was time to combine these three resources to be able to build a colored object following robot.

After a lot of experimentation, I eventually ended up with the code linked below. The Pixy2 code is a modification of Robert Lucian’s Python port of the API for use with a Raspberry Pi (I think). I did some work to adapt it to use the CircuitPython I2C functions, though it is not the complete feature set. The Sphero Python API is made for use with a micro:bit and MicroPython, but I switched out the code for the MicroPython serial port with that of the CircuitPython serial port on a Metro board. I had to comment out parts of both the Pixy2 code and the Sphero code that I wasn’t using in order to get the code to fit on the Metro. I’m still not convinced that this was the problem as there appeared to be plenty of free memory available when I checked, but I had a number of MemoryErrors pop up without the code commented out.

Here is a GitHub repository with the code.

I’m sure this isn’t as efficient as it could be, but I wanted to demonstrate the simplicity of using information from a camera as a way to build a control system for steering a robot. This is to help students understand a simple proportional algorithm as a control system. My hope is that students will apply this to autonomously steering a robot as part of this year’s FIRST Robotics Competition game: Infinite Recharge.

Leave a Reply

Your email address will not be published. Required fields are marked *