App Development / Distance Learning: SpriteKit

It has been far too long, my friends. In challenging times like these, it's all the more important to help each other out.

We have been running classes for students at home for the past six weeks here in Vietnam. I want to offer the activities I've been doing and resources to you as you look for ideas to use with your own classes.

The first is App Development. This course is meant to be an open workshop class where students learn to code using Swift and build projects in Xcode and Swift Playgrounds. Needless to say, doing this when we are not in the same room has been difficult. I am used to running around the classroom and helping students resolve issues. This is much more difficult when we are not in the same room.

The most frequent mode of operation for class is the try-tweak-reflect cycle. I give students some code that does something, ask them to change something in the code, predict what will happen, and then reflect on the result. They do this a few times until they end up with something that is uniquely their own.

SpriteKit Activities

SpriteKit is a library in iOS that lets you create games. The best way to do this right now is the Swift Playgrounds app which is available for iPad and macOS. There are lots of things you can have students do. Here are some of the ones I've done with students. Feel free to use any of it. The app itself has some phenomenal activities that are ready to hand to students and get them learning from the start. If you want a bit more directed learning related to SpriteKit, read on.

I created a video that walks my students from start to finish in creating a simple SpriteKit app that moves a block around the screen with a tap. This was recorded all in the iPad Swift Playgrounds app using the screen recording feature.

SpriteKitStartToFinish (YouTube)

This next video led to something a bit more sophisticated with numbered circles falling and bouncing around.

AppDevelopment - Playing with Classes (YouTube)

You will want to start with this playground downloaded to your computer to use in Swift Playgrounds or Xcode.

After taking students through a few activities like this, I gave them a design challenge. Before we closed school, my plan was to have students build a game to spread awareness of an issue related to the UN sustainable development goals, or SDGs. Here is the description of the project I gave them. (Link here)

Students created some really neat stuff. It took a lot of remote debugging and use of Chrome remote desktop in some cases to make this work, but it was worth it in the end. Here are some highlights of student work:

Write me if you have any questions on how you might use this yourself or with students. We are in this together.

Color Following Robot - RVR, Adafruit Metro, Pixy2

One of the fundamental challenges in maker education right now is the reality that there are many different interesting tools available, but not all of them talk to each other. A notable change, however, is that many of them use open source software that makes it possible to understand how each one works. All that remains is connecting the dots.

About a year ago, I learned about the Pixy2 camera platform. This is an impressive device that can easily learn colors and deliver coordinates of those colors in a camera image through a number of different protocols. The example code is primarily provided through a series of Arduino projects. After working through some Arduino projects with my robotics class a year ago, I decided that barebones Arduino isn't the way to get students excited about programming microcontrollers. There are way too many semicolons, error-prone libraries, and finicky elements of the C++ language that get in the way of students being able to make things on their own. That said, the Pixy2 is incredibly versatile. It tracks colored blobs quickly and accurately.

Over the past year, I've also spent time in the CircuitPython ecosystem. The Adafruit team has done an amazing job of documenting their extensive line of boards (such as the Metro and Feather Express lines) and providing resources through CircuitPython to use these out of the box.

The other release last October was the Sphero RVR line. The first project I gave students a year ago was to build a motorized car using VEX motor controllers, a motorized platform, and an Arduino board. That was a lot more trouble than I expected, especially given that I wanted students to then use the cars they built with a Pixy2. That was a flimsy platform and I should have seen that from miles away. The Sphero RVR is packed with strong motors, a robust sensor assembly, and a solid hackable frame upon which other things can be mounted. This thing drives quickly and reliably, and that's after only a few months of being available. Commands can be sent to it through a serial port to get it to drive. As of now, the API is limited to driving at a given heading and speed, raw motor power control, and flashing LEDs.

I wanted to be able to demonstrate some concepts of control systems to my high school students. I also had a number of Pixy2 cameras available from my experiment last year. I also had decided that CircuitPython was the way to go for teaching students embedded microcontroller program using the Adafruit boards. It was time to combine these three resources to be able to build a colored object following robot.

After a lot of experimentation, I eventually ended up with the code linked below. The Pixy2 code is a modification of Robert Lucian's Python port of the API for use with a Raspberry Pi (I think). I did some work to adapt it to use the CircuitPython I2C functions, though it is not the complete feature set. The Sphero Python API is made for use with a micro:bit and MicroPython, but I switched out the code for the MicroPython serial port with that of the CircuitPython serial port on a Metro board. I had to comment out parts of both the Pixy2 code and the Sphero code that I wasn't using in order to get the code to fit on the Metro. I'm still not convinced that this was the problem as there appeared to be plenty of free memory available when I checked, but I had a number of MemoryErrors pop up without the code commented out.

Here is a GitHub repository with the code.

I'm sure this isn't as efficient as it could be, but I wanted to demonstrate the simplicity of using information from a camera as a way to build a control system for steering a robot. This is to help students understand a simple proportional algorithm as a control system. My hope is that students will apply this to autonomously steering a robot as part of this year's FIRST Robotics Competition game: Infinite Recharge.