The Transterpreter Project

Concurrency, everywhere.


Firmware... as user code!

NOTE: We will be unleashing the things you read about here on students at Olin College very soon. We'll bundle up a release for general consumption (as well as the documentation that we alpha test on the Olin students) Real Soon Now(TM).


I've been experimenting with the new firmware for the Surveyor SRV-1b that Carl and Jon have done a lot of excellent work on. Here's something wacky: what used to be the "firmware" can now be executed as "user code".

I'll get there in a few steps. Consider this piece of test code that Jon wrote:

PROC tests (CHAN BYTE in?, out!, CHAN P.LASER lasers!, 
            CHAN P.LED leds!, CHAN P.MOTOR motors!)
  SEQ
    out.string("SRV-1 Test Program (of Doom)*n", 0, out!)
    out.string("Testing death lasers*n", 0, out!)
    test.lasers(lasers!)
    out.string("Testing less deadly LED*'s*n", 0, out!)
    test.leds(leds!)
    out.string("Testing harmless motors*n", 0, out!)
    test.motors(motors!)
:

This is is the main process from Jon's test program. The parameters coming into the process header are channels out to the environment. In this case, the environment is the Surveyor, so we have channels for the serial communications over the WiFi radio (those are the channels `in' and `out'), and there are output channels for the lasers, the LEDs, and the motors. Each of these are named as you might expect. To run this on the Surveyor, we compile it, upload the bytecode, and things just go.

But what's great is that this program, although it doesn't do much, is not really different than the firmware that used to be written in C. The original firmware would handle commands from the SRV Console, and then spit images back, drive around, and do whatever else you commanded your mobile wireless camera platform to do. That is, all the default firmware did was respond to commands over a textual protocol. (There was a C interpreter, to. But my point is that you used a textual protocol to initiate all kinds of things.) Given that we have a channel of bytes representing the textual input coming over the radio, it seems like we could implement the old protocol completely in occam-pi.

And that's what Carl and Jon have done. They've implemented what used to be firmware as a program that any user can now write and upload to the Surveyor. The program srv1.occ can be compiled, uploaded to the Surveyor, and executed as a user program, even though it is implementing the entire SRV-1 protocol. It is now a "user program" that implements what I've referred to as the "old firmware." If we want to kill this "new firmware," we issue a '!', and it is shut down cleanly. Now, we can upload a newer version, or perhaps a completely different program.

This has its tradeoffs. For example, it means that my SRV-1 doesn't wake up ready to send me images. On the other hand, it does mean that I can easily modify and extend the firmware, including extending the protocol or (because occam-pi is a parallel-safe language) running my own additional code along with the original firmware. I can filter the channel carrying the commands (perhaps ignoring every other request for an image, or drawing on the images to add data to them), and so on. Over time, we'll probably end up refactoring the "firmware" into a bunch of reusable components that program authors can selectively include in their programs to get parts of the original firmware's behavior in their programs. (We'll see... I haven't given this much thought yet, but perhaps Carl and Jon have.)

This is still evolving rapidly, but it is an absolute joy to be able to easily modify my occam-pi code, send it over the WiFi to the Surveyor, and get completely new, low-level behavior without having to go through a lengthy reflashing of the bot over JTAG (or, worse, WiFi).

As can be seen above (sorta), I've begun to explore drawing on the images before shipping them from the SRV-1 back to the user. I believe this is important for students exploring robotic vision, as they need a way to indicate what they are looking for; drawing onto the image strikes me as a very simple way for their code to communicate that "this looks important!". It might be that they're doing edge detection, or blob finding, or any of a host of other things. My code doesn't do anything exciting yet, but tomorrow is another day; more excitement will ensue.

Nicely done to the UK team! Wootness.



Metadata

  • Posted: February 6, 2008
  • Author: Matthew Jadud
  • Comments: None
  • Tags: None