Archive for the ‘Programming’Category

A Light Diversion


In the last days of Radio Shack, I was in a store on Michigan Avenue when I spotted, buried amongst the disassembled shelving units and discarded phone cases, a small red box that turned out to be an Arduino-based soldering project, the 2770158 LED Cube ( I bought it for something like $5, took it home, and promptly put it on the shelf as a project I’ll ‘get to’ at some point.

The honest truth is that I was somewhat intimidated by the soldering; it’s a 3x3x3 cube of LEDs that are soldered together and the lights were smaller than i was expecting, and looking at some pics of the final result, I resigned myself to likely screwing it up and at best hoping that I might learn something from what I assumed would be a complete failure. So I somehow justified to myself that, in order to not waste my $5, I shouldn’t actually try to make the thing I spent $5 on.

At some point I hit myself with a clue-by-four and realized the stupidity of my situation; accept the possible loss of the $5 and actually try instead of fretting about what-ifs. So I took the kit to PS:1, sat down in the Electronics area, got out the soldering iron, magnifying glass, and went to work. It took a couple of hours, and I was certain, absolutely positively certain, that, even though it looked right, there was no chance that I had actually gotten the leads all wired together correctly, especially the ones in the middle that were extremely hard to reach with the big tip of the soldering iron. Okay, well, only thing left was to actually plug it into the Arduino Uno I had, load up the sample sketch (available in the RS GitHub repo above), and see what happens.

I fired up the Arduino IDE, loaded the sample sketch, hit upload, and all of a sudden all the lights came on as it started through the canned routines. I was initially skeptical, checking every single light to see which one was never lighting up, and all of a sudden it dawned on me that I had actually done it, all the lights actually lit up as part of the demo routine, and HOLY CRAP I MADE A THING AND IT WORKED!!!!1111

And then in my excitement I dropped it, ripping the USB cord from the Arduino, and landed lights-down on the floor. Well, of course I did. Of course I broke it, right? But as I checked the connections, nothing had come loose, there were no broken connections. I plugged the Arduino back in, and sure enough, it happily came back to life and started going through the routine. Whew!


So I resolved to make this truly my own; running a demo program that I didn’t write was not ‘finishing the job’. I remembered the QBasic ‘Snake’ program that drew a line bouncing around the screen, hitting the edge and then randomly turning and going off into another direction. Ah, but this is a cube, in threeeee deeeeeeee, so the challenge would be that more interesting, especially as I resolved to sit down and actually try to implement it without any help from the Internet; a three-dimensional matrix of lights, translated into C++.

This is where I remembered a line from Top Gun that went something along the lines of “Our pilots had become dependent on missiles” as a reason for loss of dogfighting ability. (And then I got that Everly Brothers song stuck in my head). Well, writing C++ for years, I had become dependent on the containers provided by the Standard Template Library (map, vector, etc.). While the Arduino is programmed using C++, it’s really a pretty small subset of C++ (which sort-of-kinda-not-really makes sense) and the STL is not available; go ahead and #include <map> all you like, all the compiler’s gonna do is complain. So I knew I’d have to regain some amount of dogfighting capability and do all the array/matrix stuff in pure C. So I decided the best way to keep myself honest and regain some of the skills I think I used to have, I created a C file in Vim (using Emacs always made me angry, straight-up I hate this, whatever this is), wrote the program, saved, compiled and ran straight from the terminal prompt. Again and again and again.

One of the biggest problems was forcing myself to get past the ‘sheesh, this woulda been easy to use <insert some STL thing> here’ and just focus on getting the values in the right cells of the matrix. It took a few hours to get the algorithm right, but pretty soon I had it spitting out numbers that seemed right, but how was I gonna know that it was right?

This is where I decided to make a quick diversion and build a virtual version of the matrix in OpenSCAD:


Using this model, I could walk through the output of the program and verify that the snake was truly moving correctly around the matrix. I rotated the model around, checking that the numbers were right and HOLY CRAP I MADE ANOTHER THING THAT WORKED!


The last thing to do was to actually get the program to work with the LEDs. This is where the spartan documentation of the original Radio Shack code became a problem; the sketch did a passable job of explaining how the lights were addressed, but the examples were all arrays of pre-baked values without having to do anything dynamic, and my program was all dynamic. I studied how the demo program worked, started fiddling with the values, and discovered how to set the bits in the right way to turn on individual lights, on specific levels. From there I modified my C program and added some code to translate my positioning, which turned out to be the mirror opposite of the way the lights are addressed; I solved the problem by physically turning the Arduino around so I was looking at the other side. Problem solved!)

I uploaded my sketch to the Arduino and it suddenly the lights were lighting up in what appeared to be a snake moving around the matrix. HOLY CRAP I GOT IT TO WORK!!!!!!11111


This is a long post for what amounts to a small light toy, but whilst I was feeling rather verbose (a consequence of sitting and waiting for an unrelated program to finish), I can’t emphasize how foolish I feel for not starting all this earlier; fear of failure is a very, very powerful emotion and if there’s a TL;DR in here somewhere, it’s that it is always better to try and fail than to never try at all, which is something PS:1 has done a very good job of teaching me.


The code is available at


08 2016

Configuring Pi 3 with a TFT touchscreen and GPIO buttons

A while back I built a pair of sound-reactive LED towers which were on display a few times, at some local Chicago events.

To plug into the DJ mixer output required a relatively expensive device to get the stereo signals (with external gain control) seen by a computer running as a microphone input.  Also, it was a relatively bulky affair to have yet another laptop sitting next to the mixer when the artist wasn’t using it to make the music.

Recently I discovered that starting with the 3.1 release, Processing can now run on a Raspberry Pi and has built-in ability to manipulate the GPIO ports.  This blog entry highlights the release.  This is exciting news as now the laptop can be replaced by a Pi 3 with a small touchscreen.

Here is a video of the light towers in action.

The towers each have 8 panels with 60 individually addressable RGB LEDs.  These pictures show off more of the visual esthetic and the cutaway view reveals something of the construction.  The LED strips are hidden in a channel in the wood supports which side illuminate acrylic panels (backed with just a strip of white cardboard).  The acrylic is impregnated with a diffuser which reflects the edge-lit light 90 degrees to exit out of the faces of the panels.  The white cardboard reflects the 1/2 of the light that would otherwise be directed inwards.   The acrylic is produced by Evonik and is called Acrylite EndLighten.  The towers themselves only require 110 VAC power.  The data frames to control the LED strips are sent wirelessly from the processing script using an Open Pixel Control  module which maps points on the processing screen into frames sent to a Fadecandy server running inside a OpenWRT Wifi device which is then physically connected to a Fadecandy board.  I used TP-Link TL-MR3040 WiFi devices to run OpenWRT and added the Fadecandy server application into the img file used to reflash the WiFi devices.  The Fadecandy GIT repository can be found here.

IMG_1689 IMG_1682

This is the assembled Raspberry Pi 3 w/ 2.8″ TFT Capacitive Touchscreen mirroring the HDMI frame buffer in a Zebra case without the top cover.

Pi 3 w/ 2.8" TFT Capacitive Touchscreen HDMI framebuffer

There were a lot of possible paths to follow in getting this build working the way I wanted it to be.  Most of my Google searching turned up outdated examples, particularly due to the changes introduced in the 4.4 kernel with /boot/config.txt use of overlays.  Adafruit had this very nice looking tutorial of how to get the touchscreen working with their version of the Jassie Raspbian os image.  The inclusion of how to use FBCP was of particular interest as mirroring the HDMI output is important for displaying processing scripts with the 2D or 3D graphics libraries.  Their Raspbian image was based on an older kernel and updating the os (sudo apt-get update; sudo apt-get dist-upgrade) turned out not to just work.

After much tinkering, these are the steps that worked for me.  (note that I working with OSX 10.11)

  1. Download the latest Raspbian Jessie image here.
  2. Extract the .img file using “The” as opposed to the built-in “Archive” as I saw many comments that the default app caused issues.
  3. I chose a 32GB Samsung EVO Plus (model MB-MC32D) micro SD.  It has a red background.
  4. Flash the SD card with the extracted image file.  Instructions for doing this can be found easily.  I used the following procedure:
    • open a terminal window and change to the directory with the extracted image file
    • $ diskutil list
    • note the device path of the SD card (eg: /dev/disk4)
    • unmount the SD card, replace disk4 with what was discovered in the previous step
    • $ diskutil unmountDisk /dev/disk4
    • flash the SD card, again update rdisk4 and also make sure the if= filename is correct
    • $ sudo dd if=./2016-05-27-raspbian-jessie.img of=/dev/rdisk4 bs=1m
    • this will take at least 5 minutes to complete, but it is possible to see some status without interrupting the transfer, by pressing ctrl-t
    • exit the terminal window and eject the SD card
  5. Insert the SD card into the Pi and hook it up to an HDMI monitor.  You will need a keyboard and mouse as well.
  6. Open a terminal window
    • disable power management for the onboard WiFi module for stability
    • $ sudo nano /etc/network/if-up.d/wlan0
      iwconfig wlan0 power off
    • $ sudo chmod +x /etc/network/if-up.d/wlan0
    • $ sudo raspi-config
    • select Expand Filesystem and reboot
  7. Configure the WiFi as usual from icon at the top of the desktop
  8. Open a terminal window
    1. install updates
    2. $ sudo apt-get update
    3. $ sudo apt-get dist-upgrade
    4. install build utility
    5. $ sudo apt-get install cmake
    6. fetch the FBCP source, compile and install
    7. $ git clone
    8. $ mkdir ./rpi-fbcp/fbcp/build
    9. $ cd ./rpi-fbcp/fbcp/build
    10. $ cmake ..
    11. $ make
    12. $ sudo install fbcp /usr/local/bin/fbcp
    13. configure the touchscreen by uncommenting, changing or adding the following config entries
    14. $ sudo nano /boot/config.txt
      # match console size
      # force 640x480 VGA on HDMI
      # 2.8" Capacitive 320x240 Touchscreen
    15. expose touchscreen events
    16. $ sudo nano /etc/udev/rules.d/95-ft6206.rules
      SUBSYSTEM=="input", ATTRS{name}=="ft6236", ENV{DEVNAME}=="*event*", SYMLINK+="input/touchscreen"
    17. select an easier to read console font
    18. $ sudo dpkg-reconfigure console-setup
      • UTF-8
      • “Guess optimal character set”
      • Terminus
      • 6×12 (framebuffer only)
    19. remove extra GLES library see this issue
    20. $ sudo aptitude remove libgles2-mesa
    21. install processing
    22. $ curl | sudo sh
    23. disable auto monitor-off
    24. $ sudo nano /etc/lightdm/lightdm.conf
      xserver-command=X -s 0 -dpms
  9. Reboot

The touchscreen should now display the 640×480 desktop scaled down to the 320×240 PiTFT screen.  This makes things look less crisp but has the advantage that connecting to an external HDMI display will work and that most apps need the larger dimensions be usable.  Note that many HDMI displays will not be able to handle a 320×240 HDMI signal.

FBCP stands for frame buffer copy, which rescales and mirrors the HDMI framebuffer (/dev/fb0) onto the PiTFT framebuffer (/dev/fb1)

The version of the 2.8″ PiTFT I got from Adafruit, comes with 4 buttons and I created this test Python script to demonstrate not only how to use the RPi.GPIO library, but how to manipulate the PiTFT backlight (so as to not burn-in the screen), use multi-threaded event handlers, and shutdown the os to safely disconnect the power.

I created this script as: /home/pi/  (chmod +c to make it executable) and test by typing ./  Note that pressing the bottom right button (#27) will ask for authentication for powering off the Pi.  See below to set this script running as a service, in which case, the user will not be asked for authentication.

#!/usr/bin/env python2.7

# example code tested with Pi 3
# Raspibian Jassie (4.4 kernel):
# Adafruit 2.8" Capacitive Touchscreen:
# for running on startup see:
# make sure to update the ExecStart= entry in the Adafruit script after copying from the example

import subprocess
import time
import RPi.GPIO as GPIO

# list of BCM channels from RPO.GPIO (printed on the Adafruit PCB next to each button)
channel_list = [17, 22, 23, 27]
backlightOn = True

# event handler to toggle the TFT backlight
def toggleBacklight(channel):
    global backlightOn
    if backlightOn:
        backlightOn = False
        backlightOn = True

# event handler to manage button presses
def buttonEvent(channel):
    startTime = time.time()
    while GPIO.input(channel) == GPIO.LOW:
    print "Button #%d pressed for %f seconds." % (channel, time.time() - startTime)

# event handler to manage Pi shutdown
def poweroff(channel):
    startTime = time.time()
    while GPIO.input(channel) == GPIO.LOW:
    if (time.time() - startTime) &amp;amp;amp;gt; 2:['poweroff'], shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)

# initialize GPIO library
GPIO.setup(channel_list, GPIO.IN, pull_up_down=GPIO.PUD_UP)
GPIO.setup(18, GPIO.OUT)
backlight = GPIO.PWM(18, 1000)

print "Button #17 exits."
print "Button #22 toggles the TFT backlight."
print "Button #23 displayed the time the button is pressed."
print "!!! Pressing button #27 for at least 2 seconds, powers down the Pi !!!"

GPIO.add_event_detect(22, GPIO.FALLING, callback=toggleBacklight, bouncetime=200)
GPIO.add_event_detect(23, GPIO.FALLING, callback=buttonEvent, bouncetime=200)
GPIO.add_event_detect(27, GPIO.FALLING, callback=poweroff, bouncetime=200)

    GPIO.wait_for_edge(17, GPIO.FALLING)
    print "Exit button pressed."


# exit gracefully

To install as a service,

# pitft_buttons service file, start a daemon on startup
# file: /etc/systemd/system/pitft_buttons.service

Description=Start PiTFT buttons daemon

ExecStart=/usr/bin/python -u /home/pi/


Run a processing sketch from a script or from a terminal window or ssh session.
$ DISPLAY=:0 processing-java –sketch=/home/pi/HelloWorld –present


07 2016

100th NERP Tonite! Embedded GUIs part 2 of 2: Qt

For the 100th NERP Meetup (Yea!), we’ll have the second of a two-part discussion of embedded GUIs on small Linux devices. Qt is much more than a GUI library. Tonight Ron Olson will share some wisdom on the Qt application framework. Ron tells me that Qt promises a lot, and it delivers. Sounds good to me!

Ron says “I figured the main thrust of the talk would be Qt, how it works, its two main parts (C++, QML), and how it works with the BBB as well as a Qt project controlling an Arduino, all with an eye towards demonstrating the QML, and lightly, the C++ connection.”

After graduating from NYU’s film school, Ron took full advantage of his film and theatre background by becoming a software developer. For 24 years, Ron has been one of the developers that companies go to when they want to make their customers lives worse; he helped write the system to show commercials at MTV, worked on cancelled projects at an animation studio that went out of business, pioneered allowing you to lose your retirement savings on the web at Bankers Trust, came up with new ways to target ads to you at DoubleClick, did his part in the financial crisis at Goldman Sachs, and lately has been writing software to help your attorney remember when your trial date is.
Mostly in C++ and Java.

NERP is not exclusively Raspberry Pi, the small computer and embedded systems interest group at Pumping Station:One in Chicago. NERP meets every other Monday at 7pm at Pumping Station:One, 3519 N. Elston Ave. in Chicago. Find NERP and Pumping Station:One at


Doors open at 6:30pm. NERP is free and open to the public. Ed Bennett ed @ kinetics and electronics com Tags: electronics, embedded, NERP, Open Source, raspberry pi, hackerspace, Beagle Bone, Pumping Station One, programming, Qt


06 2016

X,Y,Z Finder for the ShopBot

The PS:One ShopBot is a great CNC machine that has the benefit, among other things, of being huge, allowing for a lot of cuts on large pieces of material. One of the difficulties working with the machine, however, is getting the bit at exactly 0,0,0 in the X, Y, and Z axis so that if you need something cut at exactly six inches from the edge of the material, it will be exactly six inches. There is already a built-in method for setting the Z axis, using a metal plate and clip and running a specific program on the ShopBot, but there is no such program for setting the X and Y, requiring the user to manually position the bit. This can lead to inaccuracies and wasted work.

To help everyone with accurate setting of the the X, Y, and Z axis, I made a thing:

The front of the plate, looking down on a test piece of wood for calibration

The front of the plate, looking down on a test piece of wood for calibration

This is an aluminum plate that is milled to be as precise as I could make it (read: probably a lot of room for improvement) where it sits on the lower left hand corner of the piece to be cut, with the corner of the work sitting directly in the middle of the circle.

Side view of the plate

With the piece placed on the work, the cable is plugged into the back (I had originally drilled two holes on the front left and bottom of the plate, forgetting that is where the bit has to touch so as to not push the plate off the work, so I drilled a new hold on the back and wrote “Do not use this hole” on the other two) and attached via the alligator clips (ToDo: make a better cable) to the Z plate.

The cable connects the XYZ plate to the Z plate that comes with the Shopbot for finding the Z axis.

The cable connects the XYZ plate to the Z plate that comes with the Shopbot for finding the Z axis.

The user should position the bit somewhere over the top part of the plate, where doesn’t matter. The user loads xyz-zero-finder.sbp (the code is available at this GitHub repository) into the ShopBot software and runs it. Assuming the bit is somewhere over the top, it will then slowly move the bit down until it touches the top, at which point it will move to the side (visually this appears to be moving towards the front of the machine, but in reality the side of the machine with the power switch is technically the bottom, or X axis). The program will move the bit inside the circle at what it believes is exactly 0,0,0 and, after displaying a message, will move the bit up two inches to allow the user to remove the plate and put it away.

The bit at the corner of the work after the plate has been removed and the bit put back to 0

The bit at the corner of the work after the plate has been removed and the bit put back to 0

The plate is in the drawer under the ShopBot in the Arduino box (ToDo: Make a real box for the plate). Feel free to use it and report back how it worked for you, so that we can make it better.

I want to thank Dean, Everett and Todd for giving me valuable advice about how to mill the plate on the Bridgeport; it was tricky because both sides of the plate are milled and getting it to sit properly in the vice was very worrying to me. I also want to thank Eric for suggesting the project in the first place.



09 2015

NERP Tonite! Roll your own firmware: The ESP-8266 Revisited

NERP is not exclusively Raspberry Pi, the small computer and embedded systems interest group at Pumping Station:One in Chicago. NERP meets every other Monday at 7pm at Pumping Station:One, 3519 N. Elston Ave. in Chicago.

The ESP8266 module has come up several times in discussions at NERP, and it keeps getting better. Tonight at NERP, Jay Hopkins will tell us about some of his recent findings as he revisits the esp8266. In Jay’s own words:

“The esp8266 is an ultra low cost module (sub $10) with an 80Mhz 32 bit processor, up to 4 MB flash memory, 100k of ram and 802.11 radio.   What sets the module apart from other ultra low cost modules is the inclusion of an 802.11 b/g/n radio and in firmware the IP stack for connectivity in the IoT (internet of things).

“At NERP we will be looking at the tools available to build firmware for the ‘8266.  Both microPython, lua and the arduino IDE are available for programming the ‘8266.”

Find NERP and Pumping Station:One at
Doors open at 6:30pm.
NERP is free and open to the public.
Ed Bennett ed @ kinetics and electronics com
Tags: electronics, embedded, NERP, Open Source,
raspberry pi, hackerspace, Beagle Bone, Element14, Pumping Station One


08 2015

Newbie Programmers’ Office Hours (NPOO)

Officially announcing the creation of Newbie Programmers’ Office Hours! This will be like PYOO, but specifically with a focus on beginning programmers. We are language agnostic.


Please bring a laptop and we will try to help each other with projects and tutorials. If you don’t know what to work on, we will give you a suggestion from our resources page on the wiki.

For experienced programmers: you are welcome too!

More info here:


When: Every Saturday at 7 PM
Where: Upstairs in the Electronics Lab


08 2015

NERP Tonite: Pingo means “pin, go!”

NERP is Not Exclusively Raspberry Pi, the small computer and
embedded systems interest group at Pumping Station:One in
Chicago. NERP meets every other Monday at 7pm at Pumping
Station:One, 3519 N. Elston Ave. in Chicago.

Luciano Ramalho is a member of Garoa Hacker Clube in Sao Paulo,
Brazil (
Tonight at NERP, Luciano will tell us about the Pingo
project in progress at Garoa HC
( Pingo aims to make
interconnecting small controllers of all sorts easy and
transparent, so that they can use each other’s peripherals. An
example use case would be using Python on a Beagle (or similar)
to effectively “program” one or more attached Arduinos.

From the website:
“Pingo provides a uniform API to program devices like the
Raspberry Pi, BeagleBone Black, pcDuino etc. just like the
Python DBAPI provides an uniform API for database programming in

The API is object-oriented but easy to use: a board is an
instance of a Board subclass. Every board has a dictionary
called pins which lists all GPIO pins on the board. Each pin is
an instance of a Pin subclass with attributes that you can
inspect to learn about its capabilities.”

Find NERP and Pumping Station:One
Doors open at 6:30pm. The next meeting is July 7th, 2014.
NERP is free and open to the public.
Ed Bennett ed @ kinetics and electronics com

Tags: electronics, embedded, NERP, Open Source, raspberry pi,
hackerspace, BeagleBone, Element14, Pumping Station One


07 2014

NERP June 23rd – BeagleBoard Project co-founder Jason Kridner


NERP June 23rd – BeagleBoard Project co-founder Jason Kridner

NERP is Not Exclusively Raspberry Pi, the small computer interest group at Pumping Station:One in Chicago. NERP meets every other Monday at 7pm at Pumping Station:One, 3519 N. Elston Ave. in Chicago.

Jason Kridner, BeagleBoard project co-founder, community manager, and software cat herder will speak at NERP on June 23. His topics will be “JavaScript on BeagleBone” and “Real-time programming with BeagleBone PRUs”. As part of his job at Texas Instruments, Jason provides support and development of the project. He is also a member of hackerspace i3 Detroit.

The BeagleBone Black is the most recent in a series of single board Linux computers created by the folks behind The BeagleBone is designed for educators, designers, makers, and hackers. The BoneScript language, based on JavaScript, reaches out to Web software developers who want to get out of the box. Being a Linux (plus Android and others) computer, the BBB natively runs Python, C++, and the usual suspects. There is also a growing ecosystem of hardware add-on “capes” that shield the user from the complexity of developing interface devices.

Some quick specs on the BeagleBone Black rev C : Technology: Texas Instruments Sitara® 32-bit ARM core cpu @ 1GHz, 512MB SDRAM, accelerated HDMI, Ethernet, USB, 69 (max) GPIO, and a host of i/o peripherals directly accessible from onboard headers. Dim 3.4”x2.1”, weight 1.4 oz., Debian Linux pre-installed, Price $55. Availability: everywhere. Element14 is a good place to look

PLEASE NOTE: For this special NERP, we will keep introductions short and start promptly at 7pm. Please use the meetup (below) to confirm your attendance!

Find NERP and Pumping Station:One

Doors open at 6:30pm. The next meeting is June 23rd, 2014.
NERP is free and open to the public.

Ed Bennett ed @ kinetics and electronics com

Tags: electronics, embedded, NERP, Open Source, raspberry pi,hackerspace, BeagleBone, Element14, Pumping Station One


06 2014

NERP tonight! Eric Stein: RPi hardware i/o and ZeroMQ

ERP is not exclusively raspberry pi, the small computer and embedded control interest group at Pumping Station:One in Chicago.

Tonight, Eric Stein, who is Chief Cat Herder and president of Pumping Station:One will take time out of his busy schedule to show us a Raspberry Pi and Python based system that sends messages by passing messages.

Eric’s system, which is currently in development, receives input from IRC (internet relay chat) and responds by playing an informational message over one of several louudspeakers located around the PS:1 facility. The Pi does text-to-speech conversion on stored messages and i/o logic and control to select the appropriate audio output channel.

A very interesting aspect of the system is the use of ZeroMQ to pass control messages between the Pi and and a server that does something important that Eric will explain. From wikipedia:

“ØMQ (also spelled ZeroMQ, 0MQ or ZMQ) is a high-performance asynchronous messaging library aimed at use in scalable distributed or concurrent applications. It provides a message queue, but unlike message-oriented middleware, a ØMQ system can run without a dedicated message broker. The library is designed to have a familiar socket-style API.”

Find NERP and Pumping Station:One

NERP meets at 7pm 4-8-13 at Pumping Station:One, 3519 N. Elston Ave. in
Chicago. NERP is free and open to the public.

Ed Bennett
Tags: announcement, electronics, embedded, meetup, NERP, Open Source, raspberry pi

Happy Happy
-Ed Bennett


04 2013

NERP is Not Exclusively Raspberry Pi Monday 2/11 @ 7pm

This NERP, Drew Fustini will show us how to WebIDE to program the Raspberry Pi. WebIDE is free from Adafruit. Put simply, “The Raspberry Pi WebIDE is by far the easiest way to run code on your Raspberry Pi. Just connect your Pi to your local network, and log on to the WebIDE in your web browser to edit Python, Ruby, JavaScript, or anything and easily send it over to your Pi.” Drew will use a Pi Plate RGB 16×2 character “Pi Plate” as the output device.

At the 1/28 NERP, we announced that the Adafruit Industries community grants people intended to send us a care package with Raspberry Pi goodies. The package arrived last week, and rather than opening it right away I thought it would be nice to share the surprise on Monday. It’ll be fun to see what kinds of ideas come out of the box and what kinds of projects they might inspire.

As always, if you have a NERP related project that you’d like to share, bring
it along!

Find NERP and Pumping Station:One

NERP meets at 7pm 2-11-13 at Pumping Station:One, 3519 N. Elston Ave. in Chicago. NERP is free and open to the public.


02 2013