Category Archives: Projects

Block Rockin’ Knits

The arts and crafts area has a sewing table with a pinable canvas surface for pattern design, and this is also great for blocking hand knitted items. 

Blocking means stretching the knitted item and steaming it to get a better shape. Acrylic, wool, alpaca and pretty much any fiber used to make a knitted item tends to roll at the edges and be floppy until it is blocked. Factory knitwear is blocked as well as hand knits.

I discovered a new tool called blocking wires which I used on this large lace shawl I recently completed. The wires can be run through the edges and pinned. Using the wires allowed for needing less pins and getting tension faster for the rectangular shape. I steamed the whole piece with a sewing iron and could see it adjust, tightening and relaxing, along the pattern.

Many people wash an item and block it into shape to dry, then steam it. A hat can be blocked on a large party balloon or a Styrofoam head form like the kind sold in beauty supply shops. Steam alone does a good job of getting a crisp shape for your knitted item. Just be careful to hover a steam iron a few inches above the item to not scorch or melt the fibers.

 

Share this!

“Digit” Sensors

Knitted Finger Sensor from Jesse Seay on Vimeo.

I machine-knit these finger sleeves from a conductive yarn that changes resistance as the knit is stretched.

With this project, I wanted to design a glove that could be machine-knit for workshops cheaply and quickly, making a wearable bend sensor available to people with no textile skills.

With a range of sleeve sizes, users can select the sleeve with the best fit and resistance range for each digit. We attach flexible silicone wires by means of a snap press, and the wearer then sews the wire in place with a tapestry needle and yarn — very easy!  Once the sleeve is finished, the user can use the tapestry needle to easily sew the wire leads in place along a fingerless glove.

Get your own “digit” sensor at the PS:One workshop on March 25. Details and RSVP on Meetup.  (Workshop fee: $10.)

Jenna Boyles, Kyle Werle, and Christine Shallenberg beta-tested the sensors at Pumping Station: One. They selected sleeves for fit, then stitched on the wires themselves. Kyle and Christine were able to use the sensors to control an analog synth and a processing sketch.

More details here.

Share this!

Coloring Book Adventure

I am making a coloring book.  The way is fraught with fears, doubts, and time eating mechanical failuresFears of being unable to make my goals.  Doubt that my art is worth the investment of strangers.  Battles with an old scanner not being compatible with my computer.  Then a crashed computer bios that corrupted my RAID drive.  I lost a lot of files.  But I am winning.  I am winning thanks to very good friends who encouraged my talents.  I am winning with the support of my very wonderful family that helped me in times of need. I am winning because of my tenacity in the face of problems.  It is only a matter of time in this book battle of attrition.  “Today I Draw Dragons” will be a thing.

I will encourage you too to tread the path of book making.  Be not daunted by the endless tasks before you.

This project began when I started to draw dragons before work and then after work.  I began to count them.  I told myself that when I made thirty five of them I would pursue making them into a coloring book. I ended up making one hundred and fourteen of them.

I shopped around for publishers.  It is a sea of frustration.  You have your easiest ride if you can wrestle the support of a professional publishing company, but they will have a say in your product and it is hard to convince them that you are worth it.  So I decided to pursue self publishing, at least for now.  If I prove myself with a successful project, then I will show them what I can do.

None of this is the way to wealth but it is the way of artists.

Knittin’ Kitten

I learned many things.  I learned that even if I print only 30 dragon images it will be considered a 60 pages plus book to a printer even if I don’t print on both sides of the sheet of paper.  If you have a place to store 1000 books and the cash to buy and ship them then you might be able to get them printed for a competitive price.  ISBN numbers are expensive if you buy just one.

Advertising matters.  My Kickstarter shows a definite lull in support when my computer crashed and I could not reason out how to advertise without my scanned and worked drawings.  My friends and family took up the slack then.  I continued.  I made business cards and flyers to paint the town.  I wish I had done more.  But I am still winning.

Cleaning up and re-working scans for print TWICE is annoying.

I have an external hard drive now so I can back up the back ups.

planned cover image

Learning all the programs for formatting everything for print is a huge pain in my pinky toe.

I still have many tasks ahead.  I need to subscribe to a download service so that I can deliver my PDF. files.  I need to secure a high quality printer for the prints I have sold.  I need to prepare to wrap and mail out my books.  I need to make all the custom sketch cards and commissioned art sold to fund this endeavor.  I will need a plan in place to sell the extra copies I am going to order.  And I need to draw more, lots more.

This will not be my last publishing adventure, by far.

There are still a few more days if you want a copy of the book yourself:

“Today I Draw Dragons” By Shelly Loke

My Kickstarter Ends March 8th, but that is really just barely the beginning.  I hope to see your adventuresome projects up here too, soon.

Dragon Making Toast in the Style of the Ancients

 

Share this!

Drossel von Flügel Cosplay Project

After months of work, hours of troubleshooting 3D printers and lasers, as well as a lot of patience, I’m proud to present my completed cosplay mask of gynoid Drossel von Flügel. My friend Jaina helped me take pictures at Katsucon last weekend in National Harbor. (Yes, the same convention center, unfortunately)

Note: almost all images can be clicked for full size.

Skylar at Katsucon dressed as Drossel with a hoodie that says "I HEART HUMANS"

Skylar at Katsucon dressed as Drossel with a hoodie that says "I HEART HUMANS"

 

 

 

 

 

 

 

 

 


ByNEET's 3D model of Drossel Sky's picture of her Drossel Figma


A picture of the Drossel mask with the Drossel Figma next to it for reference I HEART HUMANS sweater with blue heart


I have received no shortage of help from various people. The CNC department at Pumping Station: One has been great at supporting those who want to make things. Twitter user @ByNEET released a full model of Drossel which my friend Faraday (she does 3D work! fortunafaradaze at gmail dot com) helped disassemble for conversion into 3d print friendly STL files. My friends who spent countless late nights with me while I worked on this project. My mom, who was very helpful in assembling the mounts to hold it on my head at the last minute. My friend Amir, who introduced me to Pumping Station: One which has made a huge impact on me. Lastly, the PS:One community itself, for maintaining such a wonderful place to create and share as a community.

Below the read-more is a fairly detailed explanation on how I created the mask and what tools I used for those who are interested in pursing similar projects. Feel free to contact me (Skylar) with questions at SKY at TUNA dot SH or find me at the space! I also have a (photography) website, http://hexbee.net.

Click the read more below!

Continue reading Drossel von Flügel Cosplay Project

Share this!

Mandolin Plates on the ShopBot

My name is Ralph, and I’m an amateur luthier and PS:1 “starving hacker”.

I make all kinds of instruments: guitars, ukuleles, bouzoukis, and more, but my favorite thing to build is mandolins.  They are far and away the most difficult instrument that I make, and require a level of craftsmanship not found in the simpler instruments.

There’s only one real downside to building mandolins— the carving.  Mandolin plates are made from 1” thick stock, carved into a very precise dome shape ranging from 3mm thick at the rim up to 6mm thick at the bridge.  Making the plate accurately is the key to getting a good tone from the instrument: too thick and it sounds “dead”, too thin and the top can’t withstand the force of the strings.

To make the plates requires a set of inside and outside templates that show the proper curves (making these templates on the laser cutter was a primary reason I joined PS:1), using carving gouges to get close the the final shape, and then curved planes and scrapers to get the dimensions exact.  There is about 40-50 hours of carving and scraping that go into a set of mandolin plates.  To make things worse, the back plate is made of hard maple, which is VERY difficult to carve.  Even with leather carving gloves, my hands are a mess of blisters and callouses after making a plate.

When I saw the CNC routers at PS:1, I was immediately struck by the idea of using CNC to produce a rough mandolin plate.  Even if I would still need to scrape to get things perfect, the hard carving work (and blisters) would be taken care of by the machine.

Thus began a year-long journey of discovery…

I learned about CAM, and taught myself to use Fusion 360, only to discover that this 3D modeling stuff is HARD.  I managed to turn out some pretty simple models for bridges and headstocks, which I was able to make on the Shapeoko and ShopBot, but every attempt at modeling a mandolin plate failed.

After flailing around for many months, I discovered the Fusion 360 meetup (sponsored by Autodesk and held at PS:1), and everything changed.  With the help of Autodesk’s Michael Aubrey (Fusion evangelist), and PS:1’s resident CAD experts, I improved my skills to the point where I was able to make a reasonable model of the top plate for an A-style mandolin.

fusion-mando

Last weekend, I got to test the model on the ShopBot!  The initial version is in MDF, just to test the model and the machining commands.  Once everything is tweaked, I will do the real thing in Sitka spruce.

Since the plate needs to be machined on both sides, I needed to create a fixture to align everything.  It’s a pretty straightforward plate, with two alignment pegs that match holes drilled into the ShopBot wasteboard.  All of the shaping was done with a 1/2” round-nose bit running at 12000 rpm and a chip load of .35mm.

The inside is machined first, referencing the stock top.  It uses a pretty simple adaptive pocket to remove most of the waste, followed by a spiral with a 1mm overlap to take things to the finished size.

img_2194

You’ll note that the pocket is not centered in the picture— my origin was in the wrong place in my model.  I fixed that, and the second attempt came out much better.  There is still a bit of scraping/sanding to remove the machine marks, but that was to be expected.

img_2195

After the inside surface was machined, I flipped the workpiece over and re-registered the Z axis to the bottom of the piece.  That way, I know the thickness of the part will be accurate even if my stock thickness is off by a little bit.

Once again an adaptive pocket removed most of the stock, starting with a channel around the rim.

img_2196

After the rim was rough-sized, the “hump” was roughed in.

img_2197

A second pass of the adaptive pocket got the rim down to 4mm thick, and smoothed the transitions.

img_2198

Just as with the inside, the finishing step used a spiral to clean the surface and eliminate the tool marks.  The net result was quite good, and will need only a bit of scraping to finish

img_2199

After all was said and done I swapped in a 1/8” straight bit to cut the outer profile and f-holes.

img_2200

Cutting off the excess stock left me with a quite nice-looking mandolin top plate!  Total elapsed time (not counting my initial screwup) was about 90 minutes.

img_2201

Putting my micrometer to the finished product, the results were better than I expected.  Thickness is accurate within 1mm across the entire profile, with most areas within 0.5mm.  That leaves only a bit of scraping to get things perfect!

Next weekend… the real thing, in sitka spruce.

Many thanks to Michael Aubrey from Autodesk, Ray Doeksen and Andrew Carmadella from PS:1, and all of the Fusion 360 Meetup crew that helped me along the way!  I’m still a modeling rookie, but I’ve come a LONG way with your help!

Ralph Brendler

Share this!

A Light Diversion

IMG_9659

In the last days of Radio Shack, I was in a store on Michigan Avenue when I spotted, buried amongst the disassembled shelving units and discarded phone cases, a small red box that turned out to be an Arduino-based soldering project, the 2770158 LED Cube (https://github.com/RadioShackCorp/2770158-LED-Cube). I bought it for something like $5, took it home, and promptly put it on the shelf as a project I’ll ‘get to’ at some point.

The honest truth is that I was somewhat intimidated by the soldering; it’s a 3x3x3 cube of LEDs that are soldered together and the lights were smaller than i was expecting, and looking at some pics of the final result, I resigned myself to likely screwing it up and at best hoping that I might learn something from what I assumed would be a complete failure. So I somehow justified to myself that, in order to not waste my $5, I shouldn’t actually try to make the thing I spent $5 on.

At some point I hit myself with a clue-by-four and realized the stupidity of my situation; accept the possible loss of the $5 and actually try instead of fretting about what-ifs. So I took the kit to PS:1, sat down in the Electronics area, got out the soldering iron, magnifying glass, and went to work. It took a couple of hours, and I was certain, absolutely positively certain, that, even though it looked right, there was no chance that I had actually gotten the leads all wired together correctly, especially the ones in the middle that were extremely hard to reach with the big tip of the soldering iron. Okay, well, only thing left was to actually plug it into the Arduino Uno I had, load up the sample sketch (available in the RS GitHub repo above), and see what happens.

I fired up the Arduino IDE, loaded the sample sketch, hit upload, and all of a sudden all the lights came on as it started through the canned routines. I was initially skeptical, checking every single light to see which one was never lighting up, and all of a sudden it dawned on me that I had actually done it, all the lights actually lit up as part of the demo routine, and HOLY CRAP I MADE A THING AND IT WORKED!!!!1111

And then in my excitement I dropped it, ripping the USB cord from the Arduino, and landed lights-down on the floor. Well, of course I did. Of course I broke it, right? But as I checked the connections, nothing had come loose, there were no broken connections. I plugged the Arduino back in, and sure enough, it happily came back to life and started going through the routine. Whew!

 

So I resolved to make this truly my own; running a demo program that I didn’t write was not ‘finishing the job’. I remembered the QBasic ‘Snake’ program that drew a line bouncing around the screen, hitting the edge and then randomly turning and going off into another direction. Ah, but this is a cube, in threeeee deeeeeeee, so the challenge would be that more interesting, especially as I resolved to sit down and actually try to implement it without any help from the Internet; a three-dimensional matrix of lights, translated into C++.

This is where I remembered a line from Top Gun that went something along the lines of “Our pilots had become dependent on missiles” as a reason for loss of dogfighting ability. (And then I got that Everly Brothers song stuck in my head). Well, writing C++ for years, I had become dependent on the containers provided by the Standard Template Library (map, vector, etc.). While the Arduino is programmed using C++, it’s really a pretty small subset of C++ (which sort-of-kinda-not-really makes sense) and the STL is not available; go ahead and #include <map> all you like, all the compiler’s gonna do is complain. So I knew I’d have to regain some amount of dogfighting capability and do all the array/matrix stuff in pure C. So I decided the best way to keep myself honest and regain some of the skills I think I used to have, I created a C file in Vim (using Emacs always made me angry, straight-up I hate this, whatever this is), wrote the program, saved, compiled and ran straight from the terminal prompt. Again and again and again.

One of the biggest problems was forcing myself to get past the ‘sheesh, this woulda been easy to use <insert some STL thing> here’ and just focus on getting the values in the right cells of the matrix. It took a few hours to get the algorithm right, but pretty soon I had it spitting out numbers that seemed right, but how was I gonna know that it was right?

This is where I decided to make a quick diversion and build a virtual version of the matrix in OpenSCAD:

9x9-grid

Using this model, I could walk through the output of the program and verify that the snake was truly moving correctly around the matrix. I rotated the model around, checking that the numbers were right and HOLY CRAP I MADE ANOTHER THING THAT WORKED!

 

The last thing to do was to actually get the program to work with the LEDs. This is where the spartan documentation of the original Radio Shack code became a problem; the sketch did a passable job of explaining how the lights were addressed, but the examples were all arrays of pre-baked values without having to do anything dynamic, and my program was all dynamic. I studied how the demo program worked, started fiddling with the values, and discovered how to set the bits in the right way to turn on individual lights, on specific levels. From there I modified my C program and added some code to translate my positioning, which turned out to be the mirror opposite of the way the lights are addressed; I solved the problem by physically turning the Arduino around so I was looking at the other side. Problem solved!)

I uploaded my sketch to the Arduino and it suddenly the lights were lighting up in what appeared to be a snake moving around the matrix. HOLY CRAP I GOT IT TO WORK!!!!!!11111

TL;DR:

This is a long post for what amounts to a small light toy, but whilst I was feeling rather verbose (a consequence of sitting and waiting for an unrelated program to finish), I can’t emphasize how foolish I feel for not starting all this earlier; fear of failure is a very, very powerful emotion and if there’s a TL;DR in here somewhere, it’s that it is always better to try and fail than to never try at all, which is something PS:1 has done a very good job of teaching me.

TL;DR(2):

The code is available at https://github.com/tachoknight/arduino-snakey.

Share this!

Spacecats Rocket Build!

A good friend of mine had the vision to make a memorial to lost cats at Burning Man 2016. It would be a whimsical project with a deeper side to it to honor our fallen feline companions.  To see more of what is behind the project can go to see the Spacecats Indiegogo at https://www.indiegogo.com/projects/art-installation-for-burning-man-spacecats#/ . I was asked by her to assist with creating the rocketship part of the project for the intrepid spacecats. I just starting doing CNC work this year and leaped at the opportunity to further improve my skills with a big project. Over a period of 2 months many models were created to arrive at the final form. I will detail the workflow for this and share some of the iterations!

It all started with Fusion 360, a great program for makers, to create a basic rocketship model. Well, I thought it was basic but my inexperience made it a bit harder than expected and went through many hours of “learning time” to arrive at a model I was happy with. From making the 3d model in Fusion 360, I then took it to 123d Make to have it piece together in radial slices so that it can be put together in real life! With the parts generated from 123dMake I was able to create some laser models to show my friend and get her input for her vision. As you can see it took about 4 times to get it right. These models were done 1:10 scale then 1:7 scale. it really helps to have something in front of you to decide what will look best.

13528376_10153841100737746_2223592556338790308_o

After finally arriving at a model that was good it was time to bring it to the shopbot for a 1:2 model (that is also one of the indiegogo rewards!). There was much dialing in to make sure that the slot fit was tight but not too tight to be able to fit the pieces together.  Found that adding in .01 helped immensely to get the perfect fit. I did many test notch pieces to ensure the fit. One problem I had was making the test pieces too small so it did not get the full effect of sliding all the way into the wood. I found that making them larger really helped. It paid off to prototype and make test pieces , saved me from wasting many materials , especially when I moved to the more expensive wood!  Finally, we had something that the Spacecats seemed somewhat happy about – other than that orange tabby Floyd at least!

13575832_10153871639597746_8500954300887324922_o

Also learned how to use a V bit for this project , very challenging to get the right font in so that it looks nice but was not too thin. This is the plate with the names of departed cats.

13765807_10153902665842746_91960329919297892_o

And finally , was able to do the full-size model that will go out to burning man! They were displayed at an event last weekend that was a Hawaiian luau, they seemed pretty pleased with it if I do say so myself!

13627040_10153915405857746_872954163030275718_n

Thanks to everyone at PS:One for the patience to answer many of my questions and excessive use of the shopbot to dial this project in 😀

If curious about the indiegogo project and the other elements of the installation can check out the page at Spacecats . And if going to Black Rock City this year, look for some spacecats in the deep playa!

Share this!

Configuring Pi 3 with a TFT touchscreen and GPIO buttons

A while back I built a pair of sound-reactive LED towers which were on display a few times, at some local Chicago events.

To plug into the DJ mixer output required a relatively expensive device to get the stereo signals (with external gain control) seen by a computer running processing.org as a microphone input.  Also, it was a relatively bulky affair to have yet another laptop sitting next to the mixer when the artist wasn’t using it to make the music.

Recently I discovered that starting with the 3.1 release, Processing can now run on a Raspberry Pi and has built-in ability to manipulate the GPIO ports.  This blog entry highlights the release.  This is exciting news as now the laptop can be replaced by a Pi 3 with a small touchscreen.

Here is a video of the light towers in action.

The towers each have 8 panels with 60 individually addressable RGB LEDs.  These pictures show off more of the visual esthetic and the cutaway view reveals something of the construction.  The LED strips are hidden in a channel in the wood supports which side illuminate acrylic panels (backed with just a strip of white cardboard).  The acrylic is impregnated with a diffuser which reflects the edge-lit light 90 degrees to exit out of the faces of the panels.  The white cardboard reflects the 1/2 of the light that would otherwise be directed inwards.   The acrylic is produced by Evonik and is called Acrylite EndLighten.  The towers themselves only require 110 VAC power.  The data frames to control the LED strips are sent wirelessly from the processing script using an Open Pixel Control  module which maps points on the processing screen into frames sent to a Fadecandy server running inside a OpenWRT Wifi device which is then physically connected to a Fadecandy board.  I used TP-Link TL-MR3040 WiFi devices to run OpenWRT and added the Fadecandy server application into the img file used to reflash the WiFi devices.  The Fadecandy GIT repository can be found here.

IMG_1689 IMG_1682

This is the assembled Raspberry Pi 3 w/ 2.8″ TFT Capacitive Touchscreen mirroring the HDMI frame buffer in a Zebra case without the top cover.

Pi 3 w/ 2.8" TFT Capacitive Touchscreen HDMI framebuffer

There were a lot of possible paths to follow in getting this build working the way I wanted it to be.  Most of my Google searching turned up outdated examples, particularly due to the changes introduced in the 4.4 kernel with /boot/config.txt use of overlays.  Adafruit had this very nice looking tutorial of how to get the touchscreen working with their version of the Jassie Raspbian os image.  The inclusion of how to use FBCP was of particular interest as mirroring the HDMI output is important for displaying processing scripts with the 2D or 3D graphics libraries.  Their Raspbian image was based on an older kernel and updating the os (sudo apt-get update; sudo apt-get dist-upgrade) turned out not to just work.

After much tinkering, these are the steps that worked for me.  (note that I working with OSX 10.11)

  1. Download the latest Raspbian Jessie image here.
  2. Extract the .img file using “The Unarchiver.app” as opposed to the built-in “Archive Utility.app” as I saw many comments that the default app caused issues.
  3. I chose a 32GB Samsung EVO Plus (model MB-MC32D) micro SD.  It has a red background.
  4. Flash the SD card with the extracted image file.  Instructions for doing this can be found easily.  I used the following procedure:
    • open a terminal window and change to the directory with the extracted image file
    • $ diskutil list
    • note the device path of the SD card (eg: /dev/disk4)
    • unmount the SD card, replace disk4 with what was discovered in the previous step
    • $ diskutil unmountDisk /dev/disk4
    • flash the SD card, again update rdisk4 and also make sure the if= filename is correct
    • $ sudo dd if=./2016-05-27-raspbian-jessie.img of=/dev/rdisk4 bs=1m
    • this will take at least 5 minutes to complete, but it is possible to see some status without interrupting the transfer, by pressing ctrl-t
    • exit the terminal window and eject the SD card
  5. Insert the SD card into the Pi and hook it up to an HDMI monitor.  You will need a keyboard and mouse as well.
  6. Open a terminal window
    • disable power management for the onboard WiFi module for stability
    • $ sudo nano /etc/network/if-up.d/wlan0
      #!/bin/bash
      iwconfig wlan0 power off
      
    • $ sudo chmod +x /etc/network/if-up.d/wlan0
    • $ sudo raspi-config
    • select Expand Filesystem and reboot
  7. Configure the WiFi as usual from icon at the top of the desktop
  8. Open a terminal window
    1. install updates
    2. $ sudo apt-get update
    3. $ sudo apt-get dist-upgrade
    4. install build utility
    5. $ sudo apt-get install cmake
    6. fetch the FBCP source, compile and install
    7. $ git clone https://github.com/tasanakorn/rpi-fbcp
    8. $ mkdir ./rpi-fbcp/fbcp/build
    9. $ cd ./rpi-fbcp/fbcp/build
    10. $ cmake ..
    11. $ make
    12. $ sudo install fbcp /usr/local/bin/fbcp
    13. configure the touchscreen by uncommenting, changing or adding the following config entries
    14. $ sudo nano /boot/config.txt
      # match console size
      framebuffer_width=640
      framebuffer_height=480
      
      # force 640x480 VGA on HDMI
      hdmi_force_hotplug=1
      hdmi_group=2
      hdmi_mode=4
      
      # 2.8" Capacitive 320x240 Touchscreen
      dtoverlay=pitft28-capacitive,rotate=90,speed=80000000,fps=60
      dtoverlay=pitft28-capacitive,touch-swapxy=true,touch-invx=true
      
    15. expose touchscreen events
    16. $ sudo nano /etc/udev/rules.d/95-ft6206.rules
      SUBSYSTEM=="input", ATTRS{name}=="ft6236", ENV{DEVNAME}=="*event*", SYMLINK+="input/touchscreen"
      
    17. select an easier to read console font
    18. $ sudo dpkg-reconfigure console-setup
      • UTF-8
      • “Guess optimal character set”
      • Terminus
      • 6×12 (framebuffer only)
    19. remove extra GLES library see this issue
    20. $ sudo aptitude remove libgles2-mesa
    21. install processing
    22. $ curl https://processing.org/download/install-arm.sh | sudo sh
    23. disable auto monitor-off
    24. $ sudo nano /etc/lightdm/lightdm.conf
      xserver-command=X -s 0 -dpms
      
  9. Reboot

The touchscreen should now display the 640×480 desktop scaled down to the 320×240 PiTFT screen.  This makes things look less crisp but has the advantage that connecting to an external HDMI display will work and that most apps need the larger dimensions be usable.  Note that many HDMI displays will not be able to handle a 320×240 HDMI signal.

FBCP stands for frame buffer copy, which rescales and mirrors the HDMI framebuffer (/dev/fb0) onto the PiTFT framebuffer (/dev/fb1)

The version of the 2.8″ PiTFT I got from Adafruit, comes with 4 buttons and I created this test Python script to demonstrate not only how to use the RPi.GPIO library, but how to manipulate the PiTFT backlight (so as to not burn-in the screen), use multi-threaded event handlers, and shutdown the os to safely disconnect the power.

I created this script as: /home/pi/pitft_buttons.py  (chmod +c pitft_buttons.py to make it executable) and test by typing ./pitft_buttons.py.  Note that pressing the bottom right button (#27) will ask for authentication for powering off the Pi.  See below to set this script running as a service, in which case, the user will not be asked for authentication.

#!/usr/bin/env python2.7

# example code tested with Pi 3
# Raspibian Jassie (4.4 kernel): https://www.raspberrypi.org/downloads/raspbian/
# Adafruit 2.8" Capacitive Touchscreen: https://www.adafruit.com/products/2423
# for running on startup see: https://learn.adafruit.com/running-programs-automatically-on-your-tiny-computer/systemd-writing-and-enabling-a-service
# make sure to update the ExecStart= entry in the Adafruit script after copying from the example

import subprocess
import time
import RPi.GPIO as GPIO

# list of BCM channels from RPO.GPIO (printed on the Adafruit PCB next to each button)
channel_list = [17, 22, 23, 27]
backlightOn = True

# event handler to toggle the TFT backlight
def toggleBacklight(channel):
    global backlightOn
    if backlightOn:
        backlightOn = False
        backlight.start(0)
    else:
        backlightOn = True
        backlight.start(100)

# event handler to manage button presses
def buttonEvent(channel):
    startTime = time.time()
    while GPIO.input(channel) == GPIO.LOW:
        time.sleep(0.02)
    print "Button #%d pressed for %f seconds." % (channel, time.time() - startTime)

# event handler to manage Pi shutdown
def poweroff(channel):
    startTime = time.time()
    while GPIO.input(channel) == GPIO.LOW:
        time.sleep(0.02)
    if (time.time() - startTime) &amp;amp;amp;gt; 2:
        subprocess.call(['poweroff'], shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)

# initialize GPIO library
GPIO.setmode(GPIO.BCM)
GPIO.setup(channel_list, GPIO.IN, pull_up_down=GPIO.PUD_UP)
GPIO.setup(18, GPIO.OUT)
backlight = GPIO.PWM(18, 1000)
backlight.start(100)

print "Button #17 exits."
print "Button #22 toggles the TFT backlight."
print "Button #23 displayed the time the button is pressed."
print "!!! Pressing button #27 for at least 2 seconds, powers down the Pi !!!"

GPIO.add_event_detect(22, GPIO.FALLING, callback=toggleBacklight, bouncetime=200)
GPIO.add_event_detect(23, GPIO.FALLING, callback=buttonEvent, bouncetime=200)
GPIO.add_event_detect(27, GPIO.FALLING, callback=poweroff, bouncetime=200)

try:
    GPIO.wait_for_edge(17, GPIO.FALLING)
    print "Exit button pressed."

except:
    pass

# exit gracefully
backlight.stop()
GPIO.cleanup()

To install pitft_buttons.py as a service,

# pitft_buttons service file, start a daemon on startup
# file: /etc/systemd/system/pitft_buttons.service

[Unit]
Description=Start PiTFT buttons daemon

[Service]
RemainAfterExit=true
ExecStart=/usr/bin/python -u /home/pi/pitft_buttons.py

[Install]
WantedBy=multi-user.target

Run a processing sketch from a script or from a terminal window or ssh session.
$ DISPLAY=:0 processing-java –sketch=/home/pi/HelloWorld –present

Share this!