ASCII selfies

selfieI’ve been experimenting, after a long break, with jp2a again, converting captured images into ASCII representations, which are then sent to a local printer. I remember, long ago and as a child, having my face printed onto a t-shirt in Blackpool. The ASCII character set made up the individual “pixels”.

With this experiment, I’ve used an old Deskjet 930c printer and a cheap USB web cam.

From a software point-of-view, I’ve installed the following:

  • CUPS – The Unix Printing System
  • jp2a – Takes a jpeg image and turns it into ASCII text
  • GUVCVIEW – captures images from a webcam
  • Figlet – to create the text for an interesting banner at the bottom of the page

“lp” is the command that sends this text to the printer. There are loads of command line options available. I’ve wrapped this all up in a bash script and set it to run from the command line by making it executable.

To make this all work, I capture an image and it gets saved as my_photo.jpg. I then run the script shown above, typing ./ and the printer spews out a rendered version of whoever is in front of the camera. Neat… and I’ll demo it at the next Raspberry Jam.

Future options would be to use a big button to start the whole process off. I’ve done it before, but there’s not enough time remaining for that.


Minecraft Molehills?

Here’s a little program for generating random holes in the landscape and automatically putting a candle at the bottom. The idea is that this could be run on the Raspberry Pi running as a server so that other players can have a little fun hunting for them.

I’ve discovered that candles don’t like being put underwater… no surprise there!

Hole generated in Python with a candle at the bottom.

Watch your step – a deep hole created by the python program.

Screenshots have been created following the instructions at Raspberry Pi – Spy.

from mcpi import minecraft
import time
import random
mc = minecraft.Minecraft.create()

mc.postToChat("Python is now in control!")

def Molehole(x,y,z,depth):
    mc.setBlocks(x,y+40,z,x+0,y-depth,z+0,0) #clear the space first
    mc.setBlock(x,y-depth,z,50) #Put a candle at the bottom of the tunnel  

for loop in range(20):
    startx = random.randint(-120,120)
    startz = random.randint(-120,120)
    starty = 0

Poundland Universal Remote Control

I’ve been experimenting with the infra-red input commands on the PICAXE for some simple remote control Lego robots. There is an official remote control for the PICAXE – in an eye-catching blue case, but for economy I’ve bought a bunch of Poundland universal remote controls.

They don’t give the same hardwearing impression that the official ones give – the cases are a little more flimsy, but they do seem more than adequate to remote control a PICAXE and can therefore add a whole new dimension to PICAXE circuits. An infra-red receiver is an easy thing to add and it means that one input can effectively pretend to be many switches – all present remotely on the handset.

To program the remote, press and hold the TV remote for 5 seconds. The LED will light at this point. Type in the code 0495. When this is completed, the LED will go out and the remote is ready to use.

The command to receive the remote signals is to use:

irin [100],3,b0

Assuming that you’re using input 3. I often use the 08M2 Picaxe, and this is fine. The value is dumped into byte variable b0.

Often I use the following program:

    irin [100],3,b0
    goto main

to see the effect of pressing different buttons in the debug window.

I’ve discovered the following codes  (File: PoundlandUniversalRemote as a PDF file):


Bluedot – Bluetooth and the 4-tronix Initio Robot.

IMG_1595Some time ago, I wrote about Bluedot – and now I’ve used this to control the Initio robot.

Bluedot consists of two parts – an App and some Python code which runs on the Raspberry Pi. The Initio takes a variety of Raspberry Pi’s, but I recently received some PTFA funding to upgrade to a Pi3 for the club robot so that it would more effectively run Scratch.

Setting up Bluedot is easy – I then took the demonstration program from the Bluedot website and added GPIO commands into the mix.

While my preference is to have robots running autonomously, using sensors to find their way around (And the Initio certainly excels at this), having an App control the motors is certainly fun and a very immediate way of creating a remote control vehicle.

From here? Perhaps a Web cam streaming images to a remote monitor… Watch this space!



from bluedot import BlueDot
from signal import pause

import time, RPi.GPIO as GPIO

GPIO.setup(19, GPIO.OUT)
GPIO.setup(21, GPIO.OUT)
GPIO.setup(24, GPIO.OUT)
GPIO.setup(26, GPIO.OUT)

def dpad(pos):
        print("Both Forward")
        GPIO.output (26, 1)
        GPIO.output (21, 1)
        GPIO.output (19, 0)
        GPIO.output (24, 0)

    elif pos.bottom:
        GPIO.output (26, 0)
        GPIO.output (21, 0)
        GPIO.output (19, 1)
        GPIO.output (24, 1)

    elif pos.left:
        GPIO.output (26, 1)
        GPIO.output (21, 0)
        GPIO.output (19, 1)
        GPIO.output (24, 0)

    elif pos.right:
        GPIO.output (26, 0)
        GPIO.output (21, 1)

    elif pos.middle:
        GPIO.output (19, 0)
        GPIO.output (21, 0)
        GPIO.output (24, 0)
        GPIO.output (26, 0)

bd = BlueDot()
bd.when_pressed = dpad


Pi-Hut Christmas Tree

IMG_20171226_201348727_HDR[1]My son was fortunate enough to receive a Pi-Hut Christmas tree for an early Christmas present. He’s only eight, so I wanted to supervise his soldering, but I needn’t have worried – the only problem encountered was when I fitted a missing LED, fitted it to the wrong side, removed it and broke the leg. Fortunately, I had a spare lying around in the shed so it was an easy fix.

The Christmas tree is a lovely kit. Gold plating means that the soldering is effortless and the edges have also been guilded for a bit of luxury. The whole board sits rather neatly on top of a Pi Zero W. I’ve been logging in remotely with SSH, but a development I’d like to try is to use BlueDot to remote control it from a phone.

There is a set of instructions for installing the gpiozero software, although I’m unfamiliar with this – I’ve always used rpi.GPIO in the past with good success. One thing I found missing was a map of all of the LEDs. They’re all numbered, but that doesn’t relate to anything on the pin connections. Mapping the outputs (BCM or Board number) was a bit of a mission but I think I’ve succeeded.


Output (BCM)

Output (Pin)

Star 2 3
1 4 7
2 15 10
3 13 33
4 21 40
5 25 22
6 8 24
7 5 29
8 10 15
9 16 36
10 17 11
11 27 13
12 26 37
13 24 18
14 9 21
15 12 32
16 6 31
17 20 38
18 19 35
19 14 8
20 18 12
21 11 23
22 7 26
23 23 16




The next step I’d like to try is to run this from ScratchGPIO, possibly using SID as a starter so that I can run Scratch on a laptop and transfer it over later. For the meantime, I’ve just got a couple of alternative python programs to play with.

All-in-all, this has been a lovely kit that uses loads of outputs on the PiZero. Soldering is effortless and programming with the standard program supplied was easy. One day I’ll read the docs for GPIOzero, but for now, I’ll be teaching with Scratch.

Merry Christmas and a Happy 2018 to everyone!





Using a Raspberry Pi as a desktop PC replacement.

It started with a glass of water… and our old laptop. We probably wouldn’t have used the laptop at all, but it ran Vista which meant that it supported our old-ish Canon Canoscan LiDE scanner. The scanner is, in my view, gorgeous –  it’s light, slim and produces good scans. Sadly, Windows 7 or Windows 10 drivers are unavailable for the scanner. I’m not binning or replacing perfectly good kit, so we’re keeping it!

Unfortunately, a clumsy accident with the glass of water meant that the laptop doesn’t work any more. Just when I needed to do some scanning.

Back in the early days of Raspberry Pi use, I had some minor success with scanning from the command line, but now I know a bit more. I installed XSane, fixed some permissions with the way it handles writing to a folder (can’t remember what, though), and off I went. No problems. The batch mode is especially good for scanning a bunch of photos. 4 at a time and just walk away.

Pi-top CEED

All this was running on my Pi-top CEED. I’d picked this up as a kickstarted and now I’m so pleased with it. I’ve got a wireless keyboard and mouse, and the Raspberry Pi 3 inside really shifts it along well. It boots faster than our Windows 10 laptop and I’m not having to put up with the sound of a fan. It connects with no problem to our NAS and I’ve also managed to pair it with the shared printer.

So, the question remains, what would I install? I’ve got the following, squeezed on to the 8Gb card:

  • CUPS used to connect to a shared printer connected to our laptop. I’ve done this before and found it easier to share than trying to get the now-deceased Vista laptop sharing the printer.
  • XSane to run the scanner. I can scan individual images or a bunch at a time in batch mode.
  • GIMP loads WAY faster on the Pi-top CEED than it does on the Win Laptop. Very useable for the brief experiments I tried.
  • Open Office of course.
  • Seq24 for making music. Runs on my MIDI setup elsewhere in the house, but the Raspberry Pi 3 is also able to run some software synthesizers too. I’ve installed Rosegarden which seems to work well, but I need the time to learn it. It sure looks full-featured.
  • Audacity seems to run well, although I did have it crash once on a file. Not sure what and I didn’t have the chance to try again.
  • Fritzing is a PCB and breadboard design package. Works well and could be handy although I’ve spent so much time in our school’s PCB package that I’m not ready to leave quite yet.

There’s a few other packages that I’m using, but for the most part these are the ones I’ll need for general office-type productivity. My conclusion is that, for me, the Pi-top CEED is an excellent desktop replacement.

I’ll just move the glass of water out of the way…