Clock using a Max7219 Matrix Display

I guess many of us find lighting up loads of LEDs to be rather interesting, so when I saw this group of 4 Matrix displays for sale cheaply, I had to pounce and find a use for them later.

There are a number of methods described for running them with the Raspberry Pi but I stumbled on an excellent library which does all of the hard work. The documentation is comprehensive and the examples really show what is possible.

matrixdisplay

Matrix Display resting on a Lego compatible case. The sizes fit perfectly, so I guess a construction brick display device is on the way soon!

I started out with just an 8×8 display originally, not trusting that I’d have any success with a larger module, but soon I got bored and upgraded. Woah… this thing works well, especially once the correct rotation had been added.


#!/usr/bin/env python

import time
from random import randrange

import max7219.led as led
from max7219.font import proportional, SINCLAIR_FONT, TINY_FONT, CP437_FONT
import feedparser

python_wiki_rss_url = "http://open.live.bbc.co.uk/weather/feeds/en/2649808/3dayforecast.rss"

# create matrix device
device = led.matrix(cascaded=4)
device.orientation(90)
print("Created device")

while True:
        print("Getting feed")
        feed = feedparser.parse( python_wiki_rss_url)

        for repeats in range(10):
                print(repeats)
                for items in feed["items"]:

                        msg = items["title"]
                        msg = msg[0:msg.find(",")]
                        print(msg)
                        device.show_message(msg, font=proportional(SINCLAIR_FONT))
                        time.sleep(1)

                        msg = time.asctime()
                        msg= time.strftime("%H:%M")
                        print(msg)
                        device.show_message(msg, font=proportional(SINCLAIR_FONT))
                        time.sleep(10)

I quickly looked up reading RSS feeds as well as strftime for python, and in a short period of time the program above shows a clock for 10 seconds and then parts of the 3 day forecast.

WS2812B LED ring driven from a Pi Zero

I’ve already written about using WS2812B RGB LEDs with the Pixel Cushion, but seeing a set of concentric circle PCBs available got me thinking. I originally used these a sort-of Superhero badge as part of a fancy-dress outfit, but they were crudely mounted onto a plastic disk with cable ties.

A bit of work with 3d software and a 3d printer produced this.

I’ve used a servo extension lead to power and control the LEDs. They’re all daisy-chained as before, starting from one of the outside LEDs and finishing on the final LED in the centre. In total, there are 61 LEDs which means that if they were run at full power, the load on the battery would be rather high (60mA per LED * 61 LEDs = 3.66 Amps). However, I’m limiting the maximum value for each LED so that they’re dimmer. An added advantage is that I can still see if I look directly at the LEDs. The overall effect is rather striking, although the camera used in the video above struggled to focus (perhaps macro might have been better) but it’s easy to see the exposure control kicking in as more LEDs are lit.

The Raspberry Pi power connections are fed directly into the GPIO connector, bypassing a somewhat delicate micro-usb port as well as ensuring that large currents for the LEDs are going to the adjacent 5v pin which is then fed out to the pixel rings.

Files for the clip-together case can be found and 3d printed from the: YouMagine Website

I suppose it would be prudent to add some sort of switch to initiate a shutdown routine, but for the moment I’ve been just pulling the plug.

Seq24 on the Pi 3

Yesterday’s Raspberry Jam in Exeter gave an opportunity to briefly demonstrate Seq24 on the Raspberry Pi. I’ve got a Pi-TopCEED desktop and inside this is fitted a small amplifier and speaker. It’s a simple addition but gives me a chance to use sound without trailing wires.

Seq24 is a block-based sequencer. Patterns of notes are entered into blocks and each of these can be triggered with either the mouse or keyboard. As an experiment, I’ve entered parts of Pachelbel’s Canon in D into the blocks so that they can be cued one at a time. I’ve chosen Canon as it’s based on a repeating chord pattern of D, A, Bm, F#m, G, D, G, A over 4 bars.

Any Pi is fast enough to run Seq24 is external MIDI gear, but now that I’ve got a Pi 3 I’ve installed QSynth (which can emulate an external polyphonic sound module and uses SoundFonts) as well as AMSynth, which emulates a classic analogue synthesizer. These both seemed pretty easy to set up once I’d found a GM Sound Font for QSynth.

Cueing different parts of the Canon gives an entirely new mix which certainly adds an interesting perspective on the original music. Hopefully a YouTube clip will appear here soon!

Seq24 Demonstration

Seq24 and two virtual sound generators running on the Pi-TopCEED at Exeter Raspberry Jam.

 

LED cushion… using e-Textiles as a display device

LEDcushion

LED cushion… enough to make your heart skip a beat?

I’ve been experimenting over the Christmas period with WS2812 RGB LEDs. I intend to write up these experiments in another post, but having a bit of free time enabled me to try something a little more ambitious.

Connecting RGB LEDs to the Raspberry Pi has been simplified and they are now supported in ScratchGPIO once the Pimoroni unicornhat software is installed. By wiring together a bunch of these, I was able to create a 25 LED matrix.

Control via ScratchGPIO was a little slower than I needed and in addition I wanted to see if I could write text. I’ve done this in the past using figlet and writing the result to a Minecraft screen as a pile of blocks. It seemed it might be possible. I originally intended to go for a set of 8×8 LEDs (Similar to the Unicorn Hat or the Sense Hat from Pimoroni) but with time running short for a demo, I had to opt for just 5×5. The maths is what stings here… 25 LEDs, 3 pieces of wire between each, two ends to strip, twist and solder = 150 joints. For 8×8 this would be…. well, ouch!

In the video below, I present the results of the first set of experiments. I fully intend to solder up the remaining 39 LEDs to make 8×8, but this will take some time. I’ve mounted the LEDs on cardboard inside the cushion and covered them with bubble wrap. I will eventually use something much softer in the final product.

The controlling program is written in Python and uses a slight modification of the Unicorn Hat module. It took me a little while to find where Python modules are stored but I found it in

Here’s the final program:

#!/usr/bin/env python

import iprpixel as unicorn
import time
import datetime
import random
from mcpi import minecraft
import commands

#print("Here goes")

lookup=[0,1,2,3,4,9,8,7,6,5,10,11,12,13,14,19,18,17,16,15,20,21,22,23,24]
unicorn.brightness(0.6)

#---------------------
#wr_str = raw_input('What would you like to write? ') # ask the user what they'd like to write at the current position
#wr_str = "Hello"

def choosecol():
	colval=random.randint(0,5)
	global r_val, g_val, b_val
#	print "Colour is :",colval
	if colval==0: #red
		r_val=255
		g_val=0
		b_val=0

	if colval==1: #green
		r_val=0
		g_val=255
		b_val=0

	if colval==2: #blue
		r_val=0
		g_val=0
		b_val=255

	if colval==3: #orange/yellow
		r_val=128
		g_val=128
		b_val=0

	if colval==4: #cyan
		r_val=0
		g_val=128
		b_val=128

	if colval==5: #magenta
		r_val=128
		g_val=0
		b_val=128

def sparklefill():
	for led in range(400):
		ledpos = random.randint(0,24)
		r_val = random.randint(64,255)
		g_val = random.randint(64,255)
		b_val = random.randint(64,255)
		unicorn.setapixel(ledpos,r_val,g_val,b_val)
		unicorn.show()
		time.sleep(.05)
	for led in range(400):
		ledpos = random.randint(0,24)
		unicorn.setapixel(ledpos,0,255,255)
		unicorn.show()
		time.sleep(.05)
		unicorn.setapixel(ledpos,0,0,0)
		unicorn.show()
#		time.sleep(.05)

def timestring():

	global figletstring, list_line
#	wr_str= time.strftime('%I:%M %p')
	wr_str= "Time Now : "+time.strftime('%I:%M %p')
	figletstring=str(wr_str).upper()
	cmd = 'figlet -w 400 -f 3x5 '+figletstring # create an operating system command
	line = commands.getoutput( cmd ) # send cmd as a command to the operating system and receive the result.
	list_line = line.rstrip().split('\n') # split the result from 'figlet' into separate lines (right strip new line feeds)

def ipaddress():
	global figletstring, list_line
	figletstring = commands.getoutput( "hostname -I" )
#	print figletstring
	cmd = 'figlet -w 400 -f 3x5 '+figletstring # create an operating system command
	line = commands.getoutput( cmd ) # send cmd as a command to the operating system and receive the result.
	list_line = line.rstrip().split('\n') # split the result from 'figlet' into separate lines (right strip new line feeds)

def heart1():
	global list_line
	list_line = ["     "," # # ","#####","#####"," ### ","  #  "]

def heart2():
	global list_line
	list_line = ["     ","     "," # # "," ### ","  #  ","     "]

def displayleds():
	for startcolumn in range(len(list_line[1])-4):
		column = 0
		rownumber=-1
		time.sleep(0.1)
		for row in list_line: # one row at a time from list_line (the result from figlet)
#			print row, len(row)
			if rownumber>-1 and rownumber<5:
				column=0
#				print "working with row",row,"completed"
				for letter in row[startcolumn:startcolumn+5]: # work along each row - check each character. If it's a '#' then print a block else leave it as air
					if letter == "#":
						unicorn.setapixel(lookup.index(column+(rownumber*5)),r_val,g_val,b_val)
					else:
						unicorn.setapixel(lookup.index(column+(rownumber*5)),0,0,0)
					column = column+1
					unicorn.show()
			rownumber=rownumber+1

def boxdraw():
	global list_line, r_val, g_val, b_val
	for boxes in range(6):
		choosecol()
		list_line = ["     ","     ","     ","  #  ","     ","     ","     "]
		displayleds()
		time.sleep(0.1)
		list_line = ["     ","     "," ###  "," # # "," ### ","     "]
		displayleds()
		time.sleep(0.1)
		list_line = ["     ","#####","#   #","#   #","#   #","#####"]
		displayleds()
		time.sleep(0.1)

def pacman():
	global list_line, r_val, g_val, b_val
	r_val=255
	g_val=255
	b_val=0
	for bite in range(6):
		list_line = ["     "," ### ","#####","#####","#####"," ### ","     "]
		displayleds()
		time.sleep(0.5)
		list_line = ["     "," ### ","#####","##   ","#####"," ### ","     "]
		displayleds()
		time.sleep(0.5)

def smiley():
	global list_line, r_val, g_val, b_val
	for flashface in range(5):
		choosecol()
		list_line = ["     ","## ##","## ##","     ","#   #"," ### ","     "]
		displayleds()
		time.sleep(1)

def spinner():
	global list_line, r_val, g_val, b_val
	for boxes in range(5):
		choosecol()
		for spin in range(4):
			list_line = ["     ","#    "," #   ","  #  ","   # ","    # "]
			displayleds()
			time.sleep(0.01)
			list_line = ["     ","  #  ","  #  ","  #  ","  #  ","  #   "]
			displayleds()
			time.sleep(0.01)
			list_line = ["     ","    #","   # ","  #  "," #   ","#     "]
			displayleds()
			time.sleep(0.01)
			list_line = ["     ","     ","     ","#####","     ","      "]
			displayleds()
			time.sleep(0.01)

def heartbeat():
	global r_val, g_val, b_val
	for hearts in range(10):
		r_val=255
		g_val=0
		b_val=0

		heart1()
		displayleds()
		time.sleep(1)
		heart2()
		displayleds()
		time.sleep(0.25)

for displayip in range(5):
	choosecol()
	boxdraw()
	choosecol()
	ipaddress()
	displayleds()
	time.sleep(1)

while True:
#	pacman()
	smiley()
	spinner()
	boxdraw()
	heartbeat()
	choosecol()
	timestring()
	displayleds()
	time.sleep(1)
	sparklefill()

Construction Brick Scratch Interface Device

SID-Lego

Lego compatible Scratch Interface Device

I’ve been thinking about this for a while. It’s all very well having a battery pack connected to some motors and LEDs with Lego, but what about making it all controllable remotely from Scratch running on a laptop or desktop?

In previous articles, I’ve looked at using the Raspberry Pi and the Lego firmly next to each other, with a wireless keyboard and mouse and that oh-so-bulky monitor cable.

Then I figured out what the implications of “Scratch Interface Device” meant when I spotted it on CympleCy’s website.

Having bundled the Raspberry Pi into the case and added a set of LEDs and switches, it was a simple job to install SID and get working. The last four digits of the Raspberry Pi are used to identify it, and these must be entered on to the PC (Or Mac, if you have one). Once this is done, starting Scratch and enabling Remote Sensor Connections allows the laptop to control the Raspberry Pi without having the bulky connections. Now it’s possible to seriously integrate Lego and intelligent control.

ChristmasLights (4)

ScratchCPIO program to run the lighting

The whole setup now runs faster as the Scratch project runs on a PC, while the Raspberry Pi handles the GPIO control remotely. I haven’t yet tried all of the functions, but all those that I’ve tried have been supported. My PiBlox case has space for the Raspberry Pi camera, so running “Broadcast Photo” causes a photograph to be taken and stored on the Pi’s SD card. At the moment, it’s directed to /home/root/photos but maybe that’s something I’ve done wrong – ScratchGPIO running directly sends the files to /home/pi/photos which is a little more convenient (more later on why…).

Now… I could have stopped here, but inspiration has now kicked in. At the time of these early experiments, I was using an external battery pack and then…

…the Raspberry Pi Zero was released!

And the reason I’m now thinking of experimenting further?

Proposed Lego compatible Scratch Interface Device:

  • PiBlox case
  • Raspberry Pi Zero… when I can get hold of one.
  • WiFi dongle – perhaps a hacked and built-in version?
  • Li-Ion battery and 5v converter/charger
  • Connector setup to suit 0.1 connectors easily at the side of the case. Digital (and analogue?) Inputs, Motors and Servos.
  • Motor and GPIO controller (Explorer pHAT or maybe PiRoCon v2 depending on the size)
  • Mini amplifier PCB and small speaker.

For now, here’s a video of some of the work in action.

Cheerlights with the Raspberry Pi

cheerlightsI had seen a reference to Cheerlights on CymplyCy’s website and was intrigued. The whole idea of a set of lights being controlled by a twitter feed seemed a useful demonstration of bringing electronics, user interaction and social media together.

By tweeting #cheerlights, it’s possible to change the current Cheerlights colour. This is picked up by a variety of devices around the world and the idea is that all of the connected lights would display the same colour.

CympleCy has added to ScratchGPIO7 so that a “broadcast getcheerlights” command will trigger a fetch of the current colour.

My idea was to take the colour and use this to change the background colour of a Scratch screen to match the cheerlights around the world. I created a set of background colours that matches the cheerlights defined colours. It’s then a simple job to see what the current colour is and then choose a background to match. With the Scratch window set to full screen, it’s quite an effective colour source and a useful demonstration. Eventually the screen-blanking will kick in and I haven’t investigated disabling that yet.

With a few RGB LEDs it might be possible to create a physical light that matches the current colour. Equally, it would be lovely to see this controlling a set of NeoPixels (RGB LEDs) although this isn’t so straightforward with the Raspberry Pi.

 

Raspberry Pi Construction Brick Camera

Lego compatible Raspberry Pi case with LEDs, Switch and Piezo Buzzer

Lego compatible Raspberry Pi case with LEDs, Switch and Piezo Buzzer

I recently saw a Raspberry Pi case being sold by CPC which is compatible with Lego construction bricks and decided to get one. Just the sort of thing I’d been looking for in order to extend the experiments that I’d been working on.

In a very short time I’d built a Rasberry Pi selfie-camera with count-down and WiFi capability. Apache handles being able to access the photos remotely whilst ScratchGPIO manages the interfacing and triggering the camera.

Pressing the trigger button causes a gentle beep-beep-beep and flash from the red LED for a couple of seconds, followed by a green flash while the picture is taken.

The raspberry pi case easily handles the camera – it pushes onto two little pegs and the camera lens pokes through where a stud would normally be moulded. It’s so discrete that it wouldn’t be possible to guess it’s there unless known about.

Scratch Program

Scratch Program for the selfie-cam

The programming bit goes around in a loop waiting for the switch to be pressed. When that happens, the camera goes through a loop flashing and beeping, then pulsing the green LED. ScratchGPIO allows a simple “Broadcast Photo” command which works with either a USB or Raspberry Pi camera module. If Apache is set up to allow access to the same folder, then those photos can be viewed remotely.

A quick video should appear here later…