Saturday, 30 December 2017

Setting up a new Raspberry Pi (2017 Edition)

Last post of the year and admittedly not all that many in 2017. This one is more of a personal reference of the steps taken to setup a new Raspberry Pi. In this case using the Raspbian Jessie OS.

The version of the Pi used here, the Zero W, includes built-in WiFi.

First steps were performed using a Windows 10 machine.

1. Download image. The one I'm using is the 10th April 2017 Raspbian Jessie image.
2. Get a new Micro SD card and plug into the PC. I have an external USB card reader for this. It should appear as a newly mounted drive.
3. Unzip and flash the .img file to the SD card. I use Etcher. Follow the workflow to select the image, select the drive corresponding to the SD card and then Flash.
4. Disconnect the external drive and insert the Micro SD into the Raspberry Pi.

Now I plugged the Raspberry Pi into a TV via HDMI (an HDMI to Micro HDMI connector is required for the Zero) and switched on. At this point you should see it boot and start the windows.

5. Next, the important step, change the default password. Open a command window and use the pwd command.
6. From the same command prompt, enter sudo raspi-config. Select the Advanced Options and Expand Filesystem. This ensures the whole of the SD card is available to the Pi.
7. In the same tool, enable SSH and VNC. This is because I want to remotely logon to the machine from elsewhere on my network.
8. Now reboot.
9. After the machine has restarted I connect to the network using WiFi. I used the UI from Raspbian to do this.
10. Now I can get the latest updates of the OS, sudo apt-get update followed by sudo apt-get upgrade.
11. For the purposes of my network, I want to use a static IP address, rather than use one assigned by DHCP. To do this edit /etc/dhcpcd.conf and add the following lines to the bottom, assuming the router IP is 192.168.0.1 and we want to setup our static IP to 129.168.0.207

   interface wlan0

   static ip_address=192.168.0.207/24
   static routers=192.168.0.1
   static domain_name_servers=192.168.0.1
 
12. Connect to the network share. From the command line it's possible to do this using:

    sudo mount -t cifs -o guest //ip.of.nas/Public /home/pi/NAS

I want to be able to connect automatically on start up. To do this requires a couple of steps, first add the following to the end of /etc/fstab

  //ip.of.nas/Public /home/pi/NAS cifs guest 0 0

However the problem with this as it stands is this step is run before connection to the network at boot time. To fix, open raspi-config, select boot options and Wait for network at boot.
13. Finally reboot. The Raspberry Pi setup is now complete.

Saturday, 18 November 2017

Stepping it up a gear

Having previously played with a Servo, I bought a cheap stepper motor to see how easy it is to use with the Arduino.

Stepper motors have multiple electromagnets arranged around the central shaft. These electromagnets can be controlled in a specific way that allows the shaft to rotate, or step, a fixed amount. Typically steppers are controlled with a driver circuit. A microcontroller such as the Arduino can do this.

The motor is a 28BYJ-48 with a ULN2003 motor driver chip. This is a unipolar stepper with 5 wires.


A unipolar stepper has one winding with a "center tap" per phase. The center tap is a contact made half way along the winding. In the 5 wire stepper, the center tap is joined to a common wire.



Wiring the stepper as below and and using the stepper_oneRevolution sketch from the Arduino examples didn't work out of the box. Instead of revolving clockwise and anticlockwise it always rotated in the same sense.

So, to get it working I followed the steps on this Instructables tutorial



Here is the Arduino and Stepper put together:
Next step is to include this with the Raspberry Pi for some more interesting applications.

Wednesday, 6 September 2017

Fixing Linux Mint WiFi

With Windows 10 completely breaking in a fresh and exciting (not to mention unfixable) way on a laptop earlier in the year I decided to wipe it and install Linux Mint

This was all very straightforward but as the laptop in question is a 6 year old Dell to get the WiFi working I needed to install a new driver.

From the Ubuntu wiki the command:

lspci -vvnn | grep -A 9 Network

Gave this output:
 
Network controller [0280]: Broadcom Corporation BCM4312 802.11b/g LP-PHY [14e4:4315] (rev 01)

4315 is not listed in the above link but using this successfully got the WiFi working:

sudo apt-get install firmware-b43-installer

Now, all this worked very well for some weeks until after a restart the WiFi decided not to connect at all.

I looked at the logs from

cat /var/log/syslog

Sep  5 20:42:27 jonathan-Inspiron-1764 avahi-daemon[926]: Registering new address record for fe80::9a27:20f7:ee67:8a6a on wlan0.*.
Sep  5 20:42:28 jonathan-Inspiron-1764 dhclient[5211]: DHCPDISCOVER on wlan0 to 255.255.255.255 port 67 interval 5 (xid=0x600e0419)
Sep  5 20:42:33 jonathan-Inspiron-1764 dhclient[5211]: DHCPDISCOVER on wlan0 to 255.255.255.255 port 67 interval 11 (xid=0x600e0419)
Sep  5 20:42:44 jonathan-Inspiron-1764 dhclient[5211]: DHCPDISCOVER on wlan0 to 255.255.255.255 port 67 interval 18 (xid=0x600e0419)
Sep  5 20:43:02 jonathan-Inspiron-1764 dhclient[5211]: DHCPDISCOVER on wlan0 to 255.255.255.255 port 67 interval 16 (xid=0x600e0419)
Sep  5 20:43:10 jonathan-Inspiron-1764 NetworkManager[940]: <warn>  [1504640590.6173] dhcp4 (wlan0): request timed out
Sep  5 20:43:10 jonathan-Inspiron-1764 NetworkManager[940]: <info>  [1504640590.6174] dhcp4 (wlan0): state changed unknown -> timeout
Sep  5 20:43:10 jonathan-Inspiron-1764 NetworkManager[940]: <info>  [1504640590.6338] dhcp4 (wlan0): canceled DHCP transaction, DHCP client pid 5211
Sep  5 20:43:10 jonathan-Inspiron-1764 NetworkManager[940]: <info>  [1504640590.6338] dhcp4 (wlan0): state changed timeout -> done
Sep  5 20:43:10 jonathan-Inspiron-1764 NetworkManager[940]: <info>  [1504640590.6345] device (wlan0): state change: ip-config -> failed (reason 'ip-config-unavailable') [70 120 5]
Sep  5 20:43:10 jonathan-Inspiron-1764 NetworkManager[940]: <info>  [1504640590.6350] manager: NetworkManager state is now DISCONNECTED

Strangely, this seemed to be a symptom of not using the b43 driver. So I decided to try the following:
  • Connect to router by ethernet and update
  • sudo apt-get remove firmware-b43-installer 
  • sudo apt-get install firmware-b43-installer
  • Reboot
This worked and I've been able to post this blog from the now Minty fresh (sorry!) laptop.

Tuesday, 23 May 2017

Birdbox cam odds and ends


Unfortunately, although I did get a few visitors, including Wrens to the Birdbox, none of them decided to setup a nest.



However, I did make a few changes to make life simpler for me. First off, using an old router made life harder than necessary when transferring files or viewing the video output. I played with setting up a network bridge but this proved to be too problematic.
The simple solution was to buy a Wi-Fi Range extender with an Ethernet port. This allowed me to seamlessly add the Pi on the Power over Ethernet to my main home network.


Next, the image processing code used by the motion detection requires the camera output to a device on /dev/video0

This is not setup automatically so to setup the driver enter the following:

sudo modprobe bcm2835-v4l2

To do this automatically after every boot, add the following two lines to the end of the /etc/modules file.

# /etc/modules: kernel modules to load at boot time.
#
# This file contains the names of kernel modules that should be loaded
# at boot time, one per line. Lines beginning with "#" are ignored.
# Parameters can be specified after the module name.

snd-bcm2835

# v4l2 driver for Rasberry Pi Cam
bcm2835-v4l2


To copy files between the pi and laptop. I used scp

tar -cvf images.tar *.jpg
scp images.tar username@destination.ip:images.tar
tar -xvf images.tar

And finally, to measure the temperature of CPU. Interesting both in the cold winter nights and hot summer days

/opt/vc/bin/vcgencmd measure_temp



Thursday, 20 April 2017

Improving the Windows Photo Screensaver

Way back in 2011 I decided to setup the laptop screensaver to show my photos. The problem with this was although I had around 10,000 or so photos, the screensaver in shuffle mode was apparently using a random seed based on the photo folder properties. This meant I only saw the same photos, at least until I'd added or removed some.

( I have no idea if the latest incarnation on Windows 10 uses a time-based seed now, mainly because my own one has worked for me without a hitch for the last 6 years)

As you can guess, I wasn't particularly impressed with this so I decided to write my own photo display screensaver.

I wanted to write this using .NET and WPF so I first created a Window with 3 vertical columns. The reason for this was because I wanted portrait photos to be centered. The XAML is something like this.

<Window x:Class="WPFScreenSaver.PhotoScreenSaver"
    Title="PhotoScreenSaver" WindowStyle="None"

    WindowState="Maximized" ResizeMode="NoResize"
    Loaded="OnLoaded" Cursor="None" KeyDown="PhotoStack_KeyDown" 

    MouseDown="PhotoStack_MouseDown"
    MouseMove="PhotoStack_MouseMove" >
    <Grid Name="ImageGrid" Background="Black">
        <Grid.ColumnDefinitions>
            <ColumnDefinition Name="Col1"/>
            <ColumnDefinition/>
            <ColumnDefinition Name="Col2"/>
        </Grid.ColumnDefinitions>
        <Image Name="ScreenImage" Stretch="Uniform" Grid.Column="1"/>
    </Grid>
</Window>


Note the KeyDown, MouseDown and MousrMove events are used to detect motion to stop the screensaver. In the code behind these call Application.Current.Shutdown(); to kill the screensaver.

The WindowStyle, WindowState and ResizeMode properties ensure the window fills the whole screen, especially on top of the Taskbar.

In the PhotoScreenSaver constructor, I run a file discovery step to obtain a list of all the filenames of files with .jpg extension from a root folder. (This root folder is stored in the Registry)

Then I create a new Random instance. This has a seed based on the current time.

A Timer is setup with callback to a method to ShowNextImage.

It's important to note here that the timer callback is not on the main UI thread which means we won't be able to update the image so we need to use Dispatcher to invoke it on that thread

if (this.Dispatcher.Thread != Thread.CurrentThread)
{

    this.Dispatcher.Invoke(new Action<Object>(ShowNextImage), new object[] { stateInfo });
}


I obtain an index to file to show using the random number generator. NextDouble returns a double between 0.0 and 1.0 so I can use this to multiply by the number of images to get an index.

var index = RandomGenerator.NextDouble() * imageFileCount;
var filename = ImageFiles[(int)index];

I don't worry about rounding since it's not really a problem if I get ImageFiles[7689] or ImageFiles[7688]

The image itself is loaded as a BitmapImage. This is then scaled to fit the screen.

To turn this from a photo display app into a fully fledged screensaver we need to do some work with the Application startup. In App.xaml we need to change Startup to point to our own startup method

<Application x:Class="WPFScreenSaver.App"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    Startup="OnStartup"
    >
    <Application.Resources/>
</Application>


The OnStartup in the code-behind class does two things. First it reads the command line arguments and second shows either the screensaver, a settings window or a preview. The arguments are:
  • /c  Show the settings dialog
  • /p Show a preview (I just call Shutdown for this)
  • /s or no argument, show the screensaver window
The last step to display the photo window I've just updated (and hence the blog!) Mainly because the original implementation didn't work with two monitors. I now enumerate each screen, determine its dimensions and create the photo window accordingly.

foreach (System.Windows.Forms.Screen screen in System.Windows.Forms.Screen.AllScreens)
{
    PhotoScreenSaver window = new PhotoScreenSaver();
    window.WindowStartupLocation = WindowStartupLocation.Manual;
    System.Drawing.Rectangle location = screen.Bounds;
  
    // Set window position and size
    window.Left = location.X;
    window.Top = location.Y;
    window.Width = location.Width;
    window.Height = location.Height;

    // Tip, if this isn't the primary window and you set
    // window.WindowState = WindowState.Maximized;
    // Before the window has been generated, it'll maximise into the primary
    // In any case, using normal seems fine.
    window.WindowState = WindowState.Normal;
}

//Show the windows
foreach (Window window in System.Windows.Application.Current.Windows)
{         
    window.Show();
}

The windows are entirely independent. I can improve this by sharing the ImageFiles list between the two and synchronising the updates but it really works quite nicely as it is.

Finally, after building the exe, to turn into a screensaver rename to Screensaver.scr. After this, you can right-click on the .scr file and select install.

Thursday, 9 March 2017

Bird Motion Detector

It was pointed out to me that while live streaming was very impressive, a view of an empty box wasn't answering any questions about whether it was getting visited.

The solution was to set-up motion detection and save still images whenever something happened.

The code is based on the Image processing tutorial here.

By installing the libraries, you get a Camera object that you can use to quickly grab and manipulate images. Import it using the imgproc library.

The motion detector code can then be build using a small python file: 

import subprocess
from imgproc import *

cam = Camera(320, 240)
continue_processing = True

last_frame_colour = 0
index = 1
savecount = 0
threshold = 40

try:
    while continue_processing:
        average_red = 0
        average_green = 0
        average_blue = 0

        image = cam.grabImage()
        
        total_red = 0
        total_green = 0
        total_blue = 0
       
        # Only check central 40 x 40 box  
        pixel_count = 1600

        for x in range(140, 180):
            for y in range(100, 140):
                red, green, blue = image[x,y]
                total_red += red
                total_green += green
                total_blue += blue

        # average rgb per pixel
        average_red = total_red / pixel_count
        average_green = total_green / pixel_count
        average_blue = total_blue / pixel_count

        frame_colour = average_red + average_green + average_blue

        filename = "image" + str(index) + ".jpg"
                
        if frame_colour > last_frame_colour + threshold:
            del cam
            subprocess.call(["raspistill", "-o", filename])
            savecount += 1
            cam = Camera(320, 240)
       
        if frame_colour < last_frame_colour - threshold:
            del cam
            subprocess.call(["raspistill", "-o", filename])
            savecount += 1
            cam = Camera(320, 240)

        if savecount > 20:
            continue_processing = False

        last_frame_colour = frame_colour

        index += 1

finally:
    print "Finished"


 The code does the following
  1. Create a fairly low resolution camera 320x240 pixels
  2. Enter a loop and grab an image  image = cam.grabImage()
  3. Take the pixels in the central 40x40 box and build up a representative average colour
  4. Compare this with the average from the previous iteration. If the difference exceeds a threshold, in this case set at 40 after some experimentation, then take a still image
  5. To save the still I call out to raspistill using subprocess. The problem here is I need to kill the Camera running the detection. Unfortunately at this point I've called del on that object, which doesn't really play nicely with python and Garbage collection. This is certainly something to improve in future.
  6. Finally, I set a limit of saving 20 images before exiting the loop. This is a conservative limit intended to prevent the SD card filling up. The images are just named with an index based on the iteration through the loop. Again something more sophisticated would improve things here.
I wanted to run this process continuously while I'm not logged on so I used the  following command:

nohup python BirdDetector.py > /dev/null &
  • nohup keeps the process running even after logging out of the session
  • & runs the process in the background
  • > /dev/null redirects the output to the null device
So, did I detect a bird?

Yes!

Was the bird (A great tit) concerned about being spied on?

Who knows!

Thursday, 23 February 2017

LeapMotion Robot controller

Sometimes impulse takes over and I end up buying something that seems really cool but in the cold light of day doesn't quite live up to its promise

In this case, about two years, I bought a LeapMotion controller.

The premise seemed great, a way of using hand gestures to intuitively control your computer. It seemed especially promising given the controller could track hand and finger positions for both hands. Unfortunately, my old laptop didn't have a good enough spec to do anything more than the most basic samples and even then the experience seemed too prone to errors.

The new laptop certainly has enough power but even now the tracking sometimes seems to get the jitters. So it seemed like this piece of kit was destined to lie in a cupboard like so many other pieces of technology around the world.

But cometh the hour, cometh the robot-arm and it occurred to me LeapMotion would make for an interesting control interface.

Starting simply, my intention was to control the open and shut movement of the grabber with my index finger and thumb. By moving them apart and together I hoped to get the grabber to follow this motion.

To begin I downloaded the LeapMotion 2.3.1 SDK. Using Eclipse, I created a new Java application and added LeapJava.jar (Right-click on the project, select Properties -> Java Build Path -> Libraries Tab and Add External Jars, so that it appeared in the project:


To access the tracking data I created a class that implemented the com.leapmotion.leap.Listener interface and added this to a Controller

  LeapMotionListener listener = null;
  Controller controller = new Controller();

  try
  {
   listener = new LeapMotionListener( writer, new TcpClient() );

   controller.addListener(listener);
   System.out.println("Press Enter to quit...");
   System.in.read();
  }
  catch( Exception e)
  {
   e.printStackTrace();
  }
  finally
  {
   if (listener != null)
   {
    controller.removeListener(listener);
   }

  }


In the Listener class I implemented onConnect and onFrame. The interesting implementation was onFrame. This is called in a separate thread every time data is sampled. The event details can be obtained with a Frame instance

@Override
 public void onFrame(Controller controller) {

    Frame frame = controller.frame();

The API then quite nicely allows you to obtain the information about the index finger and thumb in the shape of LeapMotion Vectors:

FingerList indexFingerList = frame.fingers().fingerType(Finger.Type.TYPE_INDEX);
Finger indexFinger = indexFingerList.get(0); //since there is only one per hand. I'm only holding  one hand in the field of view

FingerList thumbList = frame.fingers().fingerType(Finger.Type.TYPE_THUMB);
Finger thumb = thumbList.get(0);

Vector thumbTip = thumb.tipPosition();
Vector indexTip = indexFinger.tipPosition();



At this point I started investigating the behaviour of the sensor. Calculating the distance between finger and thumb is simple:

final float distance = thumbTip.distanceTo(indexTip);

Unfortunately this measure is too noisy to be able to reliably control the arm. So I decided on a velocity measure:

v = delta (distance) / delta (time)

If the velocity exceeded a threshold then the signal to drive the motor in the appropriate direction would be sent.

I setup the robot arm and Arduino as per my earlier blog; replacing the Android app with this LeapMotion client.

This is certainly a work in progress. For one the velocity is still a bit noisy. Also the motor being a simple DC motor is setup to only work at one speed. Finally, if the controller lost tracking of the hand the motor was left in the last known state. I'll add post in future to address some of these problems and make the source available on GitHub.