Rasberry Pi + Camera


The idea of setting up a surveillance system has been on the to-do list for a long time. Given there have been a few break and enters in our neighbourhood for petty stuff and the last straw was a break-in in my vehicle, I decided to set one one up.  Why make it easy and purchase a ready to go camera like a foscam? It is would not be as exciting. After some searching, I ended up with purchasing the following from Newark Element14:


I placed the order on a Thursday, received the goods the next day, and was up an running on Sunday and debuging for the next few days. :-)  This was a trivial project given the amount of information on the web from people who already did the same thing.

Partial View of my Home Network

The home network has a several devices connected. For the purpose of this discussion, I  utilized my WD-MyBookLive which was basically doing nothing. Also storing media on an SD card in the Raspberry did not appeal to me. Besides, with SSH enabled on the WD, I can do all the Linux stuff. More about that later. The following is a partial network diagram of what is going on at home.


The steps taken are the following:

  1. Setup Raspberry with Raspian OS
  2. Configure WIFI
  3. Install and Configure Motion
  4. Setup WD MyBookLive and share folder
  5. Update fstab in Raspberry to automatically mount the share
  6. Lots of files will be created. Some purging of files via CRON is in order

Setup Raspberry with Raspian OS

This horse has been beaten enough and will explained on the web. I ordered the SD card with NOOB and when powered up, one is presented with the option to install Raspian. This was absolutely painless. Update raspian and the packages too at this point.


Configure WIFI

This was the part that created most grief. That light on the USB Wifi dongle did not want to light up. The following entries in the  /etc/network/interfaces file  worked and opted not using the wpa_supplicant configuration file.

e.g. wpa-roam /etc/wpa_supplicant/wpa_supplicant.conf 

For the wpa_psk invoke the command wpa_passphrase ssid to generate a passphrase for an added layer of security and then copy the generated key in the interfaces file.

Install Motion and Configure Motion

This is well explained here and works. My configuration is a little different than posted in the link and is as follows:

The camera captures people walking and now points more in the yard than the road. Although not a freeway, the car movement is generating too much motion and generating lots of files. To create a mask file is simple. I followed these instructions and it worked fine. The exact steps on this end included the following:

  • Copy jpg from camera image that has moving car
  • Open file using Gimp or one can use paint/photshop. Gimp can save as PGM format as raw format
  • Black out areas that you don’t want movement
  • White out areas that are in scope
  • Save file as PGM as Raw
  • Copy file to a directory that motion can read. I used WinSCP to transfer the file to the Raspberry
  • Change the ownership of the file so that motion can access it

Not that sudo motion start/stop/restart and the motion.log file are your friends when trying to tune motion.


Setup WD MyBookLive and share folder

Browsing the files from OSX proved painfully slow. What can be done though is use TwonkyMedia Server installed in the the WD. Mine is an old version 5.1.9 and felt that upgrading was not worth it at this point.

Enabling SSH

Enabling SSH so that one can remote into the WD is simple. Log into your WD device as per usual e.g. http://<ipadddress>  Once logged in, change the URL to


and a screen similar presents itself.




Enable SSH and then via the console–OSX is built-in or if you are using windows something like putty–log into the WD.  At the prompt type in ssh root@<ipaddress> and use default password of welc0me. Be sure to change the password when you log in to close that security hole.

Go ahead a create a user and share that the Raspberry to push files too.


Update FSTAB

Now that a share is created in mybooklive, it is time to have it automatically mount on the Raspberry by adding a line to the /etc/fstab file.  For example, let’s assume the IP address of your MyBookLike is, your share is called camera in the WD, and  mounting to a directory /mnt/nas/camery in the Raspberry, the the following would end up in your fstab file.

Note, don’t leave a space between the commas. Type in mount -a to see if all is good and the contents of share on MyBookLive should show up. Reboot for good measure e.g. sudo reboot


A script in the MyBookLive runs weekly and tars files older than 7 days are archived then removed.  anacron is what configured in the WD alread so in the etc/cron.weekly folder, a file archivecam with the following content does the work. The execution flag needs to be set via chmod +x achivecam.

The crontab file was already configured to run what is in the cron.weekly directory



The AVI files generated present a convenient way review the motion throughout the day and I’m quite impressed with the motion functionality out of the box. That said, one of the goals is to perform image processing on the files. OpenCV is still sitting waiting for something to do. I still need some hardware with more punch to do this in a mobile setting so for now it is an off-line process. Gait Analysis is an interesting area to explore and this little setup is not equipped for that.

The camera purchased is compact and quite clear. The PINoIR shall be used for a inside-birdhouse view for spring-time on another Raspberry. The following is an motion captured before moving the camera to its new location.









The camera is temporally mounted until spring time on the vertical end of a window. I threaded some 14/2 copper wire through the two holes in the camera PC board which allowed me to point the camera and remain in that position.  Not a pretty job yet functional.



I got around to set up the 120 GB SSD on the quad. I ended up making my own power cable with header wires to connect to +5 and Ground pins. I could not find a power connector that fit it J12 on the board at the local stores and didn’t want to order a small part online and pay for shipping.  What I have works.

The process was rather painless at first. I followed the procedure for creating a bootable image via OSX as described at the Udoo website. I then booted the board with the newly imaged SD card with the SSD connected to the board. I then installed gparted via

then ran

to partition the SSD and also create another partition on the SD card for backing up stuff since I had about 23 GB extra on the card to use.


I liked what another Udoo user suggested and allocated space for u-boot in the event that one day a complete boot from the SSD arrives. I created a partition for root/home and another for the swap as shown below.


Once the partition was complete, I did a quick check with fdisk -l to make sure the device was listed. I then mounted the SSD e.g. sudo mount /dev/sda1 /media/ssd for my environment. Then followed the instructions posted at elinux.org using the wget/tarball approach. The only thing left to prepare u-boot via setenv root… as the stated int the instructions. I noticed that my osx did not recognize the udoo quad plugged in the usb port and concluded I was missing a driver.  I ended up installing the osx version from Silicon Labs. Once that was complete, I proceeded with SerialTools to interrupt the boot process.

I set the root parameter as suggested and resumed the boot process. Under that scenario, I could not get the root system to point to the SSD. It was just booting like there was just an SD card. What I had to do was change the mmcroot to point as follows

rather than /dev/mmcblk0p1

I could not otherwise make the system root/home point to the SSD. I’m not a hard core linux person. This approach was not described as part of the instructions and I ended looking at the boot arguments to make this work.  The last touch was to set up /ect/fstab to deal with the swap partition and my backup space on the SD card.

The stab for my environment is the following:

Note that the UUID is specific to my environment and can be found via the blkid command. I rebooted the board (sudo shutdown -r now) and then ran


and the output shown below.

ubuntu@udoo: -etc_003


I then installed mysql server as per standard Ubuntu instructions. What is left to do is to port what I have on my PC to the quad and replicate the SCADA side. Once that works, I will move the arduino code and energy monitoring hardware to the quad as well. Then call this project a day.

A Distraction – UDOO Quad

I received my UDOO Quad today. I was not expecting to dust off my faithful home energy system I wrote a couple of years ago. It has been running well yet I feel guilty of having a home computer running 24×7 to act as my SCADA host. The UDOO is the board that will blend both the Arduino and Linux in a nice board and allow Solid State Drive to hold all the data. On its own without a SSD it consumes around 3.7 watts in-standby.

I currently have about three years worth of energy and temperature data that I also don’t want to lose and it has to be migrated as well. I’m hoping to analyze it using the R-Language one day.  I’m hoping that it should be a relatively easy port as everything is cross platform.




GNU Octave OS X

Why GNU Octave

Learning image processing using C++ is not practical for a newbie like myself as it is not conducive to trial and error. Besides, I would like create a model and explore it in an iterative fashion before I code it in C/C++/ObjectiveC.

I opted to install GNU Octave on the Mac Mini since all my dev is on that platform.  I did install it on Windows 7 a while back for an earlier project so I know what it can do. I also wanted a tool that was almost 100% compatible with MatLab code.

Installing GNU Octave

My dev box consists of a Mac Mini with 2.3 GHz Intel Core i7, 16GB RAM running OS X version 10.8.3 (Mountain Lion). There seems to be a lot of issues with installing Octave in OS X based on what I see on the web. I went down the MacPorts path since I used it for some other installation.

The Steps

  • Download and install Xcode from the Apple App Store
  • Command Line Tools need to be installed manually. Apparently, one can download them without Xcode (web rumour). I did not go down that path as I needed xCode for iOS development. To install the Command Line Tools, select Preferences from the Xcode menu as shown below. Then proceed to the download tab where you will find the Command Line Tool with the install button. Mine is already installed.

cmdline tools

  •  Install MacPorts as per instructions
  • Since I already had MacPorts installed, I ran sudo port selfupdate followed by sudo port upgrade outdated to bring everything up to the latest versions. Note,I use sudo to elevate the privilege required to run port.
  • The documented way to install e.g. sudo port install octave-devel +atlas+docs did not work for me. I kept getting errors and had to perform the installation with sudo port install octave-devel gcc47 I think it has to with atlas which forces everything to recompile.  I suppose one could run it with gcc48 as well.
  • When I reran sudo port install octave-devel +atlas+docs things worked ok.

Note atlas takes a long time to build. I thought the process was frozen but the activity monitor stated that it was busy. Eventually it completed.

Optional Packages

Octave comes with several optional packages depending on what you want to do.  I wanted the image and signal packages.  To install a package, run octave, (type octave in the terminal prompt) and  execute the command pkg install -force <package name> for images pkg install -force image.  The package is not automatically loaded. To load it, type pkg load <package name>. For the image package, pkg load image.

 Testing It Out

I tested my installation with the image package by taking a white square on black background, performed a 2D FFT, displayed the magnitude and phase of the signal, and recreated the original image by performing an inverse 2D FFT. For the curious, the FFT of the image results in a cross in the frequency domain which is in reality a 2D Sync function. In the 1D world, the FFT of a square wave is a sync function. In the 2D world, a square is the equivalent of a pulse. My phases looks wonky and at this point the goal was just to test the installation on OS X. Passed.


octave 2dFFT

Spring Cleaning-Flow Meter

I read yet another book on iOS development and got the creative juices flowing with lots of ideas for some image processing.  That said, I’m starting to feel like a hoarder of electronic parts and opted to put a couple SeedStudio Water Flow Sensors to some use.  Besides someone asked me to describe how to use them in simple terms. So this side trip’s goal is put something together to measure the flow from kitchen faucets. The functional requirements include the following:

  • Hot water line measurement
  • Cold water line measurement
  • Current flow rate in L/sec
  • Running volume for the day in L
  • Max flow Duration for the day in seconds
  • Min flow Duration for the day in seconds
  • Average Duration for the day in seconds
  • Total flow duration for the day in seconds 
  • Integration with my existing M2M Mango instation over Modbus

The Approach

The code itself is not complicated and derived from the product wiki page. I briefly entertained some event based model to abstract the interrupts and looked at the Cosa OO platform for the Arduino as a possible candidate. In the end, the spirit of Arduino is to allow “artists, designers, hobbyists” to innovate and opted for something simple.

From the spec sheet,

f=7.5Q where

f is the frequency in Hz +/- 3% horizontal position
Q  is flow rate in\frac{L}{min}

(1) \therefore Q=\frac{f}{7.5}

To convert Q to \frac{L}{s} , mutiply (1) by \frac{1 min}{60s} leads to

(2) Q=\frac{f}{7.5*60} in \frac{L}{s}

The following oscilloscope screen capture shows a pulse train generated from the flow sensor. The aim is to count the number of rise edges (low to high) during a 1 second interval.

The following figure illustrates a three second view of the flow rate. To calculate a running total of the volume of water consumed, one has to integrate the flow rate over a period of time. In this case we just perform a running sum of the area under the curve. In this example, A1 + A2 + A3.

For A1, v=Q*t where
v is the volume in L
t is the sample period which is 1 sec
Q is the flow rate as calculated as shown in equation 2 earlier


In general terms, v=\sum_{k=1}^{\infty}Q_k assuming t = 1 second

Variations in flow rate during the sample period (1 second) from the flow sensor is either an in increase or decrease in the number of pulses. If for example, the flow was 0 for the half a second and X over the other half, the end result the average of the two flow rates.

The Core Components 

Putting things Together

The circuitry for this as well as the code is simple. Not much math and just a couple of resistors. It is well described on the at the product wiki page. What I did was create a host board for the pro mini and wired it up with old wire wrapping wire.

The Code

I opted to create a class for the basic flow meter functions. I may want to reuse the code later if I get a different flow sensor. I opted not to create base flowmeter classes from which to create those that are tied to a specific flow sensor. It keeps it simple for the target audience–the hobbyist.

The source can be found here. FlowMeter.h and FlowMeter.cpp. Create a FlowMeter folder under the libraries folder of your Arduino IDE installation. As for the main code it can be downloaded from here. Create a new Arduino project and paste it the editor.

The main loop is shown below. It is far from complex. The sei/delay/cli is key. I did not get into using the millis() to measure the deltaT. I did a quick test and it was 1001/1000 ms so close enough. Invoking begin()/end() need to be invoked prior and after the sei()/cli(). The rest of the code just checks if the totalizers need to be reset based on time of day and process requests from host. To avoid introducing too many errors, the time spend for that overhead should be kept to a minimum so that we keep Ian eye on the pulses.

Installation and Testing

The following photo shows how the solution was mounted. Note that the flow meters were mounted horizontally.


For testing, I took a 1 litre measuring container, filled it, and checked out the value on the HMI host for both hot and cold water lines. The volume incremented by 1.1 L on the host. I repeated this several times for different flow rates. Close enough. I do round to the nearest decimal place.  The HMI polls every 30 seconds so the flow rate is usually zero, unless the tap is running for a long time. I also track the flow time as well though I am more interested in the total volume consumed. The screen shot fragments below show what is seen from the host.


Back to Room One

Time to explore something outside my comfort zone. Image Processing.  I’ve gone through a few geek books during hard to find spare time. There are so many apps in the iOS world that I needed to dig a little deeper to get a better understanding what is under the hood.  iOS Programming: The Big Nerd Ranch Guide and Objective-C Programming: The Big Nerd Ranch Guide are good introductions. If you know C/C++ the Objective C book can be read rather quickly. I liked going through the exercises in the iOS programming book (well kindle version) to force me to navigate through the xcode/iOS documentation.  My real motivation is to do some image processing and opted to read the OpenCV 2 Computer Vision Application Programming Cookbook.

I got  a “hello world” to work on the iPhone, Macbook Air, and Ubuntu 12.04 running under Oracle Virtual Box for Windows. I got something going on the Macbook Air to perform basic motion tracking and facial recognition over a video stream through the built-in camera. OpenCV makes this rather trivial.  There a lot of threads out there for getting open CV installed under Ubuntu of which this one gave me the best results. The following is the output from a program to detect faces against a picture downloaded from the web compiled in an Eclipse environment. I wanted to ensure my dev environment was in working order.

Now What? Well, time to roll up the sleeves and dust off the math and start exploring. I purchased a gopro cam as everywhere I go we are under constant surveillance. Time for sousveillance  coined by Steve Mann.

Power Measurement Revisited

Spot Measurement

I finally got around to tinker with things again and opted to digress from the zigbee standalone mote-like development to revisit the power measurement.

I have three types of sensors to measure currents lying around and decided that given I already wrote the C++ classes and modbus integration for total home energy consumption, I could easily create a spot measurement modbus slave device to monitor specific loads.  Note that all my arduinos+zigbee are modbus slave devices and only the mote-like devices shall use a different protocol.


The sensors in questions include a SeeedStudio Non-Invasive AC Current Sensor,  a general purpose 0-20A for experimentation CR Magnetics  (CR8410-1000) Hall effect bases chips from Allegro ACS712ELCTR-20A-T







With known loads, I compared the output from each of the sensors as illustrated below.

Allegro hall effect chip.

What I liked about the hall effect chip was a dc offset of 2.5V was already in place to feed into the A/D of the Arduino and fit in the 0-5v range requirement. The CT from CR magnetic was clean and was sensitive down to the half-amp range which I wanted for spot measurements. The non-invasive one was bulky and noisier on the low end.

I did not spend much effort on the linearity of the sensors.  Like the total home energy solution completed a while back, I generated a scatter plot for know currents vs my calculated currents then fit a line of the type mx+b to use to linearize the calculations in the desired operating regions.

 Phase Shift

Now when it comes to delays introduced by the CT relative to the voltage, it can introduce errors in non-unity power factor loads. I put both signals on the scope and examined the phase offset introduced. My scope software can calculate the phase difference between two signals. I was not getting too much phase delay to warrant compensation for my home application.

 Housing the Components

A plastic electrical box was used to host the arduino (fio) and xbee + power supply + voltage measurement transformer. Not the prettiest, but it serves the purpose.

The same host software was used for total home energy monitoring to poll, display, and trend the information from my modbus aware device. The chart below is for the power consumption of my PC and other stuff plugged into the power bar.

Pin Sleep Xbee with Arduino Host

Making it work.

I got everything to function with a rather messy board setup as shown below.

The output from the Arduino shows the delta T between messages received from the end-devices.  It is pretty close to the calculated ones. I will change the duration to be 15 minutes later on but for debugging purposes 10s intervals for pin sleep is tolerable.


The following are the libraries were referenced.

Arduino library for communicating with XBees in API mode – I used both Java and C++ versions for this.

Standard Template library (STL) for AVR – I wanted some maps and other stuff in the dev environment.

Streaming C++ style Output – print and println gets to be a pain after a while

Ardunio Modbus Slave - I used this for my energy monitoring project as well and hacked in a modbus function 6. Since then, a new library that includes function 6 has been deployed, but I still used the older version as I wrapped the C++ class – SerialModbusSlave and did not feel like manually merging code.


I leveraged the serial ports available in the Mega2560 and created global variables to reference instances of the serial port as follows:

Which allowed me to do things like the following:

I can change ports around in one spot in the code rather than find an replace things.

I opted to over design things a bit as I wanted to avoid changing the code later on as tacit information tends to be forgotten over time. What I wanted to do is be able to provision new end devices without changing the arduino code and follow a simple 4 step process:

  1. update to appropriate firmware in XBee, set Pan ID, and API mode 2
  2. utilize the home grown Java app to set up the details such as IO types and lines  ( I am thinking of adding a clone function to make this even simpler)
  3. wire up xbee related circuitry
  4. provision the end device in the Arduino via Mango HMI over Modbus

The following illustrates the concept in a non-formal way of describing software. When a ioresponse is received from an end device I look up the corresponding meshdevice entry in map using the string representation of the address64. I was thinking of using int64_t  type rather than a string as the string takes more space but that is not at a premium at this point.

The address64 string for managed end devices are stored in eeprom and loaded on setup(). Provisioning of end devices are sent over modbus from the HMI host and written to eeprom as well. Yes overkill, but t I want to evolve the system and focus on some abstraction to allow the evolution the meshdevice representation. All the smarts to evolve will be confined in one class. The tacit information like how to provision and communicate with the HMI, etc. will hopefilly remain static.

Map of Meshdevices

A map of using the STL for AVR to reference the MeshDevice is as follows:

And updating meshdevices is coded as follows and will change later on as the error handling is uber weak.

Modbus Register Buffer

Lastly the mapping to modbus style registers are set up as clusters for each device and terminated with a set of register to accept commands and address64 strings from the host. The code for it is rather simple and I will shoehorn this later on. Up to 13 discrete (digital) and 5 analog I/O are reserved for each device. What is used depends on how the XBee device is configured which is explained in the XBee datasheet 90000976_D Datasheet
















The prototype basically implemented the following data flow model. The issue for me is battery powered end device is not green enough of a solution for me.  I’m thinking of exploring energy harvesting to help things along. The plan was to also send battery voltage level on the end device to the SCADA host to generate low battery alarms. Still, it would be nicer to tap into ambient environment (vibration, etc) to generate energy to help power the end device. This will be explored later.

Pin Sleep Xbee and Documentation Rant

Well everything is wired up and works as planned. Sort of. The journey there was not simple. The datasheets on the xbee around sleep modes covers everything one needs to set that up. The problem is that it is not a first pass approach for a newbie. I had to scratch my head a few times. Specifically the following:

Pin Sleep for periods longer than 5 seconds caused the end device to send information every 30 seconds or so. In the back of my mind, I knew that the coordinator and router handled all the registration process for end devices and that if one of them went MIA then it would be removed from the child table entry. It did not dawn on me that I had to set the SP/SN on the coordinator and router to a value that is longer than the sleepiest end device. I thought those parameters were to be used for just sleep mode configuration. Alas, I was wrong and perusing the data sheet it states for the SN value.

My initial thoughts were that I don’t have any messages sent from the host as they are initiated by the end device upon waking so why bother to change the defaults on the coordinator and router? Then I started questioning the datasheet version I was using. Although the datasheet supported the firmware version of the xbee, I did not pay attention to the differences between OEM (doc # 9000976a) vs non-oem (doc #9000976d). The d version has a lot more updates and included a section on child poll timeout. That was the aha moment for me to get things to work.

To quote the datasheet “D” and not in “A”.


What is the difference between A and D from the front page? A Hardware line not in previous versions and the absence of OEM. Nice change log.

Now as a hobbyist, I have to ramp up on a lot of fronts and time wasting, although annoying, is par for the course. When one googles xbee and sleep modes, etc. there is a lot of chatter. I’m sensing the XBee evolution and corresponding documentation suffers from a quality issue around information management. What is the cost of ignoring quality in productivity losses in companies? A lot.

Perhaps the problem was between the chair and keyboard in this case. If I compare the effort in  ramping up on the arduino vs xbee, the arduino is a no brainer. I can focus on algorithms and more cerebrally challenging problems. I can’t say that for the XBee.

I’ll get into the mechanics next.

Sleep Mode – Timer Setup

Upgraded to wordpress 3.2. Quite painless.

Setting up the Timer

I received my components from digikey to finish the mesh based data collection prototyping. One of the components is the LTC6991 timer chip. I never soldered surface mount devices before so I did my share of reading up on it. Mine turned out functional but not the best looking. Hats off to those who make it look easy. The IC is TSOT-23 and I purchased a breadboard friendly board to solder it to.



 Note US penny is same diameter as Canadian penny.

This chip is awesome and worked flawlessly. The datasheet has all the information and includes a sample-circuit for a self-resetting timer. The vendor also has an excel based calculator to facilitate resistor value calculations. It does not have provisions for the  resistor/capacitor portion self-resetting portion of the circuit. One can refer to the datasheet to get the formulas to compute those. For my experiment, the circuit used from the datasheet is reproduced below.

Solve the above equation for R_{pw} to get

R_{pw} = \frac{-t_{pulse}}{C_{pw}\cdot ln(1.429/3.3)}  (1)

C_{pw} = 4.3 nF
t_{pulse} = 55 to 56 ms  I may have to lengthen the pulse duration later.


R_{pw} came to 15.4 M\Omega  and I had a couple of 7.68 M\Omega 1% resistors available. Close enough.

The rest of the components values where calculated from the the vendor tool.

R_{set}= 118k\Omega \approx 120k\Omega
R1 = 392k\Omega
R2 = 1M\Omega

I wanted a period of 10 seconds and active low. That is a very high duty cycle that you would not get with a 555 timer. The final solution will have around 5 minutes between pulses but for testing purposes, 10 seconds is easier.

The output from the scope below shows (click on it for zoom). The period is 10.1s.

The pulse came in at 57ms as shown below.

Now time to move on to setup the arduino and Xbee integration.