In 2015, I worked with a team of seven amazing students and Olin’s new Library Director, Jeff Goldenson, on a summer design/build project called the “Olin Workshop on the Library” (OWL). The project was about designing new life into Olin’s library space and the resources it provides to students. We started with a mantra (a la Guy Kawasaki): “Make Olin’s culture manifest.” It was the highest performing team I’ve ever worked on, and we generated some pretty interesting work. Posts in this series are efforts to share our results whether they be processes or interesting products that emerged. The post below documents a project aimed at passive documentation collection using open source hardware and software tools.”
A major theme of the library reinvention project is documentation. Documentation can serve many critical purposes including storytelling, enabling someone else to reproduce your work, providing compelling materials for fundraising, and building a personal portfolio of work to name just a few. Design/build projects, specifically, are particularly well captured by “before/after” documentation that represents progress in a discontinuous fashion and time-lapse photography that provides a sense of process and temporal/spatial evolution.
This post is intended to provide an enabling tutorial for setting up low-cost, open source, cloud-enabled time lapse with free or inexpensive tools (digital and physical alike). I’ll attempt to provide step-by-step instructions for creating a highly configurable time-lapse system using the popular Raspberry Pi computer that runs Linux and the photo-sharing service, Flickr.
documentation that represents progress in a discontinuous fashion
Why Raspberry Pi?
Time lapse is not a revolutionary application of digital cameras. So, why not pick a more polished product like GoPro or Brinno? There are many reasons to go with the Pi, but our primary motivation was openness. We want as much of what we’re doing in our library to be transferable to as many other places as possible. Accordingly, whenever possible, we try to choose open source tools and inexpensive hardware so that our designs and “recipes” are as accessible as possible. Apart from the openness and the existing community, the Pi affords functional flexibility that is unmatched by closed systems.
What you need
Note: With the Raspberry Pi 2, we encountered some performance issues with a cheap WiFi USB dongle. After talking to a colleague it seems possible that the USB drivers for the cheap dongles may not properly handle the parallel processing (the Pi 2 has a quad-core processor) of network packets leading to an absurdly high packet loss percentage (~80 – 90%) in some cases. However, we’ve had good luck so far with the TP-Link TL-WDN3200 N600 pictured above.
Apart from the hardware shown above, a passing familiarity with the Linux command line and the Python programming language will be helpful.
The original intention of the project was to create a system that could take high-quality photos at regular intervals and upload them to Flickr. I used a similar system last summer to document a student-led redesign of a classroom here at Olin:
This time around, I decided to add motion sensing to avoid recording painfully long periods of no action in the space.
Installing the OS
First things first, you’ll need to install an appropriate distribution on your Raspberry Pi. I didn’t have a lot of experience with the Pi when I started, so the NOOBS setup of Raspbian appealed to me immediately. After copying the distribution to the micro SD card, I popped it in the slot on the Pi 2, booted with a keyboard and display attached and we were up and running!
Raspbian comes with Python 2.7 installed by default, but you’ll need an extra module to use the camera from a Python script. The module
picamera and can be installed from the command line by running:
pi@raspberrypi$ sudo apt-get install python-picamera
To test the camera out, I used the following snippet of Python code:
#!/usr/bin/env python import picamera camera = picamera.PiCamera() camera.resolution = (2592, 1944) #use the full camera resolution camera.led = True #the red LED should come on at this point camera.capture('test.jpg') #save the image to a file named "test.jpg" camera.led = False
Note: You’ll need to run this file using
sudo like so:
pi@raspberrypi$ sudo python snippet.py
Because the camera accesses the general purpose input/output (GPIO) pins of the Pi, executing the script requires root level permissions. The output of this script will be the file, “test.jpg”, saved in the directory from which the script is run. If you’re connected to a display, you can display the image from the command line using the
pi@raspberrypi$ sudo apt-get install fim pi@raspberrypi$ fim -a test.jpg
If you’ve made it this far, you should be snapping photos with your Pi camera like a champ. But, those photos are pretty huge, and if your project takes place over the course of weeks or months, you’re going to run out of space quickly. Enter, Flickr.
The idea here is that after you’ve snapped a photo, why not simply upload it to a dedicated album on Flickr? After all, you get a whopping 1TB of storage for free.
Before I knew if this was even possible, I started by searching for other folks who had integrated the Raspberry Pi with Flickr. I stumbled across this post about making a digital photo frame. In the picture frame project, they used a Python module named
flickrapi. Here’s the rub, if you want to upload photos to Flickr (and not just download), you’ll need to use an authentication method called “oauth.” The best documentation I could find for setting up oauth using Python used a different Python module named
flickr_api (confusing, I know – thanks developers). The easiest way to install many Python modules is using a utility called
pip. So, let’s install that first so that we can use it to install the
pi@raspberrypi$ sudo apt-get install python-pip
Now, let’s use
pip to install the
flickr_api Python module:
pi@raspberrypi$ sudo pip install flickr_api
Before we can start uploading photos to Flickr, we need to apply for an API key. “Apply” is a strong word particularly if you’re requesting access for a non-commercial app. After you fill out your information, you’re likely to be granted a key immediately. What you get when your application is approved are two strings of numbers and letters. One is your “API Key” and the other is your “API Secret.” Following the instructions in this tutorial, you’ll use those two strings to generate an authentication token in the form of a file that you can store on your Pi’s filesystem. (Note: the tutorial only grants “read” permissions to your app. You’ll want to change that string to “write” to enable you to upload.) You’ll use that token to authenticate automatically via the Flickr API so that you can upload photos from a Python script running on your Pi.
Let’s write another snippet of Python to test our ability to upload a photo:
#!/usr/bin/env python import flickr_api API_KEY='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' API_SECRET='xxxxxxxxxxxxxxxx' flickr_api.set_keys(api_key = API_KEY, api_secret = API_SECRET) a = flickr_api.auth.AuthHandler.load('') flickr_api.set_auth_handler(a) user = flickr_api.test.login() photo = flickr_api.upload(photo_file='test.jpg')
Let’s say you called that snippet above
test_upload.py. You can run it from the command line like so:
pi@raspberrypi$ python test_upload.py
If all goes well, you should see your “test.jpg” image in your Flickr photostream.
In the last project for which I used a time-lapse camera, I tried to limit the dead time in the image stream by specifying days of the week and hours of the day during which the camera should be active. A good start was 9 am – 6 pm, Monday through Friday. But what about lunch? Well, I guess we could assume lunch happens from 12 pm – 1 pm, so we can remove that hour. Then, there are the days the team attends research talks or goes out shopping for materials. It’s pretty easy to quickly end up with lots of frames in which nothing is happening.
To solve this problem, I decided to integrate a passive infrared (PIR) motion sensor. The motion sensor works by detecting changes in the infrared emission within its field of view with a range of around 20 feet. (You can always make a longer cable for it if the action is far from your camera.) The idea is to drive the camera with a script that wakes at a regular interval and checks whether any motion has been detected since the last time it woke up. If so, it takes a photo and uploads it to Flickr. The interface to the sensor is a simple digital interface where the sensor’s output voltage is digital “high” (3.3V) if motion is detected, otherwise, the sensor output is “low” (~0V). To use the sensor, we need to install the
RPi.GPIO Python module:
pi@raspberrypi$ sudo apt-get update pi@raspberrypi$ sudo apt-get install python-rpi.gpio
Bringing It All Together
All that’s left to do is integrate it all into a single Python script:
#!/usr/bin/python import RPi.GPIO as GPIO import time import flickr_api import picamera from datetime import datetime as d import os import sys import logging GPIO.setmode(GPIO.BCM) #use the Broadcom channel name (instead of the physical pin number) PIR_PIN = 4 #we use pin 4, but any GPIO pin will work GPIO.setup(PIR_PIN, GPIO.IN) #set the PIR_PIN to be an input motion_detected = False #default to no motion detected def motion_cbk(PIR_PIN): #the callback that will be called when the PIR_PIN is set "high" global motion_detected motion_detected = True logging.basicConfig(filename='time_lapse.log', level=logging.DEBUG) #it's good to log stuff API_KEY='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' #replace with your API_KEY API_SECRET='xxxxxxxxxxxxxxxx' #replace with your API_SECRET USER_ID = 'xxxxxxxxxxxxx' #replace with your Flickr user id (you can get this from <a href="http://idgettr.com/">http://idgettr.com/</a>) DEFAULT_INTERVAL = 3 #in minutes PHOTO_SET_ID = 'xxxxxxxxxxxxxxxxx' #useful if you want to have an album for your time lapse flickr_api.set_keys(api_key = API_KEY, api_secret = API_SECRET) a = flickr_api.auth.AuthHandler.load('') flickr_api.set_auth_handler(a) user = flickr_api.test.login() photosets = user.getPhotosets() for p in photosets: if p.id == PHOTO_SET_ID: break if len(sys.argv) == 2: interval = int(sys.argv) else: interval = DEFAULT_INTERVAL camera = picamera.PiCamera() camera.resolution = (2592, 1944) img_idx = 0 try: GPIO.add_event_detect(PIR_PIN, GPIO.RISING, callback=motion_cbk) while True: n = d.now() if motion_detected: try: camera.led = True fname = n.strftime('%Y%m%d%H%M') + '.jpg' #this naming convention will help with sorting and parsing later camera.capture(fname) photo = flickr_api.upload(photo_file=fname) p.addPhoto(photo_id = photo.id) os.remove(fname) camera.led = False motion_detected = False except: print("Encountered an exception") e = sys.exc_info() logging.error(e) camera.led = False time.sleep(interval*60) except KeyboardInterrupt: print("Quitting...")
Let’s say you saved the above script as
time_lapse.py. You can try out your time lapse camera now, by running the following command:
pi@raspberrypi$ sudo python time_lapse.py 1
This will take photos at 1-minute intervals and upload them to your Flickr account, but when you actually deploy it you’ll probably want to use a longer interval. The default is 3 minutes.
Once you’ve verified that your Python script runs as expected and your Flickr is being generously populated with time-lapse images, you’ll want to deploy your camera. Once deployed, it’s likely your Pi will no longer be connected to a display and keyboard. So, to access it, you’ll want to use the secure shell or
ssh utility. Raspbian comes with
ssh enabled by default, but you’ll need to get your Pi’s IP address using
pi@raspberrypi$ sudo iconfig
The output you care about for your wireless adapter is this:
wlan0 Link encap:Ethernet HWaddr 14:cc:20:15:85:5d inet addr:xxx.xxx.xxx.xxx Bcast:xxx.xxx.xxx.255 Mask:255.255.252.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:298986 errors:0 dropped:0 overruns:0 frame:0 TX packets:3345086 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:45490521 (43.3 MiB) TX bytes:550684935 (525.1 MiB)
To access the Pi over the network, you’ll need to copy the IP address shown above in the
inet addr field and run the following:
pi@raspberrypi$ ssh firstname.lastname@example.org
This will run your user’s shell, however, any programs you start in your session will terminate when you logout. To avoid this, you probably want to use the
screen utility. To install it, run:
pi@raspberrypi$ sudo apt-get install screen
To run screen enter:
This will start another process that looks like another login session. Run your time lapse program in the background with the following command:
pi@raspberrypi$ sudo python time_lapse.py &
Once the time lapse is up and running, disconnect from your screen session using
ctrl+a followed by
ctrl+d. You can now logout without worrying that your time lapse script will terminate. To reconnect to the screen, you simply run:
pi@raspberrypi$ screen -r
You can check out our time lapse album to see photos from our setup.
All of the Python code in this post is available from our Github repository.
So, all of this should prompt the question, “how do I get my photos off of Flickr to make the time lapse video?” This is a great question that will be answered in a follow-up blog post.