Long Term Deep Water Monitoring from the Chagos Achipelago
ZSL have developed low cost cameras to monitor marine biodiversity in large marine protected areas (MPAs) using the $35 Raspberry Pi single board computers and standard webcams and running opensource Motion tracking software. ZSL reached out to UK hackspaces to help design the cameras and achieved unprecedented economy and features.
Why Raspberry Pi?
Traditionally it has been incredibly difficult to capture events underwater – all of the usual apparatus such as PIR/heat, infrared and ultrasonic sensors simply do not work underwater. The Raspberry Pi literally opened up a new door with its low power consumption and processing power. It allowed us to deploy a solution which really fits the bill and without it would have been very troublesome to achieve!
Each camera was deployed on an anchored buoy. Mounted directly onto the buoys were two solar panels for charging two deep cycle 90Ah lead-acid gel batteries, the aerial, and a waterproof box containing the communications system. This was then connected to a 50m SWA cat-5 cable running down to the pressure vessel containing the camera itself.
The cameras are designed to operate at depths between 20 and 50 meters. Rlab's (Reading Hackspace) Ryan White suggested basing the design around a clear polycarbonate tube, with machined HDPE end caps secured by threaded rods and double o-rings. One end-cap had a threaded hole which SWA cat- 5 cable was run though, anchored to the inside and then potted. This cable runs the power and communications. BuildBrighton's Mike Poutney and Paul Strotten machined the endcaps on their lathe and offered some great technical advice which was very well received. The outer pressure vessels easily survived a 100m pressure test in a hydrostatic chamber. It should go significantly deeper had the internal structure not failed at that point.
Rlab's Barnaby Shearer designed the internal support structure. This was laser cut from 3mm acrylic. The designs were done in 3D in OpenSCAD to check all the components fitted together, then projected into 2d for laser cutting. The acrylic was glued with tensol.
The junction box was 3d printed and then sealed using potting compound and left to dry for some time also forming a mechanical join between the inside and the cable gland.
Attached to the buoy in a waterproof case was a Raspberry Pi to coordinate the communications. This had an Ethernet link to the Raspberry Pi in the pressure vessel. It also had a WiFi dongle running in access point mode to allow easy monitoring and reconfiguration form the research vessel. The Pi also has a serial connection to an Iridium satellite modem so it can stream pictures of the images captured. The satellite image transfer software was specially developed by Cambridge Consultants and the equipment and satellite bandwidth for this trip was kindly sponsored by Iridium.
Attached to the bottom Pi was an Eve board to provide the Pi a RTC and a temperature sensor. Also attached was [Ciseco’s Humble Pi] hosting an AVR and a mosfet to to turn the Pi off at night (and critically back on each morning). This Pi wake was developed by Miles and Matt from Ciseco, who make an amazing range of Raspberry Pi and microelectronics and are well worth a look - http://www.ciseco.co.uk/
These boards were slightly modified to handle a HackHD camera via the AVR so we could capture high definition footage as well as stills.
The boards were assembled at Nottingham Hackspace.
The camera used is the Microsoft LifeCam Cinema a cost effective camera conforming to the UVC specification. The only gotcha proved to be that despite it's claims it only responds to a few 'magic' exposure settings (5,10,20,39,78,156,312,625,1250,2500,5000,10000,20000), and you have to wait 100ms and reset the brightness after any exposure change.
Rlab, Gary Fletcher and Doug Snead provided a simple command line program to control the camera, and a slimmed down version of MJPEG-Streamer optimized for this camera and with some additional time stamping. This stream then fed into Motion which starts saving the frames as JPEGs after it detects an event. The JPEGs are then rsynce’d up to the top Pi (backups are always a good thing). ImageMagick then thumbnails and montages the images for efficient sending over the (slow) satellite link.
What did it Look Like
The deepest ever Pi?
At 50 meters deep – could this be the deepest Pi to date?
Where was it Deployed?
The system was tested at ZSL in London Zoo behind the scenes and then went onto to open Ocean tests in the largest marine protected area in the world, the Chagos Archipelago.
Gary Fletcher and Barnaby Shearer testing the camera at ZSL London Zoo, behind the scenes
The proposed route for the expedition (a) 3-4 days in Diego Garcia, (b) 3 days in the Salomon atoll, (c) 1 day East Peros Banhos, (d) 2 days West Peros Banhos, (e) 4 days at Three Brothers, (f) 2 days at Eagel-Danger, and (g) 1 day at Egmont.
Well as you can see the results speak for itself, but there is still quite a lot of development work to do but once these sentient units are complete, it will offer a low-cost monitoring system that, when deployed as a network, will greatly expand ocean areas that can be observed. For those that would like a little further reading on the actual deployment, please have a look here on the Chagos Trust website.
Gary Fletcher, Barnaby Shearer, Ryan White, Richard Ibbotson, Doug Snead, Paul Strotten, Mike Pountney, Miles Hodkinson, Matt Lloyd, Adam Markwell, Gary Fletcher Senior, Anna Fletcher, Charles Turner, Marty Morriss, David Curnick, Matthew Gollock, Heather Koldewey, Alasdair Davies, Charles and Anne Shepard, Yannick Mandarin, Ronan Roche, Reece Pitts, Richard Traherne, Marion Campbell, Jonathan Pallant, Ant Skelton.