Tuesday, 10 June 2014

Narrative Clip - The good, the bad and the hacks

I got my Narrative Clip lifelogging camera a few days ago. I pre-ordered it way-back-when right after its successful kickstarter campaign as Memoto.
Image from: http://www.engadget.com/2013/11/10/narrative-clip-lapel-camera-final-version-hands-on/
It is a small wearable camera that takes a photo every 30 seconds or when you double tap on it. It has a bunch of sensors inside, including an accelerometer, magnetometer, GPS and light sensor. Here's a teardown of the hardware performed by Adafruit.

After having a good play with the clip, here's what I think. Overall, I am happy with my purchase but would advice against buying the clip unless you are OK with the bad points listed below or plan to use it for various hacks and projects.

The Good

  • The industrial design is very nice - It is light, compact, easy to attach to clothing and looks quite nice when worn.
  • The camera is of reasonable quality given the battery life, which is around 2 days
  • The non-camera sensors seem to function reasonably well, apart from the GPS (see "The Bad" below). 
  • As can be seen from the teardown video, it looks to be a well engineered piece of electronic hardware.

The Bad

  • The software, or lack there of. You can upload photos and sensor data to the cloud and view the photos in a mobile app. That's pretty much it. There is an attempt to group photos into "moments", show photos that you took using a double tap (or by accident...) and the cloud service does 90-degree auto rotation. That's it. No GPS data. Nothing else is implemented at the time of writing.
  • To reiterate, the software for the thing is horrible. There is no API, not even for viewing your cloud data via the web. There is no way to view and edit your data "offline". 
  • There is also no way to set parameters in the "firmware", such as the interval between snapshots. The device itself is not programmable at all.
  • It continues to take photos in very low light conditions, and often takes photos of the inside of my pocket.
  • It seems to ignore large accelerations and takes blurry photos because of this
  • Sometimes, it senses double taps when none occurred, and ignores them when I try to double tap. Overall, the double-tapping feature isn't very well implemented and leads to some blurry photos and shots of fingers. It is also quite hard to aim one's body at an object or scene of interest without a view finder.

Not Sure

  • Toilets and other places where one should be careful taking photos means that it is a bad idea to just forget about the device and use it for life logging. This is both a good and bad point about the device. I found it quite easy to forget that I have on me a camera that takes photos without any notifications.
  • The above is especially troubling given that the software is so limited and the only mode of useful operation is through a subscription-based cloud service. I really want access to edit and remove photos before uploading them to the cloud - this is currently not an option (but can be gotten around with the hacks below).
  • It takes many hours to upload a day's worth of photos due to the horrible state of Australian Internet. I assume this is also an issue for many people, but I also think it is a non-issue for many living in sane countries with good Internet infrastructure that don't operate using smoke signals and homing pigeons...

The Hacks

Given the limitations of the software, I have gone ahead and started working on some code to build an "offline narrative". The long term goal of this project is to make my own set of computer vision and sensor processing functions that enable useful lifelogging capabilities without the need for an Internet connection and cloud services.

Firstly, here's the code so far. It is some MATLAB that I cobbled together to test things out. It should run on Octave but I haven't tested it yet :)

The github page includes an extensive Readme that has a lot more information about the clip, including technical details such as how the accelerometer axes are arranged and how files are timestamped. 

The code currently operates on the cached folder of data that is created after the Narrative Uploader detects a clip connected via USB to a host PC. Note that if you disable Internet access for the uploader app, it will still cache the data from the clip and free up memory. This means you should be able to run the clip in offline mode indefinitely! (Assuming you don't update the firmware when Narrative decides to fix this...)