A datacenter decommissioning last year gave me access to some old Netbotz (now APC) monitoring gear. I jumped at the chance to save it from the garbage, as I remembered it being a pretty slick way to get basic environmental data (temp, door switches, etc.) and it was a small physical footprint way to control a number of networked cameras at once.

My recollection was right, and while the management server appliance proved both annoying to work with and massive overkill for my home projects, I’ve had some luck scraping data out of its web UI.

I wrote a few python classes to make it reasonable to programmatically access the data generated by the cameras and sensors in my backyard chicken coop; you can see the end result at pachube.

The code is available here.  It’s driven by a small db schema that holds basic location info for the monitoring gear and metadata about sensors being watched (e.g. polling intervals and alert thresholds), and produces simple name/value pairs with sensor data satisfying given conditions.  By “conditions” here I mean things like “it’s time to report this sensor value” or “this value changed too much, so I’m reporting on it” — this is by no means meant to replace actual monitoring systems (though it could easily be used to interface with them).

While I wrote fairly thorough pydoc, the overall documentation is hardly extensive, so I’d be happy to answer questions if getting at netbotz data from python is interesting to anybody else.


Pachube provides an API-exposed store of sensor data from pretty much anything, but with a focus on sensor data.  It’s free to use and has what seems to be a pretty complete, reasonably designed, and easy to use API with a nicely flexible authorization model.

I’ve been working on some code to scrape data out of the Netbotz web UI (which does not have an easy to use API) and, as I got to working on the data storage backend for the scraper, remembered Pachube and decided to give it a try rather than reinventing what was effectively the same wheel.

I have generally positive impressions of Pachube after a couple of days messing around with it.  It was pretty easy to get going, and after stumbling into one quickly-fixed bug was up and sending data in from my prototype scraper pretty quickly.

I’m using the following python class to simplify the interaction.  It’s trivial, but I didn’t see anything similar already done, so am throwing it here for future google results:


# This code is released into the public domain (though I'd be interested in seeing
#  improvements or extensions, if you're willing to share back).

import mechanize
import json
import time

class PachubeFeedUpdate:

  _url_base = "http://api.pachube.com/v2/feeds/"
  _feed_id = None
  _version = None
  ## the substance of our update - list of dictionaries with keys 'id' and 'current_value'
  _data = None
  ## the actual object we'll JSONify and send to the API endpoint
  _payload = None
  _opener = None

  def __init__(self, feed_id, apikey):
    self._version = "1.0.0"
    self._feed_id = feed_id
    self._opener = mechanize.build_opener()
    self._opener.addheaders = [('X-PachubeApiKey',apikey)]
    self._data = []
    self._payload = {}

  def addDatapoint(self,dp_id,dp_value):
    self._data.append({'id':dp_id, 'current_value':dp_value})

  def buildUpdate(self):
    self._payload['version'] = self._version
    self._payload['id'] = self._feed_id
    self._payload['datastreams'] = self._data

  def sendUpdate(self):
    url = self._url_base + self._feed_id + "?_method=put"
    except mechanize.HTTPError as e:
      print "An HTTP error occurred: %s " % e

Usage is pretty straightforward.  For example, assuming you have defined key (with a valid API key from Pachube) and feed (a few clicks in the browser once you’ve registered) it’s basically like:

pfu = PachubeFeedUpdate(feed,key)
# do some stuff; gather data, repeating as necessary for any number of datastreams
# finish up and submit the data

The resulting datapoints basically end up looking like they were logged at the time the sendUpdate() call is made.  In my situation, I want to send readings from a couple of dozen sensors each into their own Pachube datastream in one shot, so this works fine.  If, instead, for some reason you need to accumulate updates over time without posting them, you’d need to take a different approach.


I set out this weekend to get an Arduino board to control my Roomba.  (The Roomba has a great – and generally open – interface, and iRobot deserves significant credit for encouraging creative repurposing/extensions of their products.)  I’ve got a few project ideas in mind, but for an initial step just wanted to verify that the Arduino could a) send control commands (“move forward”, “turn right”, etc.) from the Arduino, and b) read sensor data (“something is touching my left bumper”, “I’m about to fall down the stairs”).  This post contains my notes, which hopefully will help others doing this sort through some of the issues in a bit less time than that I spent.  Continue reading »


I’ve been playing with Arduino boards in my limited spare time over the past few months.  It’s a fun way to spend quality hands-on geek time that is clearly distinct (at least to me) from my day job.  Plus, I’m able to start actually instantiating some of the ubiquitious computing / distributed sensor ideas that have been floating around in my head.

I’ve been working on a simple wireless light, temp, and motion sensor.  Light was a trivial CDS photocell connected to the analog port of the arduino.  My first attempt at temp is using the Dallas Semiconductor DS-18B20 digital one-wire sensor, which is pretty slick for $4.25.

There was some good sample code on the main arduino site, but I spent a small bit of time to flesh it out more completely, adding the ability to configure sensor resolution and extracting the temp value from the returned data.  Code is here, if this is interesting or useful to you.


I was going through an old pile of paper in my office recently and encountered a set of note cards I’d accumulated years ago, back in the very small, scrappy, it-definitely-might-not-make-it startup stage. Most of the cards contained miscellaneous reminders, todos, or ideas I thought worthy of further exploration.

A few of the cards, though, had the record of an idiom mixing game my colleagues and I played back in the day (I’ve had the distinct pleasure of working with strongly multi-disciplinary and linguistically inclined geeks).

At its basic level, the game produced comprehensible phrases that amusingly combined two familiar idioms, such as “there are other fish to skin”, or “there are other cats in the sea”.

These are good for a chuckle, but not fundamentally anything more than language slapstick. Some combined idioms of similar intent in ways that made more vibrant images than did the originals, such as

“that opens up a whole new can of monkeys”

(A “can of worms” is one thing, but monkeys make everything funnier.) Rather than just “getting ducks in a row” or having things “fall in line”, we had

“all the ducks are falling in line”

Other are amusing but confusing, and almost seem as if they mean something, at least until you actually think about them. Example:

“Happier than a clam in pigsh*t”

The pinnacle of our mixed idiom game, though, were those hard to find combinations whose meanings were a novel blend of the original idioms. Most of these tended to mockingly riff on various elements of commonly accepted corporate-speak.

“I’m just putting them on the table as I see them”

for example, takes the casual (if sometimes cowardly) innocence of the defensive verbal communication standby “I’m just calling them as I see them” with the trite business-ese of “putting something on the table” to create an all-new description of an impetuous laziness thrust upon others.

Better still, in my opinion, is the lighthearted cynical foreshadowing of:

“We’ll burn that bridge when we come to it”

But my favorite, by far, is a sadly apt commentary on organizational politics gone awry:

“I dropped the ball in your court”

Have more? Oh yes you do … comment away!


Thanks to Amy and JB‘s motivational dissing of our old (and oft-broken) mythtv installation, I set out last weekend to rebuild the setup, which involves a backend system in the basement and a frontend in the TV room driven by an xbox that can retrieve recorded video from the myth system downstairs.

I had a very easy time getting current myth (0.20) installed on a SuSE 10.2 box with a Hauppauge PVR-350 last weekend, and as I’ve come to expect in 4+ years of myth use, 0.20 is noticeably better than 0.18. Since the xbox scripts that interface with myth are version-specific, I needed to update them too. and this was enough motivation for me to go ahead and get a modern XBMC install.

It’s all working now, and it seems pretty cool.

I’ll spare you the particulars, but there were 2 non-obvious problems I encountered along the way, the corresponding solutions to which I thought I’d leave here where google can find them:

problem #1: database connection errors from the xbmcmythtv script. This was puzzling as I’d verified that the host-level packet filters on the server running mysqld were allowing traffic, I was seeing successful TCP connections, and I had verified from a different machine on the network could connect to the target db using the username+password I had configured xmbcmythtv to use. Since the xbox has relatively few other diagnostic capabilities, I used ethereal wireshark to watch more closely, and found a mysql auth error being sent back by the server that read:

Client does not support authentication protocol requested by server; consider upgrading MySQL client

solution #1: use pre-mysql-4.1 style password encoding on the server. With the specific error string (unhelpfully obscured by the xbmc script), google quickly found this note in the MySQL reference manual.

problem #2: script says “caching subtitles” when I try to play a recorded show, then appears to hang for a while before returning to the program listing. This one was quite a head-scratcher for a while, since I wasn’t trying to do anything with subtitles, I couldn’t find any caching options that seemed related, and there was no other indication of something that might be failing (permissions, etc.). What’s more, this problem was coming up after I was successfully getting the list of recorded programs, which meant the xbox was successfully talking to the backend server (mysql, smb, and the myth backend process are all on the same box).

I found some threads on various xbox fora that described very similar problems, but none with solutions that were even potentially relevant.

solution #2: use IPs in the xbmcmythtv config. I had been using the FQDN of my backend box in both the db and general paths config screens of xbmcmythtv. The xbox is configured to use an internal DNS server that is authoritative for the domain in question, and to remove all doubt that DNS was actually working, the DB connection and connection to the mythtv backend worked fine (as evidenced by my ability to retrieve the program listing).

While desperately searching for clues on the “caching subtitles” problem, I found a mention in some random “common problems” document that emphasized that unqualified hostnames (i.e. missing the full domain) in the xmbcmythtv config would not work. I was already using FQDNs, but on a lark I tried replacing the FQDN with the IP of the backend server in the SMB path part of the config, and sure enough, that did the trick.

Apologies for what was certainly a very boring post for, well, anybody who came here except via a search for one of the aforementioned problems. For those of you who did get here looking for answers, I hope this helped.


I had the following dialogue with my 3 year old son after I got home from work last night:

Ben: “What does contrary mean?”
Me: “It means inclined to disagree.”
Ben: “No. No it doesn’t. Don’t say that.”


For those of you who are in the Madison area, the Badger Bonsai Society (of which I’ve been member since I started working with bonsai in 2004) has its annual show this weekend.  The show is Saturday and Sunday (May 19th & 20th) from 10a – 4pm at Olbrich Botanical Gardens on Atwood.  Admission is free, though donations are accepted.  Stop by and check out some cool trees in pots for a few minutes during your weekend.  I’ll have four trees in the show this year for the first time, as well.

For those interested in delving a bit deeper, there will be demonstrations at 11am and 2pm on both days.

(photo courtesy of Grufnik via the Creative Commons license)


It’s bike to work week here in Madison, which makes linking to this guy who mounted a camera and a bit o’ electronics to take a photo every 10 seconds as he bikes across NYC apropos:

(You get the point after a few seconds of the video, but it’s a neat project.)


… that Joseph Schumpeter once challenged a librarian to a duel (over his students’ access to books).  He won (after taking a chunk of the librarian’s shoulder out with his sword).  This book looks interesting.

© 2011 Joshua Heling Suffusion theme by Sayontan Sinha