Building Bluetooth LE devices

While waiting for my flic.io buttons to turn up I’ve been playing with building my own Bluetooth Low Energy devices.

Since I already had a couple of sensors hooked up to publish their values via MQTT I thought I would try and build a bridge between MQTT and BLE.

I’m using a Raspberry Pi, a Bluetooth 4.0 USB dongle and a NodeJS npm module called bleno.

It turned out to be petty easy, first a short file to set up the BLE service and connect to MQTT:

var util = require('util');
var bleno = require('bleno');
var mqtt = require('mqtt');

var BlenoPrimarySerivce = bleno.PrimaryService;

var TopicCharacteristic = require('./topicCharacteristic');

var config = require("./config");

var client = mqtt.connect(config.broker);

var topicCharacteristic = new TopicCharacteristic(config);

client.on('connect', function(){
  client.subscribe(config.topic);
});

client.on('message',function(topic, message){
  topicCharacteristic.update(message);
});

bleno.on('stateChange', function(state){
  if (state === 'poweredOn') {
    bleno.startAdvertising('mqtt', ['ba42561bb1d2440a8d040cefb43faece']);
  } else {
    bleno.stopAdvertising();
  }
});

bleno.on('advertisingStart', function(error){
  if(!error) {
  	bleno.setServices([
  	  new BlenoPrimarySerivce({
  	  	uuid: 'ba42561bb1d2440a8d040cefb43faece',
  	  	characteristics: [
  	  	  topicCharacteristic
  	  	]
  	  })
  	]);
  }
});

And then something to add the characteristic for the topic:

var util = require('util');
var bleno = require('bleno');

function TopicCharacteristic(config, client) {
	bleno.Characteristic.call(this, {
		uuid: '6bcb06e2747542a9a62a54a1f3ce11e6',
		properties: ['read', 'write', 'notify'],
		descriptors: [
			new bleno.Descriptor({
				uuid: '2901',
				value: 'Topic: ' + config.topic
			})
		]
	});

	this._value = new Buffer(0);
	this._updateValueCallback = null;
	this._client = client;
	this._topic = config.topic;
}

util.inherits(TopicCharacteristic, bleno.Characteristic);

TopicCharacteristic.prototype.onWriteRequest = function(data, offset, withoutResponse, callback) {
	this._value = data;
	client.publish(this._topic, data);
	callback(this.RESULT_SUCCESS);
}


TopicCharacteristic.prototype.onReadRequest = function(offset, callback) {
	callback(this.RESULT_SUCCESS, this._value);
}

TopicCharacteristic.prototype.onSubscribe = function(maxValueSize, updateValueCallback) {
	this._updateValueCallback = updateValueCallback;
}

TopicCharacteristic.prototype.onUnsubscribe = function () {
	this._updateValueCallback = null;
}

TopicCharacteristic.prototype.update = function(value) {
	this._value = value;
	if (this._updateValueCallback) {
		this._updateValueCallback(this._value);
	}
}

module.exports = TopicCharacteristic;

I’ve used 2 randomly generated UUIDs, one for the service and one for the characteristic.

The code should allow you to read the last value published, publish a new value and subscribe to notifications when new values arrive.

I pulled together a quick Android app to subscribe to the notifications and update when a new value is published and it seams to be working well.

The code is all up on Github and on npmjs so feel free to have a play.

You can test it with the Bluez gatttoool.

[hardillb@bagend ~]$ gatttool -b 00:1A:7D:DA:71:15 -I
[00:1A:7D:DA:71:15][LE]> connect
Attempting to connect to 00:1A:7D:DA:71:15
Connection successful
[00:1A:7D:DA:71:15][LE]> primary
attr handle: 0x0001, end grp handle: 0x0005 uuid: 00001800-0000-1000-8000-00805f9b34fb
attr handle: 0x0006, end grp handle: 0x0009 uuid: 00001801-0000-1000-8000-00805f9b34fb
attr handle: 0x000a, end grp handle: 0x000e uuid: ba42561b-b1d2-440a-8d04-0cefb43faece
[00:1A:7D:DA:71:15][LE]> characteristics 
handle: 0x0002, char properties: 0x02, char value handle: 0x0003, uuid: 00002a00-0000-1000-8000-00805f9b34fb
handle: 0x0004, char properties: 0x02, char value handle: 0x0005, uuid: 00002a01-0000-1000-8000-00805f9b34fb
handle: 0x0007, char properties: 0x20, char value handle: 0x0008, uuid: 00002a05-0000-1000-8000-00805f9b34fb
handle: 0x000b, char properties: 0x1a, char value handle: 0x000c, uuid: 6bcb06e2-7475-42a9-a62a-54a1f3ce11e6
[00:1A:7D:DA:71:15][LE]> char-write-req 0x000d 0300
Characteristic value was written successfully
Notification handle = 0x000c value: 42 61 72 
Notification handle = 0x000c value: 48 65 6c 6c 6f 57 6f 72 6c 64 
Notification handle = 0x000c value: 54 65 73 74 69 6e 67 20 31 32 33 
[00:1A:7D:DA:71:15][LE]> 
[00:1A:7D:DA:71:15][LE]> quit

The lines that start Notification handle contain the bytes published, in this case

  • Bar
  • HelloWorld
  • Testing 123

Securing Node-RED

Node-RED added some new authentication/authorisation code in the 0.10 release that allows for a plugable scheme. In this post I’m going to talk about how to use this to make Node-RED use a LDAP server to look up users.

HTTPS

First of all, to do this properly we will need to enable HTTPS to ensure the communication channel between the browser and Node-RED is properly protected. This is done by adding a https value to the settings.js file like this:

...
},
https: {
  key: fs.readFileSync('privkey.pem'),
  cert: fs.readFileSync('cert.pem')
},
...

You also need to un-comment the var fs = require(‘fs’); line at the top of settings.js.

You can generate the privkey.pem and cert.pem with the following commands in your node-red directory:

pi@raspberrypi ~/node-red $ openssl genrsa -out privkey.pem
Generating RSA private key, 1024 bit long modulus
.............................++++++
......................++++++
e is 65537 (0x10001)
pi@raspberrypi ~/node-red $ openssl req -new -x509 -key privkey.pem -out cert.pem -days 1095
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:GB
State or Province Name (full name) [Some-State]:Hampshire
Locality Name (eg, city) []:Eastleigh
Organization Name (eg, company) [Internet Widgits Pty Ltd]:Node-RED
Organizational Unit Name (eg, section) []:
Common Name (e.g. server FQDN or YOUR name) []:raspberrypi.local
Email Address []:

The important bit is the Common Name value, this needs to match either the name or IP address that you will use to access your Node-RED console. In my case I have avahi enabled so I can access my pi using it’s host name raspberrypi with .local as the domain, but you may be more used to using an IP address like 192.168.1.12.

Since this is a self signed certificate your browser will reject it the first time you try to connect with a warning like this:

Chrome warning about unsafe cert.
Chrome warning about unsafe cert.

This is because your certificate is not signed by one of the trusted certificate authorities, you can get past this error by clicking on Advanced then Proceed to raspberrypi.local (unsafe). With Chrome this error will be shown once every time you access the page, you can avoid this by copying the cert.pem file to your client machine and import it into Chrome:

  1. Open Chrome settings page chrome://settings
  2. Scroll to the bottom of page and click on the “+Show advanced settings” link
  3. Scroll to the HTTPS/SSL and click on “Manage certificates…”
  4. Select the Servers tab and select import
  5. Select the cert.pem you copied from your Raspberry Pi

Usernames and Passwords

In previous Node-RED releases you could set a single username and password for the admin interface, any static content or one that covered bh. This was done by adding a object to the settings.js file containing the user name and password. This was useful but could be a little limited. Since the 0.10 release there is now a pluggable authentication interface that also includes support for things like read only access to the admin interface. Details of these updates can be found here.

To implement a authentication plugin you need to create a NodeJS module based on this skeleton:

var when = require("when");

module.exports = {
   type: "credentials",
   users: function(username){
      //returns a promise that checks the authorisation for a given user
      return when.promise(function(resolve) {
         if (username == 'foo') {
            resolve({username: username, permissions: "*"});
         } else {
            resolve(null);
         }
      });
   },
   authenticate: function(username, password) {
      //returns a promise that completes when the user has been authenticated
      return when.promise(function(resolve) {
         if (username == 'foo' && password == 'bar' ) {
            resolve({username: username, permissions: "*"});
         } else {
            resolve(null);
         }
      });
   },
   default: function() {
      // Resolve with the user object for the default user.
      // If no default user exists, resolve with null.
      return when.promise(function(resolve) {
         resolve(null);
      });
   }
};

This comprises on 3 functions, one to authenticate a user against the backend, one to check the level of authorisation (used by Node-REDs built in oAuth mechanism once a user has been authenticated), and finally default which matches unauthenticated users.

For my LDAP example I’m not going to implement different read only/write levels of authorisation to make things a little easier.

The source for the module can be found here or on npmjs here. The easiest way to install it is with:

npm install -g node-red-contrib-ldap-auth

Then edit the Node-RED settings.js to include the following:

adminAuth: require('node-red-contrib-ldap-auth').setup({
    uri:'ldap://url.to.server', 
    base: 'ou=group,o=company.com', 
    filterTemplate: 'mail={{username}}'
}),

The filterTemplate is a mustache template to use to search the LDAP for the username, in this case the username is mail address.

Once it’s all up and running Node-RED should present you with something that looks like this:

Node-RED asking for credentials
Node-RED asking for credentials

Openstreetmap Video overlays

So as I mentioned in my last post I’ve been playing with generating map overlays for the cycling videos I’ve been making while out training. I’d run into a rate limiting problem when using Google Maps static map API.

To work round this I thought I’d see what I could do using Openstreetmap. Openstreetmap doesn’t have a static image API so I’m going to try and build something similar using LeafletJS and a way to grab images of the pages generated.

<html>
<head>
	<title>Maps</title>
	<link type="text/css" href="leaflet.css" rel="stylesheet"/>
	<script type="application/javascript" src="leaflet.js"></script>
	<style >
	#map { 
		height: 250px; 
		width: 250px;
	}
	</style>
</head>
<body>
	<div id="map"/>
	<script type="application/javascript">
function getUrlVars()
{
    var vars = [], hash;
    var hashes = window.location.href.slice(window.location.href.indexOf('?') + 1).split('&');
    for(var i = 0; i < hashes.length; i++)
    {
        hash = hashes[i].split('=');
        vars.push(hash[0]);
        vars[hash[0]] = hash[1];
    }
    return vars;
}

var args = getUrlVars();

var line = args["line"].split("|");

var last = line[(line.length - 1)];

var centre = [last.split(',')[0], last.split(',')[1]];

var map = L.map('map',{
	zoomControl: false,
	zoom: 15
});
map.setView(centre, 15);

L.tileLayer('http://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', 
	{
		maxZoom: 20,
	}).addTo(map);

var latlngs = [];

for (var i=0; i<line.length; i++) {
	latlngs.push(L.latLng(line[i].split(',')[0],line[i].split(',')[1]));
}

var polyline = L.polyline(latlngs, {color: 'red'}).addTo(map);
	</script
</body>
</html>

This generates the map tiles and overlays the route, but it’s as a web page, now I needed a way to convert this into a PNG image. There are two options, html2canvas or PhantomJS. I decided to go with PhantomJS first. The following loads and renders the page and then generates a PNG image.

var page = require('webpage').create();
var system = require('system');


page.onConsoleMessage = function(msg, lineNum, sourceId) {
  //console.log('CONSOLE: ' + msg + ' (from line #' + lineNum + ' in "' + sourceId + '")');
};

page.viewportSize = {
  width: 265,
  height: 250
};

var url = "file:///opt/share/playing/map-overlay/index.html?" + system.args[1];

console.log(url);

page.open(url, function(){
  setTimeout(function() {	
    page.render(system.args[2]);
    phantom.exit();
  },500);
});

The coordinates for the line are passed in on the command line along with the file name to write the file to.

test

phantomjs --local-to-remote-url-access=true --max-disk-cache-size=1024 --disk-cache=true map.js [Path] [file name]

Running PhantomJS with the disk cache enabled should keep the load on the Openstreetmap servers to a minimum but I’m also looking at how easy it is to set up my own tile server.

I can now feed this in to the scripts I spun up last time.

YouTube Preview Image

DIY Video overlay

I got myself a Garmin Virb Elite in the post Christmas sales, the plan was to use it while riding my bike and when snowboarding.

The camera will shoot in full 1080p HD and has a super wide angle lens to grab loads of what’s going on.

I finally managed to get out on my bike at the weekend and remembered to hook the camera up so it was hung under the handle bars. The raw video looks pretty good.

Having got some video I wanted to overlay some of the stats from my Garmin 810 cycling computer like location, speed, elevation, gradient and heart rate. Garmin provide some software called Virb Edit which will do all this, unfortunately it only runs on Windows or OSx. No problem I thought, I’ll just throw it in the same Windows 7 VM I use for Garmin Express, the app I use to upload my workouts. This was going well until I tried to view one of the videos I’d just imported and it complained about not having DirectX 10 support. This is the bit of Windows that handles all the multimedia stuff and for video tends to need access to a GPU to accelerate things. While it is possible to get things like this to work with a VM it is a lot of work and requires the machine your using to have 2 graphics cards1.

Since the standard software wasn’t going to work I thought I’d have a go at trying to build some scripts to do this with Linux. I decided to start with a simple map overlay. I’ve played with Google Map’s static maps API before so I ran up some simple NodeJS code to read a TCX file I generated from the FIT file created by the camera.

var tcx = require('tcx-js');
var http = require('https');
var fs = require('fs');

var parser = new tcx.Parser();
parser.parse_file("test.tcx");

var tail = [];


parser.activity.trackpoints.every(function(element, index, array){
	
	var filename = index + '.png';
	var url = "https://maps.googleapis.com/maps/api/staticmap?center=" + element.lat + "," + element.lng + "&zoom=16&size=300x300";
	if (index != 0) {
		url = url + '&path=color:0x0000ff80|weight:2|' + tail.join('|');
	}
	console.log(url);


	 (function(uri,fn){
	 	http.get(uri, function(res){
	 		//console.log("statusCode: ", res.statusCode);
 //  			//console.log("headers: ", res.headers);
	 		var data = '';
	 		res.setEncoding('binary');
	 		res.on('data', function (chunk){
	 			data += chunk;
	 		});

	 		res.on('end', function(){
	 			fs.writeFile(fn, data, 'binary', function(err) {
	 				if (err) {
	 					console.log(err);
	 				}
	 			});
	 		});
	 	});	
	 })(url, filename);


	tail.push(element.lat + "," + element.lng);
	if (tail.length >= 25) {
		tail.shift();
	}

	if (index != (array.length-1)) {
		var now = new Date(element.time);
		var then = new Date(array[index +1].time);

		var diff = Math.abs((then.getTime() - now.getTime() ) / 1000);
		//console.log('%d', diff);
		for (var i=0; i<diff; i++) {
			fs.appendFile('list.txt',filename + '\n',function(err){});
		}
	}

	return true;
	
});

This worked pretty well until I hit Google’s rate limit so only got the first 100 map titles.

I used the ImageMagick scripts to build a composite image of the map tile on transparent background.

Create an empty 1920×1800 image with a transparent background

$ convert -size 1920x1080 xc:transparent background.png

Overlay the map title in the bottom right hand corner

$ composite -geometry 300x300+1520+700 1.png background.png scaled/1.png

The script also generates a file called list.txt which contains a list of the filenames and how many times to repeat them at 1 frame per second to match up with the orginal video. Using this list with mencoder to generate the video like this.

$ mencoder mf://@list.txt -mf w=1920:h=1080:fps=1:type=png -ovc copy -oac copy -o output.avi

Finally I used OpenShot to generate the composite video.

I’m going to have a look at using OpenStreetmap instead of Google Maps to avoid the rate limiting and also trying to generate some gauges for speed, cadence, altitude and heading.

1 The laptop I’m doing this on does actually have 2 graphics cards, but I’m using the better one to actually run the OS and it’s a royal pain to switch them round.

WEMO Event notifications

As part of my on going playing with some Belkins WEMO devices I started to have a look at the UPNP Event support.

The UPNP standard as well as including discover and endpoints for control has an event system, this allows devices to publish their status changes to interested parties. The event system uses some extensions to HTTP.

SUBSCRIBE

To subscribe to events from a given device you send a request similar to the following to the eventSubURL given as defined in the setup.xml which is linked to in the UPNP discovery response.

SUBSCRIBE /upnp/event/bridge1 HTTP/1.1
CALLBACK: <http://192.168.1.1:3000/>
NT: upnp:event
TIMEOUT: Second-600
Host: 192.168.1.2:49154

CALLBACK -> is the URL to be notified
TIMEOUT -> how long to send notifications

Gets the following response:

HTTP/1.1 200 OK
DATE: Sun, 11 Jan 2015 18:27:05 GMT
SERVER: Unspecified, UPnP/1.0, Unspecified
CONTENT-LENGTH: 0
X-User-Agent: redsonic
SID: uuid:7206f5ac-1dd2-11b2-80f3-e76de858414e
TIMEOUT: Second-600

SID -> the Subscription ID

The SID is used to identify which notification come from this subscription, it can also be used to renew the subscription before the timeout expires by sending a SUBSCRIBE message like this:

SUBSCRIBE /upnp/event/bridge1 HTTP/1.1
SID: uuid:7206f5ac-1dd2-11b2-80f3-e76de858414e
TIMEOUT: Second-600
Host: 192.168.1.2:49154

NOTIFY

Incoming event notifications get delivered every time the state of the device changes and look like this for the WeMo Socket:

NOTIFY / HTTP/1.1
HOST: 192.168.1.1:3000
CONTENT-TYPE: text/xml; charset="utf-8"
CONTENT-LENGTH: 212
NT: upnp:event
NTS: upnp:propchange
SID: uuid:7206f5ac-1dd2-11b2-80f3-e76de858414e
SEQ: 0

<e:propertyset xmlns:e="urn:schemas-upnp-org:event-1-0">
  <e:property>
    <BinaryState>0</BinaryState>
  </e:property>
</e:propertyset>

The events from the light bulbs is a bit more complex, how parse them is demonstrated in the code.

UNSUBSCRIBE

And when your done you can unsubscribe with the following:

UNSUBSCRIBE / HTTP/1.1
Host: 192.168.1.2:49154
SID: uuid:7206f5ac-1dd2-11b2-80f3-e76de858414e

WeMo

Having worked all these bits out I put the following node js app together to test it all out with the WeMo devices I have:

Next step is to put all this together to build a new WeMo node module and then a improved Node-RED node.

Out Of Office Engine

Over Christmas (among other things) I’ve been messing around writing a little app to handle Out Of Office notifications for my Dad’s business.

Beaches and Blenders

Dad’s mail server runs Postfix on Linux and while there are a number of options to set up auto response messages, such as the vacation application. This seams to work well, but requires the user to edit files on the server to make it work. I wanted something a little bit more user friendly as none of the userbase for this tool have any idea what a Linux command line looks like.

I’ve finally started to move all my little side project from Java to NodeJS so I set out by crawling through npmjs to see which nodes I could use to help.

I started out with mailparser to be able to read and pull out the useful stuff from the original mail like the headers and any attachments.

Next up came email-templates to allow me to built a standard reply template then slot in a per user specific message. This was paired with nodemailer to actually send the response.

That was it for the bit that actually read and replied to the mail, but I needed user interface so I opted for a simple express web server using ejs to template the data.

Out Of Office Editor

All this is tied together with a small Mongodb to hold the details of each Out Of Office period and the list of email addresses that have been replied to. Mongodb is probably overkill for this but it’s simple to setup. It holds a record like this one for each user:

{
   "_id" : ObjectId("54a7d6f4176a69088a22b4b5"), 
  "name" : "b.hardill", 
  "startDate" : "2015-01-02T12:29:57.238Z", 
  "endDate" : "2015-01-30T21:59:17.971Z", 
  "subject" : "Away", 
  "message" : "I'll be out of the office for the next week, I'll get back to you when I get back", 
  "list" : [
    "bob@example.com", 
    "jim@example.com" 
  ]
}

I’ve stuck the project up on github here

It probably still needs some more testing but it should be good enough for what my Dad needs.

If I end up feeling really adventurous I may even try and remember hot to write a Thunderbird plugin to enable/disable things and set the dates.

Quick tip, don’t run a public facing SMTP server if you can help it

In between the Gingerbread engineering and Lego building and as part of my usual round of Christmas time IT support I ran into a new issue for me.

Towards the end of last year my dad acquired a small engineering business, as usual I got roped in to help sort out the IT side of things. I had a quick look round at what they had already over a weekend and worked out how to set up a quick off site backup system similar to the one I’ve mentioned before. The system was all Windows based so I stuck a Raspberry Pi in the wiring cupboard to give a remote access jumping off point to be able to poke at things from a far and to host a OpenVPN server to allow proper remote access.

The previous owners had set up a email system using a Windows based SMTP/POP/IMAP server app called MDaemon. It all seamed to work and I didn’t have time to swap it out for something I’m familiar with (Postfix/Dovecot on Linux) so I left it alone when I had my first look round. It turns out that the server was setup to not only accept incoming email on it’s internet facing side of things but also forward mail for signed in users. Normally I would set a mail server to only forward from users on the private internal LAN side of things, but if properly secured this arrangement should be OK. In this case things were not properly secured, lets just say that some people will be getting into the office in the new year to polite notes about selecting secure password and probably a link to this.

So one of the account had their password compromised and the server was used to send a bunch of SPAM. Once this was spotted the offending account was removed and the sever config tweaked to be a bit more sensible. This stopped the flow of SPAM and things seamed to be OK for a while until a couple of people that the business interacts with started to mention that they had not been receiving some of the email that had been sent. It turns out the IP address for the mail server had been added to a number of blacklists.

There is a site called mxtoolbox which hosts a tool for checking IP addresses against a number of the more popular blacklists. In our case we had ended up on 3 of them. The tool provides links to the sites which manage the lists and have forms to submit requests to be removed from the list once what ever has been sending the SPAM has been fixed. I have managed to successfully submit removal requests to 2 out of the 3 lists, but the Barracuda form is currently terminating the connection when ever I submit the form. As it’s the holiday period it looks like I’m going to wait until these guys are back in the office to get this fixed and hopefully get everybody accepting email.

I also found a very well hidden link to a form to get Google gmail to accept mail from us again as well, the pretty much say they you will never hear anything back from them and it could take weeks to get actioned or not…. I’ve sent this off but I’m hoping getting cleared from the other lists will help with this as well.

The other option I considered was to see if I could get the static IP assigned to the ASDL changed, but the ISP didn’t really want to entertain that so it looks like we’ll be changing ISP if I can’t get that list fixed.

So the moral of this story is find somebody to host your email for you.

Moo Sticker engines fixed

A few years ago I create a web app to allow people to order their own sets of Moo stickers with the MQTT, Node-RED and Owntracks(at the time called MQTTitude) logos.

These worked well with people ordering about 1 pack a month until Moo changed the way they did authentication. I didn’t have time to fix it until today.

I’ve moved the app from being a J2EE app over to one written in NodeJS using express, this along with the change in Moo’s authentication method has made the code a lot shorter and easier to read. I’ve published the new code on github here.

To order a set of stickers click on the appropriate image: