Node-Red – Ti SensorTag Node

Last weekend I spent some time working on yet another Node-Red node. This one is an input node that reads the data published by a small sensor platorm from Ti.

The Ti SensorTag is a Bluetooth 4.0 LE platform designed to be a test source for building new BLE and is very accessible at only $25 dollars especially with the following list of sensors on board:

  • Ambient Temperature
  • IR remote Temperature
  • Air Pressure
  • Humidity
  • Accelerometer
  • Magnetometer
  • Gyroscope
  • 2 Push Buttons

Ti have also set up a open wiki to allow people to document they experiments with the device.

You can find the node in the new node-red-nodes repo on github here it relies on a slightly updated version of Sandeep Mistry‘s node-sensortag, the read me explains how to install my update (until I get round to submitting the pull request) but here is the command to run in the root of your Node-Red directory:

npm install sensortag@https://api.github.com/repos/hardillb/node-sensortag/tarball

(you may need to install the libbluetooth-dev package for Debian/Ubuntu based distros and bluez-libs-devel on Redhat/Fedora first)

UPDATE:
Sandeep has now merged my changes so the sensortag node can now be installed normally with npm with:

npm install sensortag

Once installed you need to run Node-RED as root as this is the only way to get access to the BLE functions, then you can add the node to the canvas and configure which sensors are pushed as events

Please feel free to have a play and let me know what you think

Google Chromecast

I managed to get my hands on a Google Chromecast at the weekend. Many thanks to Mike Carew for bringing one back from the US for me via Dale.

Having unpacked the stick I plugged it into my TV and plugged the usb cable in to power it. At first nothing happened and the little notification light on the device stayed red. but replugging the power cable it jumped into life. The instructions directing me to http://www.google.com/chromecast/setup, I had to do this in the Chrome browser and on my Windows laptop as there is no setup app for Linux at the moment (There is a config app for Android, but this is only available for US users at the moment)

When I got to the point where I had to configure which WiFi network the chromecast should connect to there was a problem as my router’s SSID was not showing in the list. It took a couple of minutes for me to remember that I had set my router to use channel 13 as it’s normally lightly used. The reason it is lightly used is because in the US you can only use channels up to 11. A quick change of channel later and the network showed up in the list.

The next part is the only bit that is not as slick as it should be. The Chromecast was fully configured but when I tried to use one of the apps (I’ll talk about those in a moment) it would not show a Chromecast available to send data to. The problem was that my router had done it’s usual trick of walling each of the separate WiFi device from each other, this feature can be called a few things but the most common seams to be apisolation. In a place with shared WiFi like a coffee shop or hotel this is good as it stops people snooping on or attacking your machine, in the home environment this may not be suitable and in this case very much unwanted. I had run into this problem before as one of my MythTV frontends is on WiFi and I had changed the settings to allow WiFi cross talking but the router seams to forget the setting pretty quickly, my usual trick was to reboot my router if I needed to log into it from my laptop to fix things. This was going to become a real issue with the Chromecast. After bit of digging I found a forum post about how to tweak the settings via the telnet interface so quickly ran up an expect script to do it when needed.

#!/usr/bin/expect

set timeout 20
set name SuperUser
set pass ###########

spawn telnet 192.168.1.254

expect "Username : "
send "$namer"
expect "Password : "
send "$passr"
expect "{SuperUser}=>"
send "wireless mssid ifconfig ssid_id=0 apisolation=disabledr"
expect "{SuperUser}=>"
send "saveallr"
expect "{SuperUser}=>"
send "exitr"

This gets called by the script I’ve got bound to a button on my remote driving LIRC that changes the input on my TV from RGB used for MythTV to the HDMI socket used by the Chromecast which ensures my network is always setup properly. I really shouldn’t have to do this but O2 Wifibox III I have is not the best.

Once I’d got all that out of the way time to start actually using this thing for what’s made for. Out of the box there is support for the Chromecast baked into the latest version of the Android YouTube app, Google Play Music, Google Play Movies and Netflix app. I don’t have a Netflix account at the moment so I tried out the other 3.

YouTube app

When the YouTube app finds a Chromecast on the local network it adds the little cast icon to the Action bar. When you tap on this it displays a pop-up to all you to select the Chromecast (if you have more than one on the network) and then rather than play the video on the devices screen they are played on the TV. Play/Pause and volume control are available on the device. One other really nice feature is that the Chromecast maintains a queue of videos to play so you can add to the queue from your phone while it’s playing the current video, in fact if you can do this from multiple devices at the same time. This means you can take it in turns with your mates to see who can find best cat video.

Google Play Movies
Much like YouTube Google Play movies lets you play content on the Chromecast. I had rented a copy of Mud the week before getting hold of my Chromecast so I watched this on the TV rather than on my Nexus 7. The only odd part was that I had downloaded a copy to the device and it would not let me watch it via the Chromecast without deleting the local copy.

Google Play Music
The music app works as expected, showing the cover art on the screen while it plays the tracks. Because it streams tracks directly from the cloud if you are working through a playlist and hit a track that you have added directly to the storage on the phone then it will refuse to play even if you have pushed a copy of the file to Google Music’s cloud storage.

Away from applications on your Android device there is a plugin for the Chrome browser which allows you to share the content of any tab on the large screen. I need to have a look at using this for giving HTML5 based presentations.

There is a API for interacting with the Chromecast and and I’m going to have a look at writing an app to push MythTV recordings so I can replace one of my MythTV frontends. First impressions of the API make me think this shouldn’t be too hard if I can set up the right transcoding.

Over all I’m pretty impressed with the Chromecast and I’m still debating if I should ask my folks to bring me another one back as they are out in the US at the moment.

Taking a look at the neighbourhood

A recent article about detecting offline social networks from information about preferred WIFI networks leaked by mobile devices led me to have another look at the work I’d done looking at WIFI location detection to see what other information I could derive.

I had been having problems with finding WIFI adapters with the right chipsets to allow me to use them in monitor mode in order to capture all the available packets from different networks, but recent updates to some of the WIFI drivers in the Linux kernel have enabled some more of the devices I have access to.

Previously I build a small application which works with the Kismet wireless scanner application to publish details of each device spotted to a MQTT topic tree. With a small modification it now also published the data about the WIFI networks that are being searched for.

Then using 2 simple Node-RED flows this data is stored into a MongoDB instance

You can have a look at the flow here.

From the device’s MAC address it is possible to determine the manufacture, so with another little node application to query the MongoDB store I can generate this d3js view of what type of devices are in use in the area round my flat.

The view dynamically updates every 5 seconds to pick up the latest information.

Now I know who owns what type of device, time to see who might know who. By plotting a force directed graph of all the clients detected and linking them based on the networks they have been searching for I can build up a view of which devices may belong to people who know each other.

Force directed network graph

There are a couple of clusters in the data so far, but most of them are from public WIFI networks like BTOpenzone and O2 Wifi. After filtering these services out there was still the 3 devices that look to be using Mike’s Lumina 800 for internet access and 4 devices connected to the same Sky Broadband router. I expect the data to be a lot more interesting when I get to run it somewhere with a few more people.

At the moment this is all running on my laptop, but it should run fine on my raspberry pi or my home server, as soon as I’ve transferred it over I’ll put a link up to live version of the charts.

d3 MQTT Tree visualiser updated

I’ve been having a bit of a play updating my d3 based MQTT topic tree visualiser this weekend.

  1. I’ve been trying to tidy things up a bit and break the code out into a little standalone library and only leave the MQTT connection code in the index.html.
  2. I’ve been improving the handling of long payload. There is now a nicer popup tooltip with a longer version of the payload, it’s still cropped at 100 chars but I’m still working on wrapping it and showing a few lines to get more in.
  3. I’ve been moving the MQTT connection code over to use binary WebSocket connections* rather than the JSON based proxy I was using before. The version hosted here is still using the proxy for now, but the binary versions work and I’ll move it over once I’ve finished playing with my internal broker setup.

I still need to try and make the whole thing resize or at least not have hard coded dimensions, but that might have to wait for next time.

I’ve stuck the updated version up here, I’ll stick the code up on github once I get sign off from the boss.

https://github.com/hardillb/d3-MQTT-Topic-Tree

* There is a new IBM WebSphere MQ feature pack supporting this, details can be found here

RGB LED Meeting Warning Light (Lotus Notes Edition)

At the end of my last post I mentioned trying to get the same set up working with my work Lotus Notes calendar.

After a little poking around with the Lotus Notes Java API here is the result:

package uk.me.hardill.notes;

/**
 * NextCalEntry
 * 
 * Sets the RGB values for Blink(1)/Digispark+RGB
 * according to the time to the next meeting in 
 * your Lotus Notes Calendar
 * 
 * This should be run with the JRE that ships with
 * Lotus notes as it has the required classes on 
 * classpath and have Lotus notes directory on 
 * library path e.g.
 * LD_LIBRARY_PATH=/opt/ibm/lotus/notes
 * /opt/ibm/lotus/notes/jvm/bin/java NextCalEntry -d
 */

import java.text.SimpleDateFormat;
import java.util.Date;

import lotus.domino.Database;
import lotus.domino.DateTime;
import lotus.domino.DbDirectory;
import lotus.domino.Document;
import lotus.domino.NotesFactory;
import lotus.domino.NotesThread;
import lotus.domino.Session;
import lotus.domino.View;
import lotus.domino.ViewEntry;
import lotus.domino.ViewEntryCollection;

public class NextCalEntry extends NotesThread {

   static SimpleDateFormat dateFormat = new SimpleDateFormat(
         "dd/MM/yy H:m:s z");

   static int rgb[] = { 0, 200, 0 };

   static int pollInterval = 300;
   static boolean mqtt = false;
   static String topic = "";
   static boolean digi = true;

   public static void main(String argv[]) {

      for (int i = 0; i < argv.length; i++) {
         if (argv[i].equals("-b")) {

         } else if (argv[i].equals("-d")) {

         } else if (argv[i].equals("-t")) {
            mqtt = true;
            topic = argv[++i];
         }
      }

      NextCalEntry nextCalEntry = new NextCalEntry();
      nextCalEntry.start();
   }

   public void runNotes() {
      try {
         Session s = NotesFactory.createSession();

         DbDirectory dir = s.getDbDirectory(null);
         Database db = dir.getFirstDatabase(DbDirectory.DATABASE);

         db = dir.openMailDatabase();

         if (db.isOpen() == false)
            db.open();

         db = dir.openMailDatabase();
         View calendarView = db.getView("($Calendar)");

         DateTime sdt = s.createDateTime("today");
         sdt.setNow();
         DateTime edt = s.createDateTime("today");
         edt.setNow();
         edt.adjustDay(+1);

         ViewEntryCollection vec = calendarView.getAllEntries();

         ViewEntry entry = vec.getFirstEntry();

         int offset = 3600;
         boolean poisonPill = false;
         while (entry != null) {
            Document caldoc = entry.getDocument();
            String sub = caldoc.getItemValueString("Subject");
            DateTime startDate = null;
            try {
               startDate = (DateTime) caldoc.getItemValueDateTimeArray(
                     "StartDate").firstElement();

            } catch (Exception e) {

            }

            if (startDate != null) {

               for (int i = 0; i < caldoc.getItemValueDateTimeArray(
                     "StartDateTime").size(); i++) {

                  int st = sdt.timeDifference((DateTime) caldoc
                        .getItemValueDateTimeArray("StartDateTime")
                        .get(i));
                  int en = edt.timeDifference((DateTime) caldoc
                        .getItemValueDateTimeArray("EndDateTime")
                        .get(i));

                  Date start = dateFormat.parse(caldoc
                        .getItemValueDateTimeArray("StartDateTime")
                        .get(i).toString());
                  Date end = dateFormat.parse(caldoc
                        .getItemValueDateTimeArray("EndDateTime")
                        .get(i).toString());
                  Date now = new Date();

                  if ((st <= 0) & (en >= 0)) {
                     if ((-1 * st) < offset) {
                        offset = (-1 * st);
                     }

                  } else if (now.after(start) && now.before(end)) {
                     offset = -1;
                     poisonPill = true;
                     break;
                  }
               }
            }
            if (poisonPill) {
               break;
            }
            entry = vec.getNextEntry();
         }

         if (offset > 0 && offset <= 300) {
            // red
            rgb[0] = 20;
            rgb[1] = 0;
            rgb[2] = 0;
         } else if (offset > 300 && offset <= 600) {
            // redish
            rgb[0] = 15;
            rgb[1] = 5;
            rgb[2] = 0;
         } else if (offset > 600 && offset <= 900) {
            // greenish/redish
            rgb[0] = 10;
            rgb[1] = 10;
            rgb[2] = 0;
         } else if (offset > 900 && offset <= 1200) {
            // greenish
            rgb[0] = 5;
            rgb[1] = 15;
            rgb[2] = 0;
         } else if (offset == -1) {
            // blue
            rgb[0] = 0;
            rgb[1] = 0;
            rgb[2] = 20;
         } else {
            // green
            rgb[0] = 0;
            rgb[1] = 20;
            rgb[2] = 0;
         }

         if (mqtt) {
            // TODO
            // connect to broker and publish
         } else {
            Runtime runtime = Runtime.getRuntime();
            String cmd[];
            if (digi) {
               System.out.println(rgb[0] + " " + rgb[1] + " " + rgb[2]);
               cmd = new String[4];
               cmd[0] = "DigiRGB.py";
               cmd[1] = rgb[0] + "";
               cmd[2] = rgb[1] + "";
               cmd[3] = rgb[2] + "";
            } else {
               System.out.println(rgb[0] + "," + rgb[1] + "," + rgb[2]);
               cmd = new String[3];
               cmd[0] = "blink1-tool";
               cmd[1] = "--rgb";
               cmd[2] = rgb[0] + "," + rgb[1] + "," + rgb[2];
            }
            Process proc = runtime.exec(cmd);
            proc.waitFor();
         }

      } catch (Exception e) {
         e.printStackTrace();
      }
   }
}

It is also available as gist here (not embedded as there seams to be CSS background colour issue)

Run this in a script every 5 mins either with a sleep loop or as a crontab entry similar to the ones described at the end of the previous post

RGB LED Meeting Warning Light

In my previous post I talked about the Digispark board and the RGB LED shield I had just received.

Digispark and RGB Shield

This afternoon in a spare 30mins I had while some tests ran I built my first little hack to use one of them. It is inspired by this really cool hack that got mentioned on twitter by Dave CJ while we were debating Blink(1) vs Digispark/RGB Shields. The idea of a strip of LEDs that represent the working day with different colours for free time and meetings is cool, but I only have 1 LED to work with, so I thought how about just showing how close to the next meeting I am.

The idea is that if the next entry in my calendar is more than 20mins away then the light is green, as the meeting start time gets closer the LED changes closer to red and finally during the meeting if glows blue.

I started out prototyping this against my google calendar as it’s pretty easy to get at the data, if you poke around in the settings page you can get hold of a URL that points to either a XML or ICAL version of the data.

I’m running this on my Raspberry Pi so python seamed like a good choice to run this up in.

#!/usr/bin/python

import sys
import pycurl
import StringIO
from icalendar import Calendar, Event
from datetime import datetime, timedelta

curl = pycurl.Curl()
curl.setopt(pycurl.URL,sys.argv[1])
body = StringIO.StringIO()
curl.setopt(pycurl.WRITEFUNCTION, body.write)
curl.perform()

cal = Calendar.from_ical(body.getvalue())
body.close()

now = datetime.now()

offset = 3600

for component in cal.walk():
  if component.name == "VEVENT":
    dtstart = component["dtstart"].to_ical()
    dtend = component["dtend"].to_ical()
    if "T" in dtstart:
      dt = datetime.strptime(dtstart, "%Y%m%dT%H%M%SZ")
      end = datetime.strptime(dtend, "%Y%m%dT%H%M%SZ")

      if dt > now:
        delta = dt - now
        if delta.days == 0:
          if delta.seconds < offset:
            offset = delta.seconds
      elif end > now and dt < now:
        offset = -1
        break

if offset > 0 and offset < 300:
  print "200 0 0"
elif offset >= 300 and offset < 600:
  print "150 50 0"
elif offset >= 600 and offset < 900:
  print "100 100 0"
elif offset >= 900 and offset < 1200:
  print "50 150 0"
elif offset == -1:
  print "0 0 200"
else:
  print "0 200 0"

This script outputs RGB values that can be fed directly to DigiBlink.py script like this in a cron job

*/5 * * * * DigiBlink.py `calColur.py <calendar URL>`

Or since I’ve set up a little script to control my DigiSpark via MQTT the following publishes the output of the script on the ‘digiblink’ topic on the local broker.


*/5 * * * * calColur.py <calendar URL> | mosquitto_pub -h localhost -t digiblink -l

This script should also work with the command line tool for the Blink(1)

I still need to tweak the colours for each step and next work out how to get the same feed out of the company calendaring system, but it’s a good start.

Budget Blink(1)

Production blink(1) units

About a month ago I spotted Kickstarter project for a device called a Blink(1). This was a little USB device that acts like Ambient Orb and it made by the same folk that make the BlinkM 3 colour LED board that is used in them. I had loads of ideas with what to do with one.

The project had already reached it’s funding target and it had actually delivered. The devices where available to order directly from ThinkM here. I was all set to buy one except for 1 thing, the price. While I don’t really have a problem with ThinkM wanting $30 each for them, the problem is not even with the fact that HMRC will want to charge me VAT (at 20% on anything imported with a price of more than £15) on this when it is imported the real issue is that which ever courier handles the UK end of the delivery will charge me at least £8 in “handling” fee for the privileged.

Digispark and RGB Shield

I put the idea to the back of my mind for a while with the hope that the price would come down or a UK supplier would be found. While waiting I came across another Kickstarter project for a really tiny Arduino called a Digispark from the Digistump team. As well as the digispark boards themselves they where doing a range of shield, one of which was a RGB LED shield. Being a miniature arduino means I should be able to add the odd sensor or two to the board which I can use to influence the colour shown by the LED.

And to top it off I could order 2 Digisparks and 2 RGB LED shields + shipping for $26 which comes in just under the import limit for VAT.

The package was waiting for me when I got back from a weekend away and I managed to grab 5 mins today at the office to use the soldering iron to put one of the shields together and add the headers to the first digispark.

I plugged it into my laptop and the standard blink sketch seamed to be running fine. I started to follow the instructions on the wiki to set up the development environment under Linux and I think I followed it all properly but I could not get the tools to upload a new sketch to the device. I have added a comment to a similar question on the forum. In the mean time I have managed to get it all to work on my Windows machine.

EDIT: Following a comment on the forum suggesting I try plugging the digispark into a powered hub to program it I have managed to get my second digispark programmed from Linux.

Along with the RGB shield there is some sample code that makes the digispark behave as a HID device and some python code that allows you to send it RGB values to set the colour shown much the same way the Blink(1) works. When I get 5 mins I will be doing the obvious MQTT hack to allow RGB values to be posted to a topic to set the colour then look at hooking it up to some of the data feeds available.

Ingress

So there has been a bit of buzz recently (mainly about how to get a invite) about a new game from the Google NianticLabs called Ingress. It is a AR game where users do battle for one of two factions in a virtual world overlaid on top of the real world.

Ingress Logo
Ingress Logo

The game is still in closed beta at the moment and only open to people with invites, you can apply for an invite here. You can also submit Ingress inspired art work to Google Plus and tagging Brandon Badger, Brian Rose, Anne Beuttenmüller and +Joe Philley and hope they are impressed enough to let you in.

The premiss is that the work at CERN hunting for the Higgs boson has caused the release/discovery of something called Exotic Matter (XM). This XM seams to be capable of influencing human behaviour, especially in the creative and scientific direction. Following this discovery those that know of it’s existence fall in to two factions, the Enlightened who believe that XM is sent to help humanity and the Resistance who believe that XM is part of a slow insidious invasion. As well as XM there are portals which seam to be related. Portals can be captured and things called Resonators added, with these three or more portals can be linked together to create fields in the enclosed space. These field allow the Enlightened or the Resistance to try and influence the public to their way of thinking. The battle is to create the largest fields and influence the most people.

If you happen to live in an area with no portals you can add new ones to the game by sending geo-tagged photos of “interesting” locations to Google via the app, but they take between 2-3 weeks to be added. That seams a long time for somebody to maintain an interest if they are in a area with little or no existing portal, I’m still waiting for any of mine to appear, but I’ve managed to stay engaged by playing around the 4 portals up in Winchester.

There is the start of a strong social aspect to the game as well, you need to arrange for other members of your faction to coordinate attacks on portals (especially the higher level version)in order to capture them back from the opposition and you also need their help to upgrade portals to higher levels to enable longer range links.

There are plenty more detailed discussions about the game online so I won’t go on any more about that here. What is more interesting is the potential this sort of platform has. The combination of crowd sourcing and gamification may well lead to something like the game Spooks, from Charles StrossHalting State. I’m not suggesting any thing as sinister as Spooks, where the EULA turned out to be a click through copy of the Official Secrets Act, and the whole thing was being run by MI5, but it’s important to remember that Google are not running this game for fun, to start with it’s a way to get more people to volunteer more location data and also a nice way to collect a bunch of geo-tagged photos from the portal submission process. I’m already signed up to Latitude so it’s not a problem for me to send my location to google again as part of the game but I do wonder how many other players are aware of the trade they are making to play the game and what else they may need to trade as the game continues.

I’m also sure that Google have a bunch more plans for the project. Thinking about how this could be extended leads to any number of avenues, what could you do with a large enough group of people for the promise of some notional in game reward (at little to no cost to the host)? Things that come to mind:

  • Set tasks to submit photos of new buildings/locations to keep things like Streetview up to date.
  • Have users walk new roads/paths to update mapping data
  • Taking it a step further, with the right mechanisms built in to evaluate trust could you build a cheap/free delivery service having players deliver packages across a city?

These are just a few that came to mind as I was putting this post together, I’m sure there are many more. I’ll be keeping an eye on how it develops and of course capturing portals for the Resistance.