Google Assistant Sensors

Having built my 2 different LoRA connected temperature/humidity sensors I was looking for something other than the Graphana instance that shows the trends.

Being able to ask Google Assistant the temperature in a room seemed like a good idea and an excuse to add the relatively new Sensor device type my Google Assistant Bridge for Node-RED.

I’m exposing 2 options for the Sensor to start with, Temperature and Humidity. I might look at adding Air Quality later.

Once the virtual device is setup, you can feed data in the Google Home Graph using a flow similar to the following

The join node is set to combine the 2 incoming MQTT messages into a single object based on their topics. The function node then builds the right payload to pass to the Google Home output node and finally it feeds it through an RBE node just to make sure we only send updates when the data changes.

msg.payload = {
  params: {
    temperatureAmbientCelsius: msg.payload["bedroom/temp"],
    humidityAmbientPercent: Math.round(msg.payload["bedroom/humidity"])
  }
}

Google Assistant Camera Feeds

As mentioned in a previous post I’ve been playing with Streaming Camera feeds to my Chromecast.

The next step is to enabling accessing these feeds via the Google Assistant. To do this I’m extending my Node-RED Google Assistant Service.

You should now be able to add a device with the type Camera and a CameraStream trait. You can then ask the Google Assistant to “OK Google, show me View Camera on the Livingroom TV”

This will create an input message in Node-RED that looks like:

{
  "topic": "",
  "name": "View Camera",
  "payload": {
    "command": "action.devices.commands.GetCameraStream",
    "params": {
      "StreamToChromecast": true,
      "SupportedStreamProtocols": [
        "progressive_mp4",
        "hls",
        "dash",
        "smooth_stream"
      ],
      "online": true
    }
  }
}

The important part is mainly the SupportedStreamProtocols which shows the types of video stream the display device supports. In this case because the target is a ChromeCast it shows the full list.

Since we need to reply with a URL pointing to the stream the Node-RED input node can not be set to Auto Acknowledge and must be wired to a Response node.

The function node updates the msg.payload.params with the required details. In this case

msg.payload.params = {
    cameraStreamAccessUrl: "http://192.168.1.96:8080/hls/stream.m3u8",
    cameraStreamProtocol: "hls"
}
return msg;

It needs to include the cameraStreamAccessUrl which points to the video stream and the cameraStreamProtocol which identifies which of the requested protocols the stream uses.

This works well when the cameras and the Chromecast are on the same network, but if you want to access remote cameras then you will want to make sure that they are secured to prevent them being scanned by a IoT search engine like Shodan and open to the world.

Streaming Camera to Chromecast

I have a little cheap WiFi camera I’ve been meaning to do something with for a while. The on board web access doesn’t really work any more because it only supports Flash or Java Applet to view the stream.

But camera supports the ONVIF standard so it offers a rtsp:// feed so I can point Linux apps like mplayer at it and see the stream.

The camera is currently sat on my window sill looking out over the valley which is a pretty good view.

View from upstairs window

I thought it would be interesting to stream the view to the TV in my Living room while I’m working from home at the moment. It is also a way to check the weather without having to get up in the morning and open the blind.

I have a Chromecast in the back back of both TVs so using this seamed like it would be the easiest option.

Chromecast

Chromecasts support a number of different media types but for video we have 2 common codec that will work across all the currently available device types.

  • H264
  • VP8

And we have 3 options to deliver the video stream

  • HLS
  • DASH
  • SmoothStreaming

These are all basically the same, they chop the video up into short segments and generate a play list that points to the segments in order and the consumer downloads each segment. When it reaches the end of the list it downloads the list again which will now hold the next list of segments.

There is a plugin for Nginx that supports both HLS and DASH which looked like a good place to start.

NGINX

I’m running this whole stack on a Raspberry Pi 4 running Raspbian Buster.

$ sudo apt-get install nginx libnginx-mod-rtmp ffmpeg

Once the packages are installed the following needs to be added to the end of the /etc/nginx/nginx.conf file. This sets up a rtmp listener that we can stream the video to which will then be turned into both a HLS and DASH stream to be consumed.

...
rtmp {
  server {
    listen 1935; # Listen on standard RTMP port
    chunk_size 4000;

    application show {
      live on;
      # Turn on HLS
      hls on;
      hls_type live;
      hls_path /var/www/html/hls/;
      hls_fragment 5s;
      hls_playlist_length 20s;
      
      # Turn on DASH      
      dash on;
      dash_path /var/www/html/dash/;
      dash_fragment 5s;
      dash_playlist_length 20s;

      # disable consuming the stream from nginx as rtmp
      deny play all;
    }
  }
}

The playlist and video segments get written to /var/www/html/hls and /var/www/html/dash respectively. Because they will be short lived and replaced very regularly it’s a bad idea to write these to an SD card as they will just cause excessive flash wear.

To get round this I’ve mounted tmpfs filesystems at those points with the following entries in /etc/fstab

tmpfs	/var/www/html/dash	tmpfs	defaults,noatime,size=50m
tmpfs	/var/www/html/hls	tmpfs	defaults,noatime,size=50m

Now we have the playlists and segments being generated in a sensible way we need to server them up. I added the following to the /etc/nginx/sites-enabled/default file

server {
  listen 8080;
  listen [::]:8080;

  sendfile off;
  tcp_nopush on;
  directio 512;
  default_type application/octet-stream;

  location / {
    add_header 'Cache-Control' 'no-cache';
    add_header 'Access-Control-Allow-Origin' '*' always;
    add_header 'Access-Control-Allow-Credentials' 'true';
    add_header 'Access-Control-Expose-Headers' 'Content-Length';

    if ($request_method = 'OPTIONS') {
      add_header 'Access-Control-Allow-Origin' '*';
      add_header 'Access-Control-Allow-Credentials' 'true';
      add_header 'Access-Control-Max-Age' 1728000;
      add_header 'Content-Type' 'text/plain charset=UTF-8';
      add_header 'Content-Length' 0;
      return 204;
    }

    types {
      application/dash+xml mpd;
      application/vnd.apple.mpegurl m3u8;
      video/mp2t ts;
    }

    root /var/www/html/;
  }
}

Now we have the system to stream the content in an acceptable format we need to get the video from the camera into nginx. We can use ffmpeg to do this.

ffmpeg -re -rtsp_transport tcp -i rtsp://192.168.1.104:554/live/ch1 -vcodec libx264 -vprofile baseline -acodec aac -strict -2 -f flv rtmp://localhost/show/stream

This reads from the RTSP stream rtsp://192.168.1.104:554/live/ch1 and streams it into the rtmp://localhost/show/stream. The show part is the name of the application declared in the rtmp section in the nginx.conf and the stream will be the name of the HLS or DASH stream. In this case the following:

  • HLS -> http://192.168.1.98/hls/stream.m3u8
  • DASH -> http://192.168.1.98/dash/stream.mpd

If you change the end of the `rtmp://localhost/show/XXXX` URL you can create multiple streams from different sources with different names (just make sure the tmpfs mounts have enough space for all the streams).

Testing

Node-RED Chromecast flow

I’ve been using the Node-RED Chromecast node to test the streams. The DASH stream is working pretty well, but the HLS is a bit more fragile for some reason. Latency is currently about 20-30 seconds which is appears mainly to be determined by the size and number of chunks in the playlist used but if I wind the fragment size down any lower than 5s or the 20s for the playlist length.

Next

Now it’s basically working the next steps are to add support for Camera devices to my Google Assistant Node-RED service so I can request the stream via voice and have it show on my Google Home Hub as well. I’m also building a standalone Google Smart Home Action just for Camera feeds using an all Google stack just as a learning exercise in Firebase.

At the moment the stream is only available from inside my network, I’ll probably proxy it to my external web server as well and add on authentication. The Google Assistant can be given a Bearer Auth Token along with the URL which means I’ll be able to view the stream on my phone while out. While not important for this stream it would be for other Security camera type applications.

Node-RED Google Home Smart Home Action Generally Available

I started this post back in November 2017, it’s been a long slog, but we are finally here. We had a false start back in January 2019 when I got the bulk of the code all working and I thought it would take a few weeks get it certified and released and again at the beginning of this year when the Action got approved but still required you to be a member of a Google Group to be able to sign up . Unfortunately that wasn’t the case. But all that is behind us now.

A Node-RED flow with lots of Google Home Assistant nodes

You can find the full docs for how to install and configure the Action here, but the short version is:

  • Create an account here
  • Create some devices using the wizard
  • Link the NR-GAB Action to your Google Account in the Google Home app, it should have the following icon.
  • Install the node-red-contrib-googlehome node in Node-RED
  • Drag the googlehome-in node on to the canvas and start building flows that can be triggered by the Google Assistant.

As always the doc probably needs a little bit more work, so I’ll keep updating it as folks run into issues with it. If you have any questions the best place to ask them is in the #google-home-assistant channel on the Node-RED Slack team.

Next steps are I have a working version of the Google Assistant Local Control API that I will release as soon as Google open that up for general availability. This sends commands directly to Node-RED from the Smart Speaker device which reduces the latency in triggering actions.

Node-RED Google Home Smart Home Action

Google Home

Following on from my Alexa Home Skill for Node-RED it’s time to see about showing some love to the Google Home users (OK, I’ve been slowly chipping away at this for ages, but I’ve finally found a bit of time).

One of the nice things about Google Assistant is that it works all over the place, I can use it via the text interface if I’m somewhere and can’t talk, or even from the car via Android Auto.

Screenshot_20190101-170716

Google offer a pretty similar API for controlling Smart home devices to the one offered by Amazon for the Alexa so the implementation of this was very similar. The biggest difference is the is no requirement to use something like Amazon’s Lambda to interface with the service so it’s just a single web endpoint.

I’ve taken pretty much the same approach as with the Alexa version in that I have a Web Site where you can sign up for an account and then define virtual devices with specific names and characteristics.

Virtual devices

Google support a lot more different types of devices and characteristics than Amazon with Alexa at the moment, but to start with I’m just supporting Sockets/Light/Switches and Thermostats. I intend to add more later as I work out the best way to surface the data.

The other big change is that Google Assistant supports asynchronously updating the device state and the ability for the Assistant backend to query the state of a device. To support this I’m going to allow the response node to be configured with a specific device and to accept input that has not come from an input node.

The node is currently being beta tested, if you are interested post in #google-home-assistant on the Node-RED Slack and I can add you to the ACL for the beta.

Google Assistant Node-RED Node

I’ll do another post when the node has finished testing and has been accepted by Google.

Google Home

Having got hold of a Amazon Echo Dot just over a week ago, I was eager to get my hands on a Google Home to compare. Luckily a I had a colleague in the US on release day who very kindly brought me one back.

Google Home

It looks like the developer program will not be opening up until December, which is just as well as I need to get the my Alexa project finished first.

My initial impression from a couple of hours of playing is it’s very similar to the Echo, but knows a lot more about me out of the box (due Google already knowing all my secrets), I really do like the Chromecast integration for things like Youtube. I need to try the Echo with my FireTV stick to see if it can do anything similar. It’s still early days for the Google Home and it has only been released in the US so it wouldn’t be totally fair compare them too closely just yet.

I’ll keep this brief for now until I can get into the developer tools. It’s going to be fun time working out just what I can get these two devices to do.