I have a little cheap WiFi camera I’ve been meaning to do something with for a while. The on board web access doesn’t really work any more because it only supports Flash or Java Applet to view the stream.
But camera supports the ONVIF standard so it offers a rtsp:// feed so I can point Linux apps like mplayer at it and see the stream.
The camera is currently sat on my window sill looking out over the valley which is a pretty good view.

I thought it would be interesting to stream the view to the TV in my Living room while I’m working from home at the moment. It is also a way to check the weather without having to get up in the morning and open the blind.
I have a Chromecast in the back back of both TVs so using this seamed like it would be the easiest option.
Chromecast
Chromecasts support a number of different media types but for video we have 2 common codec that will work across all the currently available device types.
- H264
- VP8
And we have 3 options to deliver the video stream
- HLS
- DASH
- SmoothStreaming
These are all basically the same, they chop the video up into short segments and generate a play list that points to the segments in order and the consumer downloads each segment. When it reaches the end of the list it downloads the list again which will now hold the next list of segments.
There is a plugin for Nginx that supports both HLS and DASH which looked like a good place to start.
NGINX
I’m running this whole stack on a Raspberry Pi 4 running Raspbian Buster.
$ sudo apt-get install nginx libnginx-mod-rtmp ffmpeg
Once the packages are installed the following needs to be added to the end of the /etc/nginx/nginx.conf
file. This sets up a rtmp
listener that we can stream the video to which will then be turned into both a HLS and DASH stream to be consumed.
...
rtmp {
server {
listen 1935; # Listen on standard RTMP port
chunk_size 4000;
application show {
live on;
# Turn on HLS
hls on;
hls_type live;
hls_path /var/www/html/hls/;
hls_fragment 5s;
hls_playlist_length 20s;
# Turn on DASH
dash on;
dash_path /var/www/html/dash/;
dash_fragment 5s;
dash_playlist_length 20s;
# disable consuming the stream from nginx as rtmp
deny play all;
}
}
}
The playlist and video segments get written to /var/www/html/hls and /var/www/html/dash respectively. Because they will be short lived and replaced very regularly it’s a bad idea to write these to an SD card as they will just cause excessive flash wear.
To get round this I’ve mounted tmpfs
filesystems at those points with the following entries in /etc/fstab
tmpfs /var/www/html/dash tmpfs defaults,noatime,size=50m
tmpfs /var/www/html/hls tmpfs defaults,noatime,size=50m
Now we have the playlists and segments being generated in a sensible way we need to server them up. I added the following to the /etc/nginx/sites-enabled/default
file
server {
listen 8080;
listen [::]:8080;
sendfile off;
tcp_nopush on;
directio 512;
default_type application/octet-stream;
location / {
add_header 'Cache-Control' 'no-cache';
add_header 'Access-Control-Allow-Origin' '*' always;
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Expose-Headers' 'Content-Length';
if ($request_method = 'OPTIONS') {
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Max-Age' 1728000;
add_header 'Content-Type' 'text/plain charset=UTF-8';
add_header 'Content-Length' 0;
return 204;
}
types {
application/dash+xml mpd;
application/vnd.apple.mpegurl m3u8;
video/mp2t ts;
}
root /var/www/html/;
}
}
Now we have the system to stream the content in an acceptable format we need to get the video from the camera into nginx. We can use ffmpeg
to do this.
ffmpeg -re -rtsp_transport tcp -i rtsp://192.168.1.104:554/live/ch1 -vcodec libx264 -vprofile baseline -acodec aac -strict -2 -f flv rtmp://localhost/show/stream
This reads from the RTSP stream rtsp://192.168.1.104:554/live/ch1
and streams it into the rtmp://localhost/show/stream
. The show
part is the name of the application
declared in the rtmp section in the nginx.conf
and the stream
will be the name of the HLS or DASH stream. In this case the following:
- HLS -> http://192.168.1.98/hls/stream.m3u8
- DASH -> http://192.168.1.98/dash/stream.mpd
If you change the end of the `rtmp://localhost/show/XXXX` URL you can create multiple streams from different sources with different names (just make sure the tmpfs mounts have enough space for all the streams).
Testing

I’ve been using the Node-RED Chromecast node to test the streams. The DASH stream is working pretty well, but the HLS is a bit more fragile for some reason. Latency is currently about 20-30 seconds which is appears mainly to be determined by the size and number of chunks in the playlist used but if I wind the fragment size down any lower than 5s or the 20s for the playlist length.
Next
Now it’s basically working the next steps are to add support for Camera devices to my Google Assistant Node-RED service so I can request the stream via voice and have it show on my Google Home Hub as well. I’m also building a standalone Google Smart Home Action just for Camera feeds using an all Google stack just as a learning exercise in Firebase.
At the moment the stream is only available from inside my network, I’ll probably proxy it to my external web server as well and add on authentication. The Google Assistant can be given a Bearer Auth Token along with the URL which means I’ll be able to view the stream on my phone while out. While not important for this stream it would be for other Security camera type applications.
I like this solution a lot and am working to implement it with some Foscam IP cams. I would like to only restream when a browser is connected; otherwise it’s wasted network traffic continually pulling the RTSP streams. Do you have any clever ways to pull the stream only when needed?
No, I’m running it on a dedicated Wireless network so it’s not a problem.