Here's how to set up a smooth streaming video webcam from a raspberry pi. In my case I have a raspberry pi zero w with the pi camera which streams a feed from a window.
Prerequisites:
- Raspberry pi (zero, zero w, B, whatever - they all do the heavy h.264 encoding, unsure on A)
- Raspberry pi camera
- raspbivid installed and working (with raspi-config). Not covered here.
- Network connection to raspberry pi
What we'll be doing is:
- Compiling FFMPEG from source
- Compiling NGINX with the rtmp module
- Creating a simple index file to load a javascript hls library
- Creating a script to start the streaming
- Systemd units to keep everything going
In part 2, I'll show how to stream this via an external proxy.
FFMPEG
You'll now need to compile ffmpeg on the pi# install libx246 dev tools sudo apt-get install build-essential libx264-dev # Get a copy of the latest ffmpeg git clone git://source.ffmpeg.org/ffmpeg.git cd ffmpeg/ # Configure ffmpeg with x246 and non-free codecs ./configure --enable-gpl --enable-nonfree --enable-libx264 # Make and install make && sudo make install
Nginx with rtmp module
Since Nginx doesn't natively support the rtmp protocol, you have to compile it in.Here's how:
# Download and install development tools sudo apt-get install build-essential libpcre3 libpcre3-dev libssl-dev # Get the rtmp module git clone git://github.com/arut/nginx-rtmp-module.git # Get the nginx source (at the time this was 1.14.1, review your versions...) wget http://nginx.org/download/nginx-1.14.1.tar.gz # Extract the nginx source and go in to the directory tar xvzf nginx-1.14.1.tar.gz && cd nginx-1.14.1 # Configure nginx with defaults + ssl + rtmp module ./configure --with-http_ssl_module --add-module=../nginx-rtmp-module -- with-cc-opt=-Wno-error # Compile, and install make && sudo make install
Next create a directory to store the rtmp content in for streaming
sudo mkdir /webcam sudo chown nobody: /webcam
Create an nginx configuration that uses the rtmp module in /usr/local/nginx/conf/nginx.conf:
worker_processes 4; pid /run/nginx.pid; error_log logs/error.log debug; events { worker_connections 512; } http { include mime.types; #default_type application/octet-stream; sendfile off; keepalive_timeout 65; server { listen 80; root /webcam/; location / { rewrite ^/webcam/(.*) /$1; # Allows http://site/ or http://site/webcam/ index index.html; add_header Cache-Control no-cache; add_header 'Access-Control-Allow-Origin' '*'; types { application/vnd.apple.mpegurl m3u8; video/mp2t ts; text/html html; } } } } rtmp { server { listen 1935; ping 30s; notify_method get; application video { live on; # Enable live streaming meta copy; hls on; # Enable HLS output hls_path /webcam; # Where to write HLS files } } }
In summary of above:
- Users will connect to http://host/webcam/index.html and this will load a javascript library that loads the stream (next section)
- The rtmp stream will be published from ffmpeg to http://host/video/streamname, it will be served out at http://host/webcam/streamname
HTML file to load stream
Create an /webcam/index.html file that loads a public hls streaming javascript library - this will give you the streaming video interface. Make sure to update the URL to where you are hosting your stream:<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script> <video autoplay="" controls="" id="video"></video> <script> if (Hls.isSupported()) { var video = document.getElementById('video'); var hls = new Hls(); // bind them together hls.attachMedia(video); hls.on(Hls.Events.MEDIA_ATTACHED, function () { console.log("video and hls.js are now bound together !"); hls.loadSource("http://raspberrypi.local/webcam/stream.m3u8"); hls.on(Hls.Events.MANIFEST_PARSED, function (event, data) { console.log("manifest loaded, found " + data.levels.length + " quality level"); }); video.play(); }); } </script>
Script to start the stream
Next part is to create a script that will launch rasbpivid to do the encoding, then feed that in to ffmpeg to do the rtmp handling and point that output at nginx.I've put this script in /home/pi/webcam.sh (remember to chmod a+x this file)
#!/bin/bash /usr/bin/raspivid -o - -t 0 -b 1000000 -w 1280 -h 720 -g 50 | \ /usr/local/bin/ffmpeg -i - -vcodec copy -map 0:0 -strict experimental \ -f flv rtmp://127.0.0.1/video/stream
Quick summary of those commands:
- Start raspivid
- output to STDOUT (-o -)
- Do it forever (-t 0)
- Set the bitrate to 1000000 bps, close enough to 1Mbit for jazz (-b 1000000)
- Set the width and height of the video to 1280x720 (-w 1280 -h 720)
- Set the GOP length to 50 frames (this means the image is completely refreshed every 50 frames)
- Pipe this output in to ffmpeg (| /usr/local/bin/ffmpeg)
- Tell ffmpeg to get input from STDIN
- Set the video codec to copy
- Map stream 0 of the input to to stream 0 of the output
- Enable some experimental features
- Format to flv
- Set the output to the nginx server with the rtmp module.
Use Systemd to keep everything going
Create a systemd service that will start the webcam, after nginx has started:in /etc/systemd/system/webcam.service
[Unit] Description=webcam After=nginx.service After=systemd-user-sessions.service After=rc-local.service Before=getty.target [Service] ExecStart=/home/pi/webcam.sh Type=simple Restart=always User=pi Group=pi [Install] WantedBy=multi-user.target
Create an nginx systemd service to start the local install of nginx in /etc/systemd/system/nginx.service:
[Unit] Description=The NGINX HTTP and reverse proxy server After=syslog.target network.target remote-fs.target nss-lookup.target [Service] Type=forking PIDFile=/run/nginx.pid ExecStartPre=/usr/local/nginx/sbin/nginx -t ExecStart=/usr/local/nginx/sbin/nginx ExecReload=/usr/local/nginx/sbin/nginx -s reload ExecStop=/bin/kill -s QUIT $MAINPID PrivateTmp=true [Install] WantedBy=multi-user.target
Start everything up
sudo systemctl daemon-reload sudo systemctl enable nginx.service sudo systemctl enable webcam.service sudo systemctl start nginx.service sudo systemctl start webcam.service
Note that this is not live. It's a sample captured from the camera and hosted elsewhere.