I like this time of year, a chance to reflect on the last 12 months and take stock of accomplishments and realise the achievements. And something I like to gauge a success on is the longevity of a solution, and a time-lapse comparison 6 months apart is seemingly my go to example.
To elaborate on this achievement, earlier this year was the setup of a homebrew CCTV solution using an array of Raspberry Pi’s with cameras, and a VM Cent OS server acting as a PVR host. A surplus Pi W Zero was pointed at the hills and used as a time-lapse experiment.
The real achievement is that since its conception in early June,
it has been stable enough to run in the background, capturing footage for such an
occasion.
So here I present my latest time-lapse, a split screen video on the difference between a June day and a December day:
Following the setup of a Cent OS CCTV server, I’ve been using Raspberry Pi’s as video sources. But what if there was a Raspberry Pi in perfect situ for a CCTV camera, but was already in use as a media player?
A Linux system has always had the impression that it is
versatile, so this should be an achievable task. A barrier would be how to get
this done with the operating system installed, in this case it is LibreElec, an
OS with the tagline “Just enough OS for Kodi”. Therefore, it would be more of a
challenge than a usual Debian install.
The team at LibreElec saw this type of thing coming, and included the Docker service as a Kodi addon to allow the curious tinkerer to add more than Kodi to a Pi.
If you have the LibreElec based Pi in the opportune
placement to add a camera, here is how to add Mjpeg streaming capabilities…
If you’d ever searched for Raspberry Pi projects that
involved a camera then the results would certainly include Motioneye OS, an
easy to use self-contained operating system that is truly (write then) plug and
play.
Looking for a CCTV project earlier this year I too was drawn in by this, and with my small abundance of RPi spares it was the cheapest choice, using a couple of RPi 3B+ for video, and a Zero W for time-lapse image capture. All processing was self-contained on each Pi with capture data passed over via SMB to a Windows file share.
This worked, but had a couple of problems that prevented it
from being trustworthy. Firstly, it stops recording video after a few days of
uptime, by creating empty files. And secondly the time-lapse camera seemed to
reset every few minutes that created in white out image capture as the camera’s
exposure setting recalibrated, ruining a time-lapse video.
Looking wider there was also the performance issue. In
Motioneye OS’ default state of managing all features, the highest FPS seemed to
max at 15 fps even on the Pi 3B+. Forums suggest this is due to the motion eye
daemon handling all the image processing in software, putting a strain on the Pi’s
modest CPU.
The idea and goal is to move the processing and IO
responsibilities to my server, which would be far more capable than the then
latest available RPi, and as I have chosen Cent OS to be my go-to Linux OS of
choice, this is what I’ll be using.
A gateway to make this possible is an option in Motioneye OS, Fast Network Camera. This when set relinquishes the Pi of all processing duties and serves to just stream the camera capture as best as possible via MJPEG.
Here’s how to set up Motioneye on a Cent OS server to be a central data hub for a network of RPi Motioneye OS cameras.