Sending CCTV to the cloud

I finally got round to adding another critical element to my Zoneminder CCTV system.

The one thing that my system lacked was cloud storage for video clips. When something is detected by the cameras it’s important to get the footage offsite, otherwise it’s vulnerable to being destroyed. Burglars actively look for, and destroy, things that might be recording devices and it’s a selling point of systems such as Arlo and Nest that the footage is uploaded to their cloud storage.

I needed an equivalent for my system. In my last post I described how I set up Cloud Sync on my Synology NAS to act as a gateway between my Linux server and Onedrive, and that gives me an easy way to sync my Zoneminder videos.

The task, then, is to copy a video file from Zoneminder to Onedrive.

Within my Onedrive synced folder I’ve created a directory “cctv” to store the clips that I want to upload to Onedrive. I’m taking the low resolution feeds from the cameras into Zoneminder, so I’ve created a “low” subdirectory – I’ll come back to the high resolution videos another time.

I only need to send clips of interest – i.e. where motion has been detected – and, with my object recognition motion detection, these appear in Zoneminder with note text:

I can then use filters in Zoneminder to look for the text “detected” and run a script against each match. The script receives the event ID as a parameter, so I’m using that to make a call to the Zoneminder API to get the full event details.

The zmcapture.sh script is simply a wrapper for a Python script:

#!/bin/bash
/opt/local/bin/zmupload.py $1 >> /var/local/zoneminder/zmupload2.txt

In the script, I start off by setting some variables – the Zoneminder credentials, root URL for the Zoneminder API and my target directory:

username = '######'
password = '#######'

url = 'http://localhost/zm/api/'
output_dir = '/mnt/onedrive/cctv/low'

I then grab the event ID from the script parameter:

eid = sys.argv[1]

I have Zoneminder authorisation enabled, so to use the API the script must log in and store the resulting cookies for subsequent API calls. The Python requests library makes that simple:

login = {'user': username, 'pass': password}
r = requests.post(url +  'host/login.json', data=login )
cookies = r.cookies

I can now get the event details, parsing the start time into a Python date time object:

r = requests.get('{}events/{}.json'.format(url, eid), cookies=cookies)

data = r.json()

path = data['event']['Storage']['Path']
videofile = data['event']['Event']['DefaultVideo']
monitor = data['event']['Monitor']['Name']

start_time = datetime.strptime(data['event']['Event']['StartTime'], '%Y-%m-%d %H:%M:%S')

The path variable is the root directory of the Zoneminder storage:


Below that, there is a subdirectory for each monitor, a subdirectory for each date and finally a subdirectory for each event.

For example, the video file for event 123456 from the front garden camera will be something like:

/cctv/zoneminder/events/Front Garden/2020-02-11/123456/123456.mp4

I can build that up from the event details:

date_path = datetime.strftime(start_time, '%Y-%m-%d')
input_file = "{}/{}/{}/{}/{}".format(path, monitor, date_path, eid, videofile)

For the output file I just want a simpler structure – I’m just putting the files into a single directory with filenames that include the monitor name and a timestamp. There shouldn’t be too many files to need to worry about a more complex structure – I’m only copying motion events and I’m only going to store them for a limited period of time.

So, my output file can be built up like this:

output_file = "{}/{}-{}.mp4".format(output_dir, monitor, datetime.strftime(start_time, '%Y-%m-%d-%H-%M-%S'))

And finally, I just need to copy the file:

copyfile(input_file, output_file)

In full, the script is:

#!/usr/bin/python3

import sys
import requests
from datetime import datetime
from shutil import copyfile

username = '######'
password = '#######'
url = 'http://localhost/zm/api/'
output_dir = '/mnt/onedrive/cctv/low'

eid = sys.argv[1]

login = {'user': username, 'pass': password}
r = requests.post(url +  'host/login.json', data=login )
cookies = r.cookies
r = requests.get('{}events/{}.json'.format(url, eid), cookies=cookies)

data = r.json()

path = data['event']['Storage']['Path']
videofile = data['event']['Event']['DefaultVideo']
monitor = data['event']['Monitor']['Name']
start_time = datetime.strptime(data['event']['Event']['StartTime'], '%Y-%m-%d %H:%M:%S')
date_path = datetime.strftime(start_time, '%Y-%m-%d')

input_file = "{}/{}/{}/{}/{}".format(path, monitor, date_path, eid, videofile)
output_file = "{}/{}-{}.mp4".format(output_dir, monitor, datetime.strftime(start_time, '%Y-%m-%d-%H-%M-%S'))

copyfile(input_file, output_file)

As always, I’ve left out the error handling for brevity. Obviously a real script should check the return codes of the API calls and the file copy. It would also be better to store the parameters, such as the url and the credentials, outside of the script itself.

Zoneminder filters run every 20 seconds by default, so there is clearly going to be a delay between the event being detected and the script running but I don’t think it’s significant enough to worry about.

I only want to store events from the last seven days on Onedrive, and that’s easily handled by a cron job running on the Linux server:

0 1 * * * find /mnt/onedrive/cctv/ -type f -mtime +7 -exec rm -f {} \;

There is a simpler way to do all of this. Given that the video filename is the event ID, I could’ve done the copy directly in the wrapper script using the find command:

find /cctv/zoneminder/events/$1-video.mp4 -exec cp {} /mnt/onedrive/cctv/low/

I’m doing it in Python so that I can pull in the name of the monitor and use that for the target filename. Plus it was a chance to play with the Zoneminder API…

in Home Automation

Related Posts