Download OpenAPI specification:Download
[- {
- "id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
- "name": "string",
- "interface": "ONVIF",
- "state": "RUNNING",
- "uptime": 0,
- "fps": 0,
- "gop": 0,
- "bitrate": 0,
- "jitter": 0,
- "pic_size": "1920x1080",
- "video_codec": "h264",
- "audio_codec": "aac",
- "delivery_policy": "string",
- "restarts": 0
}
]
Get device by ID. ID is a uuid.
id required | string <uuid> Camera ID |
{- "id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
- "name": "string",
- "interface": "ONVIF",
- "state": "RUNNING",
- "uptime": 0,
- "fps": 0,
- "gop": 0,
- "bitrate": 0,
- "jitter": 0,
- "pic_size": "1920x1080",
- "video_codec": "h264",
- "audio_codec": "aac",
- "delivery_policy": "string",
- "restarts": 0
}
Get device by ID. ID is a uuid.
id required | string <uuid> Sensor ID |
{- "id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
- "name": "string",
- "interface": "Hikvision",
- "state": "RUNNING",
- "uptime": 0,
- "restarts": 0
}
Get device by ID. ID is a uuid.
id required | string <uuid> Gateway ID |
{- "id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
- "name": "string",
- "interface": "Lenel",
- "state": "RUNNING",
- "uptime": 0,
- "restarts": 0
}
VAE abbr. stands for Video Analytics Engine.
Some basics:
We use JSON as a format for metadata exchange with full separation between engine reports and visualization commands. This allows for visualization to be changed as needed with no engines modifications
Analytical metadata is tied to a video frame using a timestamp
Avoid sending empty or useless metadata because it will take up storage space. So if there are no detections - metadata = empty string.
All coordinates (e.g, x-y locations of bounding boxes, dimensions, etc..) are normalized across 16 bits. i.e, 0-65535 corresponds to 0-100% and are relative to the top left corner of the image
Each target is reported by "bounding box" with (x,y) at the top left corner of the detected base and (w,h) as dimensions
Analytical metadata contains a 'visual' attribute if there are something to show on the screen. It can be bounding boxes, text, polygons, polylines.
Analytical metadata contains a 'meta' attribute if there are detect(s) and it's necessary to create an event (alert) or store metadata in the metastorage for history and possible search metadata or make some report in the future.
All meta attribute values are strings, unless otherwise specified (with the exceptions of: snaps.default which is boolean, meta.box which is array of ints).
'meta' attribute consists of mandatory UUID v1 target id(s) and one or more classifiers with mandatory name and 'val' attribute with value "0" or "1" which stands for "true"/"false", present or not.
Mandatory attributes if need to create an event:
Using here YAML format as a super-set of JSON for clarity ONLY as JSON/YAML are mostly interchangeable. General structure in detail with inline comments using Intrusion Detection engine:
type: result # mandatory
ver: 6 # mandatory, current format version
pts: '20221116150647.234425' # mandatory, Unix timestamp with microseconds in human-readable form
kind: ID # mandatory, some VAE engine name (ID abbr. - Intrusion Detection)
# ----- "status report" data -----
status:
err_code: 100500 # Optional, present only in a case of some error happened
err_msg: some error message # Optional, present only in a case of some error happened
quality: 100 # VAE engine "results quality estimate". Example: 100% if FPS is high and detects are of high confidence, 50% if FPS/resolution is lower and/or confidence is low, etc...
list: # visual part, it's only for player metadata visualization
target-id1: # engine's unique id of some target
trig:true # 'alert' if "true", 'info' otherwise (in short: red or green bounding boxes)
visual: # special "visual" attribute for visualisation
opt: # options
def: # engine's possible visualization names
- "Bounding box" # used for bounding box visualisation
Classification # used for textual classification visualisation
Status # used for some status visualisation (e.g. "Learning the scene")
Trail # used for object trail visualisation
"Triggered rule" # used e.g. to show zone where triggered rule happened
pp: [x,y] # projection point
box: [l,t,w,h] # not for visualization but rather for element's position (Left/Top/Width/Height for bounding box in 0..65535/0..65535 coordinate system)
"Bounding box": # name from opt.def array above
- box:[x1,y1,x2,y2] # may be "box" / "polygon", coords for visualisation
Classification: # name from opt.def array above, used for textual classification visualisation
- text: vehicle
text: "0.55 sqm" # also see example below for different 'text' format with references meta keys as variables
text:"1.55 m"
text:"1.27 km/h"
target-id1_schema: # if we want to show zone where triggered rule happened or some more complex schema
trig:true # 'alert'
visual:
"Triggered rule": # name from opt.def array above
- polygon:[ # "polygon" or "pline" for polygons or polylines respectively
x1,
y1,
x2,
y2,
...,
xn,
yn
]
target-id2: # next engine's unique target id
.........
# ----- "meta" is a collection of targets and their estimated classifiers in standardized/flat structure -----
meta:
- id: some-unique-target-id1 # UUID v.1 (time-based) for the Target
box: [l, t, w, h] # Left/Top/Width/Height for bounding box in 0..65535/0..65535 coordinate system
snaps: # optional, if there is a need to store snapshot(s)
- type: ID-rule # ID: will produce "ID-snap", LPR will produce "LPR-snap"
default: true
snapshot: "....." # snapshot encoded in Base64
tags:
mime: image/jpeg # snapshot mime-type
orig-size: "[w,h]" # original video size, pixels
snap-zone: "[l,t,w,h]" # snapshot cut-out position and size, pixels
schema: "{...json...}" # analytical schema in json format as described in visual attribute above
cls:
SCHEMA-VER: # SPECIAL CLASSIFIER, engine's schema version. May be omited for now
val: "B123D6F7"
ID-area: # sample for area classificator
val: "0.55" # floating value as string
unit: sqm
ID-height: # sample for height classificator
val: "1.55" # floating value as string
unit: m
ID-speed: # sample for speed classificator
val: "1.27" # floating value as string
unit: km/h
ID-person: # sample for person classificator
val: "0" # false, not person
ID-vehicle: # sample for vehicle classificator
val: "1" # true, it's vehicle
ID-rule:
val: vehicle # equal to object classification
src: rule # source name
rule: presence # rule name
zone: Entrance # zone name
inited_at: # initial event timestamp in ms. MUST exactly match the time from the UUID v1 for this target (some-unique-target-id1)
created: # event update timestamp in ms. MUST be equal 'inited_at' for new event
message: # SPECIAL CLASSIFIER, generates events from "rule" / "watchlist" / etc...
text: text-of-event # text of event, e.g. "'Entrance' wire crossed by 'vehicle'"
{
"kind":"ID",
"vae":"vca",
"ver":6,
"type":"result",
"list":{
"2.1010":{
"trig":true,
"visual":{
"opt":{
"def":[
"Bounding box",
"Counter",
"Classification",
"Status",
"Trail",
"Triggered rule"
],
"pp":[
36360,
55482
],
"box":[
23041,
31627,
26638,
23855
]
},
"Bounding box":[
{
"box":[
23041,
31627,
26638,
23855
]
}
],
"Classification":[
{
"text":"vehicle"
},
{
"text":"${val[0]} ${unit[0]}",
"val": ["0.55"],
"unit": ["sqm"]
},
{
"text":"${v[0]} ${u[0]}",
"v": ["1.55"],
"u": ["m"]
},
{
"text":"${v[0]} ${u[0]}",
"v": ["1.27"],
"u": ["km/h"]
}
]
}
},
"2.1011":{
"trig":true,
"visual":{
"opt":{
"def":[
"Bounding box",
"Counter",
"Classification",
"Status",
"Trail",
"Triggered rule"
],
"pp":[
15048,
35223
],
"box":[
12786,
29209,
4525,
6014
]
},
"Bounding box":[
{
"box":[
12786,
29209,
4525,
6014
]
}
]
}
},
"2.1110_schema":{
"trig":true,
"visual":{
"Triggered rule":[
{
"polygon":[
0,
0,
0,
65535,
65125,
65125,
65535,
273,
33791,
1229
]
}
]
}
}
},
"meta":[
{
"id":"9a9ffc7f-674e-11ed-8015-0242ac110003",
"box":[
12786,
29209,
4525,
6014
],
"cls":{
"ID-area":{
"val": "0.55",
"unit": "sqm"
},
"ID-height":{
"val": "1.55",
"unit": "m"
},
"ID-speed":{
"val": "1.27",
"unit": "km/h"
},
"ID-person":{
"val": "0"
},
"ID-vehicle":{
"val": "1"
},
"ID-rule":{
"val": "vehicle",
"src": "rule",
"rule": "Zone 0-Presence",
"zone": "Zone 0",
"inited_at": "1668782281763",
"created": "1668782281763"
},
"message":{
"text": "Intrusion Detection. 'Zone 0-Presence' triggered by 'vehicle'"
},
"SCHEMA-VER":{
"val":"B123D6F7"
}
}
}
],
"pts":"20221118143909.029553"
}
{
"witness": [
"c14bbbbc-5059-11ed-a47e-15de91f04e1f"
],
"metadata": {
"kind": "External motion detector",
"meta": [
{
"cls": {
"message": {
"text": "Motion detected"
}
}
}
]
}
}
Build following test_uuid.c
with gcc -o test_uuid test_uuid.c -luuid
. Tested on ubuntu20.04.
#include <time.h>
#include <uuid/uuid.h>
int main(int argc, char* argv[])
{
uuid_t uuid;
char ext_id[37];
struct timeval tv_inited_at;
(void)uuid_generate_time_safe(uuid);
uuid_unparse_lower(uuid, ext_id);
(void)uuid_time(uuid, &tv_inited_at);
return 0;
}
Event
witness | Array of strings <uuid> |
metadata | object See "Analytics metadata explanations" |
{- "witness": [
- "497f6eca-6276-4993-bfeb-53cbbbba6f08"
], - "metadata": { }
}
Events are going continuously as multipart messages stream until client disconnects or timeout occurred. If no events were registered, every 10 seconds heartbeat message (empty json) generated. If the event/heartbeat receiving timed out or network disconnected, perform request again.