External Metadata
Any external system can send analytics metadata to Macula servers. These may be individual cameras with edge analytics, or third-party analytics engines; these can operate independently but it makes sense to send the metadata along with the analyzed video stream to the Macula server. The Macula server can then overlay the metadata in both live and archive playback, use the data for the Event & Action rules, as well as provide the opportunity to view and search these events based on the various recognition attributes.
A simple example of such integration is: ANPR camera sending license plate recognition results with the video stream.
At this point, Macula servers support two metadata classes: LPR and FR (license plate and facial recognition).
Prerequisites
You will need to configure the external analytics source to send the metadata in JSON format over HTTP/HTTPS. The only accepted type is an array of JSON objects named "objects" containing the following properties:
x and y - relative coordinates, [0; 1], float
width and height - relative object size, [0; 1], float
classId - internal Macula class identifier, string
4 = FR, 17 = LPR
id = object ID, string; used to distinguish between objects and to ensure correct moving object drawing across frames; if not specified and there are multiple objects, they will not be drawn correctly
className - optional user-defined class name, string
accuracy - recognition accuracy, optional, [0; 1], float
value - mandatory parameter, recognition value (plate number or Subject name/surname), string
attributes - optional value attributes delimited by comma, e.g., car color, make, model, year etc., string
If the data are even partially incorrect, the server will return an error. If everything is OK, you should expect the HTTP/1.1 204 No Content response.
If classId is not equal to 4 or 17, the metadata will be accepted and displayed as overlay but you will be unable to search it.
Example
{
"objects":[
{
"x":0.15,
"y":0.4,
"width":0.5,
"height":0.6,
"id":"01",
"classId":"4",
"className":"Face",
"value":"Guy Julius Caesar",
"accuracy":"0.95"
},
{
"x":0.25,
"y":0.09,
"width":0.2,
"height":0.17,
"id":"LPR0013",
"classId":"17",
"value":"GZ1729",
"attributes":"silver,Honda,sedan,2007"
}
]
}Configuration in Macula Console
The Macula servers accept metadata automatically starting from software version 1.21.0.
The metadata are partially stored and displayed in the archive, and is represented by a wide semi-transparent line on the archive timeline. Namely, all bounding boxes (colorful rectangles) are stored in the video archive. The rest of the data - recognition results, attributes, etc. - are stored in a separate database. Without the database, you will only be able to see the data overlay in playback, without event values and properties.
In Macula Console, you can change the database that will be used for storing metadata. By default, a built-in database (SQLite) is enabled for all clean installations. To change the database settings (limits etc.), go to the Configuration section > choose Servers on the left > select External databases tab > click Change > select a database and click the Edit
icon.
To check the DB configuration, in the Configuration section > Servers, scroll horizontally and check the Recognition history DB column. If there is no such column, add it to the displayed columns by editing the item grid using the Edit columns button
in the upper right corner.

To verify that the data are recorded into the database, switch to the Monitoring section > choose Servers on the left > check the Recognition history DB field. If there is no such column, add it to the displayed column list by configuring the table using the Edit columns button
in the upper right corner.

External Metadata Display in Macula Monitor
The metadata are displayed as video overlay - colorful bounding boxes with parameters - in live and regular/instant playback, as well as in 1x1 view in the dedicated external service tabs. In the middle of the rectangle, object value will be displayed. (X) in the corner means that the metadata source is external, and it is accompanied by other parameters (e.g., className).
You can perform the object-based search in the dedicated tabs (LPR, FR) using the panel on the right-hand-side:
Search interval: start and end of the search period
Plate: enter full or partial value
Attributes: one of more attributes to search for
Tag: Macula tag, if configured
If there is no configured VA database, there will be no dedicated tab and you will be unable to perform value-based search.
Attribute search can be performed in two modes:
OR: use commas or spaces between attributes to include results that have ANY of the listed attributes
AND: use + between attributes to display results that have ALL attributes
For example, entering "black sedan" will display all black cars and all sedans, while searching for "black+sedan" will only output black sedans.
Last updated