-
Notifications
You must be signed in to change notification settings - Fork 13
SAF apps
There are two ways of creating applications using SAF: 1) Deploying a pipeline via the JSON API and 2) Creating custom C++ applications.
The JSON API allows users to run arbitrary DAGs of cameras and operators using
a JSON specification. The pipeline is specified with a pipeline_name
and a
list of operators
. An operator is specified the operator_name
,
operator_type
, a map of the operator's parameters
and the name of its
inputs
. The parameters that a operator takes can be found in the
Create(const FactoryParamsType& params)
method of operator subclasses. An
input name should have the form "<src operator name>:<src sink name>"
. If
<srcSinkName>
is output
(the default for many operators), you can omit
:<srcSinkName>
and just list the <src operator Name>
under input
.
Here is an example JSON spec:
{
"pipeline_name": "FrameWriterExample",
"operators": [{
"operator_name": "MyCamera",
"operator_type": "Camera",
"parameters": {
"camera_name": "GST_TEST"
}
},
{
"operator_name": "MyWriter",
"operator_type": "FrameWriter",
"parameters": {
"format": "binary",
"output_dir": "/tmp/frame_writer/"
},
"inputs": {
"input": "MyCamera:output"
}
}
]
}
Deploy the pipeline with pipeline
:
/path/to/apps/pipeline/pipeline -p /path/to/pipelines/pipeline.json
This program takes an optional flag -n/--dry-run
that will load the pipeline,
but not run it. This mode is useful for testing to make sure that the JSON pipeline
file is well-formed and that SAF is able to instantiate all of the operators.
If you build SAF with the -DUSE_GRAPHVIZ
flag to CMake, then the pipeline
app will support the -g/--graph
optional flag. This will use GraphViz to visualize the pipeline
as a graph (DAG) in an OpenCV window.
Applications can also be made in C++ by adding a directory to the
/path/to/saf/apps/
. Current existing applications can be used as an
example. Building a custom application can be useful when there is processing
to be done outside of an operator.
/path/to/apps/simple
--camera <camera-id>
[--display]
/path/to/apps/classifier
--camera <camera-id>
[--display]
-m <model-id>
/path/to/apps/detector
--camera <camera-id>
--detector_type [opencv-face|mtcnn-face|yolo|frcnn|ssd|ncs-yolo]
-m <model-id>
--detector_confidence_threshold <number>
--detector_targets <csv-labels>
[--display]
/path/to/apps/tracker
-c <camera-id>
--detector_type [opencv-face|mtcnn-face|yolo|frcnn|ssd|ncs-yolo]
-m <model-id>
--detector_confidence_threshold <number>
--detector_targets <csv-lables>
[--display]
--tracker_type [dlib]
--detector_idle_duration <secs>
--tracker_calibration_duration <secs>