Skip to content

Latest commit

 

History

History
100 lines (73 loc) · 5.72 KB

2d_hand_demo.md

File metadata and controls

100 lines (73 loc) · 5.72 KB

2D Hand Keypoint Demo

We provide a demo script to test a single image or video with hand detectors and top-down pose estimators. Assume that you have already installed mmdet with version >= 3.0.

Hand Box Model Preparation: The pre-trained hand box estimation model can be found in mmdet model zoo.

2D Hand Image Demo

python demo/topdown_demo_with_mmdet.py \
    ${MMDET_CONFIG_FILE} ${MMDET_CHECKPOINT_FILE} \
    ${MMPOSE_CONFIG_FILE} ${MMPOSE_CHECKPOINT_FILE} \
    --input ${INPUT_PATH} [--output-root ${OUTPUT_DIR}] \
    [--show] [--device ${GPU_ID or CPU}] [--save-predictions] \
    [--draw-heatmap ${DRAW_HEATMAP}] [--radius ${KPT_RADIUS}] \
    [--kpt-thr ${KPT_SCORE_THR}] [--bbox-thr ${BBOX_SCORE_THR}]

The pre-trained hand pose estimation model can be downloaded from model zoo. Take onehand10k model as an example:

python demo/topdown_demo_with_mmdet.py \
    demo/mmdetection_cfg/rtmdet_nano_320-8xb32_hand.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmdet_nano_8xb32-300e_hand-267f9c8f.pth \
    configs/hand_2d_keypoint/rtmpose/hand5/rtmpose-m_8xb256-210e_hand5-256x256.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmpose-m_simcc-hand5_pt-aic-coco_210e-256x256-74fb594_20230320.pth \
    --input tests/data/onehand10k/9.jpg \
    --show --draw-heatmap

Visualization result:


If you use a heatmap-based model and set argument --draw-heatmap, the predicted heatmap will be visualized together with the keypoints.

To save visualized results on disk:

python demo/topdown_demo_with_mmdet.py \
    demo/mmdetection_cfg/rtmdet_nano_320-8xb32_hand.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmdet_nano_8xb32-300e_hand-267f9c8f.pth \
    configs/hand_2d_keypoint/rtmpose/hand5/rtmpose-m_8xb256-210e_hand5-256x256.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmpose-m_simcc-hand5_pt-aic-coco_210e-256x256-74fb594_20230320.pth \
    --input tests/data/onehand10k/9.jpg \
    --output-root vis_results --show --draw-heatmap

To save the predicted results on disk, please specify --save-predictions.

To run demos on CPU:

python demo/topdown_demo_with_mmdet.py \
    demo/mmdetection_cfg/rtmdet_nano_320-8xb32_hand.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmdet_nano_8xb32-300e_hand-267f9c8f.pth \
    configs/hand_2d_keypoint/rtmpose/hand5/rtmpose-m_8xb256-210e_hand5-256x256.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmpose-m_simcc-hand5_pt-aic-coco_210e-256x256-74fb594_20230320.pth \
    --input tests/data/onehand10k/9.jpg \
    --show --draw-heatmap  --device cpu

2D Hand Keypoints Video Demo

Videos share the same interface with images. The difference is that the ${INPUT_PATH} for videos can be the local path or URL link to video file.

python demo/topdown_demo_with_mmdet.py \
    demo/mmdetection_cfg/rtmdet_nano_320-8xb32_hand.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmdet_nano_8xb32-300e_hand-267f9c8f.pth \
    configs/hand_2d_keypoint/rtmpose/hand5/rtmpose-m_8xb256-210e_hand5-256x256.py \
    https://download.openmmlab.com/mmpose/v1/projects/rtmposev1/rtmpose-m_simcc-hand5_pt-aic-coco_210e-256x256-74fb594_20230320.pth \
    --input data/tests_data_nvgesture_sk_color.avi \
    --output-root vis_results --kpt-thr 0.1


The original video can be downloaded from Github.

2D Hand Keypoints Demo with Inferencer

The Inferencer provides a convenient interface for inference, allowing customization using model aliases instead of configuration files and checkpoint paths. It supports various input formats, including image paths, video paths, image folder paths, and webcams. Below is an example command:

python demo/inferencer_demo.py tests/data/onehand10k \
    --pose2d hand --vis-out-dir vis_results/onehand10k \
    --bbox-thr 0.5 --kpt-thr 0.05

This command infers all images located in tests/data/onehand10k and saves the visualization results in the vis_results/onehand10k directory.

Image 1 Image 2 Image 3 Image 4

In addition, the Inferencer supports saving predicted poses. For more information, please refer to the inferencer document.

Speed Up Inference

For 2D hand keypoint estimation models, try to edit the config file. For example, set model.test_cfg.flip_test=False in onehand10k_hrnetv2.