Skip to content

Pointcept/SAMPart3D

Folders and files

NameName
Last commit message
Last commit date

Latest commit

76f088f Β· Jan 15, 2025

History

12 Commits
Nov 20, 2024
Nov 20, 2024
Nov 20, 2024
Nov 20, 2024
Nov 20, 2024
Jan 15, 2025
Nov 20, 2024
Nov 20, 2024
Nov 20, 2024
Nov 20, 2024
Nov 11, 2024
Nov 25, 2024
Nov 20, 2024

Repository files navigation

SAMPart3D: Segment Any Part in 3D Objects

πŸ”§ Setup

Installation

Please refer to INSTALL.md.

Preparation for training

  1. Download pretrained PTv3-object https://huggingface.co/yhyang-myron/SAMPart3D/tree/main.

  2. Data prepocessing

    We use Blender to render multi-view rgb and depth of the 3D glb mesh. First Install Blender:

    wget https://download.blender.org/release/Blender4.0/blender-4.0.0-linux-x64.tar.xz
    tar -xf blender-4.0.0-linux-x64.tar.xz
    rm blender-4.0.0-linux-x64.tar.xz

    Then render rgb and depth:

    cd tools
    ${PATH_TO_BLENDER} -b -P blender_render_16views.py ${MESH_PATH} ${TYPES} ${OUTPUT_PATH}

    For example:

    blender-4.0.0-linux-x64/blender -b -P blender_render_16views.py mesh_root/knight.glb glb data_root/knight

πŸš€ Running SAMPart3D

1. Train

Change the rendering data_root, mesh_root and backbone_weight_path in configs/sampart3d/sampart3d-trainmlp-render16views.py.

SAMPart3D
|-- ckpt
    |-- ptv3-object.pth
|-- mesh_root
    |-- knight.glb
|-- data_root
    |-- knight
        |-- meta.json
        |-- render_0000.webp
        |-- depth_0000.exr
        ...
export CUDA_VISIBLE_DEVICES=${CUDA_VISIBLE_DEVICES}
sh scripts/train.sh -g ${NUM_GPU} -d ${DATASET_NAME} -c ${CONFIG_NAME} -n ${EXP_NAME} -o ${OBJECT_UID}

For example:

sh scripts/train.sh -g 1 -d sampart3d -c sampart3d-trainmlp-render16views -n knight -o knight

The mesh segmentation results will be saved in exp/${DATASET_NAME}/${EXP_NAME}/resuls, and the visualization of point clouds and meshes will be saved in exp/${DATASET_NAME}/${EXP_NAME}/vis_pcd/.

2. Test more scales with pretrained MLPs

After training, the ckpt of the target mesh will be saved in exp/${DATASET_NAME}/${EXP_NAME}/model/, if you want to try more scales, you can directly load the weight. And modify the val_scales_list in exp/${DATASET_NAME}/${EXP_NAME}/config.py.

export CUDA_VISIBLE_DEVICES=${CUDA_VISIBLE_DEVICES}
sh scripts/eval.sh -g ${NUM_GPU} -d ${DATASET_NAME} -n ${EXP_NAME} -w ${WEIGHT_NAME}

For example:

sh scripts/eval.sh -g 1 -d sampart3d -n knight -w 5000

3. Highlight 3D segments on multi-view renderings

Set render_dir, mesh_path, results_dir, save_dir in tools/highlight_parts.py.

python tools/highlight_parts.py

πŸ“š Dataset: PartObjaverse-Tiny

Please refer to PartObjaverse-Tiny.md.

Acknowledgement

SAMPart3D is inspired by the following repos: garfield, PointTransformerV3, Pointcept, FeatUp, dinov2, segment-anything, PartSLIP2.

Many thanks to the authors for sharing their codes.

Citation

If you find SAMPart3D useful in your project, please cite our work. :)

@article{yang2024sampart3d,
  title={SAMPart3D: Segment Any Part in 3D Objects},
  author={Yang, Yunhan and Huang, Yukun and Guo, Yuan-Chen and Lu, Liangjun and Wu, Xiaoyang and Lam, Edmund Y and Cao, Yan-Pei and Liu, Xihui},
  journal={arXiv preprint arXiv:2411.07184},
  year={2024}
}

About

SAMPart3D: Segment Any Part in 3D Objects

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published