Skip to content

The Triton Model Navigator is a tool that provides the ability to automate the process of model deployment on the Triton Inference Server.

License

Notifications You must be signed in to change notification settings

mayani-nv/model_navigator

 
 

Repository files navigation

Triton Model Navigator

The NVIDIA Triton Inference Server provides a robust and configurable solution for deploying and managing AI models. The Triton Model Navigator is a tool that provides the ability to automate the process of model deployment on the Triton Inference Server. It selects the most promising model format and configuration, matches the provided constraints, and helps optimize performance.

Documentation

About

The Triton Model Navigator is a tool that provides the ability to automate the process of model deployment on the Triton Inference Server.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 97.8%
  • Gherkin 0.6%
  • Makefile 0.6%
  • Shell 0.4%
  • Dockerfile 0.3%
  • Mustache 0.1%
  • Other 0.2%