Skip to content

TeamBrusta/brusta

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Brusta logo

Brusta

  • language-agnostic PyTorch model serving
  • serve JIT compiled PyTorch model in production environment

Requirements

  • docker == 18.09.1
  • wget == 1.20.1
  • your JIT traced PyTorch model (If you are not familiar with JIT tracing, please refer JIT Tutorial)

Process Flow

  1. run bridge

Request Example

request to the model server as follow (suppose your input dimension is 3)

curl -X POST -d '{"input":[1.0, 1.0, 1.0]}' localhost:8080/model/predict

Contributors

Author

About

Pytorch model serving framework

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published