Skip to content

Latest commit

 

History

History
39 lines (25 loc) · 2.28 KB

File metadata and controls

39 lines (25 loc) · 2.28 KB

Otoroshi LLM Extension in action - Cloud APIM Serverless

this API demonstrate how to use the Otoroshi LLM Extension in a Cloud APIM Serverless project. The Otoroshi LLM Extension provide tools and plugins to use LLM easily in your projects no matter what provider you're using (on premise, cloud, openai, anthropic, mistral, etc).

the API hosting this demo is located at

https://${environment.HOST}

Source code

the source code for this project is available on Cloud APIM's Github. You can fork it and modify it for your own needs.

Demo steps

the project is organized around several steps, one for each possible use case (non exhaustive). The first 2 steps shows how to use otoroshi to consume an LLM API et expose it to your applications in a consistant way no matter what provider is used. The other steps shows how a LLM API can help you configuring your API gateway using natural language.

API Spec

the OpenAPI spec of this API is available here and the viewer for this spec is available here

Conference

this project is based on a conference given by Mathieu at Breizhcamp 2024

Otoroshi managed instances

if you want to achieve the same thing using your own otoroshi instance (on Cloud APIM or on premise), you can use the following configuration file in the otoroshi resource loader. Do not forget to change the domain names in the routes to match yours and to add an OpenAI token in the local vault named openai-token