Skip to content

Backend for .cpp libraries compatible with OpenAI's API

License

Notifications You must be signed in to change notification settings

EmanuelJr/cpp-backend

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cpp Backend

Overview

This service utilizes llama.cpp and whisper.cpp to provide an API compatible with OpenAI's endpoints. It enables you to run large language models and speech recognition models locally, offering features like text completion and audio transcription through familiar API calls.

Setup instructions

  1. Clone the repository
git clone https://github.com/EmanuelJr/cpp-backend.git
cd cpp-backend
  1. Setup your environment

This script creates a virtual environment, figures out the build type, and installs the required dependencies.

./setup.sh
  1. Set your API key (optional)

To protect your API, create an API key and set it as an environment variable:

export API_KEY=your-api-key-here

Usage

Running the server

  1. Start the server
python3 run.py

By default, the server runs on http://127.0.0.1:8000.

You also can run to set host and port:

python3 run.py 127.0.0.1:8080
  1. Verify the Server is Running

Visit http://127.0.0.1:8000/health in your browser or use curl:

curl http://127.0.0.1:8000/health

You should receive a response indicating the server is operational.

API Endpoints

Feel free to inspect our Swagger at http://127.0.0.1:8000/docs

Acknowledgments

About

Backend for .cpp libraries compatible with OpenAI's API

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published