A tool that can automatically convert Huggingface Spaces, Modelscope Studios and Gradio ChatBot into free APIs. It basically supports any space with a chatbot, and currently perfectly supports many model spaces such as GPT4Free, ChatGPT, Llama 2, Vicuna, MPT-30B, Falcon, ChatGLM, Qwen and so on.
English | 中文
Online Demo: https://weaigc.github.io/gradio-chatbot
Due to the current high demand on the ChatGPT space on Huggingface, there is a noticeable delay in response time. If you have your own ChatGPT account, it is recommended to use gpt-web.
- Experience a free ChatGPT.
npx gradio-chatbot
# or
npm install -g gradio-chatbot
# call the default model
chatbot
- Experience Llama2.
chatbot 2
# or
chatbot https://huggingface.co/spaces/huggingface-projects/llama-2-13b-chat
More usage just type
chatbot help
docker build . -t gradio-server
docker run --rm -it -p 8000:8000 gradio-server
You can use npm or yarn to install gradio-chatbot. Node version 18 or higher is required.
npm install gradio-chatbot
# or
yarn add gradio-chatbot
Currently supports three modes.
Refer to Quickstart.
To make it easy to use, two forms of interfaces are provided.
Streaming output, simply visit http://localhost:8000/api/conversation?model=0&text=hello. Non-streaming output, the calling method is the same as ChatGPT API. The following is an example of a call.
import { GradioChatBot } from 'gradio-chatbot'
const bot = new GradioChatBot();
async function start() {
const message = await bot.chat('hello', {
onMessage(partialMsg) {
console.log('stream output:', partialMsg);
}
});
console.log('message', message);
}
start();
You can also input the spatial address you want to convert, such as https://huggingface.co/spaces/h2oai/h2ogpt-chatbot.
import { GradioChatBot } from 'gradio-chatbot'
const bot = new GradioChatBot({
url: 'https://huggingface.co/spaces/h2oai/h2ogpt-chatbot',
fnIndex: 35,
}); // 调用自定义 ChatBot 模型
async function start() {
console.log(await bot.chat('Hello'));
}
In addition, the NPM package has built-in support for 10 popular spaces from Hugging Face Spaces and Modelscope Studios. You can directly use the model index to access them. Please refer to the Model List for more details.
import { GradioChatBot } from 'gradio-chatbot';
const bot = new GradioChatBot('1');
async function start() {
console.log(await bot.chat('Tell me about ravens.'));
}
start();
For more examples, please visit the directory: Examples .
Note: Some models on Hugging Face may collect the information you input. If you have data security concerns, it is recommended not to use them, and using self-hosted models is a better choice.
import openai
openai.api_key = "dummy"
openai.api_base = "http://127.0.0.1:8080/v1"
# create a chat completion
chat_completion = openai.ChatCompletion.create(model="10", messages=[{"role": "user", "content": "Hello"}])
# print the completion
print(chat_completion.choices[0].message.content)
For more usage instructions, please refer to https://github.com/openai/openai-python
import OpenAI from 'openai';
const openai = new OpenAI({
baseURL: 'http://127.0.0.1:8080/v1'
});
async function main() {
const stream = await openai.chat.completions.create({
model: '10',
messages: [{ role: 'user', content: 'Hello' }],
stream: true,
});
for await (const part of stream) {
process.stdout.write(part.choices[0]?.delta?.content || '');
}
}
main();
For more usage instructions, please refer to https://github.com/openai/openai-node
See API Document
More useful models are welcome to contribute in the issue section.
See CHANGELOG.md
- This package supports
node >= 18
.
- Huge thanks to @gradio/client
- OpenAI for creating ChatGPT 🔥
Apache 2.0 © LICENSE.