CustomLLM config to leverage watsonx LLMs with continue.dev.
-
Updated
Jun 18, 2024 - TypeScript
CustomLLM config to leverage watsonx LLMs with continue.dev.
Deploying watsonx.ai on AWS for AI Assistant Solutions
A simple, unified NPM-based interface for interacting with multiple Large Language Model (LLM) APIs, including OpenAI, AI21 Studio, Anthropic, Cloudflare AI, Cohere, DeepInfra, Fireworks AI, Friendli AI, Google Gemini, Goose AI, Groq, Hugging Face, Mistral AI, Monster API, Octo AI, Perplexity, Reka AI, watsonx.ai, and LLaMA.cpp.
A news scraper and sentiment analyzer powered by watsonx.ai
Fast APIs integration with watsonx Assistant as a custom extension
Add a description, image, and links to the watsonx-ai topic page so that developers can more easily learn about it.
To associate your repository with the watsonx-ai topic, visit your repo's landing page and select "manage topics."