Skip to content

Commit

Permalink
Merge pull request #106 from aurelio-labs/james/16-release
Browse files Browse the repository at this point in the history
chore: 16 release
  • Loading branch information
jamescalam committed Jan 14, 2024
2 parents 311e909 + 4779030 commit b4ad4d7
Show file tree
Hide file tree
Showing 8 changed files with 89 additions and 55 deletions.
28 changes: 27 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@

Semantic Router is a superfast decision-making layer for your LLMs and agents. Rather than waiting for slow LLM generations to make tool-use decisions, we use the magic of semantic vector space to make those decisions — _routing_ our requests using _semantic_ meaning.

---

## Quickstart

To get started with _semantic-router_ we install it like so:
Expand Down Expand Up @@ -114,4 +116,28 @@ rl("I'm interested in learning about llama 2").name

In this case, no decision could be made as we had no matches — so our route layer returned `None`!

## 📚 [Resources](https://github.com/aurelio-labs/semantic-router/tree/main/docs)
---

## 📚 Resources

### Docs

| Notebook | Description |
| -------- | ----------- |
| [Introduction](https://github.com/aurelio-labs/semantic-router/blob/main/docs/00-introduction.ipynb) | Introduction to Semantic Router and static routes |
| [Dynamic Routes](https://github.com/aurelio-labs/semantic-router/blob/main/docs/02-dynamic-routes.ipynb) | Dynamic routes for parameter generation and functionc calls |
| [Save/Load Layers](https://github.com/aurelio-labs/semantic-router/blob/main/docs/01-save-load-from-file.ipynb) | How to save and load `RouteLayer` from file |
| [Local Execution](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb) | Fully local Semantic Router with dynamic routes — *local models such as Mistral 7B outperform GPT-3.5 in most tests* |
| [LangChain Integration](https://github.com/aurelio-labs/semantic-router/blob/main/docs/03-basic-langchain-agent.ipynb) | How to integrate Semantic Router with LangChain Agents |

### Online Course

**COMING SOON**

### Community

Julian Horsey, [Semantic Router superfast decision layer for LLMs and AI agents](https://www.geeky-gadgets.com/semantic-router-superfast-decision-layer-for-llms-and-ai-agents/), Geeky Gadgets

azhar, [Beyond Basic Chatbots: How Semantic Router is Changing the Game](https://medium.com/ai-insights-cobet/beyond-basic-chatbots-how-semantic-router-is-changing-the-game-783dd959a32d), AI Insights @ Medium

Daniel Avila, [Semantic Router: Enhancing Control in LLM Conversations](https://blog.codegpt.co/semantic-router-enhancing-control-in-llm-conversations-68ce905c8d33), CodeGPT @ Medium
2 changes: 1 addition & 1 deletion docs/00-introduction.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install -qU semantic-router==0.0.15"
"!pip install -qU semantic-router==0.0.16"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/01-save-load-from-file.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
"metadata": {},
"outputs": [],
"source": [
"!pip install -qU semantic-router==0.0.15"
"!pip install -qU semantic-router==0.0.16"
]
},
{
Expand Down
88 changes: 45 additions & 43 deletions docs/02-dynamic-routes.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,9 @@
"source": [
"In semantic-router there are two types of routes that can be chosen. Both routes belong to the `Route` object, the only difference between them is that _static_ routes return a `Route.name` when chosen, whereas _dynamic_ routes use an LLM call to produce parameter input values.\n",
"\n",
"For example, a _static_ route will tell us if a query is talking about mathematics by returning the route name (which could be `\"math\"` for example). A _dynamic_ route can generate additional values, so it may decide a query is talking about maths, but it can also generate Python code that we can later execute to answer the user's query, this output may look like `\"math\", \"import math; output = math.sqrt(64)`."
"For example, a _static_ route will tell us if a query is talking about mathematics by returning the route name (which could be `\"math\"` for example). A _dynamic_ route can generate additional values, so it may decide a query is talking about maths, but it can also generate Python code that we can later execute to answer the user's query, this output may look like `\"math\", \"import math; output = math.sqrt(64)`.\n",
"\n",
"***⚠️ Note: We have a fully local version of dynamic routes available at [docs/05-local-execution.ipynb](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb). The local 05 version tends to outperform the OpenAI version we demo in this notebook, so we'd recommend trying [05](https://github.com/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb)!***"
]
},
{
Expand All @@ -46,7 +48,7 @@
},
"outputs": [],
"source": [
"!pip install -qU semantic-router==0.0.15"
"!pip install -qU semantic-router==0.0.16"
]
},
{
Expand Down Expand Up @@ -114,16 +116,16 @@
"cell_type": "code",
"execution_count": 3,
"metadata": {
"id": "BI9AiDspur0y",
"outputId": "27329a54-3f16-44a5-ac20-13a6b26afb97",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"id": "BI9AiDspur0y",
"outputId": "27329a54-3f16-44a5-ac20-13a6b26afb97"
},
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"output_type": "stream",
"text": [
"\u001b[32m2024-01-08 11:12:24 INFO semantic_router.utils.logger Initializing RouteLayer\u001b[0m\n"
]
Expand Down Expand Up @@ -163,22 +165,22 @@
"cell_type": "code",
"execution_count": 4,
"metadata": {
"id": "_rNREh7gur0y",
"outputId": "f3a1dc0b-d760-4efb-b634-d3547011dcb7",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"id": "_rNREh7gur0y",
"outputId": "f3a1dc0b-d760-4efb-b634-d3547011dcb7"
},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"RouteChoice(name='chitchat', function_call=None)"
]
},
"execution_count": 4,
"metadata": {},
"execution_count": 4
"output_type": "execute_result"
}
],
"source": [
Expand Down Expand Up @@ -233,26 +235,26 @@
"cell_type": "code",
"execution_count": 6,
"metadata": {
"id": "YyFKV8jMur0z",
"outputId": "29cf80f4-552c-47bb-fbf9-019f5dfdf00a",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 35
}
},
"id": "YyFKV8jMur0z",
"outputId": "29cf80f4-552c-47bb-fbf9-019f5dfdf00a"
},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"'06:13'"
],
"application/vnd.google.colaboratory.intrinsic+json": {
"type": "string"
}
},
"text/plain": [
"'06:13'"
]
},
"execution_count": 6,
"metadata": {},
"execution_count": 6
"output_type": "execute_result"
}
],
"source": [
Expand All @@ -272,15 +274,14 @@
"cell_type": "code",
"execution_count": 7,
"metadata": {
"id": "tOjuhp5Xur0z",
"outputId": "ca88a3ea-d70a-4950-be9a-63fab699de3b",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"id": "tOjuhp5Xur0z",
"outputId": "ca88a3ea-d70a-4950-be9a-63fab699de3b"
},
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"{'name': 'get_time',\n",
Expand All @@ -289,8 +290,9 @@
" 'output': \"<class 'str'>\"}"
]
},
"execution_count": 7,
"metadata": {},
"execution_count": 7
"output_type": "execute_result"
}
],
"source": [
Expand Down Expand Up @@ -341,16 +343,16 @@
"cell_type": "code",
"execution_count": 9,
"metadata": {
"id": "-0vY8PRXur0z",
"outputId": "db01e14c-eab3-4f93-f4c2-e30f508c8b5d",
"colab": {
"base_uri": "https://localhost:8080/"
}
},
"id": "-0vY8PRXur0z",
"outputId": "db01e14c-eab3-4f93-f4c2-e30f508c8b5d"
},
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"output_type": "stream",
"text": [
"\u001b[32m2024-01-08 11:15:26 INFO semantic_router.utils.logger Adding `get_time` route\u001b[0m\n"
]
Expand All @@ -373,33 +375,33 @@
"cell_type": "code",
"execution_count": 11,
"metadata": {
"id": "Wfb68M0-ur0z",
"outputId": "79923883-2a4d-4744-f8ce-e818cb5f14c3",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 53
}
},
"id": "Wfb68M0-ur0z",
"outputId": "79923883-2a4d-4744-f8ce-e818cb5f14c3"
},
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"output_type": "stream",
"text": [
"\u001b[32m2024-01-08 11:16:24 INFO semantic_router.utils.logger Extracting function input...\u001b[0m\n"
]
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"'06:16'"
],
"application/vnd.google.colaboratory.intrinsic+json": {
"type": "string"
}
},
"text/plain": [
"'06:16'"
]
},
"execution_count": 11,
"metadata": {},
"execution_count": 11
"output_type": "execute_result"
}
],
"source": [
Expand Down Expand Up @@ -427,6 +429,9 @@
}
],
"metadata": {
"colab": {
"provenance": []
},
"kernelspec": {
"display_name": "decision-layer",
"language": "python",
Expand All @@ -443,11 +448,8 @@
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.5"
},
"colab": {
"provenance": []
}
},
"nbformat": 4,
"nbformat_minor": 0
}
}
2 changes: 1 addition & 1 deletion docs/03-basic-langchain-agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@
],
"source": [
"!pip install -qU \\\n",
" semantic-router==0.0.15 \\\n",
" semantic-router==0.0.16 \\\n",
" langchain==0.0.352 \\\n",
" openai==1.6.1"
]
Expand Down
5 changes: 0 additions & 5 deletions docs/04-chat-history.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,6 @@
"Applying semantic-router to the most recent interaction in a conversation can work for many cases but it misses scenarios where information provided in the latest interaction."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
},
{
"cell_type": "code",
"execution_count": 1,
Expand Down
12 changes: 11 additions & 1 deletion docs/05-local-execution.ipynb
Original file line number Diff line number Diff line change
@@ -1,11 +1,21 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "e92c26d9",
"metadata": {},
"source": [
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb) [![Open nbviewer](https://raw.githubusercontent.com/pinecone-io/examples/master/assets/nbviewer-shield.svg)](https://nbviewer.org/github/aurelio-labs/semantic-router/blob/main/docs/05-local-execution.ipynb)"
]
},
{
"cell_type": "markdown",
"id": "ee50410e-3f98-4d9c-8838-b38aebd6ce77",
"metadata": {},
"source": [
"# Local execution with `llama.cpp` and HuggingFace Encoder\n",
"# Local Dynamic Routes\n",
"\n",
"## Fully local Semantic Router with `llama.cpp` and HuggingFace Encoder\n",
"\n",
"There are many reasons users might choose to roll their own LLMs rather than use a third-party service. Whether it's due to cost, privacy or compliance, Semantic Router supports the use of \"local\" LLMs through `llama.cpp`.\n",
"\n",
Expand Down
5 changes: 3 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
[tool.poetry]
name = "semantic-router"
version = "0.0.15"
version = "0.0.16"
description = "Super fast semantic router for AI decision making"
authors = [
"James Briggs <[email protected]>",
"Siraj Aizlewood <[email protected]>",
"Simonas Jakubonis <[email protected]>",
"Luca Mannini <[email protected]>",
"Bogdan Buduroiu <[email protected]>",
"Ismail Ashraq <[email protected]>"
"Ismail Ashraq <[email protected]>",
"Daniel Griffin <[email protected]>"
]
readme = "README.md"
packages = [{include = "semantic_router"}]
Expand Down

0 comments on commit b4ad4d7

Please sign in to comment.