Skip to content

☕️ CoffiFilter is a tool to help manage your LLM agents' outbound tool usage.

License

Notifications You must be signed in to change notification settings

sjakati98/coffifilter

Repository files navigation

CoffiFilter

☕️ CoffiFilter is a tool to help manage your LLM agents' outbound tool usage.

PyPI Status Python Version License

Tests Codecov

pre-commit Black

What is CoffiFilter?

  • A tool to help manage your LLM agents' outbound tool usage.
  • Track and manage the tools that your agents are using.
  • Switch tools on and off for your agents without having to redeploy them.

Features

  • Connect to your own redis server.
  • Drop in decorator for langchain functions.

Requirements

  • Python 3.6 or later
  • Redis server

Installation

You can install CoffiFilter via pip from PyPI:

$ pip install coffifilter

Usage

CoffiFilter can wrap your Langchain tools to help manage their usage.

import coffifilter
from langchain_community.tools import YouTubeSearchTool

coffifilter.init(
    redis_host="your-redishost.redis-cloud.com",
    redis_port=11552,
    redis_db=0,
    redis_password="your-redispassword",
)

youtube_tool = coffifilter.wrap_langchain_tool(YouTubeSearchTool())

...

tools = [..., youtube_tool]

# Create an agent executor by passing in the agent and tools
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "can you find videos on langchain"})

CoffiFilter can also be used as a decorator for your langchain functions.

import coffifilter

coffifilter.init(
    redis_host="your-redishost.redis-cloud.com",
    redis_port=11552,
    redis_db=0,
    redis_password="your-redispassword",
)

@tool
@coffifilter.coffi_filter("summarize_tool")
def summarize_tool(url: str, callbacks: Callbacks = None):
    """Summarize a website."""
    text = requests.get(url).text
    summary_chain = (
        ChatPromptTemplate.from_template(
            "Summarize the following text:\n<TEXT {uid}>\n" "{text}" "\n</TEXT {uid}>"
        ).partial(uid=lambda: uuid.uuid4())
        | ChatOpenAI(model="gpt-4o")
        | StrOutputParser()
    ).with_config(run_name="Summarize Text")
    return summary_chain.invoke(
        {"text": text},
        {"callbacks": callbacks},
    )

The current design is to check the redis server for the tool status before executing the function. It literally checks if the tool is on or off by checking the value of the key in the redis server. If the key is not found, it will default to off. If the key is found, it will check if the value is "true" or "false".

The decorator will raise a ValueError if the tool is off.

Eg. If the tool is off, the following error will be raised:

ValueError(f"Tool '{filter_string}' is not enabled")

Coming Soon

  • Better documentation.
  • Better langchain integration.
  • Better error handling.
  • Local first approach; avoiding redis server if not needed.
  • User tracking and IFTTT tool usage.

Contributing

Contributions are very welcome. To learn more, see the Contributor Guide.

License

Distributed under the terms of the GPL 3.0 license, CoffiFilter is free and open source software.

Issues

If you encounter any problems, please file an issue along with a detailed description.

Credits

This project was generated from @cjolowicz's Hypermodern Python Cookiecutter template.

About

☕️ CoffiFilter is a tool to help manage your LLM agents' outbound tool usage.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages