English | 中文翻译
💸🤑 Announcing our Bounty Program: Help the Julep community fix bugs and ship features and get paid. More details here.
Start your project with conversation history, support for any LLM, agentic workflows, integrations & more.
Explore the docs »
Report Bug
·
Request Feature
·
Join Our Discord
·
X
·
LinkedIn
We've built a lot of AI apps and understand how difficult it is to evaluate hundreds of tools, techniques, and models, and then make them work well together.
The Problems
- The barrier to making LLM apps with memory, knowledge & tools is too high.
- Agentic behaviour is hard to control when done through multi-agent frameworks.
- Statefulness By Design: Manages conversation history by default. Use simple flags;
remember
&recall
to tune whether to save or retrieve conversation history. - Support for Users & Agents: Allows creating different user <-> agent interactions like
One Agent <-> Many Users
;Many Agents <-> One User
etc. Read more,. - Built-in RAG: Add, delete & update documents to give the LLM context about the user or an agent depending on your use case. Read more here.
- 90+ tools built-in: Connect your AI app to 90+ third-party applications using Composio natively.
toolset.handle_tool_calls(julep_client, session.id, response)
will call and handle your tools for you! See example - Local-first: Julep comes ready to be deployed to production using Docker Compose. Support for k8s coming soon!
- Switch LLMs on the fly: Update the Agent to switch between LLMs from OpenAI, Anthropic or Ollama. All the while preserving state.
- *Assign Tasks to Agents: Define agentic workflows to be executed asynchronously with one ore more without worrying about timeouts or multiplying hallucinations. Work in progress
(*) Coming soon!
You can view the different features of Julep in action in the guide docs.
Our hosted platform is in Beta!
To get access:
- Head over to https://platform.julep.ai
- Generate and add your
JULEP_API_KEY
in.env
Head over to docs on self-hosting to see how to run Julep locally!
pip install julep
from julep import Client
from pprint import pprint
import textwrap
import os
base_url = os.environ.get("JULEP_API_URL")
api_key = os.environ.get("JULEP_API_KEY")
client = Client(api_key=api_key, base_url=base_url)
Agent is the object to which LLM settings like model, temperature along with tools are scoped to.
agent = client.agents.create(
name="Jessica"
model="gpt-4",
tools=[] # Tools defined here
)
User is the object which represents the user of the application.
Memories are formed and saved for each user and many users can talk to one agent.
user = client.users.create(
name="Anon",
about="Average nerdy techbro/girl spending 8 hours a day on a laptop,
)
A "user" and an "agent" communicate in a "session". System prompt goes here. Conversation history and summary are stored in a "session" which saves the conversation history.
The session paradigm allows for; many users to interact with one agent and allow separation of conversation history and memories.
situation_prompt = """You are Jessica. You're a stuck up Cali teenager.
You basically complain about everything. You live in Bel-Air, Los Angeles and drag yourself to Curtis High School when you must.
"""
session = client.sessions.create(
user_id=user.id, agent_id=agent.id, situation=situation_prompt
)
session.chat
controls the communication between the "agent" and the "user".
It has two important arguments;
recall
: Retrieves the previous conversations and memories.remember
: Saves the current conversation turn into the memory store.
To keep the session stateful, both need to be True
user_msg = "hey. what do u think of starbucks"
response = client.sessions.chat(
session_id=session.id,
messages=[
{
"role": "user",
"content": user_msg,
"name": "Anon",
}
],
recall=True,
remember=True,
)
print("\n".join(textwrap.wrap(response.response[0][0].content, width=100)))
To use the API directly or to take a look at request & response formats, authentication, available endpoints and more, please refer to the API Documentation
You can also use the Postman Collection for reference.
To install the Python SDK, run:
pip install julep
For more information on using the Python SDK, please refer to the Python SDK documentation.
To install the TypeScript SDK using npm
, run:
npm install @julep/sdk
For more information on using the TypeScript SDK, please refer to the TypeScript SDK documentation.
Check out the self-hosting guide to host the platform yourself.
If you want to deploy Julep to production, let's hop on a call!
We'll help you customise the platform and help you get set up with:
- Multi-tenancy
- Reverse proxy along with authentication and authorisation
- Self-hosted LLMs
- & more
We welcome contributions from the community to help improve and expand the Julep AI platform. See CONTRIBUTING.md
Julep AI is released under the Apache 2.0 License. By using, contributing to, or distributing the Julep AI platform, you agree to the terms and conditions of this license.
If you have any questions, need assistance, or want to get in touch with the Julep AI team, please use the following channels:
- Discord: Join our community forum to discuss ideas, ask questions, and get help from other Julep AI users and the development team.
- GitHub Issues: For technical issues, bug reports, and feature requests, please open an issue on the Julep AI GitHub repository.
- Email Support: If you need direct assistance from our support team, send an email to [email protected], and we'll get back to you as soon as possible.
- Follow for updates on X & LinkedIn
- Hop on a call: We wanna know what you're building and how we can tweak and tune Julep to help you build your next AI app.