Prompt is essentially a string, but it should behave somewhat differently from a standard string:
📏 Length & Slicing: A prompt string should consider the length in terms of tokens, not characters, and slicing should be done accordingly.
👨 Role & Concatenation: Prompt strings should have designated roles (e.g., system
, user
, assistant
) and should be concatenated in a specific manner.
prompt-string
provides two types:
P
for prompt, inherits fromstring
. Length, Slicing and concatenation are modified and support new attributes like.role
.p = P("You're a helpful assistant")
PC
for prompt chain, act likelist[P]
. Link a series of prompt and support.messages(...)
pc = p1 / p2 / p3
pip install prompt-string
from prompt_string import P
prompt = P("you're a helpful assistant.")
print("Total token size:", len(prompt))
print("Decoded result of the second token:", prompt[2])
print("The decoded result of first three tokens:", prompt[:3])
P
supports some str
native methods to still return a P
object:
.format
.replace
prompt = P("you're a helpful assistant. {temp}")
print(len(prompt.format(temp="End of instructions")))
print(len(prompt.replace("{temp}", "")))
🧐 Raise an issue if you think other methods should be supported
from prompt_string import P
sp = P("you're a helpful assistant.", role="system")
up = P("How are you?", role="user")
print(sp.role, up.role, (sp+up).role)
print(sp + up)
print(sp.message())
-
role can be
None
,str
forP
-
For single prompt, like
sp
, the role isstr
(e.g.system
) orNone
-
sp+up
will concatenate two prompt string and generate a newP
, whose role will be updated if the latter one has one.- For example,
sp+up
's role isuser
;sp+P('Hi')
's role issystem
- For example,
-
.message(...)
return a JSON object of this prompt.
pc = sp / up
print(pc.roles)
print(pc.messages())
For concatenated prompts, like sp / up
, the type will be converted to PC
(prompt chain), PC
has below things:
.roles
, a list of roles. For example,(sp|up).roles
is['system', 'user']
.messages(...)
pack prompts into OpenAI-Compatible messages JSON, where you can directly pass it toclient.chat.completions.create(messages=...)
.messages
will assume the first role isuser
, then proceed in the order of user-assistant. When a prompt has a role, it will use that role. checkpc.infer_role
for final roles in messages.
P
inherits fromstring
. Therefore, aside from the mentioned features, its other behaviors are just like those of astring
in Python.prompt-string
won't add OpenAI and other AI SDKs as dependencies; it is simply a toolkit for prompts.prompt-string
will be super light and fast, with no heavy processes running behind the scenes.