Skip to content

Latest commit

 

History

History
66 lines (44 loc) · 4.52 KB

EXAMPLE.md

File metadata and controls

66 lines (44 loc) · 4.52 KB

EXAMPLE.md

My Personal Usage

I set up the shell script scripts/find-role such that it does not actually set the environment variable ROLE after interactively picking a role prompt. I can actually export ROLE by pressing enter again.

Therefore:

  • I don't overwrite any existing ROLE variable (I might have set it before, and I don't want to lose the previous value).
  • I don't fill my environment with variables.

I use these prompts/roles mainly for software development tasks:

  1. GitHub Copilot Chat: Tweak Copilot's suggestions by selecting some lines of code in your editor. Call find-role, pick a prompt, copy it. Paste the prompt into the GitHub Copilot Chat Extension for VSCode and press enter. GitHub Copilot will explain, rewrite, or work with the code lines in #selection, according to your prompt.
  2. llm cli tool: I often use these prompts together with Simon Willison's llm command-line tool. I can call llm with
    llm gpt-4o "$ROLE 'prompt text'".
    This is a great, simple way to interact with AI systems from the command line, and for me was the basis for more complex scripts.
  3. Interacting with Llama models: The Perplexity API endpoints offer access to fine-tuned variants of Meta's important family of models, the Llama models. Sometimes I use my explore_perplexity_api.py script to interact with those Llama LLMs. I call explore_perplexity_api.py with the --role option. See example below, or see my perplexity-api-search for details.

The scripts then set the ROLE environment variable to the role that I want to use. Here I use the role labeled "Education,"CS Bootcamp Instructor". This can significantly change the output of the AI system:

Example of a Terminal Session

# EXAMPLE OF A TERMINAL SESSION 
# with the 2 scripts "find-role" and (optionally) "explore_perplexity_api.py".
# Open two terminal windows side by side, and then:

# Terminal 1: Create fzf preview window for selecting a role, interactively
`. ~/ai-system-roles/find-role`
# ... or
`. ~/bin/find-role`

# the script will print/echo the role definition as an export command
# but will NOT actually set the environment variable:
echo export ROLE='Education,"CS Bootcamp Instructor","From now on, act as an instructor in a computer science bootcamp, teaching algorithms to beginners. You will provide code examples using the Python programming language. First, start briefly explaining what the user asked for, and continue giving simple examples. Later, wait for my prompt for additional questions. Then you explain and give code examples. Whenever possible include corresponding visualizations as ASCII art."';

# Terminal 2: copy and paste that output of the find-role script here, 
# modify command as needed:
export ROLE='Education,"CS Bootcamp Instructor","From now on, act as an instructor...';

# call the script that does the heavy lifting (Query LLM and save output to a file)
export PERPLEXITY_API_KEY=your-api-key-here
./explore_perplexity_api.py --prompt "Explain to a non-programmer what a REST-API is" \
   --slug rest-api --role "$ROLE" --role-slug cs-instructor

# output will be saved to a file named final-output/rest-api-<MODELNAME>.md

My personal setup

The --slug argument and --role-slug arguments

The --slug argument to ./explore_perplexity_api.py is just a label I use to keep track of my interactions with the AI systems and the prompts I've used. The value of --slug becomes part of the output filename.
Use any word or phrase you want as a slug, but make sure it is unique to the prompt you're using. The --slug should not have spaces or special characters. The same rule applies to the --role-slug argument which you can use to test responses to different prompts.

Why all this? It enhances reproducibility and traceability of the AI-generated responses. That is useful when I take a look at the output files, perhaps many months later.

More Prompts and Personas

Some good collections, presented with both good Prompt Design and GUI Design in mind, and with a lot of additional information:

There are many, many more such collections on the web.