Skip to content

DARPA-ASKEM/GoLLM

Repository files navigation

Build and Publish

GoLLM

This is a repository which contains endpoints for various Terrarium LLM workflows.

Getting Started

Running the API

cd into root
run: docker build -t gollm .
run: docker run -p 8000:8000 -e OPENAI_API_KEY=$OPENAI_API_KEY gollm

AMR configuration from paper and AMR

Once the API has been started, the /configure endpoint will consume a JSON with the structure:
{ research_paper: str, amr: obj }

The API will return a model configuration candidate with the structure

{response: obj}

where `response` contains the AMR populated with configuration values.

Note: This is a WIP, is unoptimized and is currently being used as a test case for integrating LLM features with Terrarium.

AMR model card from paper

Once the API has been started, the /model_card endpoint will consume a JSON with the structure:

    {
    research_paper: str,
    }

The API will return a model card in JSON format
{response: obj}

Note: This is a WIP


License

Apache License 2.0