The LLM-Based Automation Agent is designed to execute plain-English tasks using a Large Language Model (LLM) and integrate into a Continuous Integration (CI) pipeline. This automation agent processes structured and unstructured tasks while ensuring security and compliance constraints.
- Accepts plain-English task descriptions via an API.
- Parses, interprets, and executes tasks using GPT-4o-Mini.
- Ensures security constraints (e.g., no external file access, no data deletion).
- Supports multi-step operations and structured automation workflows.
- Provides verifiable results via a dedicated endpoint.
- Fully containerized with Docker and Podman compatibility.
Executes a task described in natural language.
- Success Response:
200 OK - Task Error Response:
400 Bad Request - Agent Error Response:
500 Internal Server Error
Retrieves the content of a specified file to verify output correctness.
- Success Response:
200 OKwith file content. - File Not Found Response:
404 Not Found
- Python 3.8+
- Docker / Podman
- AI Proxy Token (Environment Variable:
AIPROXY_TOKEN)
git clone https://github.com/your-username/llm-based-automation-agent.git
cd llm-based-automation-agentpip install -r requirements.txtexport AIPROXY_TOKEN=your_token_here
python app.pydocker build -t llm-automation-agent .
docker run --rm -e AIPROXY_TOKEN=$AIPROXY_TOKEN -p 8000:8000 llm-automation-agent- Install and run files to generate required data.
- Format Markdown files using
[email protected]. - Count data from a date file and store results.
- Sort data.
- Extract log files.
- Index Markdown files based on H1 headings.
- Extract the email address from an email file.
- Extract number and text from an image.
- Find the most similar comments using embeddings.
- Query in an SQLite database.
- Prevent access outside
/data. - Prevent file deletion.
- Handle data fetching, Git commits, SQL queries, web scraping, image compression, audio transcription, Markdown-to-HTML conversion, and CSV filtering.
podman run --rm -e AIPROXY_TOKEN=$AIPROXY_TOKEN -p 8000:8000 sandeepstele/llmautoThe Docker image for this project is hosted at: Docker Hub - sandeepstele/llmauto
- Fork the repository.
- Create a feature branch.
- Commit changes with meaningful messages.
- Open a pull request.
Developed by Sandeep S.
This project is based on initial development by ANdIeCOOl.
This project is licensed under the MIT License. See LICENSE for details.
This project is based on initial development by ANdIeCOOl.