Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
!/drawio/
!/mermaid/
!/adguardhome/
!/novita/
!/ollama/

# Step 5: Inside each software dir, ignore everything (including dotfiles)
Expand Down Expand Up @@ -88,6 +89,7 @@
!/drawio/agent-harness/
!/mermaid/agent-harness/
!/adguardhome/agent-harness/
!/novita/agent-harness/
!/ollama/agent-harness/

# Step 7: Ignore build artifacts within allowed dirs
Expand Down
165 changes: 165 additions & 0 deletions novita/agent-harness/cli_anything/novita/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,165 @@
# Novita CLI

A CLI harness for **Novita AI** - an OpenAI-compatible API service for AI models like DeepSeek, GLM, and others.

## Prerequisites

- Python 3.10+
- `requests` (HTTP client)
- `click` (CLI framework)
- Novita API key

Optional (for interactive REPL):
- `prompt_toolkit`

## Install Dependencies

```bash
pip install requests click prompt_toolkit
```

## Get an API Key

1. Go to [novita.ai](https://novita.ai) and sign up
2. Navigate to Settings → API Keys
3. Create an API key (format: `sk-xxx`)
4. Configure it:

```bash
# Option 1: Config file (recommended)
cli-anything-novita config set api_key "sk-xxx"

# Option 2: Environment variable
export NOVITA_API_KEY="sk-xxx"
```

## How to Run

All commands are run from the `agent-harness/` directory or via the installed command.

### One-shot Commands

```bash
# Show help
cli-anything-novita --help

# Chat with model
cli-anything-novita chat --prompt "What is AI?" --model deepseek/deepseek-v3.2

# Streaming chat
cli-anything-novita stream --prompt "Write a poem about code"

# Test connectivity
cli-anything-novita test --model deepseek/deepseek-v3.2

# List available models
cli-anything-novita models

# JSON output for agent consumption
cli-anything-novita --json chat --prompt "Hello" --model deepseek/deepseek-v3.2
```

### Interactive REPL

```bash
cli-anything-novita
```

Inside the REPL, type `help` for all available commands.

## Command Reference

### Chat

```bash
chat --prompt <text> [--model <id>] [--temperature <0.0-1.0>] [--max-tokens <n>]
stream --prompt <text> [--model <id>] [--temperature <0.0-1.0>] [--max-tokens <n>]
```

### Session

```bash
session status
session clear
session history [--limit N]
```

### Config

```bash
config set api_key "sk-xxx"
config set default_model "deepseek/deepseek-v3.2"
config get [key]
config delete <key>
config path
```

### Utility

```bash
test [--model <id>] # Test API connectivity
models # List available models
```

## JSON Mode

Add `--json` before the subcommand for machine-readable output:

```bash
cli-anything-novita --json chat --prompt "Hello"
cli-anything-novita --json session status
```

## Default Models

The CLI supports multiple models with `/` separator (not `-`):

- `deepseek/deepseek-v3.2` (default)
- `zai-org/glm-5`
- `minimax/minimax-m2.5`

## Running Tests

```bash
cd agent-harness

# Unit tests (mock HTTP, no API key needed)
python3 -m pytest cli_anything/novita/tests/test_core.py -v

# E2E tests (requires NOVITA_API_KEY)
NOVITA_API_KEY=sk-xxx python3 -m pytest cli_anything/novita/tests/test_full_e2e.py -v

# All tests
python3 -m pytest cli_anything/novita/tests/ -v
```

## Example Workflow

```bash
# Configure API key
cli-anything-novita config set api_key "sk-xxx"

# Chat with DeepSeek model
cli-anything-novita chat --prompt "Explain quantum computing" --model deepseek/deepseek-v3.2

# Stream response
cli-anything-novita stream --prompt "Write a Python function to calculate factorial"

# Test connectivity
cli-anything-novita test --model deepseek/deepseek-v3.2

# List available models
cli-anything-novita models
```

## Supported Models

| Model ID | Provider | Description |
|----------|----------|-------------|
| `deepseek/deepseek-v3.2` | DeepSeek | DeepSeek V3.2 model |
| `zai-org/glm-5` | Zhipu AI | GLM-5 model |
| `minimax/minimax-m2.5` | MiniMax | MiniMax M2.5 model |

## License

MIT License
4 changes: 4 additions & 0 deletions novita/agent-harness/cli_anything/novita/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
"""Novita CLI harness - OpenAI-compatible AI API client."""
from __future__ import annotations

__version__ = "1.0.0"
5 changes: 5 additions & 0 deletions novita/agent-harness/cli_anything/novita/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
"""Allow running as python3 -m cli_anything.novita."""
from cli_anything.novita.novita_cli import main

if __name__ == "__main__":
main()
17 changes: 17 additions & 0 deletions novita/agent-harness/cli_anything/novita/core/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
"""Novita CLI core modules."""

from cli_anything.novita.core.session import ChatSession
from cli_anything.novita.utils.novita_backend import (
chat_completion,
chat_completion_stream,
run_full_workflow,
list_models,
)

__all__ = [
"ChatSession",
"chat_completion",
"chat_completion_stream",
"run_full_workflow",
"list_models",
]
98 changes: 98 additions & 0 deletions novita/agent-harness/cli_anything/novita/core/session.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
"""Lightweight session for chat history management."""

from __future__ import annotations

import json
import os
from datetime import datetime
from pathlib import Path


def _locked_save_json(path, data, **dump_kwargs) -> None:
"""Atomically write JSON with exclusive file locking."""
try:
f = open(path, "r+")
except FileNotFoundError:
os.makedirs(os.path.dirname(os.path.abspath(path)), exist_ok=True)
f = open(path, "w")
with f:
_locked = False
try:
import fcntl
fcntl.flock(f.fileno(), fcntl.LOCK_EX)
_locked = True
except (ImportError, OSError):
pass
try:
f.seek(0)
f.truncate()
json.dump(data, f, **dump_kwargs)
f.flush()
finally:
if _locked:
import fcntl
fcntl.flock(f.fileno(), fcntl.LOCK_UN)


class ChatSession:
"""Lightweight session for chat history management."""

def __init__(self, session_file: str = None):
self.session_file = session_file or str(
Path.home() / ".cli-anything-novita" / "session.json"
)
self.messages = []
self.history = []
self.max_history = 50
self.modified = False
if os.path.exists(self.session_file):
try:
with open(self.session_file, "r") as f:
data = json.load(f)
self.messages = data.get("messages", [])
self.history = data.get("history", [])
except (json.JSONDecodeError, IOError):
self.messages = []

def add_user_message(self, content: str):
self.messages.append({"role": "user", "content": content})
self.modified = True
self._save()

def add_assistant_message(self, content: str):
self.messages.append({"role": "assistant", "content": content})
self.modified = True
self._save()

def get_messages(self):
return self.messages.copy()

def clear(self):
self.messages = []
self.history = []
self.modified = True
self._save()

def status(self):
return {
"message_count": len(self.messages),
"history_count": len(self.history),
"modified": self.modified,
"session_file": self.session_file,
}

def _save(self):
_locked_save_json(
self.session_file,
{"messages": self.messages, "history": self.history},
indent=2,
)
self.modified = False

def save_history(self, command: str, result: dict):
self.history.append(
{"command": command, "result": result, "timestamp": str(datetime.now())}
)
if len(self.history) > self.max_history:
self.history = self.history[-self.max_history :]
self._save()
Loading