Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 36 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
name: Release

on:
push:
tags:
- '*'

permissions:
contents: write

jobs:
release:
name: Create GitHub Release
runs-on: ubuntu-latest
timeout-minutes: 5
if: startsWith(github.ref, 'refs/tags')

steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Verify tag is on main branch
run: |
git fetch origin main
if ! git merge-base --is-ancestor ${{ github.sha }} origin/main; then
echo "Error: Tag is not on the main branch"
exit 1
fi

- name: Create Release
uses: softprops/action-gh-release@v2
with:
generate_release_notes: true
draft: false
prerelease: false
93 changes: 23 additions & 70 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,103 +10,56 @@
<a href="https://github.com/project-lighter/sparkwheel/blob/main/LICENSE"><img alt="License" src="https://img.shields.io/badge/License-Apache%202.0-blue.svg"></a>
<a href="https://project-lighter.github.io/sparkwheel"><img alt="Documentation" src="https://img.shields.io/badge/docs-latest-olive"></a>
</p>
<br/>

<p align="center">⚙️ YAML configuration meets Python 🐍</p>
<h3 align="center">YAML configuration meets Python</h3>
<p align="center">Define Python objects in YAML. Reference, compose, and instantiate them effortlessly.</p>
<br/>

## What is Sparkwheel?
## Quick Start

Stop hardcoding parameters. Define complex Python objects in clean YAML files, compose them naturally, and instantiate with one line.
```bash
pip install sparkwheel
```

```yaml
# config.yaml
dataset:
num_classes: 10
batch_size: 32

model:
_target_: torch.nn.Linear
in_features: 784
out_features: "%dataset::num_classes" # Reference other values
out_features: "%dataset::num_classes" # Reference

dataset:
num_classes: 10
training:
steps_per_epoch: "$10000 // @dataset::batch_size" # Expression
```

```python
from sparkwheel import Config

config = Config()
config.update("config.yaml")
model = config.resolve("model") # Actual torch.nn.Linear(784, 10) instance!
```

## Key Features

- **Declarative Object Creation** - Instantiate any Python class from YAML with `_target_`
- **Smart References** - `@` for resolved values, `%` for raw YAML
- **Composition by Default** - Configs merge naturally (dicts merge, lists extend)
- **Explicit Operators** - `=` to replace, `~` to delete when needed
- **Python Expressions** - Compute values dynamically with `$` prefix
- **Schema Validation** - Type-check configs with Python dataclasses
- **CLI Overrides** - Override any value from command line

## Installation

```bash
pip install sparkwheel
model = config.resolve("model") # Actual torch.nn.Linear(784, 10)
```

**[→ Get Started in 5 Minutes](https://project-lighter.github.io/sparkwheel/getting-started/quickstart/)**
## Features

## Coming from Hydra/OmegaConf?

Sparkwheel builds on similar ideas but adds powerful features:

| Feature | Hydra/OmegaConf | Sparkwheel |
|---------|-----------------|------------|
| Config composition | Explicit (`+`, `++`) | **By default** (dicts merge, lists extend) |
| Replace semantics | Default | Explicit with `=` operator |
| Delete keys | Not idempotent | Idempotent `~` operator |
| References | OmegaConf interpolation | `@` (resolved) + `%` (raw YAML) |
| Python expressions | Limited | Full Python with `$` |
| Schema validation | Structured Configs | Python dataclasses |
| List extension | Lists replace | **Lists extend by default** |

**Composition by default** means configs merge naturally without operators:
```yaml
# base.yaml
model:
hidden_size: 256
dropout: 0.1

# experiment.yaml
model:
hidden_size: 512 # Override
# dropout inherited
```

## Documentation
- **Declarative Objects** - Instantiate any Python class with `_target_`
- **Smart References** - `@` for resolved values, `%` for raw YAML
- **Composition by Default** - Dicts merge, lists extend automatically
- **Explicit Control** - `=` to replace, `~` to delete
- **Python Expressions** - Dynamic values with `$`
- **Schema Validation** - Type-check with dataclasses

- [Full Documentation](https://project-lighter.github.io/sparkwheel/)
- [Quick Start Guide](https://project-lighter.github.io/sparkwheel/getting-started/quickstart/)
- [Core Concepts](https://project-lighter.github.io/sparkwheel/user-guide/basics/)
- [API Reference](https://project-lighter.github.io/sparkwheel/reference/)
**[Get Started](https://project-lighter.github.io/sparkwheel/getting-started/quickstart/)** · **[Documentation](https://project-lighter.github.io/sparkwheel/)** · **[Quick Reference](https://project-lighter.github.io/sparkwheel/user-guide/quick-reference/)**

## Community

- [Discord Server](https://discord.gg/zJcnp6KrUp) - Chat with the community
- [YouTube Channel](https://www.youtube.com/channel/UCef1oTpv2QEBrD2pZtrdk1Q) - Tutorials and demos
- [GitHub Issues](https://github.com/project-lighter/sparkwheel/issues) - Bug reports and feature requests

## Contributing

We welcome contributions! See [CONTRIBUTING.md](CONTRIBUTING.md) for development setup and guidelines.
- [Discord](https://discord.gg/zJcnp6KrUp) · [YouTube](https://www.youtube.com/channel/UCef1oTpv2QEBrD2pZtrdk1Q) · [Issues](https://github.com/project-lighter/sparkwheel/issues)

## About

Sparkwheel is a hard fork of [MONAI Bundle](https://github.com/Project-MONAI/MONAI/tree/dev/monai/bundle)'s configuration system, refined and expanded for general-purpose use. We're deeply grateful to the MONAI team for their excellent foundation.

Sparkwheel powers [Lighter](https://project-lighter.github.io/lighter/), our configuration-driven deep learning framework built on PyTorch Lightning.

## License

Apache License 2.0 - See [LICENSE](LICENSE) for details.
Sparkwheel is a hard fork of [MONAI Bundle](https://github.com/Project-MONAI/MONAI/tree/dev/monai/bundle)'s config system, with the goal of making a more general-purpose configuration library for Python projects. It combines the best of MONAI Bundle and [Hydra](http://hydra.cc/)/[OmegaComf](https://omegaconf.readthedocs.io/), while introducing new features and improvements not found in either.
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ Sparkwheel has two types of references with distinct purposes:
- **Composition-by-default** - Configs merge/extend naturally, no operators needed for common case
- **List extension** - Lists extend by default (unique vs Hydra!)
- **`=` replace operator** - Explicit control when you need replacement
- **`~` delete operator** - Remove inherited keys cleanly (idempotent!)
- **`~` delete operator** - Remove inherited keys explicitly
- **Python expressions with `$`** - Compute values dynamically
- **Dataclass validation** - Type-safe configs without boilerplate
- **Dual reference system** - `@` for resolved values, `%` for raw YAML
Expand Down
2 changes: 0 additions & 2 deletions docs/user-guide/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -221,8 +221,6 @@ config.update({"~plugins": [0, 2]}) # Remove list items
config.update({"~dataloaders": ["train", "test"]}) # Remove dict keys
```

**Note:** The `~` directive is idempotent - it doesn't error if the key doesn't exist, enabling reusable configs.

### Programmatic Updates

Apply operators programmatically:
Expand Down
2 changes: 1 addition & 1 deletion docs/user-guide/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ Three operators for fine-grained control:
|----------|--------|----------|---------|
| **Compose** (default) | `key=value` | Merges dicts, extends lists | `model::lr=0.001` |
| **Replace** | `=key=value` | Completely replaces value | `=model={'_target_': 'ResNet'}` |
| **Delete** | `~key` | Removes key (idempotent) | `~debug` |
| **Delete** | `~key` | Removes key (errors if missing) | `~debug` |

!!! info "Type Inference"
Values are automatically typed using `ast.literal_eval()`:
Expand Down
82 changes: 43 additions & 39 deletions docs/user-guide/operators.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,11 +126,14 @@ Remove keys or list items with `~key`:
### Delete Entire Keys

```yaml
# Remove keys (idempotent - no error if missing!)
# Remove keys explicitly
~old_param: null
~debug_settings: null
```

!!! warning "Key Must Exist"
The delete operator will raise an error if the key doesn't exist. This helps catch typos and configuration mistakes.

### Delete Dict Keys

Use path notation for nested keys:
Expand Down Expand Up @@ -214,28 +217,6 @@ dataloaders:

**Why?** Path notation is designed for dict keys, not list indices. The batch syntax handles index normalization and processes deletions correctly (high to low order).

### Idempotent Delete

Delete operations don't error if the key doesn't exist:

```yaml
# production.yaml - Remove debug settings if they exist
~debug_mode: null
~dev_logger: null
~test_data: null
# No errors if these don't exist!
```

This enables **reusable configs** that work with multiple bases:

```yaml
# production.yaml works with ANY base config
~debug_settings: null
~verbose_logging: null
database:
pool_size: 100
```

## Combining Operators

Mix composition, replace, and delete:
Expand Down Expand Up @@ -298,7 +279,7 @@ config.update({"model": {"hidden_size": 1024}})
# Replace explicitly
config.update({"=optimizer": {"type": "sgd", "lr": 0.1}})

# Delete keys (idempotent)
# Delete keys
config.update({
"~training::old_param": None,
"~model::dropout": None
Expand Down Expand Up @@ -454,17 +435,40 @@ model:

### Write Reusable Configs

Use idempotent delete for portable configs:
!!! warning "Delete Requires Key Existence"
The delete operator (`~`) is **strict** - it raises an error if the key doesn't exist. This helps catch typos and configuration mistakes.

When writing configs that should work with different base configurations, you have a few options:

**Option 1: Document required keys**
```yaml
# production.yaml - works with ANY base!
~debug_mode: null # Remove if exists
~verbose_logging: null # No error if missing
# production.yaml
# Requires: base config must have debug_mode and verbose_logging
~debug_mode: null
~verbose_logging: null
database:
pool_size: 100
ssl: true
```

**Option 2: Use composition order**
```yaml
# production.yaml - override instead of delete
debug_mode: false # Overrides if exists, sets if not
verbose_logging: false
database:
pool_size: 100
ssl: true
```

**Option 3: Conditional deletion with lists**
```yaml
# Delete multiple optional keys - fails only if ALL are missing
~: [debug_mode, verbose_logging] # At least one must exist
database:
pool_size: 100
```

## Common Mistakes

### Using `=` When Not Needed
Expand Down Expand Up @@ -519,17 +523,17 @@ plugins: [cache]
|---------|-------|------------|
| Dict merge default | Yes ✅ | Yes ✅ |
| List extend default | No ❌ | **Yes** ✅ |
| Operators in YAML | No ❌ | Yes ✅ (`=`, `~`) |
| Operator count | 4 (`+`, `++`, `~`) | **2** (`=`, `~`) ✅ |
| Delete dict keys | No ❌ | Yes |
| Delete list items | No ❌ | Yes |
| Idempotent delete | N/A | Yes ✅ |

Sparkwheel goes beyond Hydra with:
- Full composition-first philosophy (dicts **and** lists)
- Operators directly in YAML files
- Just 2 simple operators
- Delete operations for fine-grained control
| Operators in YAML | CLI-only | **Yes** ✅ (YAML + CLI) |
| Operator count | 4 (`=`, `+`, `++`, `~`) | **2** (`=`, `~`) ✅ |
| Delete dict keys | CLI-only (`~foo.bar`) | **Yes** ✅ (YAML + CLI) |
| Delete list items | No ❌ | **Yes** ✅ (by index) |

Sparkwheel differs from Hydra:
- **Full composition philosophy**: Both dicts AND lists compose by default
- **Operators in YAML files**: Not just CLI overrides
- **Simpler operator set**: Just 2 operators (`=`, `~`) vs 4 (`=`, `+`, `++`, `~`)
- **List deletion**: Delete items by index with `~plugins: [0, 2]`
- **Flexible delete**: Use `~` anywhere (YAML, CLI, programmatic)

## Next Steps

Expand Down
Loading