Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 36 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
name: Release

on:
push:
tags:
- '*'

permissions:
contents: write

jobs:
release:
name: Create GitHub Release
runs-on: ubuntu-latest
timeout-minutes: 5
if: startsWith(github.ref, 'refs/tags')

steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Verify tag is on main branch
run: |
git fetch origin main
if ! git merge-base --is-ancestor ${{ github.sha }} origin/main; then
echo "Error: Tag is not on the main branch"
exit 1
fi

- name: Create Release
uses: softprops/action-gh-release@v2
with:
generate_release_notes: true
draft: false
prerelease: false
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,9 @@ Sparkwheel builds on similar ideas but adds powerful features:
|---------|-----------------|------------|
| Config composition | Explicit (`+`, `++`) | **By default** (dicts merge, lists extend) |
| Replace semantics | Default | Explicit with `=` operator |
| Delete keys | Not idempotent | Idempotent `~` operator |
| Delete keys | CLI-only `~` operator | `~` in **YAML and CLI** |
| Delete list items | No ❌ | Yes ✅ (by index) |
| Delete dict keys | CLI-only (`~foo.bar`) | Yes ✅ (YAML + CLI) |
| References | OmegaConf interpolation | `@` (resolved) + `%` (raw YAML) |
| Python expressions | Limited | Full Python with `$` |
| Schema validation | Structured Configs | Python dataclasses |
Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ Sparkwheel has two types of references with distinct purposes:
- **Composition-by-default** - Configs merge/extend naturally, no operators needed for common case
- **List extension** - Lists extend by default (unique vs Hydra!)
- **`=` replace operator** - Explicit control when you need replacement
- **`~` delete operator** - Remove inherited keys cleanly (idempotent!)
- **`~` delete operator** - Remove inherited keys explicitly
- **Python expressions with `$`** - Compute values dynamically
- **Dataclass validation** - Type-safe configs without boilerplate
- **Dual reference system** - `@` for resolved values, `%` for raw YAML
Expand Down
2 changes: 0 additions & 2 deletions docs/user-guide/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -221,8 +221,6 @@ config.update({"~plugins": [0, 2]}) # Remove list items
config.update({"~dataloaders": ["train", "test"]}) # Remove dict keys
```

**Note:** The `~` directive is idempotent - it doesn't error if the key doesn't exist, enabling reusable configs.

### Programmatic Updates

Apply operators programmatically:
Expand Down
2 changes: 1 addition & 1 deletion docs/user-guide/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ Three operators for fine-grained control:
|----------|--------|----------|---------|
| **Compose** (default) | `key=value` | Merges dicts, extends lists | `model::lr=0.001` |
| **Replace** | `=key=value` | Completely replaces value | `=model={'_target_': 'ResNet'}` |
| **Delete** | `~key` | Removes key (idempotent) | `~debug` |
| **Delete** | `~key` | Removes key (errors if missing) | `~debug` |

!!! info "Type Inference"
Values are automatically typed using `ast.literal_eval()`:
Expand Down
51 changes: 16 additions & 35 deletions docs/user-guide/operators.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,11 +126,14 @@ Remove keys or list items with `~key`:
### Delete Entire Keys

```yaml
# Remove keys (idempotent - no error if missing!)
# Remove keys explicitly
~old_param: null
~debug_settings: null
```

!!! warning "Key Must Exist"
The delete operator will raise an error if the key doesn't exist. This helps catch typos and configuration mistakes.

### Delete Dict Keys

Use path notation for nested keys:
Expand Down Expand Up @@ -214,28 +217,6 @@ dataloaders:

**Why?** Path notation is designed for dict keys, not list indices. The batch syntax handles index normalization and processes deletions correctly (high to low order).

### Idempotent Delete

Delete operations don't error if the key doesn't exist:

```yaml
# production.yaml - Remove debug settings if they exist
~debug_mode: null
~dev_logger: null
~test_data: null
# No errors if these don't exist!
```

This enables **reusable configs** that work with multiple bases:

```yaml
# production.yaml works with ANY base config
~debug_settings: null
~verbose_logging: null
database:
pool_size: 100
```

## Combining Operators

Mix composition, replace, and delete:
Expand Down Expand Up @@ -298,7 +279,7 @@ config.update({"model": {"hidden_size": 1024}})
# Replace explicitly
config.update({"=optimizer": {"type": "sgd", "lr": 0.1}})

# Delete keys (idempotent)
# Delete keys
config.update({
"~training::old_param": None,
"~model::dropout": None
Expand Down Expand Up @@ -519,17 +500,17 @@ plugins: [cache]
|---------|-------|------------|
| Dict merge default | Yes ✅ | Yes ✅ |
| List extend default | No ❌ | **Yes** ✅ |
| Operators in YAML | No ❌ | Yes ✅ (`=`, `~`) |
| Operator count | 4 (`+`, `++`, `~`) | **2** (`=`, `~`) ✅ |
| Delete dict keys | No ❌ | Yes |
| Delete list items | No ❌ | Yes |
| Idempotent delete | N/A | Yes ✅ |

Sparkwheel goes beyond Hydra with:
- Full composition-first philosophy (dicts **and** lists)
- Operators directly in YAML files
- Just 2 simple operators
- Delete operations for fine-grained control
| Operators in YAML | CLI-only | **Yes** ✅ (YAML + CLI) |
| Operator count | 4 (`=`, `+`, `++`, `~`) | **2** (`=`, `~`) ✅ |
| Delete dict keys | CLI-only (`~foo.bar`) | **Yes** ✅ (YAML + CLI) |
| Delete list items | No ❌ | **Yes** ✅ (by index) |

Sparkwheel differs from Hydra:
- **Full composition philosophy**: Both dicts AND lists compose by default
- **Operators in YAML files**: Not just CLI overrides
- **Simpler operator set**: Just 2 operators (`=`, `~`) vs 4 (`=`, `+`, `++`, `~`)
- **List deletion**: Delete items by index with `~plugins: [0, 2]`
- **Flexible delete**: Use `~` anywhere (YAML, CLI, programmatic)

## Next Steps

Expand Down
1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,6 @@ show_traceback = true

allow_redefinition = false
check_untyped_defs = true
disallow_any_generics = true
disallow_incomplete_defs = true
ignore_missing_imports = true
implicit_reexport = false
Expand Down
6 changes: 2 additions & 4 deletions src/sparkwheel/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
"""

from .config import Config, parse_overrides
from .errors import enable_colors
from .items import Component, Expression, Instantiable, Item
from .operators import apply_operators, validate_operators
from .resolver import Resolver
Expand All @@ -19,7 +18,7 @@
EvaluationError,
FrozenConfigError,
InstantiationError,
SourceLocation,
Location,
TargetNotFoundError,
)

Expand All @@ -39,7 +38,6 @@
"validate_operators",
"validate",
"validator",
"enable_colors",
"RESOLVED_REF_KEY",
"RAW_REF_KEY",
"ID_SEP_KEY",
Expand All @@ -55,5 +53,5 @@
"EvaluationError",
"FrozenConfigError",
"ValidationError",
"SourceLocation",
"Location",
]
60 changes: 41 additions & 19 deletions src/sparkwheel/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -97,15 +97,15 @@
from typing import Any

from .loader import Loader
from .metadata import MetadataRegistry
from .operators import _validate_delete_operator, apply_operators, validate_operators
from .locations import LocationRegistry
from .operators import MergeContext, _validate_delete_operator, apply_operators, validate_operators
from .parser import Parser
from .path_utils import split_id
from .preprocessor import Preprocessor
from .resolver import Resolver
from .utils import PathLike, look_up_option, optional_import
from .utils.constants import ID_SEP_KEY, REMOVE_KEY, REPLACE_KEY
from .utils.exceptions import ConfigKeyError
from .utils.exceptions import ConfigKeyError, build_missing_key_error

__all__ = ["Config", "parse_overrides"]

Expand Down Expand Up @@ -183,7 +183,7 @@
>>> config = Config(schema=MySchema).update("config.yaml")
"""
self._data: dict[str, Any] = data or {} # Start with provided data or empty
self._metadata = MetadataRegistry()
self._locations = LocationRegistry()
self._resolver = Resolver()
self._is_parsed = False
self._frozen = False # Set via freeze() method later
Expand Down Expand Up @@ -329,7 +329,7 @@
"""
from .schema import validate as validate_schema

validate_schema(self._data, schema, metadata=self._metadata)
validate_schema(self._data, schema, metadata=self._locations)

def freeze(self) -> None:
"""Freeze config to prevent further modifications.
Expand Down Expand Up @@ -359,6 +359,20 @@
"""
return self._frozen

@property
def locations(self) -> LocationRegistry:
"""Get the location registry for this config.
Returns:
LocationRegistry tracking file locations of config keys
Example:
>>> config = Config().update("config.yaml")
>>> location = config.locations.get("model::lr")
>>> print(f"{location.filepath}:{location.line}")
"""
return self._locations

def update(self, source: PathLike | dict[str, Any] | "Config" | str) -> "Config":
"""Update configuration with changes from another source.
Expand All @@ -376,7 +390,7 @@
Operators:
- key=value - Compose (default): merge dict or extend list
- =key=value - Replace operator: completely replace value
- ~key - Remove operator: delete key (idempotent)
- ~key - Remove operator: delete key (errors if missing)
Examples:
>>> # Update from file
Expand Down Expand Up @@ -426,7 +440,7 @@
# Eagerly expand raw references (%) immediately after update
# This matches MONAI's behavior and allows safe pruning with delete operator (~)
# Must happen BEFORE validation so schema sees final structure, not raw ref strings
self._data = self._preprocessor.process_raw_refs(self._data, self._data, id="")
self._data = self._preprocessor.process_raw_refs(self._data, self._data, id="", locations=self._locations)

# Validate after raw ref expansion if schema exists
# This validates the final structure, not intermediate raw reference strings
Expand All @@ -436,7 +450,7 @@
validate_schema(
self._data,
self._schema,
metadata=self._metadata,
metadata=self._locations,
allow_missing=self._allow_missing,
strict=self._strict,
)
Expand All @@ -445,8 +459,9 @@

def _update_from_config(self, source: "Config") -> None:
"""Update from another Config instance."""
self._data = apply_operators(self._data, source._data)
self._metadata.merge(source._metadata)
context = MergeContext(locations=source.locations)
self._data = apply_operators(self._data, source._data, context=context)
self._locations.merge(source.locations)
self._invalidate_resolution()

def _uses_nested_paths(self, source: dict[str, Any]) -> bool:
Expand All @@ -466,17 +481,22 @@
self.set(actual_key, value)

elif key.startswith(REMOVE_KEY):
# Delete operator: ~key (idempotent)
# Delete operator: ~key
actual_key = key[1:]
_validate_delete_operator(actual_key, value)

if actual_key in self:
self._delete_nested_key(actual_key)
if actual_key not in self:
# Try to find source location for the key being deleted
source_location = self._locations.get(actual_key) if self._locations else None
available_keys: list[str] = list(self._data.keys()) if isinstance(self._data, dict) else []
raise build_missing_key_error(actual_key, available_keys, source_location)

Check warning on line 492 in src/sparkwheel/config.py

View check run for this annotation

Codecov / codecov/patch

src/sparkwheel/config.py#L490-L492

Added lines #L490 - L492 were not covered by tests
self._delete_nested_key(actual_key)

else:
# Default: compose (merge dict or extend list)
if key in self and isinstance(self[key], dict) and isinstance(value, dict):
merged = apply_operators(self[key], value)
context = MergeContext(locations=self._locations, current_path=key)
merged = apply_operators(self[key], value, context=context)
self.set(key, merged)
elif key in self and isinstance(self[key], list) and isinstance(value, list):
self.set(key, self[key] + value)
Expand All @@ -501,7 +521,8 @@
def _apply_structural_update(self, source: dict[str, Any]) -> None:
"""Apply structural update with operators."""
validate_operators(source)
self._data = apply_operators(self._data, source)
context = MergeContext(locations=self._locations)
self._data = apply_operators(self._data, source, context=context)
self._invalidate_resolution()

def _update_from_file(self, source: PathLike) -> None:
Expand All @@ -512,12 +533,13 @@
# Check if loaded data uses :: path syntax
if self._uses_nested_paths(new_data):
# Expand nested paths using path updates
self._metadata.merge(new_metadata)
self._locations.merge(new_metadata)

Check warning on line 536 in src/sparkwheel/config.py

View check run for this annotation

Codecov / codecov/patch

src/sparkwheel/config.py#L536

Added line #L536 was not covered by tests
self._apply_path_updates(new_data)
else:
# Normal structural update
self._data = apply_operators(self._data, new_data)
self._metadata.merge(new_metadata)
context = MergeContext(locations=new_metadata)
self._data = apply_operators(self._data, new_data, context=context)
self._locations.merge(new_metadata)

self._invalidate_resolution()

Expand Down Expand Up @@ -637,7 +659,7 @@
self._data = self._preprocessor.process(self._data, self._data, id="")

# Stage 2: Parse config tree to create Items
parser = Parser(globals=self._globals, metadata=self._metadata)
parser = Parser(globals=self._globals, metadata=self._locations)
items = parser.parse(self._data)

# Stage 3: Add items to resolver
Expand Down
6 changes: 0 additions & 6 deletions src/sparkwheel/errors/__init__.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,7 @@
from .context import format_available_keys, format_resolution_chain
from .formatters import enable_colors, format_code, format_error, format_suggestion
from .suggestions import format_suggestions, get_suggestions, levenshtein_distance

__all__ = [
# Formatters
"enable_colors",
"format_error",
"format_suggestion",
"format_code",
# Suggestions
"levenshtein_distance",
"get_suggestions",
Expand Down
Loading
Loading