Skip to content
Closed
Show file tree
Hide file tree
Changes from 14 commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
a2b74b9
docs(schema): add migration docs
Godzilla675 Feb 5, 2026
7245fb4
feat(migration): core model and schemas
Godzilla675 Feb 5, 2026
2eb1b4d
feat(migration): builder and validation
Godzilla675 Feb 5, 2026
a94a183
feat(schema): structural derivation bindings
Godzilla675 Feb 5, 2026
14d335d
feat(migration): selector macros and syntax
Godzilla675 Feb 5, 2026
efe5765
test(migration): add migration coverage
Godzilla675 Feb 5, 2026
e24d7a2
Fix MigrationValidator: add Wrap path validation and migration-aware …
Godzilla675 Feb 5, 2026
17f403c
Apply scalafmt formatting to MigrationValidator
Godzilla675 Feb 5, 2026
620bd24
Add branch coverage tests and fix scalafmt
Godzilla675 Feb 5, 2026
f99eb13
Add 413 migration coverage tests to meet 80% branch coverage minimum
Godzilla675 Feb 6, 2026
f65031e
Fix scalafmt formatting for MigrationBuilderSyntax.scala
Godzilla675 Feb 6, 2026
89a5820
fix: scalafmt scala213 dialect, fix failing test, add coverage tests …
Godzilla675 Feb 6, 2026
8c5cee6
fix: JS compat, coercion assertions, add modifyAtPathRec and navigate…
Godzilla675 Feb 6, 2026
18908b5
fix: add targeted branch coverage tests for 3.3.x threshold
Godzilla675 Feb 6, 2026
97786d2
Address code review feedback
Godzilla675 Feb 6, 2026
94ff8e1
Revert TransformValue context change - transform must eval against fi…
Godzilla675 Feb 6, 2026
756d930
Polish migration docs: fix method names, add descriptions, improve ex…
Godzilla675 Feb 6, 2026
d603ef7
feat: add compile-time field tracking via TrackedMigrationBuilder (Sc…
Godzilla675 Feb 6, 2026
c3f3518
docs: add compile-time field tracking section to migration reference
Godzilla675 Feb 6, 2026
dbae175
refactor: remove ~30 duplicate tests across coverage spec files
Godzilla675 Feb 6, 2026
db0a83e
fix: resolve 7 bugs in migration system
Godzilla675 Feb 6, 2026
f70ecb2
fix: address PR audit - ChangeType, Join/Split validation
Godzilla675 Feb 7, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .scalafmt.conf
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,9 @@ rewriteTokens = {
}

fileOverride {
"glob:**/scala-2/**" {
runner.dialect = scala213
}
"glob:**/scala-3/**" {
runner.dialect = scala3
}
Expand Down
222 changes: 222 additions & 0 deletions docs/reference/migration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,222 @@
# Schema Migration

Schema migration provides a pure, algebraic system for transforming data between schema versions.

## Overview

The migration system enables:
- **Type-safe migrations**: Define transformations between typed schemas
- **Dynamic migrations**: Operate on untyped `DynamicValue` for flexibility
- **Reversibility**: All migrations can be structurally reversed
- **Serialization**: Migrations are pure data that can be serialized and stored
- **Path-aware errors**: Detailed error messages with exact location information

## Core Types

### Migration[A, B]

A typed migration from schema `A` to schema `B`:

```scala
import zio.blocks.schema._
import zio.blocks.schema.migration._

// Needed when (de)serializing DynamicMigration / MigrationAction / DynamicSchemaExpr
import zio.blocks.schema.migration.MigrationSchemas._

case class PersonV1(name: String, age: Int)
case class PersonV2(fullName: String, age: Int, country: String)

object PersonV1 { implicit val schema: Schema[PersonV1] = Schema.derived }
object PersonV2 { implicit val schema: Schema[PersonV2] = Schema.derived }

val migration: Migration[PersonV1, PersonV2] =
Migration
.newBuilder[PersonV1, PersonV2]
.renameField(MigrationBuilder.paths.field("name"), MigrationBuilder.paths.field("fullName"))
.addField(MigrationBuilder.paths.field("country"), "US")
.buildPartial
```

### DynamicMigration

An untyped, serializable migration operating on `DynamicValue`:

```scala
val dynamicMigration = migration.dynamicMigration

import zio.blocks.chunk.Chunk

// Apply to DynamicValue directly
val oldValue: DynamicValue = DynamicValue.Record(Chunk(
"name" -> DynamicValue.Primitive(PrimitiveValue.String("John")),
"age" -> DynamicValue.Primitive(PrimitiveValue.Int(30))
))

val newValue: Either[MigrationError, DynamicValue] = dynamicMigration(oldValue)
```

### MigrationAction

Individual migration actions are represented as an algebraic data type:

| Action | Description |
|--------|-------------|
| `AddField` | Add a new field with a default value |
| `DropField` | Remove a field |
| `RenameField` | Rename a field |
| `TransformValue` | Transform a value using an expression |
| `Mandate` | Make an optional field mandatory |
| `Optionalize` | Make a mandatory field optional |
| `ChangeType` | Convert between primitive types |
| `Join` | Combine multiple fields into one |
| `Split` | Split one field into multiple |
| `RenameCase` | Rename a case in a variant/enum |
| `TransformCase` | Transform within a specific case |
| `TransformElements` | Transform all elements in a sequence |
| `TransformKeys` | Transform all keys in a map |
| `TransformValues` | Transform all values in a map |
| `Identity` | No-op action |

### DynamicSchemaExpr

Serializable expressions for value transformations:

```scala
// Literal value
val lit = DynamicSchemaExpr.Literal(DynamicValue.Primitive(PrimitiveValue.Int(42)))

// Path extraction
val path = DynamicSchemaExpr.Path(DynamicOptic.root.field("name"))

// Arithmetic
val doubled = DynamicSchemaExpr.Arithmetic(
path,
DynamicSchemaExpr.Literal(DynamicValue.Primitive(PrimitiveValue.Int(2))),
DynamicSchemaExpr.ArithmeticOperator.Multiply
)

// String operations
val concat = DynamicSchemaExpr.StringConcat(expr1, expr2)
val length = DynamicSchemaExpr.StringLength(stringExpr)

// Type coercion
val coerced = DynamicSchemaExpr.CoercePrimitive(intExpr, "String")
```

## MigrationBuilder API

The builder provides a fluent API for constructing migrations:

```scala
Migration
.newBuilder[OldType, NewType]
// Record operations
.addField(path, defaultExpr)
.dropField(path, defaultForReverse)
.renameField(fromPath, toPath)
.transformField(path, transform, reverseTransform)
.mandateField(path, default)
.optionalizeField(path)
.changeType(path, converter, reverseConverter)
.joinFields(targetPath, sourcePaths, combiner, splitter)
.splitField(sourcePath, targetPaths, splitter, combiner)
// Enum operations
.renameCase(path, from, to)
.transformCase(path, caseName, nestedActions)
// Collection operations
.transformElements(path, transform, reverseTransform)
.transformKeys(path, transform, reverseTransform)
.transformValues(path, transform, reverseTransform)
Copy link

Copilot AI Feb 6, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The builder API shown here doesn’t match the implemented MigrationBuilder methods (e.g. .changeType vs changeFieldType, and renameCase(path, ...) / transformCase(path, ...) vs renameCaseAt / transformCaseAt). Please update the docs to use the actual method names/signatures so readers can copy/paste the examples successfully.

Suggested change
.changeType(path, converter, reverseConverter)
.joinFields(targetPath, sourcePaths, combiner, splitter)
.splitField(sourcePath, targetPaths, splitter, combiner)
// Enum operations
.renameCase(path, from, to)
.transformCase(path, caseName, nestedActions)
// Collection operations
.transformElements(path, transform, reverseTransform)
.transformKeys(path, transform, reverseTransform)
.transformValues(path, transform, reverseTransform)
.changeFieldType(path, converter, reverseConverter)
.joinFields(targetPath, sourcePaths, combiner, splitter)
.splitField(sourcePath, targetPaths, splitter, combiner)
// Enum operations
.renameCaseAt(path, from, to)
.transformCaseAt(path, caseName, nestedActions)
// Collection operations
.transformElementsAt(path, transform, reverseTransform)
.transformKeysAt(path, transform, reverseTransform)
.transformValuesAt(path, transform, reverseTransform)

Copilot uses AI. Check for mistakes.
// Build
.build // Full validation
.buildPartial // Skip validation
```

## Type-Safe Selector Syntax

For more ergonomic, type-safe paths, import the selector syntax extensions:

```scala
import zio.blocks.schema.migration.MigrationBuilderSyntax._

val migration: Migration[PersonV1, PersonV2] =
Migration
.newBuilder[PersonV1, PersonV2]
.renameField(_.name, _.fullName)
.addField(_.country, "US")
.buildPartial
```

Selector syntax supports optic-like projections such as:
- `.when[T]`, `.each`, `.eachKey`, `.eachValue`, `.wrapped[T]`, `.at(i)`, `.atIndices(is*)`, `.atKey(k)`, `.atKeys(ks*)`

## Path Helpers

Use the `paths` object for constructing paths:

```scala
import MigrationBuilder.paths

paths.field("name") // Single field
paths.field("address", "street") // Nested field
paths.elements // Sequence elements
paths.mapKeys // Map keys
paths.mapValues // Map values
```

## Reversibility

All migrations can be reversed:

```scala
val forward: Migration[A, B] = ...
val backward: Migration[B, A] = forward.reverse

// Law: forward ++ backward should be identity (structurally)
```

## Composition

Migrations can be composed:

```scala
val v1ToV2: Migration[V1, V2] = ...
val v2ToV3: Migration[V2, V3] = ...

val v1ToV3: Migration[V1, V3] = v1ToV2 ++ v2ToV3
// or
val v1ToV3: Migration[V1, V3] = v1ToV2.andThen(v2ToV3)
```

## Error Handling

Migrations return `Either[MigrationError, DynamicValue]`:

```scala
migration.apply(value) match {
case Right(newValue) => // Success
case Left(errors) =>
errors.errors.foreach { error =>
println(s"At ${error.path}: ${error.message}")
}
}
```

Error types include:
- `FieldNotFound` - A required field was not found in the source value
- `FieldAlreadyExists` - A field already exists when trying to add it
- `NotARecord` - Expected a record but found a different kind of value
- `NotAVariant` - Expected a variant but found a different kind of value
- `TypeConversionFailed` - Primitive type conversion failed
- `DefaultValueMissing` - Default value not resolved
- `PathNavigationFailed` - Cannot navigate the path
- `ActionFailed` - General action failure

## Best Practices

1. **Use `buildPartial` during development**, switch to `build` for production validation
2. **Provide meaningful reverse transforms** for `TransformValue` actions
3. **Keep migrations small and focused** - compose multiple simple migrations
4. **Test both forward and reverse** directions
5. **Store migrations alongside schema versions** for reproducibility
7 changes: 4 additions & 3 deletions docs/reference/schema.md
Original file line number Diff line number Diff line change
Expand Up @@ -221,9 +221,10 @@ Having the schema for `DynamicValue` allows seamless encoding/decoding between `

```scala
import zio.blocks.schema._
import zio.blocks.chunk.Chunk

// Records have unquoted keys
val record = DynamicValue.Record(Vector(
val record = DynamicValue.Record(Chunk(
"name" -> DynamicValue.Primitive(PrimitiveValue.String("Alice")),
"age" -> DynamicValue.Primitive(PrimitiveValue.Int(30))
))
Expand All @@ -234,7 +235,7 @@ println(record)
// }

// Maps have quoted string keys
val map = DynamicValue.Map(Vector(
val map = DynamicValue.Map(Chunk(
DynamicValue.Primitive(PrimitiveValue.String("key")) ->
DynamicValue.Primitive(PrimitiveValue.String("value"))
))
Expand All @@ -244,7 +245,7 @@ println(map)
// }

// Variants use @ metadata
val variant = DynamicValue.Variant("Some", DynamicValue.Record(Vector(
val variant = DynamicValue.Variant("Some", DynamicValue.Record(Chunk(
"value" -> DynamicValue.Primitive(PrimitiveValue.Int(42))
)))
println(variant)
Expand Down
6 changes: 4 additions & 2 deletions docs/reference/validation.md
Original file line number Diff line number Diff line change
Expand Up @@ -320,8 +320,10 @@ object Person {
// Create a DynamicSchema for validation
val dynamicSchema: DynamicSchema = Schema[Person].toDynamicSchema

import zio.blocks.chunk.Chunk

// Create a DynamicValue to validate
val value = DynamicValue.Record(Vector(
val value = DynamicValue.Record(Chunk(
"name" -> DynamicValue.Primitive(PrimitiveValue.String("Alice")),
"age" -> DynamicValue.Primitive(PrimitiveValue.Int(30))
))
Expand All @@ -345,7 +347,7 @@ val dynamicSchema: DynamicSchema = Schema[Person].toDynamicSchema
val validatingSchema: Schema[DynamicValue] = dynamicSchema.toSchema

// Now any decoding through this schema will validate structure
val invalidValue = DynamicValue.Record(Vector(
val invalidValue = DynamicValue.Record(Chunk(
"name" -> DynamicValue.Primitive(PrimitiveValue.Int(42)) // wrong type!
))

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ import scala.collection.immutable.ArraySeq
import scala.util.Try

object ToonTestUtils {
private def normalizeNewlines(s: String): String =
s.replace("\r\n", "\n").replace("\r", "\n")
def roundTrip[A](value: A, expectedToon: String)(implicit schema: Schema[A]): TestResult =
roundTrip(value, expectedToon, getOrDeriveCodec(schema))

Expand Down Expand Up @@ -47,7 +49,7 @@ object ToonTestUtils {
val encodedBySchema3 = output.toByteArray
val encodedBySchema4 = codec.encode(value, writerConfig)
val encodedBySchema5 = codec.encodeToString(value, writerConfig).getBytes(UTF_8)
assert(new String(encodedBySchema1, UTF_8))(equalTo(expectedToon)) &&
assert(normalizeNewlines(new String(encodedBySchema1, UTF_8)))(equalTo(normalizeNewlines(expectedToon))) &&
assert(ArraySeq.unsafeWrapArray(encodedBySchema1))(equalTo(ArraySeq.unsafeWrapArray(encodedBySchema2))) &&
assert(ArraySeq.unsafeWrapArray(encodedBySchema1))(equalTo(ArraySeq.unsafeWrapArray(encodedBySchema3))) &&
assert(ArraySeq.unsafeWrapArray(encodedBySchema1))(equalTo(ArraySeq.unsafeWrapArray(encodedBySchema4))) &&
Expand Down Expand Up @@ -110,7 +112,7 @@ object ToonTestUtils {
): TestResult = {
val codec = ToonBinaryCodec.dynamicValueCodec
val result = codec.encodeToString(value, writerConfig)
assert(result)(equalTo(expectedToon))
assert(normalizeNewlines(result))(equalTo(normalizeNewlines(expectedToon)))
}

def decodeError[A](invalidToon: String, error: String)(implicit schema: Schema[A]): TestResult =
Expand Down Expand Up @@ -181,7 +183,7 @@ object ToonTestUtils {
val encodedBySchema3 = output.toByteArray
val encodedBySchema4 = codec.encode(value, writerConfig)
val encodedBySchema5 = codec.encodeToString(value, writerConfig).getBytes(UTF_8)
assert(new String(encodedBySchema1, UTF_8))(equalTo(expectedToon)) &&
assert(normalizeNewlines(new String(encodedBySchema1, UTF_8)))(equalTo(normalizeNewlines(expectedToon))) &&
assert(ArraySeq.unsafeWrapArray(encodedBySchema1))(equalTo(ArraySeq.unsafeWrapArray(encodedBySchema2))) &&
assert(ArraySeq.unsafeWrapArray(encodedBySchema1))(equalTo(ArraySeq.unsafeWrapArray(encodedBySchema3))) &&
assert(ArraySeq.unsafeWrapArray(encodedBySchema1))(equalTo(ArraySeq.unsafeWrapArray(encodedBySchema4))) &&
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,17 @@ import scala.reflect.NameTransformer

trait SchemaCompanionVersionSpecific {
def derived[A]: Schema[A] = macro SchemaCompanionVersionSpecific.derived[A]

/**
* Derive a schema for a structural type. This is only supported in Scala 3
* due to the requirement for:
* - Refinement types (structural records):
* `{ val name: String; val age: Int }`
* - Union types (structural enums): `Type1 | Type2`
*
* In Scala 2, use regular case classes with `Schema.derived[A]` instead.
*/
def structural[A]: Schema[A] = macro SchemaCompanionVersionSpecific.structural[A]
}

private object SchemaCompanionVersionSpecific {
Expand Down Expand Up @@ -936,4 +947,11 @@ private object SchemaCompanionVersionSpecific {
// c.info(c.enclosingPosition, s"Generated schema:\n${showCode(schemaBlock)}", force = true)
c.Expr[Schema[A]](schemaBlock)
}

def structural[A: c.WeakTypeTag](c: blackbox.Context): c.Expr[Schema[A]] =
CommonMacroOps.fail(c)(
"Schema.structural is only supported in Scala 3. " +
"Structural types (refinement types and union types) are a Scala 3 feature. " +
"In Scala 2, please use regular case classes with Schema.derived[A] instead."
)
}
Loading
Loading