diff --git a/.claude/skills/docs-data-type-ref/SKILL.md b/.claude/skills/docs-data-type-ref/SKILL.md index 7730cdc6e9..7ddbe11189 100644 --- a/.claude/skills/docs-data-type-ref/SKILL.md +++ b/.claude/skills/docs-data-type-ref/SKILL.md @@ -243,6 +243,56 @@ Rules for this section: - The bolded description must be a short plain-English description of what that specific `App` demonstrates — not the object name rephrased. - Keep the two numbered steps (clone, run individually) in that order; do not add or remove steps. - If no example `App` objects were written (rare), omit this section entirely. +- When the full example source is also **embedded earlier in the document** via `SourceFile.print`, + the `([source](...))` link in this section serves as a convenient shortcut to the GitHub file; + there is no need to embed the source again here. + +### Embedding Example Files with `SourceFile` + +When the documentation needs to show a **full example file** from the `schema-examples` project +(written in Step 3), **do not copy-paste the code inline**. Instead, use `mdoc:passthrough` with +the `SourceFile.print` helper to include it by reference. This keeps the doc and the example in +sync — any change to the example file automatically appears in the rendered docs on the next +mdoc build. + +Use this pattern: + +````markdown +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala//.scala") +``` +```` + +**Important:** Import as `import docs.SourceFile` and call `SourceFile.print(...)` — do NOT use +`import docs.SourceFile._` with bare `print(...)` because `print` conflicts with `Predef.print` +inside mdoc sessions. + +`SourceFile.print(path)` reads the file at mdoc compile time and emits a fenced code block with +the file path shown as the title. The path is relative to the repository root (the helper tries +`../` first, then ``). + +**When to use `SourceFile.print`:** +- Showing a complete, runnable `App` example from `schema-examples/` +- Showing a large, self-contained example that would be unwieldy to maintain in two places + +**When NOT to use it — use regular mdoc blocks instead:** +- Short inline snippets (< 20 lines) that illustrate a single method or concept +- Code that needs `mdoc` evaluated output (e.g., `// res0: Int = 42`) +- Code that is documentation-specific and doesn't exist as a standalone file + +**Optional parameters:** +- `lines = Seq((from, to))` — include only specific line ranges (1-indexed): + ````markdown + ```scala mdoc:passthrough + import docs.SourceFile + + SourceFile.print("schema-examples/src/main/scala/into/IntoNumericExample.scala", lines = Seq((10, 25))) + ``` + ```` +- `showLineNumbers = true` — render with line numbers in the output +- `showTitle = false` — suppress the file path title ### Compile-Checked Code Blocks with mdoc @@ -335,7 +385,31 @@ object IntoSchemaEvolutionExample extends App { } ``` -## Step 4: Integrate +## Step 4: Lint Check (Mandatory Before Integration) + +After creating all example files, stage them in git first, then ensure all Scala files pass the CI formatting gate: + +```bash +git add schema-examples/src/main/scala/**/*.scala +sbt fmtChanged +``` + +If any files were reformatted, commit the changes immediately: + +```bash +git add -A +git commit -m "docs(): apply scalafmt to examples" +``` + +Verify the CI lint gate locally: + +```bash +sbt check +``` + +**Success criterion:** zero formatting violations reported. + +## Step 5: Integrate See the **`docs-integrate`** skill for the complete integration checklist (sidebars.js, index.md, cross-references, link verification). @@ -343,7 +417,7 @@ cross-references, link verification). Additional note for reference pages: if creating a new file, place it in the appropriate `docs/reference/` subdirectory based on where it logically belongs. -## Step 5: Review and Verify Compilation +## Step 6: Review and Verify Compilation After writing, re-read the document and verify: - All method signatures match the actual source code diff --git a/.claude/skills/docs-document-pr/SKILL.md b/.claude/skills/docs-document-pr/SKILL.md index 05538c47aa..ba6a7c0ac7 100644 --- a/.claude/skills/docs-document-pr/SKILL.md +++ b/.claude/skills/docs-document-pr/SKILL.md @@ -272,6 +272,34 @@ Once documentation is written, tell the user: --- +## Phase 6: Verify Lint (If Examples Created) + +If documentation involved creating or modifying `.scala` example files in `schema-examples/`, stage them in git first, then verify that all Scala code passes the CI formatting gate before reporting completion: + +```bash +git add schema-examples/src/main/scala/**/*.scala +sbt fmtChanged +``` + +If any files were reformatted, commit the changes: + +```bash +git add -A +git commit -m "docs(): apply scalafmt to examples" +``` + +Then verify the CI lint gate locally: + +```bash +sbt check +``` + +**Success criterion:** zero formatting violations reported. + +**If no `.scala` files were created or modified**, skip this phase. + +--- + ## Implementation Checklist When you invoke this skill: @@ -285,6 +313,7 @@ When you invoke this skill: - [ ] **Phase 3c:** If subsection → manually edit existing page, consult `docs-writing-style` and `docs-mdoc-conventions` skills - [ ] **Phase 4:** If new page → invoke `docs-integrate` skill to update sidebar - [ ] **Phase 5:** Report findings and file paths to user +- [ ] **Phase 6:** If `.scala` examples were created, run `sbt fmt` and `sbt check` to verify lint compliance --- diff --git a/.claude/skills/docs-how-to-guide/SKILL.md b/.claude/skills/docs-how-to-guide/SKILL.md index a852a48dc9..774051dd2f 100644 --- a/.claude/skills/docs-how-to-guide/SKILL.md +++ b/.claude/skills/docs-how-to-guide/SKILL.md @@ -420,6 +420,30 @@ sbt "schema-examples/compile" If any example fails to compile, fix it before proceeding. The examples must compile successfully. +### 4f. Lint Check (Mandatory Before Integration) + +After all examples compile, stage them in git first, then run Scalafmt to ensure all Scala files pass the CI formatting gate: + +```bash +git add schema-examples/src/main/scala/**/*.scala +sbt fmtChanged +``` + +If any files were reformatted, commit the changes immediately: + +```bash +git add -A +git commit -m "docs(): apply scalafmt to examples" +``` + +Verify the CI lint gate locally: + +```bash +sbt check +``` + +**Success criterion:** zero formatting violations reported. + --- ## Step 5: Integrate diff --git a/.claude/skills/docs-mdoc-conventions/SKILL.md b/.claude/skills/docs-mdoc-conventions/SKILL.md index e30983b53b..bef1cd2a6d 100644 --- a/.claude/skills/docs-mdoc-conventions/SKILL.md +++ b/.claude/skills/docs-mdoc-conventions/SKILL.md @@ -106,6 +106,59 @@ modifiers are used more than in reference pages: --- +## Tabbed Scala 2 / Scala 3 Examples + +When a section shows syntax that differs between Scala 2 and Scala 3, use Docusaurus tabs +instead of sequential prose blocks. This lets readers pick their version once and have all +tab groups on the page sync together. + +### Required MDX imports + +Add these two lines at the top of any `.md` file that uses tabs (right after the closing +`---` of the frontmatter, before any prose): + +```mdx +import Tabs from '@theme/Tabs'; +import TabItem from '@theme/TabItem'; +``` + +### Tab structure + +````mdx + + + +```scala mdoc:compile-only +// Scala 2 syntax here +``` + + + + +```scala mdoc:compile-only +// Scala 3 syntax here +``` + + + +```` + +### Rules + +- Always use `groupId="scala-version"` — this syncs all tab groups on the page when the + reader picks a version. +- Always use `defaultValue="scala2"` — Scala 2 is shown first by default. +- Blank lines inside `` are required for mdoc to process fenced code blocks + correctly. +- `mdoc:compile-only` is the correct modifier for code inside tabs (same as everywhere + else). +- mdoc passes JSX components through unchanged — only fenced `scala mdoc:*` blocks are + rewritten. +- Do **not** use tabs for examples that are identical in both versions — only use them + when the syntax genuinely differs. + +--- + ## Docusaurus Admonitions Use Docusaurus admonition syntax for callouts: diff --git a/.claude/skills/docs-writing-style/SKILL.md b/.claude/skills/docs-writing-style/SKILL.md index 7c81fdcc56..6b0eb82979 100644 --- a/.claude/skills/docs-writing-style/SKILL.md +++ b/.claude/skills/docs-writing-style/SKILL.md @@ -82,7 +82,12 @@ Apply these conventions consistently in all prose, section headings, and inline ## Scala Version -All code in documentation and companion example files **must use Scala 2.13.x syntax**. When in -doubt, check the companion example files — they are the source of truth for syntax style. +All code in documentation and companion example files **defaults to Scala 2.13.x syntax**. +When in doubt, check the companion example files — they are the source of truth for syntax style. + +When a section shows syntax that genuinely differs between Scala 2 and Scala 3 (e.g., `using` +vs `implicit`, native union types vs backtick infix), use tabbed code blocks instead of +sequential prose. See `docs-mdoc-conventions` for the exact tab structure. Scala 2 is always +the default tab (`defaultValue="scala2"`). --- diff --git a/build.sbt b/build.sbt index eb1c13e89b..7dd63f3f81 100644 --- a/build.sbt +++ b/build.sbt @@ -31,6 +31,10 @@ com.github.sbt.git.SbtGit.useReadableConsoleGit addCommandAlias("build", "; fmt; coverage; root/test; coverageReport") addCommandAlias("fmt", "all root/scalafmtSbt root/scalafmtAll") addCommandAlias("fmtCheck", "all root/scalafmtSbtCheck root/scalafmtCheckAll") +addCommandAlias( + "fmtChanged", + "; set scalafmtFilter in ThisBuild := \"diff-ref=main\"; scalafmtAll; set scalafmtFilter in ThisBuild := \"\"" +) addCommandAlias("check", "; scalafmtSbtCheck; scalafmtCheckAll") addCommandAlias("mimaChecks", "all schemaJVM/mimaReportBinaryIssues") addCommandAlias( diff --git a/docs/index.md b/docs/index.md index 2ddc490bbe..d21bb11e2d 100644 --- a/docs/index.md +++ b/docs/index.md @@ -536,6 +536,7 @@ ZIO Blocks supports **Scala 2.13** and **Scala 3.x** with full source compatibil ### Core Schema Concepts - [Schema](./reference/schema.md) - Core schema definitions and derivation +- [Allows](./reference/allows.md) - Compile-time structural grammar constraints - [Reflect](./reference/reflect.md) - Structural reflection API - [Binding](./reference/binding.md) - Runtime constructors and deconstructors - [BindingResolver](./reference/binding-resolver.md) - Binding lookup and schema rebinding diff --git a/docs/reference/allows.md b/docs/reference/allows.md index 341ddc45e8..e2a408e510 100644 --- a/docs/reference/allows.md +++ b/docs/reference/allows.md @@ -3,34 +3,90 @@ id: allows title: "Allows" --- -`Allows[A, S]` is a compile-time capability token that proves, at the call site, that type `A` satisfies the structural grammar `S`. +import Tabs from '@theme/Tabs'; +import TabItem from '@theme/TabItem'; + +`Allows[A, S]` is a compile-time capability token that proves, at the call site, that type `A` satisfies the structural grammar `S`. A capability token is a compile-time phantom proof value — it carries no runtime data and exists solely to pass evidence through the type system that a structural constraint has been satisfied. `Allows` does **not** require or use `Schema[A]`. It inspects the Scala type structure of `A` directly at compile time, using nothing but the Scala type system. Any `Schema[A]` that appears alongside `Allows` in examples is the library author's own separate constraint — it is not imposed by `Allows` itself. -## Motivation +```scala +sealed abstract class Allows[A, S <: Allows.Structural] +``` -ZIO Blocks (ZIO Schema 2) gives library authors a powerful way to build data-oriented DSLs. A library can accept `A: Schema` and use the schema at runtime to serialize, deserialize, query, or transform values of `A`. But `Allows` is useful even without a Schema — it can enforce structural preconditions on *any* generic function. +## Overview -The gap is **structural preconditions**. Many generic functions only make sense for a subset of types: +The gap `Allows` fills is **structural preconditions** at the call site, at compile time, with precise error messages. Structural preconditions are constraints on the shape of a type's fields (e.g., "all fields must be scalars"), unlike runtime checks which happen during execution and produce exceptions or errors. -- A CSV serializer requires flat records of scalars. -- An RDBMS layer cannot handle nested records as column values. -- An event bus expects a sealed trait of flat record cases. -- A JSON document store allows arbitrarily nested records but not `DynamicValue` leaves. +## Motivation -Today, these constraints can only be checked at runtime, producing confusing errors deep inside library internals. +ZIO Blocks gives library authors a powerful way to build data-oriented DSLs. A library can accept `A: Schema` and use the schema at runtime to serialize, deserialize, query, or transform values of `A`. A data-oriented DSL is a generic API built around a data description (Schema) rather than a fixed interface, allowing one function to serialize, validate, or transform any conforming type. Many generic functions have **structural preconditions** that don't require a schema. -`Allows[A, S]` closes this gap: the constraint is verified at the **call site**, at compile time, with precise, path-aware error messages and concrete fix suggestions. +Consider these real-world scenarios: + +- A CSV serializer requires flat records of scalars — nested records should fail at the call site, not deep inside the serializer. +- An RDBMS layer cannot handle nested records as column values — the error should name the problematic field. +- An event bus expects a sealed trait of flat record cases — violations should be caught before publishing. +- A JSON document store allows arbitrarily nested records but not `DynamicValue` leaves — the schema validation should be precise. DynamicValue is the schema-less escape hatch that can hold arbitrary data — a DynamicValue leaf bypasses compile-time checking entirely, making it impossible for the compiler to enforce any structural grammar. + +Without `Allows`, these constraints can only be checked at runtime, producing confusing errors deep inside library internals. With `Allows[A, S]`, the constraint is verified at the **call site**, at compile time, with precise, path-aware error messages and concrete fix suggestions. ## The Upper Bound Semantics -`Allows[A, S]` is an upper bound. A type `A` that uses only a strict subset of what `S` permits also satisfies it — just as `A <: Foo` does not require that `A` uses every method of `Foo`. +`Allows[A, S]` is an upper bound. A type `A` that uses only a strict subset of what `S` permits also satisfies it — just as `A <: Foo` does not require that `A` uses every method of `Foo`. Upper bound semantics is the right choice because a lower bound would require using every shape (impractical), exact matching would require naming every shape used (too rigid), whereas upper bound says "your type may use any of these shapes" — a permission, not a mandate. -```scala -// Allows[UserRow, Record[Primitive | Optional[Primitive]]] is satisfied even if -// UserRow has no Option fields — the Optional branch is simply never needed. +```scala mdoc:compile-only +import zio.blocks.schema.comptime.Allows +import Allows._ + +// Both satisfy Record[Primitive | Optional[Primitive]] — the upper bound + +case class UserRow(name: String, age: Int) +// UserRow satisfies the grammar: all fields are Primitive + +case class UserRowOpt(name: String, age: Int, email: Option[String]) +// UserRowOpt also satisfies the grammar: all fields are Primitive or Optional[Primitive] + +val ev1: Allows[UserRow, Record[Primitive | Optional[Primitive]]] = implicitly +val ev2: Allows[UserRowOpt, Record[Primitive | Optional[Primitive]]] = implicitly +``` + +## Creating Instances + +`Allows[A, S]` is not instantiated directly. Instead, you summon an evidence value at the point where you need the constraint. The macro automatically verifies the constraint at compile time. + + + + +```scala mdoc:compile-only +import zio.blocks.schema.comptime.Allows +import Allows._ + +def toJson[A](doc: A)(implicit ev: Allows[A, Record[Primitive]]): String = ??? + +// Or summon at the call site: +val evidence = implicitly[Allows[Int, Primitive]] +``` + + + + +```scala mdoc:compile-only +import zio.blocks.schema.comptime.Allows +import Allows._ + +def toJson[A](doc: A)(using Allows[A, Record[Primitive]]): String = ??? + +// Calling the function: +case class Person(name: String, age: Int) +val json = toJson(Person("Alice", 30)) // Compiles if Person satisfies Record[Primitive] ``` + + + +The constraint is checked once, at the call site. If the type `A` does not satisfy `S`, you get a compile-time error with a precise message showing exactly which field violates the grammar. + ## Grammar Nodes All grammar nodes extend `Allows.Structural`. @@ -53,22 +109,29 @@ All grammar nodes extend `Allows.Structural`. | `Primitive.UUID` | `java.util.UUID` only | | `Primitive.Currency` | `java.util.Currency` only | | `Primitive.Instant` / `LocalDate` / `LocalDateTime` / … | Each specific `java.time.*` type | +| | | | `Record[A]` | A case class / product type whose every field satisfies `A`. Vacuously true for zero-field records. Sealed traits and enums are **automatically unwrapped**: each case is checked individually, so no `Variant` node is needed. | | `Sequence[A]` | Any collection (`List`, `Vector`, `Set`, `Array`, `Chunk`, …) whose element type satisfies `A` | -| `Sequence.List[A]` | `scala.collection.immutable.List` only, element type satisfies `A` | -| `Sequence.Vector[A]` | `scala.collection.immutable.Vector` only, element type satisfies `A` | -| `Sequence.Set[A]` | `scala.collection.immutable.Set` only, element type satisfies `A` | -| `Sequence.Array[A]` | `scala.Array` only, element type satisfies `A` | -| `Sequence.Chunk[A]` | `zio.blocks.chunk.Chunk` only, element type satisfies `A` | -| `IsType[A]` | Exact nominal type match: satisfied only when the checked type is exactly `A` (`=:=`) | | `Map[K, V]` | `Map`, `HashMap`, … whose key satisfies `K` and value satisfies `V` | | `Optional[A]` | `Option[X]` where the inner type `X` satisfies `A` | | `Wrapped[A]` | A ZIO Prelude `Newtype`/`Subtype` wrapper whose underlying type satisfies `A` | -| `Dynamic` | `DynamicValue` — the schema-less escape hatch | +| | | | `Self` | Recursive self-reference back to the entire enclosing `Allows[A, S]` grammar | +| `Dynamic` | `DynamicValue` — the schema-less escape hatch | +| `IsType[A]` | Exact nominal type match: satisfied only when the checked type is exactly `A` (`=:=`) | | `` `\|` `` | Union of two grammar nodes: `A \| B`. In Scala 2 write `` A `\|` B `` in infix position. | -Every specific `Primitive.Xxx` node also satisfies the catch-all `Primitive`. This means a type annotated with `Primitive.Int` is valid wherever `Primitive` or `Primitive | Primitive.Long` is required. +Every specific `Primitive.Xxx` node also satisfies the top-level `Primitive` node (which matches any of the 30 primitive types). This means a type annotated with `Primitive.Int` is valid wherever `Primitive` or `Primitive | Primitive.Long` is required. + +## Core Operations + +`Allows[A, S]` is a **proof token**, not an ordinary value. It carries zero public methods that you call directly. Instead, you use it in three ways: + +1. **As a constraint in function signatures** — Declare `Allows[A, S]` as an implicit/using parameter to require that callers pass only types satisfying the grammar. +2. **To summon evidence** — Use `implicitly[Allows[A, S]]` (Scala 2) or `summon[Allows[A, S]]` (Scala 3) at a call site to check the constraint and get an error message if it fails. +3. **In type aliases** — Define type aliases like `type FlatRecord = Allows[_, Record[Primitive | Optional[Primitive]]]` to name constraints and reuse them across functions. + +The macro that powers `Allows` checks the constraint **at compile time** and emits nothing but a reference to a single private singleton at runtime, so there is zero per-call-site overhead. ## Specific Primitives @@ -124,7 +187,7 @@ def toJson[A](doc: A)(using Allows[A, Json]): String = ??? `Self` recurses back to `Json` at every nested position, so `List[String]` satisfies `Sequence[JsonPrimitive | Self]` (String is JsonPrimitive), `List[Author]` satisfies it too (Author satisfies `Record[JsonPrimitive | Self]` via Self), and top-level arrays work directly. -A type with a UUID or Instant field fails at compile time: +A type with a UUID or Instant field fails at compile time with this error: ``` [error] Schema shape violation at WithUUID.id: found Primitive(java.util.UUID), @@ -136,28 +199,37 @@ A type with a UUID or Instant field fails at compile time: Union types express "or" in the grammar. -**Scala 3** uses native union type syntax: + + + +Uses the infix operator `` Primitive `|` Optional[Primitive] `` from `Allows`: ```scala mdoc:compile-only import zio.blocks.schema.comptime.Allows import Allows._ -def writeCsv[A](rows: Seq[A])(using - Allows[A, Record[Primitive | Optional[Primitive]]] +def writeCsv[A](rows: Seq[A])(implicit + ev: Allows[A, Record[Primitive | Optional[Primitive]]] ): Unit = ??? ``` -**Scala 2** uses the `` `\|` `` infix operator from `Allows`: + + -```scala +Uses native union type syntax: + +```scala mdoc:compile-only import zio.blocks.schema.comptime.Allows import Allows._ -def writeCsv[A](rows: Seq[A])(implicit - ev: Allows[A, Record[Primitive | Optional[Primitive]]] +def writeCsv[A](rows: Seq[A])(using + Allows[A, Record[Primitive | Optional[Primitive]]] ): Unit = ??? ``` + + + Both spellings compile and produce the same semantic behavior. The grammar is identical — the only difference is how the union type is expressed. ## Use Cases @@ -180,7 +252,7 @@ def insert[A: Schema](value: A)(using ): String = ??? ``` -If a user passes a type with nested records, they get a precise compile-time error: +If a user passes a type with nested records, they get a precise compile-time error like this: ``` [error] Schema shape violation at UserWithAddress.address: found Record(Address), @@ -202,7 +274,7 @@ def publish[A: Schema](event: A)(using ): Unit = ??? ``` -If a case of the sealed trait has a nested record field, the error names that case and field: +If a case of the sealed trait has a nested record field, the error names that case and field like this: ``` [error] Schema shape violation at DomainEvent.OrderPlaced.items.: @@ -221,7 +293,7 @@ import Allows._ type JsonDocument = Record[Primitive | Self | Optional[Primitive | Self] | Sequence[Primitive | Self] | Allows.Map[Primitive, Primitive | Self]] -def toJson[A: Schema](doc: A)(implicit ev: Allows[A, JsonDocument]): String = ??? +def toJson[A: Schema](doc: A)(using Allows[A, JsonDocument]): String = ??? ``` This grammar allows: @@ -354,7 +426,7 @@ val ev2: Allows[List[String], Allows.Sequence[Allows.IsType[String]]] = implicit **Non-recursive types** satisfy `Self`-containing grammars without issue: if no field ever recurses back to the root type, the `Self` position is never reached, and the constraint is vacuously satisfied. -**Mutual recursion** between two or more distinct types is a compile-time error: +**Mutual recursion** between two or more distinct types is a compile-time error reported as: ``` [error] Mutually recursive types are not supported by Allows. @@ -363,11 +435,15 @@ val ev2: Allows[List[String], Allows.Sequence[Allows.IsType[String]]] = implicit ## `Wrapped[A]` and Newtypes -The `Wrapped[A]` node matches ZIO Prelude `Newtype` and `Subtype` wrappers. The underlying type must satisfy `A`. +The `Wrapped[A]` node matches ZIO Prelude `Newtype` and `Subtype` wrappers. The underlying type must satisfy `A`. Here's an example: -```scala -// ZIO Prelude Newtype pattern: +```scala mdoc:compile-only import zio.prelude.Newtype +import zio.blocks.schema.Schema +import zio.blocks.schema.comptime.Allows +import Allows._ + +// ZIO Prelude Newtype pattern: object ProductCode extends Newtype[String] type ProductCode = ProductCode.Type @@ -378,9 +454,9 @@ given Schema[ProductCode] = val ev: Allows[ProductCode, Wrapped[Primitive]] = implicitly ``` -**Scala 3 opaque types** are resolved to their underlying type by the macro (they are transparent), so `opaque type UserId = UUID` satisfies `Primitive` (not `Wrapped[Primitive]`): +**Scala 3 opaque types** are resolved to their underlying type by the macro (they are transparent), so an opaque alias like this satisfies `Primitive` directly: -```scala +```scala mdoc:compile-only opaque type UserId = java.util.UUID // UserId satisfies Allows[UserId, Primitive] — resolved to UUID (a primitive) ``` @@ -415,9 +491,7 @@ When a type does not satisfy the grammar, the macro reports: 3. **What was required**: `Primitive | Sequence[Primitive]` 4. **A hint** where applicable -Multiple violations are reported in a single compilation pass — the user sees all problems at once. - -Example: +Multiple violations are reported in a single compilation pass — the user sees all problems at once, for example: ``` [error] Schema shape violation at UserWithAddress.address: found Record(Address), @@ -455,3 +529,124 @@ val ev: Allows[EmptyEvent.type, Record[Primitive]] = implicitly // vacuously tr | Derivation keyword | `Schema.derived` implicit | `Schema.derived` or `derives Schema` | Both Scala versions produce the same macro behavior and the same error messages. + +## Integration with Schema + +`Allows` and `Schema` are complementary but independent: + +- **`Schema[A]`** describes what an `A` looks like at runtime — how to serialize, deserialize, introspect, or transform it. It requires explicit derivation and handles the full type signature. +- **`Allows[A, S]`** describes what an `A` *may* look like at compile time — a structural grammar that `A` must satisfy. It requires no schema and uses only the Scala type system. + +You can use `Allows` **without** `Schema`: + +```scala mdoc:compile-only +import zio.blocks.schema.comptime.Allows +import Allows._ + +// Pure shape constraint, no Schema required +def writeCsv[A](rows: Seq[A])(using Allows[A, Record[Primitive | Optional[Primitive]]]): Unit = ??? +``` + +Or combine them when runtime encoding **and** shape validation are both needed: + +```scala mdoc:compile-only +import zio.blocks.schema.Schema +import zio.blocks.schema.comptime.Allows +import Allows._ + +// Shape constraint + runtime encoding +def writeCsv[A: Schema](rows: Seq[A])(using + Allows[A, Record[Primitive | Optional[Primitive]]] +): Unit = ??? +``` + +When combined, `Allows` enforces the structural guarantee that `Schema` can use — for example, a CSV serializer can assume that every field is a primitive or optional primitive and skip defensive type checks. + +See [Schema](./schema.md) for more on runtime encoding and decoding with schemas. + +## Running the Examples + +All code from this guide is available as runnable examples in the `schema-examples` module. + +**1. Clone the repository and navigate to the project:** + +```bash +git clone https://github.com/zio/zio-blocks.git +cd zio-blocks +``` + +**2. Run individual examples with sbt:** + +**CSV serializer with flat record compile-time constraints** +([source](https://github.com/zio/zio-blocks/blob/main/schema-examples/src/main/scala/comptime/AllowsCsvExample.scala)) + +```bash +sbt "schema-examples/runMain comptime.AllowsCsvExample" +``` + +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala/comptime/AllowsCsvExample.scala") +``` + +**Event bus with sealed trait auto-unwrap and nested hierarchies** +([source](https://github.com/zio/zio-blocks/blob/main/schema-examples/src/main/scala/comptime/AllowsEventBusExample.scala)) + +```bash +sbt "schema-examples/runMain comptime.AllowsEventBusExample" +``` + +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala/comptime/AllowsEventBusExample.scala") +``` + +**GraphQL / tree structures using Self for recursive grammars** +([source](https://github.com/zio/zio-blocks/blob/main/schema-examples/src/main/scala/comptime/AllowsGraphQLTreeExample.scala)) + +```bash +sbt "schema-examples/runMain comptime.AllowsGraphQLTreeExample" +``` + +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala/comptime/AllowsGraphQLTreeExample.scala") +``` + +**Sealed trait auto-unwrap with nested hierarchies and case objects** +([source](https://github.com/zio/zio-blocks/blob/main/schema-examples/src/main/scala/comptime/AllowsSealedTraitExample.scala)) + +```bash +sbt "schema-examples/runMain comptime.AllowsSealedTraitExample" +``` + +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala/comptime/AllowsSealedTraitExample.scala") +``` + +**RDBMS library with CREATE TABLE and INSERT using flat record constraints** (compile-only) +([source](https://github.com/zio/zio-blocks/blob/main/schema-examples/src/main/scala/comptime/RdbmsExample.scala)) + +Demonstrates how Allows constraints are verified at compile time — the code below shows valid examples that compile successfully, and includes comments showing which patterns would be rejected: + +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala/comptime/RdbmsExample.scala") +``` + +**JSON document store with specific primitives and recursive Self grammar** (compile-only) +([source](https://github.com/zio/zio-blocks/blob/main/schema-examples/src/main/scala/comptime/DocumentStoreExample.scala)) + +Demonstrates how Allows enforces recursive schema constraints at compile time: + +```scala mdoc:passthrough +import docs.SourceFile + +SourceFile.print("schema-examples/src/main/scala/comptime/DocumentStoreExample.scala") +``` diff --git a/docs/sidebars.js b/docs/sidebars.js index 8dd7de0297..4ec40a2d0c 100644 --- a/docs/sidebars.js +++ b/docs/sidebars.js @@ -7,6 +7,7 @@ const sidebars = { link: { type: "doc", id: "index" }, items: [ "reference/schema", + "reference/allows", "reference/reflect", "reference/binding", "reference/binding-resolver", diff --git a/schema-examples/src/main/scala/comptime/AllowsCsvExample.scala b/schema-examples/src/main/scala/comptime/AllowsCsvExample.scala new file mode 100644 index 0000000000..8dd6c63c4e --- /dev/null +++ b/schema-examples/src/main/scala/comptime/AllowsCsvExample.scala @@ -0,0 +1,98 @@ +package comptime + +import zio.blocks.schema._ +import zio.blocks.schema.comptime.Allows +import Allows.{Primitive, Record, `|`} +import Allows.{Optional => AOptional} +import util.ShowExpr.show + +// --------------------------------------------------------------------------- +// CSV serializer example using Allows[A, S] compile-time shape constraints +// +// A CSV row is a flat record: every field must be a primitive scalar or an +// optional primitive (for nullable columns). Nested records, sequences, and +// maps are all rejected at compile time. +// --------------------------------------------------------------------------- + +// Compatible: flat record of primitives and optional primitives +case class Employee(name: String, department: String, salary: BigDecimal, active: Boolean) +object Employee { implicit val schema: Schema[Employee] = Schema.derived } + +case class SensorReading(sensorId: String, timestamp: Long, value: Double, unit: Option[String]) +object SensorReading { implicit val schema: Schema[SensorReading] = Schema.derived } + +object CsvSerializer { + + type FlatRow = Primitive | AOptional[Primitive] + + /** Serialize a sequence of flat records to CSV format. */ + def toCsv[A](rows: Seq[A])(implicit schema: Schema[A], ev: Allows[A, Record[FlatRow]]): String = { + val reflect = schema.reflect.asRecord.get + val header = reflect.fields.map(_.name).mkString(",") + val lines = rows.map { row => + val dv = schema.toDynamicValue(row) + dv match { + case DynamicValue.Record(fields) => + fields.map { case (_, v) => csvEscape(dvToString(v)) }.mkString(",") + case _ => "" + } + } + (header +: lines).mkString("\n") + } + + private def dvToString(dv: DynamicValue): String = dv match { + case DynamicValue.Primitive(PrimitiveValue.String(s)) => s + case DynamicValue.Primitive(PrimitiveValue.Boolean(b)) => b.toString + case DynamicValue.Primitive(PrimitiveValue.Int(n)) => n.toString + case DynamicValue.Primitive(PrimitiveValue.Long(n)) => n.toString + case DynamicValue.Primitive(PrimitiveValue.Double(n)) => n.toString + case DynamicValue.Primitive(PrimitiveValue.Float(n)) => n.toString + case DynamicValue.Primitive(PrimitiveValue.BigDecimal(n)) => n.toString + case DynamicValue.Primitive(v) => v.toString + case DynamicValue.Null => "" + case DynamicValue.Variant(tag, inner) if tag == "Some" => dvToString(inner) + case DynamicValue.Variant(tag, _) if tag == "None" => "" + case DynamicValue.Record(fields) => + fields.headOption.map { case (_, v) => dvToString(v) }.getOrElse("") + case other => other.toString + } + + private def csvEscape(s: String): String = + if (s.contains(",") || s.contains("\"") || s.contains("\n")) + "\"" + s.replace("\"", "\"\"") + "\"" + else s +} + +// --------------------------------------------------------------------------- +// Demonstration +// --------------------------------------------------------------------------- + +object AllowsCsvExample extends App { + + // Flat records of primitives — compiles fine + val employees = Seq( + Employee("Alice", "Engineering", BigDecimal("120000.00"), true), + Employee("Bob", "Marketing", BigDecimal("95000.50"), true), + Employee("Carol", "Engineering", BigDecimal("115000.00"), false) + ) + + // CSV output for a flat record of primitives + show(CsvSerializer.toCsv(employees)) + + // Flat record with optional fields — also compiles + val readings = Seq( + SensorReading("temp-01", 1709712000L, 23.5, Some("celsius")), + SensorReading("temp-02", 1709712060L, 72.1, None) + ) + + // Optional fields become empty CSV cells when None + show(CsvSerializer.toCsv(readings)) + + // The following would NOT compile — uncomment to see the error: + // + // case class Nested(name: String, address: Address) + // object Nested { implicit val schema: Schema[Nested] = Schema.derived } + // CsvSerializer.toCsv(Seq(Nested("Alice", Address("1 Main St", "NY", "10001")))) + // [error] Schema shape violation at Nested.address: found Record(Address), + // required Primitive | Optional[Primitive] +} diff --git a/schema-examples/src/main/scala/comptime/AllowsEventBusExample.scala b/schema-examples/src/main/scala/comptime/AllowsEventBusExample.scala new file mode 100644 index 0000000000..fe44f3bfd5 --- /dev/null +++ b/schema-examples/src/main/scala/comptime/AllowsEventBusExample.scala @@ -0,0 +1,96 @@ +package comptime + +import zio.blocks.schema._ +import zio.blocks.schema.comptime.Allows +import Allows.{Primitive, Record, Sequence, `|`} +import Allows.{Optional => AOptional} +import util.ShowExpr.show + +// --------------------------------------------------------------------------- +// Event bus / message broker example using Allows[A, S] +// +// Published events are typically sealed traits of flat record cases. Sealed +// traits are automatically unwrapped by the Allows macro — each case is +// checked individually against the grammar. No Variant node is needed. +// +// This example also shows nested sealed traits (auto-unwrap is recursive). +// --------------------------------------------------------------------------- + +// Domain events — a sealed trait hierarchy +sealed trait AccountEvent +case class AccountOpened(accountId: String, owner: String, initialBalance: BigDecimal) extends AccountEvent +case class FundsDeposited(accountId: String, amount: BigDecimal) extends AccountEvent +case class FundsWithdrawn(accountId: String, amount: BigDecimal) extends AccountEvent +case class AccountClosed(accountId: String, reason: Option[String]) extends AccountEvent +object AccountEvent { implicit val schema: Schema[AccountEvent] = Schema.derived } + +// Nested sealed trait — InventoryEvent has a sub-hierarchy +sealed trait InventoryEvent +case class ItemAdded(sku: String, quantity: Int) extends InventoryEvent +case class ItemRemoved(sku: String, quantity: Int) extends InventoryEvent + +sealed trait InventoryAlert extends InventoryEvent +case class LowStock(sku: String, remaining: Int) extends InventoryAlert +case class OutOfStock(sku: String) extends InventoryAlert + +object InventoryEvent { implicit val schema: Schema[InventoryEvent] = Schema.derived } + +// Event with sequence fields (e.g. tags or batch items) +sealed trait BatchEvent +case class BatchImport(batchId: String, itemIds: List[String]) extends BatchEvent +case class BatchComplete(batchId: String, count: Int) extends BatchEvent +object BatchEvent { implicit val schema: Schema[BatchEvent] = Schema.derived } + +object EventBus { + + type EventShape = Primitive | AOptional[Primitive] + + /** + * Publish a domain event. All cases of the sealed trait must be flat records. + */ + def publish[A](event: A)(implicit schema: Schema[A], ev: Allows[A, Record[EventShape]]): String = { + val dv = schema.toDynamicValue(event) + val (typeName, payload) = dv match { + case DynamicValue.Variant(name, inner) => (name, inner.toJson.toString) + case _ => (schema.reflect.typeId.name, dv.toJson.toString) + } + s"PUBLISH topic=${schema.reflect.typeId.name} type=$typeName payload=$payload" + } + + /** + * Publish events that may contain sequence fields (e.g. batch operations). + */ + def publishBatch[A](event: A)(implicit + schema: Schema[A], + ev: Allows[A, Record[Primitive | Sequence[Primitive]]] + ): String = { + val dv = schema.toDynamicValue(event) + val (typeName, payload) = dv match { + case DynamicValue.Variant(name, inner) => (name, inner.toJson.toString) + case _ => (schema.reflect.typeId.name, dv.toJson.toString) + } + s"PUBLISH topic=${schema.reflect.typeId.name} type=$typeName payload=$payload" + } +} + +// --------------------------------------------------------------------------- +// Demonstration +// --------------------------------------------------------------------------- + +object AllowsEventBusExample extends App { + + // Flat sealed trait — all cases are records of primitives/optionals + show(EventBus.publish[AccountEvent](AccountOpened("acc-001", "Alice", BigDecimal("1000.00")))) + show(EventBus.publish[AccountEvent](FundsDeposited("acc-001", BigDecimal("500.00")))) + show(EventBus.publish[AccountEvent](AccountClosed("acc-001", Some("customer request")))) + + // Nested sealed trait — auto-unwrap is recursive + // InventoryAlert extends InventoryEvent, both are unwrapped + show(EventBus.publish[InventoryEvent](ItemAdded("SKU-100", 50))) + show(EventBus.publish[InventoryEvent](LowStock("SKU-100", 3))) + show(EventBus.publish[InventoryEvent](OutOfStock("SKU-100"))) + + // Events with sequence fields use a wider grammar + show(EventBus.publishBatch[BatchEvent](BatchImport("batch-42", List("item-1", "item-2", "item-3")))) + show(EventBus.publishBatch[BatchEvent](BatchComplete("batch-42", 3))) +} diff --git a/schema-examples/src/main/scala/comptime/AllowsGraphQLTreeExample.scala b/schema-examples/src/main/scala/comptime/AllowsGraphQLTreeExample.scala new file mode 100644 index 0000000000..60b940b54e --- /dev/null +++ b/schema-examples/src/main/scala/comptime/AllowsGraphQLTreeExample.scala @@ -0,0 +1,114 @@ +package comptime + +import zio.blocks.schema._ +import zio.blocks.schema.comptime.Allows +import Allows.{Primitive, Record, Sequence, `|`} +import Allows.{Optional => AOptional, Self => ASelf} +import util.ShowExpr.show + +// --------------------------------------------------------------------------- +// GraphQL / tree structure example using Self for recursive grammars +// +// Self refers back to the entire enclosing Allows[A, S] grammar, allowing +// the constraint to describe recursive data structures like trees, linked +// lists, and nested menus. +// +// Non-recursive types also satisfy Self-containing grammars — the Self +// position is never reached, so the constraint is vacuously satisfied. +// --------------------------------------------------------------------------- + +// Recursive tree: children reference the same type +case class TreeNode(value: Int, children: List[TreeNode]) +object TreeNode { implicit val schema: Schema[TreeNode] = Schema.derived } + +// Recursive category hierarchy (common in e-commerce, CMS, etc.) +case class NavCategory(name: String, slug: String, subcategories: List[NavCategory]) +object NavCategory { implicit val schema: Schema[NavCategory] = Schema.derived } + +// Linked list via Optional[Self] +case class Chain(label: String, next: Option[Chain]) +object Chain { implicit val schema: Schema[Chain] = Schema.derived } + +// Non-recursive type — satisfies Self-containing grammars vacuously +case class FlatNode(id: Int, label: String) +object FlatNode { implicit val schema: Schema[FlatNode] = Schema.derived } + +object GraphQL { + + type TreeShape = Primitive | Sequence[ASelf] | AOptional[ASelf] + + /** Generate a simplified GraphQL type definition for a recursive type. */ + def graphqlType[A](implicit schema: Schema[A], ev: Allows[A, Record[TreeShape]]): String = { + val reflect = schema.reflect.asRecord.get + val fields = reflect.fields.map { f => + s" ${f.name}: ${gqlType(resolve(f.value), schema.reflect.typeId.name)}" + } + s"type ${schema.reflect.typeId.name} {\n${fields.mkString("\n")}\n}" + } + + /** Unwrap Deferred to get the actual Reflect node. */ + private def resolve(r: Reflect.Bound[_]): Reflect.Bound[_] = r match { + case d: Reflect.Deferred[_, _] => resolve(d.value.asInstanceOf[Reflect.Bound[_]]) + case other => other + } + + private def gqlType(r: Reflect.Bound[_], selfName: String): String = r match { + case _: Reflect.Sequence[_, _, _] => s"[$selfName]" + case p: Reflect.Primitive[_, _] => + p.primitiveType match { + case PrimitiveType.Int(_) => "Int" + case PrimitiveType.Long(_) => "Int" + case PrimitiveType.Float(_) => "Float" + case PrimitiveType.Double(_) => "Float" + case PrimitiveType.String(_) => "String" + case PrimitiveType.Boolean(_) => "Boolean" + case _ => "String" + } + case _ => selfName + } +} + +// --------------------------------------------------------------------------- +// Demonstration +// --------------------------------------------------------------------------- + +object AllowsGraphQLTreeExample extends App { + + // Recursive tree with Sequence[Self] + show(GraphQL.graphqlType[TreeNode]) + + // Recursive categories — same grammar, different domain + show(GraphQL.graphqlType[NavCategory]) + + // Linked list via Optional[Self] + show(GraphQL.graphqlType[Chain]) + + // Non-recursive type also satisfies the grammar (vacuously — Self is never reached) + show(GraphQL.graphqlType[FlatNode]) + + // Show that recursive data actually works at runtime + val tree = TreeNode( + 1, + List( + TreeNode(2, List(TreeNode(4, Nil), TreeNode(5, Nil))), + TreeNode(3, Nil) + ) + ) + show(Schema[TreeNode].toDynamicValue(tree).toJson.toString) + + val nav = NavCategory( + "Electronics", + "electronics", + List( + NavCategory("Phones", "phones", Nil), + NavCategory( + "Laptops", + "laptops", + List( + NavCategory("Gaming", "gaming", Nil) + ) + ) + ) + ) + show(Schema[NavCategory].toDynamicValue(nav).toJson.toString) +} diff --git a/schema-examples/src/main/scala/comptime/AllowsSealedTraitExample.scala b/schema-examples/src/main/scala/comptime/AllowsSealedTraitExample.scala new file mode 100644 index 0000000000..4c613388a3 --- /dev/null +++ b/schema-examples/src/main/scala/comptime/AllowsSealedTraitExample.scala @@ -0,0 +1,87 @@ +package comptime + +import zio.blocks.schema._ +import zio.blocks.schema.comptime.Allows +import Allows.{Primitive, Record} +import util.ShowExpr.show + +// --------------------------------------------------------------------------- +// Sealed trait auto-unwrap example +// +// Sealed traits and enums are automatically unwrapped by the Allows macro. +// Each case is checked individually against the grammar — no Variant node +// is needed. +// +// Auto-unwrap is recursive: if a case is itself a sealed trait, its cases +// are unwrapped too, to any depth. +// +// Zero-field records (case objects) are vacuously true for any Record[A] +// constraint. +// --------------------------------------------------------------------------- + +// Simple sealed trait with case classes and a case object +sealed trait Shape +case class Circle(radius: Double) extends Shape +case class Rectangle(width: Double, height: Double) extends Shape +case object Point extends Shape +object Shape { implicit val schema: Schema[Shape] = Schema.derived } + +// Nested sealed trait hierarchy — two levels deep +sealed trait Expr +sealed trait BinaryOp extends Expr +case class Add(left: Double, right: Double) extends BinaryOp +case class Multiply(left: Double, right: Double) extends BinaryOp +case class Literal(value: Double) extends Expr +case object Zero extends Expr +object Expr { implicit val schema: Schema[Expr] = Schema.derived } + +// All-singleton enum (all case objects) +sealed trait Color +case object Red extends Color +case object Green extends Color +case object Blue extends Color +object Color { implicit val schema: Schema[Color] = Schema.derived } + +object SealedTraitValidator { + + /** Validate that a value's type has a flat record structure. */ + def validate[A](value: A)(implicit schema: Schema[A], ev: Allows[A, Record[Primitive]]): String = { + val dv = schema.toDynamicValue(value) + dv match { + case DynamicValue.Variant(caseName, inner) => + s"Valid variant case '$caseName': ${inner.toJson}" + case DynamicValue.Record(fields) => + s"Valid record with ${fields.size} field(s): ${fields.map(_._1).mkString(", ")}" + case _ => + s"Valid: ${dv.toJson}" + } + } +} + +// --------------------------------------------------------------------------- +// Demonstration +// --------------------------------------------------------------------------- + +object AllowsSealedTraitExample extends App { + + // Simple sealed trait — all cases checked against Record[Primitive] + // Circle: Record(radius: Double) — satisfies Record[Primitive] + // Rectangle: Record(width: Double, height: Double) — satisfies Record[Primitive] + // Point: zero-field case object — vacuously true + show(SealedTraitValidator.validate[Shape](Circle(3.14))) + show(SealedTraitValidator.validate[Shape](Rectangle(4.0, 5.0))) + show(SealedTraitValidator.validate[Shape](Point)) + + // Nested sealed trait — auto-unwrap is recursive + // BinaryOp is itself sealed with Add and Multiply + // All leaf cases have only Double fields — satisfies Record[Primitive] + show(SealedTraitValidator.validate[Expr](Add(1.0, 2.0))) + show(SealedTraitValidator.validate[Expr](Multiply(3.0, 4.0))) + show(SealedTraitValidator.validate[Expr](Literal(42.0))) + show(SealedTraitValidator.validate[Expr](Zero)) + + // All-singleton enum — every case is a zero-field record (vacuously true) + show(SealedTraitValidator.validate[Color](Red)) + show(SealedTraitValidator.validate[Color](Green)) + show(SealedTraitValidator.validate[Color](Blue)) +} diff --git a/zio-blocks-docs/src/main/scala/SourceFile.scala b/zio-blocks-docs/src/main/scala/SourceFile.scala new file mode 100644 index 0000000000..9131330bca --- /dev/null +++ b/zio-blocks-docs/src/main/scala/SourceFile.scala @@ -0,0 +1,56 @@ +package docs + +import scala.io.Source +import scala.util.Using +import scala.util.control.NonFatal + +object SourceFile { + + def read(path: String, lines: Seq[(Int, Int)]): String = { + def openSource(path: String): Source = + try { + Source.fromFile("../" + path) + } catch { + case NonFatal(_) => Source.fromFile(path) + } + + Using.resource(openSource(path)) { source => + val allLines = source.getLines().toVector + if (lines.isEmpty) { + allLines.mkString("\n") + } else { + val chunks = for { + (from, to) <- lines + } yield allLines + .slice(from - 1, to) + .mkString("\n") + + chunks.mkString("\n\n") + } + } + } + + def fileExtension(path: String): String = { + val javaPath = java.nio.file.Paths.get(path) + val fileExtension = + javaPath.getFileName.toString + .split('.') + .lastOption + .getOrElse("") + fileExtension + } + + def print( + path: String, + lines: Seq[(Int, Int)] = Seq.empty, + showTitle: Boolean = true, + showLineNumbers: Boolean = false + ) = { + val title = if (showTitle) s"""title="$path"""" else "" + val showLines = if (showLineNumbers) "showLineNumbers" else "" + println(s"""```${fileExtension(path)} $title $showLines""") + println(read(path, lines)) + println("```") + } + +}