-
Notifications
You must be signed in to change notification settings - Fork 24
Feat/generate ubproject and schema validation #394
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
ba3147b
0b9ad50
2bc27b4
bd979d4
04e561e
7ed2c2a
5af8620
7e6a388
ba387d7
995616e
503470c
9791db5
14793ad
bb30a98
3650fcf
e40c003
9715f86
fae7591
94bda9e
0460310
027e3b7
e1b3bd2
79c7295
d7bd238
d7df843
6a468c7
93136e0
5ea6264
180e1f7
db1682d
ab549c6
467ed9c
65ae158
316102a
1c4352e
d9df244
86b5a37
de4b821
ebebf9e
1c436f1
68fe59e
f57ba65
69ad70b
36a2252
98e7276
e5c2dc5
609d4fb
8dac10b
e866db5
2ce245b
c3065ed
a7bea32
bae447d
2d19347
5c66928
73fd2ee
e81f9e9
110032c
eaec7c6
6b181a8
bd0c101
f101ba8
3007d2b
df01d29
d7a98e3
6290e80
a2d7cad
3c7be8e
9719be9
d8aa801
055bf29
90bd22f
28ad4ed
8964246
96e5a76
446bf0e
8459a10
ec62bcc
e6bf7c2
61be512
f064cef
e0d5de6
e95e662
958177e
fd4b97c
4e272bd
66dc723
cc3b0fa
aae98ca
e48a1fc
49bde21
f69b9c0
fd90a17
014857a
cab4dfd
9f410b3
753dbe7
bb83e99
92a4806
039abec
5259093
1a7657e
0272f9e
47a80da
b3617ae
5b52558
df9dc12
b7ac1d5
e0385b0
c62f3d0
8e46554
12d6128
d4b230f
dd04c72
d3cb059
ceae78a
5f22e5f
19de31b
ee6099e
3f8a6f7
5ea4b6c
305c238
8c99c5a
afae3b3
5ec612c
2b39282
e16f781
1161235
be78ba5
5e8b013
5239062
33eda7a
723bee8
bb36f74
19e57a3
a5f3fb6
87b43e1
60db8fe
3a00e3d
8fe17a3
14d18a2
703af50
d77b662
7a80227
accad51
3ca45ea
3d4dadd
b7b082a
1931ac2
b5df7e0
c5e0c3f
38ecb3d
80ed1b5
0236364
a177ece
64df598
3b2271d
8bf3335
7c591f4
86b0145
2e7118b
c546eed
39d78d7
84de4a4
794d642
4d28db0
015e0af
50b21b3
0382665
efa8280
89dcf0f
b5023ad
fe212a0
072693f
a145da7
3be1288
4b8ddcd
a6cf1d9
4e12400
a1a3e15
5102059
5629263
534e1ef
d14b917
5008ea2
88bd7ce
e6460ed
3723042
7fbb09b
59703f5
de6667c
60382b9
1a56cfe
1ab312b
e0bdf84
8174980
374f4cd
ad53345
ce37b35
d428a31
e49778e
7330a45
3fa5b63
6ee8e86
5c2c5dd
558b210
f505139
5517b36
04c9307
22f830d
9074587
15e1779
d054d81
e33bd2f
a38e999
eac5d5e
2cc5ee8
a42dedd
29fcd07
d784f05
e0d7efd
6c23dbe
f46007a
92f3307
a1d32e6
791ec1e
3808b58
6dfdad7
1b6f083
8fce7fd
7539f11
7e193ed
8e8c30a
49bc9ca
86c06f5
1f6f681
31c472c
7220fe9
f2e8bcf
5282209
783f20c
e9b0766
dd2774f
210a550
9609af3
5f1263b
406adfd
8307f98
2344224
9e93122
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -17,6 +17,7 @@ | |
| from pathlib import Path | ||
|
|
||
| from sphinx.application import Sphinx | ||
| from sphinx.config import Config | ||
| from sphinx_needs import logging | ||
| from sphinx_needs.data import NeedsView, SphinxNeedsData | ||
| from sphinx_needs.need_item import NeedItem | ||
|
|
@@ -29,6 +30,7 @@ | |
| ProhibitedWordCheck as ProhibitedWordCheck, | ||
| ScoreNeedType as ScoreNeedType, | ||
| ) | ||
| from src.extensions.score_metamodel.sn_schemas import write_sn_schemas | ||
| from src.extensions.score_metamodel.yaml_parser import ( | ||
| default_options as default_options, | ||
| load_metamodel_data as load_metamodel_data, | ||
|
|
@@ -248,6 +250,24 @@ def setup(app: Sphinx) -> dict[str, str | bool]: | |
| config_setdefault(app.config, "needs_reproducible_json", True) | ||
| config_setdefault(app.config, "needs_json_remove_defaults", True) | ||
|
|
||
| def _write_schemas_with_error_handling(app: Sphinx, config: Config) -> None: | ||
| """Generate schemas.json from the metamodel and register it with sphinx-needs. | ||
|
|
||
| This enables sphinx-needs 6 schema validation: required fields, regex | ||
| patterns on option values, and (eventually) link target type checks. | ||
| Use config-inited event to defer file writing until config is ready, | ||
| with error handling to prevent boot crashes on file write failures. | ||
| """ | ||
| try: | ||
| write_sn_schemas(app, metamodel) | ||
| except Exception as e: | ||
| logger.warning( | ||
| f"Failed to write schemas.json: {e}. " | ||
| "Schema validation will be unavailable." | ||
| ) | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. AI review note: |
||
|
|
||
| _ = app.connect("config-inited", _write_schemas_with_error_handling, priority=499) | ||
|
|
||
| # sphinx-collections runs on default prio 500. | ||
| # We need to populate the sphinx-collections config before that happens. | ||
| # --> 499 | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -978,6 +978,12 @@ needs_extra_links: | |
| partially_verifies: | ||
| incoming: partially_verified_by | ||
| outgoing: partially_verifies | ||
|
|
||
| # Decision Records | ||
| affects: | ||
| incoming: affected by | ||
| outgoing: affects | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why is the metamodel extended here? Was that a gap before? Why didn't we find that before then?
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I was just reviewing that. It was a gap. Why we didnt find that ourselves is a very valid question.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. In fact, this comes from here: https://github.com/eclipse-score/docs-as-code/pull/394/changes/BASE..5f1263b1fccd9acde0d608e29e672f448e53979e#diff-0ac6046ac75bc866580d21188af43573b945253e760280edbf05da368090304bR874 The introduced transformation script was not happy about not finding the definition of the optional link.
AlexanderLanin marked this conversation as resolved.
|
||
|
|
||
| ############################################################## | ||
| # Graph Checks | ||
| # The graph checks focus on the relation of the needs and their attributes. | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,328 @@ | ||
| # ******************************************************************************* | ||
| # Copyright (c) 2025 Contributors to the Eclipse Foundation | ||
| # | ||
| # See the NOTICE file(s) distributed with this work for additional | ||
| # information regarding copyright ownership. | ||
| # | ||
| # This program and the accompanying materials are made available under the | ||
| # terms of the Apache License Version 2.0 which is available at | ||
| # https://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # SPDX-License-Identifier: Apache-2.0 | ||
| # ******************************************************************************* | ||
| """Transforms the YAML metamodel into sphinx-needs >6 JSON schema definitions. | ||
|
|
||
| Reads need types from the parsed metamodel (MetaModelData) and generates a | ||
| ``schemas.json`` file that sphinx-needs uses to validate each need against | ||
| the S-CORE metamodel rules (required fields, regex patterns, link constraints). | ||
|
|
||
| Schema structure per need type (sphinx-needs schema format): | ||
| - ``select`` : matches needs by their ``type`` field | ||
| - ``validate.local`` : validates the need's own properties (patterns, required) | ||
| - ``validate.network`` : validates properties of linked needs | ||
| """ | ||
|
|
||
| import json | ||
| from pathlib import Path | ||
| from typing import Any | ||
|
|
||
| from sphinx.application import Sphinx | ||
| from sphinx.config import Config | ||
| from sphinx_needs import logging | ||
|
|
||
| from src.extensions.score_metamodel.metamodel_types import ScoreNeedType | ||
| from src.extensions.score_metamodel.yaml_parser import MetaModelData | ||
|
|
||
| # Fields whose values are lists in sphinx-needs (e.g. tags: ["safety", "security"]). | ||
| # These need an "array of strings" JSON schema instead of a plain "string" schema. | ||
| SN_ARRAY_FIELDS = { | ||
| "tags", | ||
| "sections", | ||
| } | ||
|
|
||
| # Fields to skip during schema generation. | ||
| IGNORE_FIELDS = { | ||
| "content", # not yet available in ubCode | ||
| } | ||
|
|
||
| LOGGER = logging.get_logger(__name__) | ||
|
|
||
|
|
||
| def write_sn_schemas(app: Sphinx, metamodel: MetaModelData) -> None: | ||
| """Build sphinx-needs schema definitions from the metamodel and write to JSON. | ||
|
|
||
| Iterates over all need types, builds a schema for each one via | ||
| ``_build_need_type_schema``, and writes the result to | ||
| ``<confdir>/schemas.json``. | ||
| """ | ||
| config: Config = app.config | ||
| schemas: list[dict[str, Any]] = [] | ||
|
|
||
| for need_type in metamodel.needs_types: | ||
| schema = _build_need_type_schema(need_type) | ||
| if schema is not None: | ||
| schemas.append(schema) | ||
|
|
||
| schema_definitions: dict[str, Any] = {"schemas": schemas} | ||
|
|
||
| # Write the complete schema definitions to a JSON file in confdir | ||
| schemas_output_path = Path(app.confdir) / "schemas.json" | ||
| with open(schemas_output_path, "w", encoding="utf-8") as f: | ||
| json.dump(schema_definitions, f, indent=2, ensure_ascii=False) | ||
|
|
||
| # Tell sphinx-needs to load the schema from the JSON file | ||
| config.needs_schema_definitions_from_json = "schemas.json" | ||
|
|
||
|
|
||
| def _classify_links( | ||
| links: dict[str, Any], type_name: str, mandatory: bool | ||
| ) -> tuple[dict[str, str], dict[str, list[str]]]: | ||
| """Classify link values into regex patterns vs. target type names. | ||
|
|
||
| In the metamodel YAML, a link value can be either: | ||
| - A regex (starts with "^"), e.g. "^logic_arc_int(_op)*__.+$" | ||
| -> validated locally (the link ID must match the pattern) | ||
| - A plain type name, e.g. "comp" | ||
| -> validated via network (the linked need must have that type) | ||
| Multiple values are comma-separated, e.g. "comp, sw_unit". | ||
|
|
||
| Returns: | ||
| A tuple of (regexes, targets) dicts, keyed by field name. | ||
| ``targets`` maps each field to a list of all allowed type names. | ||
| """ | ||
| label = "mandatory" if mandatory else "optional" | ||
| regexes: dict[str, str] = {} | ||
| targets: dict[str, list[str]] = {} | ||
|
|
||
| for field, value in links.items(): | ||
| # Guard: ensure value is a string (i.e., called before postprocess_need_links). | ||
| # postprocess_need_links mutates link values into lists of ScoreNeedType objects, | ||
| # which would break the string operations below. | ||
| if not isinstance(value, str): | ||
| raise TypeError( | ||
| f"_classify_links must be called before postprocess_need_links. " | ||
| f"Expected string for {label} link '{field}' in type '{type_name}', " | ||
| f"but got {type(value).__name__}. This indicates a Sphinx phase " | ||
| "ordering issue." | ||
| ) | ||
|
|
||
| link_values = [v.strip() for v in value.split(",")] | ||
| for link_value in link_values: | ||
| if link_value.startswith("^"): | ||
| if field in regexes: | ||
| LOGGER.error( | ||
| f"Multiple regex patterns for {label} link field " | ||
| f"'{field}' in need type '{type_name}'. " | ||
| "Only the first one will be used in the schema." | ||
| ) | ||
| regexes[field] = link_value | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. AI review note: Bug in log message — says "Only the first one will be used" but the code overwrites with |
||
| else: | ||
| if field not in targets: | ||
| targets[field] = [] | ||
| targets[field].append(link_value) | ||
|
|
||
| return regexes, targets | ||
|
|
||
|
|
||
| def _build_local_validator( | ||
| mandatory_fields: dict[str, str], | ||
| optional_fields: dict[str, str], | ||
| mandatory_links_regexes: dict[str, str], | ||
| optional_links_regexes: dict[str, str], | ||
| mandatory_links_targets: dict[str, list[str]] | None = None, | ||
| ) -> dict[str, Any]: | ||
| """Build the local validator dict for a need type's schema. | ||
|
|
||
| The local validator checks the need's own properties: | ||
| - Mandatory fields must be present and match their regex pattern. | ||
| - Optional fields, if present, must match their regex pattern. | ||
| - Mandatory links must have at least one entry. | ||
| """ | ||
| properties: dict[str, Any] = {} | ||
| required: list[str] = [] | ||
|
|
||
| # Mandatory fields: must be present AND match the regex pattern | ||
| for field, pattern in mandatory_fields.items(): | ||
| if field in IGNORE_FIELDS: | ||
| continue | ||
| required.append(field) | ||
| properties[field] = get_field_pattern_schema(field, pattern, is_optional=False) | ||
|
|
||
| # Optional fields: if present, must match the regex pattern | ||
| # Allow empty strings to align with Python checker behavior | ||
| # Skip fields already handled as mandatory to avoid pattern/required conflicts | ||
| # when a field appears in both mandatory_options and optional_options | ||
| # (e.g. global base options merged into every type's optional_options). | ||
| for field, pattern in optional_fields.items(): | ||
| if field in IGNORE_FIELDS: | ||
| continue | ||
| if field in mandatory_fields: | ||
| continue | ||
| properties[field] = get_field_pattern_schema(field, pattern, is_optional=True) | ||
|
|
||
| # Mandatory links (regex): must have at least one entry | ||
| # TODO: regex pattern matching on link IDs is not yet enabled | ||
| for field in mandatory_links_regexes: | ||
| properties[field] = {"type": "array", "minItems": 1} | ||
| required.append(field) | ||
|
|
||
| # Mandatory links (plain target types): must have at least one entry. | ||
| # The type of the linked need is checked via validate.network, but the | ||
| # list length constraint belongs in the local validator. | ||
| # Skip fields already handled by mandatory_links_regexes (mixed regex + plain). | ||
| for field in mandatory_links_targets or {}: | ||
| if field not in properties: | ||
| properties[field] = {"type": "array", "minItems": 1} | ||
| required.append(field) | ||
|
|
||
| # Optional links (regex): allowed but not required | ||
| # TODO: regex pattern matching on link IDs is not yet enabled | ||
| for field in optional_links_regexes: | ||
| properties[field] = {"type": "array"} | ||
|
|
||
| return { | ||
| "properties": properties, | ||
| "required": required, | ||
| } | ||
|
|
||
|
|
||
| def _build_need_type_schema(need_type: ScoreNeedType) -> dict[str, Any] | None: | ||
| """Build a sphinx-needs schema entry for a single need type. | ||
|
|
||
| Returns ``None`` if the need type has no constraints (no mandatory/optional | ||
| fields or links), meaning no schema validation is needed. | ||
|
|
||
| The returned dict has the sphinx-needs schema structure: | ||
| - ``select``: matches needs by their ``type`` field | ||
| - ``validate.local``: validates the need's own properties | ||
| - ``validate.network``: validates linked needs' types | ||
| """ | ||
| mandatory_fields = need_type.get("mandatory_options", {}) | ||
| optional_fields = need_type.get("optional_options", {}) | ||
| mandatory_links = need_type.get("mandatory_links", {}) | ||
| optional_links = need_type.get("optional_links", {}) | ||
|
|
||
| # Skip need types that have no constraints at all | ||
| if not (mandatory_fields or optional_fields or mandatory_links or optional_links): | ||
| return None | ||
|
|
||
| type_name = need_type["directive"] | ||
|
|
||
| # Classify link values as regex patterns vs. target type names. | ||
| # Note: links are still plain strings at this point (before postprocess_need_links). | ||
| mandatory_links_regexes, mandatory_links_targets = _classify_links( | ||
| mandatory_links, type_name, mandatory=True | ||
| ) | ||
| optional_links_regexes, _ = _classify_links( | ||
| optional_links, type_name, mandatory=False | ||
| ) | ||
|
|
||
| # Build validate.network for link fields with plain type targets. | ||
| # The network schema uses sphinx-needs' ValidateSchemaType format: | ||
| # each entry's ``items.local`` is a JSON Schema applied to each linked need. | ||
| network: dict[str, Any] = {} | ||
|
|
||
| def add_network_entry(field: str, target_types: list[str]) -> None: | ||
| type_constraint: dict[str, Any] = ( | ||
| {"enum": target_types} | ||
| if len(target_types) > 1 | ||
| else {"const": target_types[0]} | ||
| ) | ||
| network[field] = { | ||
| "type": "array", | ||
| "items": { | ||
| "local": { | ||
| "properties": {"type": type_constraint}, | ||
| "required": ["type"], | ||
| } | ||
| }, | ||
| } | ||
|
|
||
| # Only add network entries for *mandatory* links with exclusively plain | ||
| # type targets. Two categories are intentionally excluded: | ||
| # | ||
| # 1. Mixed regex+plain fields (e.g. | ||
| # "complies: std_wp, ^std_req__aspice_40__iic.*$"): | ||
| # The items schema would incorrectly require ALL linked needs to match | ||
| # the plain type, while some legitimately match the regex instead. | ||
| # | ||
| # 2. Optional links: The Python validate_links() in check_options.py treats | ||
| # optional link type violations as informational (treat_as_info=True), | ||
| # but schemas use a single severity ("violation") per need type. | ||
| # Including optional links would escalate info-level issues to errors. | ||
| # Optional link types are validated by the Python check instead. | ||
| for field, target_types in mandatory_links_targets.items(): | ||
| if field not in mandatory_links_regexes: | ||
| add_network_entry(field, target_types) | ||
|
|
||
| type_schema: dict[str, Any] = { | ||
| "id": f"need-type-{type_name}", | ||
| "severity": "violation", | ||
| "message": "Need does not conform to S-CORE metamodel", | ||
| # Selector: only apply this schema to needs with matching type | ||
| "select": { | ||
| "properties": {"type": {"const": type_name}}, | ||
| "required": ["type"], | ||
| }, | ||
| "validate": { | ||
| "local": _build_local_validator( | ||
| mandatory_fields, | ||
| optional_fields, | ||
| mandatory_links_regexes, | ||
| optional_links_regexes, | ||
| mandatory_links_targets, | ||
| ), | ||
| }, | ||
| } | ||
| if network: | ||
| type_schema["validate"]["network"] = network | ||
|
|
||
| return type_schema | ||
|
|
||
|
|
||
| def get_field_pattern_schema( | ||
| field: str, pattern: str, is_optional: bool = False | ||
| ) -> dict[str, Any]: | ||
| """Return the appropriate JSON schema for a field's regex pattern. | ||
|
|
||
| Array-valued fields (like ``tags``) get an array-of-strings schema; | ||
| scalar fields get a plain string schema. | ||
|
|
||
| For optional fields, the schema allows empty strings to align with the | ||
| Python metamodel checker's behavior (which treats empty strings as absent). | ||
| """ | ||
| if field in SN_ARRAY_FIELDS: | ||
| return get_array_pattern_schema(pattern, is_optional=is_optional) | ||
| return get_pattern_schema(pattern, is_optional=is_optional) | ||
|
|
||
|
|
||
| def get_pattern_schema(pattern: str, is_optional: bool = False) -> dict[str, Any]: | ||
| """Return a JSON schema that validates a string against a regex pattern. | ||
|
|
||
| For optional fields, allows either an empty string OR a string matching | ||
| the pattern, matching the Python checker's behavior where empty strings | ||
| are treated as "absent" and not validated. | ||
| """ | ||
| if is_optional: | ||
| # Allow empty strings for optional fields (Python checker treats "" as absent) | ||
| # Use regex alternation to match either empty string or the original pattern | ||
| return { | ||
| "type": "string", | ||
| "pattern": f"^$|{pattern}", | ||
| } | ||
| return { | ||
| "type": "string", | ||
| "pattern": pattern, | ||
| } | ||
|
|
||
|
|
||
| def get_array_pattern_schema(pattern: str, is_optional: bool = False) -> dict[str, Any]: | ||
| """Return a JSON schema that validates an array where each item matches a regex. | ||
|
|
||
| For optional fields, allows empty strings in the array to align with the | ||
| Python checker's behavior. | ||
| """ | ||
| return { | ||
| "type": "array", | ||
| "items": get_pattern_schema(pattern, is_optional=is_optional), | ||
| } | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Even more files getting generated into the workspace?
This clashes with my proposal here: eclipse-score/score#2826
Technically, this is just a draft, so not really relevant for this PR though. Just want to mention it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
docs/needs_metamodel_generated.tomlcan definitely be moved to some build dir. I'll fix as part of this PR.docs/schemas.jsonis maybe required in the workspace. I can check if it is possible forubproject.tomlto reference to something from within the build dir.