Note
The contribution policy is currently under development.
Except in rare circumstances, changes to the phlex repository are proposed and considered through GitHub pull requests (PRs).
Note
- Both external contributors and framework developers are expected to abide by this contribution policy.
- Even if you are a contributor with merge privileges, your PR should be reviewed by someone else before being merged into the
mainbranch.
Phlex supports comprehensive code coverage measurement using CMake's built-in coverage support:
# Configure with coverage enabled
cmake -DCMAKE_BUILD_TYPE=Coverage -DENABLE_COVERAGE=ON /path/to/phlex/source
# Build and run tests
cmake --build . -j $(nproc)
ctest -j $(nproc)
# Generate coverage reports
ninja coverage-xml # XML report for CI/Codecov
ninja coverage-html # HTML report for local viewing
ninja coverage # Basic coverage info via ctest
ninja coverage-clean # Clean coverage data filesThe coverage system:
- Uses GCC's built-in
gcovfor instrumentation - Generates reports with
gcovr(XML) andlcov(HTML) - Automatically filters to project sources only
- Handles known GCC coverage bugs gracefully
- Integrates with VS Code coverage extensions
For convenience, the scripts/coverage.sh script automates the entire coverage workflow:
# Complete workflow (recommended)
./scripts/coverage.sh all
# Individual commands
./scripts/coverage.sh setup # Configure and build with coverage
./scripts/coverage.sh test # Run tests with coverage
./scripts/coverage.sh xml # Generate XML report
./scripts/coverage.sh html # Generate HTML report
./scripts/coverage.sh view # Open HTML report in browser
./scripts/coverage.sh summary # Show coverage summary
./scripts/coverage.sh upload # Upload to Codecov
# Chain multiple commands
./scripts/coverage.sh setup test html viewImportant: Coverage data becomes stale when source files change. Always rebuild before generating reports:
./scripts/coverage.sh setup test html # Rebuild → test → generate HTMLThe script automatically:
- Detects standalone vs. multi-project build configurations
- Sources environment setup scripts (
setup-env.sh) - Detects stale instrumentation and rebuilds when needed
- Handles generated source files via temporary symlink trees
- Normalizes coverage paths for Codecov compatibility
The coverage system handles generated C/C++ files (e.g., from code generators) by:
- Creating a temporary symlink tree at
.coverage-generated/in the repository root - Mapping build-directory paths to repository-relative paths using
--path-map - Normalizing XML paths with
normalize_coverage_xml.pybefore upload
This ensures Codecov can match coverage data to the correct files in your repository.
After generating HTML coverage, the script copies lcov.info to the workspace root for VS Code extensions like Coverage Gutters. Install the extension and open any source file to see coverage indicators in the gutter.
To upload coverage manually:
# Set your token (choose one method)
export CODECOV_TOKEN='your-token'
echo 'your-token' > ~/.codecov_token && chmod 600 ~/.codecov_token
# Upload
./scripts/coverage.sh xml uploadThe upload command uses the Codecov CLI and automatically normalizes paths for repository compatibility.
The project automatically runs coverage analysis on every PR and push to main/develop branches. The workflow:
- Builds with
-DCMAKE_BUILD_TYPE=Coverage -DENABLE_COVERAGE=ON - Runs all tests via
ctest - Generates XML coverage report using
cmake --build . --target coverage-xml - Normalizes paths for generated files using
normalize_coverage_xml.py - Uploads to Codecov via
codecov/codecov-action@v4
Coverage reports are uploaded to Codecov for tracking and PR integration, with automatic comments on PRs showing coverage changes.
For clang-tidy workflow behavior, local command examples, and configuration details, see CLANG_TIDY_CONFIGURATION.md.
The .github/copilot-instructions.md contains various "ground rules" to be observed by GitHub Copilot for every session. They are intended to be useful for everyone, but you can override or augment them yourself by creating a <workspace>/.github/copilot-instructions.md file. If this file exists, its contents will be merged with—but take precedence over—the repository level instructions.
The instructions in this file were created not by hand, but by asking Copilot to produce instructions for itself to achieve a goal, such as, "Please update the instructions file to ensure adherence to markdown-lint rules when generating Markdown files." This way (hopefully), the instructions are more likely to have the intended effect on the AI since they were generated by the AI itself.
If you wish to add your own personalized instructions, or adjust/augment the repository-level instructions via PR, best results will be achieved by following this method rather than writing them yourself except in the most trivial of cases.
The tool pre-commit offers a framework for setting up hooks that are run on staged files before committing, ensuring correct formatting and catching some errors before they end up in a commit.
To set up this tool locally, first install pre-commit (or prek, a faster re-implementation in rust) using your system package manager or uv/pip(x), e.g. uv tool install pre-commit.
So set up the hooks in the Phlex repository, run the pre-commit install command once in the repository. The hooks now should run automatically prior to each commit.
To skip a check temporarily, run SKIP=ruff-format git commit -m "...". To skip all hooks, you can use the --no-verify flag to git commit.
To manually run hooks over all files, run pre-commit run --all-files.
The pre-commit hooks are configured in .pre-commit-config.yaml.
For further reference, see the extensive pre-commit documentation.