Skip to content

Commit

Permalink
Prepare v0.2.1 (#23)
Browse files Browse the repository at this point in the history
* Repo readme edits (#20)

* Readme updated

* pyproject.toml updated

* Switched to source-based layout

* precommit and ci updated

* precommit and devrequirements created fresh

---------

Co-authored-by: mkuehbach <[email protected]>

* Fixed linting (#21)

Co-authored-by: mkuehbach <[email protected]>

* Relocate files and fix #18 (#22)

* Relocate files and fix #18

* Adding FAIRmat docs corporate to fix #16

* Relocated mkdocs

* Cleaned unnecessary readme.md

---------

Co-authored-by: mkuehbach <[email protected]>

* Reverted unintentional change of build docs

* Added programmatic tests (#24)

* Added programmatic tests

* Add debug configuration

* Various fixes

* Fixed unknown enum for service

* if map_to_bool gets a value that is itself a bool the if src_val will do nothing

* Bumped pynxtools to 0.7.0

---------

Co-authored-by: mkuehbach <[email protected]>

---------

Co-authored-by: mkuehbach <[email protected]>
  • Loading branch information
mkuehbach and atomprobe-tc authored Sep 11, 2024
1 parent adcfd67 commit f086312
Show file tree
Hide file tree
Showing 40 changed files with 713 additions and 349 deletions.
30 changes: 22 additions & 8 deletions .github/workflows/build_docs.yml
Original file line number Diff line number Diff line change
@@ -1,27 +1,41 @@
name: build_docs
on:
push:
branches: [main]
branches:
- main # Triggers deployment on push to the main branch

env:
UV_SYSTEM_PYTHON: true
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Checkout Repository
uses: actions/checkout@v4
- name: Configure Git Credentials
run: |
git config user.name github-actions[bot]
git config user.email 41898282+github-actions[bot]@users.noreply.github.com
- uses: actions/setup-python@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.x
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v3
python-version: '3.x'

- name: Cache mkdocs-material enviroment
uses: actions/cache@v3
with:
key: mkdocs-material-${{ env.cache_id }}
path: .cache
restore-keys: |
mkdocs-material-
- run: pip install ".[docs]"
- run: mkdocs gh-deploy --force
- name: Install Dependencies
run: |
curl -LsSf https://astral.sh/uv/install.sh | sh
uv pip install ".[docs]"
- name: Build and Deploy
run: |
mkdocs gh-deploy --force --remote-branch gh-pages
37 changes: 26 additions & 11 deletions .github/workflows/publish.yml
Original file line number Diff line number Diff line change
@@ -1,30 +1,45 @@
# This workflow will upload a Python Package using Twine when a release is created
# For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries

# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
name: Upload Python Package

on: workflow_dispatch
# release:
# types: [published]
env:
UV_SYSTEM_PYTHON: true

jobs:
deploy:
name: Upload release to PyPI
runs-on: ubuntu-latest

environment:
name: pypi
url: https://pypi.org/p/pynxtools-apm
permissions:
id-token: write
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0
submodules: recursive
# submodules: recursive
- name: Set up Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install build
curl -LsSf https://astral.sh/uv/install.sh | sh
uv pip install build
- name: Build package
run: python -m build
- name: Publish package
uses: pypa/gh-action-pypi-publish@27b31702a0e7fc50959f5ad993c78deac1bdfc29
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
- name: Publish package distributions to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
# 27b31702a0e7fc50959f5ad993c78deac1bdfc29
# with:
# user: __token__
# password: ${{ secrets.PYPI_API_TOKEN }}
37 changes: 20 additions & 17 deletions .github/workflows/pylint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,32 +2,35 @@ name: linting

on: [push]

env:
UV_SYSTEM_PYTHON: true
jobs:
linting:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python_version: ["3.8", "3.9", "3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v2
- name: Set up Python 3.10
uses: actions/setup-python@v2
- uses: actions/checkout@v4
with:
fetch-depth: 0
submodules: recursive
- name: Set up Python ${{ matrix.python_version }}
uses: actions/setup-python@v5
with:
python-version: "3.10"
python-version: ${{ matrix.python_version }}
- name: Install dependencies
run: |
git submodule sync --recursive
git submodule update --init --recursive --jobs=4
python -m pip install --upgrade pip
- name: Install package
run: |
python -m pip install --no-deps .
- name: Install requirements
run: |
python -m pip install -r dev-requirements.txt
- name: ruff
curl -LsSf https://astral.sh/uv/install.sh | sh
uv pip install ".[dev,docs]"
- name: ruff check
run: |
ruff pynxtools_apm tests
ruff check src/pynxtools_apm tests
- name: ruff formatting
run: |
ruff format --check pynxtools_apm tests
ruff format --check src/pynxtools_apm tests
- name: mypy
run: |
mypy pynxtools_apm tests
mypy src/pynxtools_apm tests
23 changes: 18 additions & 5 deletions .github/workflows/pytest.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: pytest

on:
Expand All @@ -6,6 +9,9 @@ on:
pull_request:
branches: [main]

env:
UV_SYSTEM_PYTHON: true

jobs:
pytest:
runs-on: ubuntu-latest
Expand All @@ -15,20 +21,27 @@ jobs:
python_version: ["3.8", "3.9", "3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0
submodules: recursive
- name: Set up Python ${{ matrix.python_version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python_version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
curl -LsSf https://astral.sh/uv/install.sh | sh
uv pip install coverage coveralls
- name: Install package
run: |
pip install ".[dev]"
uv pip install ".[dev]"
- name: Test with pytest
run: |
pytest tests
coverage run -m pytest -sv --show-capture=no tests
- name: Submit to coveralls
continue-on-error: true
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
coveralls --service=github
2 changes: 1 addition & 1 deletion .precommit-config.yaml → .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.3.4
rev: v0.5.5
hooks:
# Run the linter.
- id: ruff
Expand Down
25 changes: 25 additions & 0 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "pynx-apm debug",
"type": "python",
"request": "launch",
"cwd": "${workspaceFolder}",
"program": "../.py3.12.4/bin/dataconverter",
"args": [//"convert",
"tests/data/eln/eln_data.yaml",
"tests/data/eln/apm.oasis.specific.yaml",
"tests/data/apt/Si.apt",
"tests/data/rng/87D_1.rng",
"--reader",
"apm",
"--nxdl",
"NXapm",
"--output=tests/prod/apt.Si.apt.nxs"]
}
]
}
31 changes: 12 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@
![](https://img.shields.io/pypi/v/pynxtools-apm)
![](https://coveralls.io/repos/github/FAIRmat-NFDI/pynxtools-apm/badge.svg?branch=main)

# A parser and normalizer for atom probe tomography and field-ion microscopy data
# Parse and normalize atom probe tomography and field-ion microscopy data

# Installation
## Installation
It is recommended to use python 3.11 with a dedicated virtual environment for this package.
Learn how to manage [python versions](https://github.com/pyenv/pyenv) and
[virtual environments](https://realpython.com/python-virtual-environments-a-primer/).
Expand All @@ -19,9 +19,11 @@ This package is a reader plugin for [`pynxtools`](https://github.com/FAIRmat-NFD
pip install pynxtools[apm]
```

for the latest development version.
for the latest release version from [pypi](https://pypi.org/project/pynxtools-em/).

# Purpose
If you are interested in the newest version, we recommend to work with a development installation instead.

## Purpose
This reader plugin for [`pynxtools`](https://github.com/FAIRmat-NFDI/pynxtools) is used to translate diverse file formats from the scientific community and technology partners
within the field of atom probe tomography and field-ion microscopy into a standardized representation using the
[NeXus](https://www.nexusformat.org/) application definition [NXapm](https://fairmat-nfdi.github.io/nexus_definitions/classes/contributed_definitions/NXapm.html#nxapm).
Expand All @@ -30,14 +32,12 @@ within the field of atom probe tomography and field-ion microscopy into a standa
This plugin supports the majority of the file formats that are currently used for atom probe.
A detailed summary is available in the [reference section of the documentation](https://fairmat-nfdi.github.io/pynxtools-apm).

# Getting started
## Getting started
[A getting started tutorial](https://github.com/FAIRmat-NFDI/pynxtools-apm/tree/main/examples) is offered that guides you
how to use the apm reader for converting your data to NeXus using a Jupyter notebook. That notebook details also the commands how to convert data via command line calls. Note that not every combination of input from a supported file format and other, typically electronic lab notebook, input for the parser allows filling the required and recommended fields and attributes of the NXapm application definition.
Therefore, you may need to provide an ELN file that contains the missing values in order for the
validation step of the APM reader to pass.
on how to use the apm reader for converting your data to NeXus using a Jupyter notebook or command line calls. Note that not every combination of input from a supported file format and other input, such as from an electronic lab notebook, allows filling the required and recommended fields and their attributes of the NXapm application definition. Therefore, you may need to provide an ELN file that contains the missing values in order for the validation step of the APM reader to pass.

# Contributing
We are continously working on adding parsers for other data formats, technology partners, and atom probers.
## Contributing
We are continously working on improving the collection of parsers and their functionalities.
If you would like to implement a parser for your data, feel free to get in contact.

## Development install
Expand All @@ -47,19 +47,12 @@ Install the package with its dependencies:
git clone https://github.com/FAIRmat-NFDI/pynxtools-apm.git --branch main --recursive pynxtools_apm
cd pynxtools_apm
python -m pip install --upgrade pip
python -m pip install -e .
python -m pip install -e ".[dev,docs]"
```

<!---There is also a [pre-commit hook](https://pre-commit.com/#intro) available
which formats the code and checks the linting before actually commiting.
It can be installed with
```shell
pre-commit install
```
from the root of this repository.

## Development Notes-->
The last line installs a [pre-commit hook](https://pre-commit.com/#intro) which
automatically formats (linting) and type checks the code before committing.

## Test this software
Especially relevant for developers, there exists a basic test framework written in
Expand Down
Loading

0 comments on commit f086312

Please sign in to comment.