Skip to content

Commit

Permalink
Merge pull request #45 from softwareunderground/t21-main
Browse files Browse the repository at this point in the history
Part I of the T21 hackathon - versioning and autodeploy to PyPI/conda-forge
  • Loading branch information
prisae committed Apr 19, 2021
2 parents ea4e3ae + 6d02421 commit 2e5bd0e
Show file tree
Hide file tree
Showing 27 changed files with 560 additions and 262 deletions.
56 changes: 56 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
name: docs

# Controls when the action will run.
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches:
- main
release:
types:
- published

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

jobs:
docs:

runs-on: ubuntu-latest

env:
DISPLAY: ':99.0'
PYVISTA_OFF_SCREEN: 'True'
ALLOW_PLOTTING: true
SHELLOPTS: 'errexit:pipefail'

steps:
- uses: actions/checkout@v2
- name: Setup Headless Display
run: |
sudo apt-get install libgl1-mesa-glx
sudo apt-get install -y xvfb
Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
sleep 3
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.8'

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements_dev.txt
pip install -e .
- name: Generate Sphinx Docs
working-directory: docs/
run: make html

- name: Publish generated content to GitHub Pages
uses: tsunematsu21/[email protected]
with:
dir: docs/build/html
branch: gh-pages
token: ${{ secrets.ACCESS_TOKEN }}
160 changes: 108 additions & 52 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -1,64 +1,59 @@
# This is a basic workflow to help you get started with Actions
name: linux

name: subsurface CI

# Controls when the action will run.
# Controls when the action will run.
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ main, mig_dev, main_candidate, t21-main ]
pull_request:
branches: [ main, t21-main ]
release:
types:
- published

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
pytest:
runs-on: ubuntu-latest
env:
DISPLAY: ':99.0'
PYVISTA_OFF_SCREEN: 'True'
ALLOW_PLOTTING: true
SHELLOPTS: 'errexit:pipefail'
# The type of runner that the job will run on

steps:
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.8'

- uses: actions/checkout@v2
- name: Setup Headless Display
run: |
sudo apt-get install libgl1-mesa-glx
sudo apt-get install -y xvfb
Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
sleep 3
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r optional_requirements.txt
- name: Test with pytest
run: |
pip install pytest
pip install pytest-cov
pytest
docs:
runs-on: ubuntu-latest
name: ${{ matrix.case.os }} py${{ matrix.case.python-version }} ${{ matrix.case.name }}
runs-on: ${{ matrix.case.os }}-latest

strategy:
fail-fast: false
matrix:
os: [ubuntu, ] # macos, windows] # Only Linux currently.
case:
- python-version: 3.8
name: basic
os: ubuntu
# - python-version: 3.9
# name: basic
# os: ubuntu

env:
DISPLAY: ':99.0'
PYVISTA_OFF_SCREEN: 'True'
ALLOW_PLOTTING: true
SHELLOPTS: 'errexit:pipefail'
OS: ${{ matrix.case.os }}
PYTHON: ${{ matrix.case.python-version }}

steps:

# Cancel any previous run of the test job; [pin v0.8.0 (2021-02-13)]
- name: Cancel Previous Runs
uses: styfle/cancel-workflow-action@3d86a7cc43670094ac248017207be0295edbc31d
with:
access_token: ${{ github.token }}

- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.case.python-version }}

- uses: actions/checkout@v2
- name: Setup Headless Display
run: |
Expand All @@ -67,24 +62,85 @@ jobs:
Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
sleep 3
- name: Set up Python
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements_dev.txt
- name: Test with pytest
run: pytest --cov=subsurface

# # # # DEPLOY # # #
deploy:
needs: pytest
name: Deploy to PyPI
runs-on: ubuntu-latest
# Only from the origin repository, not forks; only main and tags.
if: github.repository_owner == 'softwareunderground' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/'))

steps:
# Checks-out your repository under $GITHUB_WORKSPACE
- name: Checkout
uses: actions/checkout@v2
with:
# Need to fetch more than the last commit so that setuptools_scm can
# create the correct version string. If the number of commits since
# the last release is greater than this, the version will still be
# wrong. Increase if necessary.
fetch-depth: 100
# The GitHub token is preserved by default but this job doesn't need
# to be able to push to GitHub.
persist-credentials: false

# Need the tags so that setuptools_scm can form a valid version number
- name: Fetch git tags
run: git fetch origin 'refs/tags/*:refs/tags/*'

- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.8'
python-version: "3.8"

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install wheel setuptools_scm
pip install -r requirements.txt
pip install -r optional_requirements.txt
pip install -r dev_requirements.txt
pip install -e .
- name: Generate Sphinx Docs
working-directory: docs/
run: make html
- name: Publish generated content to GitHub Pages
uses: tsunematsu21/[email protected]
- name: Build source and wheel distributions
if: github.ref == 'refs/heads/main'
run: |
# Change setuptools-scm local_scheme to "no-local-version" so the
# local part of the version isn't included, making the version string
# compatible with Test PyPI.
sed --in-place "s/'root'/'local_scheme':'no-local-version','root'/g" setup.py
- name: Build source and wheel distributions
run: |
# Build source and wheel packages
python setup.py sdist
python setup.py bdist_wheel
echo ""
echo "Generated files:"
ls -lh dist/
- name: Publish to Test PyPI
if: success()
# Hash corresponds to v1.4.1
uses: pypa/gh-action-pypi-publish@54b39fb9371c0b3a6f9f14bb8a67394defc7a806
with:
dir: docs/build/html
branch: gh-pages
token: ${{ secrets.ACCESS_TOKEN }}
user: __token__
password: ${{ secrets.TEST_PYPI_PASSWORD }}
repository_url: https://test.pypi.org/legacy/
# Allow existing releases on test PyPI without errors.
# NOT TO BE USED in PyPI!
skip_existing: true

- name: Publish to PyPI
# Only for releases
if: success() && github.event_name == 'release'
# Hash corresponds to v1.4.1
uses: pypa/gh-action-pypi-publish@54b39fb9371c0b3a6f9f14bb8a67394defc7a806
with:
user: __token__
password: ${{ secrets.PYPI_PASSWORD }}
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
subsurface/_version.py

# vscode
.vscode/*

Expand Down Expand Up @@ -116,4 +118,4 @@ venv.bak/
!/docs/source/_templates/
/data/

.idea
.idea
37 changes: 17 additions & 20 deletions DevelopersGuide.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,21 @@
Before a release.
NOTES
-----

TODO: Re-work and automate this. Steps that should be necessary:


**Important.** Due to the use of ``setuptools_scm``, everything is by default
added to the wheel on PyPI. Documents that should not be in a release have to
be excluded by adding it to the ``MANIFEST.in``.


Making a release
----------------
# set version number in setup.py, also in the config file of the documentation and init of the package
- [ ] setup.py
- [ ] subsurface._version
> Note: in the config for sphinx~ this is taken from subsurface._version
Github release
--------------
# add new tag
$ git tag X.X -m "Add X.X tag for PyPI"
# push git tag
$ git push --tags origin master

PyPi release
------------
# First create the dist
python3 setup.py sdist bdist_wheel

# Second upload the distributions
twine upload dist/*

Create a release on GitHub. This will run the tests, and then automatically
deploy it to PyPI, from where conda-forge will pick it up as well.
If everything works fine it should be available from PyPI within minutes after
the tests passed, and within an hour or two from conda-forge.


### Type of commits:
Expand Down
11 changes: 11 additions & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
prune docs
prune examples
prune tests
prune .github
exclude CONTRIBUTING.md
exclude DevelopersGuide.md
exclude MANIFEST.in
exclude requirements.txt
exclude requirements_dev.txt
exclude requirements_opt.txt
exclude .gitignore
48 changes: 27 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,27 +14,33 @@ The difference between data levels is **not** which data they store but which da

**Human**

\=================================/'
\===============================/ ' \
\==========geo_format=========/ ' \ -> Additional context/meta information about the data
\===========================/' ' \
\=======geo_object========/ ' ' \ -> Elements that represent some
\=======================/ ' ' / geological concept. E.g: faults, seismic
\=====================/' ' ' ' /
\======element======/' ' ' ' / -> type of geometric object: PointSet,
\=================/' ' ' ' / TriSurf, LineSet, Tetramesh
\=primary_struct/ '' / - > Set of arrays that define a geometric object:
\=============/ ' ' / e.g. *StructuredData* **UnstructuredData**
\============/'' /
\DF/Xarray/ ' '/ -> Label numpy.arrays
\=======/'' /
\array/' / -> Memory allocation
\===/ /
\=//
'

\‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾‾/\
\= = = = = = = = = = = = = = /. \ -> Additional context/meta information about the data
\= = = = geo_format= = = = /. . \
\= = = = = = = = = = = = /. . . \ -> Elements that represent some
\= = = geo_object= = = /. . . . \ geological concept. E.g: faults, seismic
\= = = = = = = = = = /. . . . ./
\= = element = = = /. . . . / -> type of geometric object: PointSet,
\= = = = = = = = /. . . ./ TriSurf, LineSet, Tetramesh
\primary_struct/. . . / -> Set of arrays that define a geometric object:
\= = = = = = /. . ./ e.g. *StructuredData* **UnstructuredData**
\DF/Xarray /. . / -> Label numpy.arrays
\= = = = /. ./
\array /. / -> Memory allocation
\= = /./
\= //
\/


**Computer**

## Documentation (WIP)

An early version of the documentation can be found here:

https://softwareunderground.github.io/subsurface/

## Installation

`pip install subsurface`
Expand All @@ -49,9 +55,9 @@ We are changing things. Help us figure it out!

#### Original statement

The goal of this project is to support other subsurface geoscience and
engineering projects with a set of classes for common subsurface data entities,
such as seismic and GPR datasets, log curves, and so on. The current plan is to
The goal of this project is to support other subsurface geoscience and
engineering projects with a set of classes for common subsurface data entities,
such as seismic and GPR datasets, log curves, and so on. The current plan is to
use `xarray` under the hood, with `pint` for units support and `cartopy` for CRS and map support.

It's early days, everything might change. Help us figure it out!
Expand Down
2 changes: 0 additions & 2 deletions dev_requirements.txt

This file was deleted.

Binary file removed docs/logos/subsurface_place_holder.png
Binary file not shown.
Binary file added docs/source/_static/logos/favicon.ico
Binary file not shown.
Loading

0 comments on commit 2e5bd0e

Please sign in to comment.