Skip to content

Commit

Permalink
Merge branch 'dev' into etl-gui-changes
Browse files Browse the repository at this point in the history
  • Loading branch information
milo-hyben committed Sep 26, 2023
2 parents 8e22771 + d6f9bb4 commit 33826ef
Show file tree
Hide file tree
Showing 27 changed files with 111 additions and 87 deletions.
6 changes: 5 additions & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 6.2.0
current_version = 6.2.1
commit = True
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>[A-z0-9-]+)
Expand All @@ -15,3 +15,7 @@ replace = version='{new_version}',
[bumpversion:file:deploy/python/version.txt]
search = {current_version}
replace = {new_version}

[bumpversion:file:web/package.json]
search = "version": "{current_version}",
replace = "version": "{new_version}",
17 changes: 13 additions & 4 deletions .github/workflows/deploy.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ on:
- main
- dev

permissions:
id-token: write
contents: read

jobs:
deploy:
runs-on: ubuntu-latest
Expand All @@ -23,11 +27,16 @@ jobs:
steps:
- uses: actions/checkout@v3

- name: "gcloud setup"
uses: google-github-actions/setup-gcloud@v1
- id: "google-cloud-auth"
name: "Authenticate to Google Cloud"
uses: "google-github-actions/auth@v1"
with:
project_id: sample-metadata
service_account_key: ${{ secrets.GCP_SERVER_DEPLOY_KEY }}
workload_identity_provider: "projects/774248915715/locations/global/workloadIdentityPools/gh-deploy-pool/providers/gh-provider"
service_account: "[email protected]"

- id: "google-cloud-sdk-setup"
name: "Set up Cloud SDK"
uses: google-github-actions/setup-gcloud@v1

- name: "gcloud docker auth"
run: |
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Sample Metadata

[![codecov](https://codecov.io/gh/populationgenomics/sample-metadata/branch/dev/graph/badge.svg?token=OI3XZYR9HK)](https://codecov.io/gh/populationgenomics/sample-metadata)
[![codecov](https://codecov.io/gh/populationgenomics/metamist/branch/dev/graph/badge.svg?token=OI3XZYR9HK)](https://codecov.io/gh/populationgenomics/metamist)

Metamist is database that stores **de-identified** -omics metadata.

Expand Down Expand Up @@ -390,7 +390,7 @@ will build the docker container and supply it to regenerate_api.py.

```bash
# SM_DOCKER is a known env variable to regenerate_api.py
export SM_DOCKER="cpg/sample-metadata-server:dev"
export SM_DOCKER="cpg/metamist-server:dev"
docker build --build-arg SM_ENVIRONMENT=local -t $SM_DOCKER -f deploy/api/Dockerfile .
python regenerate_api.py
```
Expand Down
2 changes: 1 addition & 1 deletion api/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from api.settings import PROFILE_REQUESTS, SKIP_DATABASE_CONNECTION

# This tag is automatically updated by bump2version
_VERSION = '6.2.0'
_VERSION = '6.2.1'

logger = get_logger()

Expand Down
8 changes: 4 additions & 4 deletions db/backup/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ The full recovery process is detailed below. Follow the recovery stages that are
3. Clone this repo
> ```bash
> git clone https://github.com/populationgenomics/sample-metadata.git
> git clone https://github.com/populationgenomics/metamist.git
> ```
4. Navigate to the appropriate directory
Expand Down Expand Up @@ -85,7 +85,7 @@ The full recovery process is detailed below. Follow the recovery stages that are
### Daily Backup
A cron job runs a [backup script](https://github.com/populationgenomics/sample-metadata/blob/dev/db/backup/backup.py) daily. The script outputs a folder that is uploaded to GCS in the [cpg-sm-backups](https://console.cloud.google.com/storage/browser/cpg-sm-backups;tab=objects?forceOnBucketsSortingFiltering=false&project=sample-metadata&prefix=&forceOnObjectsSortingFiltering=false) bucket.
A cron job runs a [backup script](https://github.com/populationgenomics/metamist/blob/dev/db/backup/backup.py) daily. The script outputs a folder that is uploaded to GCS in the [cpg-sm-backups](https://console.cloud.google.com/storage/browser/cpg-sm-backups;tab=objects?forceOnBucketsSortingFiltering=false&project=sample-metadata&prefix=&forceOnObjectsSortingFiltering=false) bucket.
All backups will be retained for 30 days in the event that they are deleted.
Setting up
Expand Down Expand Up @@ -141,7 +141,7 @@ IMPORTANT: Do not run the validation script in a production environment. In orde
2. Clone this repo
> ```bash
> git clone https://github.com/populationgenomics/sample-metadata.git
> git clone https://github.com/populationgenomics/metamist.git
> ```
3. Navigate to the directory
Expand Down Expand Up @@ -196,4 +196,4 @@ To test our monitoring and alerting policy, once a year our database backups wil
Further, alongside the procedure to validate the [database restoration](#running-the-validation-script), the SM API will be validated.
1. Update the configuration to point to the new VM as the production VM.
2. Run the test script, currently under construction [#35](https://github.com/populationgenomics/sample-metadata/pull/35)
2. Run the test script, currently under construction [#35](https://github.com/populationgenomics/metamist/pull/35)
2 changes: 1 addition & 1 deletion db/python/tables/project.py
Original file line number Diff line number Diff line change
Expand Up @@ -409,7 +409,7 @@ async def create_project(
return project_id

async def update_project(self, project_name: str, update: dict, author: str):
"""Update a sample-metadata project"""
"""Update a metamist project"""
await self.check_project_creator_permissions(author)

meta = update.get('meta')
Expand Down
2 changes: 1 addition & 1 deletion deploy/python/version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
6.2.0
6.2.1
10 changes: 5 additions & 5 deletions metamist/parser/generic_metadata_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,20 +4,20 @@
import re
import shlex
from functools import reduce
from typing import Dict, List, Optional, Any, Tuple, Union
from typing import Any, Dict, List, Optional, Tuple, Union

import click

from metamist.parser.generic_parser import (
GenericParser,
GroupedRow,
ParsedSequencingGroup,
ParsedAnalysis,
ParsedAssay,
ParsedSequencingGroup,
# noqa
SingleRow,
run_as_sync,
ParsedAnalysis,
) # noqa
)

__DOC = """
Parse CSV / TSV manifest of arbitrary format.
Expand Down Expand Up @@ -697,7 +697,7 @@ async def get_analyses_from_sequencing_group(
@click.option(
'--project',
required=True,
help='The sample-metadata project ($DATASET) to import manifest into',
help='The metamist project ($DATASET) to import manifest into',
)
@click.option('--sample-name-column', required=True)
@click.option(
Expand Down
2 changes: 1 addition & 1 deletion metamist/parser/generic_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -426,7 +426,7 @@ def __init__( # pylint: disable=too-many-arguments
self.ignore_extra_keys = ignore_extra_keys

if not project:
raise ValueError('sample-metadata project is required')
raise ValueError('A metamist project is required')

self.project = project

Expand Down
4 changes: 2 additions & 2 deletions metamist/parser/sample_file_map_parser.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env python3
# pylint: disable=too-many-instance-attributes,too-many-locals,unused-argument,wrong-import-order,unused-argument
from typing import List
import logging
from typing import List

import click

Expand Down Expand Up @@ -120,7 +120,7 @@ def get_sample_id(self, row: SingleRow) -> str:
@click.command(help=__DOC)
@click.option(
'--project',
help='The sample-metadata project to import manifest into',
help='The metamist project to import manifest into',
)
@click.option('--default-sample-type', default='blood')
@click.option('--default-sequence-type', default='wgs')
Expand Down
4 changes: 2 additions & 2 deletions metamist_infrastructure/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,12 @@ The `setup.py` in this directory does some directory magic, it adds:
You can install from git using:

```bash
pip install git+https://github.com/populationgenomics/sample-metadata.git@main#subdirectory=metamist_infrastructure
pip install git+https://github.com/populationgenomics/metamist.git@main#subdirectory=metamist_infrastructure
```

Or add the following to your `requirements.txt`:

```text
# other requirements here
metamist-infrastructure @ git+https://github.com/populationgenomics/sample-metadata.git@main#subdirectory=metamist_infrastructure
metamist-infrastructure @ git+https://github.com/populationgenomics/metamist.git@main#subdirectory=metamist_infrastructure
```
2 changes: 1 addition & 1 deletion metamist_infrastructure/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
description='Metamist infrastructure plugin for cpg-infrastructure',
long_description=readme,
long_description_content_type='text/markdown',
url=f'https://github.com/populationgenomics/sample-metadata',
url='https://github.com/populationgenomics/metamist',
license='MIT',
packages=[
'metamist_infrastructure',
Expand Down
15 changes: 7 additions & 8 deletions scripts/arbitrary_sm.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env python
# pylint: disable=no-member,consider-using-with
"""
Run the sample-metadata API through the analysis runner
Run the metamist API through the analysis runner
in a very generic, customisable way!
For example:
Expand All @@ -11,13 +11,12 @@
--json '{"project": "acute-care", "external_id": "<external-id>"}'
"""
from typing import List
import os.path
import logging
import subprocess

import argparse
import json
import logging
import os.path
import subprocess
from typing import List

from metamist import apis
from metamist.model_utils import file_type
Expand All @@ -39,7 +38,7 @@ def run_sm(
api = getattr(apis, api_class_name)
api_instance = api()

# the latest sample-metadata API wants an IOBase, so let's
# the latest metamist API wants an IOBase, so let's
# scan through the params, open and substitute the files
openapi_types = getattr(api_instance, f'{method_name}_endpoint').__dict__[
'openapi_types'
Expand Down Expand Up @@ -90,7 +89,7 @@ def from_args(args):
def main(args=None):
"""Main function, parses sys.argv"""

parser = argparse.ArgumentParser('Arbitrary sample-metadata script')
parser = argparse.ArgumentParser('Arbitrary metamist script')
parser.add_argument('api_name')
parser.add_argument('method_name')
parser.add_argument(
Expand Down
3 changes: 2 additions & 1 deletion scripts/back_populate_library_type.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
"""
import logging
import re

import click

from metamist.apis import AssayApi
Expand Down Expand Up @@ -45,7 +46,7 @@
@click.option(
'--project',
required=True,
help='The sample-metadata project ($DATASET)',
help='The metamist project ($DATASET)',
)
@click.option(
'-d',
Expand Down
3 changes: 2 additions & 1 deletion scripts/back_populate_sequences.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
The assay ID would be HTJMHDSX3_4_220817_FD123456
"""
import logging

import click

from metamist.apis import AssayApi
Expand All @@ -24,7 +25,7 @@
@click.option(
'--project',
required=True,
help='The sample-metadata project ($DATASET)',
help='The metamist project ($DATASET)',
)
def main(project: str):
"""Back populate external_ids for existing assays"""
Expand Down
Loading

0 comments on commit 33826ef

Please sign in to comment.