Skip to content

Commit

Permalink
Merge pull request #19 from crobby/100rebase
Browse files Browse the repository at this point in the history
Rebasing the v1.0.0 branch
  • Loading branch information
tmckayus authored and Chad Roberts committed Feb 25, 2021
2 parents 34d089e + a76daac commit 4ef393a
Show file tree
Hide file tree
Showing 16 changed files with 100 additions and 22 deletions.
11 changes: 11 additions & 0 deletions .github/ISSUE_TEMPLATE/major-release.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
name: Major release
about: Create a new major release
title: New major release
assignees: 'sesheta'
labels: bot
---

Hey, Kebechet!

Create a new major release, please.
11 changes: 11 additions & 0 deletions .github/ISSUE_TEMPLATE/minor-release.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
name: Minor release
about: Create a new minor release
title: New minor release
assignees: 'sesheta'
labels: bot
---

Hey, Kebechet!

Create a new minor release, please.
11 changes: 11 additions & 0 deletions .github/ISSUE_TEMPLATE/patch-release.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
name: Patch release
about: Create a new patch release
title: New patch release
assignees: 'sesheta'
labels: bot
---

Hey, Kebechet!

Create a new patch release, please.
12 changes: 12 additions & 0 deletions .thoth.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
managers:
- name: version
configuration:
maintainers:
- crobby
- vpavlin
- lavlas
- nakfour
assignees:
- sesheta
labels: [bot]
changelog_file: true
18 changes: 18 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@

## Release 1.0.1 (2021-02-18T21:17:04)
### Features
* issue template for release via bots (#322)
* include thoth configuration file for kebechet (#323)
* Change ODH dashboard type to ClusterIP (#321)
* upgrade s2i notebook images for jupyterhub (#304)
* Update README.md (#307)
* Owners List Updates (#306)
* Update grafana to version 3.8.1 (#301)
* Update peak operator file to include odh-manifests repo url (#299)
* Adding changes to kfdef (#298)
* Updated jupyterhub imagestream image to 0.1.5 (#296)
### Bug Fixes
* Update JH image to v0.2.0 to fix cert issues and support groups (#312)
### Improvements
* Simplify test for dashboard pods, hopefully eliminating flake (#305)
* Use airflowui secure route for tests (#300)
2 changes: 1 addition & 1 deletion OWNERS
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# Each list is sorted alphabetically, additions should maintain that order
approvers:
- anishasthana
- crobby
- lavlas
- nakfour
Expand All @@ -10,5 +11,4 @@ reviewers:
- crobby
- lavlas
- nakfour
- VedantMahabaleshwarkar
- vpavlin
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ spec:
openshift.io/imported-from: quay.io/thoth-station/s2i-minimal-notebook
from:
kind: DockerImage
name: quay.io/thoth-station/s2i-minimal-notebook:v0.0.4
name: "v0.0.4"
name: quay.io/thoth-station/s2i-minimal-notebook:v0.0.7
name: "v0.0.7"
referencePolicy:
type: Source
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ spec:
openshift.io/imported-from: quay.io/thoth-station/s2i-scipy-notebook
from:
kind: DockerImage
name: quay.io/thoth-station/s2i-scipy-notebook:v0.0.1
name: "v0.0.1"
name: quay.io/thoth-station/s2i-scipy-notebook:v0.0.2
name: "v0.0.2"
referencePolicy:
type: Source
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ spec:
openshift.io/imported-from: quay.io/thoth-station/s2i-tensorflow-notebook
from:
kind: DockerImage
name: quay.io/thoth-station/s2i-tensorflow-notebook:v0.0.1
name: "v0.0.1"
name: quay.io/thoth-station/s2i-tensorflow-notebook:v0.0.2
name: "v0.0.2"
referencePolicy:
type: Source
2 changes: 1 addition & 1 deletion odh-dashboard/base/deployment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ spec:
serviceAccount: odh-dashboard
containers:
- name: odh-dashboard
image: quay.io/opendatahub/odh-dashboard:v1.0
image: quay.io/modh/odh-dashboard:latest
imagePullPolicy: Always
ports:
- containerPort: 8080
Expand Down
1 change: 0 additions & 1 deletion odh-dashboard/base/service.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ metadata:
spec:
selector:
deployment: odh-dashboard
type: LoadBalancer
ports:
- protocol: TCP
targetPort: 8080
Expand Down
14 changes: 7 additions & 7 deletions odhargo/README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# ODH Argo

![ODH Argo version](https://img.shields.io/badge/ODH_Argo_version-v2.12.5-yellow.svg) <!-- v2.12.5 -->
![ODH Argo Workflows version](https://img.shields.io/badge/ODH_Argo_version-v2.12.5-yellow.svg) <!-- v2.12.5 -->
![Upstream version](https://img.shields.io/github/v/release/argoproj/argo?label=Upstream%20release)

ODH Argo component installs Argo that is namespace bound and not cluster wide. There are two pods running after installation
ODH Argo component installs Argo Workflows that is namespace bound and not cluster wide. There are two pods running after installation

1. Argo Server
2. Argo Controller

This Argo installation uses the "k8sapi" executor to work on Openshift.
This Argo Workflows installation uses the "k8sapi" executor to work on Openshift.

### Folders

Expand All @@ -18,11 +18,11 @@ This Argo installation uses the "k8sapi" executor to work on Openshift.

### Argo Server

This installation creates a route to the Argo portal. To access the portal go to `Routes` and click on the `Argo Server` route.
This installation creates a route to the Argo Workflows portal. To access the portal go to `Routes` and click on the `Argo Server` route.

### Installation

To install Argo add the following to the `kfctl` yaml file.
To install Argo Workflows add the following to the `kfctl` yaml file.

```yaml
- kustomizeConfig:
Expand All @@ -49,12 +49,12 @@ metadata:
spec: ...
```
or submit it via [Argo CLI](https://github.com/argoproj/argo/releases):
or submit it via [Argo Workflows CLI](https://github.com/argoproj/argo/releases):
```sh
argo submit odhargo/base/test-workflow.yaml
```

### Known issues

- Argo UI raises 2 "Forbidden" notifications on initial page load. This is just a cosmetic issue and doesn't effect functionality. [argoproj/argo#4885](https://github.com/argoproj/argo/issues/4885)
- Argo Workflows UI raises 2 "Forbidden" notifications on initial page load. This is just a cosmetic issue and doesn't effect functionality. [argoproj/argo#4885](https://github.com/argoproj/argo/issues/4885)
4 changes: 4 additions & 0 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,10 @@ If you'd like to run the tests against an instance that already has Open Data Hu
you set `SKIP_INSTALL=true` and that will cause the test run
to skip the installation process and will only run the tests. example: `make run SKIP_INSTALL=true`

If you'd like to run the tests against an instance that already has a KfDef created,
you set `SKIP_KFDEF_INSTALL=true` and that will cause the test run
to skip the step of creating the default KfDef. example: `make run SKIP_KFDEF_INSTALL=true`

If you'd like to run a single test instead of all tests, you can
set the TESTS_REGEX variable `TESTS_REGEX=<name of the test to run>`. That will
only run the test that you specify instead of all of the tests. example: `make run TESTS_REGEX=grafana`
Expand Down
2 changes: 1 addition & 1 deletion tests/basictests/jupyterhub.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ JH_LOGIN_PASS=${OPENSHIFT_PASS:-"admin"} #Password used to login to JH
OPENSHIFT_LOGIN_PROVIDER=${OPENSHIFT_LOGIN_PROVIDER:-"htpasswd-provider"} #OpenShift OAuth provider used for login
JH_AS_ADMIN=${JH_AS_ADMIN:-"true"} #Expect the user to be Admin in JupyterHub

JUPYTER_IMAGES=(s2i-minimal-notebook:v0.0.4 s2i-scipy-notebook:v0.0.1 s2i-tensorflow-notebook:v0.0.1 s2i-spark-minimal-notebook:py36-spark2.4.5-hadoop2.7.3)
JUPYTER_IMAGES=(s2i-minimal-notebook:v0.0.7 s2i-scipy-notebook:v0.0.2 s2i-tensorflow-notebook:v0.0.2 s2i-spark-minimal-notebook:py36-spark2.4.5-hadoop2.7.3)
JUPYTER_NOTEBOOK_FILES=(basic.ipynb basic.ipynb tensorflow.ipynb spark.ipynb)

os::test::junit::declare_suite_start "$MY_SCRIPT"
Expand Down
17 changes: 12 additions & 5 deletions tests/scripts/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -68,13 +68,20 @@ if [ -z "${OPENSHIFT_USER}" ] || [ -z "${OPENSHIFT_PASS}" ]; then
export OPENSHIFT_PASS=admin
fi

echo "Creating the following KfDef"
cat ./kfctl_openshift.yaml > ${ARTIFACT_DIR}/kfctl_openshift.yaml
oc apply -f ./kfctl_openshift.yaml
kfctl_result=$?
if [ "$kfctl_result" -ne 0 ]; then
if ! [ -z "${SKIP_KFDEF_INSTALL}" ]; then
## SKIP_KFDEF_INSTALL is useful in an instance where the
## operator install comes with an init container to handle
## the KfDef creation
echo "Relying on existing KfDef because SKIP_KFDEF_INSTALL was set"
else
echo "Creating the following KfDef"
cat ./kfctl_openshift.yaml > ${ARTIFACT_DIR}/kfctl_openshift.yaml
oc apply -f ./kfctl_openshift.yaml
kfctl_result=$?
if [ "$kfctl_result" -ne 0 ]; then
echo "The installation failed"
exit $kfctl_result
fi
fi
set +x
popd
5 changes: 5 additions & 0 deletions version.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#!/usr/bin/env python3
"""This file is just for the release bots of opendatahub manifests"""


__version__ = "1.0.1"

0 comments on commit 4ef393a

Please sign in to comment.