Skip to content

Commit

Permalink
Improve github CI, and helm default docker version (#1415)
Browse files Browse the repository at this point in the history
Add a docker build into the github CI, even if we don't use it, and also
stop using latest as the default container version, as it is clearly wrong, and
the latest tag hasn't been updated in a long time.

travis-ci.org has stopped building the code for this repository as well, so we should
focus on github workflows. Ideally we'd move the building of the docker container into
github as well because then at least all CI is managed in one place.

Signed-off-by: Tom Hellier <[email protected]>
  • Loading branch information
TomHellier authored Dec 2, 2021
1 parent 3a0e804 commit 537bab5
Show file tree
Hide file tree
Showing 11 changed files with 79 additions and 70 deletions.
29 changes: 26 additions & 3 deletions .github/workflows/main.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,12 @@ jobs:
with:
fetch-depth: 2

- name: Build Spark-Operator Docker Image
run: |
tag=$(git describe --tags --dirty)_v3.1.1
docker build -t gcr.io/spark-operator/spark-operator:${tag} .
docker build -t gcr.io/spark-operator/spark-operator:local .
- name: Install Helm
uses: azure/setup-helm@v1
with:
Expand All @@ -68,8 +74,25 @@ jobs:
- name: Detect CRDs drift between chart and manifest
run: make detect-crds-drift

- name: Create kind cluster
uses: helm/[email protected]
- name: setup minikube
uses: manusa/[email protected]
with:
minikube version: "v1.24.0"
kubernetes version: "v1.20.8"
start args: --memory 6g --cpus=2 --addons ingress
github token: ${{ inputs.github-token }}

- name: Run chart-testing (install)
run: ct install
run: |
tag=$(git describe --tags --dirty)_v3.1.1
minikube image load gcr.io/spark-operator/spark-operator:local
ct install
# The integration tests are currently broken see: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/issues/1416
# - name: Run chart-testing (integration test)
# run: make it-test

- name: Setup tmate session
if: failure()
uses: mxschmitt/action-tmate@v3
timeout-minutes: 15
38 changes: 34 additions & 4 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,24 @@ jobs:
with:
python-version: 3.7

# TODO: Maintainer of repository to follow:
# https://github.com/docker/login-action#google-container-registry-gcr to add credentials so
# we can push from github actions
# - name: log in to container registry
# uses: docker/login-action@v1
# with:
# registry: gcr.io
# username: ${{ secrets.DOCKER_USERNAME }}
# password: ${{ secrets.DOCKER_PASSWORD }}

- name: Build Spark-Operator Docker Image
run: |
tag=$(git describe --tags --dirty)_v3.1.1
docker build -t gcr.io/spark-operator/spark-operator:${tag} .
docker build -t gcr.io/spark-operator/spark-operator:local .
echo "Ideally, we'd release the docker container at this point, but the maintainer of this repo needs to approve..."
echo "docker push gcr.io/spark-operator/spark-operator:${tag}"
- name: Set up chart-testing
uses: helm/[email protected]

Expand All @@ -42,15 +60,27 @@ jobs:
- name: Run chart-testing (lint)
run: ct lint

- name: Create kind cluster
uses: helm/[email protected]
if: steps.list-changed.outputs.changed == 'true'
- name: setup minikube
uses: manusa/[email protected]
with:
minikube version: "v1.24.0"
kubernetes version: "v1.20.8"
start args: --memory 6g --cpus=2 --addons ingress
github token: ${{ inputs.github-token }}

- name: Run chart-testing (install)
run: ct install
run: |
tag=$(git describe --tags --dirty)_v3.1.1
minikube image load gcr.io/spark-operator/spark-operator:local
ct install
- name: Run chart-releaser
uses: helm/[email protected]
env:
CR_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
CR_RELEASE_NAME_TEMPLATE: "spark-operator-chart-{{ .Version }}"

- name: Setup tmate session
if: failure()
uses: mxschmitt/action-tmate@v3
timeout-minutes: 15
24 changes: 0 additions & 24 deletions .travis.gofmt.sh

This file was deleted.

29 changes: 0 additions & 29 deletions .travis.yml

This file was deleted.

14 changes: 10 additions & 4 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -46,8 +46,14 @@ helm-docs:
helm-docs -c ./charts

fmt-check: clean
@echo "running fmt check"
./.travis.gofmt.sh
@echo "running fmt check"; cd "$(dirname $0)"; \
if [ -n "$(go fmt ./...)" ]; \
then \
echo "Go code is not formatted, please run 'go fmt ./...'." >&2; \
exit 1; \
else \
echo "Go code is formatted"; \
fi

detect-crds-drift:
diff -q charts/spark-operator-chart/crds manifest/crds --exclude=kustomization.yaml
Expand All @@ -62,9 +68,9 @@ test: clean
go test -v ./... -covermode=atomic


it-test: clean all
it-test: clean
@echo "running unit tests"
go test -v ./test/e2e/ --kubeconfig "$HOME/.kube/config" --operator-image=gcr.io/spark-operator/spark-operator:v1beta2-1.3.0-3.1.1
go test -v ./test/e2e/ --kubeconfig "$(HOME)/.kube/config" --operator-image=gcr.io/spark-operator/spark-operator:local

vet:
@echo "running go vet"
Expand Down
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
[![Build Status](https://travis-ci.org/GoogleCloudPlatform/spark-on-k8s-operator.svg?branch=master)](https://travis-ci.org/GoogleCloudPlatform/spark-on-k8s-operator.svg?branch=master)
[![Go Report Card](https://goreportcard.com/badge/github.com/GoogleCloudPlatform/spark-on-k8s-operator)](https://goreportcard.com/report/github.com/GoogleCloudPlatform/spark-on-k8s-operator)

**This is not an officially supported Google product.**
Expand Down
1 change: 1 addition & 0 deletions charts/spark-operator-chart/.helmignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ci/
4 changes: 2 additions & 2 deletions charts/spark-operator-chart/Chart.yaml
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
apiVersion: v2
name: spark-operator
description: A Helm chart for Spark on Kubernetes operator
version: 1.1.14
appVersion: v1beta2-1.3.0-3.1.1
version: 1.1.15
appVersion: v1beta2-1.3.1-3.1.1
keywords:
- spark
home: https://github.com/GoogleCloudPlatform/spark-on-k8s-operator
Expand Down
2 changes: 2 additions & 0 deletions charts/spark-operator-chart/ci/ci-values.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
image:
tag: "local"
4 changes: 2 additions & 2 deletions charts/spark-operator-chart/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@ image:
repository: gcr.io/spark-operator/spark-operator
# -- Image pull policy
pullPolicy: IfNotPresent
# -- Overrides the image tag whose default is the chart appVersion.
tag: "latest"
# -- if set, override the image tag whose default is the chart appVersion.
tag: ""

# -- Image pull secrets
imagePullSecrets: []
Expand Down
3 changes: 2 additions & 1 deletion test/e2e/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,8 @@ Prerequisites:
e2e tests are written as Go test. All go test techniques apply (e.g. picking what to run, timeout length). Let's say I want to run all tests in "test/e2e/":

```bash
$ go test -v ./test/e2e/ --kubeconfig "$HOME/.kube/config" --operator-image=gcr.io/spark-operator/spark-operator:v1beta2-1.3.0-3.1.1
$ docker build -t gcr.io/spark-operator/spark-operator:local .
$ go test -v ./test/e2e/ --kubeconfig "$HOME/.kube/config" --operator-image=gcr.io/spark-operator/spark-operator:local
```

### Available Tests
Expand Down

0 comments on commit 537bab5

Please sign in to comment.