|
1 |
| -# airflow-operator |
2 |
| -// TODO(user): Add simple overview of use/purpose |
| 1 | +# Kubedoop Operator for Apache Airflow |
3 | 2 |
|
4 |
| -## Description |
5 |
| -// TODO(user): An in-depth paragraph about your project and overview of use |
| 3 | +[](https://github.com/zncdatadev/airflow-operator/actions/workflows/publish.yml) |
| 4 | +[](https://opensource.org/licenses/Apache-2.0) |
| 5 | +[](https://goreportcard.com/report/github.com/zncdatadev/airflow-operator) |
| 6 | +[](https://artifacthub.io/packages/helm/kubedoop/airflow-operator) |
6 | 7 |
|
7 |
| -## Getting Started |
| 8 | +This is a Kubernetes operator to manage [Apache Airflow](https://airflow.apache.org/) ensembles. |
8 | 9 |
|
9 |
| -### Prerequisites |
10 |
| -- go version v1.22.0+ |
11 |
| -- docker version 17.03+. |
12 |
| -- kubectl version v1.11.3+. |
13 |
| -- Access to a Kubernetes v1.11.3+ cluster. |
| 10 | +It's part of the Kubedoop Data Platform, a modular open source data platform built on Kubernetes that provides Kubernetes native deployment |
| 11 | +and management of popular open source data apps like Apache Kafka, Apache Doris, Trino or Apache Spark, all working |
| 12 | +together seamlessly. Based on Kubernetes, it runs everywhere – on prem or in the cloud. |
14 | 13 |
|
15 |
| -### To Deploy on the cluster |
16 |
| -**Build and push your image to the location specified by `IMG`:** |
| 14 | +## Quick Start |
17 | 15 |
|
18 |
| -```sh |
19 |
| -make docker-build docker-push IMG=<some-registry>/airflow-operator:tag |
20 |
| -``` |
21 |
| - |
22 |
| -**NOTE:** This image ought to be published in the personal registry you specified. |
23 |
| -And it is required to have access to pull the image from the working environment. |
24 |
| -Make sure you have the proper permission to the registry if the above commands don’t work. |
25 |
| - |
26 |
| -**Install the CRDs into the cluster:** |
27 |
| - |
28 |
| -```sh |
29 |
| -make install |
30 |
| -``` |
| 16 | +### Add helm repository |
31 | 17 |
|
32 |
| -**Deploy the Manager to the cluster with the image specified by `IMG`:** |
| 18 | +> Please make sure helm version is v3.0.0+ |
33 | 19 |
|
34 |
| -```sh |
35 |
| -make deploy IMG=<some-registry>/airflow-operator:tag |
| 20 | +```bash |
| 21 | +helm repo add kubedoop https://zncdatadev.github.io/kubedoop-helm-charts/ |
36 | 22 | ```
|
37 | 23 |
|
38 |
| -> **NOTE**: If you encounter RBAC errors, you may need to grant yourself cluster-admin |
39 |
| -privileges or be logged in as admin. |
40 |
| - |
41 |
| -**Create instances of your solution** |
42 |
| -You can apply the samples (examples) from the config/sample: |
43 |
| - |
44 |
| -```sh |
45 |
| -kubectl apply -k config/samples/ |
46 |
| -``` |
47 |
| - |
48 |
| ->**NOTE**: Ensure that the samples has default values to test it out. |
49 |
| -
|
50 |
| -### To Uninstall |
51 |
| -**Delete the instances (CRs) from the cluster:** |
| 24 | +### Add required dependencies |
52 | 25 |
|
53 |
| -```sh |
54 |
| -kubectl delete -k config/samples/ |
| 26 | +```bash |
| 27 | +helm install commons-operator kubedoop/commons-operator |
| 28 | +helm install listener-operator kubedoop/listener-operator |
| 29 | +helm install secret-operator kubedoop/secret-operator |
55 | 30 | ```
|
56 | 31 |
|
57 |
| -**Delete the APIs(CRDs) from the cluster:** |
| 32 | +### Add airflow-operator |
58 | 33 |
|
59 |
| -```sh |
60 |
| -make uninstall |
| 34 | +```bash |
| 35 | +helm install airflow-operator kubedoop/airflow-operator |
61 | 36 | ```
|
62 | 37 |
|
63 |
| -**UnDeploy the controller from the cluster:** |
| 38 | +### Deploy Airflow cluster |
64 | 39 |
|
65 |
| -```sh |
66 |
| -make undeploy |
| 40 | +```bash |
| 41 | +kubectl apply -f config/samples |
67 | 42 | ```
|
68 | 43 |
|
69 |
| -## Project Distribution |
| 44 | +## Kubedoop Data Platform Operators |
70 | 45 |
|
71 |
| -Following are the steps to build the installer and distribute this project to users. |
| 46 | +These are the operators that are currently part of the Kubedoop Data Platform: |
72 | 47 |
|
73 |
| -1. Build the installer for the image built and published in the registry: |
| 48 | +- [Kubedoop Operator for Apache Airflow](https://github.com/zncdatadev/airflow-operator) |
| 49 | +- [Kubedoop Operator for Apache DolphinScheduler](https://github.com/zncdatadev/dolphinscheduler-operator) |
| 50 | +- [Kubedoop Operator for Apache Doris](https://github.com/zncdatadev/doris-operator) |
| 51 | +- [Kubedoop Operator for Apache Hadoop HDFS](https://github.com/zncdatadev/hdfs-operator) |
| 52 | +- [Kubedoop Operator for Apache HBase](https://github.com/zncdatadev/hbase-operator) |
| 53 | +- [Kubedoop Operator for Apache Hive](https://github.com/zncdatadev/hive-operator) |
| 54 | +- [Kubedoop Operator for Apache Kafka](https://github.com/zncdatadev/kafka-operator) |
| 55 | +- [Kubedoop Operator for Apache Kyuubi](https://github.com/zncdatadev/kyuubi-operator) |
| 56 | +- [Kubedoop Operator for Apache Nifi](https://github.com/zncdatadev/nifi-operator) |
| 57 | +- [Kubedoop Operator for Apache Spark](https://github.com/zncdatadev/spark-k8s-operator) |
| 58 | +- [Kubedoop Operator for Apache Superset](https://github.com/zncdatadev/superset-operator) |
| 59 | +- [Kubedoop Operator for Trino](https://github.com/zncdatadev/trino-operator) |
| 60 | +- [Kubedoop Operator for Apache Zookeeper](https://github.com/zncdatadev/zookeeper-operator) |
74 | 61 |
|
75 |
| -```sh |
76 |
| -make build-installer IMG=<some-registry>/airflow-operator:tag |
77 |
| -``` |
78 |
| - |
79 |
| -NOTE: The makefile target mentioned above generates an 'install.yaml' |
80 |
| -file in the dist directory. This file contains all the resources built |
81 |
| -with Kustomize, which are necessary to install this project without |
82 |
| -its dependencies. |
83 |
| - |
84 |
| -2. Using the installer |
| 62 | +And our internal operators: : |
85 | 63 |
|
86 |
| -Users can just run kubectl apply -f <URL for YAML BUNDLE> to install the project, i.e.: |
87 |
| - |
88 |
| -```sh |
89 |
| -kubectl apply -f https://raw.githubusercontent.com/<org>/airflow-operator/<tag or branch>/dist/install.yaml |
90 |
| -``` |
| 64 | +- [Commons Operator](https://github.com/zncdatadev/commons-operator) |
| 65 | +- [Listener Operator](https://github.com/zncdatadev/listener-operator) |
| 66 | +- [Secret Operator](https://github.com/zncdatadev/secret-operator) |
91 | 67 |
|
92 | 68 | ## Contributing
|
93 |
| -// TODO(user): Add detailed information on how you would like others to contribute to this project |
94 | 69 |
|
95 |
| -**NOTE:** Run `make help` for more information on all potential `make` targets |
96 |
| - |
97 |
| -More information can be found via the [Kubebuilder Documentation](https://book.kubebuilder.io/introduction.html) |
| 70 | +If you'd like to contribute to Kubedoop, please refer to our [Contributing Guide](https://kubedoop.dev/docs/developer-manual/collaboration) for more information. |
| 71 | +We welcome contributions of all kinds, including but not limited to code, documentation, and use cases. |
98 | 72 |
|
99 | 73 | ## License
|
100 | 74 |
|
101 |
| -Copyright 2024 ZNCDataDev. |
102 |
| - |
103 |
| -Licensed under the Apache License, Version 2.0 (the "License"); |
104 |
| -you may not use this file except in compliance with the License. |
105 |
| -You may obtain a copy of the License at |
106 |
| - |
107 |
| - http://www.apache.org/licenses/LICENSE-2.0 |
108 |
| - |
109 |
| -Unless required by applicable law or agreed to in writing, software |
110 |
| -distributed under the License is distributed on an "AS IS" BASIS, |
111 |
| -WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
112 |
| -See the License for the specific language governing permissions and |
113 |
| -limitations under the License. |
114 |
| - |
| 75 | +Kubedoop is under the Apache 2.0 license. See the [LICENSE](./LICENSE) file for details. |
0 commit comments