You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to test major changes in prod-like environment, we would like to setup a staging instance. The goal is that this instance will be used on adhoc basis for testing PRs etc. We do not maintain it as a continuous running instance.
Requirements:
Create a Staging Airflow instance staging-hubble
Create a bash script which uploads all schemas and datasets from prod (for last 30 days worth of data) to staging-hubble project in bigquery. One should run this script as part of setup DAG in staging-hubble airflow instance.
Add deploy option to staging instance in stellar-etl-airflow github action
Create a bash script to clear all schemas and datasets in staging-hubble bigquery project. This will again will be run as part of dag/task in staging-hubble airflow instance
Note that we should use all env variables/profile similar to that of prod to mimic the behavior entirely, unless it is not possible.
Workflow:
The staging airflow instance is always running, while the dags are off
Developer runs setup dag in staging airflow
Developer deploy their PR on staging airflow and test
Once the testing is done, developer runs clean dag in staging airflow and turns off all the DAGs
How staging is different from Test env:
Test uses data from testnet while staging uses subset of prod data
Test is a continuous running environment which we use to test minor or major changes. And it helps identifies issue earlier. Staging will mimic prod environment and used when major changes are required to be tested
The text was updated successfully, but these errors were encountered:
Description
In order to test major changes in prod-like environment, we would like to setup a staging instance. The goal is that this instance will be used on adhoc basis for testing PRs etc. We do not maintain it as a continuous running instance.
Requirements:
Create a Staging Airflow instance
staging-hubble
Create a bash script which uploads all schemas and datasets from prod (for last 30 days worth of data) to staging-hubble project in bigquery. One should run this script as part of setup DAG in
staging-hubble
airflow instance.Add deploy option to
staging
instance in stellar-etl-airflow github actionCreate a bash script to clear all schemas and datasets in
staging-hubble
bigquery project. This will again will be run as part of dag/task instaging-hubble
airflow instanceNote that we should use all env variables/profile similar to that of prod to mimic the behavior entirely, unless it is not possible.
Workflow:
The staging airflow instance is always running, while the dags are off
Developer runs setup dag in staging airflow
Developer deploy their PR on staging airflow and test
Once the testing is done, developer runs clean dag in staging airflow and turns off all the DAGs
How staging is different from Test env:
Test uses data from testnet while staging uses subset of prod data
Test is a continuous running environment which we use to test minor or major changes. And it helps identifies issue earlier. Staging will mimic prod environment and used when major changes are required to be tested
The text was updated successfully, but these errors were encountered: