Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: adjusting deploy on foundation docs, cleaning files and 5-appinfra docs #65

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions 0-bootstrap/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -262,6 +262,7 @@ Using GitHub Actions requires manual creation of the GitHub repositories used in
git add .
git commit -m 'Initialize bootstrap repo'
git push --set-upstream origin plan
cd ..
```

1. Continue with the instructions in the [1-org](../1-org/README.md) step.
Expand Down
3 changes: 3 additions & 0 deletions 1-org/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -293,4 +293,7 @@ Before executing the next stages, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT`

```bash
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT

cd ../..
```
1. Proceed to the [2-environments](../2-environments/README.md) step.
2 changes: 1 addition & 1 deletion 1-org/envs/shared/ml_key_rings.tf
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ module "kms_keyring" {
keyring_admins = [
"serviceAccount:${local.projects_step_terraform_service_account_email}"
]
project_id = module.org_kms.project_id
project_id = module.common_kms.project_id
keyring_regions = var.keyring_regions
keyring_name = var.keyring_name
}
14 changes: 10 additions & 4 deletions 2-environments/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ Run `terraform output cloudbuild_project_id` in the `0-bootstrap` folder to get
git push origin production
```

### `N.B.` Read this before continuing further
### Read this before continuing further

A logging project will be created in every environment (`development`, `non-production`, `production`) when running this code. This project contains a storage bucket for the purposes of project logging within its respective environment. This requires the `[email protected]` group permissions for the storage bucket. Since foundations has more restricted security measures, a domain restriction constraint is enforced. This restraint will prevent the google cloud-storage-analytics group to be added to any permissions. In order for this terraform code to execute without error, manual intervention must be made to ensure everything applies without issue.

Expand Down Expand Up @@ -405,7 +405,8 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
export GOOGLE_IMPERSONATE_SERVICE_ACCOUNT=$(terraform -chdir="../0-bootstrap/" output -raw environment_step_terraform_service_account_email)
echo ${GOOGLE_IMPERSONATE_SERVICE_ACCOUNT}
```
1. Ensure you [disable The Orginization Policy](#do-this-before-you-push-development-non-production--production) on the `development` folder before continuing further

1. Ensure you [disable The Organization Policy](#read-this-before-continuing-further) on the `development` folder before continuing further.

1. Run `init` and `plan` and review output for environment development.

Expand All @@ -426,7 +427,7 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
./tf-wrapper.sh apply development
```

1. Ensure you [disable The Orginization Policy](#do-this-before-you-push-development-non-production--production) on the `non-production` folder before continuing further
1. Ensure you [disable The Organization Policy](#read-this-before-continuing-further) on the `non-production` folder before continuing further.

1. Run `init` and `plan` and review output for environment non-production.

Expand All @@ -446,7 +447,8 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
```bash
./tf-wrapper.sh apply non-production
```
1. Ensure you [disable The Orginization Policy](#do-this-before-you-push-development-non-production--production) on the `non-production` folder before continuing further

1. Ensure you [disable The Organization Policy](#read-this-before-continuing-further) on the `non-production` folder before continuing further.

1. Run `init` and `plan` and review output for environment production.

Expand All @@ -473,4 +475,8 @@ Before executing the next stages, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT`

```bash
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT

cd ../..
```

1. You can now move to the instructions in the network step. To use the [Dual Shared VPC](https://cloud.google.com/architecture/security-foundations/networking#vpcsharedvpc-id7-1-shared-vpc-) network mode go to [3-networks-dual-svpc](../3-networks-dual-svpc/README.md).
2 changes: 2 additions & 0 deletions 3-networks-dual-svpc/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -418,3 +418,5 @@ Before executing the next stages, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT`
```bash
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT
```

1. You can now move to the instructions in the [4-projects](../4-projects/README.md) step.
3 changes: 3 additions & 0 deletions 4-projects/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -358,4 +358,7 @@ Before executing the next stages, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT`

```bash
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT

cd ../..
```
1. You can now move to the instructions in the [5-app-infra](../5-app-infra/README.md) step.
118 changes: 74 additions & 44 deletions 5-app-infra/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -416,6 +416,7 @@ The pipeline also listens for changes made to `plan`, `development`, `non-produc

```bash
cd service-catalog/
git checkout -b main
cp -RT ../terraform-google-enterprise-genai/5-app-infra/source_repos/service-catalog/ .
git add img
git commit -m "Add img directory"
Expand All @@ -442,11 +443,23 @@ The pipeline also listens for changes made to `plan`, `development`, `non-produc

#### Artifacts Application

1. The next instructions assume that you are at the same level of the `terraform-google-enterprise-genai` folder. Change into `5-app-infra` folder, copy the Terraform wrapper script and ensure it can be executed.
1. Create `ml-artifact-publish` directory at the same level as `terraform-google-enterprise-genai`.

```bash
mkdir ml-artifact-publish
```

1. Navigate into the repo, change to non-main branch and copy contents of genAI to new repo.
All subsequent steps assume you are running them from the ml-artifact-publish directory.
If you run them from another directory, adjust your copy paths accordingly.

```bash
cd terraform-google-enterprise-genai/5-app-infra/projects/artifact-publish/
cp ../../../build/tf-wrapper.sh .
cd ml-artifact-publish/

cp -RT ../terraform-google-enterprise-genai/5-app-infra/projects/artifact-publish/ .
cp -R ../terraform-google-enterprise-genai/5-app-infra/modules/ ./modules
cp ../terraform-google-enterprise-genai/build/cloudbuild-tf-* .
cp ../terraform-google-enterprise-genai/build/tf-wrapper.sh .
chmod 755 ./tf-wrapper.sh
```

Expand All @@ -461,7 +474,7 @@ The pipeline also listens for changes made to `plan`, `development`, `non-produc
1. Use `terraform output` to get the project backend bucket value from 0-bootstrap.

```bash
export remote_state_bucket=$(terraform -chdir="../../../0-bootstrap/" output -raw projects_gcs_bucket_tfstate)
export remote_state_bucket=$(terraform -chdir="../terraform-google-enterprise-genai/0-bootstrap/" output -raw projects_gcs_bucket_tfstate)
echo "remote_state_bucket = ${remote_state_bucket}"
sed -i "s/REMOTE_STATE_BUCKET/${remote_state_bucket}/" ./common.auto.tfvars
```
Expand All @@ -474,10 +487,10 @@ The pipeline also listens for changes made to `plan`, `development`, `non-produc
member="user:$(gcloud auth list --filter="status=ACTIVE" --format="value(account)")"
echo ${member}

project_id=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
project_id=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
echo ${project_id}

terraform_sa=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-artifact-publish"' --raw-output)
terraform_sa=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-artifact-publish"' --raw-output)
echo ${terraform_sa}

gcloud iam service-accounts add-iam-policy-binding ${terraform_sa} --project ${project_id} --member="${member}" --role="roles/iam.serviceAccountTokenCreator"
Expand All @@ -486,7 +499,7 @@ The pipeline also listens for changes made to `plan`, `development`, `non-produc
1. Update `backend.tf` with your bucket from the infra pipeline output.

```bash
export backend_bucket=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -json state_buckets | jq '."ml-artifact-publish"' --raw-output)
export backend_bucket=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -json state_buckets | jq '."ml-artifact-publish"' --raw-output)
echo "backend_bucket = ${backend_bucket}"

for i in `find -name 'backend.tf'`; do sed -i "s/UPDATE_APP_INFRA_BUCKET/${backend_bucket}/" $i; done
Expand All @@ -500,10 +513,10 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
1. Use `terraform output` to get the Infra Pipeline Project ID from 4-projects output.

```bash
export INFRA_PIPELINE_PROJECT_ID=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
export INFRA_PIPELINE_PROJECT_ID=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
echo ${INFRA_PIPELINE_PROJECT_ID}

export GOOGLE_IMPERSONATE_SERVICE_ACCOUNT=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-artifact-publish"' --raw-output)
export GOOGLE_IMPERSONATE_SERVICE_ACCOUNT=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-artifact-publish"' --raw-output)
echo ${GOOGLE_IMPERSONATE_SERVICE_ACCOUNT}
```

Expand All @@ -517,7 +530,7 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
1. Run `validate` and check for violations.

```bash
./tf-wrapper.sh validate shared $(pwd)/../policy-library ${INFRA_PIPELINE_PROJECT_ID}
./tf-wrapper.sh validate shared $(pwd)/../terraform-google-enterprise-genai/policy-library ${INFRA_PIPELINE_PROJECT_ID}
```

1. Run `apply` shared.
Expand All @@ -534,14 +547,12 @@ After executing this stage, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT` envir
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT
```

1. `cd` out of the `artifact-publish`.
1. `cd` out of the repository.

```bash
cd
cd ..
```

1. Navigate to the project that was output from `${ARTIFACT_PROJECT_ID}` in Google's Cloud Console to view the first run of images being built.

#### Configuring Cloud Source Repository of Artifact Application

1. The next instructions assume that you are at the same level of the `terraform-google-enterprise-genai` folder.
Expand Down Expand Up @@ -586,62 +597,73 @@ unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT

#### Service Catalog Configuration

1. The next instructions assume that you are at the same level of the `terraform-google-enterprise-genai` folder. Change into `5-app-infra` folder, copy the Terraform wrapper script and ensure it can be executed.

1. Create `ml-service-catalog` directory at the same level as `terraform-google-enterprise-genai`.

```bash
cd terraform-google-enterprise-genai/5-app-infra/projects/service-catalog/
cp ../../../build/tf-wrapper.sh .
mkdir ml-service-catalog
```

1. Navigate into the repo, change to non-main branch and copy contents of foundation to new repo.
All subsequent steps assume you are running them from the ml-service-catalog directory.
If you run them from another directory, adjust your copy paths accordingly.

```bash
cd ml-service-catalog

cp -RT ../terraform-google-enterprise-genai/5-app-infra/projects/service-catalog/ .
cp -R ../terraform-google-enterprise-genai/5-app-infra/modules/ ./modules
cp ../terraform-google-enterprise-genai/build/cloudbuild-tf-* .
cp ../terraform-google-enterprise-genai/build/tf-wrapper.sh .
chmod 755 ./tf-wrapper.sh
```

1. Rename `common.auto.example.tfvars` files to `common.auto.tfvars`.
1. Rename `common.auto.example.tfvars` to `common.auto.tfvars`.

```bash
mv common.auto.example.tfvars common.auto.tfvars
```

1. Update `common.auto.tfvars` file with values from your environment.

1. Use `terraform output` to get the project backend bucket value from 0-bootstrap.
1. Update the file with values from your environment and 0-bootstrap. See any of the business unit 1 envs folders [README.md](./ml_business_unit/production/README.md) files for additional information on the values in the `common.auto.tfvars` file.

```bash
export remote_state_bucket=$(terraform -chdir="../../../0-bootstrap/" output -raw projects_gcs_bucket_tfstate)
export remote_state_bucket=$(terraform -chdir="../terraform-google-enterprise-genai/0-bootstrap/" output -raw projects_gcs_bucket_tfstate)
echo "remote_state_bucket = ${remote_state_bucket}"
sed -i "s/REMOTE_STATE_BUCKET/${remote_state_bucket}/" ./common.auto.tfvars
```

1. Provide the user that will be running `./tf-wrapper.sh` the Service Account Token Creator role to the ml Terraform service account.

1. Provide the user permissions to run the terraform locally with the `serviceAccountTokenCreator` permission.
1. Update `backend.tf` with your bucket from the infra pipeline output.

```bash
member="user:$(gcloud auth list --filter="status=ACTIVE" --format="value(account)")"
echo ${member}

project_id=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
echo ${project_id}

terraform_sa=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-service-catalog"' --raw-output)
echo ${terraform_sa}
export backend_bucket=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -json state_buckets | jq '."ml-service-catalog"' --raw-output)
echo "backend_bucket = ${backend_bucket}"

gcloud iam service-accounts add-iam-policy-binding ${terraform_sa} --project ${project_id} --member="${member}" --role="roles/iam.serviceAccountTokenCreator"
for i in `find -name 'backend.tf'`; do sed -i "s/UPDATE_APP_INFRA_BUCKET/${backend_bucket}/" $i; done
```

1. Update the `log_bucket` variable with the value of the `logs_export_storage_bucket_name`.

```bash
export log_bucket=$(terraform -chdir="../gcp-org/envs/shared" output -raw logs_export_storage_bucket_name)
```bash
export log_bucket=$(terraform -chdir="../terraform-google-enterprise-genai/1-org/envs/shared" output -raw logs_export_storage_bucket_name)
echo "log_bucket = ${log_bucket}"
sed -i "s/REPLACE_LOG_BUCKET/${log_bucket}/" ./common.auto.tfvars
```

1. Update `backend.tf` with your bucket from the infra pipeline output.
1. Provide the user permissions to run the terraform locally with the `serviceAccountTokenCreator` permission.

```bash
export backend_bucket=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -json state_buckets | jq '."ml-service-catalog"' --raw-output)
echo "backend_bucket = ${backend_bucket}"
(cd ../terraform-google-enterprise-genai/4-projects && ./tf-wrapper.sh init shared)

for i in `find -name 'backend.tf'`; do sed -i "s/UPDATE_APP_INFRA_BUCKET/${backend_bucket}/" $i; done
member="user:$(gcloud auth list --filter="status=ACTIVE" --format="value(account)")"
echo ${member}

project_id=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
echo ${project_id}

terraform_sa=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-service-catalog"' --raw-output)
echo ${terraform_sa}

gcloud iam service-accounts add-iam-policy-binding ${terraform_sa} --project ${project_id} --member="${member}" --role="roles/iam.serviceAccountTokenCreator"
```

We will now deploy each of our environments (development/production/non-production) using this script.
Expand All @@ -652,10 +674,10 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
1. Use `terraform output` to get the Infra Pipeline Project ID from 4-projects output.

```bash
export INFRA_PIPELINE_PROJECT_ID=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
export INFRA_PIPELINE_PROJECT_ID=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -raw cloudbuild_project_id)
echo ${INFRA_PIPELINE_PROJECT_ID}

export GOOGLE_IMPERSONATE_SERVICE_ACCOUNT=$(terraform -chdir="../../../4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-service-catalog"' --raw-output)
export GOOGLE_IMPERSONATE_SERVICE_ACCOUNT=$(terraform -chdir="../terraform-google-enterprise-genai/4-projects/ml_business_unit/shared/" output -json terraform_service_accounts | jq '."ml-service-catalog"' --raw-output)
echo ${GOOGLE_IMPERSONATE_SERVICE_ACCOUNT}
```

Expand All @@ -669,7 +691,7 @@ To use the `validate` option of the `tf-wrapper.sh` script, please follow the [i
1. Run `validate` and check for violations.

```bash
./tf-wrapper.sh validate shared $(pwd)/../policy-library ${INFRA_PIPELINE_PROJECT_ID}
./tf-wrapper.sh validate shared $(pwd)/../terraform-google-enterprise-genai/policy-library ${INFRA_PIPELINE_PROJECT_ID}
```

1. Run `apply` shared.
Expand All @@ -686,6 +708,12 @@ After executing this stage, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT` envir
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT
```

1. `cd` out of the repository.

```bash
cd ..
```

#### Configuring Cloud Source Repository of Service Catalog Solutions Pipeline

1. The next instructions assume that you are at the same level of the `terraform-google-enterprise-genai` folder
Expand Down Expand Up @@ -729,4 +757,6 @@ After executing this stage, unset the `GOOGLE_IMPERSONATE_SERVICE_ACCOUNT` envir
cd ..
```

1. Navigate to the project that was output from `${ARTIFACT_PROJECT_ID}` in Google's Cloud Console to view the first run of images being built.
1. Navigate to the project that was output from `${SERVICE_CATALOG_PROJECT_ID}` in Google's Cloud Console to view the first run of images being built.

https://console.cloud.google.com/cloud-build/builds;region=us-central1?orgonly=true&project=${SERVICE_CATALOG_PROJECT_ID}&supportedpurview=project
21 changes: 0 additions & 21 deletions 5-app-infra/modules/service_catalog/locals.tf
Original file line number Diff line number Diff line change
Expand Up @@ -19,25 +19,4 @@ locals {
current_user_domain = split("@", local.current_user_email)[1]
current_member = strcontains(local.current_user_domain, "iam.gserviceaccount.com") ? "serviceAccount:${local.current_user_email}" : "user:${local.current_user_email}"
log_bucket_prefix = "bkt"
bucket_permissions = {

"roles/storage.admin" = [
google_service_account.trigger_sa.member,
],
"roles/storage.legacyObjectReader" = [
"serviceAccount:${var.machine_learning_project_number}@cloudbuild.gserviceaccount.com",
],
}

bucket_roles = flatten([
for role in keys(local.bucket_permissions) : [
for sa in local.bucket_permissions[role] :
{
role = role
acct = sa
}
]
])
}


7 changes: 3 additions & 4 deletions 5-app-infra/modules/service_catalog/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,9 @@ resource "google_storage_bucket" "bucket" {
}

resource "google_storage_bucket_iam_member" "bucket_role" {
for_each = { for gcs in local.bucket_roles : "${gcs.role}-${gcs.acct}" => gcs }
bucket = google_storage_bucket.bucket.name
role = each.value.role
member = each.value.acct
bucket = google_storage_bucket.bucket.name
role = "roles/storage.admin"
member = google_service_account.trigger_sa.member
}

resource "google_sourcerepo_repository_iam_member" "read" {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| instance\_region | The region where compute instance will be created. A subnetwork must exists in the instance region. | `string` | n/a | yes |
| log\_bucket | Log bucket to be used by Service Catalog Bucket | `string` | n/a | yes |
| remote\_state\_bucket | Backend bucket to load remote state information from previous steps. | `string` | n/a | yes |

## Outputs
Expand Down
Loading