This repo is part of a multi-part guide that shows how to configure and deploy the example.com reference architecture described in Google Cloud security foundations guide. The following table lists the parts of the guide.
0-bootstrap | Bootstraps a Google Cloud organization, creating all the required resources and permissions to start using the Cloud Foundation Toolkit (CFT). This step also configures a CI/CD Pipeline for foundations code in subsequent stages. |
1-org | Sets up top level shared folders, monitoring and networking projects, and organization-level logging, and sets baseline security settings through organizational policy. |
2-environments (this file) | Sets up development, non-production, and production environments within the Google Cloud organization that you've created. |
3-networks-dual-svpc | Sets up base and restricted shared VPCs with default DNS, NAT (optional), Private Service networking, VPC service controls, on-premises Dedicated Interconnect, and baseline firewall rules for each environment. It also sets up the global DNS hub. |
3-networks-hub-and-spoke | Sets up base and restricted shared VPCs with all the default configuration found on step 3-networks-dual-svpc, but here the architecture will be based on the Hub and Spoke network model. It also sets up the global DNS hub |
4-projects | Sets up a folder structure, projects, and application infrastructure pipeline for applications, which are connected as service projects to the shared VPC created in the previous stage. |
5-app-infra | Deploys Service Catalog Pipeline and Custom Artifacts Pipeline. |
For an overview of the architecture and the parts, see the terraform-google-enterprise-genai README.
The purpose of this step is to setup development, non-production, and production environments within the Google Cloud organization that you've created.
- 0-bootstrap executed successfully.
- 1-org executed successfully.
- Cloud Identity / Google Workspace group for monitoring admins.
- Membership in the monitoring admins group for user running Terraform.
Please refer to troubleshooting if you run into issues during this step.
To enable Assured Workloads in the production folder, edit the main.tf file and update assured_workload_configuration.enable
to true
.
See the env_baseline
module README.md file for additional information on the values that can be configured for the Workload.
Assured Workload is a paid service. FedRAMP Moderate workloads can be deployed at no additional charge to Google Cloud products and service usage. For other compliance regimes, see Assured Workloads pricing.
If you enable Assured Workloads, to delete the Assured workload, you will need to manually delete the resources under it. Use the GCP console to identify the resources to be deleted.
Note: If you are using MacOS, replace cp -RT
with cp -R
in the relevant
commands. The -T
flag is needed for Linux, but causes problems for MacOS.
-
Clone the
gcp-environments
repo based on the Terraform output from the0-bootstrap
step. Clone the repo at the same level of theterraform-google-enterprise-genai
folder, the following instructions assume this layout. Runterraform output cloudbuild_project_id
in the0-bootstrap
folder to get the Cloud Build Project ID.export CLOUD_BUILD_PROJECT_ID=$(terraform -chdir="terraform-google-enterprise-genai/0-bootstrap/" output -raw cloudbuild_project_id) echo ${CLOUD_BUILD_PROJECT_ID} gcloud source repos clone gcp-environments --project=${CLOUD_BUILD_PROJECT_ID}
-
Navigate into the repo, change to the non-main branch and copy contents of foundation to new repo. All subsequent steps assume you are running them from the gcp-environments directory. If you run them from another directory, adjust your copy paths accordingly.
cd gcp-environments git checkout -b plan cp -RT ../terraform-google-enterprise-genai/2-environments/ . cp ../terraform-google-enterprise-genai/build/cloudbuild-tf-* . cp ../terraform-google-enterprise-genai/build/tf-wrapper.sh . chmod 755 ./tf-wrapper.sh
-
Rename
terraform.example.tfvars
toterraform.tfvars
.mv terraform.example.tfvars terraform.tfvars
-
Update the file with values from your environment and bootstrap (you can re-run
terraform output
in the 0-bootstrap directory to find these values). See any of the envs folder README.md files for additional information on the values in theterraform.tfvars
file.export backend_bucket=$(terraform -chdir="../terraform-google-enterprise-genai/0-bootstrap/" output -raw gcs_bucket_tfstate) echo "remote_state_bucket = ${backend_bucket}" sed -i "s/REMOTE_STATE_BUCKET/${backend_bucket}/" terraform.tfvars
-
Commit changes.
git add . git commit -m 'Initialize environments repo'
-
Push your plan branch to trigger a plan for all environments. Because the plan branch is not a named environment branch, pushing your plan branch triggers terraform plan but not terraform apply.
git push --set-upstream origin plan
-
Review the plan output in your cloud build project https://console.cloud.google.com/cloud-build/builds;region=DEFAULT_REGION?project=YOUR_CLOUD_BUILD_PROJECT_ID
-
Merge changes to development branch. Because this is a named environment branch, pushing to this branch triggers both terraform plan and terraform apply.
git checkout -b development git push origin development
-
Review the apply output in your cloud build project https://console.cloud.google.com/cloud-build/builds;region=DEFAULT_REGION?project=YOUR_CLOUD_BUILD_PROJECT_ID
-
Merge changes to non-production. Because this is a named environment branch, pushing to this branch triggers both terraform plan and terraform apply. Review the apply output in your cloud build project https://console.cloud.google.com/cloud-build/builds;region=DEFAULT_REGION?project=YOUR_CLOUD_BUILD_PROJECT_ID
git checkout -b non-production git push origin non-production
-
Merge changes to production branch. Because this is a named environment branch, pushing to this branch triggers both terraform plan and terraform apply. Review the apply output in your cloud build project https://console.cloud.google.com/cloud-build/builds;region=DEFAULT_REGION?project=YOUR_CLOUD_BUILD_PROJECT_ID
git checkout -b production git push origin production
A logging project will be created in every environment (development
, non-production
, production
) when running this code. This project contains a storage bucket for the purposes of project logging within its respective environment. This requires the [email protected]
group permissions for the storage bucket. Since foundations has more restricted security measures, a domain restriction constraint is enforced. This restraint will prevent the google cloud-storage-analytics group to be added to any permissions. In order for this terraform code to execute without error, manual intervention must be made to ensure everything applies without issue.
You must disable the contraint, assign the permission on the bucket and then apply the contraint again. This step-by-step presents you with two different options (Option 1
and Option 2
) and only one of them should be executed.
The first and the recommended option is making the changes by using gcloud
cli, as described in Option 1
.
Option 2
is an alternative to gcloud
cli and relies on Google Cloud Console.
You will be doing this procedure for each environment (development
, non-production
& production
)
-
Configure the following variable below with the value of
gcp-environments
repository path.export GCP_ENVIRONMENTS_PATH=INSERT_YOUR_PATH_HERE
Make sure your git is checked out to the development branch by running
git checkout development
onGCP_ENVIRONMENTS_PATH
.(cd $GCP_ENVIRONMENTS_PATH && git checkout development && ./tf-wrapper.sh init development)
-
Retrieve the bucket name and project id from terraform outputs.
export ENV_LOG_BUCKET_NAME=$(terraform -chdir="$GCP_ENVIRONMENTS_PATH/envs/development" output -raw env_log_bucket_name) export ENV_LOG_PROJECT_ID=$(terraform -chdir="$GCP_ENVIRONMENTS_PATH/envs/development" output -raw env_log_project_id)
-
Validate the variable values.
echo env_log_project_id=$ENV_LOG_PROJECT_ID echo env_log_bucket_name=$ENV_LOG_BUCKET_NAME
-
Reset your org policy for the logging project by running the following command.
gcloud org-policies reset iam.allowedPolicyMemberDomains --project=$ENV_LOG_PROJECT_ID
-
Assign
roles/storage.objectCreator
role to[email protected]
group.gcloud storage buckets add-iam-policy-binding gs://$ENV_LOG_BUCKET_NAME --member="group:[email protected]" --role="roles/storage.objectCreator"
Note: you might receive an error telling you that this is against an organization policy, this can happen because of the propagation time from the change made to the organization policy (propagation time is tipically 2 minutes, but can take 7 minutes or longer). If this happens, wait some minutes and try again
-
Delete the change made on the first step to the organization policy, this will make the project inherit parent policies.
gcloud org-policies delete iam.allowedPolicyMemberDomains --project=$ENV_LOG_PROJECT_ID
-
Configure the following variable below with the value of
gcp-environments
repository path.export GCP_ENVIRONMENTS_PATH=INSERT_YOUR_PATH_HERE
Make sure your git is checked out to the
non-production
branch by runninggit checkout non-production
onGCP_ENVIRONMENTS_PATH
.(cd $GCP_ENVIRONMENTS_PATH && git checkout non-production && ./tf-wrapper.sh init non-production)
-
Retrieve the bucket name and project id from terraform outputs.
export ENV_LOG_BUCKET_NAME=$(terraform -chdir="$GCP_ENVIRONMENTS_PATH/envs/non-production" output -raw env_log_bucket_name) export ENV_LOG_PROJECT_ID=$(terraform -chdir="$GCP_ENVIRONMENTS_PATH/envs/non-production" output -raw env_log_project_id)
-
Validate the variable values.
echo env_log_project_id=$ENV_LOG_PROJECT_ID echo env_log_bucket_name=$ENV_LOG_BUCKET_NAME
-
Reset your org policy for the logging project by running the following command.
gcloud org-policies reset iam.allowedPolicyMemberDomains --project=$ENV_LOG_PROJECT_ID
-
Assign
roles/storage.objectCreator
role to[email protected]
group.gcloud storage buckets add-iam-policy-binding gs://$ENV_LOG_BUCKET_NAME --member="group:[email protected]" --role="roles/storage.objectCreator"
Note: you might receive an error telling you that this is against an organization policy, this can happen because of the propagation time from the change made to the organization policy (propagation time is tipically 2 minutes, but can take 7 minutes or longer). If this happens, wait some minutes and try again
-
Delete the change made on the first step to the organization policy, this will make the project inherit parent policies.
gcloud org-policies delete iam.allowedPolicyMemberDomains --project=$ENV_LOG_PROJECT_ID
-
Configure the following variable below with the value of
gcp-environments
repository path.export GCP_ENVIRONMENTS_PATH=INSERT_YOUR_PATH_HERE
Make sure your git is checked out to the
production
branch by runninggit checkout production
onGCP_ENVIRONMENTS_PATH
.(cd $GCP_ENVIRONMENTS_PATH && git checkout production && ./tf-wrapper.sh init production)
-
Retrieve the bucket name and project id from terraform outputs.
export ENV_LOG_BUCKET_NAME=$(terraform -chdir="$GCP_ENVIRONMENTS_PATH/envs/production" output -raw env_log_bucket_name) export ENV_LOG_PROJECT_ID=$(terraform -chdir="$GCP_ENVIRONMENTS_PATH/envs/production" output -raw env_log_project_id)
-
Validate the variable values.
echo env_log_project_id=$ENV_LOG_PROJECT_ID echo env_log_bucket_name=$ENV_LOG_BUCKET_NAME
-
Reset your org policy for the logging project by running the following command.
gcloud org-policies reset iam.allowedPolicyMemberDomains --project=$ENV_LOG_PROJECT_ID
-
Assign
roles/storage.objectCreator
role to[email protected]
group.gcloud storage buckets add-iam-policy-binding gs://$ENV_LOG_BUCKET_NAME --member="group:[email protected]" --role="roles/storage.objectCreator"
Note: you might receive an error telling you that this is against an organization policy, this can happen because of the propagation time from the change made to the organization policy (propagation time is tipically 2 minutes, but can take 7 minutes or longer). If this happens, wait some minutes and try again
-
Delete the change made on the first step to the organization policy, this will make the project inherit parent policies.
gcloud org-policies delete iam.allowedPolicyMemberDomains --project=$ENV_LOG_PROJECT_ID
Proceed with these steps only if Option 1
is not chosen.
-
On
ml_logging.tf
locate the following lines and uncomment them:resource "google_storage_bucket_iam_member" "bucket_logging" { bucket = google_storage_bucket.log_bucket.name role = "roles/storage.objectCreator" member = "group:[email protected]" }
-
Under
IAM & Admin
, selectOrganization Policies
. Search for "Domain Restricted Sharing". -
Select 'Manage Policy'. This directs you to the Domain Restricted Sharing Edit Policy page. It will be set at 'Inherit parent's policy'. Change this to 'Google-managed default'.
-
Follow the instructions on checking out
development
,non-production
&production
branches. Once environments terraform code has successfully applied, edit the policy again and select 'Inherit parent's policy' and ClickSET POLICY
.
After making these modifications, you can follow the README.md procedure for 2-environment
step on foundation, make sure you change the organization policy after running the steps on foundation.
- You can now move to the instructions in the network step. To use the Dual Shared VPC network mode go to 3-networks-dual-svpc.
See 0-bootstrap
README-Jenkins.md.
See 0-bootstrap
README-GitHub.md.
-
The next instructions assume that you are at the same level of the
terraform-google-enterprise-genai
folder. Change into2-environments
folder, copy the Terraform wrapper script and ensure it can be executed.cd terraform-google-enterprise-genai/2-environments cp ../build/tf-wrapper.sh . chmod 755 ./tf-wrapper.sh
-
Rename
terraform.example.tfvars
toterraform.tfvars
.mv terraform.example.tfvars terraform.tfvars
-
Update the file with values from your environment and 0-bootstrap output.See any of the envs folder README.md files for additional information on the values in the
terraform.tfvars
file. -
Use
terraform output
to get the backend bucket value from 0-bootstrap output.export backend_bucket=$(terraform -chdir="../0-bootstrap/" output -raw gcs_bucket_tfstate) echo "remote_state_bucket = ${backend_bucket}" sed -i "s/REMOTE_STATE_BUCKET/${backend_bucket}/" ./terraform.tfvars
We will now deploy each of our environments(development/production/non-production) using this script. When using Cloud Build or Jenkins as your CI/CD tool each environment corresponds to a branch is the repository for 2-environments step and only the corresponding environment is applied.
To use the validate
option of the tf-wrapper.sh
script, please follow the instructions to install the terraform-tools component.
-
Use
terraform output
to get the Cloud Build project ID and the environment step Terraform Service Account from 0-bootstrap output. An environment variableGOOGLE_IMPERSONATE_SERVICE_ACCOUNT
will be set using the Terraform Service Account to enable impersonation.export CLOUD_BUILD_PROJECT_ID=$(terraform -chdir="../0-bootstrap/" output -raw cloudbuild_project_id) echo ${CLOUD_BUILD_PROJECT_ID} export GOOGLE_IMPERSONATE_SERVICE_ACCOUNT=$(terraform -chdir="../0-bootstrap/" output -raw environment_step_terraform_service_account_email) echo ${GOOGLE_IMPERSONATE_SERVICE_ACCOUNT}
-
Ensure you disable The Organization Policy on the
development
folder before continuing further. -
Run
init
andplan
and review output for environment development../tf-wrapper.sh init development ./tf-wrapper.sh plan development
-
Run
validate
and check for violations../tf-wrapper.sh validate development $(pwd)/../policy-library ${CLOUD_BUILD_PROJECT_ID}
-
Run
apply
development../tf-wrapper.sh apply development
-
Ensure you disable The Organization Policy on the
non-production
folder before continuing further. -
Run
init
andplan
and review output for environment non-production../tf-wrapper.sh init non-production ./tf-wrapper.sh plan non-production
-
Run
validate
and check for violations../tf-wrapper.sh validate non-production $(pwd)/../policy-library ${CLOUD_BUILD_PROJECT_ID}
-
Run
apply
non-production../tf-wrapper.sh apply non-production
-
Ensure you disable The Organization Policy on the
non-production
folder before continuing further. -
Run
init
andplan
and review output for environment production../tf-wrapper.sh init production ./tf-wrapper.sh plan production
-
Run
validate
and check for violations../tf-wrapper.sh validate production $(pwd)/../policy-library ${CLOUD_BUILD_PROJECT_ID}
-
Run
apply
production../tf-wrapper.sh apply production
If you received any errors or made any changes to the Terraform config or terraform.tfvars
you must re-run ./tf-wrapper.sh plan <env>
before running ./tf-wrapper.sh apply <env>
.
Before executing the next stages, unset the GOOGLE_IMPERSONATE_SERVICE_ACCOUNT
environment variable.
unset GOOGLE_IMPERSONATE_SERVICE_ACCOUNT
cd ../..
- You can now move to the instructions in the network step. To use the Dual Shared VPC network mode go to 3-networks-dual-svpc.