This module allows you to create log exports at the project, folder, or organization level.
The resources/services/activations/deletions that this module will create/trigger are:
- An Aggregated log export on the project-level, folder-level, or organization-level
- A Service account (logsink writer)
- A Destination (Cloud Storage bucket, Cloud Pub/Sub topic, BigQuery dataset)
You can go to the examples folder to see all the use cases, however the usage of the module could be like this in your own main.tf
file:
module "logsink" {
source = "terraform-google-modules/log-export/google"
name = "my-logsink"
folder = "2165468435"
filter = "severity >= ERROR"
include_children = true
pubsub = {
name = "my-logsink-pubsub"
project = "my-pubsub-project"
}
}
- Terraform 0.10.x
- terraform-provider-google plugin v1.8.0
In order to execute this module you must have a Service Account with the following:
The service account should have the following roles:
roles/logging.configWriter
on the logsink's project, folder, or organization (to create the logsink)roles/iam.admin
on the destination project (to grant write permissions for logsink service account)roles/serviceusage.admin
on the destination project (to enable destination API)
To use a Google Cloud Pub/Sub topic as the destination:
roles/pubsub.admin
on the destination project (to create a pub/sub topic)
To integrate the logsink with Splunk, you'll need a topic subscriber (service account):
roles/serviceAccount.admin
on the destination project (to create a service account for the logsink subscriber)
To use a Google Cloud Storage bucket as the destination:
roles/storage.admin
on the destination project (to create a storage bucket)
To use a BigQuery dataset as the destination, one must grant:
roles/bigquery.dataEditor
on the destination project (to create a BigQuery dataset)
In order to operate with the Service Account you must activate the following API's on the base project where the Service Account was created:
- Cloud Resource Manager API - cloudresourcemanager.googleapis.com
- Cloud Billing API - cloudbilling.googleapis.com
- Identity and Access Management API - iam.googleapis.com
- Service Usage API - serviceusage.googleapis.com
Be sure you have the correct Terraform version (0.10.x), you can choose the binary here:
Then perform the following commands:
terraform init
to get the pluginsterraform plan
to see the infrastructure planterraform apply
to apply the infrastructure buildterraform destroy
to destroy the built infrastructure
Please refer the variables.tf
file for the required and optional variables.
Please refer the outputs.tf
file for the outputs that you can get with the terraform output
command
The project has the following folders and files:
- /: root folder
- /examples: examples for using this module
- /scripts: Shell scripts for specific tasks on module
- /test: Folders with files for testing the module (see Testing section on this file)
- /main.tf: main file for this module, contains all the resources to create
- /variables.tf: all the variables for the module
- /output.tf: the outputs of the module
- /readme.MD: this file
- bats 0.4.0
The integration tests for this module are built with bats, basically the test checks the following:
- Perform
terraform init
command - Perform
terraform get
command - Perform
terraform plan
command and check that it'll create n resources, modify 0 resources and delete 0 resources - Perform
terraform apply -auto-approve
command and check that it has created the n resources, modified 0 resources and deleted 0 resources - Perform several
gcloud
commands and check the infrastructure is in the desired state - Perform
terraform destroy -force
command and check that it has destroyed the n resources
You can use the following command to run the integration test in the folder /test/integration/gcloud-test
. launch.sh