Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

QoL updates ✅ #24

Merged
merged 9 commits into from
Nov 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
67 changes: 0 additions & 67 deletions .github/workflows/deployment.yaml

This file was deleted.

73 changes: 73 additions & 0 deletions .github/workflows/deployment_apply.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
name: Terraform Apply
run-name: Terraform ${{ inputs.terraform_action }} ${{ inputs.workspace }} by @${{ github.actor }}

on:
workflow_dispatch:
inputs:
workspace:
description: 'The workspace to terraform against'
required: true
type: choice
options:
- ""
- prod

concurrency:
group: ${{ github.event.inputs.workspace }}-terraform
cancel-in-progress: false

permissions:
id-token: write
contents: read

env:
workspace: ${{ github.event.inputs.workspace }}
terraform_action: apply

jobs:
terraform:
name: Run Terraform
runs-on: ubuntu-latest
defaults:
run:
shell: bash
# this may need to be updated if you change the directory you are working with
# ./terraform/implementation/dev || ./terraform/implementation/prod for example
# this practice is recommended to keep the terraform code organized while reducing the risk of conflicts
working-directory: ./terraform/implementation/ecs
steps:
- name: Check Out Changes
uses: actions/checkout@v4

- name: Setup Terraform
uses: hashicorp/setup-terraform@v3

- name: configure aws credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: ${{ secrets.AWS_ROLE_ARN }}
role-session-name: githubDeploymentWorkflow
aws-region: ${{ vars.AWS_REGION }}

- name: Terraform
env:
ACTION: ${{ env.terraform_action }}
BUCKET: ${{ secrets.TFSTATE_BUCKET }}
DYNAMODB_TABLE: ${{ secrets.TFSTATE_DYNAMODB_TABLE }}
OWNER: ${{ vars.OWNER }}
PROJECT: ${{ vars.PROJECT }}
REGION: ${{ vars.AWS_REGION }}
WORKSPACE: ${{ env.workspace }}
shell: bash
run: |
echo "owner = \"$OWNER\"" >> $WORKSPACE.tfvars
echo "project = \"$PROJECT\"" >> $WORKSPACE.tfvars
echo "region = \"$REGION\"" >> $WORKSPACE.tfvars
terraform init \
-var-file="$WORKSPACE.tfvars" \
-backend-config "bucket=$BUCKET" \
-backend-config "dynamodb_table=$DYNAMODB_TABLE" \
-backend-config "region=$REGION" \
|| (echo "terraform init failed, exiting..." && exit 1)
terraform workspace select "$WORKSPACE"
terraform apply -auto-approve -var-file="$WORKSPACE.tfvars"
73 changes: 73 additions & 0 deletions .github/workflows/deployment_plan.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
name: Terraform Plan
run-name: Terraform ${{ inputs.terraform_action }} ${{ inputs.workspace }} by @${{ github.actor }}

on:
workflow_dispatch:
inputs:
workspace:
description: 'The workspace to terraform against'
required: true
type: choice
options:
- ""
- prod

concurrency:
group: ${{ github.event.inputs.workspace }}-terraform
cancel-in-progress: false

permissions:
id-token: write
contents: read

env:
workspace: ${{ github.event.inputs.workspace }}
terraform_action: plan

jobs:
terraform:
name: Run Terraform
runs-on: ubuntu-latest
defaults:
run:
shell: bash
# this may need to be updated if you change the directory you are working with
# ./terraform/implementation/dev || ./terraform/implementation/prod for example
# this practice is recommended to keep the terraform code organized while reducing the risk of conflicts
working-directory: ./terraform/implementation/ecs
steps:
- name: Check Out Changes
uses: actions/checkout@v4

- name: Setup Terraform
uses: hashicorp/setup-terraform@v3

- name: configure aws credentials
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: ${{ secrets.AWS_ROLE_ARN }}
role-session-name: githubDeploymentWorkflow
aws-region: ${{ vars.AWS_REGION }}

- name: Terraform
env:
ACTION: ${{ env.terraform_action }}
BUCKET: ${{ secrets.TFSTATE_BUCKET }}
DYNAMODB_TABLE: ${{ secrets.TFSTATE_DYNAMODB_TABLE }}
OWNER: ${{ vars.OWNER }}
PROJECT: ${{ vars.PROJECT }}
REGION: ${{ vars.AWS_REGION }}
WORKSPACE: ${{ env.workspace }}
shell: bash
run: |
echo "owner = \"$OWNER\"" >> $WORKSPACE.tfvars
echo "project = \"$PROJECT\"" >> $WORKSPACE.tfvars
echo "region = \"$REGION\"" >> $WORKSPACE.tfvars
terraform init \
-var-file="$WORKSPACE.tfvars" \
-backend-config "bucket=$BUCKET" \
-backend-config "dynamodb_table=$DYNAMODB_TABLE" \
-backend-config "region=$REGION" \
|| (echo "terraform init failed, exiting..." && exit 1)
terraform workspace select "$WORKSPACE"
terraform plan -var-file="$WORKSPACE.tfvars"
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -193,9 +193,10 @@ The setup.sh script will create the following files:

## 4.6 Run Terraform Code In Your Designated Environment
<em><strong>4.6.1. Run ECS Module Locally</em></strong>
* To run your ECS Module Changes in your local terminal, navigate to _terraform/implementation/ecs/_ and run the following command: `cd /terraform/implementation`.
* It is highly recommended to create a new directory per environment that is launched, to do so run `cp terraform/implementation/ecs terraform/implementation/{insertEnvironmentName}`.
* To run your ECS Module Changes in your local terminal, navigate to your working directory, ` cd terraform/implementation/ecs/` or `cd terraform/implementation/{insertEnvironmentName}`
* In your terminal run the deploy script for your designated environment `./deploy.sh -e {insertEnvironmentName}`.\
&nbsp;&nbsp;&nbsp;&nbsp;<u><em><strong>Note</em></strong></u>: The _-e_ tag stands for environment and you can specify `dev`, `stage`, `prod`
&nbsp;&nbsp;&nbsp;&nbsp;<u><em><strong>Note</em></strong></u>: The _-e_ tag stands for environment and you can specify `dev`, `stage`, `prod`, this can match your environment naming convention.
&nbsp;&nbsp;&nbsp;&nbsp;or whatever environment your team desires.

[Return to Table of Contents](#table-of-contents).
Expand Down Expand Up @@ -226,7 +227,7 @@ The setup.sh script will create the following files:

## 4.10 Update Variables
<em><strong>4.10.1. Update Other Default Variables</em></strong>
* Navigate to the _defaults.tfvars_ file `cd terraform/implementation/ecs/default.tfvars`.
* Navigate to the _defaults.tfvars_ file ` cd terraform/implementation/ecs/` or `cd terraform/implementation/{insertEnvironmentName}`.
* In this _defaults.tfvars_ file, you can update and override any other default values.

<em><strong>4.10.2. Test and Validate Your Changes</em></strong>
Expand Down
11 changes: 5 additions & 6 deletions terraform/implementation/ecs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,18 +28,17 @@

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_availability_zones"></a> [availability\_zones](#input\_availability\_zones) | The availability zones to use | `list(string)` | <pre>[<br/> "us-east-1a",<br/> "us-east-1b",<br/> "us-east-1c"<br/>]</pre> | no |
| <a name="input_availability_zones"></a> [availability\_zones](#input\_availability\_zones) | The availability zones to use | `list(string)` | <pre>[<br> "us-east-1a",<br> "us-east-1b",<br> "us-east-1c"<br>]</pre> | no |
| <a name="input_ecr_viewer_database_schema"></a> [ecr\_viewer\_database\_schema](#input\_ecr\_viewer\_database\_schema) | The database schema used for the eCR data tables | `string` | `"core"` | no |
| <a name="input_ecr_viewer_database_type"></a> [ecr\_viewer\_database\_type](#input\_ecr\_viewer\_database\_type) | The SQL variant used for the eCR data tables | `string` | `"postgres"` | no |
| <a name="input_ecs_alb_sg"></a> [ecs\_alb\_sg](#input\_ecs\_alb\_sg) | The security group for the Application Load Balancer | `string` | `"ecs-albsg"` | no |
| <a name="input_enable_nat_gateway"></a> [enable\_nat\_gateway](#input\_enable\_nat\_gateway) | Enable NAT Gateway | `bool` | `true` | no |
| <a name="input_internal"></a> [internal](#input\_internal) | Flag to determine if the several AWS resources are public (intended for external access, public internet) or private (only intended to be accessed within your AWS VPC or avaiable with other means, a transit gateway for example). | `bool` | `true` | no |
| <a name="input_owner"></a> [owner](#input\_owner) | The owner of the infrastructure | `string` | `"skylight"` | no |
| <a name="input_phdi_version"></a> [phdi\_version](#input\_phdi\_version) | PHDI container image version | `string` | `"v1.4.4"` | no |
| <a name="input_private_subnets"></a> [private\_subnets](#input\_private\_subnets) | The private subnets | `list(string)` | <pre>[<br/> "176.24.1.0/24",<br/> "176.24.3.0/24"<br/>]</pre> | no |
| <a name="input_project"></a> [project](#input\_project) | The project name | `string` | `"dibbs-ce"` | no |
| <a name="input_public_subnets"></a> [public\_subnets](#input\_public\_subnets) | The public subnets | `list(string)` | <pre>[<br/> "176.24.2.0/24",<br/> "176.24.4.0/24"<br/>]</pre> | no |
| <a name="input_private_subnets"></a> [private\_subnets](#input\_private\_subnets) | The private subnets | `list(string)` | <pre>[<br> "176.24.1.0/24",<br> "176.24.3.0/24"<br>]</pre> | no |
| <a name="input_project"></a> [project](#input\_project) | The project name | `string` | `"dibbs"` | no |
| <a name="input_public_subnets"></a> [public\_subnets](#input\_public\_subnets) | The public subnets | `list(string)` | <pre>[<br> "176.24.2.0/24",<br> "176.24.4.0/24"<br>]</pre> | no |
| <a name="input_region"></a> [region](#input\_region) | AWS region | `string` | `"us-east-1"` | no |
| <a name="input_single_nat_gateway"></a> [single\_nat\_gateway](#input\_single\_nat\_gateway) | Single NAT Gateway | `bool` | `true` | no |
| <a name="input_vpc"></a> [vpc](#input\_vpc) | The name of the VPC | `string` | `"ecs-vpc"` | no |
| <a name="input_vpc_cidr"></a> [vpc\_cidr](#input\_vpc\_cidr) | The CIDR block for the VPC | `string` | `"176.24.0.0/16"` | no |

Expand Down
Loading