Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Assuming multiple roles for a Terraform deployment #424

Closed
lpossamai opened this issue Apr 25, 2022 · 16 comments
Closed

Assuming multiple roles for a Terraform deployment #424

lpossamai opened this issue Apr 25, 2022 · 16 comments
Labels
effort/medium This issue will take a few days of effort to fix feature-request A feature should be added or improved. p2

Comments

@lpossamai
Copy link

Hello,

I have a question on how can I use configure-aws-credentials to assume multiple roles so that my TF provider.tf file can apply all the necessary changes to multiple accounts?

Example: In my PROD workspace, I need to deploy to TEST and DEV workspaces. In my provider.tf file I have the following:

provider "aws" {
  region = "ap-southeast-2"
  assume_role {
    role_arn = local.role_arns[terraform.workspace]
  }
}

provider "aws" {
  alias  = "test"
  region = "ap-southeast-2"
  assume_role {
    role_arn = local.role_arns.test
  }
}

provider "aws" {
  alias  = "staging"
  region = "ap-southeast-2"
  assume_role {
    role_arn = local.role_arns.staging
  }
}

In my Github Actions workflow I have the following:

steps:
      - name: Checkout
        uses: actions/checkout@v3
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          role-to-assume: ${{ env.iam_role_to_assume_prod }}
          aws-region: ${{ env.AWS_REGION }}

But that gives me an error, because Github didn't have permissions to assume the other two roles staging and test.

Error: NoCredentialProviders: no valid providers in chain. Deprecated.
	For verbose messaging see aws.Config.CredentialsChainVerboseErrors

Is there a workaround this? Any suggestions is welcome.

Thanks!

@alicancakil
Copy link

facing same issue and would love to find a solution

@nooperpudd
Copy link

I can provide a suitable solution for the multiple regions & multiple accounts.
here is the repo link: http://github.com/startuplcoud/infra-multi-account-region-startup-kit/
but I need to update the document and much more details.

@HanwhaARudolph
Copy link

HanwhaARudolph commented Jul 1, 2022

This is how I manage in my pipeline:

      - name: Terraform Validate
        working-directory: ./ProvisionAWSGlobal
        id: validate
        run: terraform validate -no-color
        env:
          AWS_ACCESS_KEY_ID: "${{ secrets.APPID }}"
          AWS_SECRET_ACCESS_KEY: "${{ secrets.APPSECRET }}"
        continue-on-error: true 

And in my terraform:

provider "aws" {
  alias  = "base"
  region = var.deploy_region
  default_tags {
    tags = {
      managed_by = "Terraform"
    }
  }
}

provider "aws" {
  alias  = "other"
  region = var.deploy_region
  assume_role {
    role_arn     = "arn:aws:iam::${var.management_account_id}:role/${rolename}"
  }
  default_tags {
    tags = {
      managed_by = "Terraform"
    }
  }
}

@kono2021
Copy link

We do Assume Role twice to manage multiple provider situations like this case.

[GithubAction] -----------------------> [prod_role] -----------------------> [staging_role] 
                 assume role with                     assume role with
             configure-aws-credentials              terraform assume_role

[GithubAction] -----------------------> [prod_role] -----------------------> [test_role] 
                 assume role with                     assume role with
             configure-aws-credentials              terraform assume_role

@lpossamai
Copy link
Author

We do Assume Role twice to manage multiple provider situations like this case.

[GithubAction] -----------------------> [prod_role] -----------------------> [staging_role] 
                 assume role with                     assume role with
             configure-aws-credentials              terraform assume_role

[GithubAction] -----------------------> [prod_role] -----------------------> [test_role] 
                 assume role with                     assume role with
             configure-aws-credentials              terraform assume_role

If you could provide example code, that would be awesome please.

@peterwoodworth peterwoodworth added the needs-triage This issue still needs to be triaged label Oct 1, 2022
@CyberViking949
Copy link

Hello, i recently encountered this same issue. is there any update on a fix?

@Constantin07
Copy link

@CyberViking949 This advice worked for me to assume multiple roles #636 (comment)

@CyberViking949
Copy link

@CyberViking949 This advice worked for me to assume multiple roles #636 (comment)

Thanks @Constantin07, however this requires static access keys setup. The whole reason i was leveraging this action was to use the Github OIDC provider in aws. so im assuming a role in an identity account to assume a role in a prod/dev account all using ephemeral tokens.

Action assume role --> Identity role (this action) --> backend role for s3 statefiles
--> Child role for plan/apply.

the backend role is assumed properly and state is pulled. However plan/apply is not using the role defined in the provider and is instead using the role from the identity account

@peterwoodworth peterwoodworth added p2 effort/medium This issue will take a few days of effort to fix feature-request A feature should be added or improved. and removed needs-triage This issue still needs to be triaged labels Feb 21, 2023
@mcblair
Copy link

mcblair commented Mar 19, 2023

I'm standing on the shoulders of giants with this, but here is something that I whipped up to meet my use case: https://github.com/marketplace/actions/configure-aws-profile

@peterwoodworth
Copy link
Contributor

Thanks for sharing this mcblair, this is excellent. I'm going to be closing this issue in favor of #112, as I suspect once #112 is implemented that will work for this use case. Let me know if you disagree and I can reopen this issue

@github-actions
Copy link

github-actions bot commented Jul 3, 2023

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

@lpossamai
Copy link
Author

Hi @peterwoodworth ,

I disagree that #112 will fix this issue here. #112 will use profiles and not the IAM Roles. That would cause a very long pipeline config file, depending on your setup, and lots and lots of Github Secrets to configure... which isn't something practical.

If we take the #112 example:

- name: Add Dev profile credentials to ~/.aws/credentials
   env:
      AWS_ACCESS_KEY_ID: ${{ secrets.DEV_AWS_ACCESS_KEY_ID }}
      AWS_SECRET_ACCESS_KEY: ${{ secrets.DEV_AWS_SECRET_ACCESS_KEY }}
   run: |
      aws configure set aws_access_key_id $DEV_AWS_ACCESS_KEY_ID --profile my-app-name-dev
      aws configure set aws_secret_access_key $DEV_AWS_SECRET_ACCESS_KEY --profile my-app-name-dev

- name: Add Staging profile credentials to ~/.aws/credentials
   env:
      AWS_ACCESS_KEY_ID: ${{ secrets.STAGING_AWS_ACCESS_KEY_ID }}
      AWS_SECRET_ACCESS_KEY: ${{ secrets.STAGING_AWS_SECRET_ACCESS_KEY }}
   run: |
      aws configure set aws_access_key_id $STAGING_AWS_ACCESS_KEY_ID --profile my-app-name-staging
      aws configure set aws_secret_access_key $STAGING_AWS_SECRET_ACCESS_KEY --profile my-app-name-staging

- name: Add Prod profile credentials to ~/.aws/credentials
   env:
      AWS_ACCESS_KEY_ID: ${{ secrets.PROD_AWS_ACCESS_KEY_ID }}
      AWS_SECRET_ACCESS_KEY: ${{ secrets.PROD_AWS_SECRET_ACCESS_KEY }}
   run: |
      aws configure set aws_access_key_id $PROD_AWS_ACCESS_KEY_ID --profile my-app-name-prod
      aws configure set aws_secret_access_key $PROD_AWS_SECRET_ACCESS_KEY --profile my-app-name-prod

what I propose is a way to support multiple AWS authentication using IAM Roles.

@peterwoodworth
Copy link
Contributor

Thanks @lpossamai, I see why profiles doesn't solve this for you

I'm curious to know more about how exactly you're using this action within your workflow, and what exactly you're doing in terraform. I'm unfamiliar with terraform, is there one command that you're running in one step, and you need to be able to assume multiple roles at once for this one terraform command to work?

@peterwoodworth peterwoodworth reopened this Jul 3, 2023
@lpossamai
Copy link
Author

Hi @peterwoodworth , thanks for your prompt reply.

TBH, I have changed the way I use Terraform and authenticate with AWS. So, this issue is not needed for me and I cannot replicate it anymore. Looking at this further, I realize now that the limitation I was facing is not something that needs and can be fixed by the maintainers of aws-actions/configure-aws-credentials. It should be addressed at the Terraform level.


A little background for further reference.

Before the change I made, I was using Github Actions to deploy my infrastructure to AWS with Terraform. A sample code would be:

// terraform/elb/main.tf
resource "aws_lb" "alb" {
  count                      = terraform.workspace == "test" || terraform.workspace == "staging" ? 1 : 0
  name                       = "example-${terraform.workspace}-alb"
  internal                   = false
  load_balancer_type         = "application"
  security_groups            = [aws_security_group.alb[count.index].id]
  subnets                    = data.terraform_remote_state.network.outputs.public_subnets
  idle_timeout               = 300
  enable_deletion_protection = true
  enable_http2               = true
  preserve_host_header       = true
  drop_invalid_header_fields = true

  access_logs {
    bucket  = module.alb_log_bucket[count.index].s3_bucket_id
    prefix  = terraform.workspace
    enabled = true
  }

  tags = merge({
    Environment = terraform.workspace
  }, var.tags)
}

The github workflow for that particular folder would look like this:

jobs:
  ELB-TEST:
    name: "ELB-TEST"
    runs-on: ubuntu-latest
    environment: test
    env:
      TF_VAR_iam_role_to_assume_test: ${{ secrets.iam_role_to_assume_test }}
      ENVIRONMENT: test
    defaults:
      run:
        working-directory: ${{ env.WORKING_DIRECTORY }}

    steps:
      - name: Checkout
        uses: actions/checkout@v3
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          role-to-assume: ${{ env.TF_VAR_iam_role_to_assume_test }}
          role-session-name: github-ELB-test
          aws-region: ${{ env.AWS_REGION }}

      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v2

      - name: Terraform Format
        id: fmt
        run: terraform fmt -check -recursive

      - name: Terraform Init
        id: init
        run: |
          terraform init -backend-config="role_arn=$TF_VAR_iam_role_to_terraform_backend"

      - name: Terraform Validate
        id: validate
        run: |
          terraform validate -no-color
        env:
          TF_WORKSPACE: test
          TF_IN_AUTOMATION: true

      - name: Terraform Plan
        id: plan
        if: github.event_name == 'pull_request'
        run: terraform plan -input=false -out=tf_plan_out_${{ env.ENVIRONMENT }}_${{ env.TF_MODULE_NAME }}.tfplan
        continue-on-error: false
        env:
          TF_WORKSPACE: test
          TF_IN_AUTOMATION: true

      - name: Terraform Apply
        if: github.ref == 'refs/heads/main' && github.event_name == 'push'
        run: terraform apply -input=false -auto-approve tf_plan_out_${{ env.ENVIRONMENT }}_${{ env.TF_MODULE_NAME }}.tfplan
        env:
          TF_WORKSPACE: test
          TF_IN_AUTOMATION: true

  ELB-STAGING:
    name: "ELB-STAGING"
    runs-on: ubuntu-latest
    needs: ELB-TEST
    environment: staging
    env:
      TF_VAR_iam_role_to_assume_staging: ${{ secrets.iam_role_to_assume_staging }}
      ENVIRONMENT: staging
    defaults:
      run:
        working-directory: ${{ env.WORKING_DIRECTORY }}

    steps:
      - name: Checkout
        uses: actions/checkout@v3
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          role-to-assume: ${{ env.TF_VAR_iam_role_to_assume_staging }}
          role-session-name: github-ELB-staging
          aws-region: ${{ env.AWS_REGION }}

      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v2

      - name: Terraform Format
        id: fmt
        run: terraform fmt -check -recursive

      - name: Terraform Init
        id: init
        run: |
          terraform init -backend-config="role_arn=$TF_VAR_iam_role_to_terraform_backend"

      - name: Terraform Validate
        id: validate
        run: |
          terraform validate -no-color
        env:
          TF_WORKSPACE: staging
          TF_IN_AUTOMATION: true

      - name: Terraform Plan
        id: plan
        if: github.event_name == 'pull_request'
        run: terraform plan -input=false -out=tf_plan_out_${{ env.ENVIRONMENT }}_${{ env.TF_MODULE_NAME }}.tfplan
        continue-on-error: false
        env:
          TF_WORKSPACE: staging
          TF_IN_AUTOMATION: true

      - name: Terraform Apply
        if: github.ref == 'refs/heads/main' && github.event_name == 'push'
        run: terraform apply -input=false -auto-approve tf_plan_out_${{ env.ENVIRONMENT }}_${{ env.TF_MODULE_NAME }}.tfplan
        env:
          TF_WORKSPACE: staging
          TF_IN_AUTOMATION: true

So not very good as I would have to have a Job for each of my environments and for each of my terraform/** folders/modules. And not only that, but what if I want to deploy to multiple accounts in the same PR? That wouldn't be possible.

What I ended up doing was:

  1. Moved my TF backend to a Shared-Services AWS Account
  2. Implemented Terragrunt in my repository to help keeping the code DRY
  3. Implemented Terrateam as my new CI solution

This allows me to deploy to multiple accounts now in the same PR using provider = aws.alias. You can check this diagram to understand this concept now.

Safe to close this issue now. Thanks!

@github-actions
Copy link

github-actions bot commented Jul 4, 2023

Comments on closed issues are hard for our team to see.
If you need more assistance, please either tag a team member or open a new issue that references this one.
If you wish to keep having a conversation with other community members under this issue feel free to do so.

@benabineri
Copy link

benabineri commented Jul 4, 2023

Sorry, I didn't read all the comments before replying. #112 covers my requirements

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
effort/medium This issue will take a few days of effort to fix feature-request A feature should be added or improved. p2
Projects
None yet
Development

No branches or pull requests

10 participants