Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 module not accepting given lifecycle rule #313

Open
1 task done
alishah730 opened this issue Feb 13, 2025 · 2 comments
Open
1 task done

S3 module not accepting given lifecycle rule #313

alishah730 opened this issue Feb 13, 2025 · 2 comments
Labels

Comments

@alishah730
Copy link

Description

Please provide a clear and concise description of the issue you are encountering, and a reproduction of your configuration (see the examples/* directory for references that you can copy+paste and tailor to match your configs if you are unable to copy your exact configuration). The reproduction MUST be executable by running terraform init && terraform apply without any further changes.

If your request is for a new feature, please use the Feature request template.

  • ✋ I have searched the open/closed issues and my issue is not listed.

⚠️ Note

Before you submit an issue, please perform the following first:

  1. Remove the local .terraform directory (! ONLY if state is stored remotely, which hopefully you are following that best practice!): rm -rf .terraform/
  2. Re-initialize the project root to pull down modules: terraform init
  3. Re-attempt your terraform plan or apply and check if the issue still persists

Versions

  • Module version [Required]: version = "4.6.0"

  • Terraform version: Terraform v1.10.5

  • Provider version(s): provider registry.terraform.io/hashicorp/aws v5.86.1

Reproduction Code [Required]

terraform {
  required_version = ">= 1.0"


  required_providers {
    aws = {
      #checkov:skip=CKV_TF_1
      source  = "hashicorp/aws"
      version = "5.86.1"
    }
  }
}

# This provider is to deploy all regional resources like Lambda functions, VPCs etc.
provider "aws" {
  region = "us-east-1"
}


module "s3_provisioning_tfstate_bucket" {
  #checkov:skip=CKV_TF_1
  source  = "terraform-aws-modules/s3-bucket/aws"
  version = "4.6.0"
  bucket  = "ali-module-s3-test" # Namespace, AWS A/C Id & Region are added to the bucket name to make it unique
  acl     = "private"

  control_object_ownership = true
  object_ownership         = "ObjectWriter"
  force_destroy            = true

  block_public_acls       = true
  block_public_policy     = true
  ignore_public_acls      = true
  restrict_public_buckets = true

  versioning = {
    enabled = true
  }

  server_side_encryption_configuration = {
    rule = {
      apply_server_side_encryption_by_default = {
        # kms_master_key_id = module.kms.key_arn
        sse_algorithm = "AES256" #"aws:kms"
      }
    }
  }

  lifecycle_rule = [
    {
      id      = "provisionSfnExecutionLogs"
      enabled = true

      filter = {
        prefix = "provisionSfnExecutionLogs/"
      }

      expiration = {
        days                         = 7
        expired_object_delete_marker = true
      }

      noncurrent_version_expiration = {
        newer_noncurrent_versions = 1
        days                      = 7
      }
    }
  ]
}


Steps to reproduce the behavior:

no yes

terraform init
terraform plan
terraform apply

Expected behavior

its should create an S3 bucker as per the given definition

Actual behavior

its giving error in s3 lifecycle rule

Error: Provider produced inconsistent result after apply
│ 
│ When applying changes to module.s3_ali_state_bucket.aws_s3_bucket_lifecycle_configuration.this[0], provider "provider[\"registry.terraform.io/hashicorp/aws\"]" produced an unexpected new value:
│ .rule[0].expiration[0].expired_object_delete_marker: was cty.True, but now cty.False.
│ 
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.

Terminal Output Screenshot(s)

After terrafrom apply

Image

Additional context

@sharovmerk
Copy link

The same

Copy link

This issue has been automatically marked as stale because it has been open 30 days
with no activity. Remove stale label or comment or this issue will be closed in 10 days

@github-actions github-actions bot added the stale label Mar 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants