Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IaC-Library-Edify-1: Concepts/Principles files #2317

Merged
merged 6 commits into from
Jan 14, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
# Be Judicious With New Features
# Be Judicious with New Features

New OpenTofu features can streamline module authoring and provide more features, but may also require that consumers adopt newer OpenTofu versions. This requirement can pose challenges for organizations that cannot upgrade OpenTofu versions promptly, but want to keep using the latest version of our modules.
Introducing new OpenTofu features can enhance module development and expand functionality. However, adopting these features may require users to upgrade to newer OpenTofu versions. This requirement can create challenges for organizations that are unable to update OpenTofu promptly but still wish to utilize the latest module versions.

Modules in the Library often depend on each other. All dependent modules must be updated to require a newer OpenTofu version if a dependent module update requires a newer OpenTofu version.
Modules within the library often have interdependencies. If a module update introduces a dependency on a newer OpenTofu version, **all related modules must be updated** to reflect this version requirement.

## How to Decide to Use Newer Features
## Guidelines for adopting new features

Consider the following when deciding whether to adopt a new feature:
Evaluate the following factors when determining whether to adopt a new feature:

### Age of Feature
### Feature stability and maturity

Older features are more likely to be compatible with existing consumer environments. While it is unnecessary to avoid newer features altogether, prioritizing well-established features ensures broader compatibility.
Well-established features typically offer greater compatibility with existing module-consumer environments. While incorporating newer features is not discouraged, prioritizing stable and widely adopted features helps maintain consistency and reliability.

### Impact on Module Consumers
### Impact on module users

Requiring upgrades is sometimes justified. For example, the [moved](https://opentofu.org/docs/v1.6/language/modules/develop/refactoring/#moved-block-syntax) block enables seamless upgrades even when resource addresses change. If upgrading provides greater benefits than manual interventions, adopting newer versions can be a practical choice.
Upgrading OpenTofu versions may sometimes be necessary. For instance, the [`moved`](https://opentofu.org/docs/v1.6/language/modules/develop/refactoring/#moved-block-syntax) block allows seamless upgrades even when resource addresses change. When the benefits of upgrading outweigh the effort of manual adjustments, adopting newer versions is recommended.
28 changes: 13 additions & 15 deletions docs/2.0/docs/library/concepts/principles/control-provider-usage.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
# Control Provider Usage

There are two main methods for controlling the provider used in OpenTofu/Terraform operations:
There are two primary methods for managing the provider used in OpenTofu/Terraform operations:

1. Setting required provider versions.
2. Committing the `.terraform.lock.hcl` file.
1. Specifying required provider versions.
2. Committing the `.terraform.lock.hcl` file to version control.

## Required provider versions

## Required Provider Versions
It is advisable to follow the [OpenTofu recommendations](https://opentofu.org/docs/language/providers/requirements/#best-practices-for-provider-versions) for specifying minimum provider versions for any providers used in modules developed as part of the library, ensuring compatibility between installed provider versions and the features in your modules.

Generally speaking, follow [OpenTofu recommendations](https://opentofu.org/docs/language/providers/requirements/#best-practices-for-provider-versions) regarding specifying the minimum provider version for any providers used by modules authored as part of the library.

This recommendation is useful guidance for ensuring that the feature set used as part of modules being authored for the library are available in provider versions end users install.

You can do that by using a `required_provider` configuration block like the following:
Specify these versions using a `required_provider` configuration block, as demonstrated below:

```terraform
terraform {
Expand All @@ -23,19 +23,17 @@ terraform {
}
}
```
Following guidance against setting maximum provider versions is critical, particularly for modules dependent on other modules.

Note the guidance against setting maximum provider versions. This is especially important for modules that end up as dependencies of other modules.
Since end users install only one provider version, conflicting provider version constraints across module dependencies can make the modules unusable.

Because only one version of a provider is ultimately installed by end users, conflicting provider versions in module dependencies can result in modules being unusable.

An exception to the general rule of avoiding pinning maximum provider versions in modules is to prevent a module from pulling in breaking changes from a future version of a provider.
The exception to this rule is when setting a maximum provider version is necessary to prevent a module from inadvertently adopting breaking changes in future provider versions.

## Committing `.terraform.lock.hcl` File

When running `tofu init` in a directory with `.tf` files, a [`.terraform.lock.hcl`](https://opentofu.org/docs/language/files/dependency-lock) file will be automatically generated if it doesn't exist.

This file shouldn't be committed in module repositories, but should be committed to repositories where those modules are referenced to provision live infrastructure.
When you run `tofu init` in a directory with `.tf` files, a [`.terraform.lock.hcl`](https://opentofu.org/docs/language/files/dependency-lock) file is automatically created if one does not already exist.

When using Terragrunt, note how [Terragrunt handles lock files](https://terragrunt.gruntwork.io/docs/features/lock-file-handling/).
This file should not be committed in module repositories. However, committing it in repositories used to provision live infrastructure is recommended.

Keep this behavior in mind when deciding how to handle `.terraform.lock.hcl` files in modules.
For Terragrunt users, review [Terragrunt’s approach to lock file management](https://terragrunt.gruntwork.io/docs/features/lock-file-handling/) to determine the best way to handle `.terraform.lock.hcl` files in modules.
20 changes: 11 additions & 9 deletions docs/2.0/docs/library/concepts/principles/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,23 @@ import DocCardList from '@theme/DocCardList';

# Overview

Authoring a large library of modules like this requires that certain principles be decided and documented for iteration.
Developing and maintaining a comprehensive library of modules requires clearly defined and documented principles to support ongoing iteration and improvement.

These principles help guide decision making when multiple valid approaches exist for tackling a problem.
These principles serve as a framework for decision-making when multiple valid solutions are available for addressing a problem.

## High Level Principles
## High-level principles

These are some documented high level principles that should be followed in order to encourage continuous increase in the quality of modules in the library.
The following high-level principles are intended to continuously improve the quality and reliability of the modules within this library.

### If Isn't Tested It's Broken
### If it isn't tested, it's broken

The [Terratest](https://github.com/gruntwork-io/terratest) testing library was created in order to provide a way to efficiently test IAC modules in an easy way.
All modules in the library should have tests associated with them, and whenever possible, should have tests that test all the most important behaviors.
The [Terratest](https://github.com/gruntwork-io/terratest) testing library was developed to provide an efficient and effective method for testing Infrastructure as Code (IaC) modules.

It can be expensive to maintain this many tests, but it is more expensive not to, when dealing with surface area this large.
All modules in this library **must** have associated tests. Wherever feasible, these tests should cover all critical functionalities and behaviors.

While maintaining a comprehensive suite of tests can be resource-intensive, neglecting testing introduces significantly higher risks and costs due to the library's extensive surface area.

As a best practice, it is **more important that a module has basic testing coverage** than that it is tested exhaustively.

As a matter of general practice, it is more important that a module is tested at all than that it is tested comprehensively

<DocCardList />
51 changes: 21 additions & 30 deletions docs/2.0/docs/library/concepts/principles/quality-in-depth.md
Original file line number Diff line number Diff line change
@@ -1,51 +1,42 @@
# Quality in Depth

Similar to the notion of [Defense in Depth](https://en.wikipedia.org/wiki/Defense_in_depth_(computing)), quality in depth is a concept relating to how multiple layers of quality assurance can be used to ensure that modules continuously improve in quality over time.
Inspired by the concept of [Defense in Depth](https://en.wikipedia.org/wiki/Defense_in_depth_(computing)), quality in depth refers to implementing multiple layers of quality assurance to ensure continuous improvement in the quality of modules.

## Quality Checks We Use
## Quality checks we use

These are some of the standard quality checks that are frequently used to ensure that modules are of high quality.
These are the standard quality checks we employ to maintain high-quality modules.

### Automated Testing
### Automated testing

The most important tool in our arsenal for ensuring high quality modules is automated testing. Nothing will catch more bugs than actually attempting to provision infrastructure using a module, verifying that the infrastructure is provisioned as expected, and then tearing it down.
Automated testing is the most effective method for ensuring the quality of modules. It involves provisioning infrastructure using a module, verifying it works as expected, and tearing it down.

To support this, we use the [Terratest](https://github.com/gruntwork-io/terratest) library, which is an open source Go library maintained by Gruntwork that makes it easier to automate this process.
We rely on [Terratest](https://github.com/gruntwork-io/terratest), an open-source Go library maintained by Gruntwork, to facilitate this process. Terratest enables local testing and CI pipeline tests against live cloud environments to verify that all library modules function as intended.

These tests can be run locally, and are run in CI pipelines against live cloud environments to ensure that every module in the library works as expected.
### Pre-commit hooks

### Pre-commit Hooks
Pre-commit hooks enable module authors to identify and address issues early in the development process. They are also enforced during CI runs to maintain compliance and consistency.

Prior to committing any software, module authors leverage a suite of pre-commit hooks to ensure that quality is introduced as early as possible. These hooks are run again in CI to ensure that authors did in-fact run them locally.
For details on available hooks and repository-specific configurations, refer to the documentation in the [pre-commit repository](https://github.com/gruntwork-io/pre-commit?tab=readme-ov-file#pre-commit-hooks).

For a list of hooks available for authors, and for information on the hooks leveraged for a particular repository in the library, see the documentation in the [pre-commit](https://github.com/gruntwork-io/pre-commit?tab=readme-ov-file#pre-commit-hooks) repository.
### Security scanning

### Security Scanning
While ensuring security often involves good practices and sound judgment, static analysis tools can identify potential module vulnerabilities.

While making modules secure is often a practice of exercising good judgement and following best practices, there are some tools that can help identify security vulnerabilities in modules through static analysis.
- [Terrascan](https://github.com/tenable/terrascan) is used in CI pipelines to detect vulnerabilities through static analysis.
- [Steampipe](https://github.com/turbot/steampipe) performs live test cloud environment scans to detect security risks not captured by static analysis. These live test scans validate CIS compliance of modules like the [cis-service-catalog](https://github.com/gruntwork-io/terraform-aws-cis-service-catalog).

The tool used to achieve this in CI is [Terrascan](https://github.com/tenable/terrascan). This is an open source tool that is run continuously to ensure that our modules do not have any security vulnerabilities that are easily detectable.
### Automated documentation generation

In addition to static analysis, we also use the [Steampipe](https://github.com/turbot/steampipe) tool to scan live test cloud environments for security vulnerabilities that might not be easily detectable through static analysis. This tool is particularly useful in ensuring the CIS compliance of the [cis-service-catalog](https://github.com/gruntwork-io/terraform-aws-cis-service-catalog) provided to automate the provisioning of CIS compliant AWS accounts.
While well-written, human-generated documentation captures intent and technical details, automated documentation generation ensures accuracy and up-to-date information. Gruntwork employs custom tools to supplement manually written documentation with automatically generated details, available in the [Library Reference](/library/reference).

### Automated Documentation Generation
## Quality checks we don't use

Generally, the best documentation is that which is written by a person and accurately conveys not only the technical details of a module, but also the intent behind the module. However, it is also useful to have automated documentation generation to ensure that the documentation is always up to date.
Not every quality check is practical or valuable enough to justify its implementation. Each quality check incurs a cost, and we aim to maintain a high signal-to-noise ratio by using the most impactful methods within available resources.

To achieve that goal, some custom tooling is used to automatically generate documentation for modules in the library in the context of the manually written documentation to express intent. You can see this documentation by navigating to the [Library Reference](/library/reference)
### Infrastructure cost

## Quality Checks We Don't Use
Tools like [Infracost](https://github.com/infracost/infracost) can help assess the cost of live infrastructure. However, it is less relevant to a module library and is not currently part of our quality checks.

It is impossible to use every conceivable quality check, and some quality checks are more valuable than others. Every quality check has a cost associated with it, and it is important to be judicious in the quality checks that are used to ensure a high signal to noise ratio. That being said, there is also limited time and resources to implement new quality checks on the entire library.
### Further exploration

Gruntwork strives to use the most valuable quality checks that result in high quality modules with the resources available to us.

These are some quality checks that we are aware of, but don't use. They may not be in use at the moment because they are not valuable enough to justify the cost of implementing them, or because they are not a good fit for the library. Regardless, they are referenced here so that they can be considered in the future, and so that you can evaluate whether they are a good fit for your own modules.

### Infrastructure Cost

A useful tool for evaluating the cost of infrastructure is [Infracost](https://github.com/infracost/infracost). This is a useful tool, but more practically useful for live infrastructure than a library of modules.

### More to be Discovered

Quality checks that are _not_ used are harder to think of than those that _are_ used. If you have any suggestions for quality checks that we should consider, please let us know by sending a pull request to this document.
Determining unused quality checks is inherently challenging. If you have suggestions for additional quality checks we should consider, please contribute by submitting a pull request to this document.