diff --git a/README.md b/README.md index 5671233..2ea5bc2 100644 --- a/README.md +++ b/README.md @@ -94,7 +94,7 @@ provider "conduktor" { } # register an external user bob with PLATFORM.userView permission -resource "conduktor_user_v2" "bob" { +resource "conduktor_console_user_v2" "bob" { name = "bob@mycompany.io" spec { firstname = "Bob" @@ -109,7 +109,7 @@ resource "conduktor_user_v2" "bob" { } # create a group with Bob as a member -resource "conduktor_group_v2" "qa" { +resource "conduktor_console_group_v2" "qa" { name = "qa" spec { display_name = "QA team" @@ -153,10 +153,10 @@ Using environment variables `CDK_API_TOKEN` or `CDK_API_KEY`. Use local user (usually admin) credentials pair. This will login against the API and use an ephemeral access token to make API calls. -Using HCL `admin_email`/`admin_password` attributes +Using HCL `admin_user`/`admin_password` attributes ```hcl provider "conduktor" { - admin_email = "admin@my-org.com" + admin_user = "admin@my-org.com" admin_password = "admin-password" } ``` diff --git a/docs/index.md b/docs/index.md index 6275d4f..aa5264f 100644 --- a/docs/index.md +++ b/docs/index.md @@ -109,7 +109,7 @@ resource "conduktor_gateway_service_account_v2" "gateway_sa" { ### Optional -- `admin_password` (String, Sensitive) The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_email is set. If not provided, the API token will be used to authenticater. +- `admin_password` (String, Sensitive) The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_user is set. If not provided, the API token will be used to authenticater. - `admin_user` (String) The login credentials of the admin user. May be set using environment variable `CDK_CONSOLE_USER`, `CDK_ADMIN_EMAIL` or `CDK_ADMIN_USER` for Console, `CDK_GATEWAY_USER` or `CDK_ADMIN_USER` for Gateway. Required if admin_password is set. If not provided and `mode` is Console, the API token will be used to authenticate. - `api_token` (String, Sensitive) The API token to authenticate with the Conduktor Console API. May be set using environment variable `CDK_API_TOKEN` or `CDK_API_KEY`. If not provided, admin_user and admin_password will be used to authenticate. See [documentation](https://docs.conduktor.io/platform/reference/api-reference/#generate-an-api-key) for more information. Not used if `mode` is Gateway. - `base_url` (String) The URL of either Conduktor Console or Gateway, depending on the `mode`. May be set using environment variable `CDK_CONSOLE_BASE_URL` or `CDK_BASE_URL` for Console, `CDK_GATEWAY_BASE_URL` or `CDK_BASE_URL` for Gateway. Required either here or in the environment. diff --git a/docs/resources/group_v2.md b/docs/resources/console_group_v2.md similarity index 91% rename from docs/resources/group_v2.md rename to docs/resources/console_group_v2.md index 15ff0ab..5ba2594 100644 --- a/docs/resources/group_v2.md +++ b/docs/resources/console_group_v2.md @@ -1,12 +1,12 @@ --- -page_title: "Conduktor : conduktor_group_v2 " +page_title: "Conduktor : conduktor_console_group_v2 " subcategory: "iam/v2" description: |- Resource for managing Conduktor groups. This resource allows you to create, read, update and delete groups in Conduktor. --- -# conduktor_group_v2 +# conduktor_console_group_v2 Resource for managing Conduktor groups. This resource allows you to create, read, update and delete groups in Conduktor. @@ -15,7 +15,7 @@ This resource allows you to create, read, update and delete groups in Conduktor. ### Simple group without members or permissions ```terraform -resource "conduktor_group_v2" "example" { +resource "conduktor_console_group_v2" "example" { name = "simple-group" spec { display_name = "Simple Group" @@ -26,7 +26,7 @@ resource "conduktor_group_v2" "example" { ### Complex group with members, external reference and permissions ```terraform -resource "conduktor_user_v2" "user1" { +resource "conduktor_console_user_v2" "user1" { name = "user1@company.com" spec { firstname = "User" @@ -35,13 +35,13 @@ resource "conduktor_user_v2" "user1" { } } -resource "conduktor_group_v2" "example" { +resource "conduktor_console_group_v2" "example" { name = "complex-group" spec { display_name = "Complex group" description = "Complex group description" external_groups = ["sso-group1"] - members = [conduktor_user_v2.user1.name] + members = [conduktor_console_user_v2.user1.name] permissions = [ { resource_type = "PLATFORM" diff --git a/docs/resources/kafka_cluster_v2.md b/docs/resources/console_kafka_cluster_v2.md similarity index 96% rename from docs/resources/kafka_cluster_v2.md rename to docs/resources/console_kafka_cluster_v2.md index 1822e7b..daf9776 100644 --- a/docs/resources/kafka_cluster_v2.md +++ b/docs/resources/console_kafka_cluster_v2.md @@ -1,12 +1,12 @@ --- -page_title: "Conduktor : conduktor_kafka_cluster_v2 " +page_title: "Conduktor : conduktor_console_kafka_cluster_v2 " subcategory: "console/v2" description: |- Resource for managing Conduktor Kafka cluster definition with optional Schema registry. This resource allows you to create, read, update and delete Kafka cluster and Schema registry definitions in Conduktor. --- -# conduktor_kafka_cluster_v2 +# conduktor_console_kafka_cluster_v2 Resource for managing Conduktor Kafka cluster and Schema registry definitions. This resource allows you to create, read, update and delete Kafka clusters and Schema registry definitions in Conduktor. @@ -16,7 +16,7 @@ This resource allows you to create, read, update and delete Kafka clusters and S ### Simple Kafka cluster without Schema registry This example creates a simple Kafka cluster definition without authentication resource and without Schema Registry. ```terraform -resource "conduktor_kafka_cluster_v2" "simple" { +resource "conduktor_console_kafka_cluster_v2" "simple" { name = "simple-cluster" spec { display_name = "Simple kafka Cluster" @@ -32,7 +32,7 @@ resource "conduktor_kafka_cluster_v2" "simple" { This example creates a Confluent Kafka cluster and Schema Registry definition resource. The Schema Registry authentication uses mTLS. ```terraform -resource "conduktor_kafka_cluster_v2" "confluent" { +resource "conduktor_console_kafka_cluster_v2" "confluent" { name = "confluent-cluster" labels = { "env" = "staging" @@ -84,7 +84,7 @@ EOT This example creates an Aiven Kafka cluster and Schema Registry definition resource. The Schema Registry authentication uses basic auth. ```terraform -resource "conduktor_kafka_cluster_v2" "aiven" { +resource "conduktor_console_kafka_cluster_v2" "aiven" { name = "aiven-cluster" labels = { "env" = "test" @@ -122,7 +122,7 @@ resource "conduktor_kafka_cluster_v2" "aiven" { ### AWS MSK with Glue Schema registry This example creates an AWS MSK Kafka Cluster and a Glue Schema Registry definition resource. ```terraform -resource "conduktor_kafka_cluster_v2" "aws_msk" { +resource "conduktor_console_kafka_cluster_v2" "aws_msk" { name = "aws-cluster" labels = { "env" = "prod" @@ -157,7 +157,7 @@ resource "conduktor_kafka_cluster_v2" "aws_msk" { This example creates a Conduktor Gateway Kafka Cluster and Schema Registry definition resource. The Schema Registry authentication uses a bearer token. ```terraform -resource "conduktor_kafka_cluster_v2" "gateway" { +resource "conduktor_console_kafka_cluster_v2" "gateway" { name = "gateway-cluster" labels = { "env" = "prod" diff --git a/docs/resources/kafka_connect_v2.md b/docs/resources/console_kafka_connect_v2.md similarity index 86% rename from docs/resources/kafka_connect_v2.md rename to docs/resources/console_kafka_connect_v2.md index 77612da..e37809b 100644 --- a/docs/resources/kafka_connect_v2.md +++ b/docs/resources/console_kafka_connect_v2.md @@ -1,12 +1,12 @@ --- -page_title: "Conduktor : conduktor_kafka_connect_v2 " +page_title: "Conduktor : conduktor_console_kafka_connect_v2 " subcategory: "console/v2" description: |- Resource for managing Conduktor Kafka Connect servers definition linked to an existing Kafka cluster definition inside Conduktor Console. This resource allows you to create, read, update and delete Kafka Connect servers connections from Conduktor Console. --- -# conduktor_kafka_connect_v2 +# conduktor_console_kafka_connect_v2 Resource for managing Conduktor Kafka Connect servers definition linked to an existing Kafka cluster definition inside Conduktor Console. This resource allows you to create, read, update and delete Kafka Connect servers connections from Conduktor Console. @@ -16,7 +16,7 @@ This resource allows you to create, read, update and delete Kafka Connect server ### Simple Kafka Connect server This example creates a simple Kafka Connect server connection without any authentication. ```terraform -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -24,9 +24,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "simple" { +resource "conduktor_console_kafka_connect_v2" "simple" { name = "simple-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name spec { display_name = "Simple Connect Server" urls = "http://localhost:8083" @@ -37,7 +37,7 @@ resource "conduktor_kafka_connect_v2" "simple" { ### Basic Kafka Connect server This example creates a complex Kafka Connect server connection with basic authentication. ```terraform -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -45,9 +45,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "basic" { +resource "conduktor_console_kafka_connect_v2" "basic" { name = "basic-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { description = "This is a complex connect using basic authentication" documentation = "https://docs.mycompany.com/complex-connect" @@ -73,7 +73,7 @@ resource "conduktor_kafka_connect_v2" "basic" { ### Bearer token Kafka Connect server This example creates a complex Kafka Connect server connection with bearer token authentication. ```terraform -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -81,9 +81,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "bearer" { +resource "conduktor_console_kafka_connect_v2" "bearer" { name = "bearer-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { description = "This is a complex connect using bearer token authentication" documentation = "https://docs.mycompany.com/complex-connect" @@ -108,7 +108,7 @@ resource "conduktor_kafka_connect_v2" "bearer" { ### mTLS Kafka Connect server This example creates a complex Kafka Connect server connection with mTLS authentication. ```terraform -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -116,9 +116,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "mtls" { +resource "conduktor_console_kafka_connect_v2" "mtls" { name = "mtls-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { description = "This is a complex connect using mTLS authentication" documentation = "https://docs.mycompany.com/complex-connect" @@ -210,12 +210,12 @@ The import ID is constructed as follows: `< cluster_id >/< connect_id >`. For example, using an [`import` block](https://developer.hashicorp.com/terraform/language/import) : ```terraform import { - to = conduktor_kafka_connect_v2.example + to = conduktor_console_kafka_connect_v2.example id = "mini-cluster/import-connect" # Import "import-connect" Connect server for "mini-cluster" Kafka cluster } ``` Using the `terraform import` command: ```shell -terraform import conduktor_kafka_connect_v2.example mini-cluster/import-connect +terraform import conduktor_console_kafka_connect_v2.example mini-cluster/import-connect ``` diff --git a/docs/resources/user_v2.md b/docs/resources/console_user_v2.md similarity index 93% rename from docs/resources/user_v2.md rename to docs/resources/console_user_v2.md index 523e120..5105a82 100644 --- a/docs/resources/user_v2.md +++ b/docs/resources/console_user_v2.md @@ -1,12 +1,12 @@ --- -page_title: "Conduktor : conduktor_user_v2 " +page_title: "Conduktor : conduktor_console_user_v2 " subcategory: "iam/v2" description: |- Resource for managing Conduktor users. This resource allows you to create, read, update and delete users in Conduktor. --- -# conduktor_user_v2 +# conduktor_console_user_v2 Resource for managing Conduktor users. This resource allows you to create, read, update and delete users in Conduktor. @@ -15,7 +15,7 @@ This resource allows you to create, read, update and delete users in Conduktor. ### Simple user without permissions ```terraform -resource "conduktor_user_v2" "example" { +resource "conduktor_console_user_v2" "example" { name = "bob@company.io" spec { firstname = "Bob" @@ -26,7 +26,7 @@ resource "conduktor_user_v2" "example" { ### Complex user with permissions ```terraform -resource "conduktor_user_v2" "example" { +resource "conduktor_console_user_v2" "example" { name = "bob@company.io" spec { firstname = "Bob" diff --git a/examples/resources/conduktor_group_v2/complex.tf b/examples/resources/conduktor_console_group_v2/complex.tf similarity index 80% rename from examples/resources/conduktor_group_v2/complex.tf rename to examples/resources/conduktor_console_group_v2/complex.tf index 40c2a96..1d88581 100644 --- a/examples/resources/conduktor_group_v2/complex.tf +++ b/examples/resources/conduktor_console_group_v2/complex.tf @@ -1,4 +1,4 @@ -resource "conduktor_user_v2" "user1" { +resource "conduktor_console_user_v2" "user1" { name = "user1@company.com" spec { firstname = "User" @@ -7,13 +7,13 @@ resource "conduktor_user_v2" "user1" { } } -resource "conduktor_group_v2" "example" { +resource "conduktor_console_group_v2" "example" { name = "complex-group" spec { display_name = "Complex group" description = "Complex group description" external_groups = ["sso-group1"] - members = [conduktor_user_v2.user1.name] + members = [conduktor_console_user_v2.user1.name] permissions = [ { resource_type = "PLATFORM" diff --git a/examples/resources/conduktor_group_v2/simple.tf b/examples/resources/conduktor_console_group_v2/simple.tf similarity index 70% rename from examples/resources/conduktor_group_v2/simple.tf rename to examples/resources/conduktor_console_group_v2/simple.tf index 478ed39..db85c5b 100644 --- a/examples/resources/conduktor_group_v2/simple.tf +++ b/examples/resources/conduktor_console_group_v2/simple.tf @@ -1,4 +1,4 @@ -resource "conduktor_group_v2" "example" { +resource "conduktor_console_group_v2" "example" { name = "simple-group" spec { display_name = "Simple Group" diff --git a/examples/resources/conduktor_kafka_cluster_v2/aiven.tf b/examples/resources/conduktor_console_kafka_cluster_v2/aiven.tf similarity index 94% rename from examples/resources/conduktor_kafka_cluster_v2/aiven.tf rename to examples/resources/conduktor_console_kafka_cluster_v2/aiven.tf index 5dc9308..253bd5a 100644 --- a/examples/resources/conduktor_kafka_cluster_v2/aiven.tf +++ b/examples/resources/conduktor_console_kafka_cluster_v2/aiven.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "aiven" { +resource "conduktor_console_kafka_cluster_v2" "aiven" { name = "aiven-cluster" labels = { "env" = "test" diff --git a/examples/resources/conduktor_kafka_cluster_v2/aws_msk.tf b/examples/resources/conduktor_console_kafka_cluster_v2/aws_msk.tf similarity index 95% rename from examples/resources/conduktor_kafka_cluster_v2/aws_msk.tf rename to examples/resources/conduktor_console_kafka_cluster_v2/aws_msk.tf index b2fd81f..6cbe359 100644 --- a/examples/resources/conduktor_kafka_cluster_v2/aws_msk.tf +++ b/examples/resources/conduktor_console_kafka_cluster_v2/aws_msk.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "aws_msk" { +resource "conduktor_console_kafka_cluster_v2" "aws_msk" { name = "aws-cluster" labels = { "env" = "prod" diff --git a/examples/resources/conduktor_kafka_cluster_v2/confluent.tf b/examples/resources/conduktor_console_kafka_cluster_v2/confluent.tf similarity index 96% rename from examples/resources/conduktor_kafka_cluster_v2/confluent.tf rename to examples/resources/conduktor_console_kafka_cluster_v2/confluent.tf index fa7ccd2..801e281 100644 --- a/examples/resources/conduktor_kafka_cluster_v2/confluent.tf +++ b/examples/resources/conduktor_console_kafka_cluster_v2/confluent.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "confluent" { +resource "conduktor_console_kafka_cluster_v2" "confluent" { name = "confluent-cluster" labels = { "env" = "staging" diff --git a/examples/resources/conduktor_kafka_cluster_v2/gateway.tf b/examples/resources/conduktor_console_kafka_cluster_v2/gateway.tf similarity index 94% rename from examples/resources/conduktor_kafka_cluster_v2/gateway.tf rename to examples/resources/conduktor_console_kafka_cluster_v2/gateway.tf index 4e95166..de4580e 100644 --- a/examples/resources/conduktor_kafka_cluster_v2/gateway.tf +++ b/examples/resources/conduktor_console_kafka_cluster_v2/gateway.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "gateway" { +resource "conduktor_console_kafka_cluster_v2" "gateway" { name = "gateway-cluster" labels = { "env" = "prod" diff --git a/examples/resources/conduktor_kafka_cluster_v2/simple.tf b/examples/resources/conduktor_console_kafka_cluster_v2/simple.tf similarity index 83% rename from examples/resources/conduktor_kafka_cluster_v2/simple.tf rename to examples/resources/conduktor_console_kafka_cluster_v2/simple.tf index 21608a0..dc1f025 100644 --- a/examples/resources/conduktor_kafka_cluster_v2/simple.tf +++ b/examples/resources/conduktor_console_kafka_cluster_v2/simple.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "simple" { +resource "conduktor_console_kafka_cluster_v2" "simple" { name = "simple-cluster" spec { display_name = "Simple kafka Cluster" diff --git a/examples/resources/conduktor_kafka_connect_v2/basicAuth.tf b/examples/resources/conduktor_console_kafka_connect_v2/basicAuth.tf similarity index 79% rename from examples/resources/conduktor_kafka_connect_v2/basicAuth.tf rename to examples/resources/conduktor_console_kafka_connect_v2/basicAuth.tf index b0ed008..97cff67 100644 --- a/examples/resources/conduktor_kafka_connect_v2/basicAuth.tf +++ b/examples/resources/conduktor_console_kafka_connect_v2/basicAuth.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -6,9 +6,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "basic" { +resource "conduktor_console_kafka_connect_v2" "basic" { name = "basic-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { description = "This is a complex connect using basic authentication" documentation = "https://docs.mycompany.com/complex-connect" diff --git a/examples/resources/conduktor_kafka_connect_v2/bearerToken.tf b/examples/resources/conduktor_console_kafka_connect_v2/bearerToken.tf similarity index 78% rename from examples/resources/conduktor_kafka_connect_v2/bearerToken.tf rename to examples/resources/conduktor_console_kafka_connect_v2/bearerToken.tf index d194662..769e7a1 100644 --- a/examples/resources/conduktor_kafka_connect_v2/bearerToken.tf +++ b/examples/resources/conduktor_console_kafka_connect_v2/bearerToken.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -6,9 +6,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "bearer" { +resource "conduktor_console_kafka_connect_v2" "bearer" { name = "bearer-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { description = "This is a complex connect using bearer token authentication" documentation = "https://docs.mycompany.com/complex-connect" diff --git a/examples/resources/conduktor_kafka_connect_v2/import.tf b/examples/resources/conduktor_console_kafka_connect_v2/import.tf similarity index 70% rename from examples/resources/conduktor_kafka_connect_v2/import.tf rename to examples/resources/conduktor_console_kafka_connect_v2/import.tf index 5ae2d8a..5e5f389 100644 --- a/examples/resources/conduktor_kafka_connect_v2/import.tf +++ b/examples/resources/conduktor_console_kafka_connect_v2/import.tf @@ -1,4 +1,4 @@ import { - to = conduktor_kafka_connect_v2.example + to = conduktor_console_kafka_connect_v2.example id = "mini-cluster/import-connect" # Import "import-connect" Connect server for "mini-cluster" Kafka cluster } diff --git a/examples/resources/conduktor_kafka_connect_v2/mtls.tf b/examples/resources/conduktor_console_kafka_connect_v2/mtls.tf similarity index 85% rename from examples/resources/conduktor_kafka_connect_v2/mtls.tf rename to examples/resources/conduktor_console_kafka_connect_v2/mtls.tf index 53961b6..d6f2e8b 100644 --- a/examples/resources/conduktor_kafka_connect_v2/mtls.tf +++ b/examples/resources/conduktor_console_kafka_connect_v2/mtls.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -6,9 +6,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "mtls" { +resource "conduktor_console_kafka_connect_v2" "mtls" { name = "mtls-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { description = "This is a complex connect using mTLS authentication" documentation = "https://docs.mycompany.com/complex-connect" diff --git a/examples/resources/conduktor_kafka_connect_v2/simple.tf b/examples/resources/conduktor_console_kafka_connect_v2/simple.tf similarity index 59% rename from examples/resources/conduktor_kafka_connect_v2/simple.tf rename to examples/resources/conduktor_console_kafka_connect_v2/simple.tf index da637f2..bb15f69 100644 --- a/examples/resources/conduktor_kafka_connect_v2/simple.tf +++ b/examples/resources/conduktor_console_kafka_connect_v2/simple.tf @@ -1,4 +1,4 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -6,9 +6,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "simple" { +resource "conduktor_console_kafka_connect_v2" "simple" { name = "simple-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name spec { display_name = "Simple Connect Server" urls = "http://localhost:8083" diff --git a/examples/resources/conduktor_user_v2/complex.tf b/examples/resources/conduktor_console_user_v2/complex.tf similarity index 90% rename from examples/resources/conduktor_user_v2/complex.tf rename to examples/resources/conduktor_console_user_v2/complex.tf index bf3d17b..26c9746 100644 --- a/examples/resources/conduktor_user_v2/complex.tf +++ b/examples/resources/conduktor_console_user_v2/complex.tf @@ -1,4 +1,4 @@ -resource "conduktor_user_v2" "example" { +resource "conduktor_console_user_v2" "example" { name = "bob@company.io" spec { firstname = "Bob" diff --git a/examples/resources/conduktor_user_v2/simple.tf b/examples/resources/conduktor_console_user_v2/simple.tf similarity index 63% rename from examples/resources/conduktor_user_v2/simple.tf rename to examples/resources/conduktor_console_user_v2/simple.tf index 7080927..5cfe3d7 100644 --- a/examples/resources/conduktor_user_v2/simple.tf +++ b/examples/resources/conduktor_console_user_v2/simple.tf @@ -1,4 +1,4 @@ -resource "conduktor_user_v2" "example" { +resource "conduktor_console_user_v2" "example" { name = "bob@company.io" spec { firstname = "Bob" diff --git a/internal/mapper/group_v2/group_v2_resource_mapper.go b/internal/mapper/console_group_v2/console_group_v2_resource_mapper.go similarity index 66% rename from internal/mapper/group_v2/group_v2_resource_mapper.go rename to internal/mapper/console_group_v2/console_group_v2_resource_mapper.go index 42bf13b..41b3d95 100644 --- a/internal/mapper/group_v2/group_v2_resource_mapper.go +++ b/internal/mapper/console_group_v2/console_group_v2_resource_mapper.go @@ -1,44 +1,44 @@ -package group_v2 +package console_group_v2 import ( "context" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" schema "github.com/conduktor/terraform-provider-conduktor/internal/schema" - groups "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_group_v2" + groups "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_group_v2" "github.com/hashicorp/terraform-plugin-framework/attr" "github.com/hashicorp/terraform-plugin-framework/types" "github.com/hashicorp/terraform-plugin-framework/types/basetypes" ) -func TFToInternalModel(ctx context.Context, r *groups.GroupV2Model) (model.GroupConsoleResource, error) { +func TFToInternalModel(ctx context.Context, r *groups.ConsoleGroupV2Model) (console.GroupConsoleResource, error) { // ExternalGroups externalGroups, diag := schema.SetValueToStringArray(ctx, r.Spec.ExternalGroups) if diag.HasError() { - return model.GroupConsoleResource{}, mapper.WrapDiagError(diag, "externalGroups", mapper.FromTerraform) + return console.GroupConsoleResource{}, mapper.WrapDiagError(diag, "externalGroups", mapper.FromTerraform) } // Members members, diag := schema.SetValueToStringArray(ctx, r.Spec.Members) if diag.HasError() { - return model.GroupConsoleResource{}, mapper.WrapDiagError(diag, "members", mapper.FromTerraform) + return console.GroupConsoleResource{}, mapper.WrapDiagError(diag, "members", mapper.FromTerraform) } membersFromExternalGroups, diag := schema.SetValueToStringArray(ctx, r.Spec.MembersFromExternalGroups) if diag.HasError() { - return model.GroupConsoleResource{}, mapper.WrapDiagError(diag, "membersFromExternalGroups", mapper.FromTerraform) + return console.GroupConsoleResource{}, mapper.WrapDiagError(diag, "membersFromExternalGroups", mapper.FromTerraform) } // Permissions permissions, err := schema.SetValueToPermissionArray(ctx, schema.GROUPS, r.Spec.Permissions) if err != nil { - return model.GroupConsoleResource{}, err + return console.GroupConsoleResource{}, err } - return model.NewGroupConsoleResource( + return console.NewGroupConsoleResource( r.Name.ValueString(), - model.GroupConsoleSpec{ + console.GroupConsoleSpec{ DisplayName: r.Spec.DisplayName.ValueString(), Description: r.Spec.Description.ValueString(), ExternalGroups: externalGroups, @@ -49,25 +49,25 @@ func TFToInternalModel(ctx context.Context, r *groups.GroupV2Model) (model.Group ), nil } -func InternalModelToTerraform(ctx context.Context, r *model.GroupConsoleResource) (groups.GroupV2Model, error) { +func InternalModelToTerraform(ctx context.Context, r *console.GroupConsoleResource) (groups.ConsoleGroupV2Model, error) { externalGroupsList, diag := schema.StringArrayToSetValue(r.Spec.ExternalGroups) if diag.HasError() { - return groups.GroupV2Model{}, mapper.WrapDiagError(diag, "external_groups", mapper.IntoTerraform) + return groups.ConsoleGroupV2Model{}, mapper.WrapDiagError(diag, "external_groups", mapper.IntoTerraform) } membersList, diag := schema.StringArrayToSetValue(r.Spec.Members) if diag.HasError() { - return groups.GroupV2Model{}, mapper.WrapDiagError(diag, "members", mapper.IntoTerraform) + return groups.ConsoleGroupV2Model{}, mapper.WrapDiagError(diag, "members", mapper.IntoTerraform) } membersFromExternalGroupsList, diag := schema.StringArrayToSetValue(r.Spec.MembersFromExternalGroups) if diag.HasError() { - return groups.GroupV2Model{}, mapper.WrapDiagError(diag, "members_from_external_groups", mapper.IntoTerraform) + return groups.ConsoleGroupV2Model{}, mapper.WrapDiagError(diag, "members_from_external_groups", mapper.IntoTerraform) } permissionsList, err := schema.PermissionArrayToSetValue(ctx, schema.GROUPS, r.Spec.Permissions) if err != nil { - return groups.GroupV2Model{}, err + return groups.ConsoleGroupV2Model{}, err } specValue, diag := groups.NewSpecValue( @@ -89,10 +89,10 @@ func InternalModelToTerraform(ctx context.Context, r *model.GroupConsoleResource }, ) if diag.HasError() { - return groups.GroupV2Model{}, mapper.WrapDiagError(diag, "spec", mapper.IntoTerraform) + return groups.ConsoleGroupV2Model{}, mapper.WrapDiagError(diag, "spec", mapper.IntoTerraform) } - return groups.GroupV2Model{ + return groups.ConsoleGroupV2Model{ Name: types.StringValue(r.Metadata.Name), Spec: specValue, }, nil diff --git a/internal/mapper/group_v2/group_v2_resource_mapper_test.go b/internal/mapper/console_group_v2/console_group_v2_resource_mapper_test.go similarity index 93% rename from internal/mapper/group_v2/group_v2_resource_mapper_test.go rename to internal/mapper/console_group_v2/console_group_v2_resource_mapper_test.go index aa6d9a2..2909db6 100644 --- a/internal/mapper/group_v2/group_v2_resource_mapper_test.go +++ b/internal/mapper/console_group_v2/console_group_v2_resource_mapper_test.go @@ -1,4 +1,4 @@ -package group_v2 +package console_group_v2 import ( "context" @@ -6,6 +6,7 @@ import ( ctlresource "github.com/conduktor/ctl/resource" "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" "github.com/conduktor/terraform-provider-conduktor/internal/test" "github.com/google/go-cmp/cmp" "github.com/google/go-cmp/cmp/cmpopts" @@ -17,7 +18,7 @@ func TestGroupV2ModelMapping(t *testing.T) { ctx := context.Background() - jsonGroupV2Resource := []byte(test.TestAccTestdata(t, "group_v2_api.json")) + jsonGroupV2Resource := []byte(test.TestAccTestdata(t, "console_group_v2_api.json")) ctlResource := ctlresource.Resource{} err := ctlResource.UnmarshalJSON(jsonGroupV2Resource) @@ -32,7 +33,7 @@ func TestGroupV2ModelMapping(t *testing.T) { assert.Equal(t, jsonGroupV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewGroupConsoleResourceFromClientResource(ctlResource) + internal, err := console.NewGroupConsoleResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return diff --git a/internal/mapper/kafka_cluster_v2/internal_to_tf_mapper.go b/internal/mapper/console_kafka_cluster_v2/internal_to_tf_mapper.go similarity index 94% rename from internal/mapper/kafka_cluster_v2/internal_to_tf_mapper.go rename to internal/mapper/console_kafka_cluster_v2/internal_to_tf_mapper.go index f9fc276..37f3a1c 100644 --- a/internal/mapper/kafka_cluster_v2/internal_to_tf_mapper.go +++ b/internal/mapper/console_kafka_cluster_v2/internal_to_tf_mapper.go @@ -1,36 +1,37 @@ -package kafka_cluster_v2 +package console_kafka_cluster_v2 import ( "context" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" schemaUtils "github.com/conduktor/terraform-provider-conduktor/internal/schema" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_kafka_cluster_v2" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_kafka_cluster_v2" "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" "github.com/hashicorp/terraform-plugin-framework/types" "github.com/hashicorp/terraform-plugin-framework/types/basetypes" ) -func InternalModelToTerraform(ctx context.Context, r *model.KafkaClusterResource) (schema.KafkaClusterV2Model, error) { +func InternalModelToTerraform(ctx context.Context, r *console.KafkaClusterResource) (schema.ConsoleKafkaClusterV2Model, error) { labels, diag := schemaUtils.StringMapToMapValue(ctx, r.Metadata.Labels) if diag.HasError() { - return schema.KafkaClusterV2Model{}, mapper.WrapDiagError(diag, "labels", mapper.IntoTerraform) + return schema.ConsoleKafkaClusterV2Model{}, mapper.WrapDiagError(diag, "labels", mapper.IntoTerraform) } specValue, err := specInternalModelToTerraform(ctx, &r.Spec) if err != nil { - return schema.KafkaClusterV2Model{}, err + return schema.ConsoleKafkaClusterV2Model{}, err } - return schema.KafkaClusterV2Model{ + return schema.ConsoleKafkaClusterV2Model{ Name: types.StringValue(r.Metadata.Name), Labels: labels, Spec: specValue, }, nil } -func specInternalModelToTerraform(ctx context.Context, r *model.KafkaClusterSpec) (schema.SpecValue, error) { +func specInternalModelToTerraform(ctx context.Context, r *console.KafkaClusterSpec) (schema.SpecValue, error) { unknownSpecObjectValue, diag := schema.NewSpecValueUnknown().ToObjectValue(ctx) if diag.HasError() { @@ -78,7 +79,7 @@ func specInternalModelToTerraform(ctx context.Context, r *model.KafkaClusterSpec return value, nil } -func kafkaFlavorInternalModelToTerraform(ctx context.Context, r *model.KafkaFlavor) (schema.KafkaFlavorValue, error) { +func kafkaFlavorInternalModelToTerraform(ctx context.Context, r *console.KafkaFlavor) (schema.KafkaFlavorValue, error) { if r == nil || (r.Aiven == nil && r.Confluent == nil && r.Gateway == nil) { return schema.NewKafkaFlavorValueNull(), nil } diff --git a/internal/mapper/kafka_cluster_v2/mapper_test.go b/internal/mapper/console_kafka_cluster_v2/mapper_test.go similarity index 92% rename from internal/mapper/kafka_cluster_v2/mapper_test.go rename to internal/mapper/console_kafka_cluster_v2/mapper_test.go index 4bdd494..ab4d5d3 100644 --- a/internal/mapper/kafka_cluster_v2/mapper_test.go +++ b/internal/mapper/console_kafka_cluster_v2/mapper_test.go @@ -1,4 +1,4 @@ -package kafka_cluster_v2 +package console_kafka_cluster_v2 import ( "context" @@ -6,6 +6,7 @@ import ( ctlresource "github.com/conduktor/ctl/resource" "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" "github.com/conduktor/terraform-provider-conduktor/internal/test" "github.com/google/go-cmp/cmp" "github.com/google/go-cmp/cmp/cmpopts" @@ -17,7 +18,7 @@ func TestKafkaClusterV2ModelMapping(t *testing.T) { ctx := context.Background() - jsonKafkaClusterV2Resource := []byte(test.TestAccTestdata(t, "kafka_cluster_v2_confluent_api.json")) + jsonKafkaClusterV2Resource := []byte(test.TestAccTestdata(t, "console_kafka_cluster_v2_confluent_api.json")) ctlResource := ctlresource.Resource{} err := ctlResource.UnmarshalJSON(jsonKafkaClusterV2Resource) @@ -32,7 +33,7 @@ func TestKafkaClusterV2ModelMapping(t *testing.T) { assert.Equal(t, jsonKafkaClusterV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewKafkaClusterResourceFromClientResource(ctlResource) + internal, err := console.NewKafkaClusterResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return @@ -122,7 +123,7 @@ func TestKafkaClusterV2ModelMapping(t *testing.T) { func TestAWSKafkaClusterV2ModelMapping(t *testing.T) { ctx := context.Background() - jsonKafkaClusterV2Resource := []byte(test.TestAccTestdata(t, "kafka_cluster_v2_aws_api.json")) + jsonKafkaClusterV2Resource := []byte(test.TestAccTestdata(t, "console_kafka_cluster_v2_aws_api.json")) ctlResource := ctlresource.Resource{} err := ctlResource.UnmarshalJSON(jsonKafkaClusterV2Resource) @@ -136,7 +137,7 @@ func TestAWSKafkaClusterV2ModelMapping(t *testing.T) { assert.Equal(t, jsonKafkaClusterV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewKafkaClusterResourceFromClientResource(ctlResource) + internal, err := console.NewKafkaClusterResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return @@ -158,7 +159,7 @@ func TestAWSKafkaClusterV2ModelMapping(t *testing.T) { assert.Equal(t, "default", internal.Spec.SchemaRegistry.Glue.RegistryName) assert.Equal(t, "XXXXXXXXXX", internal.Spec.SchemaRegistry.Glue.Security.Credentials.AccessKeyId) assert.Equal(t, "YYYYYYYYYY", internal.Spec.SchemaRegistry.Glue.Security.Credentials.SecretKey) - assert.Equal(t, (*model.KafkaFlavor)(nil), internal.Spec.KafkaFlavor) + assert.Equal(t, (*console.KafkaFlavor)(nil), internal.Spec.KafkaFlavor) // convert to terraform model tfModel, err := InternalModelToTerraform(ctx, &internal) @@ -200,7 +201,7 @@ func TestAWSKafkaClusterV2ModelMapping(t *testing.T) { func TestMinimalKafkaClusterV2ModelMapping(t *testing.T) { ctx := context.Background() - jsonKafkaClusterV2Resource := []byte(test.TestAccTestdata(t, "kafka_cluster_v2_minimal_api.json")) + jsonKafkaClusterV2Resource := []byte(test.TestAccTestdata(t, "console_kafka_cluster_v2_minimal_api.json")) ctlResource := ctlresource.Resource{} err := ctlResource.UnmarshalJSON(jsonKafkaClusterV2Resource) @@ -214,7 +215,7 @@ func TestMinimalKafkaClusterV2ModelMapping(t *testing.T) { assert.Equal(t, jsonKafkaClusterV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewKafkaClusterResourceFromClientResource(ctlResource) + internal, err := console.NewKafkaClusterResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return @@ -228,7 +229,7 @@ func TestMinimalKafkaClusterV2ModelMapping(t *testing.T) { assert.Equal(t, "", internal.Spec.Icon) assert.Equal(t, map[string]string(nil), internal.Spec.Properties) assert.Equal(t, (*model.SchemaRegistry)(nil), internal.Spec.SchemaRegistry) - assert.Equal(t, (*model.KafkaFlavor)(nil), internal.Spec.KafkaFlavor) + assert.Equal(t, (*console.KafkaFlavor)(nil), internal.Spec.KafkaFlavor) // convert to terraform model tfModel, err := InternalModelToTerraform(ctx, &internal) diff --git a/internal/mapper/kafka_cluster_v2/tf_to_internal_mapper.go b/internal/mapper/console_kafka_cluster_v2/tf_to_internal_mapper.go similarity index 89% rename from internal/mapper/kafka_cluster_v2/tf_to_internal_mapper.go rename to internal/mapper/console_kafka_cluster_v2/tf_to_internal_mapper.go index 2d14f85..39fd8f2 100644 --- a/internal/mapper/kafka_cluster_v2/tf_to_internal_mapper.go +++ b/internal/mapper/console_kafka_cluster_v2/tf_to_internal_mapper.go @@ -1,52 +1,53 @@ -package kafka_cluster_v2 +package console_kafka_cluster_v2 import ( "context" "fmt" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" schemaUtils "github.com/conduktor/terraform-provider-conduktor/internal/schema" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_kafka_cluster_v2" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_kafka_cluster_v2" "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" "github.com/hashicorp/terraform-plugin-framework/types/basetypes" ) -func TFToInternalModel(ctx context.Context, r *schema.KafkaClusterV2Model) (model.KafkaClusterResource, error) { +func TFToInternalModel(ctx context.Context, r *schema.ConsoleKafkaClusterV2Model) (console.KafkaClusterResource, error) { labels, diag := schemaUtils.MapValueToStringMap(ctx, r.Labels) if diag.HasError() { - return model.KafkaClusterResource{}, mapper.WrapDiagError(diag, "labels", mapper.FromTerraform) + return console.KafkaClusterResource{}, mapper.WrapDiagError(diag, "labels", mapper.FromTerraform) } spec, err := specTFToInternalModel(ctx, &r.Spec) if err != nil { - return model.KafkaClusterResource{}, err + return console.KafkaClusterResource{}, err } - return model.NewKafkaClusterResource( + return console.NewKafkaClusterResource( r.Name.ValueString(), labels, spec, ), nil } -func specTFToInternalModel(ctx context.Context, r *schema.SpecValue) (model.KafkaClusterSpec, error) { +func specTFToInternalModel(ctx context.Context, r *schema.SpecValue) (console.KafkaClusterSpec, error) { properties, diag := schemaUtils.MapValueToStringMap(ctx, r.Properties) if diag.HasError() { - return model.KafkaClusterSpec{}, mapper.WrapDiagError(diag, "properties", mapper.FromTerraform) + return console.KafkaClusterSpec{}, mapper.WrapDiagError(diag, "properties", mapper.FromTerraform) } kafkaFlavor, err := kafkaFlavorTFToInternalModel(ctx, &r.KafkaFlavor) if err != nil { - return model.KafkaClusterSpec{}, err + return console.KafkaClusterSpec{}, err } schemaRegistry, err := schemaRegistryTFToInternalModel(ctx, &r.SchemaRegistry) if err != nil { - return model.KafkaClusterSpec{}, err + return console.KafkaClusterSpec{}, err } - return model.KafkaClusterSpec{ + return console.KafkaClusterSpec{ DisplayName: r.DisplayName.ValueString(), BootstrapServers: r.BootstrapServers.ValueString(), Color: r.Color.ValueString(), @@ -58,7 +59,7 @@ func specTFToInternalModel(ctx context.Context, r *schema.SpecValue) (model.Kafk }, nil } -func kafkaFlavorTFToInternalModel(ctx context.Context, r *basetypes.ObjectValue) (*model.KafkaFlavor, error) { +func kafkaFlavorTFToInternalModel(ctx context.Context, r *basetypes.ObjectValue) (*console.KafkaFlavor, error) { if r.IsNull() { return nil, nil } @@ -71,8 +72,8 @@ func kafkaFlavorTFToInternalModel(ctx context.Context, r *basetypes.ObjectValue) flavorType := kafkaFlavorValue.KafkaFlavorType.ValueString() switch flavorType { case validation.ConfluentKafkaFlavor: - return &model.KafkaFlavor{ - Confluent: &model.Confluent{ + return &console.KafkaFlavor{ + Confluent: &console.Confluent{ Type: flavorType, Key: kafkaFlavorValue.Key.ValueString(), Secret: kafkaFlavorValue.Secret.ValueString(), @@ -81,8 +82,8 @@ func kafkaFlavorTFToInternalModel(ctx context.Context, r *basetypes.ObjectValue) }, }, nil case validation.AivenKafkaFlavor: - return &model.KafkaFlavor{ - Aiven: &model.Aiven{ + return &console.KafkaFlavor{ + Aiven: &console.Aiven{ Type: flavorType, ApiToken: kafkaFlavorValue.ApiToken.ValueString(), Project: kafkaFlavorValue.Project.ValueString(), @@ -90,8 +91,8 @@ func kafkaFlavorTFToInternalModel(ctx context.Context, r *basetypes.ObjectValue) }, }, nil case validation.GatewayKafkaFlavor: - return &model.KafkaFlavor{ - Gateway: &model.Gateway{ + return &console.KafkaFlavor{ + Gateway: &console.Gateway{ Type: flavorType, Url: kafkaFlavorValue.Url.ValueString(), User: kafkaFlavorValue.User.ValueString(), diff --git a/internal/mapper/kafka_connect_v2/internal_to_tf_mapper.go b/internal/mapper/console_kafka_connect_v2/internal_to_tf_mapper.go similarity index 82% rename from internal/mapper/kafka_connect_v2/internal_to_tf_mapper.go rename to internal/mapper/console_kafka_connect_v2/internal_to_tf_mapper.go index 765ed51..18114a4 100644 --- a/internal/mapper/kafka_connect_v2/internal_to_tf_mapper.go +++ b/internal/mapper/console_kafka_connect_v2/internal_to_tf_mapper.go @@ -1,29 +1,29 @@ -package kafka_connect_v2 +package console_kafka_connect_v2 import ( "context" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" schemaUtils "github.com/conduktor/terraform-provider-conduktor/internal/schema" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_kafka_connect_v2" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_kafka_connect_v2" "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" "github.com/hashicorp/terraform-plugin-framework/types" "github.com/hashicorp/terraform-plugin-framework/types/basetypes" ) -func InternalModelToTerraform(ctx context.Context, r *model.KafkaConnectResource) (schema.KafkaConnectV2Model, error) { +func InternalModelToTerraform(ctx context.Context, r *console.KafkaConnectResource) (schema.ConsoleKafkaConnectV2Model, error) { labels, diag := schemaUtils.StringMapToMapValue(ctx, r.Metadata.Labels) if diag.HasError() { - return schema.KafkaConnectV2Model{}, mapper.WrapDiagError(diag, "labels", mapper.IntoTerraform) + return schema.ConsoleKafkaConnectV2Model{}, mapper.WrapDiagError(diag, "labels", mapper.IntoTerraform) } specValue, err := specInternalModelToTerraform(ctx, &r.Spec) if err != nil { - return schema.KafkaConnectV2Model{}, err + return schema.ConsoleKafkaConnectV2Model{}, err } - return schema.KafkaConnectV2Model{ + return schema.ConsoleKafkaConnectV2Model{ Name: types.StringValue(r.Metadata.Name), Cluster: types.StringValue(r.Metadata.Cluster), Labels: labels, @@ -31,7 +31,7 @@ func InternalModelToTerraform(ctx context.Context, r *model.KafkaConnectResource }, nil } -func specInternalModelToTerraform(ctx context.Context, r *model.KafkaConnectSpec) (schema.SpecValue, error) { +func specInternalModelToTerraform(ctx context.Context, r *console.KafkaConnectSpec) (schema.SpecValue, error) { unknownSpecObjectValue, diag := schema.NewSpecValueUnknown().ToObjectValue(ctx) if diag.HasError() { @@ -67,7 +67,7 @@ func specInternalModelToTerraform(ctx context.Context, r *model.KafkaConnectSpec return value, nil } -func securityInternalModelToTerraform(ctx context.Context, r *model.KafkaConnectSecurity) (schema.SecurityValue, error) { +func securityInternalModelToTerraform(ctx context.Context, r *console.KafkaConnectSecurity) (schema.SecurityValue, error) { if r == nil { return schema.NewSecurityValueNull(), nil } diff --git a/internal/mapper/kafka_connect_v2/mapper_test.go b/internal/mapper/console_kafka_connect_v2/mapper_test.go similarity index 92% rename from internal/mapper/kafka_connect_v2/mapper_test.go rename to internal/mapper/console_kafka_connect_v2/mapper_test.go index f92a875..fa02b92 100644 --- a/internal/mapper/kafka_connect_v2/mapper_test.go +++ b/internal/mapper/console_kafka_connect_v2/mapper_test.go @@ -1,11 +1,11 @@ -package kafka_connect_v2 +package console_kafka_connect_v2 import ( "context" "testing" ctlresource "github.com/conduktor/ctl/resource" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" "github.com/conduktor/terraform-provider-conduktor/internal/test" "github.com/google/go-cmp/cmp" "github.com/google/go-cmp/cmp/cmpopts" @@ -17,7 +17,7 @@ func TestKafkaConnectV2ModelMapping(t *testing.T) { ctx := context.Background() - jsonKafkaConnectV2Resource := []byte(test.TestAccTestdata(t, "kafka_connect_v2_api.json")) + jsonKafkaConnectV2Resource := []byte(test.TestAccTestdata(t, "console_kafka_connect_v2_api.json")) ctlResource := ctlresource.Resource{} err := ctlResource.UnmarshalJSON(jsonKafkaConnectV2Resource) @@ -32,7 +32,7 @@ func TestKafkaConnectV2ModelMapping(t *testing.T) { assert.Equal(t, jsonKafkaConnectV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewKafkaConnectResourceFromClientResource(ctlResource) + internal, err := console.NewKafkaConnectResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return diff --git a/internal/mapper/kafka_connect_v2/tf_to_internal_mapper.go b/internal/mapper/console_kafka_connect_v2/tf_to_internal_mapper.go similarity index 61% rename from internal/mapper/kafka_connect_v2/tf_to_internal_mapper.go rename to internal/mapper/console_kafka_connect_v2/tf_to_internal_mapper.go index 3d97a35..ffb4dad 100644 --- a/internal/mapper/kafka_connect_v2/tf_to_internal_mapper.go +++ b/internal/mapper/console_kafka_connect_v2/tf_to_internal_mapper.go @@ -1,28 +1,28 @@ -package kafka_connect_v2 +package console_kafka_connect_v2 import ( "context" "fmt" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" schemaUtils "github.com/conduktor/terraform-provider-conduktor/internal/schema" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_kafka_connect_v2" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_kafka_connect_v2" "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" ) -func TFToInternalModel(ctx context.Context, r *schema.KafkaConnectV2Model) (model.KafkaConnectResource, error) { +func TFToInternalModel(ctx context.Context, r *schema.ConsoleKafkaConnectV2Model) (console.KafkaConnectResource, error) { labels, diag := schemaUtils.MapValueToStringMap(ctx, r.Labels) if diag.HasError() { - return model.KafkaConnectResource{}, mapper.WrapDiagError(diag, "labels", mapper.FromTerraform) + return console.KafkaConnectResource{}, mapper.WrapDiagError(diag, "labels", mapper.FromTerraform) } spec, err := specTFToInternalModel(ctx, &r.Spec) if err != nil { - return model.KafkaConnectResource{}, err + return console.KafkaConnectResource{}, err } - return model.NewKafkaConnectResource( + return console.NewKafkaConnectResource( r.Name.ValueString(), r.Cluster.ValueString(), labels, @@ -30,26 +30,26 @@ func TFToInternalModel(ctx context.Context, r *schema.KafkaConnectV2Model) (mode ), nil } -func specTFToInternalModel(ctx context.Context, r *schema.SpecValue) (model.KafkaConnectSpec, error) { +func specTFToInternalModel(ctx context.Context, r *schema.SpecValue) (console.KafkaConnectSpec, error) { headers, diag := schemaUtils.MapValueToStringMap(ctx, r.Headers) if diag.HasError() { - return model.KafkaConnectSpec{}, mapper.WrapDiagError(diag, "headers", mapper.FromTerraform) + return console.KafkaConnectSpec{}, mapper.WrapDiagError(diag, "headers", mapper.FromTerraform) } var securityValue = schema.NewSecurityValueNull() if !r.Security.IsNull() { securityValue, diag = schema.NewSecurityValue(r.Security.AttributeTypes(ctx), r.Security.Attributes()) if diag.HasError() { - return model.KafkaConnectSpec{}, mapper.WrapDiagError(diag, "security", mapper.FromTerraform) + return console.KafkaConnectSpec{}, mapper.WrapDiagError(diag, "security", mapper.FromTerraform) } } security, err := securityTFToInternalModel(ctx, &securityValue) if err != nil { - return model.KafkaConnectSpec{}, err + return console.KafkaConnectSpec{}, err } - return model.KafkaConnectSpec{ + return console.KafkaConnectSpec{ DisplayName: r.DisplayName.ValueString(), Urls: r.Urls.ValueString(), IgnoreUntrustedCertificate: r.IgnoreUntrustedCertificate.ValueBool(), @@ -58,7 +58,7 @@ func specTFToInternalModel(ctx context.Context, r *schema.SpecValue) (model.Kafk }, nil } -func securityTFToInternalModel(_ context.Context, r *schema.SecurityValue) (*model.KafkaConnectSecurity, error) { +func securityTFToInternalModel(_ context.Context, r *schema.SecurityValue) (*console.KafkaConnectSecurity, error) { if r.IsNull() { return nil, nil } @@ -66,29 +66,29 @@ func securityTFToInternalModel(_ context.Context, r *schema.SecurityValue) (*mod securityType := r.SecurityType.ValueString() switch securityType { case validation.BasicAuthKafkaConnectSecurity: - return &model.KafkaConnectSecurity{ - BasicAuth: &model.KafkaConnectBasicAuth{ + return &console.KafkaConnectSecurity{ + BasicAuth: &console.KafkaConnectBasicAuth{ Type: securityType, Username: r.Username.ValueString(), Password: r.Password.ValueString(), }, }, nil case validation.BearerTokenKafkaConnectSecurity: - return &model.KafkaConnectSecurity{ - BearerToken: &model.KafkaConnectBearerToken{ + return &console.KafkaConnectSecurity{ + BearerToken: &console.KafkaConnectBearerToken{ Type: securityType, Token: r.Token.ValueString(), }, }, nil case validation.SSLAuthKafkaConnectSecurity: - return &model.KafkaConnectSecurity{ - SSLAuth: &model.KafkaConnectSSLAuth{ + return &console.KafkaConnectSecurity{ + SSLAuth: &console.KafkaConnectSSLAuth{ Type: securityType, Key: r.Key.ValueString(), CertificateChain: r.CertificateChain.ValueString(), }, }, nil default: - return &model.KafkaConnectSecurity{}, mapper.WrapError(fmt.Errorf("unsupported SecurityType: %s", securityType), "security", mapper.FromTerraform) + return &console.KafkaConnectSecurity{}, mapper.WrapError(fmt.Errorf("unsupported SecurityType: %s", securityType), "security", mapper.FromTerraform) } } diff --git a/internal/mapper/user_v2/user_v2_resource_mapper.go b/internal/mapper/console_user_v2/user_v2_resource_mapper.go similarity index 66% rename from internal/mapper/user_v2/user_v2_resource_mapper.go rename to internal/mapper/console_user_v2/user_v2_resource_mapper.go index 22f72ff..c35e0dd 100644 --- a/internal/mapper/user_v2/user_v2_resource_mapper.go +++ b/internal/mapper/console_user_v2/user_v2_resource_mapper.go @@ -1,26 +1,26 @@ -package user_v2 +package console_user_v2 import ( "context" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" schema "github.com/conduktor/terraform-provider-conduktor/internal/schema" - users "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_user_v2" + users "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_user_v2" "github.com/hashicorp/terraform-plugin-framework/attr" "github.com/hashicorp/terraform-plugin-framework/types" "github.com/hashicorp/terraform-plugin-framework/types/basetypes" ) -func TFToInternalModel(ctx context.Context, r *users.UserV2Model) (model.UserConsoleResource, error) { +func TFToInternalModel(ctx context.Context, r *users.ConsoleUserV2Model) (console.UserConsoleResource, error) { permissions, err := schema.SetValueToPermissionArray(ctx, schema.USERS, r.Spec.Permissions) if err != nil { - return model.UserConsoleResource{}, err + return console.UserConsoleResource{}, err } - return model.NewUserConsoleResource( + return console.NewUserConsoleResource( r.Name.ValueString(), - model.UserConsoleSpec{ + console.UserConsoleSpec{ FirstName: r.Spec.Firstname.ValueString(), LastName: r.Spec.Lastname.ValueString(), Permissions: permissions, @@ -28,10 +28,10 @@ func TFToInternalModel(ctx context.Context, r *users.UserV2Model) (model.UserCon ), nil } -func InternalModelToTerraform(ctx context.Context, r *model.UserConsoleResource) (users.UserV2Model, error) { +func InternalModelToTerraform(ctx context.Context, r *console.UserConsoleResource) (users.ConsoleUserV2Model, error) { permissionsList, err := schema.PermissionArrayToSetValue(ctx, schema.USERS, r.Spec.Permissions) if err != nil { - return users.UserV2Model{}, err + return users.ConsoleUserV2Model{}, err } specValue, diag := users.NewSpecValue( @@ -47,10 +47,10 @@ func InternalModelToTerraform(ctx context.Context, r *model.UserConsoleResource) }, ) if diag.HasError() { - return users.UserV2Model{}, mapper.WrapDiagError(diag, "spec", mapper.IntoTerraform) + return users.ConsoleUserV2Model{}, mapper.WrapDiagError(diag, "spec", mapper.IntoTerraform) } - return users.UserV2Model{ + return users.ConsoleUserV2Model{ Name: types.StringValue(r.Metadata.Name), Spec: specValue, }, nil diff --git a/internal/mapper/user_v2/user_v2_resource_mapper_test.go b/internal/mapper/console_user_v2/user_v2_resource_mapper_test.go similarity index 93% rename from internal/mapper/user_v2/user_v2_resource_mapper_test.go rename to internal/mapper/console_user_v2/user_v2_resource_mapper_test.go index 61356b6..b8156bf 100644 --- a/internal/mapper/user_v2/user_v2_resource_mapper_test.go +++ b/internal/mapper/console_user_v2/user_v2_resource_mapper_test.go @@ -1,4 +1,4 @@ -package user_v2 +package console_user_v2 import ( "context" @@ -6,6 +6,7 @@ import ( ctlresource "github.com/conduktor/ctl/resource" "github.com/conduktor/terraform-provider-conduktor/internal/model" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" "github.com/conduktor/terraform-provider-conduktor/internal/test" "github.com/google/go-cmp/cmp" "github.com/google/go-cmp/cmp/cmpopts" @@ -17,7 +18,7 @@ func TestUserV2ModelMapping(t *testing.T) { ctx := context.Background() - jsonUserV2Resource := []byte(test.TestAccTestdata(t, "user_v2_api.json")) + jsonUserV2Resource := []byte(test.TestAccTestdata(t, "console_user_v2_api.json")) ctlResource := ctlresource.Resource{} err := ctlResource.UnmarshalJSON(jsonUserV2Resource) @@ -32,7 +33,7 @@ func TestUserV2ModelMapping(t *testing.T) { assert.Equal(t, jsonUserV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewUserConsoleResourceFromClientResource(ctlResource) + internal, err := console.NewUserConsoleResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return diff --git a/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper.go b/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper.go index 8472a98..828e98b 100644 --- a/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper.go +++ b/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper.go @@ -5,7 +5,7 @@ import ( "fmt" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + gateway "github.com/conduktor/terraform-provider-conduktor/internal/model/gateway" schema "github.com/conduktor/terraform-provider-conduktor/internal/schema" gwserviceaccounts "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_gateway_service_account_v2" "github.com/hashicorp/terraform-plugin-framework/attr" @@ -13,31 +13,31 @@ import ( "github.com/hashicorp/terraform-plugin-framework/types/basetypes" ) -func TFToInternalModel(ctx context.Context, r *gwserviceaccounts.GatewayServiceAccountV2Model) (model.GatewayServiceAccountResource, error) { +func TFToInternalModel(ctx context.Context, r *gwserviceaccounts.GatewayServiceAccountV2Model) (gateway.GatewayServiceAccountResource, error) { externalNames, diag := schema.SetValueToStringArray(ctx, r.Spec.ExternalNames) if diag.HasError() { - return model.GatewayServiceAccountResource{}, mapper.WrapDiagError(diag, "external_names", mapper.FromTerraform) + return gateway.GatewayServiceAccountResource{}, mapper.WrapDiagError(diag, "external_names", mapper.FromTerraform) } if len(externalNames) > 0 { if r.Spec.SpecType.ValueString() != "EXTERNAL" { - return model.GatewayServiceAccountResource{}, fmt.Errorf("external_names only configurable when spec.type = EXTERNAL") + return gateway.GatewayServiceAccountResource{}, fmt.Errorf("external_names only configurable when spec.type = EXTERNAL") } } - return model.NewGatewayServiceAccountResource( - model.GatewayServiceAccountMetadata{ + return gateway.NewGatewayServiceAccountResource( + gateway.GatewayServiceAccountMetadata{ Name: r.Name.ValueString(), VCluster: r.Vcluster.ValueString(), }, - model.GatewayServiceAccountSpec{ + gateway.GatewayServiceAccountSpec{ Type: r.Spec.SpecType.ValueString(), ExternalNames: externalNames, }, ), nil } -func InternalModelToTerraform(ctx context.Context, r *model.GatewayServiceAccountResource) (gwserviceaccounts.GatewayServiceAccountV2Model, error) { +func InternalModelToTerraform(ctx context.Context, r *gateway.GatewayServiceAccountResource) (gwserviceaccounts.GatewayServiceAccountV2Model, error) { // Configuring default value for vcluster if r.Metadata.VCluster == "" { r.Metadata.VCluster = "passthrough" diff --git a/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper_test.go b/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper_test.go index 22e965f..aa557cd 100644 --- a/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper_test.go +++ b/internal/mapper/gateway_service_account_v2/gateway_service_account_v2_resource_mapper_test.go @@ -5,7 +5,7 @@ import ( "testing" ctlresource "github.com/conduktor/ctl/resource" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + gateway "github.com/conduktor/terraform-provider-conduktor/internal/model/gateway" "github.com/conduktor/terraform-provider-conduktor/internal/test" "github.com/google/go-cmp/cmp" "github.com/google/go-cmp/cmp/cmpopts" @@ -32,7 +32,7 @@ func TestGatewayServiceAccountV2ModelMapping(t *testing.T) { assert.Equal(t, jsonServiceAccountV2Resource, ctlResource.Json) // convert into internal model - internal, err := model.NewGatewayServiceAccountResourceFromClientResource(ctlResource) + internal, err := gateway.NewGatewayServiceAccountResourceFromClientResource(ctlResource) if err != nil { t.Fatal(err) return diff --git a/internal/model/group_v2.go b/internal/model/console/group_v2.go similarity index 75% rename from internal/model/group_v2.go rename to internal/model/console/group_v2.go index 8cc4610..24a74f2 100644 --- a/internal/model/group_v2.go +++ b/internal/model/console/group_v2.go @@ -1,10 +1,11 @@ -package model +package console import ( "encoding/json" "fmt" ctlresource "github.com/conduktor/ctl/resource" + model "github.com/conduktor/terraform-provider-conduktor/internal/model" jsoniter "github.com/json-iterator/go" ) @@ -20,12 +21,12 @@ func (r GroupConsoleMetadata) String() string { } type GroupConsoleSpec struct { - Description string `json:"description,omitempty"` - DisplayName string `json:"displayName"` - ExternalGroups []string `json:"externalGroups"` - Members []string `json:"members"` - MembersFromExternalGroups []string `json:"membersFromExternalGroups"` - Permissions []Permission `json:"permissions"` + Description string `json:"description,omitempty"` + DisplayName string `json:"displayName"` + ExternalGroups []string `json:"externalGroups"` + Members []string `json:"members"` + MembersFromExternalGroups []string `json:"membersFromExternalGroups"` + Permissions []model.Permission `json:"permissions"` } type GroupConsoleResource struct { @@ -47,7 +48,7 @@ func NewGroupConsoleResource(name string, spec GroupConsoleSpec) GroupConsoleRes } func (r *GroupConsoleResource) ToClientResource() (ctlresource.Resource, error) { - return toClientResource(r) + return model.ToClientResource(r) } func (r *GroupConsoleResource) FromClientResource(cliResource ctlresource.Resource) error { diff --git a/internal/model/kafka_cluster_v2.go b/internal/model/console/kafka_cluster_v2.go similarity index 82% rename from internal/model/kafka_cluster_v2.go rename to internal/model/console/kafka_cluster_v2.go index 0c77050..dde3350 100644 --- a/internal/model/kafka_cluster_v2.go +++ b/internal/model/console/kafka_cluster_v2.go @@ -1,9 +1,10 @@ -package model +package console import ( "encoding/json" "fmt" ctlresource "github.com/conduktor/ctl/resource" + model "github.com/conduktor/terraform-provider-conduktor/internal/model" jsoniter "github.com/json-iterator/go" ) @@ -20,14 +21,14 @@ func (r KafkaClusterMetadata) String() string { } type KafkaClusterSpec struct { - BootstrapServers string `json:"bootstrapServers"` - Color string `json:"color,omitempty"` - DisplayName string `json:"displayName"` - Icon string `json:"icon,omitempty"` - IgnoreUntrustedCertificate bool `json:"ignoreUntrustedCertificate"` - KafkaFlavor *KafkaFlavor `json:"kafkaFlavor,omitempty"` - Properties map[string]string `json:"properties,omitempty"` - SchemaRegistry *SchemaRegistry `json:"schemaRegistry,omitempty"` + BootstrapServers string `json:"bootstrapServers"` + Color string `json:"color,omitempty"` + DisplayName string `json:"displayName"` + Icon string `json:"icon,omitempty"` + IgnoreUntrustedCertificate bool `json:"ignoreUntrustedCertificate"` + KafkaFlavor *KafkaFlavor `json:"kafkaFlavor,omitempty"` + Properties map[string]string `json:"properties,omitempty"` + SchemaRegistry *model.SchemaRegistry `json:"schemaRegistry,omitempty"` } type KafkaFlavor struct { @@ -37,7 +38,7 @@ type KafkaFlavor struct { } func (dst *KafkaFlavor) UnmarshalJSON(data []byte) error { - var disc Discriminable + var disc model.Discriminable err := json.Unmarshal(data, &disc) if err != nil { return err @@ -127,7 +128,7 @@ func NewKafkaClusterResource(name string, labels map[string]string, spec KafkaCl } func (r *KafkaClusterResource) ToClientResource() (ctlresource.Resource, error) { - return toClientResource(r) + return model.ToClientResource(r) } func (r *KafkaClusterResource) FromClientResource(cliResource ctlresource.Resource) error { diff --git a/internal/model/kafka_connect_v2.go b/internal/model/console/kafka_connect_v2.go similarity index 96% rename from internal/model/kafka_connect_v2.go rename to internal/model/console/kafka_connect_v2.go index b0b09c8..4f596b4 100644 --- a/internal/model/kafka_connect_v2.go +++ b/internal/model/console/kafka_connect_v2.go @@ -1,9 +1,10 @@ -package model +package console import ( "encoding/json" "fmt" ctlresource "github.com/conduktor/ctl/resource" + model "github.com/conduktor/terraform-provider-conduktor/internal/model" jsoniter "github.com/json-iterator/go" ) @@ -35,7 +36,7 @@ type KafkaConnectSecurity struct { } func (s *KafkaConnectSecurity) UnmarshalJSON(bytes []byte) error { - var disc Discriminable + var disc model.Discriminable err := json.Unmarshal(bytes, &disc) if err != nil { return err @@ -118,7 +119,7 @@ func NewKafkaConnectResource(name string, cluster string, labels map[string]stri } func (r *KafkaConnectResource) ToClientResource() (ctlresource.Resource, error) { - return toClientResource(r) + return model.ToClientResource(r) } func (r *KafkaConnectResource) FromClientResource(cliResource ctlresource.Resource) error { diff --git a/internal/model/user_v2.go b/internal/model/console/user_v2.go similarity index 84% rename from internal/model/user_v2.go rename to internal/model/console/user_v2.go index 55310b8..f8e35f3 100644 --- a/internal/model/user_v2.go +++ b/internal/model/console/user_v2.go @@ -1,10 +1,11 @@ -package model +package console import ( "encoding/json" "fmt" ctlresource "github.com/conduktor/ctl/resource" + model "github.com/conduktor/terraform-provider-conduktor/internal/model" jsoniter "github.com/json-iterator/go" ) @@ -20,9 +21,9 @@ func (r UserConsoleMetadata) String() string { } type UserConsoleSpec struct { - FirstName string `json:"firstName,omitempty"` - LastName string `json:"lastName,omitempty"` - Permissions []Permission `json:"permissions"` + FirstName string `json:"firstName,omitempty"` + LastName string `json:"lastName,omitempty"` + Permissions []model.Permission `json:"permissions"` } type UserConsoleResource struct { @@ -44,7 +45,7 @@ func NewUserConsoleResource(name string, spec UserConsoleSpec) UserConsoleResour } func (r *UserConsoleResource) ToClientResource() (ctlresource.Resource, error) { - return toClientResource(r) + return model.ToClientResource(r) } func (r *UserConsoleResource) FromClientResource(cliResource ctlresource.Resource) error { diff --git a/internal/model/gateway_service_account_v2.go b/internal/model/gateway/service_account_v2.go similarity index 94% rename from internal/model/gateway_service_account_v2.go rename to internal/model/gateway/service_account_v2.go index 8fe9fb9..4d1decc 100644 --- a/internal/model/gateway_service_account_v2.go +++ b/internal/model/gateway/service_account_v2.go @@ -1,8 +1,9 @@ -package model +package gateway import ( "encoding/json" "fmt" + model "github.com/conduktor/terraform-provider-conduktor/internal/model" ctlresource "github.com/conduktor/ctl/resource" jsoniter "github.com/json-iterator/go" @@ -42,7 +43,7 @@ func NewGatewayServiceAccountResource(metadata GatewayServiceAccountMetadata, sp } func (r *GatewayServiceAccountResource) ToClientResource() (ctlresource.Resource, error) { - return toClientResource(r) + return model.ToClientResource(r) } func (r *GatewayServiceAccountResource) FromClientResource(cliResource ctlresource.Resource) error { diff --git a/internal/model/provider_data.go b/internal/model/provider_data.go deleted file mode 100644 index 6408a2f..0000000 --- a/internal/model/provider_data.go +++ /dev/null @@ -1,8 +0,0 @@ -package model - -import "github.com/conduktor/terraform-provider-conduktor/internal/client" - -type ProviderData struct { - Mode *client.Mode - Client *client.Client -} diff --git a/internal/model/utils.go b/internal/model/utils.go index c1f4fa0..7845fc6 100644 --- a/internal/model/utils.go +++ b/internal/model/utils.go @@ -10,7 +10,7 @@ type Discriminable struct { Type string `json:"type"` } -func toClientResource(o interface{}) (ctlresource.Resource, error) { +func ToClientResource(o interface{}) (ctlresource.Resource, error) { jsonData, err := json.Marshal(o) if err != nil { return ctlresource.Resource{}, err diff --git a/internal/provider/group_v2_resource.go b/internal/provider/console_group_v2_resource.go similarity index 92% rename from internal/provider/group_v2_resource.go rename to internal/provider/console_group_v2_resource.go index d8ca057..bbdc70b 100644 --- a/internal/provider/group_v2_resource.go +++ b/internal/provider/console_group_v2_resource.go @@ -5,9 +5,9 @@ import ( "fmt" "github.com/conduktor/terraform-provider-conduktor/internal/client" - mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/group_v2" - "github.com/conduktor/terraform-provider-conduktor/internal/model" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_group_v2" + mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/console_group_v2" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_group_v2" "github.com/hashicorp/terraform-plugin-framework/path" "github.com/hashicorp/terraform-plugin-framework/resource" "github.com/hashicorp/terraform-plugin-log/tflog" @@ -30,11 +30,11 @@ type GroupV2Resource struct { } func (r *GroupV2Resource) Metadata(ctx context.Context, req resource.MetadataRequest, resp *resource.MetadataResponse) { - resp.TypeName = req.ProviderTypeName + "_group_v2" + resp.TypeName = req.ProviderTypeName + "_console_group_v2" } func (r *GroupV2Resource) Schema(ctx context.Context, req resource.SchemaRequest, resp *resource.SchemaResponse) { - resp.Schema = schema.GroupV2ResourceSchema(ctx) + resp.Schema = schema.ConsoleGroupV2ResourceSchema(ctx) } func (r *GroupV2Resource) Configure(ctx context.Context, req resource.ConfigureRequest, resp *resource.ConfigureResponse) { @@ -68,7 +68,7 @@ func (r *GroupV2Resource) Configure(ctx context.Context, req resource.ConfigureR } func (r *GroupV2Resource) Create(ctx context.Context, req resource.CreateRequest, resp *resource.CreateResponse) { - var data schema.GroupV2Model + var data schema.ConsoleGroupV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -95,7 +95,7 @@ func (r *GroupV2Resource) Create(ctx context.Context, req resource.CreateRequest tflog.Debug(ctx, fmt.Sprintf("Group created with result: %s", apply.UpsertResult)) - var consoleRes = model.GroupConsoleResource{} + var consoleRes = console.GroupConsoleResource{} err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as group : %v, got error: %s", apply.Resource, err)) @@ -114,7 +114,7 @@ func (r *GroupV2Resource) Create(ctx context.Context, req resource.CreateRequest } func (r *GroupV2Resource) Read(ctx context.Context, req resource.ReadRequest, resp *resource.ReadResponse) { - var data schema.GroupV2Model + var data schema.ConsoleGroupV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) @@ -136,7 +136,7 @@ func (r *GroupV2Resource) Read(ctx context.Context, req resource.ReadRequest, re return } - var consoleRes = model.GroupConsoleResource{} + var consoleRes = console.GroupConsoleResource{} err = jsoniter.Unmarshal(get, &consoleRes) if err != nil { resp.Diagnostics.AddError("Parsing Error", fmt.Sprintf("Unable to read group, got error: %s", err)) @@ -155,7 +155,7 @@ func (r *GroupV2Resource) Read(ctx context.Context, req resource.ReadRequest, re } func (r *GroupV2Resource) Update(ctx context.Context, req resource.UpdateRequest, resp *resource.UpdateResponse) { - var data schema.GroupV2Model + var data schema.ConsoleGroupV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -181,7 +181,7 @@ func (r *GroupV2Resource) Update(ctx context.Context, req resource.UpdateRequest } tflog.Debug(ctx, fmt.Sprintf("Group updated with result: %s", apply)) - var consoleRes = model.GroupConsoleResource{} + var consoleRes = console.GroupConsoleResource{} err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as group : %v, got error: %s", apply.Resource, err)) @@ -199,7 +199,7 @@ func (r *GroupV2Resource) Update(ctx context.Context, req resource.UpdateRequest } func (r *GroupV2Resource) Delete(ctx context.Context, req resource.DeleteRequest, resp *resource.DeleteResponse) { - var data schema.GroupV2Model + var data schema.ConsoleGroupV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) diff --git a/internal/provider/group_v2_resource_test.go b/internal/provider/console_group_v2_resource_test.go similarity index 61% rename from internal/provider/group_v2_resource_test.go rename to internal/provider/console_group_v2_resource_test.go index 40f93c3..60a55fa 100644 --- a/internal/provider/group_v2_resource_test.go +++ b/internal/provider/console_group_v2_resource_test.go @@ -10,14 +10,14 @@ import ( func TestAccGroupV2Resource(t *testing.T) { test.CheckEnterpriseEnabled(t) - resourceRef := "conduktor_group_v2.test" + resourceRef := "conduktor_console_group_v2.test" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, ProtoV6ProviderFactories: testAccProtoV6ProviderFactories, Steps: []resource.TestStep{ // Create and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "group_v2_resource_create.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_group_v2_resource_create.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "sales"), resource.TestCheckResourceAttr(resourceRef, "spec.display_name", "Sales Department"), @@ -41,7 +41,7 @@ func TestAccGroupV2Resource(t *testing.T) { }, // Update and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "group_v2_resource_update.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_group_v2_resource_update.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "sales"), resource.TestCheckResourceAttr(resourceRef, "spec.display_name", "New Sales Department"), @@ -75,14 +75,14 @@ func TestAccGroupV2Minimal(t *testing.T) { Steps: []resource.TestStep{ // Create and Read from minimal example { - Config: providerConfigConsole + test.TestAccTestdata(t, "group_v2_resource_minimal.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_group_v2_resource_minimal.tf"), Check: resource.ComposeAggregateTestCheckFunc( - resource.TestCheckResourceAttr("conduktor_group_v2.minimal", "name", "minimal"), - resource.TestCheckResourceAttr("conduktor_group_v2.minimal", "spec.display_name", "Minimal"), - resource.TestCheckResourceAttr("conduktor_group_v2.minimal", "spec.external_groups.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.minimal", "spec.members.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.minimal", "spec.members_from_external_groups.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.minimal", "spec.permissions.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.minimal", "name", "minimal"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.minimal", "spec.display_name", "Minimal"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.minimal", "spec.external_groups.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.minimal", "spec.members.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.minimal", "spec.members_from_external_groups.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.minimal", "spec.permissions.#", "0"), ), }, // Delete testing automatically occurs in TestCase @@ -99,30 +99,30 @@ func TestAccGroupV2ExampleResource(t *testing.T) { Steps: []resource.TestStep{ // Create and Read from simple example { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_group_v2", "simple.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_group_v2", "simple.tf"), Check: resource.ComposeAggregateTestCheckFunc( - resource.TestCheckResourceAttr("conduktor_group_v2.example", "name", "simple-group"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.display_name", "Simple Group"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.description", "Simple group description"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.external_groups.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.members.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.members_from_external_groups.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.permissions.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "name", "simple-group"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.display_name", "Simple Group"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.description", "Simple group description"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.external_groups.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.members.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.members_from_external_groups.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.permissions.#", "0"), ), }, // Create and Read from complex example { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_group_v2", "complex.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_group_v2", "complex.tf"), Check: resource.ComposeAggregateTestCheckFunc( - resource.TestCheckResourceAttr("conduktor_group_v2.example", "name", "complex-group"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.display_name", "Complex group"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.description", "Complex group description"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.external_groups.#", "1"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.external_groups.0", "sso-group1"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.members.#", "1"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.members.0", "user1@company.com"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.members_from_external_groups.#", "0"), - resource.TestCheckResourceAttr("conduktor_group_v2.example", "spec.permissions.#", "2"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "name", "complex-group"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.display_name", "Complex group"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.description", "Complex group description"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.external_groups.#", "1"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.external_groups.0", "sso-group1"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.members.#", "1"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.members.0", "user1@company.com"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.members_from_external_groups.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_group_v2.example", "spec.permissions.#", "2"), ), }, }, diff --git a/internal/provider/kafka_cluster_v2_resource.go b/internal/provider/console_kafka_cluster_v2_resource.go similarity index 92% rename from internal/provider/kafka_cluster_v2_resource.go rename to internal/provider/console_kafka_cluster_v2_resource.go index a040250..70c1db4 100644 --- a/internal/provider/kafka_cluster_v2_resource.go +++ b/internal/provider/console_kafka_cluster_v2_resource.go @@ -4,9 +4,9 @@ import ( "context" "fmt" "github.com/conduktor/terraform-provider-conduktor/internal/client" - mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/kafka_cluster_v2" - "github.com/conduktor/terraform-provider-conduktor/internal/model" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_kafka_cluster_v2" + mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/console_kafka_cluster_v2" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_kafka_cluster_v2" "github.com/hashicorp/terraform-plugin-framework/path" "github.com/hashicorp/terraform-plugin-framework/resource" "github.com/hashicorp/terraform-plugin-log/tflog" @@ -29,11 +29,11 @@ type KafkaClusterV2Resource struct { } func (r *KafkaClusterV2Resource) Metadata(_ context.Context, req resource.MetadataRequest, resp *resource.MetadataResponse) { - resp.TypeName = req.ProviderTypeName + "_kafka_cluster_v2" + resp.TypeName = req.ProviderTypeName + "_console_kafka_cluster_v2" } func (r *KafkaClusterV2Resource) Schema(ctx context.Context, _ resource.SchemaRequest, resp *resource.SchemaResponse) { - resp.Schema = schema.KafkaClusterV2ResourceSchema(ctx) + resp.Schema = schema.ConsoleKafkaClusterV2ResourceSchema(ctx) } func (r *KafkaClusterV2Resource) Configure(_ context.Context, req resource.ConfigureRequest, resp *resource.ConfigureResponse) { @@ -67,7 +67,7 @@ func (r *KafkaClusterV2Resource) Configure(_ context.Context, req resource.Confi } func (r *KafkaClusterV2Resource) Create(ctx context.Context, req resource.CreateRequest, resp *resource.CreateResponse) { - var data schema.KafkaClusterV2Model + var data schema.ConsoleKafkaClusterV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -94,7 +94,7 @@ func (r *KafkaClusterV2Resource) Create(ctx context.Context, req resource.Create tflog.Debug(ctx, fmt.Sprintf("Kafka cluster created with result: %s", apply.UpsertResult)) - var consoleRes = model.KafkaClusterResource{} + var consoleRes = console.KafkaClusterResource{} err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as kafka cluster : %v, got error: %s", apply.Resource, err)) @@ -113,7 +113,7 @@ func (r *KafkaClusterV2Resource) Create(ctx context.Context, req resource.Create } func (r *KafkaClusterV2Resource) Read(ctx context.Context, req resource.ReadRequest, resp *resource.ReadResponse) { - var data schema.KafkaClusterV2Model + var data schema.ConsoleKafkaClusterV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) @@ -135,7 +135,7 @@ func (r *KafkaClusterV2Resource) Read(ctx context.Context, req resource.ReadRequ return } - var consoleRes = model.KafkaClusterResource{} + var consoleRes = console.KafkaClusterResource{} err = jsoniter.Unmarshal(get, &consoleRes) if err != nil { resp.Diagnostics.AddError("Parsing Error", fmt.Sprintf("Unable to read kafka cluster, got error: %s", err)) @@ -154,7 +154,7 @@ func (r *KafkaClusterV2Resource) Read(ctx context.Context, req resource.ReadRequ } func (r *KafkaClusterV2Resource) Update(ctx context.Context, req resource.UpdateRequest, resp *resource.UpdateResponse) { - var data schema.KafkaClusterV2Model + var data schema.ConsoleKafkaClusterV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -180,7 +180,7 @@ func (r *KafkaClusterV2Resource) Update(ctx context.Context, req resource.Update } tflog.Debug(ctx, fmt.Sprintf("Kafka cluster updated with result: %s", apply)) - var consoleRes = model.KafkaClusterResource{} + var consoleRes = console.KafkaClusterResource{} err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as kafka cluster : %v, got error: %s", apply.Resource, err)) @@ -198,7 +198,7 @@ func (r *KafkaClusterV2Resource) Update(ctx context.Context, req resource.Update } func (r *KafkaClusterV2Resource) Delete(ctx context.Context, req resource.DeleteRequest, resp *resource.DeleteResponse) { - var data schema.KafkaClusterV2Model + var data schema.ConsoleKafkaClusterV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) diff --git a/internal/provider/kafka_cluster_v2_resource_test.go b/internal/provider/console_kafka_cluster_v2_resource_test.go similarity index 92% rename from internal/provider/kafka_cluster_v2_resource_test.go rename to internal/provider/console_kafka_cluster_v2_resource_test.go index 62fb394..329713c 100644 --- a/internal/provider/kafka_cluster_v2_resource_test.go +++ b/internal/provider/console_kafka_cluster_v2_resource_test.go @@ -9,14 +9,14 @@ import ( ) func TestAccKafkaClusterV2Resource(t *testing.T) { - resourceRef := "conduktor_kafka_cluster_v2.test" + resourceRef := "conduktor_console_kafka_cluster_v2.test" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, ProtoV6ProviderFactories: testAccProtoV6ProviderFactories, Steps: []resource.TestStep{ // Create and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "kafka_cluster_v2_resource_create.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_kafka_cluster_v2_resource_create.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "test-cluster"), resource.TestCheckResourceAttr(resourceRef, "labels.%", "1"), @@ -51,7 +51,7 @@ func TestAccKafkaClusterV2Resource(t *testing.T) { }, // Update and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "kafka_cluster_v2_resource_update.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_kafka_cluster_v2_resource_update.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "test-cluster"), resource.TestCheckResourceAttr(resourceRef, "labels.%", "2"), @@ -84,14 +84,14 @@ func TestAccKafkaClusterV2Resource(t *testing.T) { func TestAccKafkaClusterV2Minimal(t *testing.T) { test.CheckEnterpriseEnabled(t) - resourceRef := "conduktor_kafka_cluster_v2.minimal" + resourceRef := "conduktor_console_kafka_cluster_v2.minimal" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, ProtoV6ProviderFactories: testAccProtoV6ProviderFactories, Steps: []resource.TestStep{ // Create and Read from minimal example { - Config: providerConfigConsole + test.TestAccTestdata(t, "kafka_cluster_v2_resource_minimal.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_kafka_cluster_v2_resource_minimal.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "mini-cluster"), resource.TestCheckResourceAttr(resourceRef, "spec.display_name", "Minimal Cluster"), @@ -109,11 +109,11 @@ func TestAccKafkaClusterV2Minimal(t *testing.T) { func TestAccKafkaClusterV2ExampleResource(t *testing.T) { test.CheckEnterpriseEnabled(t) - var simpleResourceRef = "conduktor_kafka_cluster_v2.simple" - var gatewayResourceRef = "conduktor_kafka_cluster_v2.gateway" - var aivenResourceRef = "conduktor_kafka_cluster_v2.aiven" - var awsResourceRef = "conduktor_kafka_cluster_v2.aws_msk" - var confluentResourceRef = "conduktor_kafka_cluster_v2.confluent" + var simpleResourceRef = "conduktor_console_kafka_cluster_v2.simple" + var gatewayResourceRef = "conduktor_console_kafka_cluster_v2.gateway" + var aivenResourceRef = "conduktor_console_kafka_cluster_v2.aiven" + var awsResourceRef = "conduktor_console_kafka_cluster_v2.aws_msk" + var confluentResourceRef = "conduktor_console_kafka_cluster_v2.confluent" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, @@ -122,7 +122,7 @@ func TestAccKafkaClusterV2ExampleResource(t *testing.T) { Steps: []resource.TestStep{ // Create and Read from simple example { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_cluster_v2", "simple.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_cluster_v2", "simple.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(simpleResourceRef, "name", "simple-cluster"), resource.TestCheckResourceAttr(simpleResourceRef, "labels.%", "0"), @@ -133,7 +133,7 @@ func TestAccKafkaClusterV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_cluster_v2", "gateway.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_cluster_v2", "gateway.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(gatewayResourceRef, "name", "gateway-cluster"), resource.TestCheckResourceAttr(gatewayResourceRef, "labels.%", "1"), @@ -154,7 +154,7 @@ func TestAccKafkaClusterV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_cluster_v2", "aiven.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_cluster_v2", "aiven.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(aivenResourceRef, "name", "aiven-cluster"), resource.TestCheckResourceAttr(aivenResourceRef, "labels.%", "1"), @@ -175,7 +175,7 @@ func TestAccKafkaClusterV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_cluster_v2", "aws_msk.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_cluster_v2", "aws_msk.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(awsResourceRef, "name", "aws-cluster"), resource.TestCheckResourceAttr(awsResourceRef, "labels.%", "1"), @@ -192,7 +192,7 @@ func TestAccKafkaClusterV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_cluster_v2", "confluent.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_cluster_v2", "confluent.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(confluentResourceRef, "name", "confluent-cluster"), resource.TestCheckResourceAttr(confluentResourceRef, "labels.%", "1"), diff --git a/internal/provider/kafka_connect_v2_resource.go b/internal/provider/console_kafka_connect_v2_resource.go similarity index 93% rename from internal/provider/kafka_connect_v2_resource.go rename to internal/provider/console_kafka_connect_v2_resource.go index de368fd..ff85e03 100644 --- a/internal/provider/kafka_connect_v2_resource.go +++ b/internal/provider/console_kafka_connect_v2_resource.go @@ -4,9 +4,9 @@ import ( "context" "fmt" "github.com/conduktor/terraform-provider-conduktor/internal/client" - mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/kafka_connect_v2" - "github.com/conduktor/terraform-provider-conduktor/internal/model" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_kafka_connect_v2" + mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/console_kafka_connect_v2" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_kafka_connect_v2" "github.com/hashicorp/terraform-plugin-framework/path" "github.com/hashicorp/terraform-plugin-framework/resource" "github.com/hashicorp/terraform-plugin-log/tflog" @@ -36,11 +36,11 @@ type KafkaConnectV2Resource struct { } func (r *KafkaConnectV2Resource) Metadata(_ context.Context, req resource.MetadataRequest, resp *resource.MetadataResponse) { - resp.TypeName = req.ProviderTypeName + "_kafka_connect_v2" + resp.TypeName = req.ProviderTypeName + "_console_kafka_connect_v2" } func (r *KafkaConnectV2Resource) Schema(ctx context.Context, _ resource.SchemaRequest, resp *resource.SchemaResponse) { - resp.Schema = schema.KafkaConnectV2ResourceSchema(ctx) + resp.Schema = schema.ConsoleKafkaConnectV2ResourceSchema(ctx) } func (r *KafkaConnectV2Resource) Configure(_ context.Context, req resource.ConfigureRequest, resp *resource.ConfigureResponse) { @@ -74,7 +74,7 @@ func (r *KafkaConnectV2Resource) Configure(_ context.Context, req resource.Confi } func (r *KafkaConnectV2Resource) Create(ctx context.Context, req resource.CreateRequest, resp *resource.CreateResponse) { - var data schema.KafkaConnectV2Model + var data schema.ConsoleKafkaConnectV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -101,7 +101,7 @@ func (r *KafkaConnectV2Resource) Create(ctx context.Context, req resource.Create tflog.Debug(ctx, fmt.Sprintf("Kafka connect server created with result: %s", apply.UpsertResult)) - var consoleRes = model.KafkaConnectResource{} + var consoleRes = console.KafkaConnectResource{} err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as kafka connect server : %v, got error: %s", apply.Resource, err)) @@ -120,7 +120,7 @@ func (r *KafkaConnectV2Resource) Create(ctx context.Context, req resource.Create } func (r *KafkaConnectV2Resource) Read(ctx context.Context, req resource.ReadRequest, resp *resource.ReadResponse) { - var data schema.KafkaConnectV2Model + var data schema.ConsoleKafkaConnectV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) @@ -142,7 +142,7 @@ func (r *KafkaConnectV2Resource) Read(ctx context.Context, req resource.ReadRequ return } - var consoleRes = model.KafkaConnectResource{} + var consoleRes = console.KafkaConnectResource{} err = jsoniter.Unmarshal(get, &consoleRes) if err != nil { resp.Diagnostics.AddError("Parsing Error", fmt.Sprintf("Unable to read kafka connect server, got error: %s", err)) @@ -161,7 +161,7 @@ func (r *KafkaConnectV2Resource) Read(ctx context.Context, req resource.ReadRequ } func (r *KafkaConnectV2Resource) Update(ctx context.Context, req resource.UpdateRequest, resp *resource.UpdateResponse) { - var data schema.KafkaConnectV2Model + var data schema.ConsoleKafkaConnectV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -187,7 +187,7 @@ func (r *KafkaConnectV2Resource) Update(ctx context.Context, req resource.Update } tflog.Debug(ctx, fmt.Sprintf("Kafka connect server updated with result: %s", apply)) - var consoleRes = model.KafkaConnectResource{} + var consoleRes = console.KafkaConnectResource{} err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as kafka connect server : %v, got error: %s", apply.Resource, err)) @@ -205,7 +205,7 @@ func (r *KafkaConnectV2Resource) Update(ctx context.Context, req resource.Update } func (r *KafkaConnectV2Resource) Delete(ctx context.Context, req resource.DeleteRequest, resp *resource.DeleteResponse) { - var data schema.KafkaConnectV2Model + var data schema.ConsoleKafkaConnectV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) diff --git a/internal/provider/kafka_connect_v2_resource_test.go b/internal/provider/console_kafka_connect_v2_resource_test.go similarity index 91% rename from internal/provider/kafka_connect_v2_resource_test.go rename to internal/provider/console_kafka_connect_v2_resource_test.go index d435dc4..c7d40ca 100644 --- a/internal/provider/kafka_connect_v2_resource_test.go +++ b/internal/provider/console_kafka_connect_v2_resource_test.go @@ -9,14 +9,14 @@ import ( ) func TestAccKafkaConnectV2Resource(t *testing.T) { - resourceRef := "conduktor_kafka_connect_v2.test" + resourceRef := "conduktor_console_kafka_connect_v2.test" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, ProtoV6ProviderFactories: testAccProtoV6ProviderFactories, Steps: []resource.TestStep{ // Create and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "kafka_connect_v2_resource_create.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_kafka_connect_v2_resource_create.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "test-connect"), resource.TestCheckResourceAttr(resourceRef, "cluster", "mini-cluster"), @@ -42,7 +42,7 @@ func TestAccKafkaConnectV2Resource(t *testing.T) { }, // Update and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "kafka_connect_v2_resource_update.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_kafka_connect_v2_resource_update.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "test-connect"), resource.TestCheckResourceAttr(resourceRef, "cluster", "mini-cluster"), @@ -68,14 +68,14 @@ func TestAccKafkaConnectV2Resource(t *testing.T) { func TestAccKafkaConnectV2Minimal(t *testing.T) { test.CheckEnterpriseEnabled(t) - resourceRef := "conduktor_kafka_connect_v2.minimal" + resourceRef := "conduktor_console_kafka_connect_v2.minimal" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, ProtoV6ProviderFactories: testAccProtoV6ProviderFactories, Steps: []resource.TestStep{ // Create and Read from minimal example { - Config: providerConfigConsole + test.TestAccTestdata(t, "kafka_connect_v2_resource_minimal.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_kafka_connect_v2_resource_minimal.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "minimal-connect"), resource.TestCheckResourceAttr(resourceRef, "cluster", "mini-cluster"), @@ -95,10 +95,10 @@ func TestAccKafkaConnectV2Minimal(t *testing.T) { func TestAccKafkaConnectV2ExampleResource(t *testing.T) { test.CheckEnterpriseEnabled(t) - var simpleResourceRef = "conduktor_kafka_connect_v2.simple" - var mtlsResourceRef = "conduktor_kafka_connect_v2.mtls" - var basicResourceRef = "conduktor_kafka_connect_v2.basic" - var bearerResourceRef = "conduktor_kafka_connect_v2.bearer" + var simpleResourceRef = "conduktor_console_kafka_connect_v2.simple" + var mtlsResourceRef = "conduktor_console_kafka_connect_v2.mtls" + var basicResourceRef = "conduktor_console_kafka_connect_v2.basic" + var bearerResourceRef = "conduktor_console_kafka_connect_v2.bearer" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, @@ -107,7 +107,7 @@ func TestAccKafkaConnectV2ExampleResource(t *testing.T) { Steps: []resource.TestStep{ // Create and Read from simple example { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_connect_v2", "simple.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_connect_v2", "simple.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(simpleResourceRef, "name", "simple-connect"), resource.TestCheckResourceAttr(simpleResourceRef, "cluster", "mini-cluster"), @@ -119,7 +119,7 @@ func TestAccKafkaConnectV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_connect_v2", "mtls.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_connect_v2", "mtls.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(mtlsResourceRef, "name", "mtls-connect"), resource.TestCheckResourceAttr(mtlsResourceRef, "cluster", "mini-cluster"), @@ -137,7 +137,7 @@ func TestAccKafkaConnectV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_connect_v2", "basicAuth.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_connect_v2", "basicAuth.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(basicResourceRef, "name", "basic-connect"), resource.TestCheckResourceAttr(basicResourceRef, "cluster", "mini-cluster"), @@ -157,7 +157,7 @@ func TestAccKafkaConnectV2ExampleResource(t *testing.T) { ), }, { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_kafka_connect_v2", "bearerToken.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_kafka_connect_v2", "bearerToken.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(bearerResourceRef, "name", "bearer-connect"), resource.TestCheckResourceAttr(bearerResourceRef, "cluster", "mini-cluster"), diff --git a/internal/provider/user_v2_resource.go b/internal/provider/console_user_v2_resource.go similarity index 93% rename from internal/provider/user_v2_resource.go rename to internal/provider/console_user_v2_resource.go index 3cf22ed..706571b 100644 --- a/internal/provider/user_v2_resource.go +++ b/internal/provider/console_user_v2_resource.go @@ -6,9 +6,9 @@ import ( "fmt" "github.com/conduktor/terraform-provider-conduktor/internal/client" - mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/user_v2" - "github.com/conduktor/terraform-provider-conduktor/internal/model" - schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_user_v2" + mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/console_user_v2" + console "github.com/conduktor/terraform-provider-conduktor/internal/model/console" + schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_user_v2" "github.com/hashicorp/terraform-plugin-framework/path" "github.com/hashicorp/terraform-plugin-framework/resource" "github.com/hashicorp/terraform-plugin-log/tflog" @@ -30,11 +30,11 @@ type UserV2Resource struct { } func (r *UserV2Resource) Metadata(ctx context.Context, req resource.MetadataRequest, resp *resource.MetadataResponse) { - resp.TypeName = req.ProviderTypeName + "_user_v2" + resp.TypeName = req.ProviderTypeName + "_console_user_v2" } func (r *UserV2Resource) Schema(ctx context.Context, req resource.SchemaRequest, resp *resource.SchemaResponse) { - resp.Schema = schema.UserV2ResourceSchema(ctx) + resp.Schema = schema.ConsoleUserV2ResourceSchema(ctx) } func (r *UserV2Resource) Configure(ctx context.Context, req resource.ConfigureRequest, resp *resource.ConfigureResponse) { @@ -68,7 +68,7 @@ func (r *UserV2Resource) Configure(ctx context.Context, req resource.ConfigureRe } func (r *UserV2Resource) Create(ctx context.Context, req resource.CreateRequest, resp *resource.CreateResponse) { - var data schema.UserV2Model + var data schema.ConsoleUserV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -94,7 +94,7 @@ func (r *UserV2Resource) Create(ctx context.Context, req resource.CreateRequest, tflog.Debug(ctx, fmt.Sprintf("User created with result: %s", apply)) - var consoleRes model.UserConsoleResource + var consoleRes console.UserConsoleResource err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as group : %v, got error: %s", apply.Resource, err)) @@ -113,7 +113,7 @@ func (r *UserV2Resource) Create(ctx context.Context, req resource.CreateRequest, } func (r *UserV2Resource) Read(ctx context.Context, req resource.ReadRequest, resp *resource.ReadResponse) { - var data schema.UserV2Model + var data schema.ConsoleUserV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) @@ -135,7 +135,7 @@ func (r *UserV2Resource) Read(ctx context.Context, req resource.ReadRequest, res return } - var consoleRes = model.UserConsoleResource{} + var consoleRes = console.UserConsoleResource{} err = json.Unmarshal(get, &consoleRes) if err != nil { resp.Diagnostics.AddError("Client Error", fmt.Sprintf("Unable to read user, got error: %s", err)) @@ -154,7 +154,7 @@ func (r *UserV2Resource) Read(ctx context.Context, req resource.ReadRequest, res } func (r *UserV2Resource) Update(ctx context.Context, req resource.UpdateRequest, resp *resource.UpdateResponse) { - var data schema.UserV2Model + var data schema.ConsoleUserV2Model // Read Terraform plan data into the model resp.Diagnostics.Append(req.Plan.Get(ctx, &data)...) @@ -180,7 +180,7 @@ func (r *UserV2Resource) Update(ctx context.Context, req resource.UpdateRequest, } tflog.Debug(ctx, fmt.Sprintf("User updated with result: %s", apply)) - var consoleRes model.UserConsoleResource + var consoleRes console.UserConsoleResource err = consoleRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as group : %v, got error: %s", apply.Resource, err)) @@ -199,7 +199,7 @@ func (r *UserV2Resource) Update(ctx context.Context, req resource.UpdateRequest, } func (r *UserV2Resource) Delete(ctx context.Context, req resource.DeleteRequest, resp *resource.DeleteResponse) { - var data schema.UserV2Model + var data schema.ConsoleUserV2Model // Read Terraform prior state data into the model resp.Diagnostics.Append(req.State.Get(ctx, &data)...) diff --git a/internal/provider/user_v2_resource_test.go b/internal/provider/console_user_v2_resource_test.go similarity index 61% rename from internal/provider/user_v2_resource_test.go rename to internal/provider/console_user_v2_resource_test.go index 6b47aca..d39ab25 100644 --- a/internal/provider/user_v2_resource_test.go +++ b/internal/provider/console_user_v2_resource_test.go @@ -9,7 +9,7 @@ import ( ) func TestAccUserV2Resource(t *testing.T) { - resourceRef := "conduktor_user_v2.test" + resourceRef := "conduktor_console_user_v2.test" resource.Test(t, resource.TestCase{ PreCheck: func() { test.TestAccPreCheck(t) }, ProtoV6ProviderFactories: testAccProtoV6ProviderFactories, @@ -17,7 +17,7 @@ func TestAccUserV2Resource(t *testing.T) { Steps: []resource.TestStep{ // Create and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "user_v2_resource_create.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_user_v2_resource_create.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "pam.beesly@dunder.mifflin.com"), resource.TestCheckResourceAttr(resourceRef, "spec.firstname", "Pam"), @@ -40,7 +40,7 @@ func TestAccUserV2Resource(t *testing.T) { }, // Update and Read testing { - Config: providerConfigConsole + test.TestAccTestdata(t, "user_v2_resource_update.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_user_v2_resource_update.tf"), Check: resource.ComposeAggregateTestCheckFunc( resource.TestCheckResourceAttr(resourceRef, "name", "pam.beesly@dunder.mifflin.com"), resource.TestCheckResourceAttr(resourceRef, "spec.firstname", "Pam"), @@ -67,10 +67,10 @@ func TestAccUserV2Minimal(t *testing.T) { Steps: []resource.TestStep{ // Create and Read from minimal example { - Config: providerConfigConsole + test.TestAccTestdata(t, "user_v2_resource_minimal.tf"), + Config: providerConfigConsole + test.TestAccTestdata(t, "console_user_v2_resource_minimal.tf"), Check: resource.ComposeAggregateTestCheckFunc( - resource.TestCheckResourceAttr("conduktor_user_v2.minimal", "name", "angela.martin@dunder-mifflin.com"), - resource.TestCheckResourceAttr("conduktor_user_v2.minimal", "spec.permissions.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.minimal", "name", "angela.martin@dunder-mifflin.com"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.minimal", "spec.permissions.#", "0"), ), }, // Delete testing automatically occurs in TestCase @@ -86,29 +86,29 @@ func TestAccUserV2ExampleResource(t *testing.T) { Steps: []resource.TestStep{ // Create and Read from example { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_user_v2", "simple.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_user_v2", "simple.tf"), Check: resource.ComposeAggregateTestCheckFunc( - resource.TestCheckResourceAttr("conduktor_user_v2.example", "name", "bob@company.io"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.firstname", "Bob"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.lastname", "Smith"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.#", "0"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "name", "bob@company.io"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.firstname", "Bob"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.lastname", "Smith"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.#", "0"), ), }, // Create and Read from example { - Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_user_v2", "complex.tf"), + Config: providerConfigConsole + test.TestAccExample(t, "resources", "conduktor_console_user_v2", "complex.tf"), Check: resource.ComposeAggregateTestCheckFunc( - resource.TestCheckResourceAttr("conduktor_user_v2.example", "name", "bob@company.io"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.firstname", "Bob"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.lastname", "Smith"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.#", "2"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.0.resource_type", "TOPIC"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.0.name", "test-topic"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.0.pattern_type", "LITERAL"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.0.cluster", "*"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.0.permissions.#", "3"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.1.resource_type", "PLATFORM"), - resource.TestCheckResourceAttr("conduktor_user_v2.example", "spec.permissions.1.permissions.#", "3"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "name", "bob@company.io"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.firstname", "Bob"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.lastname", "Smith"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.#", "2"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.0.resource_type", "TOPIC"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.0.name", "test-topic"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.0.pattern_type", "LITERAL"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.0.cluster", "*"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.0.permissions.#", "3"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.1.resource_type", "PLATFORM"), + resource.TestCheckResourceAttr("conduktor_console_user_v2.example", "spec.permissions.1.permissions.#", "3"), ), }, }, diff --git a/internal/provider/gateway_service_account_v2_resource.go b/internal/provider/gateway_service_account_v2_resource.go index aefa202..1b43a78 100644 --- a/internal/provider/gateway_service_account_v2_resource.go +++ b/internal/provider/gateway_service_account_v2_resource.go @@ -7,7 +7,7 @@ import ( "github.com/conduktor/terraform-provider-conduktor/internal/client" mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper/gateway_service_account_v2" - "github.com/conduktor/terraform-provider-conduktor/internal/model" + gateway "github.com/conduktor/terraform-provider-conduktor/internal/model/gateway" schema "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_gateway_service_account_v2" "github.com/hashicorp/terraform-plugin-framework/path" "github.com/hashicorp/terraform-plugin-framework/resource" @@ -94,7 +94,7 @@ func (r *GatewayServiceAccountV2Resource) Create(ctx context.Context, req resour tflog.Debug(ctx, fmt.Sprintf("Service account created with result: %s", apply)) - var gatewayRes model.GatewayServiceAccountResource + var gatewayRes gateway.GatewayServiceAccountResource err = gatewayRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as service account : %v, got error: %s", apply.Resource, err)) @@ -141,7 +141,7 @@ func (r *GatewayServiceAccountV2Resource) Read(ctx context.Context, req resource return } - var gatewayRes = []model.GatewayServiceAccountResource{} + var gatewayRes = []gateway.GatewayServiceAccountResource{} err = json.Unmarshal(get, &gatewayRes) if err != nil || len(gatewayRes) < 1 { resp.Diagnostics.AddError("Client Error", fmt.Sprintf("Unable to read service account, got error: %s", err)) @@ -186,7 +186,7 @@ func (r *GatewayServiceAccountV2Resource) Update(ctx context.Context, req resour } tflog.Debug(ctx, fmt.Sprintf("Service account updated with result: %s", apply)) - var gatewayRes model.GatewayServiceAccountResource + var gatewayRes gateway.GatewayServiceAccountResource err = gatewayRes.FromRawJsonInterface(apply.Resource) if err != nil { resp.Diagnostics.AddError("Unmarshall Error", fmt.Sprintf("Response resource can't be cast as service account : %v, got error: %s", apply.Resource, err)) @@ -216,7 +216,7 @@ func (r *GatewayServiceAccountV2Resource) Delete(ctx context.Context, req resour return } - deleteRes := model.GatewayServiceAccountMetadata{ + deleteRes := gateway.GatewayServiceAccountMetadata{ Name: data.Name.ValueString(), VCluster: data.Vcluster.ValueString(), } diff --git a/internal/schema/provider_conduktor/conduktor_provider_gen.go b/internal/schema/provider_conduktor/conduktor_provider_gen.go index 069482e..4cd746a 100644 --- a/internal/schema/provider_conduktor/conduktor_provider_gen.go +++ b/internal/schema/provider_conduktor/conduktor_provider_gen.go @@ -18,8 +18,8 @@ func ConduktorProviderSchema(ctx context.Context) schema.Schema { "admin_password": schema.StringAttribute{ Optional: true, Sensitive: true, - Description: "The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_email is set. If not provided, the API token will be used to authenticater.", - MarkdownDescription: "The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_email is set. If not provided, the API token will be used to authenticater.", + Description: "The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_user is set. If not provided, the API token will be used to authenticater.", + MarkdownDescription: "The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_user is set. If not provided, the API token will be used to authenticater.", }, "admin_user": schema.StringAttribute{ Optional: true, diff --git a/internal/schema/resource_group_v2/group_v2_resource_gen.go b/internal/schema/resource_console_group_v2/console_group_v2_resource_gen.go similarity index 99% rename from internal/schema/resource_group_v2/group_v2_resource_gen.go rename to internal/schema/resource_console_group_v2/console_group_v2_resource_gen.go index 2dbba70..5175168 100644 --- a/internal/schema/resource_group_v2/group_v2_resource_gen.go +++ b/internal/schema/resource_console_group_v2/console_group_v2_resource_gen.go @@ -1,6 +1,6 @@ // Code generated by terraform-plugin-framework-generator DO NOT EDIT. -package resource_group_v2 +package resource_console_group_v2 import ( "context" @@ -23,7 +23,7 @@ import ( "github.com/hashicorp/terraform-plugin-framework/resource/schema" ) -func GroupV2ResourceSchema(ctx context.Context) schema.Schema { +func ConsoleGroupV2ResourceSchema(ctx context.Context) schema.Schema { return schema.Schema{ Attributes: map[string]schema.Attribute{ "name": schema.StringAttribute{ @@ -140,7 +140,7 @@ func GroupV2ResourceSchema(ctx context.Context) schema.Schema { } } -type GroupV2Model struct { +type ConsoleGroupV2Model struct { Name types.String `tfsdk:"name"` Spec SpecValue `tfsdk:"spec"` } diff --git a/internal/schema/resource_kafka_cluster_v2/kafka_cluster_v2_resource_gen.go b/internal/schema/resource_console_kafka_cluster_v2/console_kafka_cluster_v2_resource_gen.go similarity index 99% rename from internal/schema/resource_kafka_cluster_v2/kafka_cluster_v2_resource_gen.go rename to internal/schema/resource_console_kafka_cluster_v2/console_kafka_cluster_v2_resource_gen.go index 24efd13..daa3a7c 100644 --- a/internal/schema/resource_kafka_cluster_v2/kafka_cluster_v2_resource_gen.go +++ b/internal/schema/resource_console_kafka_cluster_v2/console_kafka_cluster_v2_resource_gen.go @@ -1,6 +1,6 @@ // Code generated by terraform-plugin-framework-generator DO NOT EDIT. -package resource_kafka_cluster_v2 +package resource_console_kafka_cluster_v2 import ( "context" @@ -23,7 +23,7 @@ import ( "github.com/hashicorp/terraform-plugin-framework/resource/schema" ) -func KafkaClusterV2ResourceSchema(ctx context.Context) schema.Schema { +func ConsoleKafkaClusterV2ResourceSchema(ctx context.Context) schema.Schema { return schema.Schema{ Attributes: map[string]schema.Attribute{ "labels": schema.MapAttribute{ @@ -327,7 +327,7 @@ func KafkaClusterV2ResourceSchema(ctx context.Context) schema.Schema { } } -type KafkaClusterV2Model struct { +type ConsoleKafkaClusterV2Model struct { Labels types.Map `tfsdk:"labels"` Name types.String `tfsdk:"name"` Spec SpecValue `tfsdk:"spec"` diff --git a/internal/schema/resource_kafka_connect_v2/kafka_connect_v2_resource_gen.go b/internal/schema/resource_console_kafka_connect_v2/console_kafka_connect_v2_resource_gen.go similarity index 99% rename from internal/schema/resource_kafka_connect_v2/kafka_connect_v2_resource_gen.go rename to internal/schema/resource_console_kafka_connect_v2/console_kafka_connect_v2_resource_gen.go index 8f0371d..d9cb8fb 100644 --- a/internal/schema/resource_kafka_connect_v2/kafka_connect_v2_resource_gen.go +++ b/internal/schema/resource_console_kafka_connect_v2/console_kafka_connect_v2_resource_gen.go @@ -1,6 +1,6 @@ // Code generated by terraform-plugin-framework-generator DO NOT EDIT. -package resource_kafka_connect_v2 +package resource_console_kafka_connect_v2 import ( "context" @@ -22,7 +22,7 @@ import ( "github.com/hashicorp/terraform-plugin-framework/resource/schema" ) -func KafkaConnectV2ResourceSchema(ctx context.Context) schema.Schema { +func ConsoleKafkaConnectV2ResourceSchema(ctx context.Context) schema.Schema { return schema.Schema{ Attributes: map[string]schema.Attribute{ "cluster": schema.StringAttribute{ @@ -140,7 +140,7 @@ func KafkaConnectV2ResourceSchema(ctx context.Context) schema.Schema { } } -type KafkaConnectV2Model struct { +type ConsoleKafkaConnectV2Model struct { Cluster types.String `tfsdk:"cluster"` Labels types.Map `tfsdk:"labels"` Name types.String `tfsdk:"name"` diff --git a/internal/schema/resource_user_v2/user_v2_resource_gen.go b/internal/schema/resource_console_user_v2/console_user_v2_resource_gen.go similarity index 99% rename from internal/schema/resource_user_v2/user_v2_resource_gen.go rename to internal/schema/resource_console_user_v2/console_user_v2_resource_gen.go index 3c93216..d1b8ea6 100644 --- a/internal/schema/resource_user_v2/user_v2_resource_gen.go +++ b/internal/schema/resource_console_user_v2/console_user_v2_resource_gen.go @@ -1,6 +1,6 @@ // Code generated by terraform-plugin-framework-generator DO NOT EDIT. -package resource_user_v2 +package resource_console_user_v2 import ( "context" @@ -22,7 +22,7 @@ import ( "github.com/hashicorp/terraform-plugin-framework/resource/schema" ) -func UserV2ResourceSchema(ctx context.Context) schema.Schema { +func ConsoleUserV2ResourceSchema(ctx context.Context) schema.Schema { return schema.Schema{ Attributes: map[string]schema.Attribute{ "name": schema.StringAttribute{ @@ -116,7 +116,7 @@ func UserV2ResourceSchema(ctx context.Context) schema.Schema { } } -type UserV2Model struct { +type ConsoleUserV2Model struct { Name types.String `tfsdk:"name"` Spec SpecValue `tfsdk:"spec"` } diff --git a/internal/schema/schema_utils.go b/internal/schema/schema_utils.go index 61578cc..94444e1 100644 --- a/internal/schema/schema_utils.go +++ b/internal/schema/schema_utils.go @@ -7,8 +7,8 @@ import ( mapper "github.com/conduktor/terraform-provider-conduktor/internal/mapper" "github.com/conduktor/terraform-provider-conduktor/internal/model" - groups "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_group_v2" - users "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_user_v2" + groups "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_group_v2" + users "github.com/conduktor/terraform-provider-conduktor/internal/schema/resource_console_user_v2" "github.com/hashicorp/terraform-plugin-framework/attr" "github.com/hashicorp/terraform-plugin-framework/diag" "github.com/hashicorp/terraform-plugin-framework/types" diff --git a/internal/testdata/group_v2_api.json b/internal/testdata/console_group_v2_api.json similarity index 64% rename from internal/testdata/group_v2_api.json rename to internal/testdata/console_group_v2_api.json index aa33292..0b75731 100644 --- a/internal/testdata/group_v2_api.json +++ b/internal/testdata/console_group_v2_api.json @@ -7,7 +7,9 @@ "spec": { "displayName": "Sales Department", "description": "Sales Department Group", - "externalGroups": ["sales"], + "externalGroups": [ + "sales" + ], "members": [ "jim.halpert@dunder.mifflin.com", "dwight.schrute@dunder.mifflin.com" @@ -16,33 +18,48 @@ "permissions": [ { "resourceType": "PLATFORM", - "permissions": ["groupView", "clusterConnectionsManage"] + "permissions": [ + "groupView", + "clusterConnectionsManage" + ] }, { "resourceType": "CLUSTER", "name": "scranton", - "permissions": ["clusterViewBroker", "clusterEditBroker"] + "permissions": [ + "clusterViewBroker", + "clusterEditBroker" + ] }, { "resourceType": "TOPIC", "name": "sales-*", "patternType": "PREFIXED", "cluster": "scranton", - "permissions": ["topicViewConfig", "topicConsume", "topicProduce"] + "permissions": [ + "topicViewConfig", + "topicConsume", + "topicProduce" + ] }, { "resourceType": "SUBJECT", "name": "sales-*", "patternType": "PREFIXED", "cluster": "scranton", - "permissions": ["subjectView", "subjectEditCompatibility"] + "permissions": [ + "subjectView", + "subjectEditCompatibility" + ] }, { "resourceType": "CONSUMER_GROUP", "name": "sales-*", "patternType": "PREFIXED", "cluster": "scranton", - "permissions": ["consumerGroupView"] + "permissions": [ + "consumerGroupView" + ] }, { "resourceType": "KAFKA_CONNECT", @@ -50,13 +67,18 @@ "patternType": "PREFIXED", "kafkaConnect": "scranton", "cluster": "scranton", - "permissions": ["subjectView", "kafkaConnectorDelete"] + "permissions": [ + "subjectView", + "kafkaConnectorDelete" + ] }, { "resourceType": "KSQLDB", "name": "sales-ksqldb", "cluster": "scranton", - "permissions": ["ksqldbAccess"] + "permissions": [ + "ksqldbAccess" + ] } ] } diff --git a/internal/testdata/group_v2_resource_create.tf b/internal/testdata/console_group_v2_resource_create.tf similarity index 87% rename from internal/testdata/group_v2_resource_create.tf rename to internal/testdata/console_group_v2_resource_create.tf index 0849fa4..771362a 100644 --- a/internal/testdata/group_v2_resource_create.tf +++ b/internal/testdata/console_group_v2_resource_create.tf @@ -1,5 +1,5 @@ -resource "conduktor_group_v2" "test" { +resource "conduktor_console_group_v2" "test" { name = "sales" spec { display_name = "Sales Department" diff --git a/internal/testdata/group_v2_resource_minimal.tf b/internal/testdata/console_group_v2_resource_minimal.tf similarity index 57% rename from internal/testdata/group_v2_resource_minimal.tf rename to internal/testdata/console_group_v2_resource_minimal.tf index ad465fe..067869b 100644 --- a/internal/testdata/group_v2_resource_minimal.tf +++ b/internal/testdata/console_group_v2_resource_minimal.tf @@ -1,5 +1,5 @@ -resource "conduktor_group_v2" "minimal" { +resource "conduktor_console_group_v2" "minimal" { name = "minimal" spec { display_name = "Minimal" diff --git a/internal/testdata/group_v2_resource_update.tf b/internal/testdata/console_group_v2_resource_update.tf similarity index 83% rename from internal/testdata/group_v2_resource_update.tf rename to internal/testdata/console_group_v2_resource_update.tf index 3db8fa6..353fd29 100644 --- a/internal/testdata/group_v2_resource_update.tf +++ b/internal/testdata/console_group_v2_resource_update.tf @@ -1,5 +1,5 @@ -resource "conduktor_user_v2" "coworkers1" { +resource "conduktor_console_user_v2" "coworkers1" { name = "michael.scott@dunder.mifflin.com" spec { firstname = "Michael" @@ -8,7 +8,7 @@ resource "conduktor_user_v2" "coworkers1" { } } -resource "conduktor_group_v2" "test" { +resource "conduktor_console_group_v2" "test" { name = "sales" spec { display_name = "New Sales Department" @@ -29,5 +29,5 @@ resource "conduktor_group_v2" "test" { } ] } - depends_on = [conduktor_user_v2.coworkers1] + depends_on = [conduktor_console_user_v2.coworkers1] } diff --git a/internal/testdata/kafka_cluster_v2_aws_api.json b/internal/testdata/console_kafka_cluster_v2_aws_api.json similarity index 99% rename from internal/testdata/kafka_cluster_v2_aws_api.json rename to internal/testdata/console_kafka_cluster_v2_aws_api.json index 354321d..7d57046 100644 --- a/internal/testdata/kafka_cluster_v2_aws_api.json +++ b/internal/testdata/console_kafka_cluster_v2_aws_api.json @@ -14,7 +14,7 @@ "sasl.client.callback.handler.class": "io.conduktor.aws.IAMClientCallbackHandler", "aws_access_key_id": "XXXXXXXXXX", "aws_secret_access_key": "YYYYYYYYYY" - }, + }, "schemaRegistry": { "type": "Glue", "region": "eu-west-1", diff --git a/internal/testdata/kafka_cluster_v2_confluent_api.json b/internal/testdata/console_kafka_cluster_v2_confluent_api.json similarity index 99% rename from internal/testdata/kafka_cluster_v2_confluent_api.json rename to internal/testdata/console_kafka_cluster_v2_confluent_api.json index dae8f0a..620287e 100644 --- a/internal/testdata/kafka_cluster_v2_confluent_api.json +++ b/internal/testdata/console_kafka_cluster_v2_confluent_api.json @@ -23,7 +23,7 @@ "sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"admin\" password=\"admin-secret\";", "sasl.mechanism": "PLAIN", "security.protocol": "SASL_SSL" - }, + }, "schemaRegistry": { "type": "ConfluentLike", "url": "http://localhost:8080", diff --git a/internal/testdata/kafka_cluster_v2_minimal_api.json b/internal/testdata/console_kafka_cluster_v2_minimal_api.json similarity index 100% rename from internal/testdata/kafka_cluster_v2_minimal_api.json rename to internal/testdata/console_kafka_cluster_v2_minimal_api.json diff --git a/internal/testdata/kafka_cluster_v2_resource_create.tf b/internal/testdata/console_kafka_cluster_v2_resource_create.tf similarity index 95% rename from internal/testdata/kafka_cluster_v2_resource_create.tf rename to internal/testdata/console_kafka_cluster_v2_resource_create.tf index 382e54f..1455d93 100644 --- a/internal/testdata/kafka_cluster_v2_resource_create.tf +++ b/internal/testdata/console_kafka_cluster_v2_resource_create.tf @@ -1,5 +1,5 @@ -resource "conduktor_kafka_cluster_v2" "test" { +resource "conduktor_console_kafka_cluster_v2" "test" { name = "test-cluster" labels = { "env" = "test" diff --git a/internal/testdata/kafka_cluster_v2_resource_minimal.tf b/internal/testdata/console_kafka_cluster_v2_resource_minimal.tf similarity index 67% rename from internal/testdata/kafka_cluster_v2_resource_minimal.tf rename to internal/testdata/console_kafka_cluster_v2_resource_minimal.tf index 3a80ee2..5810c4d 100644 --- a/internal/testdata/kafka_cluster_v2_resource_minimal.tf +++ b/internal/testdata/console_kafka_cluster_v2_resource_minimal.tf @@ -1,5 +1,5 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" diff --git a/internal/testdata/kafka_cluster_v2_resource_update.tf b/internal/testdata/console_kafka_cluster_v2_resource_update.tf similarity index 94% rename from internal/testdata/kafka_cluster_v2_resource_update.tf rename to internal/testdata/console_kafka_cluster_v2_resource_update.tf index f5babd6..389ee78 100644 --- a/internal/testdata/kafka_cluster_v2_resource_update.tf +++ b/internal/testdata/console_kafka_cluster_v2_resource_update.tf @@ -1,5 +1,5 @@ -resource "conduktor_kafka_cluster_v2" "test" { +resource "conduktor_console_kafka_cluster_v2" "test" { name = "test-cluster" labels = { "env" = "test" diff --git a/internal/testdata/kafka_connect_v2_api.json b/internal/testdata/console_kafka_connect_v2_api.json similarity index 100% rename from internal/testdata/kafka_connect_v2_api.json rename to internal/testdata/console_kafka_connect_v2_api.json diff --git a/internal/testdata/kafka_connect_v2_resource_create.tf b/internal/testdata/console_kafka_connect_v2_resource_create.tf similarity index 73% rename from internal/testdata/kafka_connect_v2_resource_create.tf rename to internal/testdata/console_kafka_connect_v2_resource_create.tf index 130d795..648c7de 100644 --- a/internal/testdata/kafka_connect_v2_resource_create.tf +++ b/internal/testdata/console_kafka_connect_v2_resource_create.tf @@ -1,5 +1,5 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -7,9 +7,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "test" { +resource "conduktor_console_kafka_connect_v2" "test" { name = "test-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { env = "test" } diff --git a/internal/testdata/kafka_connect_v2_resource_minimal.tf b/internal/testdata/console_kafka_connect_v2_resource_minimal.tf similarity index 58% rename from internal/testdata/kafka_connect_v2_resource_minimal.tf rename to internal/testdata/console_kafka_connect_v2_resource_minimal.tf index 1cae818..fc78092 100644 --- a/internal/testdata/kafka_connect_v2_resource_minimal.tf +++ b/internal/testdata/console_kafka_connect_v2_resource_minimal.tf @@ -1,5 +1,5 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -7,9 +7,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "minimal" { +resource "conduktor_console_kafka_connect_v2" "minimal" { name = "minimal-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name spec { display_name = "Minimal Connect" urls = "http://localhost:8083" diff --git a/internal/testdata/kafka_connect_v2_resource_update.tf b/internal/testdata/console_kafka_connect_v2_resource_update.tf similarity index 76% rename from internal/testdata/kafka_connect_v2_resource_update.tf rename to internal/testdata/console_kafka_connect_v2_resource_update.tf index 3e21a39..45206e8 100644 --- a/internal/testdata/kafka_connect_v2_resource_update.tf +++ b/internal/testdata/console_kafka_connect_v2_resource_update.tf @@ -1,5 +1,5 @@ -resource "conduktor_kafka_cluster_v2" "minimal" { +resource "conduktor_console_kafka_cluster_v2" "minimal" { name = "mini-cluster" spec { display_name = "Minimal Cluster" @@ -7,9 +7,9 @@ resource "conduktor_kafka_cluster_v2" "minimal" { } } -resource "conduktor_kafka_connect_v2" "test" { +resource "conduktor_console_kafka_connect_v2" "test" { name = "test-connect" - cluster = conduktor_kafka_cluster_v2.minimal.name + cluster = conduktor_console_kafka_cluster_v2.minimal.name labels = { env = "test" security = "C1" diff --git a/internal/testdata/user_v2_api.json b/internal/testdata/console_user_v2_api.json similarity index 63% rename from internal/testdata/user_v2_api.json rename to internal/testdata/console_user_v2_api.json index 1dffa2d..71474c2 100644 --- a/internal/testdata/user_v2_api.json +++ b/internal/testdata/console_user_v2_api.json @@ -10,33 +10,48 @@ "permissions": [ { "resourceType": "PLATFORM", - "permissions": ["userView", "clusterConnectionsManage"] + "permissions": [ + "userView", + "clusterConnectionsManage" + ] }, { "resourceType": "CLUSTER", "name": "scranton", - "permissions": ["clusterViewBroker", "clusterEditBroker"] + "permissions": [ + "clusterViewBroker", + "clusterEditBroker" + ] }, { "resourceType": "TOPIC", "name": "sales-*", "patternType": "PREFIXED", "cluster": "scranton", - "permissions": ["topicViewConfig", "topicConsume", "topicProduce"] + "permissions": [ + "topicViewConfig", + "topicConsume", + "topicProduce" + ] }, { "resourceType": "SUBJECT", "name": "sales-*", "patternType": "PREFIXED", "cluster": "scranton", - "permissions": ["subjectView", "subjectEditCompatibility"] + "permissions": [ + "subjectView", + "subjectEditCompatibility" + ] }, { "resourceType": "CONSUMER_GROUP", "name": "sales-*", "patternType": "PREFIXED", "cluster": "scranton", - "permissions": ["consumerGroupView"] + "permissions": [ + "consumerGroupView" + ] }, { "resourceType": "KAFKA_CONNECT", @@ -44,13 +59,18 @@ "patternType": "PREFIXED", "kafkaConnect": "scranton", "cluster": "scranton", - "permissions": ["subjectView", "kafkaConnectorDelete"] + "permissions": [ + "subjectView", + "kafkaConnectorDelete" + ] }, { "resourceType": "KSQLDB", "name": "sales-ksqldb", "cluster": "scranton", - "permissions": ["ksqldbAccess"] + "permissions": [ + "ksqldbAccess" + ] } ] } diff --git a/internal/testdata/user_v2_resource_create.tf b/internal/testdata/console_user_v2_resource_create.tf similarity index 88% rename from internal/testdata/user_v2_resource_create.tf rename to internal/testdata/console_user_v2_resource_create.tf index 1b64f6b..c335178 100644 --- a/internal/testdata/user_v2_resource_create.tf +++ b/internal/testdata/console_user_v2_resource_create.tf @@ -1,5 +1,5 @@ -resource "conduktor_user_v2" "test" { +resource "conduktor_console_user_v2" "test" { name = "pam.beesly@dunder.mifflin.com" spec { firstname = "Pam" diff --git a/internal/testdata/user_v2_resource_minimal.tf b/internal/testdata/console_user_v2_resource_minimal.tf similarity index 55% rename from internal/testdata/user_v2_resource_minimal.tf rename to internal/testdata/console_user_v2_resource_minimal.tf index 0161b7e..e0109e2 100644 --- a/internal/testdata/user_v2_resource_minimal.tf +++ b/internal/testdata/console_user_v2_resource_minimal.tf @@ -1,5 +1,5 @@ -resource "conduktor_user_v2" "minimal" { +resource "conduktor_console_user_v2" "minimal" { name = "angela.martin@dunder-mifflin.com" spec { } diff --git a/internal/testdata/user_v2_resource_update.tf b/internal/testdata/console_user_v2_resource_update.tf similarity index 91% rename from internal/testdata/user_v2_resource_update.tf rename to internal/testdata/console_user_v2_resource_update.tf index 4de2437..92541f3 100644 --- a/internal/testdata/user_v2_resource_update.tf +++ b/internal/testdata/console_user_v2_resource_update.tf @@ -1,5 +1,5 @@ -resource "conduktor_user_v2" "test" { +resource "conduktor_console_user_v2" "test" { name = "pam.beesly@dunder.mifflin.com" spec { firstname = "Pam" diff --git a/provider_code_spec.json b/provider_code_spec.json index a3dd389..492e8b8 100644 --- a/provider_code_spec.json +++ b/provider_code_spec.json @@ -50,7 +50,7 @@ { "name": "admin_password", "string": { - "description": "The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_email is set. If not provided, the API token will be used to authenticater.", + "description": "The password of the admin user. May be set using environment variable `CDK_CONSOLE_PASSWORD` or `CDK_ADMIN_PASSWORD` for Console, `CDK_GATEWAY_PASSWORD` or `CDK_ADMIN_PASSWORD` for Gateway. Required if admin_user is set. If not provided, the API token will be used to authenticater.", "optional_required": "optional", "sensitive": true } @@ -89,175 +89,7 @@ "datasources": [], "resources": [ { - "name": "user_v2", - "schema": { - "attributes": [ - { - "name": "name", - "string": { - "description": "User email, must be unique, act as ID for import", - "computed_optional_required": "required", - "plan_modifiers": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" - } - ], - "schema_definition": "stringplanmodifier.RequiresReplace()" - } - } - ], - "validators": [ - { - "custom": { - "imports": [ - { - "path": "regexp" - }, - { - "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" - } - ], - "schema_definition": "stringvalidator.RegexMatches(regexp.MustCompile(\"^([\\\\w\\\\-_.]*[^.])@([\\\\w-]+\\\\.)+[\\\\w-]{2,4}$\"), \"\")" - } - } - ] - } - } - ], - "blocks": [ - { - "name": "spec", - "single_nested": { - "attributes": [ - { - "name": "firstname", - "string": { - "description": "User firstname", - "computed_optional_required": "optional" - } - }, - { - "name": "lastname", - "string": { - "description": "User lastname", - "computed_optional_required": "optional" - } - }, - { - "name": "permissions", - "set_nested": { - "description": "Set of all user permissions", - "computed_optional_required": "computed_optional", - "nested_object": { - "attributes": [ - { - "name": "name", - "string": { - "description": "Name of the resource to apply permission could be a topic, a cluster, a consumer group, etc. depending on resource_type", - "computed_optional_required": "optional" - } - }, - { - "name": "resource_type", - "string": { - "description": "Type of the resource to apply permission on valid values are: CLUSTER, CONSUMER_GROUP, KAFKA_CONNECT, KSQLDB, PLATFORM, SUBJECT, TOPIC", - "computed_optional_required": "required", - "validators": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" - }, - { - "path": "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" - } - ], - "schema_definition": "stringvalidator.OneOf(validation.ValidPermissionTypes...)" - } - } - ] - } - }, - { - "name": "cluster", - "string": { - "description": "Name of the cluster to apply permission, only required if resource_type is TOPIC, SUBJECT, CONSUMER_GROUP, KAFKA_CONNECT, KSQLDB", - "computed_optional_required": "optional" - } - }, - { - "name": "kafka_connect", - "string": { - "description": "Name of the Kafka Connect to apply permission, only required if resource_type is KAFKA_CONNECT", - "computed_optional_required": "optional" - } - }, - { - "name": "pattern_type", - "string": { - "description": "Type of the pattern to apply permission on valid values are: LITERAL, PREFIXED", - "computed_optional_required": "optional", - "validators": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" - }, - { - "path": "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" - } - ], - "schema_definition": "stringvalidator.OneOf(validation.ValidPermissionPatternTypes...)" - } - } - ] - } - }, - { - "name": "permissions", - "set": { - "description": "Set of all permissions to apply on the resource. See https://docs.conduktor.io/platform/reference/resource-reference/console/#permissions for more details", - "computed_optional_required": "required", - "element_type": { - "string": {} - }, - "validators": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework-validators/setvalidator" - }, - { - "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" - }, - { - "path": "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" - } - ], - "schema_definition": "setvalidator.ValueStringsAre(stringvalidator.OneOf(validation.ValidPermissions ...))" - } - } - ] - } - } - ] - } - } - } - ] - } - } - ] - } - }, - { - "name": "group_v2", + "name": "console_group_v2", "schema": { "attributes": [ { @@ -487,51 +319,13 @@ } }, { - "name": "generic", + "name": "console_kafka_cluster_v2", "schema": { "attributes": [ - { - "name": "kind", - "string": { - "description": "resource kind", - "computed_optional_required": "required", - "plan_modifiers": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" - } - ], - "schema_definition": "stringplanmodifier.RequiresReplace()" - } - } - ] - } - }, - { - "name": "version", - "string": { - "description": "resource version", - "computed_optional_required": "required", - "plan_modifiers": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" - } - ], - "schema_definition": "stringplanmodifier.RequiresReplace()" - } - } - ] - } - }, { "name": "name", "string": { - "description": "resource name", + "description": "Kafka cluster name, must be unique, acts as an ID for import", "computed_optional_required": "required", "plan_modifiers": [ { @@ -544,92 +338,40 @@ "schema_definition": "stringplanmodifier.RequiresReplace()" } } - ] - } - }, - { - "name": "cluster", - "string": { - "description": "resource parent cluster (if any)", - "computed_optional_required": "optional", - "plan_modifiers": [ + ], + "validators": [ { "custom": { "imports": [ { - "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" + "path": "regexp" + }, + { + "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" } ], - "schema_definition": "stringplanmodifier.RequiresReplace()" + "schema_definition": "stringvalidator.RegexMatches(regexp.MustCompile(\"^[0-9a-z\\\\_\\\\-.]+$\"), \"\")" } } ] } }, { - "name": "manifest", - "string": { - "description": "resource manifest in yaml format. See [reference documentation](https://docs.conduktor.io/platform/reference/resource-reference/console/#manifests) for more details", - "computed_optional_required": "required" + "name": "labels", + "map": { + "description": "Kafka cluster labels", + "computed_optional_required": "computed_optional", + "element_type": { + "string": {} + } } } - ] - } - }, - { - "name": "kafka_cluster_v2", - "schema": { - "attributes": [ + ], + "blocks": [ { - "name": "name", - "string": { - "description": "Kafka cluster name, must be unique, acts as an ID for import", - "computed_optional_required": "required", - "plan_modifiers": [ - { - "custom": { - "imports": [ - { - "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" - } - ], - "schema_definition": "stringplanmodifier.RequiresReplace()" - } - } - ], - "validators": [ - { - "custom": { - "imports": [ - { - "path": "regexp" - }, - { - "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" - } - ], - "schema_definition": "stringvalidator.RegexMatches(regexp.MustCompile(\"^[0-9a-z\\\\_\\\\-.]+$\"), \"\")" - } - } - ] - } - }, - { - "name": "labels", - "map": { - "description": "Kafka cluster labels", - "computed_optional_required": "computed_optional", - "element_type": { - "string": {} - } - } - } - ], - "blocks": [ - { - "name": "spec", - "single_nested": { - "attributes": [ + "name": "spec", + "single_nested": { + "attributes": [ { "name": "display_name", "string": { @@ -1053,7 +795,7 @@ } }, { - "name": "kafka_connect_v2", + "name": "console_kafka_connect_v2", "schema": { "attributes": [ { @@ -1256,6 +998,174 @@ ] } }, + { + "name": "console_user_v2", + "schema": { + "attributes": [ + { + "name": "name", + "string": { + "description": "User email, must be unique, act as ID for import", + "computed_optional_required": "required", + "plan_modifiers": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" + } + ], + "schema_definition": "stringplanmodifier.RequiresReplace()" + } + } + ], + "validators": [ + { + "custom": { + "imports": [ + { + "path": "regexp" + }, + { + "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" + } + ], + "schema_definition": "stringvalidator.RegexMatches(regexp.MustCompile(\"^([\\\\w\\\\-_.]*[^.])@([\\\\w-]+\\\\.)+[\\\\w-]{2,4}$\"), \"\")" + } + } + ] + } + } + ], + "blocks": [ + { + "name": "spec", + "single_nested": { + "attributes": [ + { + "name": "firstname", + "string": { + "description": "User firstname", + "computed_optional_required": "optional" + } + }, + { + "name": "lastname", + "string": { + "description": "User lastname", + "computed_optional_required": "optional" + } + }, + { + "name": "permissions", + "set_nested": { + "description": "Set of all user permissions", + "computed_optional_required": "computed_optional", + "nested_object": { + "attributes": [ + { + "name": "name", + "string": { + "description": "Name of the resource to apply permission could be a topic, a cluster, a consumer group, etc. depending on resource_type", + "computed_optional_required": "optional" + } + }, + { + "name": "resource_type", + "string": { + "description": "Type of the resource to apply permission on valid values are: CLUSTER, CONSUMER_GROUP, KAFKA_CONNECT, KSQLDB, PLATFORM, SUBJECT, TOPIC", + "computed_optional_required": "required", + "validators": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" + }, + { + "path": "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" + } + ], + "schema_definition": "stringvalidator.OneOf(validation.ValidPermissionTypes...)" + } + } + ] + } + }, + { + "name": "cluster", + "string": { + "description": "Name of the cluster to apply permission, only required if resource_type is TOPIC, SUBJECT, CONSUMER_GROUP, KAFKA_CONNECT, KSQLDB", + "computed_optional_required": "optional" + } + }, + { + "name": "kafka_connect", + "string": { + "description": "Name of the Kafka Connect to apply permission, only required if resource_type is KAFKA_CONNECT", + "computed_optional_required": "optional" + } + }, + { + "name": "pattern_type", + "string": { + "description": "Type of the pattern to apply permission on valid values are: LITERAL, PREFIXED", + "computed_optional_required": "optional", + "validators": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" + }, + { + "path": "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" + } + ], + "schema_definition": "stringvalidator.OneOf(validation.ValidPermissionPatternTypes...)" + } + } + ] + } + }, + { + "name": "permissions", + "set": { + "description": "Set of all permissions to apply on the resource. See https://docs.conduktor.io/platform/reference/resource-reference/console/#permissions for more details", + "computed_optional_required": "required", + "element_type": { + "string": {} + }, + "validators": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework-validators/setvalidator" + }, + { + "path": "github.com/hashicorp/terraform-plugin-framework-validators/stringvalidator" + }, + { + "path": "github.com/conduktor/terraform-provider-conduktor/internal/schema/validation" + } + ], + "schema_definition": "setvalidator.ValueStringsAre(stringvalidator.OneOf(validation.ValidPermissions ...))" + } + } + ] + } + } + ] + } + } + } + ] + } + } + ] + } + }, { "name": "gateway_service_account_v2", "schema": { @@ -1409,6 +1319,96 @@ } ] } + }, + { + "name": "generic", + "schema": { + "attributes": [ + { + "name": "kind", + "string": { + "description": "resource kind", + "computed_optional_required": "required", + "plan_modifiers": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" + } + ], + "schema_definition": "stringplanmodifier.RequiresReplace()" + } + } + ] + } + }, + { + "name": "version", + "string": { + "description": "resource version", + "computed_optional_required": "required", + "plan_modifiers": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" + } + ], + "schema_definition": "stringplanmodifier.RequiresReplace()" + } + } + ] + } + }, + { + "name": "name", + "string": { + "description": "resource name", + "computed_optional_required": "required", + "plan_modifiers": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" + } + ], + "schema_definition": "stringplanmodifier.RequiresReplace()" + } + } + ] + } + }, + { + "name": "cluster", + "string": { + "description": "resource parent cluster (if any)", + "computed_optional_required": "optional", + "plan_modifiers": [ + { + "custom": { + "imports": [ + { + "path": "github.com/hashicorp/terraform-plugin-framework/resource/schema/stringplanmodifier" + } + ], + "schema_definition": "stringplanmodifier.RequiresReplace()" + } + } + ] + } + }, + { + "name": "manifest", + "string": { + "description": "resource manifest in yaml format. See [reference documentation](https://docs.conduktor.io/platform/reference/resource-reference/console/#manifests) for more details", + "computed_optional_required": "required" + } + } + ] + } } ], "version": "0.1" diff --git a/templates/resources/group_v2.md.tmpl b/templates/resources/console_group_v2.md.tmpl similarity index 70% rename from templates/resources/group_v2.md.tmpl rename to templates/resources/console_group_v2.md.tmpl index d525ed7..c652f88 100644 --- a/templates/resources/group_v2.md.tmpl +++ b/templates/resources/console_group_v2.md.tmpl @@ -1,5 +1,5 @@ --- -page_title: "Conduktor : conduktor_group_v2 " +page_title: "Conduktor : conduktor_console_group_v2 " subcategory: "iam/v2" description: |- Resource for managing Conduktor groups. @@ -14,10 +14,10 @@ This resource allows you to create, read, update and delete groups in Conduktor. ## Example Usage ### Simple group without members or permissions -{{tffile "examples/resources/conduktor_group_v2/simple.tf"}} +{{tffile "examples/resources/conduktor_console_group_v2/simple.tf"}} ### Complex group with members, external reference and permissions -{{tffile "examples/resources/conduktor_group_v2/complex.tf"}} +{{tffile "examples/resources/conduktor_console_group_v2/complex.tf"}} {{ .SchemaMarkdown }} diff --git a/templates/resources/kafka_cluster_v2.md.tmpl b/templates/resources/console_kafka_cluster_v2.md.tmpl similarity index 75% rename from templates/resources/kafka_cluster_v2.md.tmpl rename to templates/resources/console_kafka_cluster_v2.md.tmpl index 3316df9..84c2629 100644 --- a/templates/resources/kafka_cluster_v2.md.tmpl +++ b/templates/resources/console_kafka_cluster_v2.md.tmpl @@ -1,5 +1,5 @@ --- -page_title: "Conduktor : conduktor_kafka_cluster_v2 " +page_title: "Conduktor : conduktor_console_kafka_cluster_v2 " subcategory: "console/v2" description: |- Resource for managing Conduktor Kafka cluster definition with optional Schema registry. @@ -15,25 +15,25 @@ This resource allows you to create, read, update and delete Kafka clusters and S ### Simple Kafka cluster without Schema registry This example creates a simple Kafka cluster definition without authentication resource and without Schema Registry. -{{tffile "examples/resources/conduktor_kafka_cluster_v2/simple.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_cluster_v2/simple.tf"}} ### Confluent Kafka cluster with Schema registry This example creates a Confluent Kafka cluster and Schema Registry definition resource. The Schema Registry authentication uses mTLS. -{{tffile "examples/resources/conduktor_kafka_cluster_v2/confluent.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_cluster_v2/confluent.tf"}} ### Aiven Kafka cluster with Schema registry This example creates an Aiven Kafka cluster and Schema Registry definition resource. The Schema Registry authentication uses basic auth. -{{tffile "examples/resources/conduktor_kafka_cluster_v2/aiven.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_cluster_v2/aiven.tf"}} ### AWS MSK with Glue Schema registry This example creates an AWS MSK Kafka Cluster and a Glue Schema Registry definition resource. -{{tffile "examples/resources/conduktor_kafka_cluster_v2/aws_msk.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_cluster_v2/aws_msk.tf"}} ### Conduktor Gateway Kafka cluster with Schema registry This example creates a Conduktor Gateway Kafka Cluster and Schema Registry definition resource. The Schema Registry authentication uses a bearer token. -{{tffile "examples/resources/conduktor_kafka_cluster_v2/gateway.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_cluster_v2/gateway.tf"}} {{ .SchemaMarkdown }} diff --git a/templates/resources/kafka_connect_v2.md.tmpl b/templates/resources/console_kafka_connect_v2.md.tmpl similarity index 73% rename from templates/resources/kafka_connect_v2.md.tmpl rename to templates/resources/console_kafka_connect_v2.md.tmpl index 6f409e9..14f6d91 100644 --- a/templates/resources/kafka_connect_v2.md.tmpl +++ b/templates/resources/console_kafka_connect_v2.md.tmpl @@ -1,5 +1,5 @@ --- -page_title: "Conduktor : conduktor_kafka_connect_v2 " +page_title: "Conduktor : conduktor_console_kafka_connect_v2 " subcategory: "console/v2" description: |- Resource for managing Conduktor Kafka Connect servers definition linked to an existing Kafka cluster definition inside Conduktor Console. @@ -15,19 +15,19 @@ This resource allows you to create, read, update and delete Kafka Connect server ### Simple Kafka Connect server This example creates a simple Kafka Connect server connection without any authentication. -{{tffile "examples/resources/conduktor_kafka_connect_v2/simple.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_connect_v2/simple.tf"}} ### Basic Kafka Connect server This example creates a complex Kafka Connect server connection with basic authentication. -{{tffile "examples/resources/conduktor_kafka_connect_v2/basicAuth.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_connect_v2/basicAuth.tf"}} ### Bearer token Kafka Connect server This example creates a complex Kafka Connect server connection with bearer token authentication. -{{tffile "examples/resources/conduktor_kafka_connect_v2/bearerToken.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_connect_v2/bearerToken.tf"}} ### mTLS Kafka Connect server This example creates a complex Kafka Connect server connection with mTLS authentication. -{{tffile "examples/resources/conduktor_kafka_connect_v2/mtls.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_connect_v2/mtls.tf"}} {{ .SchemaMarkdown }} @@ -38,9 +38,9 @@ In order to import a Kafka Connect server connection into Conduktor, you need to The import ID is constructed as follows: `< cluster_id >/< connect_id >`. For example, using an [`import` block](https://developer.hashicorp.com/terraform/language/import) : -{{tffile "examples/resources/conduktor_kafka_connect_v2/import.tf"}} +{{tffile "examples/resources/conduktor_console_kafka_connect_v2/import.tf"}} Using the `terraform import` command: ```shell -terraform import conduktor_kafka_connect_v2.example mini-cluster/import-connect +terraform import conduktor_console_kafka_connect_v2.example mini-cluster/import-connect ``` diff --git a/templates/resources/user_v2.md.tmpl b/templates/resources/console_user_v2.md.tmpl similarity index 68% rename from templates/resources/user_v2.md.tmpl rename to templates/resources/console_user_v2.md.tmpl index 3bd4bd9..b25199e 100644 --- a/templates/resources/user_v2.md.tmpl +++ b/templates/resources/console_user_v2.md.tmpl @@ -1,5 +1,5 @@ --- -page_title: "Conduktor : conduktor_user_v2 " +page_title: "Conduktor : conduktor_console_user_v2 " subcategory: "iam/v2" description: |- Resource for managing Conduktor users. @@ -14,10 +14,10 @@ This resource allows you to create, read, update and delete users in Conduktor. ## Example Usage ### Simple user without permissions -{{tffile "examples/resources/conduktor_user_v2/simple.tf"}} +{{tffile "examples/resources/conduktor_console_user_v2/simple.tf"}} ### Complex user with permissions -{{tffile "examples/resources/conduktor_user_v2/complex.tf"}} +{{tffile "examples/resources/conduktor_console_user_v2/complex.tf"}} {{ .SchemaMarkdown }}