Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uploading a custom connector to AZURE returns a 400 bad request error #481

Open
cconnorsada opened this issue Nov 1, 2024 · 5 comments
Open
Labels
bug Something isn't working

Comments

@cconnorsada
Copy link

Hello, I'm trying to upload a custom connector to AZURE to use in my Confluent Cloud Kafka cluster.

I was able to upload the connector successfully without issue using the default (AWS). However when I use the cloud input variable to the resource with the AZURE value I get this in return.

╷
│ Error: error creating Custom Connector Plugin "Debezium MongoDB CDC Connector Plugin": 400 Bad Request: error creating Connect plugin
│ 
│   with module.confluent-kafka.confluent_custom_connector_plugin.debezium_mongo,
│   on ../../../../../modules/confluent-kafka/connectors.tf line 12, in resource "confluent_custom_connector_plugin" "debezium_mongo":
│   12: resource "confluent_custom_connector_plugin" "debezium_mongo" {
│ 

Unfortunately because my Kafka cluster is in Azure it also requires the custom connector to be uploaded in Azure. I don't see any other input variables required when using Azure so I'm not sure why it's giving a bad request.

This is the terraform code in question,

resource "confluent_custom_connector_plugin" "debezium_mongo" {
  # https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#custom-connector-quick-start
  display_name                  = "Debezium MongoDB CDC Connector Plugin"
  documentation_link        = "https://debezium.io/documentation/reference/stable/connectors/mongodb.html"
  connector_class             = "io.debezium.connector.mongodb.MongoDbConnector"
  connector_type              = "SOURCE"
  cloud                               = "AZURE"
  sensitive_config_properties = []
  filename                          = "${path.module}/connectors/debezium-connector-mongodb-2.4.2.zip"
}

I'm not sure if this is an issue with the resource itself or if I'm missing something in my code.

Thanks for any help.

@cconnorsada cconnorsada changed the title Uploading a custom connector to AZURE return a 400 bad request error Uploading a custom connector to AZURE returns a 400 bad request error Nov 1, 2024
@linouk23
Copy link
Contributor

linouk23 commented Nov 1, 2024

@cconnorsada, thank you for creating the issue!

We’ve confirmed that this is a limitation of the TF Provider, and we have created an internal Jira ticket (APIT-2636) to address it.

As a quick workaround, could you try using the Confluent CLI to create a custom connector plugin and then use terraform import?

@cconnorsada
Copy link
Author

Hi @linouk23,

I will try with the import, it does limit our programmatic use of the Confluent Cloud platform though if we have a manual step in the cluster loop. Looking forward to a resolution in the future!

I'll let you know how it goes once I try it on Monday. Have a good weekend. :)

@cconnorsada
Copy link
Author

cconnorsada commented Nov 4, 2024

Hi @linouk23 ,

I was able to create a custom connector on the Confluent website and then use it's custom ID to create a connector for my cluster.

resource "confluent_connector" "s3_sink" {
  environment {
    id = var.environment_id
  }
  kafka_cluster {
    id = confluent_kafka_cluster.azure_kafka_cluster.id
  }

  config_sensitive = {
    "kafka.api.key"         = confluent_api_key.app-manager-kafka-api-key.id
    "kafka.api.secret"      = confluent_api_key.app-manager-kafka-api-key.secret
    "aws.access.key.id"     = aws_iam_access_key.s3_access_key.id
    "aws.secret.access.key" = aws_iam_access_key.s3_access_key.secret
  }

  config_nonsensitive = {
    "confluent.connector.type"   = "CUSTOM"
    "connector.class"            = "io.confluent.connect.s3.S3SinkConnector"
    "name"                       = "s3_sink_connector_2"
    "kafka.auth.mode"            = "KAFKA_API_KEY"
    "topics"                     = confluent_kafka_topic.orders.topic_name
    "output.data.format"         = "JSON"
    "quickstart"                 = "ORDERS"
    "confluent.custom.plugin.id" = "ccp-l5nonq"
    "time.interval"              = "DAILY"
    "flush.size"                 = "1"
    "tasks.max"                  = "1"
    "s3.bucket.name"             = aws_s3_bucket.test_bucket.bucket
    "format.class"               = "io.confluent.connect.s3.format.json.JsonFormat"
    "storage.class"              = "io.confluent.connect.s3.storage.S3Storage"
    "input.data.format"          = "JSON"
    "output.data.format"         = "JSON"
    #"confluent.custom.connection.endpoints" = "https://s3.us-east-1.amazonaws.com, https://s3.us-west-1.amazonaws.com"
    "confluent.custom.plugin.type"         = "SINK"
    "confluent.resource.connector.tier" = "2GB"
  }
}

The only thing stopping the creation from succeeding now is the custom endpoints variable not validating correctly in the provider. Without the endpoints the creation fails because it cannot successfully initialize the connector.

Once I manually add the endpoints after connector creation, the connector restarts and successfully initializes.

This is probably a separate issue though. However that leaves 2 separate manual steps to create a connector which is not ideal. Hope the issues are resolved shortly.

Thanks for the help. 👍

@linouk23
Copy link
Contributor

linouk23 commented Nov 4, 2024

The only thing stopping the creation from succeeding now is the custom endpoints variable not validating correctly in the provider. Without the endpoints the creation fails because it cannot successfully initialize the connector.
Once I manually add the endpoints after connector creation, the connector restarts and successfully initializes.
This is probably a separate issue though. However that leaves 2 separate manual steps to create a connector which is not ideal. Hope the issues are resolved shortly.

@cconnorsada please open a support ticket, as this seems to be an API (backend) issue. Thank you!

@linouk23 linouk23 added the bug Something isn't working label Nov 4, 2024
@cconnorsada
Copy link
Author

Thanks for the quick reply @linouk23. This was user error and not a bug.

The endpoints needed to look like this,

"confluent.custom.connection.endpoints" = "s3.us-west-2.amazonaws.com:443:TCP;s3.us-east-1.amazonaws.com:443:TCP"

and not what I had above.

Sorry to bother you with it. You can close this issue now, I think I have everything I need in place to get going. Looking forward to the custom connector resource being updated!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants