SQL Server Streaming MSSQL CDC to AWS MSK with Debezium

js5cn81o  于 2023-08-02  发布在  其他
关注(0)|答案(1)|浏览(102)

I'm a newbie with Kafka and currently learning to streaming data changed from MSSQL to Amazon MSK using Debezium connector

I already have a MS SQL Server with CDC enabled, a MSK cluster which I can connect, create topic, produce and consume data manually through an EC2 client. Now I'm setting up a MSK Connect with Debezium SQL Server connector as custom plugin, here is my MSK Connector configurations:

connector.class = io.debezium.connector.sqlserver.SqlServerConnector, 
tasks.max = 1
database.hostname = xxx, 
database.port = xxx, 
database.user = xxx, 
database.password = xxx, 
database.dbname = dbName, 
database.server.name = serverName, 
table.include.list = dbo.tableName, 
database.history.kafka.bootstrap.servers = xxx, 
database.history.kafka.topic = xxx

But my MSK connector keeps returning status Failed. I have searched Google though but it seems there is no instruction or guide related to my idea.

That makes me wondering whether my solution is possible? Could someone please shed some light and point me to the right direction?

Edited: some logs I got from CloudWatch

ERROR [AdminClient clientId=adminclient-1] Connection to node -2 () failed authentication due to: []: Access denied (org.apache.kafka.clients.NetworkClient:771)

INFO App info kafka.admin.client for adminclient-1 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)

[INFO [AdminClient clientId=adminclient-1] Metadata update failed (org.apache.kafka.clients.admin.internals.AdminMetadataManager:235)

org.apache.kafka.connect.errors.ConnectException: Failed to connect to and describe Kafka cluster. Check worker's broker connection and security properties.

Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: [4f91d358-fb7b-4f3b-8930-1b4aefce6d0b]: Access denied

[Worker-08134a52fe88cdc49] MSK Connect encountered errors and failed.

Many thanks,

yks3o0rb

yks3o0rb1#

If you are using IAM role based auth for your MSK cluster, your bootstrap server port will be 9098

Along with all the properties, you also have send these properties in your MSK connect config

database.history.consumer.security.protocol=SASL_SSL
database.history.consumer.sasl.mechanism=AWS_MSK_IAM
database.history.consumer.sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;
database.history.consumer.sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
database.history.producer.security.protocol=SASL_SSL
database.history.producer.sasl.mechanism=AWS_MSK_IAM
database.history.producer.sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;
database.history.producer.sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler

Refer: https://aws.amazon.com/blogs/aws/introducing-amazon-msk-connect-stream-data-to-and-from-your-apache-kafka-clusters-using-managed-connectors/

相关问题