无法从链中的任何提供程序加载aws凭据-kinesis kafka连接器

rkue9o1l  于 2021-06-07  发布在  Kafka
关注(0)|答案(1)|浏览(384)

我正在尝试使用Kafkakinesis连接器是一个连接器,用于与Kafka连接发布消息从Kafka到亚马逊kinesis firehose,如链接中所述(https://github.com/awslabs/kinesis-kafka-connector)得到一个低于标准的错误。我使用的是cloudera版本cdh-6.1.0-1.cdh6.1.0.p0.770702,它附带了kafka 2.1.2(0.10.0.1+kafka2.1.2+6)。
我在当前会话中加载了aws凭据,但这不起作用。

export AWS_ACCESS_KEY_ID="XXX"
export AWS_SECRET_ACCESS_KEY="YYYYY"
export AWS_DEFAULT_REGION="sssss"

我的worker.properties如下所示

bootstrap.servers=kafkanode:9092
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter

# internal.value.converter=org.apache.kafka.connect.storage.StringConverter

# internal.key.converter=org.apache.kafka.connect.storage.StringConverter

internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter.schemas.enable=true
internal.key.converter.schemas.enable=true
internal.value.converter.schemas.enable=true
offset.storage.file.filename=offset.log
schemas.enable=false

# Rest API

rest.port=8096
plugin.path=/home/opc/kinesis-kafka-connector-master/target/

# rest.host.name=

我的kinesis-firehose-kafka-connector.properties如下所示

name=kafka_kinesis_sink_connector
connector.class=com.amazon.kinesis.kafka.FirehoseSinkConnector
tasks.max=1
topics=OGGTest
region=eu-central-1
batch=true
batchSize=500
batchSizeInBytes=1024
deliveryStream=kafka-s3-stream

错误代码如下:

[2019-01-26 11:32:24,446] INFO Kafka version : 2.0.0-cdh6.1.0 (org.apache.kafka.common.utils.AppInfoParser:109)
  [2019-01-26 11:32:24,446] INFO Kafka commitId : unknown (org.apache.kafka.common.utils.AppInfoParser:110)
  [2019-01-26 11:32:24,449] INFO Created connector kafka_kinesis_sink_connector (org.apache.kafka.connect.cli.ConnectStandalone:104)
  [2019-01-26 11:32:25,296] ERROR WorkerSinkTask{id=kafka_kinesis_sink_connector-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
  com.amazonaws.SdkClientException: Unable to load AWS credentials from any provider in the chain
    at com.amazonaws.auth.AWSCredentialsProviderChain.getCredentials(AWSCredentialsProviderChain.java:131)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1164)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:762)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:724)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
    at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.doInvoke(AmazonKinesisFirehoseClient.java:826)
    at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.invoke(AmazonKinesisFirehoseClient.java:802)
    at com.amazonaws.services.kinesisfirehose.AmazonKinesisFirehoseClient.describeDeliveryStream(AmazonKinesisFirehoseClient.java:451)
    at com.amazon.kinesis.kafka.FirehoseSinkTask.validateDeliveryStream(FirehoseSinkTask.java:95)
    at com.amazon.kinesis.kafka.FirehoseSinkTask.start(FirehoseSinkTask.java:77)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:301)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:190)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
 [2019-01-26 11:32:25,299] ERROR WorkerSinkTask{id=kafka_kinesis_sink_connector-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:178)
 [2019-01-26 11:32:33,375] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65)
 [2019-01-26 11:32:33,375] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:223)

请告知。提前谢谢!

mepcadol

mepcadol1#

位于运行连接工作进程的操作系统用户的主目录中的~/.aws/credentials文件。这些凭证由大多数aws SDK和aws cli识别。使用以下aws cli命令创建凭据文件:
aws配置
也可以使用文本编辑器手动创建凭据文件。文件应包含以下格式的行:
[默认值]aws\u access\u key\u id=aws\u secret\u access\u key=
注意:创建凭据文件时,请确保创建凭据文件的用户与运行连接工作进程的用户相同,并且凭据文件位于该用户的主目录中。否则,s3连接器将无法找到凭据。

相关问题