apachenifi发布到kafka 0.10,sasl失败

eagi6jfj  于 2021-06-06  发布在  Kafka
关注(0)|答案(0)|浏览(435)

我正在尝试通过saslèu明文将数据从nifi1.7.1发布到kafka0.10。我们已经测试了kafka代理是否可用,并通过kafka服务器上的命令行接收关于我们主题的消息。但是,发布的kafka\u 0\u 10仍然失败,日志如下:

2018-09-12 10:37:46,648 INFO [NiFi Web Server-365] o.a.n.c.s.StandardProcessScheduler Starting PublishKafka_0_10[id=ccfbf7e8-0165-1000-528f-6771c455e664]
2018-09-12 10:37:46,648 INFO [Timer-Driven Process Thread-9] o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled PublishKafka_0_10[id=ccfbf7e8-0165-1000-528f-6771c455e664] to run with 1 threads
2018-09-12 10:37:46,658 INFO [Timer-Driven Process Thread-9] o.a.k.clients.producer.ProducerConfig ProducerConfig values: 
    acks = 1
    batch.size = 16384
    block.on.buffer.full = false
    bootstrap.servers = [ourkafkaserver:9092]
    buffer.memory = 33554432
    client.id = 
    compression.type = none
    connections.max.idle.ms = 540000
    interceptor.classes = null
    key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
    linger.ms = 0
    max.block.ms = 20000
    max.in.flight.requests.per.connection = 5
    max.request.size = 1048576
    metadata.fetch.timeout.ms = 60000
    metadata.max.age.ms = 300000
    metric.reporters = []
    metrics.num.samples = 2
    metrics.sample.window.ms = 30000
    partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
    receive.buffer.bytes = 32768
    reconnect.backoff.ms = 50
    request.timeout.ms = 30000
    retries = 6
    retry.backoff.ms = 100
    sasl.jaas.config = null
    sasl.kerberos.kinit.cmd = /usr/bin/kinit
    sasl.kerberos.min.time.before.relogin = 60000
    sasl.kerberos.service.name = kafka
    sasl.kerberos.ticket.renew.jitter = 0.05
    sasl.kerberos.ticket.renew.window.factor = 0.8
    sasl.mechanism = GSSAPI
    security.protocol = SASL_PLAINTEXT
    send.buffer.bytes = 131072
    ssl.cipher.suites = null
    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
    ssl.endpoint.identification.algorithm = null
    ssl.key.password = null
    ssl.keymanager.algorithm = SunX509
    ssl.keystore.location = null
    ssl.keystore.password = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    timeout.ms = 30000
    value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer

2018-09-12 10:37:46,675 INFO [Timer-Driven Process Thread-9] o.a.k.c.s.authenticator.AbstractLogin Successfully logged in.
2018-09-12 10:37:46,675 INFO [kafka-kerberos-refresh-thread-our@nifiprincipal] o.a.k.c.security.kerberos.KerberosLogin [Principal=our@nifiprincipal]: TGT refresh thread started.
2018-09-12 10:37:46,675 INFO [kafka-kerberos-refresh-thread-our@nifiprincipal] o.a.k.c.security.kerberos.KerberosLogin [Principal=our@nifiprincipal]: TGT valid starting at: Wed Sep 12 10:37:46 UTC 2018
2018-09-12 10:37:46,676 INFO [kafka-kerberos-refresh-thread-our@nifiprincipal] o.a.k.c.security.kerberos.KerberosLogin [Principal=our@nifiprincipal]: TGT expires: Thu Sep 13 11:37:46 UTC 2018
2018-09-12 10:37:46,676 INFO [kafka-kerberos-refresh-thread-our@nifiprincipal] o.a.k.c.security.kerberos.KerberosLogin [Principal=our@nifiprincipal]: TGT refresh sleeping until: Thu Sep 13 06:45:43 UTC 2018
2018-09-12 10:37:46,676 INFO [Timer-Driven Process Thread-9] o.a.kafka.common.utils.AppInfoParser Kafka version : 0.10.2.1
2018-09-12 10:37:46,676 INFO [Timer-Driven Process Thread-9] o.a.kafka.common.utils.AppInfoParser Kafka commitId : e89bffd6b2eff799
2018-09-12 10:38:26,678 ERROR [Timer-Driven Process Thread-9] o.a.n.p.kafka.pubsub.PublishKafka_0_10 PublishKafka_0_10[id=ccfbf7e8-0165-1000-528f-6771c455e664] Failed to send all message for StandardFlowFileRecord[uuid=b2470c67-4c6e-4dd6-a969-f46e1da5673f,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1536744161212-1, container=default, section=1], offset=429, length=39],offset=0,name=10269008232495292,size=39] to Kafka; routing to failure due to org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 20000 ms.: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 20000 ms.
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 20000 ms.
2018-09-12 10:38:26,679 ERROR [Timer-Driven Process Thread-9] o.a.n.p.kafka.pubsub.PublishKafka_0_10 PublishKafka_0_10[id=ccfbf7e8-0165-1000-528f-6771c455e664] Failed to send all message for StandardFlowFileRecord[uuid=5c24d2ec-9f09-44e4-91ea-237f2bfedefa,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1536744161212-1, container=default, section=1], offset=468, length=39],offset=0,name=10269023234631434,size=39] to Kafka; routing to failure due to org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 20000 ms.: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 20000 ms.
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 20000 ms.
2018-09-12 10:38:26,679 INFO [Timer-Driven Process Thread-9] o.a.kafka.clients.producer.KafkaProducer Closing the Kafka producer with timeoutMillis = 20000 ms.
2018-09-12 10:38:26,679 WARN [kafka-kerberos-refresh-thread-our@nifiprincipal] o.a.k.c.security.kerberos.KerberosLogin [Principal=our@nifiprincipal]: TGT renewal thread has been interrupted and will exit.

我找到了参数sasl.kerberos.kinit.cmd=/usr/bin/kinit。有必要在那个地方使用kinit吗?或者nifi会使用java来获得kerberos票证吗?
还有什么其他的线索说明为什么这会失败吗?我们在启动时使用命令提供jaas.conf文件

java.arg.50=-Djava.security.auth.login.config=/path/to/our/kerberos/jaas.conf

在bootstrap.conf文件中,它包含以下内容:

KafkaClient {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=true
  keyTab="/path/to/our/kerberos/nifi.keytab"
  serviceName="kafka"
  principal="our@nifiprincipal";
};

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题