当我使用spark direct streaming 2.1将数据写入kafka主题时,经过一段时间kafka geting关闭。
val props = new HashMap[String, Object]()
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "ip:9092")
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringSerializer")
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringSerializer")
val producer = new KafkaProducer[String, String](props)
val message = new ProducerRecord[String, String](topic, partition,compression key,message)
producer.send(message)
暂无答案!
目前还没有任何答案,快来回答吧!