找不到任何实现连接器且名称与io.confluent.connect.elasticsearch.elasticsearchsinkconnector匹配的类

nzk0hqpo  于 2021-06-06  发布在  Kafka
关注(0)|答案(1)|浏览(510)

我有msk在aws上运行,我可以发送记录进出msk。我只是想使用Kafka连接,以便记录进入msk将去ElasticSearch。我做了以下事情,但我不确定我的连接器是否正常工作,因为我无法看到ElasticSearch这是我发送的记录

{
        "data": {
                "RequestID":    517082653,
                "ContentTypeID":        9,
                "OrgID":        16145,
                "UserID":       4,
                "PromotionStartDateTime":       "2019-12-14T16:06:21Z",
                "PromotionEndDateTime": "2019-12-14T16:16:04Z",
                "SystemStartDatetime":  "2019-12-14T16:17:45.507000000Z"
        },
        "metadata":     {
                "timestamp":    "2019-12-29T10:37:31.502042Z",
                "record-type":  "data",
                "operation":    "insert",
                "partition-key-type":   "schema-table",
                "schema-name":  "dbo",
                "table-name":   "TRFSDIQueue"
        }
}

我已经安装了Kafka连接这样

wget /usr/local http://packages.confluent.io/archive/5.2/confluent-5.2.0-2.11.tar.gz -P ~/Downloads/

tar -zxvf ~/Downloads/confluent-5.2.0-2.11.tar.gz -C ~/Downloads/
sudo mv ~/Downloads/confluent-5.2.0 /usr/local/confluent

vim /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties

并更改url和主题名称
之后,我刚刚开始Kafka连接如下

/usr/local/confluent/bin/connect-standalone /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties

这给了我输出 INFO Usage: ConnectStandalone worker.properties connector1.properties [connector2.properties ...] (org.apache.kafka.connect.cli.ConnectStandalone:62) 所以我没有在哪里提到我的经纪人详细信息广告缩放器的详细信息,那么我将如何连接到msk。
请帮助我理解这一点。我错过了什么。我没有模式转换,所以我没有修改 schema-registry.properties .
当我尝试使用下面的命令时,我得到下面的错误

/usr/local/confluent/bin/connect-standalone /home/ec2-user/kafka_2.12-2.2.1/config/connect-standalone.properties /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties

[2019-12-30 03:19:30,149] ERROR Failed to create job for /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties (org.apache.kafka.connect.cli.ConnectStandalone:108)
[2019-12-30 03:19:30,149] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:119)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}
        at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
        at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:116)
Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}
        at org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:175)
        at org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382)
        at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261)
        at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:188)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:113)
nxagd54h

nxagd54h1#

将引导服务器和连接相关属性放在etc/kafka/connect-standalone.properties文件中
要加载连接器,必须使用 plugin.path 正如这里所讨论的
https://docs.confluent.io/current/connect/managing/community.html
您也可以下载confluent hub cli来安装任何可用的apache kafka连接器(您不需要confluent平台,因此可以忽略模式注册表)
https://docs.confluent.io/current/connect/managing/confluent-hub/client.html
或者像前面回答的那样,使用合并的kafka connect docker图像,它预先加载了elasticsearch连接器
另外,我建议不要使用tarballs,而是使用apt/yum来安装合流包
https://docs.confluent.io/current/installation/installing_cp/index.html

相关问题