kafka使用者消息

stszievb  于 2021-06-04  发布在  Kafka
关注(0)|答案(2)|浏览(466)

我是新来Kafka和不知何故能够运行Kafkaavro消费者和生产者。生产者是生产的信息,我成功地得到它在消费者。以下是我的生产者代码片段:

static async void AvroProducer()
{
    string bootstrapServers = "localhost:9092";
    string schemaRegistryUrl = "Production163:8081"; 
    string topicName = "player";
    string groupName = "avro-generic-example-group";

     var s = (RecordSchema)RecordSchema.Parse(
        @"{
            ""namespace"": ""Confluent.Kafka.Examples.AvroSpecific"",
            ""type"": ""record"",
            ""name"": ""User"",
            ""fields"": [
                {""name"": ""name"", ""type"": ""string""},
                {""name"": ""favorite_number"",  ""type"": [""int"", ""null""]},
                {""name"": ""favorite_color"", ""type"": [""string"", ""null""]}
            ]
          }"
    );

    using (var schemaRegistry = new CachedSchemaRegistryClient(new SchemaRegistryConfig { SchemaRegistryUrl = schemaRegistryUrl }))
    using (var producer =
        new ProducerBuilder<string, GenericRecord>(new ProducerConfig { BootstrapServers = bootstrapServers })
            .SetKeySerializer(new AsyncAvroSerializer<string>(schemaRegistry))
            .SetValueSerializer(new AsyncAvroSerializer<GenericRecord>(schemaRegistry))
            .Build())
    {
        Console.WriteLine($"{producer.Name} producing on {topicName}. Enter user names, q to exit.");

        int i = 0;
        string text;
        while ((text = Console.ReadLine()) != "q")
        {
            var record = new GenericRecord(s);
            record.Add("name", text);
            record.Add("favorite_number", i++);
            record.Add("favorite_color", "blue");

            producer.ProduceAsync(topicName, new Message<string, GenericRecord> { Key = text, Value = record })
                .ContinueWith(task => task.IsFaulted
                    ? $"error producing message: {task.Exception.Message}"
                    : $"produced to: {task.Result.TopicPartitionOffset}");
        }
    }
    Console.ReadLine();

}

正如您在上面的代码中看到的,我正在使用一个记录方案,但我正在尝试这个方案:

//this is the new schema try
        var s = (RecordSchema)RecordSchema.Parse(
            @"{
                ""type"": ""record"",
                ""name"": ""TestingMsg"",
                ""doc"": ""Sample"",
                ""fields"": [
                  {
                   ""name"": ""key"",
                   ""type"": ""string""
                  },
                  {
                   ""name"": ""Time"",
                   ""type"": ""long""
                  },
                  {
                   ""name"": ""sourceSeconds"",
                   ""type"": ""long""
                  },
                  {
                   ""name"": ""serverT"",
                   ""type"": ""long""
                  },

                  {
                   ""name"": ""statusCode"",
                   ""type"": ""int""
                  }
                ]
                }"
            );

新的一个,我正试图使用,但它不工作,因为我没有得到消费者的信息。这里是消费者:

void KafkaReader(CancellationToken cancellationToken)
    {
        Debug.Log("kafka reader started. . .");
        // Set up your Kafka connection here.

        while (_keepThreadRunning)
        {
            using (CachedSchemaRegistryClient schemaRegistry = new CachedSchemaRegistryClient(new SchemaRegistryConfig { SchemaRegistryUrl = schemaRegistryUrl }))
            using (IConsumer<string, GenericRecord> consumer = new ConsumerBuilder<string, GenericRecord>(new ConsumerConfig { BootstrapServers = bootstrapServers, GroupId = groupName })
            //using (IConsumer<string, GenericRecord> consumer = new ConsumerBuilder<string, GenericRecord>(new ConsumerConfig { BootstrapServers = bootstrapServers})
                    .SetKeyDeserializer(new AsyncAvroDeserializer<string>(schemaRegistry).AsSyncOverAsync())
                    .SetValueDeserializer(new AsyncAvroDeserializer<GenericRecord>(schemaRegistry).AsSyncOverAsync())
                    .SetErrorHandler((_, e) => Debug.Log($"Error: {e.Reason}"))
                    .Build())
            {
                Debug.Log("subscribe" );
                consumer.Subscribe(topicName);

                while (true)
                {
                    ConsumeResult<string, GenericRecord> consumeResult = consumer.Consume(cancellationToken);//TimeSpan.FromMilliseconds(50000)//new TimeSpan(0,0,1)

                    _stringsReceived.Enqueue(consumeResult.Value.ToString());

                    if (consumeResult != null)
                    {
                        Debug.Log($"Key: {consumeResult.Message.Key}\nValue: {consumeResult.Value}");

                    }
                    else
                    {
                        Debug.Log("consumer Result is null");
                    }

                    //yield return new WaitForSeconds(1);
                }
            }

        }

        GetComponent<KafkaServerConfigUI>().KafkaDisconnected();

        // Disconnect and clean up your connection here.

    }

记得我刚刚使用批处理文件运行了默认的apachekafka注册表。

D:\ApachKafka\confluent\confluent-5.2.1\bin\windows\schema-registry-start.bat D:\ApachKafka\confluent\confluent-5.2.1\etc\schema-registry\schema-registry.properties

我做错了什么?我需要在任何地方注册模式吗?

tzxcd3kk

tzxcd3kk1#

我知道你有答案。我的建议是避免在每次模式更新时运行python脚本。
您可以使用schema registry ui。
在nutshell中,schema registry ui提供了-探索和搜索模式-avro进化兼容性检查-新模式注册-avro+表模式视图-显示curl命令

如何得到它

git clone https://github.com/Landoop/schema-registry-ui.git
cd schema-registry-ui
npm install -g bower
npm install
http-server .

演示

http://schema-registry-ui.landoop.com/

或docker图像可用。如果您可以选择许可证,请尝试confluent control center,它提供了更多选项。

kg7wmglp

kg7wmglp2#

要进行任何更改或使用新模式,必须注册该模式。我错过了这个东西,因此我没有得到消费者的信息。下面是帮助您注册模式的简短python脚本。
使用该脚本,您必须提供模式注册表的url(从http://,而不仅仅是主机名和端口开始)、应该为其注册模式的主题以及模式的路径。
下面是我注册模式的方法

感谢ref:avro和schema registry

相关问题