开发者社区> 问答> 正文

Flink consume Kafka with schema registry

I have occur the problem that the data in Kakfa is formatted as avro with schema register server. I found that is not easy to consume this topic easy, the provided kafka does not support this, and I do not want to write a new kafka source, is there any way to using provided kafka source to consume kafka, which is format as avro with schema register.*来自志愿者整理的flink邮件归档

展开
收起
EXCEED 2021-12-08 13:52:10 1604 0
1 条回答
写回答
取消 提交回答
  • I have the same problem these days.

    I finally customize avro related serde schema for supporting schema registry.

    The root cause is that, when serialization , the avro record with schema registry restriction is different with “original” avro record without schema registry restriction . The former writes 5 bytes header ahead of real record bytes. 1 byte magic and 4 bytes schema Id which is the unique id registered in Kafka schema registry.

    I think apache flink should consider this case, supporting both original avro and schema registry formatted avro .*来自志愿者整理的flink邮件归档

    2021-12-08 14:42:43
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Java Spring Boot开发实战系列课程【第16讲】:Spring Boot 2.0 实战Apache Kafka百万级高并发消息中间件与原理解析 立即下载
MaxCompute技术公开课第四季 之 如何将Kafka数据同步至MaxCompute 立即下载
消息队列kafka介绍 立即下载