开发者社区> 问答> 正文

Flink 1.9 SQL Kafka Connector,Json format,how to d

Flink 1.9 SQL Kafka Connector,Json format,how to deal with not json message? Hi community,when I write the flink ddl sql like this:

CREATE TABLE kafka_src ( id varchar, a varchar, b TIMESTAMP, c TIMESTAMP ) with ( ... 'format.type' = 'json', 'format.property-version' = '1', 'format.derive-schema' = 'true', 'update-mode' = 'append' );

If the message is not the json format ,there is a error in the log。 My question is that how to deal with the message which it not json format? My thought is that I can catch the exception in JsonRowDeserializationSchema deserialize() method,is there any parameters to do this? Thanks your replay.*来自志愿者整理的flink邮件归档

展开
收起
EXCEED 2021-12-08 11:28:23 1549 0
1 条回答
写回答
取消 提交回答
  • I'm sorry there is no such configuration for json format currently. I think it makes sense to add such configuration like 'format.ignore-parse-errors' in csv format. I created FLINK-15396[1] to track this.*来自志愿者整理的flink邮件归档

    2021-12-08 14:31:52
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Java Spring Boot开发实战系列课程【第16讲】:Spring Boot 2.0 实战Apache Kafka百万级高并发消息中间件与原理解析 立即下载
MaxCompute技术公开课第四季 之 如何将Kafka数据同步至MaxCompute 立即下载
消息队列kafka介绍 立即下载