CREATE TABLE test ( myField2 DECIMAL, myField3 STRING, myField1 DECIMAL ) WITH ( 'connector' = 'kafka-0.11', 'topic' = 'test', 'properties.group.id' = 'test', 'scan.startup-mode' = 'group-offsets', 'properties.bootstrap.servers' = 'xxxx', 'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.mechanism' = 'SCRAM-SHA-256', 'properties.sasl.jaas.config' = 'org.apache.kafka.common.security.scram.ScramLoginModule required username=aa password=aa', 'format' = 'json' ); 请问下,用上面的sql创建表时,出现下面这个错是为什么? Caused by: org.apache.flink.table.api.ValidationException: Unsupported options found for connector 'kafka-0.11'.
Unsupported options:
scan.startup-mode
Supported options:
connector format json.fail-on-missing-field json.ignore-parse-errors json.timestamp-format.standard properties.bootstrap.servers properties.group.id properties.sasl.jaas.config properties.sasl.mechanism properties.security.protocol property-version scan.startup.mode scan.startup.specific-offsets scan.startup.timestamp-millis sink.partitioner topic at org.apache.flink.table.factories.FactoryUtil$TableFactoryHelper.validate(FactoryUtil.java:487) at org.apache.flink.table.factories.FactoryUtil$TableFactoryHelper.validateExcept(FactoryUtil.java:519) at org.apache.flink.streaming.connectors.kafka.table.KafkaDynamicTableFactoryBase.createDynamicTableSource(KafkaDynamicTableFactoryBase.java:77) at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:122) ... 18 more flink版本: 1.11.0 #Flink
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。