开发者社区 > 大数据与机器学习 > 实时计算 Flink > 正文

sql client不能创建表: 那位大佬给看看这是怎么回事

Flink SQL> create table hudi_flink_test1 (

uuid varchar(20)
,name varchar(10)
,age int
,ts timestamp(3)
,partition varchar(20)
)
PARTITIONED BY (partition)
with
( 'connector' = 'hudi'
,'path' = '/user/hhive/warehouse/hudi_flink/hudi_flink_test1'
);
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.sql.parser.impl.ParseException: Creating partitioned table is only allowed for HIVE dialect.

展开
收起
游客lkx6h3pxq6jya 2024-10-20 20:54:31 37 1
1 条回答
写回答
取消 提交回答
  • 我不创建分区表可以创建成功但是查询会报错

    Flink SQL> create table hudi_flink_test1 (

    uuid varchar(20)
    ,name varchar(10)
    ,age int
    ,ts timestamp(3)
    )
    with
    ( 'connector' = 'hudi'
    ,'path' = 'hdfs://hadoop102:8020/user/hhive/warehouse/hudi_flink/hudi_flink_test1'
    );
    [INFO] Table has been created.

    Flink SQL> desc hudi_flink_test1;
    [ERROR] Unknown or invalid SQL statement.

    Flink SQL> select * from hudi_flink_test1;
    [ERROR] Could not execute SQL statement. Reason:
    org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in
    the classpath.

    Reason: Required context properties mismatch.

    The following properties are requested:
    connector=hudi
    path=hdfs://hadoop102:8020/user/hhive/warehouse/hudi_flink/hudi_flink_test1
    schema.0.data-type=VARCHAR(20)
    schema.0.name=uuid
    schema.1.data-type=VARCHAR(10)
    schema.1.name=name
    schema.2.data-type=INT
    schema.2.name=age
    schema.3.data-type=TIMESTAMP(3)
    schema.3.name=ts

    The following factories have been considered:
    org.apache.flink.table.sources.CsvBatchTableSourceFactory
    org.apache.flink.table.sources.CsvAppendTableSourceFactory

    2024-10-20 21:11:26
    赞同 205 展开评论 打赏

实时计算Flink版是阿里云提供的全托管Serverless Flink云服务,基于 Apache Flink 构建的企业级、高性能实时大数据处理系统。提供全托管版 Flink 集群和引擎,提高作业开发运维效率。

热门讨论

热门文章

相关电子书

更多
低代码开发师(初级)实战教程 立即下载
冬季实战营第三期:MySQL数据库进阶实战 立即下载
阿里巴巴DevOps 最佳实践手册 立即下载