开发者社区> 问答> 正文

Flink1.12.0 sql-client连接hive报错

一、环境 1、Flink1.12.0 2、hive 2.1.1 3、下载release-1.12编译的jar包,用export HADOOP_CLASSPATH=hadoop classpath连接Hadoop集群 4、flink的lib目录下是这些包:(是不是还需要加一下什么包?)     flink-csv-1.12.jar     flink-dist_2.11-1.12.jar     flink-json-1.12.jar     flink-shaded-zookeeper-3.4.14.jar     flink-table_2.11-1.12.jar     flink-table-blink_2.11-1.12.jar     log4j-1.2-api-2.12.1.jar     log4j-api-2.12.1.jar     log4j-core-2.12.1.jar     log4j-slf4j-impl-2.12.1.jar   5、flink的conf目录下的sql-client-defaults.yaml 只修改了:     catalogs: #[] # empty list        - name: myhive          type: hive          hive-conf-dir: /etc/hive/conf

二、启动:      export HADOOP_CLASSPATH=hadoop classpath      /tmp/flink-1.12.0/bin/sql-client.sh embedded

三、报错:    [yujianbo@qzcs86 conf]$ /tmp/flink-1.12.0/bin/sql-client.sh embedded Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/tmp/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] No default environment specified. Searching for '/tmp/flink-1.12.0/conf/sql-client-defaults.yaml'...found. Reading default environment from: file:/tmp/flink-1.12.0/conf/sql-client-defaults.yaml No session environment specified.

Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue. at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208) Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context. at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878) at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226) at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196) Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.CatalogFactory' in the classpath.

Reason: Required context properties mismatch.

The following properties are requested: hive-conf-dir=/etc/hive/conf type=hive

The following factories have been considered: org.apache.flink.table.catalog.GenericInMemoryCatalogFactory at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322) at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190) at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143) at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:113) at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:383) at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634) at java.util.HashMap.forEach(HashMap.java:1280) at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633) at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266) at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632) at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529) at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185) at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138) at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867) ... 3 more*来自志愿者整理的flink邮件归档

展开
收起
毛毛虫雨 2021-12-08 11:36:51 1114 0
1 条回答
写回答
取消 提交回答
  • 这两篇文章应该能解决你的问题,文中有写到需要哪些包。 对应官网链接: https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/ https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/hive_catalog.html https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/hive/hive_dialect.html*来自志愿者整理的flink邮件归档

    2021-12-08 16:10:32
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Hive Bucketing in Apache Spark 立即下载
spark替代HIVE实现ETL作业 立即下载
2019大数据技术公开课第五季—Hive迁移到MaxCompute最佳实践 立即下载