开发者社区> 问答> 正文

Flink 1.10连接hive时kerberos认证异常问题

我使用Flink 1.10集成hive,在连接metastore的时候由于hive对应CDH集群开启了kerberos认证,抛出了如下异常:请问大家这个该怎么配置或者解决哈?

999 [main] INFO hive.metastore - Trying to connect to metastore with URI thrift://namenode01.htsc.com:9083 1175 [main] ERROR org.apache.thrift.transport.TSaslTransport - SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient. (HiveMetaStoreClient.java:236) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient. (HiveMetaStoreClient.java:181) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient. (RetryingMetaStoreClient.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:43) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper. (HiveMetastoreClientWrapper.java:71) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:188) at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:102) at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:235) at com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.doJob(HiveMetaJob.java:44) at com.htsc.crm_realtime.fatjob.Jobs.JobEntryBase.run(JobEntryBase.java:50) at com.htsc.crm_realtime.fatjob.Jobs.hive.HiveMetaJob.main(HiveMetaJob.java:23) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 34 more*来自志愿者整理的flink邮件归档

展开
收起
船长的小螺号 2021-12-03 10:20:06 2375 0
1 条回答
写回答
取消 提交回答
  • 我在配置flink连接hive时,由于集群开启了Kerberos认证,经过一番探索,异常没有了。但是现在连接的时候需要我输入Kerberos用户名和密码。我理解指定了keytab文件路径后,应该不需要用户名和密码了吧?请教各位大神可能的配置问题。

    security.kerberos.login.use-ticker-cache: false security.kerberos.login.keytab: /app/flink/flink-1.10.10/kerberos/flink_test.keytab security.kerberos.login.principal: flink_test@HADOOP.HTSC.COM*来自志愿者整理的FLINK邮件归档

    2021-12-03 10:48:00
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Hive Bucketing in Apache Spark 立即下载
spark替代HIVE实现ETL作业 立即下载
2019大数据技术公开课第五季—Hive迁移到MaxCompute最佳实践 立即下载