抓个包看下 kerberos的 pre-authentication是如何工作的 2

本文涉及的产品
日志服务 SLS,月写入数据量 50GB 1个月
简介: 抓个包看下 kerberos的 pre-authentication是如何工作的

3 如何打开/关闭 pre-authentication

kerberos协议的不同 实现,其 pre-authentication 开关的默认配置是不同的:

  • MIT kerberos: 默认对每个principal都关闭了pre-authentication;
  • Microsoft AD:默认对每个principal都开启了pre-authentication;
  • freeIPA:默认对每个principal都开启了pre-authentication; 可以采取如下方法查看和更改某个principal的pre-authentication开关(需要在kdc节点上root账户下才能执行kadmin.local):
  • 查看配置:kadmin.local -q "getprinc dap2@CDH.COM"
  • 关闭 pre-authentication:kadmin.local -q "modprinc -requires_preauth dap2"
  • 开启 pre-authentication:kadmin.local -q "modprinc +requires_preauth dap2"
  • 命令示例如下:

640.png


  • 一般来讲,普通用户的密钥一般不会太复杂容易被暴力破解,所以对于普通用户我们应该开启 requires_preauth(Pre-authentication should be enabled for user accounts, or else the security of the account’s Kerberos authentication will be weakened);
  • 一般来讲,服务用户(service principal)的密钥是随机生成的比较复杂不容易被暴力破解,所以对于服务用户,可以开启也可以关闭requires_preauth(Service principal should nor be forced to use preauth);
  • 开启了服务用户(service principal)的 requires_preauth 选项后,当普通用户向KDC发起针对该服务的 TGS_REQ 请求时,KDC 只会向使用了 pre-authentication 机制通过 AS_REQ 请求获得了 tgt总票据的业务用户正常发放 service ticket,那些通过AS_REQ 请求获得了总票据tgt但在这一过程中没有使用 pre-authentication 机制的普通用户,则会收到报错NO PREAUTH;(When +requires_preauth is set on a service principal, the KDC will only issue service tickets for that service principal if the client’s initial authentication was performed using preauthentication);
  • 比如开启了HIVE服务用户的+requires_preauth 后,配置了+requires_preauth的普通用户dap2,可以成功通过kerberos认证创建到 hs2 的JDBC连接,而没有配置+requires_preauth的普通用户dap,则会报错“sun.security.krb5.KrbException: Generic error (description in e-text) (60) - NO PREAUTH”,其KDC相关日志如下:
## dap2用户配置了+requires_preauth:
May 11 11:33:11 uf30-1 krb5kdc[6377](info): AS_REQ (1 etypes {17}) 192.168.71.70: NEEDED_PREAUTH: dap2@CDH.COM for krbtgt/CDH.COM@CDH.COM, Additional pre-authentication required
May 11 11:33:11 uf30-1 krb5kdc[6377](info): closing down fd 13
May 11 11:33:11 uf30-1 krb5kdc[6377](info): AS_REQ (1 etypes {17}) 192.168.71.70: ISSUE: authtime 1683775991, etypes {rep=17 tkt=17 ses=17}, dap2@CDH.COM for krbtgt/CDH.COM@CDH.COM
May 11 11:33:11 uf30-1 krb5kdc[6377](info): closing down fd 13
May 11 11:33:22 uf30-1 krb5kdc[6377](info): TGS_REQ (1 etypes {17}) 192.168.71.70: ISSUE: authtime 1683775991, etypes {rep=17 tkt=17 ses=17}, dap2@CDH.COM for hive/uf30-3@CDH.COM
May 11 11:33:22 uf30-1 krb5kdc[6377](info): closing down fd 13
## dap用户没有配置+requires_preauth:
May 11 13:29:22 uf30-1 krb5kdc[6377](info): AS_REQ (1 etypes {17}) 192.168.71.70: ISSUE: authtime 1683782962, etypes {rep=17 tkt=17 ses=17}, dap@CDH.COM for krbtgt/CDH.COM@CDH.COM
May 11 13:29:22 uf30-1 krb5kdc[6377](info): closing down fd 13
May 11 13:29:35 uf30-1 krb5kdc[6377](info): TGS_REQ (1 etypes {17}) 192.168.71.70: NO PREAUTH: authtime 0,  dap@CDH.COM for hive/uf30-3@CDH.COM, Generic error (see e-text)
May 11 13:29:35 uf30-1 krb5kdc[6377](info): closing down fd 13

其客户端日志如下:

## dap用户没有配置+requires_preauth:
[root@uf30-1 ~]# export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true ${HADOOP_OPTS}"
[root@uf30-1 ~]# kinit dap
Password for dap@CDH.COM: 
[root@uf30-1 ~]# beeline -u  "jdbc:hive2://uf30-3:10000/default;principal=hive/_HOST@CDH.COM" 
WARNING: Use "yarn jar" to launch YARN applications.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://uf30-3:10000/default;principal=hive/_HOST@CDH.COM
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>>KinitOptions cache name is /tmp/krb5cc_0
>>>DEBUG <CCacheInputStream>  client principal is dap@CDH.COM
>>>DEBUG <CCacheInputStream> server principal is krbtgt/CDH.COM@CDH.COM
>>>DEBUG <CCacheInputStream> key type: 17
>>>DEBUG <CCacheInputStream> auth time: Thu May 11 13:29:22 CST 2023
>>>DEBUG <CCacheInputStream> start time: Thu May 11 13:29:22 CST 2023
>>>DEBUG <CCacheInputStream> end time: Fri May 12 13:29:22 CST 2023
>>>DEBUG <CCacheInputStream> renew_till time: Thu May 18 13:29:22 CST 2023
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>DEBUG <CCacheInputStream>  client principal is dap@CDH.COM
>>>DEBUG <CCacheInputStream> server principal is X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/CDH.COM@CDH.COM@CDH.COM
>>>DEBUG <CCacheInputStream> key type: 0
>>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> start time: null
>>>DEBUG <CCacheInputStream> end time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> renew_till time: null
>>> CCacheInputStream: readFlags() 
Found ticket for dap@CDH.COM to go to krbtgt/CDH.COM@CDH.COM expiring on Fri May 12 13:29:22 CST 2023
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for dap@CDH.COM to go to krbtgt/CDH.COM@CDH.COM expiring on Fri May 12 13:29:22 CST 2023
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
default etypes for default_tgs_enctypes: 17.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=uf30-1 TCP:88, timeout=3000, number of retries =3, #bytes=603
>>> KDCCommunication: kdc=uf30-1 TCP:88, timeout=3000,Attempt =1, #bytes=603
>>>DEBUG: TCPClient reading 147 bytes
>>> KrbKdcReq send: #bytes read=147
>>> KdcAccessibility: remove uf30-1
>>> KDCRep: init() encoding tag is 126 req type is 13
>>>KRBError:
         cTime is Sat Apr 03 20:22:27 CST 1976 197382147000
         sTime is Thu May 11 13:29:35 CST 2023 1683782975000
         suSec is 741281
         error code is 60
         error Message is Generic error (description in e-text)
         cname is dap@CDH.COM
         sname is hive/uf30-3@CDH.COM
         msgType is 30
KrbException: Generic error (description in e-text) (60) - NO PREAUTH
        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73)
        at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251)
        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262)
        at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308)
        at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126)
        at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458)
        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:229)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:184)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
        at java.sql.DriverManager.getConnection(DriverManager.java:664)
        at java.sql.DriverManager.getConnection(DriverManager.java:208)
        at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145)
        at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209)
        at org.apache.hive.beeline.Commands.connect(Commands.java:1617)
        at org.apache.hive.beeline.Commands.connect(Commands.java:1512)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56)
        at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1290)
        at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1329)
        at org.apache.hive.beeline.BeeLine.connectUsingArgs(BeeLine.java:864)
        at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:768)
        at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1004)
        at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:526)
        at org.apache.hive.beeline.BeeLine.main(BeeLine.java:508)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
Caused by: KrbException: Identifier doesn't match expected value (906)
        at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140)
        at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65)
        at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60)
        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55)
        ... 45 more
23/05/11 13:29:35 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[?:1.8.0_181]
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
        at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_181]
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) [hadoop-common-3.0.0-cdh6.3.2.jar:?]
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) [hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:229) [hive-jdbc-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:184) [hive-jdbc-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_181]
        at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_181]
        at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.Commands.connect(Commands.java:1617) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.Commands.connect(Commands.java:1512) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
        at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1290) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1329) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.connectUsingArgs(BeeLine.java:864) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:768) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1004) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:526) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at org.apache.hive.beeline.BeeLine.main(BeeLine.java:508) [hive-beeline-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
        at org.apache.hadoop.util.RunJar.run(RunJar.java:313) [hadoop-common-3.0.0-cdh6.3.2.jar:?]
        at org.apache.hadoop.util.RunJar.main(RunJar.java:227) [hadoop-common-3.0.0-cdh6.3.2.jar:?]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Generic error (description in e-text) (60) - NO PREAUTH)
        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:770) ~[?:1.8.0_181]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_181]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_181]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_181]
        ... 36 more
Caused by: sun.security.krb5.KrbException: Generic error (description in e-text) (60) - NO PREAUTH
        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73) ~[?:1.8.0_181]
        at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251) ~[?:1.8.0_181]
        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262) ~[?:1.8.0_181]
        at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308) ~[?:1.8.0_181]
        at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126) ~[?:1.8.0_181]
        at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) ~[?:1.8.0_181]
        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) ~[?:1.8.0_181]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_181]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_181]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_181]
        ... 36 more
Caused by: sun.security.krb5.Asn1Exception: Identifier doesn't match expected value (906)
        at sun.security.krb5.internal.KDCRep.init(KDCRep.java:140) ~[?:1.8.0_181]
        at sun.security.krb5.internal.TGSRep.init(TGSRep.java:65) ~[?:1.8.0_181]
        at sun.security.krb5.internal.TGSRep.<init>(TGSRep.java:60) ~[?:1.8.0_181]
        at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:55) ~[?:1.8.0_181]
        at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:251) ~[?:1.8.0_181]
        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:262) ~[?:1.8.0_181]
        at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:308) ~[?:1.8.0_181]
        at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:126) ~[?:1.8.0_181]
        at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) ~[?:1.8.0_181]
        at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) ~[?:1.8.0_181]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) ~[?:1.8.0_181]
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[?:1.8.0_181]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[?:1.8.0_181]
        ... 36 more
23/05/11 13:29:35 [main]: WARN jdbc.HiveConnection: Failed to connect to uf30-3:10000
Unknown HS2 problem when communicating with Thrift server.
Error: Could not open client transport with JDBC Uri: jdbc:hive2://uf30-3:10000/default;principal=hive/_HOST@CDH.COM: GSS initiate failed (state=08S01,code=0)
Beeline version 2.1.1-cdh6.3.2 by Apache Hive
beeline> [root@uf30-1 ~]# 
## dap2用户配置了+requires_preauth:
[root@uf30-1 ~]# kinit dap2 -kt dap2-liming.keytab 
[root@uf30-1 ~]# beeline -u  "jdbc:hive2://uf30-3:10000/default;principal=hive/_HOST@CDH.COM" 
WARNING: Use "yarn jar" to launch YARN applications.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://uf30-3:10000/default;principal=hive/_HOST@CDH.COM
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>> KdcAccessibility: reset
>>> KdcAccessibility: reset
>>>KinitOptions cache name is /tmp/krb5cc_0
>>>DEBUG <CCacheInputStream>  client principal is dap2@CDH.COM
>>>DEBUG <CCacheInputStream> server principal is krbtgt/CDH.COM@CDH.COM
>>>DEBUG <CCacheInputStream> key type: 17
>>>DEBUG <CCacheInputStream> auth time: Thu May 11 13:36:55 CST 2023
>>>DEBUG <CCacheInputStream> start time: Thu May 11 13:36:55 CST 2023
>>>DEBUG <CCacheInputStream> end time: Fri May 12 13:36:55 CST 2023
>>>DEBUG <CCacheInputStream> renew_till time: Thu May 18 13:36:55 CST 2023
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL; PRE_AUTH;
>>>DEBUG <CCacheInputStream>  client principal is dap2@CDH.COM
>>>DEBUG <CCacheInputStream> server principal is X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/CDH.COM@CDH.COM@CDH.COM
>>>DEBUG <CCacheInputStream> key type: 0
>>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> start time: null
>>>DEBUG <CCacheInputStream> end time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> renew_till time: null
>>> CCacheInputStream: readFlags() 
>>>DEBUG <CCacheInputStream>  client principal is dap2@CDH.COM
>>>DEBUG <CCacheInputStream> server principal is X-CACHECONF:/krb5_ccache_conf_data/pa_type/krbtgt/CDH.COM@CDH.COM@CDH.COM
>>>DEBUG <CCacheInputStream> key type: 0
>>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> start time: null
>>>DEBUG <CCacheInputStream> end time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> renew_till time: null
>>> CCacheInputStream: readFlags() 
Found ticket for dap2@CDH.COM to go to krbtgt/CDH.COM@CDH.COM expiring on Fri May 12 13:36:55 CST 2023
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for dap2@CDH.COM to go to krbtgt/CDH.COM@CDH.COM expiring on Fri May 12 13:36:55 CST 2023
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
default etypes for default_tgs_enctypes: 17.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbKdcReq send: kdc=uf30-1 TCP:88, timeout=3000, number of retries =3, #bytes=605
>>> KDCCommunication: kdc=uf30-1 TCP:88, timeout=3000,Attempt =1, #bytes=605
>>>DEBUG: TCPClient reading 580 bytes
>>> KrbKdcReq send: #bytes read=580
>>> KdcAccessibility: remove uf30-1
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
>>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting mySeqNumber to: 180313328
Created InitSecContextToken:
0000: 01 00 6E 82 02 09 30 82   02 05 A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 20 00 00 00 A3 82 01  ......... ......
0020: 32 61 82 01 2E 30 82 01   2A A0 03 02 01 05 A1 09  2a...0..*.......
0030: 1B 07 43 44 48 2E 43 4F   4D A2 19 30 17 A0 03 02  ..CDH.COM..0....
0040: 01 00 A1 10 30 0E 1B 04   68 69 76 65 1B 06 75 66  ....0...hive..uf
0050: 33 30 2D 33 A3 81 FC 30   81 F9 A0 03 02 01 11 A1  30-3...0........
0060: 03 02 01 02 A2 81 EC 04   81 E9 53 6E 8B FC 60 5C  ..........Sn..`\
0070: F4 F0 F5 47 94 2B 7E E4   63 A8 E7 FF 7B 52 B4 E5  ...G.+..c....R..
0080: BB E5 04 4C 82 D5 1F 7A   0B DC 08 9E 67 12 80 E9  ...L...z....g...
0090: B9 B7 75 73 E5 77 15 BC   08 E1 F1 06 78 01 63 B5  ..us.w......x.c.
00A0: 26 F5 20 FD 72 37 DA D2   05 E3 8A BB 36 1F E4 92  &. .r7......6...
00B0: 7D ED 00 89 77 23 AF 35   2F 9F 9A D1 27 1F 9E FC  ....w#.5/...'...
00C0: B0 EA 00 90 87 57 42 6B   0E 0C 6B 93 90 F6 10 EE  .....WBk..k.....
00D0: 85 F9 81 33 05 77 13 B9   91 1D 0C 14 0A 90 F3 36  ...3.w.........6
00E0: 35 C6 4B AD 2E DB E5 A2   CA C8 61 EE 7D F8 D6 EB  5.K.......a.....
00F0: 54 46 F9 C3 71 1D 01 E7   B3 FC 34 A7 B0 F0 A1 D3  TF..q.....4.....
0100: AD 16 B9 91 09 DD 48 BA   C8 72 36 BD C9 97 7D 51  ......H..r6....Q
0110: 05 33 25 9D DB B5 B3 32   01 40 2E 7F 09 18 5F 7D  .3%....2.@...._.
0120: 8F 54 19 B4 36 90 06 D7   FB 58 43 FC 61 25 17 17  .T..6....XC.a%..
0130: CC 06 7A D0 14 B6 08 29   0D 2D 93 BC F7 72 AD 8C  ..z....).-...r..
0140: 83 D4 C9 33 36 19 F0 12   60 FF 9B 53 E4 94 64 DE  ...36...`..S..d.
0150: 17 7C 3D A4 81 B9 30 81   B6 A0 03 02 01 11 A2 81  ..=...0.........
0160: AE 04 81 AB AF 02 AB 97   B9 B7 65 8D 20 C7 45 27  ..........e. .E'
0170: 7E C7 64 D1 3D FA 86 57   E9 C6 0C 80 80 E2 96 D0  ..d.=..W........
0180: 76 76 4F 7C CD A6 BC 15   24 A9 5D 75 C9 E9 B7 42  vvO.....$.]u...B
0190: B2 39 3A D9 16 57 48 A4   74 69 D3 3A 6E 01 E5 C0  .9:..WH.ti.:n...
01A0: 8A 8E 4B 13 31 B1 0C 3F   82 78 F2 95 94 02 11 4F  ..K.1..?.x.....O
01B0: 34 9B 65 8F AD 75 29 CB   62 4D 5A 45 AD 34 5B 33  4.e..u).bMZE.4[3
01C0: 17 F9 B6 2B 76 91 7D 82   11 53 93 CD 36 44 63 C8  ...+v....S..6Dc.
01D0: CB 73 6F 41 B8 B1 32 63   C6 2D 23 CC E0 2C 3A EE  .soA..2c.-#..,:.
01E0: 74 5A A6 3C 8A 35 B9 D7   A7 70 8E 29 73 61 BF D1  tZ.<.5...p.)sa..
01F0: 23 5E 61 7F 9C D4 93 1D   14 F6 C6 DA D1 56 DD C8  #^a..........V..
0200: D2 BB A5 B3 D6 7C 12 6A   3F 21 A0 DF 7A A6 46     .......j?!..z.F
Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS
>>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType
Krb5Context setting peerSeqNumber to: 785458870
Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 2e d1 26 b6 01 01 00 00 80 15 9c 2f 93 6b 06 64 c4 4c 6f 6c ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 0a bf 5c f0 01 01 00 00 95 87 20 ba 2a 12 c3 e6 ad d3 85 69 ]
Connected to: Apache Hive (version 2.1.1-cdh6.3.2)
Driver: Hive JDBC (version 2.1.1-cdh6.3.2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 2.1.1-cdh6.3.2 by Apache Hive
0: jdbc:hive2://uf30-3:10000/default> show databases;
  • 没有启用+requires_preauth的业务用户dap,其TGS_REP数据报截图如下:

640.png

  • 启用了+requires_preauth的业务用户dap2,其TGS_REP数据报截图如下:

640.png

4. 如何排查 Kerberos 相关问题

最后总结下如何排查 Kerberos 相关问题。

4.1 查看客户端日志

首先可以查看客户端日志以排查问题,此时可以通过以下方式在客户端调整kerberos日志级别,调整后可以在客户端看到更详细的日志:

  • 通用配置 OS-level Kerberos Debugging: export KRB5_TRACE=/tmp/kinit.log (技术背景:User can set an environment variable called KRB5_TRACE to a filename or to /dev/stdout, Kerberos programs like kinit, klist and kvno etc., as well as Kerberos libraries libkrb5*, will start printing more interesting details.This is a very powerfull feature and can be used to debug any program which uses Kerberos libraries);
  • JAVA程序,可以配置 JVM Kerberos Library logging:-Dsun.security.krb5.debug=true
  • JAVA程序,可以配置 JVM SPNEGO Logging:-Dsun.security.spnego.debug=true
  • HADOOP可以配置 Hadoop-side JAAS debugging:export HADOOP_JAAS_DEBUG=true
  • HADOOP可以配置 HADOOP_OPTS environment variable:export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true ${HADOOP_OPTS}“
  • if the env variable HADOOP_JAAS_DEBUG is set to true,then hadoop UGI will set the "debug" flag on any JAAS files it creates;
  • 针对HADOOP配置环境变量时,可以通过以下命令查看环境变量:echo 不是����������(不是������HADOOP_OPTS!!!!);

4.2 查看服务端日志

其次可以查看kdc服务端日志,一般是在 /var/log/krb5kdc.log;

4.3 验证 KEYTAB 文件的有效性

由于一旦服务端更改密码后,原有的已经分发给客户端的keytab文件都将失效,所有客户端需要验证其keytab的有效性。可以通过 klist -ekt xxx.keytab 查看keytab文件内容,在输出内容中重点注意 KVNO 列,该列代表了 key version number,较新版本对应列的KVNO值会更大。如果从服务端新生成的keytab文件中某principal的KVNO更大,则可以证实原Keytab文件已经失效。

5 相关命令和参考连接

##相关命令:
- kadmin.local -q "xst -norandkey -k dap2.keytab dap2"
- kadmin.local -q "getprinc hive/uf30-3@CDH.COM"
- kadmin.local -q "getprinc dap2@CDH.COM"
- kadmin.local -q "getprinc krbtgt/CDH.COM@CDH.COM"
- kadmin.local -q "modprinc +requires_preauth dap2@CDH.COM"
- kadmin.local -q "modprinc +requires_preauth hive/uf30-3@CDH.COM"
- kadmin.local -q "modprinc -requires_preauth dap2" 
- kadmin.local -q "addprinc -randkey +requires_hwauth xx"
- kadmin.local -q "addprinc -randkey +requires_hwauth hdfs/wyx001.steven.com@xxx.COM"
- 验证keytab内容:klist -ekt /opt/flink-1.12.2/flink.keytab
##参考连接:
- https://www.nopsec.com/blog/exploiting-kerberos-for-lateral-movement-and-privilege-escalation/
- https://forestall.io/blog/en/kerberos-protocol-security-1/
- https://unit42.paloaltonetworks.com/next-gen-kerberos-attacks/
- https://datatracker.ietf.org/doc/rfc6113/
- https://www.ietf.org/rfc/rfc4120.txt
- https://cwiki.apache.org/confluence/display/DIRxPMGT/Kerberos+PA-DATA
相关实践学习
日志服务之使用Nginx模式采集日志
本文介绍如何通过日志服务控制台创建Nginx模式的Logtail配置快速采集Nginx日志并进行多维度分析。
相关文章
|
SQL 存储 算法
抓个包看下 kerberos的 pre-authentication是如何工作的 1
抓个包看下 kerberos的 pre-authentication是如何工作的
|
SQL 安全 Java
一篇文章彻底理解 HIVE 常见的三种 AUTHENTICATION 认证机制的配置与使用
一篇文章彻底理解 HIVE 常见的三种 AUTHENTICATION 认证机制的配置与使用
|
关系型数据库 MySQL
MySQL 8 默认身份验证插件caching_sha2_password
默认身份验证插件caching_sha2_password
366 0
Basic Auth认证
Basic Auth认证
177 1
|
Linux Windows
?悬赏:Kerberos认证Service_key及Ticket获取相关问题思路求教
借宝地一用,谢谢! 思路一: 在Windows客户端与Windows服务器的kerberos认证过程中,通过第二阶段客户端向KDC(密钥分配中心)的TGS数据交互,客户端获取到了可以与服务端会话的Service_key和Ticket。 
152 0
|
数据库 Windows Linux
悬赏:Kerberos认证Service_key及Ticket获取相关问题思路求教
Kerberos认证Service_key及Ticket获取相关问题思路求教
510 0
|
安全
Etcd安全配置之Basic Auth认证
《中小团队落地配置中心详解》文章中我们介绍了如何基于Etcd+Confd构建配置中心,最后提到Etcd的安全问题时说了可以使用账号密码认证以达到安全访问的目的,究竟该如何开启认证以及怎么设计权限访问呢?本文将为你详细解读
4526 0