Eu enfrentei o mesmo problema e resolvi-o iniciando o servidor da região separadamente. Certifique-se de que, quando estiver iniciando daemons, você tenha um Kerberos TGT válido.
Estou tentando configurar a autenticação do Kerberos para HBase usando esta documentação do link e tenho muito pouco progresso tão longe.
HBase 1.1.1 do Apache sem nenhuma influência do Cloudera. A máquina host está sendo executada no Centos 6.5.
Eu já configurei o Kerberos KDC e o cliente depois de seguir as instruções link O KDC está localizado na mesma máquina que o HBase que estou tentando proteger.
No geral, aqui está o estado do ambiente atual: O arquivo keytab está aqui /opt/hbase.keytab
conteúdo do hbase-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hbase.rootdir</name>
<value>file:///opt/hbase-data/hbase</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/opt/hbase-data/zookeeper</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>kerberos</value>
</property>
<property>
<name>hbase.security.authorization</name>
<value>true</value>
</property>
<property>
<name>hbase.coprocessor.region.classes</name>
<value>org.apache.hadoop.hbase.security.token.TokenProvider</value>
</property>
<property>
<name>hbase.master.keytab.file</name>
<value>/opt/hbase.keytab</value>
</property>
<property>
<name>hbase.master.kerberos.principal</name>
<value>hbase/[email protected]</value>
</property>
<property>
<name>hbase.regionserver.kerberos.principal</name>
<value>hbase/[email protected]</value>
</property>
<property>
<name>hbase.regionserver.keytab.file</name>
<value>/opt/hbase.keytab</value>
</property>
</configuration>
É um modo pseudo-distribuído e eu não me incomodei com o undeficaz HDFS para manter as coisas o mais simples possível.
No entanto, quando eu inicio o hbase com o comando ./start-hbase, recebo o seguinte erro em regionserver .log
2015-10-20 17:33:18,068 INFO [regionserver/xxx.mycompany.com/172.24.4.60:16201] regionserver.HRegionServer: reportForDuty to master=xxx.mycompany.com,16000,1445349909162 with port=16201, startcode=1445349910087 2015-10-20 17:33:18,071 WARN [regionserver/xxx.mycompany.com/172.24.4.60:16201] ipc.AbstractRpcClient: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] 2015-10-20 17:33:18,071 FATAL [regionserver/xxx.mycompany.com/172.24.4.60:16201] ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:609)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:732)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:732)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:885)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:854)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1180)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
at org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2260)
at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:893)
at java.lang.Thread.run(Thread.java:745) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 18 more 2015-10-20 17:33:18,072 WARN [regionserver/xxx.mycompany.com/172.24.4.60:16201] regionserver.HRegionServer: error telling master we are up com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to xxx.mycompany.com/172.24.4.60:16000
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
at org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2260)
at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:893)
at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: Could not set up IO Streams to xxx.mycompany.com/172.24.4.60:16000
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:777)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:885)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:854)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1180)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
... 5 more Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:677)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:635)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:743)
... 9 more Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:609)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:732)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:732)
... 9 more Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 18 more 2015-10-20 17:33:18,073 WARN [regionserver/xxx.mycompany.com/172.24.4.60:16201] regionserver.HRegionServer: reportForDuty failed; sleeping and then retrying.
Eu presumo que o Kerberos funcione porque eu posso obter
$ klist -ekt hbase.keytab
Keytab name: FILE:hbase.keytab
KVNO Timestamp Principal
---- ----------------- --------------------------------------------------------
3 10/19/15 17:11:42 hbase/[email protected] (arcfour-hmac)
3 10/19/15 17:11:42 hbase/[email protected] (des3-cbc-sha1)
3 10/19/15 17:11:42 hbase/[email protected] (des-cbc-crc)
$ kinit -kt /opt/hbase.keytab hbase/[email protected]
[userx1@gms-01 logs]$ klist
Ticket cache: FILE:/tmp/krb5cc_2369
Default principal: hbase/[email protected]
Valid starting Expires Service principal
10/20/15 17:49:32 10/21/15 03:49:32 krbtgt/[email protected]
renew until 10/27/15 16:49:32
O shell hbase produz a mesma exceção fornecida acima ao tentar executar o status (ou qualquer outro) comando
Se alguém tiver sugestões ou conselhos, por favor me avise
Obrigado antecipadamente
Eu enfrentei o mesmo problema e resolvi-o iniciando o servidor da região separadamente. Certifique-se de que, quando estiver iniciando daemons, você tenha um Kerberos TGT válido.