Encontre um erro quando o configuer secure hadoop: org.apache.hadoop.security.AccessControlException

2

Eu tento configurar o hadoop seguro com kerberos. Eu iniciei o servidor KDC, gerado e copiei o keytab relacionado ao nó correspondente. kerberos podem funcionar normalmente (use kinit) mas quando eu tento iniciar o namenode, eu encoupo um erro estranho. Eu modifiquei o core-site.xml e o hdfs-site.xml.

aqui é meu core-site.xml

Eu tentei procurar por respostas, mas não encontrei uma solução adequada, se alguém puder ajudar, muito obrigado

aqui está o log de erros:     

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
   <property>
         <name>fs.default.name</name>
         <value>hdfs://Master.Hadoop:9000</value>
   </property>
    <property>
        <name>hadoop.tmp.dir</name>
        <value>/hadoop/tmp</value>
        <description>A base for other temporary directories.</description>
   </property>
<property>
    <name>hadoop.security.authentication</name>
       <value>kerberos</value> <!-- A value of "simple" would disable security. -->
    </property>

    <property>
      <name>hadoop.security.authorization</name>
      <value>true</value>
    </property>
</configuration>core-site.xml

aqui está o meu hdfs-site.xml

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>

       <property>
        <name>dfs.replication</name>
        <value>2</value>
    </property>
    <property>
        <name>dfs.name.dir</name>
        <value>/hadoop/hdfs/name</value>
    </property>
    <property>
        <name>dfs.data.dir</name>
        <value>/hadoop/hdfs/data</value>
    </property>

      <!-- General HDFS security config -->
 <property>
   <name>dfs.block.access.token.enable</name>
   <value>true</value>
 </property>

 <!-- NameNode security config -->
 <property>
   <name>dfs.https.address</name>
   <value>master.hadoop:50470</value>
 </property>
 <property>
   <name>dfs.https.port</name>
   <value>50470</value>
 </property>
 <property>
   <name>dfs.namenode.keytab.file</name>
   <value>/usr/local/src/hadoop-1.2.1/conf/hdfs.keytab</value> <!-- path to the HDFS keytab -->
 </property>
 <property>
   <name>dfs.namenode.kerberos.principal</name>
   <value>hdfs/master.hadoop@HADOOP</value>
 </property>
 <property>
   <name>dfs.namenode.kerberos.https.principal</name>
   <value>host/master.hadoop@HADOOP</value>
 </property>

 <!-- Secondary NameNode security config -->
 <property>
   <name>dfs.secondary.https.address</name>
   <value>master.hadoop:50495</value>
 </property>
 <property>
   <name>dfs.secondary.https.port</name>
   <value>50495</value>
 </property>
 <property>
   <name>dfs.secondary.namenode.keytab.file</name>
   <value>//usr/local/src/hadoop-1.2.1/conf/hdfs.keytab</value> <!-- path to the HDFS keytab -->
 </property>
 <property>
   <name>dfs.secondary.namenode.kerberos.principal</name>
   <value>hdfs/master.hadoop@HADOOP</value>
 </property>
 <property>
   <name>dfs.secondary.namenode.kerberos.https.principal</name>
   <value>host/master.hadoop@HADOOP</value>
 </property>

 <!-- DataNode security config -->
 <property>
   <name>dfs.datanode.data.dir.perm</name>
   <value>700</value>
 </property>
 <property>
   <name>dfs.datanode.address</name>
   <value>0.0.0.0:1004</value>
 </property>
 <property>
   <name>dfs.datanode.http.address</name>
   <value>0.0.0.0:1006</value>
 </property>
 <property>
   <name>dfs.datanode.keytab.file</name>
   <value>/usr/local/src/hadoop-1.2.1/conf/hdfs.keytab</value> <!-- path to the HDFS keytab -->
 </property>
 <property>
   <name>dfs.datanode.kerberos.principal</name>
   <value>hdfs/slave1.hadoop@HADOOP</value>
 </property>

</configuration>

mas quando tento iniciar o namenode, encontrei um erro

Eu observei o prompt: Login bem-sucedido para o usuário hdfs/master.hadoop@HADOOP usando o arquivo keytab /usr/local/src/hadoop-1.2.1/conf/hdfs.keytab

mas por que esta exceção, "A autorização (hadoop.security.authorization) está ativada, mas a autenticação (hadoop.security.authentication) está configurada como simples. Por favor, configure outro método como kerberos ou digest"

aqui está o log de erros:

14/03/06 16:21:09 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = Master.Hadoop/192.168.128.132
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.2.1
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by 'mattf' on Mon Jul 22 15:23:09 PDT 2013
STARTUP_MSG:   java = 1.7.0_09
************************************************************/
14/03/06 16:21:10 INFO impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
14/03/06 16:21:10 INFO impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
14/03/06 16:21:10 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
14/03/06 16:21:10 INFO impl.MetricsSystemImpl: NameNode metrics system started
14/03/06 16:21:10 INFO impl.MetricsSourceAdapter: MBean for source ugi registered.
14/03/06 16:21:10 INFO security.UserGroupInformation: Login successful for user hdfs/master.hadoop@HADOOP using keytab file /usr/local/src/hadoop-1.2.1/conf/hdfs.keytab
14/03/06 16:21:10 INFO impl.MetricsSourceAdapter: MBean for source jvm registered.
14/03/06 16:21:10 INFO impl.MetricsSourceAdapter: MBean for source NameNode registered.
14/03/06 16:21:10 INFO util.GSet: Computing capacity for map BlocksMap
14/03/06 16:21:10 INFO util.GSet: VM type       = 32-bit
14/03/06 16:21:10 INFO util.GSet: 2.0% max memory = 1013645312
14/03/06 16:21:10 INFO util.GSet: capacity      = 2^22 = 4194304 entries
14/03/06 16:21:10 INFO util.GSet: recommended=4194304, actual=4194304
14/03/06 16:21:10 INFO namenode.FSNamesystem: fsOwner=hdfs/master.hadoop@HADOOP
14/03/06 16:21:10 INFO namenode.FSNamesystem: supergroup=supergroup
14/03/06 16:21:10 INFO namenode.FSNamesystem: isPermissionEnabled=true
14/03/06 16:21:10 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
14/03/06 16:21:10 INFO namenode.FSNamesystem: isAccessTokenEnabled=true accessKeyUpdateInterval=600 min(s), accessTokenLifetime=600 min(s)
14/03/06 16:21:10 INFO namenode.FSNamesystem: Registered FSNamesystemStateMBean and NameNodeMXBean
14/03/06 16:21:10 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = 0
14/03/06 16:21:10 INFO namenode.NameNode: Caching file names occuring more than 10 times 
14/03/06 16:21:10 INFO common.Storage: Start loading image file /hadoop/hdfs/name/current/fsimage
14/03/06 16:21:10 INFO common.Storage: Number of files = 9
14/03/06 16:21:10 INFO common.Storage: Number of files under construction = 0
14/03/06 16:21:10 INFO common.Storage: Image file /hadoop/hdfs/name/current/fsimage of size 993 bytes loaded in 0 seconds.
14/03/06 16:21:10 INFO namenode.FSEditLog: Start loading edits file /hadoop/hdfs/name/current/edits
14/03/06 16:21:10 INFO namenode.FSEditLog: Invalid opcode, reached end of edit log Number of transactions found: 2.  Bytes read: 41
14/03/06 16:21:10 INFO namenode.FSEditLog: Start checking end of edit log (/hadoop/hdfs/name/current/edits) ...
14/03/06 16:21:10 INFO namenode.FSEditLog: Checked the bytes after the end of edit log (/hadoop/hdfs/name/current/edits):
14/03/06 16:21:10 INFO namenode.FSEditLog:   Padding position  = 41 (-1 means padding not found)
14/03/06 16:21:10 INFO namenode.FSEditLog:   Edit log length   = 1048580
14/03/06 16:21:10 INFO namenode.FSEditLog:   Read length       = 41
14/03/06 16:21:10 INFO namenode.FSEditLog:   Corruption length = 0
14/03/06 16:21:10 INFO namenode.FSEditLog:   Toleration length = 0 (= dfs.namenode.edits.toleration.length)
14/03/06 16:21:10 INFO namenode.FSEditLog: Summary: |---------- Read=41 ----------|-- Corrupt=0 --|-- Pad=1048539 --|
14/03/06 16:21:10 INFO namenode.FSEditLog: Edits file /hadoop/hdfs/name/current/edits of size 1048580 edits # 2 loaded in 0 seconds.
14/03/06 16:21:10 INFO common.Storage: Image file /hadoop/hdfs/name/current/fsimage of size 1027 bytes saved in 0 seconds.
14/03/06 16:21:10 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/hadoop/hdfs/name/current/edits
14/03/06 16:21:10 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/hdfs/name/current/edits
14/03/06 16:21:10 INFO namenode.NameCache: initialized with 0 entries 0 lookups
14/03/06 16:21:10 INFO namenode.FSNamesystem: Finished loading FSImage in 336 msecs
14/03/06 16:21:10 INFO namenode.FSNamesystem: dfs.safemode.threshold.pct          = 0.9990000128746033
14/03/06 16:21:10 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/03/06 16:21:10 INFO namenode.FSNamesystem: dfs.safemode.extension              = 30000
14/03/06 16:21:10 INFO namenode.FSNamesystem: Number of blocks excluded by safe block count: 0 total blocks: 4 and thus the safe blocks: 4
14/03/06 16:21:11 INFO hdfs.StateChange: STATE* Safe mode ON 
The reported blocks is only 0 but the threshold is 0.9990 and the total blocks 4. Safe mode will be turned off automatically.
14/03/06 16:21:11 INFO block.BlockTokenSecretManager: Updating block keys
14/03/06 16:21:11 INFO util.HostsFileReader: Refreshing hosts (include/exclude) list
14/03/06 16:21:11 INFO impl.MetricsSourceAdapter: MBean for source FSNamesystemMetrics registered.
14/03/06 16:21:11 INFO delegation.AbstractDelegationTokenSecretManager: Updating the current master key for generating delegation tokens
14/03/06 16:21:11 INFO delegation.AbstractDelegationTokenSecretManager: Starting expired delegation token remover thread, tokenRemoverScanInterval=60 min(s)
14/03/06 16:21:11 INFO delegation.AbstractDelegationTokenSecretManager: Updating the current master key for generating delegation tokens
14/03/06 16:21:11 INFO ipc.Server: Starting SocketReader
14/03/06 16:21:11 INFO impl.MetricsSourceAdapter: MBean for source RpcDetailedActivityForPort9000 registered.
14/03/06 16:21:11 INFO impl.MetricsSourceAdapter: MBean for source RpcActivityForPort9000 registered.
14/03/06 16:21:11 INFO namenode.NameNode: Namenode up at: master.hadoop/192.168.128.132:9000
14/03/06 16:21:11 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
14/03/06 16:21:11 INFO http.HttpServer: Added global filtersafety (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter)
14/03/06 16:21:11 INFO http.HttpServer: dfs.webhdfs.enabled = false
14/03/06 16:21:11 INFO http.HttpServer: Adding Kerberos (SPNEGO) filter to getDelegationToken
14/03/06 16:21:11 INFO http.HttpServer: Adding Kerberos (SPNEGO) filter to renewDelegationToken
14/03/06 16:21:11 INFO http.HttpServer: Adding Kerberos (SPNEGO) filter to cancelDelegationToken
14/03/06 16:21:11 INFO http.HttpServer: Adding Kerberos (SPNEGO) filter to fsck
14/03/06 16:21:11 INFO http.HttpServer: Adding Kerberos (SPNEGO) filter to getimage
14/03/06 16:21:11 INFO http.HttpServer: Port returned by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on 50070
14/03/06 16:21:11 INFO http.HttpServer: listener.getLocalPort() returned 50070 webServer.getConnectors()[0].getLocalPort() returned 50070
14/03/06 16:21:11 INFO http.HttpServer: Jetty bound to port 50070
14/03/06 16:21:11 INFO mortbay.log: jetty-6.1.26
14/03/06 16:21:11 INFO server.KerberosAuthenticationHandler: Login using keytab /usr/local/src/hadoop-1.2.1/conf/hdfs.keytab, for principal ${dfs.web.authentication.kerberos.principal}
14/03/06 16:21:11 INFO server.KerberosAuthenticationHandler: Initialized, principal [${dfs.web.authentication.kerberos.principal}] from keytab [/usr/local/src/hadoop-1.2.1/conf/hdfs.keytab]
14/03/06 16:21:11 WARN server.AuthenticationFilter: 'signature.secret' configuration not set, using a random value as secret
14/03/06 16:21:11 INFO mortbay.log: Started [email protected]:50070
14/03/06 16:21:11 INFO namenode.NameNode: Web-server up at: 0.0.0.0:50070
14/03/06 16:21:11 INFO ipc.Server: IPC Server Responder: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server listener on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 8 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 7 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 6 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 5 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 4 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 3 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 2 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 1 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 0 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server handler 9 on 9000: starting
14/03/06 16:21:11 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:11 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:11 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:14 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:14 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:17 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:17 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:20 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:20 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:23 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:23 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:26 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
14/03/06 16:21:26 INFO ipc.Server: IPC Server listener on 9000: readAndProcess threw exception org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.. Count of bytes read: 0
org.apache.hadoop.security.AccessControlException: Authorization (hadoop.security.authorization) is enabled but authentication (hadoop.security.authentication) is configured as simple. Please configure another method like kerberos or digest.
    at org.apache.hadoop.ipc.Server$Connection.readAndProcess(Server.java:1166)
    at org.apache.hadoop.ipc.Server$Listener.doRead(Server.java:577)
    at org.apache.hadoop.ipc.Server$Listener$Reader.run(Server.java:384)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:722)
    
por xiaoxiao 16.03.2014 / 06:12

0 respostas