O meu sistema de arquivos de dados do Hadoop está correto com uma mensagem de configuração incorreta?

0

Estou tentando instalar o hadoop em meu próprio laptop para fins de estudo com o tutorial do tutorialspoint . start-dfs.sh :

A saída esperada foi:

10/24/14 21:37:56
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/hadoop/hadoop-
2.4.1/logs/hadoop-hadoop-namenode-localhost.out
localhost: starting datanode, logging to /home/hadoop/hadoop-
2.4.1/logs/hadoop-hadoop-datanode-localhost.out
Starting secondary namenodes [0.0.0.0]

Mas eu recebo:

mike@mike-thinks:/usr/local/hadoop/sbin$ ./start-dfs.sh 
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access '/usr/local/hadoop/logs': No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-mike-namenode-mike-thinks.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-mike-namenode-mike-thinks.out: No such file or directory
localhost: head: cannot open '/usr/local/hadoop/logs/hadoop-mike-namenode-mike-thinks.out' for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-mike-namenode-mike-thinks.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-mike-namenode-mike-thinks.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access '/usr/local/hadoop/logs': No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-mike-datanode-mike-thinks.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-mike-datanode-mike-thinks.out: No such file or directory
localhost: head: cannot open '/usr/local/hadoop/logs/hadoop-mike-datanode-mike-thinks.out' for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-mike-datanode-mike-thinks.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-mike-datanode-mike-thinks.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is SHA256:/s7X8QMliB6FVx5bde5AaCycprQ/B+NtcTXrInrXxJM.
Are you sure you want to continue connecting (yes/no)? no
0.0.0.0: Host key verification failed.

Então eu tentei com o comando sudo :

mike@mike-thinks:/usr/local/hadoop/sbin$ sudo ./start-dfs.sh 
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-mike-thinks.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-mike-thinks.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is SHA256:/s7X8QMliB6FVx5bde5AaCycprQ/B+NtcTXrInrXxJM.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-mike-thinks.out

A configuração incorreta me deixou perplexo ...

Eu tentei lançar o fio:

mike@mike-thinks:/usr/local/hadoop/sbin$ sudo ./start-yarn.sh 
[sudo] password for mike: 
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-root-resourcemanager-mike-thinks.out
nice: ‘/usr/local/hadoop/bin/yarn’: Permission denied
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-mike-thinks.out
localhost: nice: ‘/usr/local/hadoop/bin/yarn’: Permission denied

Eu fiz um chmod +x on yarn

mike@mike-thinks:/usr/local/hadoop/sbin$ sudo ./start-yarn.sh 
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-root-resourcemanager-mike-thinks.out
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-mike-thinks.out

Mas não consegui acessar http://localhost:50070

Eu tentei novamente e agora tenho que correr contra meus deamons:

mike@mike-thinks:/usr/local/hadoop/sbin$ sudo ./start-yarn.sh 
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-root-resourcemanager-mike-thinks.out
localhost: nodemanager running as process 8183. Stop it first.
mike@mike-thinks:/usr/local/hadoop/sbin$ sudo kill 8183
mike@mike-thinks:/usr/local/hadoop/sbin$ sudo ./start-yarn.sh 
starting yarn daemons
resourcemanager running as process 9513. Stop it first.
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-mike-thinks.out
mike@mike-thinks:/usr/local/hadoop/sbin$ sudo kill 9513
mike@mike-thinks:/usr/local/hadoop/sbin$ sudo ./start-yarn.sh 
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-root-resourcemanager-mike-thinks.out
localhost: nodemanager running as process 10058. Stop it first.

Ainda assim, consigo acessar a página "Hadoop All applications" em http://localhost:8088/ :

    
por ThePassenger 05.12.2017 / 15:18

1 resposta

0

Eu (usuário do archlinux) tive o mesmo problema no Hadoop 3.0.0. Mas eu conheço algumas considerações sobre isso.

Então, por favor, tente

  1. executa o comando jps e verifica a existência de "NameNode"
    se você não conseguiu encontrar o NameNode, você deve refazer o seu Hadoop.
  2. executar o comando telnet localhost 8088/50070 e verificar a conexão
    Não consigo conectar a porta 50070, mas posso conectar a porta 8088 ... (não consegui encontrar essa solução)
por Takuya Ebata 29.12.2017 / 16:58