Hadoop Pipes “Server failed to authenticate”错误及解决

问题描述:

《hadoop实战》(第2版)3.5节的Hadoop Pipes例子。

makefile的内容:
HADOOP_INSTALL=/home/xxl/hadoop-1.1.2
PLATFORM=Linux-i386-32
SSL_INSTALL=/usr/local/ssl
CC=g++
CPPFLAGS=-m32 -I$(HADOOP_INSTALL)/c++/$(PLATFORM)/include -I$(SSL_INSTALL)/include
wordcount: wordcount.cpp
    $(CC) $(CPPFLAGS) $< -Wall -L$(HADOOP_INSTALL)/c++/$(PLATFORM)/lib -lhadooppipes -lhadooputils \
        -L$(SSL_INSTALL)/lib -lcrypto -lssl -ldl -lpthread -g -O2 -o $@

上述makefile内容没有问题

将可执行文件上传到bin文件夹内
~/hadoop-1.1.2/bin/hadoop fs -mkdir bin
~/hadoop-1.1.2/bin/hadoop dfs -put wordcount bin

运行这个wordcount程序
~/hadoop-1.1.2/bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input input -output output -program bin/wordcount


此时出现如下问题:

xxl@xxl-pc:~/MapReduce/wordcount_cpp$ ~/hadoop-1.1.2/bin/hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input /user/xxl/input/file0* -output /user/xxl/output/outputfile -program bin/wordcount
13/10/04 22:29:21 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
13/10/04 22:29:22 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/10/04 22:29:22 WARN snappy.LoadSnappy: Snappy native library not loaded
13/10/04 22:29:22 INFO mapred.FileInputFormat: Total input paths to process : 2
13/10/04 22:29:22 INFO mapred.JobClient: Running job: job_201310041509_0017
13/10/04 22:29:23 INFO mapred.JobClient:  map 0% reduce 0%
13/10/04 22:29:32 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000000_0, Status : FAILED
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201310041509_0017_m_000000_0: Server failed to authenticate. Exiting
13/10/04 22:29:32 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000001_0, Status : FAILED
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201310041509_0017_m_000001_0: Server failed to authenticate. Exiting
13/10/04 22:29:40 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000000_1, Status : FAILED
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201310041509_0017_m_000000_1: Server failed to authenticate. Exiting
13/10/04 22:29:40 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000001_1, Status : FAILED
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201310041509_0017_m_000001_1: Server failed to authenticate. Exiting
13/10/04 22:29:48 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000000_2, Status : FAILED
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201310041509_0017_m_000000_2: Server failed to authenticate. Exiting
13/10/04 22:29:48 INFO mapred.JobClient: Task Id : attempt_201310041509_0017_m_000001_2, Status : FAILED
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)

attempt_201310041509_0017_m_000001_2: Server failed to authenticate. Exiting
13/10/04 22:29:59 INFO mapred.JobClient: Job complete: job_201310041509_0017
13/10/04 22:29:59 INFO mapred.JobClient: Counters: 7
13/10/04 22:29:59 INFO mapred.JobClient:   Job Counters 
13/10/04 22:29:59 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=65416
13/10/04 22:29:59 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/10/04 22:29:59 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/10/04 22:29:59 INFO mapred.JobClient:     Launched map tasks=8
13/10/04 22:29:59 INFO mapred.JobClient:     Data-local map tasks=8
13/10/04 22:29:59 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/10/04 22:29:59 INFO mapred.JobClient:     Failed map tasks=1
13/10/04 22:29:59 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201310041509_0017_m_000000
Exception in thread "main" java.io.IOException: Job failed!
	at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1327)
	at org.apache.hadoop.mapred.pipes.Submitter.runJob(Submitter.java:248)
	at org.apache.hadoop.mapred.pipes.Submitter.run(Submitter.java:479)
	at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:494)

出现上述问题后,从stackoverflow和一些博客上搜了很多,比较有利用价值的是如下两篇,主要的解决方法就是重新编译生成libhadooppipes.a和libhadooputils.a这两个静态库,然后覆盖原先的静态库。具体方法如下。

参考链接:

http://www.linuxquestions.org/questions/linux-software-2/hadoop-1-0-3-pipes-server-failed-to-authenticate-4175429779/

http://guoyunsky.iteye.com/blog/1709654


1.进入到~/hadoop-1.1.2/src/c++/pipes 目录,运行如下命令:

./configure
make install

但是我在运行./configure时出现了如下问题:
xxl@xxl-pc:~/hadoop-1.1.2/src/c++/pipes$ ./configure 
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for gawk... no
checking for mawk... mawk
checking whether make sets $(MAKE)... yes
checking for style of include used by make... GNU
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables... 
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking dependency style of gcc... gcc3
checking how to run the C preprocessor... gcc -E
checking for grep that handles long lines and -e... /bin/grep
checking for egrep... /bin/grep -E
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking minix/config.h usability... no
checking minix/config.h presence... no
checking for minix/config.h... no
checking whether it is safe to define __EXTENSIONS__... yes
checking for special C compiler options needed for large files... no
checking for _FILE_OFFSET_BITS value needed for large files... 64
checking pthread.h usability... yes
checking pthread.h presence... yes
checking for pthread.h... yes
checking for pthread_create in -lpthread... yes
checking for HMAC_Init in -lssl... no
./configure: line 413: test: please: integer expression expected
./configure: line 416: $4: Bad file descriptor
configure: error: check
./configure: line 302: return: please: numeric argument required
./configure: line 312: exit: please: numeric argument required

没学过shell,但可以摸索着去改configure的内容,根据错误提示,定位到413行和416行,代码如下:
as_fn_error ()
{
  as_status=$1; test $as_status -eq 0 && as_status=1
  if test "$4"; then
    as_lineno=${as_lineno-"$3"} as_lineno_stack=as_lineno_stack=$as_lineno_stack
    $as_echo "$as_me:${as_lineno-$LINENO}: error: $2" >&$4
  fi
  $as_echo "$as_me: error: $2" >&2
  as_fn_exit $as_status
} # as_fn_error

出现错误后,会调用 as_fn_exit 方法,退出脚本程序,我在这里把这一行注释了,即 出现错误,但不退出,继续执行脚本:
#as_fn_exit $as_status
这种方法固然不好,但花了很长时间只能这样做了。
重新运行:
./configure
make install
会在~/hadoop-1.1.2/src/c++/install 文件夹下面看到新生成了相关的.h文件和.a文件

2. 类似的,在~/hadoop-1.1.2/src/c++/utils 文件夹中运行:

./configure
make install
并没有出现pipe文件夹中出现的问题。


生成好新的libhadooppipes.a和libhadooputils.a这两个静态库和相关的头文件之后,将这些文件覆盖到~/hadoop-1.1.2/c++/Linux-i386-32/ 文件夹中的include目录和lib目录中去。(据上述两篇文章说要)重启hadoop,然后重新运行C++程序。

最终运行成功,没有出现问题。

你可能感兴趣的:(Hadoop)