前几天一直无进展,今天也弄了半天,但是出现了转机,不再是那个“connection to myubuntu(这里是自己起的名字) org/apache/hadoop/configuration/configuration”的错误了,因为我当时直接双击下方那个view里的图标,结果报出的错误是An internal error occurred during: “Map/Reduce location status updater”.
java.lang.NoClassDefFoundError: org/codehaus/jackso/map/JsonMappingException
以此为关键词搜索出了解决办法:(最幸运的莫过于一篇中文文章了)
http://www.maplef.net/blog/archives/eclipse-hadoop-plugin.html
文章作者说明了这是20.203的问题,全换成20.2可以连接上HDFS了
但作者也提出了,"使用0.20.2版本后,仍然没有办法直接通过hadoop运行mapreduce程序,需要手动运行,"
为此,按照 这个文章 编写了一个makefile去运行程序,内容如下:
JarFile=”wordcount-0.1.jar”
MainFunc=”net.maplef.hadoop.wordcount.WordCountDriver”
LocalOutDir=”/tmp/output”
all:help
jar:
jar -cvf ${JarFile} -C bin/ .
input:
hadoop fs -rmr input
hadoop fs -put ${input} input
run:
hadoop jar ${JarFile} ${MainFunc} input output
clean:
hadoop fs -rmr output
output:
rm -rf ${LocalOutDir}
hadoop fs -get output ${LocalOutDir}
gedit ${LocalOutDir}/part-r-00000 &
help:
@echo “Usage:”
@echo ” make jar – Build Jar File.”
@echo ” make input input=yourinputpath – Set up Input Path”
@echo ” make clean – Clean up Output directory on HDFS.”
@echo ” make run – Run your MapReduce code on Hadoop.”
@echo ” make output – Download and show output file”
@echo ” make help – Show Makefile options.”
@echo ” ”
@echo “Example:”
@echo ” make jar; make input input=/home/username/hadoop-test/input/*; make run; make output; make clean”
最后,小小表扬一下自己,虽然世上很少有事一帆风顺,但是只有坚持才有可能找到路,比起以前那种兴奋劲三天就过,总算进步了。