在安装好hadoop进行测试学习时,遇到下面的问题。hadoop为2.2.0,操作系统为oracle linux 6.3 64位。
[hadoop@hadoop01 input]$ hadoop dfs -put ./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
13/10/24 15:12:53 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
put: `in': No such file or directory
最后一行“put:`in': No such file or directory”先不管,肯定是语法命令有问题。
先解决“WARN util.NativeCodeLoader: Unable to loadnative-hadoop library for your platform... using builtin-java classes whereapplicable”
备注:我的hadoop环境是自己编译的,因为64位操作系统,hadoop2.2.0貌似只有32位的软件。关于64位编译请参考:
http://blog.csdn.net/bamuta/article/details/13506893
[hadoop@hadoop01 input]$ export HADOOP_ROOT_LOGGER=DEBUG,console
[hadoop@hadoop01 input]$ hadoop dfs -put./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
13/10/24 16:11:31 DEBUG util.Shell: setsidexited with exit code 0
13/10/24 16:11:31 DEBUGlib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess withannotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,value=[Rate of successful kerberos logins and latency (milliseconds)], about=,type=DEFAULT, always=false, sampleName=Ops)
13/10/24 16:11:31 DEBUGlib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRateorg.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure withannotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time,value=[Rate of failed kerberos logins and latency (milliseconds)], about=,type=DEFAULT, always=false, sampleName=Ops)
13/10/24 16:11:31 DEBUGimpl.MetricsSystemImpl: UgiMetrics, User and group related metrics
13/10/24 16:11:32 DEBUGsecurity.Groups: Creating new Groupsobject
13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: Failed to load native-hadoopwith error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
13/10/24 16:11:32 DEBUGutil.NativeCodeLoader: java.library.path=/usr/java/jdk1.7.0_45/lib:/app/hadoop/hadoop-2.2.0/lib/native:/app/hadoop/hadoop-2.2.0/lib/native
13/10/24 16:11:32 WARNutil.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
13/10/24 16:11:32 DEBUG security.JniBasedUnixGroupsMappingWithFallback:Falling back to shell based
13/10/24 16:11:32 DEBUGsecurity.JniBasedUnixGroupsMappingWithFallback: Group mappingimpl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
13/10/24 16:11:32 DEBUG security.Groups:Group mappingimpl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;cacheTimeout=300000
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: hadoop login
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: hadoop login commit
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: using local user:UnixPrincipal: hadoop
13/10/24 16:11:32 DEBUGsecurity.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
13/10/24 16:11:33 DEBUGhdfs.BlockReaderLocal: dfs.domain.socket.path =
13/10/24 16:11:33 DEBUGimpl.MetricsSystemImpl: StartupProgress, NameNode startup progress
13/10/24 16:11:33 DEBUG retry.RetryUtils:multipleLinearRandomRetry = null
13/10/24 16:11:33 DEBUG ipc.Server:rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=classorg.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper,rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@2e41d9a2
13/10/24 16:11:34 DEBUGhdfs.BlockReaderLocal: Both short-circuit local reads and UNIX domain socketare disabled.
13/10/24 16:11:34 DEBUG ipc.Client: Theping interval is 60000 ms.
13/10/24 16:11:34 DEBUG ipc.Client:Connecting to localhost/127.0.0.1:8020
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop:starting, having connections 1
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop sending#0
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop gotvalue #0
13/10/24 16:11:34 DEBUGipc.ProtobufRpcEngine: Call: getFileInfo took 82ms
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop sending#1
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop gotvalue #1
13/10/24 16:11:34 DEBUGipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
put: `.': No such file or directory
13/10/24 16:11:34 DEBUG ipc.Client:Stopping client
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop: closed
13/10/24 16:11:34 DEBUG ipc.Client: IPCClient (2141757401) connection to localhost/127.0.0.1:8020 from hadoop:stopped, remaining connections 0
上述debug中的错误 :
Failed to load native-hadoop with error:java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
为了解决这个错误,尝试过很多种办法,很多都是对环境变量的修改。都是一筹莫展。。
最后详细读了官方的NativeLibraries文档。
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html
Either download a hadoop release, whichwill include a pre-built version of the native hadoop library, or build yourown version of the native hadoop library. Whether you download or build, thename for the library is the same: libhadoop.so
发现人家要的名字是这个libhadoop.so,检查我的目录,有libhadoop.so.1.0.0这个。看了官方编译的软件,确实有那个libhadoop.so文件,但只是个link,所以照做
[hadoop@hadoop01 native]$ ln -slibhadoop.so.1.0.0 libhadoop.so
[hadoop@hadoop01 native]$ ln -s libhdfs.so.0.0.0libhdfs.so
问题解决了。
[hadoop@hadoop01 hadoop]$ hadoop dfs -put./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
put: `.': No such file or directory
现在没环境问题了,只是对命令还没有掌握好,继续学习。