HBase学习之安装与配置(单机模式)

1.前期准备
  ① 安装JDK
  ② 安装SSH
  ③ 安装Hadoop,若想安装的HBase为单机模式,可不用安装。
2.下载安装HBase
  ① 下载HBase。
     注意HBase的使用需要于相应的Hadoop版本相匹配。
3.编辑/etc/profile文件,添加与hbase相关的环境变量。
   #set HBase Environment
  export HBASE_HOME=/home/hadoop/hbase-0.98.11
  export PATH=$PATH:$HBASE_HOME/bin
一 、单机模式的配置
1.修改conf/hbase-env.sh脚本,export JAVA_HOME环境变量。
   export  JAVA_HOME=/usr/lib/jvm/jdk1.8.0_31
2.配置conf/hbase-site.sml文件,设置的属性通常有如下:
  ① hbase.rootdir  
    hbase在linux实体机上存储的位置,默认是Linux下的/tmp目录,而该目录在系统每次启动时,会被删除。  
    <configuration>
     <property>
        <name>hbase.rootdir</name>
        <value>/home/hadoop/tmp/hbase</value>
     </property>
  </configuration>
3.执行start-hbase.sh
   hadoop@ubuntu:~$ start-hbase.sh
发现出现如下的 错误:    
Error: Could not find or load main class org.apache.hadoop.hbase.util.HBaseConfTool
Error: Could not find or load main class org.apache.hadoop.hbase.zookeeper.ZKServerTool
starting master, logging to /home/hadoop/hbase-0.98.9/logs/hbase-hadoop-master-ubuntu.out
Error: Could not find or load main class org.apache.hadoop.hbase.master.HMaster
localhost: starting regionserver, logging to /home/hadoop/hbase-0.98.9/bin/../logs/hbase-hadoop-regionserver-ubuntu.out
localhost: Error: Could not find or load main class org.apache.hadoop.hbase.regionserver.HRegionServer
错误原因:原来是自己下载的hbase为hbase源码,即 hbase-0.98.11-src.tar.gz文件,实际我们应该下载 hbase-0.98.11-hadoop1-bin.tar.gz,因为我们的hadoop为1.2.1版本,所以选择的是基于hadoop1的。重新下载安装配置后,执行start-hbase.sh命令:
hadoop@ubuntu:~$ start-hbase.sh 
starting master, logging to /home/hadoop/hbase-0.98.11/logs/hbase-hadoop-master-ubuntu.out
然后我们再用jps命令看看我们的HMaster进程是否启动。
hadoop@ubuntu:~$ jps
3744 Jps
3650 HMaster
但是,发现过不了一会儿,HMaster又自动挂了,看错误log如下:
Tue Mar 17 08:37:15 PDT 2015 Starting master on ubuntu
/home/hadoop/hbase-0.98.11/bin/hbase-daemon.sh: line 207: core: command not found
2015-03-17 08:37:16,602 INFO  [main] util.VersionInfo: HBase 0.98.11-hadoop1
2015-03-17 08:37:16,603 INFO  [main] util.VersionInfo: Subversion git://aspire/home/apurtell/src/hbase -r 6e6cf74c1161035545d95921816121eb3a516fe0
2015-03-17 08:37:16,603 INFO  [main] util.VersionInfo: Compiled by apurtell on Mon Mar  2 23:21:46 PST 2015
2015-03-17 08:37:17,561 INFO  [main] server.ZooKeeperServer: Server environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
2015-03-17 08:37:17,561 INFO  [main] server.ZooKeeperServer: Server environment:host.name=ubuntu
2015-03-17 08:37:17,561 INFO  [main] server.ZooKeeperServer: Server environment:java.version=1.8.0_31
2015-03-17 08:37:17,561 INFO  [main] server.ZooKeeperServer: Server environment:java.vendor=Oracle Corporation
2015-03-17 08:37:17,561 INFO  [main] server.ZooKeeperServer: Server environment:java.home=/usr/lib/jvm/jdk1.8.0_31/jre
2015-03-17 08:37:17,561 INFO  [main] server.ZooKeeperServer: Server environment:java.class.path=/home/hadoop/hbase-0.98.11/conf:/usr/lib/jvm/jdk1.8.0_31/lib/tools.jar:/home/hadoop/hbase-0.98.11:/home/hadoop/hbase-0.98.11/lib/activation-1.1.jar:/home/hadoop/hbase-0.98.11/lib/asm-3.1.jar:/home/hadoop/hbase-0.98.11/lib/commons-beanutils-1.7.0.jar:/home/hadoop/hbase-0.98.11/lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hbase-0.98.11/lib/commons-cli-1.2.jar:/home/hadoop/hbase-0.98.11/lib/commons-codec-1.7.jar:/home/hadoop/hbase-0.98.11/lib/commons-collections-3.2.1.jar:/home/hadoop/hbase-0.98.11/lib/commons-configuration-1.6.jar:/home/hadoop/hbase-0.98.11/lib/commons-digester-1.8.jar:/home/hadoop/hbase-0.98.11/lib/commons-el-1.0.jar:/home/hadoop/hbase-0.98.11/lib/commons-httpclient-3.1.jar:/home/hadoop/hbase-0.98.11/lib/commons-io-2.4.jar:/home/hadoop/hbase-0.98.11/lib/commons-lang-2.6.jar:/home/hadoop/hbase-0.98.11/lib/commons-logging-1.1.1.jar:/home/hadoop/hbase-0.98.11/lib/commons-math-2.1.jar:/home/hadoop/hbase-0.98.11/lib/commons-net-1.4.1.jar:/home/hadoop/hbase-0.98.11/lib/findbugs-annotations-1.3.9-1.jar:/home/hadoop/hbase-0.98.11/lib/guava-12.0.1.jar:/home/hadoop/hbase-0.98.11/lib/hadoop-core-1.2.1.jar:/home/hadoop/hbase-0.98.11/lib/hamcrest-core-1.3.jar:/home/hadoop/hbase-0.98.11/lib/hbase-annotations-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-checkstyle-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-client-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-common-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-common-0.98.11-hadoop1-tests.jar:/home/hadoop/hbase-0.98.11/lib/hbase-examples-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-hadoop1-compat-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-hadoop-compat-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-it-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-it-0.98.11-hadoop1-tests.jar:/home/hadoop/hbase-0.98.11/lib/hbase-prefix-tree-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-protocol-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-rest-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-server-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-server-0.98.11-hadoop1-tests.jar:/home/hadoop/hbase-0.98.11/lib/hbase-shell-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-testing-util-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/hbase-thrift-0.98.11-hadoop1.jar:/home/hadoop/hbase-0.98.11/lib/high-scale-lib-1.1.1.jar:/home/hadoop/hbase-0.98.11/lib/htrace-core-2.04.jar:/home/hadoop/hbase-0.98.11/lib/httpclient-4.1.3.jar:/home/hadoop/hbase-0.98.11/lib/httpcore-4.1.3.jar:/home/hadoop/hbase-0.98.11/lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hbase-0.98.11/lib/jackson-jaxrs-1.8.8.jar:/home/hadoop/hbase-0.98.11/lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hbase-0.98.11/lib/jackson-xc-1.8.8.jar:/home/hadoop/hbase-0.98.11/lib/jamon-runtime-2.3.1.jar:/home/hadoop/hbase-0.98.11/lib/jasper-compiler-5.5.23.jar:/home/hadoop/hbase-0.98.11/lib/jasper-runtime-5.5.23.jar:/home/hadoop/hbase-0.98.11/lib/jaxb-api-2.2.2.jar:/home/hadoop/hbase-0.98.11/lib/jaxb-impl-2.2.3-1.jar:/home/hadoop/hbase-0.98.11/lib/jcodings-1.0.8.jar:/home/hadoop/hbase-0.98.11/lib/jersey-client-1.8.jar:/home/hadoop/hbase-0.98.11/lib/jersey-core-1.8.jar:/home/hadoop/hbase-0.98.11/lib/jersey-json-1.8.jar:/home/hadoop/hbase-0.98.11/lib/jersey-server-1.8.jar:/home/hadoop/hbase-0.98.11/lib/jettison-1.3.1.jar:/home/hadoop/hbase-0.98.11/lib/jetty-6.1.26.jar:/home/hadoop/hbase-0.98.11/lib/jetty-sslengine-6.1.26.jar:/home/hadoop/hbase-0.98.11/lib/jetty-util-6.1.26.jar:/home/hadoop/hbase-0.98.11/lib/joni-2.1.2.jar:/home/hadoop/hbase-0.98.11/lib/jruby-complete-1.6.8.jar:/home/hadoop/hbase-0.98.11/lib/jsp-2.1-6.1.14.jar:/home/hadoop/hbase-0.98.11/lib/jsp-api-2.1-6.1.14.jar:/home/hadoop/hbase-0.98.11/lib/jsr305-1.3.9.jar:/home/hadoop/hbase-0.98.11/lib/junit-4.11.jar:/home/hadoop/hbase-0.98.11/lib/libthrift-0.9.0.jar:/home/hadoop/hbase-0.98.11/lib/log4j-1.2.17.jar:/home/hadoop/hbase-0.98.11/lib/metrics-core-2.2.0.jar:/home/hadoop/hbase-0.98.11/lib/netty-3.6.6.Final.jar:/home/hadoop/hbase-0.98.11/lib/protobuf-java-2.5.0.jar:/home/hadoop/hbase-0.98.11/lib/servlet-api-2.5-6.1.14.jar:/home/hadoop/hbase-0.98.11/lib/slf4j-api-1.6.4.jar:/home/hadoop/hbase-0.98.11/lib/slf4j-log4j12-1.6.4.jar:/home/hadoop/hbase-0.98.11/lib/xmlenc-0.52.jar:/home/hadoop/hbase-0.98.11/lib/zookeeper-3.4.6.jar:/home/hadoop/hadoop-1.2.1/libexec/../conf:/usr/lib/jvm/jdk1.8.0_31/lib/tools.jar:/home/hadoop/hadoop-1.2.1/libexec/..:/home/hadoop/hadoop-1.2.1/libexec/../hadoop-core-1.2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/asm-3.2.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/aspectjrt-1.6.11.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/aspectjtools-1.6.11.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-beanutils-1.7.0.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-cli-1.2.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-codec-1.4.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-collections-3.2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-configuration-1.6.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-daemon-1.0.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-digester-1.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-el-1.0.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-httpclient-3.0.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-io-2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-lang-2.4.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-logging-1.1.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-logging-api-1.0.4.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-math-2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/commons-net-3.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/core-3.1.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/hsqldb-1.8.0.10.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jackson-core-asl-1.8.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jasper-compiler-5.5.12.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jasper-runtime-5.5.12.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jdeb-0.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jersey-core-1.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jersey-json-1.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jersey-server-1.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jets3t-0.6.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jetty-6.1.26.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jetty-util-6.1.26.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jsch-0.1.42.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/junit-4.5.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/kfs-0.2.2.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/log4j-1.2.15.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/mockito-all-1.8.5.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/oro-2.0.8.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/servlet-api-2.5-20081211.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/slf4j-api-1.4.3.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/slf4j-log4j12-1.4.3.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/xmlenc-0.52.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/hadoop-1.2.1/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
2015-03-17 08:37:17,562 INFO  [main] server.ZooKeeperServer: Server environment:java.library.path=/home/hadoop/hadoop-1.2.1/libexec/../lib/native/Linux-i386-32
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:java.io.tmpdir=/tmp
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:java.compiler=<NA>
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:os.name=Linux
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:os.arch=i386
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:os.version=3.13.0-32-generic
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:user.name=hadoop
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:user.home=/home/hadoop
2015-03-17 08:37:17,566 INFO  [main] server.ZooKeeperServer: Server environment:user.dir=/home/hadoop/hbase-0.98.11/bin
2015-03-17 08:37:17,643 INFO  [main] server.ZooKeeperServer: Created server with tickTime 2000 minSessionTimeout 4000 maxSessionTimeout 40000 datadir /tmp/hbase-hadoop/zookeeper/zookeeper_0/version-2 snapdir /tmp/hbase-hadoop/zookeeper/zookeeper_0/version-2
2015-03-17 08:37:17,775 INFO  [main] server.NIOServerCnxnFactory: binding to port 0.0.0.0/0.0.0.0:2181
2015-03-17 08:37:18,132 INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxnFactory: Accepted socket connection from /127.0.0.1:43854
2015-03-17 08:37:18,260 INFO  [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxn: Processing stat command from /127.0.0.1:43854
2015-03-17 08:37:18,271 INFO  [Thread-1] server.NIOServerCnxn: Stat command output
2015-03-17 08:37:18,272 INFO  [main] zookeeper.MiniZooKeeperCluster: Started MiniZK Cluster and connect 1 ZK server on client port: 2181
2015-03-17 08:37:18,278 INFO  [Thread-1] server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:43854 (no session established for client)
2015-03-17 08:37:18,348 DEBUG [main] master.HMaster: master/ubuntu/127.0.1.1:0 HConnection server-to-server retries=350
2015-03-17 08:37:18,744 INFO  [main] ipc.RpcServer: master/ubuntu/127.0.1.1:0: started 10 reader(s).
2015-03-17 08:37:18,882 INFO  [main] impl.MetricsConfig: loaded properties from hadoop-metrics2-hbase.properties
2015-03-17 08:37:18,921 INFO  [main] impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered.
2015-03-17 08:37:18,922 INFO  [main] impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2015-03-17 08:37:18,922 INFO  [main] impl.MetricsSystemImpl: HBase metrics system started
2015-03-17 08:37:18,926 INFO  [main] impl.MetricsSourceAdapter: MBean for source jvm registered.
2015-03-17 08:37:18,990 INFO  [main] impl.MetricsSourceAdapter: MBean for source IPC,sub=IPC registered.
2015-03-17 08:37:35,330 INFO  [main] impl.MetricsSourceAdapter: MBean for source ugi registered.
2015-03-17 08:37:35,330 WARN  [main] impl.MetricsSystemImpl: Source name ugi already exists!
2015-03-17 08:37:36,560 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:37,561 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:39,288 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:40,486 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:41,700 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:42,702 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:43,703 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:44,704 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:45,705 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:46,707 INFO  [main] ipc.Client: Retrying connect to server: localhost/127.0.0.1:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
2015-03-17 08:37:46,715 ERROR [main] master.HMasterCommandLine: Master exiting
java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMasterConnection refused
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139)
at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:202)
at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:152)
at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:181)
at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:135)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:126)
at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:3031)
Caused by: java.net.ConnectException: Call to localhost/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
at org.apache.hadoop.ipc.Client.call(Client.java:1118)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy5.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at com.sun.proxy.$Proxy5.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:247)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
at org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:942)
at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:533)
at org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.<init>(HMasterCommandLine.java:260)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:136)
... 7 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
at org.apache.hadoop.ipc.Client.call(Client.java:1093)
... 36 more
哎,这个错误找了好久,也没解决,如果哪位大牛知道,麻烦请指导,再下感激不尽!!!


来自为知笔记(Wiz)


你可能感兴趣的:(HBase学习之安装与配置(单机模式))