网上有很多讲授在windows下通过Cygwin安装hadoop的,笔者认为hadoop原始设计就是在linux下安装使用的,在windows下通过Cygwin安装也无非是模拟linux环境再在这个模拟的环境上装hadoop,既然要学hadoop就真枪实弹的在linux下安装来学习,但是要搞一台真正的linux系统对一般个人来说不容易,但是我们可以用虚拟机VMware Player,就是在windows下装一个虚拟机(VMware Player 就是个很好的虚拟机),然后在虚拟机上装一个linux系统,实现和本机windows的通信,具体的怎么安装虚拟机,怎么在虚拟机上安装linux,以及怎么实现和本机windows的通信,这里就不讲了,这个网上有很多资料。
网上有很多讲授在windows下通过Cygwin安装hadoop的,笔者认为hadoop原始设计就是在linux下安装使用的,在windows下通过Cygwin安装也无非是模拟linux环境再在这个模拟的环境上装hadoop,既然要学hadoop就真枪实弹的在linux下安装来学习,但是要搞一台真正的linux系统对一般个人来说不容易,但是我们可以用虚拟机VMware Player,就是在windows下装一个虚拟机(VMware Player 就是个很好的虚拟机),然后在虚拟机上装一个linux系统,实现和本机windows的通信,具体的怎么安装虚拟机,怎么在虚拟机上安装linux,以及怎么实现和本机windows的通信,这里就不讲了,这个网上有很多资料。
11/05/1419:08:07 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
11/05/1419:08:08 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
11/05/1419:08:08 INFO input.FileInputFormat: Total input paths to process : 4
11/05/1419:08:09 INFO mapred.JobClient: Running job: job_201105140203_0002
11/05/1419:08:10 INFO mapred.JobClient: map 0% reduce 0%
11/05/1419:08:35 INFO mapred.JobClient: map 50% reduce 0%
11/05/1419:08:41 INFO mapred.JobClient: map 100% reduce 0%
11/05/1419:08:53 INFO mapred.JobClient: map 100% reduce 100%
11/05/1419:08:55 INFO mapred.JobClient: Job complete: job_201105140203_0002
11/05/1419:08:55 INFO mapred.JobClient: Counters: 17
11/05/1419:08:55 INFO mapred.JobClient: Job Counters
11/05/1419:08:55 INFO mapred.JobClient: Launched reduce tasks=1
11/05/1419:08:55 INFO mapred.JobClient: Launched map tasks=4
11/05/1419:08:55 INFO mapred.JobClient: Data-local map tasks=4
11/05/1419:08:55 INFO mapred.JobClient: FileSystemCounters
11/05/1419:08:55 INFO mapred.JobClient: FILE_BYTES_READ=2557
11/05/1419:08:55 INFO mapred.JobClient: HDFS_BYTES_READ=3361
11/05/1419:08:55 INFO mapred.JobClient: FILE_BYTES_WRITTEN=5260
11/05/1419:08:55 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=1688
11/05/1419:08:55 INFO mapred.JobClient: Map-Reduce Framework
11/05/1419:08:55 INFO mapred.JobClient: Reduce input groups=192
11/05/1419:08:55 INFO mapred.JobClient: Combine output records=202
11/05/1419:08:55 INFO mapred.JobClient: Map input records=43
11/05/1419:08:55 INFO mapred.JobClient: Reduce shuffle bytes=2575
11/05/1419:08:55 INFO mapred.JobClient: Reduce output records=192
11/05/1419:08:55 INFO mapred.JobClient: Spilled Records=404
11/05/1419:08:55 INFO mapred.JobClient: Map output bytes=5070
11/05/1419:08:55 INFO mapred.JobClient: Combine input records=488
11/05/1419:08:55 INFO mapred.JobClient: Map output records=488
11/05/1419:08:55 INFO mapred.JobClient: Reduce input records=202
11/05/0821:41:37 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
job new ֮ǰ-----------------------------------
11/05/0821:41:40 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 0 time(s).
11/05/0821:41:42 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 1 time(s).
11/05/0821:41:44 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 2 time(s).
11/05/0821:41:46 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 3 time(s).
11/05/0821:41:48 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 4 time(s).
11/05/0821:41:50 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 5 time(s).
11/05/0821:41:52 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 6 time(s).
11/05/0821:41:54 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 7 time(s).
11/05/0821:41:56 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 8 time(s).
11/05/0821:41:58 INFO ipc.Client: Retrying connect to server: /192.168.133.128:9001. Already tried 9 time(s).
Exception in thread "main" java.net.ConnectException: Call to /192.168.133.128:9001 failed on connection exception: java.net.ConnectException: Connection refused: no further information
at org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at org.apache.hadoop.mapred.$Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:429)
at org.apache.hadoop.mapred.JobClient.init(JobClient.java:423)
at org.apache.hadoop.mapred.JobClient.(JobClient.java:410)
at org.apache.hadoop.mapreduce.Job.(Job.java:50)
at org.apache.hadoop.mapreduce.Job.(Job.java:54)
at org.apache.hadoop.examples.WordCount.main(WordCount.java:59)
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:304)
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
at org.apache.hadoop.ipc.Client.call(Client.java:720)
... 9 more
出现以上错误的时候,原因可能有:a、hadoop没起来没有启动导致的,b、就是连接不对,看看core-site.xml 和mapred-site.xml里面的 ip 是否写成 localhost了,写成localhost是连接不到的,必须写成ip。
读者也可以看到如果连接成功 那个文件夹的(2)而不是 文件夹(1)
2、如果出现一下错误
Java代码
11/05/1420:08:26 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
11/05/1420:08:46 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
Exception in thread "main" java.net.UnknownHostException: unknown host: hadoopName
at org.apache.hadoop.ipc.Client$Connection.(Client.java:195)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:850)
at org.apache.hadoop.ipc.Client.call(Client.java:720)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
at org.apache.hadoop.mapred.JobClient.getFs(JobClient.java:463)
at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:567)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
<script language="javascript">
$(function (){
var i = 4;$(window).bind("scroll", function (event){
//滚动条到网页头部的 高度,兼容ie,ff,chrome
var top = document.documentElement.s
包冲突是开发过程中很常见的问题:
其表现有:
1.明明在eclipse中能够索引到某个类,运行时却报出找不到类。
2.明明在eclipse中能够索引到某个类的方法,运行时却报出找不到方法。
3.类及方法都有,以正确编译成了.class文件,在本机跑的好好的,发到测试或者正式环境就
抛如下异常:
java.lang.NoClassDefFoundError: Could not in
NAME: gpasswd - administer the /etc/group file
SYNOPSIS:
gpasswd group
gpasswd -a user group
gpasswd -d user group
gpasswd -R group
gpasswd -r group
gpasswd [-A user,...] [-M user,...] g
enquiry mysql version in centos linux
yum list installed | grep mysql
yum -y remove mysql-libs.x86_64
enquiry mysql version in yum repositoryyum list | grep mysql oryum -y list mysql*
install mysq
Given a string s1, we may represent it as a binary tree by partitioning it to two non-empty substrings recursively.
Below is one possible representation of s1 = "great":
select p.spid,c.object_name,b.session_id,b.oracle_username,b.os_user_name from v$process p,v$session a, v$locked_object b,all_objects c where p.addr=a.paddr and a.process=b.process and c.object_id=b.