出现问题后提示如下:
[hadoop@master ~]$ yarn jar /opt/eclipse/wc.jar org.apache.hadoop.examples.wordcount /lzg/words /lzg/output
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hadoop.examples.wordcount
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
但是使用share下的examples.jar就可以
[hadoop@master ~]$ yarn jar /opt/hadoop-2.6.0-cdh5.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.6.0.jar wordcount /lzg/words /lzg/output
19/03/19 16:48:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/03/19 16:48:09 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.202.10:8032
19/03/19 16:48:10 INFO mapreduce.JobSubmissionFiles: Permissions on staging directory /tmp/hadoop-yarn/staging/hadoop/.staging are incorrect: rwxrwxrwx. Fixing permissions to correct value rwx------
19/03/19 16:48:12 INFO input.FileInputFormat: Total input paths to process : 1
19/03/19 16:48:15 INFO mapreduce.JobSubmitter: number of splits:1
19/03/19 16:48:16 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1552964615486_0001
19/03/19 16:48:21 INFO impl.YarnClientImpl: Submitted application application_1552964615486_0001
19/03/19 16:48:22 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1552964615486_0001/
19/03/19 16:48:22 INFO mapreduce.Job: Running job: job_1552964615486_0001
19/03/19 16:49:03 INFO mapreduce.Job: Job job_1552964615486_0001 running in uber mode : false
19/03/19 16:49:04 INFO mapreduce.Job: map 0% reduce 0%
19/03/19 16:49:45 INFO mapreduce.Job: map 100% reduce 0%
19/03/19 16:50:04 INFO mapreduce.Job: map 100% reduce 100%
19/03/19 16:50:06 INFO mapreduce.Job: Job job_1552964615486_0001 completed successfully
19/03/19 16:50:06 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=178
FILE: Number of bytes written=221797
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=237
HDFS: Number of bytes written=120
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=36622
Total time spent by all reduces in occupied slots (ms)=16274
Total time spent by all map tasks (ms)=36622
Total time spent by all reduce tasks (ms)=16274
Total vcore-seconds taken by all map tasks=36622
Total vcore-seconds taken by all reduce tasks=16274
Total megabyte-seconds taken by all map tasks=37500928
Total megabyte-seconds taken by all reduce tasks=16664576
Map-Reduce Framework
Map input records=9
Map output records=20
Map output bytes=223
Map output materialized bytes=178
Input split bytes=93
Combine input records=20
Combine output records=13
Reduce input groups=13
Reduce shuffle bytes=178
Reduce input records=13
Reduce output records=13
Spilled Records=26
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=349
CPU time spent (ms)=3220
Physical memory (bytes) snapshot=345022464
Virtual memory (bytes) snapshot=5583060992
Total committed heap usage (bytes)=229638144
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=144
File Output Format Counters
Bytes Written=120
其实错误的地方在红色这里
[hadoop@master ~]$ yarn jar /opt/hadoop-2.6.0-cdh5.6.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.6.0.jar wordcount /lzg/words /lzg/output
应该改成org.apache.hadoop.examples.WordCount,注意WordCount是WC大写
[hadoop@master ~]$ hadoop jar /opt/eclipse/wc.jar org.apache.hadoop.examples.WordCount /lzg/words /lzg/output
问题解决!!