java.lang.ClassNotFoundException: WordCount 问题解决方法

bruce@bruce-laptop:~/Workspaces/MyEclipse 8.x/Crawl/WebRoot/WEB-INF/classes$ hadoop WordCount
Exception in thread "main" java.lang.NoClassDefFoundError: WordCount
Caused by: java.lang.ClassNotFoundException: WordCount
	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:303)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
	at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:316

 

解决方法:

step1:先打程序jar包 (WordCount.jar)

step2:运行命令 hadoop jar WordCount.jar demo.hadoop.WordCount(其中demo.hadoop.WordCount类的路径)

bruce@bruce-laptop:~$ hadoop jar WordCount.jar demo.hadoop.WordCount
10/08/25 23:45:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
10/08/25 23:45:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
10/08/25 23:45:49 INFO mapred.FileInputFormat: Total input paths to process : 1
10/08/25 23:45:49 INFO mapred.JobClient: Running job: job_local_0001
10/08/25 23:45:49 INFO mapred.FileInputFormat: Total input paths to process : 1
10/08/25 23:45:49 INFO mapred.MapTask: numReduceTasks: 1
10/08/25 23:45:49 INFO mapred.MapTask: io.sort.mb = 100
10/08/25 23:45:49 INFO mapred.MapTask: data buffer = 79691776/99614720
10/08/25 23:45:49 INFO mapred.MapTask: record buffer = 262144/327680
10/08/25 23:45:49 INFO mapred.MapTask: Starting flush of map output
10/08/25 23:45:49 INFO mapred.MapTask: Finished spill 0
10/08/25 23:45:49 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
10/08/25 23:45:49 INFO mapred.LocalJobRunner: file:/home/bruce/input.txt:0+115
10/08/25 23:45:49 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
10/08/25 23:45:50 INFO mapred.LocalJobRunner: 
10/08/25 23:45:50 INFO mapred.Merger: Merging 1 sorted segments
10/08/25 23:45:50 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 193 bytes
10/08/25 23:45:50 INFO mapred.LocalJobRunner: 
10/08/25 23:45:50 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
10/08/25 23:45:50 INFO mapred.LocalJobRunner: 
10/08/25 23:45:50 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
10/08/25 23:45:50 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/home/bruce/output.txt
10/08/25 23:45:50 INFO mapred.LocalJobRunner: reduce > reduce
10/08/25 23:45:50 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
10/08/25 23:45:50 INFO mapred.JobClient:  map 100% reduce 100%
10/08/25 23:45:50 INFO mapred.JobClient: Job complete: job_local_0001
10/08/25 23:45:50 INFO mapred.JobClient: Counters: 13
10/08/25 23:45:50 INFO mapred.JobClient:   FileSystemCounters
10/08/25 23:45:50 INFO mapred.JobClient:     FILE_BYTES_READ=40975
10/08/25 23:45:50 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=68521
10/08/25 23:45:50 INFO mapred.JobClient:   Map-Reduce Framework
10/08/25 23:45:50 INFO mapred.JobClient:     Reduce input groups=17
10/08/25 23:45:50 INFO mapred.JobClient:     Combine output records=17
10/08/25 23:45:50 INFO mapred.JobClient:     Map input records=5
10/08/25 23:45:50 INFO mapred.JobClient:     Reduce shuffle bytes=0
10/08/25 23:45:50 INFO mapred.JobClient:     Reduce output records=17
10/08/25 23:45:50 INFO mapred.JobClient:     Spilled Records=34
10/08/25 23:45:50 INFO mapred.JobClient:     Map output bytes=200
10/08/25 23:45:50 INFO mapred.JobClient:     Map input bytes=115
10/08/25 23:45:50 INFO mapred.JobClient:     Combine input records=22
10/08/25 23:45:50 INFO mapred.JobClient:     Map output records=22
10/08/25 23:45:50 INFO mapred.JobClient:     Reduce input records=17
 

 

你可能感兴趣的:(java,jvm,hadoop,.net,MyEclipse)