WordCount案例实操--打包到集群上运行

WordCount案例实操--打包到集群上运行

1.点击idea中

WordCount案例实操--打包到集群上运行_第1张图片

2.复制mapreduce-1.0-SNAPSHOT.jar到桌面

WordCount案例实操--打包到集群上运行_第2张图片

3,.将jar包改名,然后拖入虚拟机里

WordCount案例实操--打包到集群上运行_第3张图片

4,查看jar包

WordCount案例实操--打包到集群上运行_第4张图片 5,拷贝路径WcDriver(com.zpark.wordcount.WcDriver)

6,在CRT中

[root@hdp-1 ~]# cd apps
[root@hdp-1 apps]# cd hadoop-2.8.1/
[root@hdp-1 hadoop-2.8.1]# hadoop jar mapreduce1.jar com.zpark.wordcount.WcDriver /1.txt /ooutput

20/02/01 19:21:48 INFO client.RMProxy: Connecting to ResourceManager at hdp-1/192.168.0.128:8032
20/02/01 19:21:49 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
20/02/01 19:21:49 INFO input.FileInputFormat: Total input files to process : 1
20/02/01 19:21:50 INFO mapreduce.JobSubmitter: number of splits:1
20/02/01 19:21:51 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1580555517299_0001
20/02/01 19:21:51 INFO impl.YarnClientImpl: Submitted application application_1580555517299_0001
20/02/01 19:21:52 INFO mapreduce.Job: The url to track the job: http://hdp-1:8088/proxy/application_1580555517299_0001/
20/02/01 19:21:52 INFO mapreduce.Job: Running job: job_1580555517299_0001
20/02/01 19:22:07 INFO mapreduce.Job: Job job_1580555517299_0001 running in uber mode : false
20/02/01 19:22:07 INFO mapreduce.Job:  map 0% reduce 0%
20/02/01 19:22:19 INFO mapreduce.Job:  map 100% reduce 0%
20/02/01 19:22:31 INFO mapreduce.Job:  map 100% reduce 100%
20/02/01 19:22:32 INFO mapreduce.Job: Job job_1580555517299_0001 completed successfully
20/02/01 19:22:32 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=107
                FILE: Number of bytes written=272411
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=124
                HDFS: Number of bytes written=36
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters 
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=9492
                Total time spent by all reduces in occupied slots (ms)=8040
                Total time spent by all map tasks (ms)=9492
                Total time spent by all reduce tasks (ms)=8040
                Total vcore-milliseconds taken by all map tasks=9492
                Total vcore-milliseconds taken by all reduce tasks=8040
                Total megabyte-milliseconds taken by all map tasks=9719808
                Total megabyte-milliseconds taken by all reduce tasks=8232960
        Map-Reduce Framework
                Map input records=3
                Map output records=11
                Map output bytes=79
                Map output materialized bytes=107
                Input split bytes=88
                Combine input records=0
                Combine output records=0
                Reduce input groups=6
                Reduce shuffle bytes=107
                Reduce input records=11
                Reduce output records=6
                Spilled Records=22
                Shuffled Maps =1
                Failed Shuffles=0
                Merged Map outputs=1
                GC time elapsed (ms)=301
                CPU time spent (ms)=2200
                Physical memory (bytes) snapshot=303382528
                Virtual memory (bytes) snapshot=4164362240
                Total committed heap usage (bytes)=152588288
        Shuffle Errors
 BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters 
                Bytes Read=36
        File Output Format Counters 
                Bytes Written=36
0

7,查看

[root@hdp-1 hadoop-2.8.1]# hadoop fs -cat /ooutput/*
        5
1       1
hadoop  1
hellow  2
me      1
you     1

 

 

你可能感兴趣的:(Hadoop)