MR最大气温代码实现及hadoop问题 InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist

第一次运行hadoop jar文件就报错误,从其中找出问题原因:org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist

hadoop问题描述如下:

[hadoop@master bin]$ hadoop jar /soft/source/mr-0.0.1-SNAPSHOT.jar hmr/mr/App /user/hadoop/data /user/hadoop/out
18/11/17 20:25:20 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.2.10:8032
18/11/17 20:25:22 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
18/11/17 20:25:22 WARN mapreduce.JobResourceUploader: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
18/11/17 20:25:23 INFO input.FileInputFormat: Total input paths to process : 2
18/11/17 20:25:24 INFO mapreduce.JobSubmitter: number of splits:2
18/11/17 20:25:25 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1542507182584_0007
18/11/17 20:25:25 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources.
18/11/17 20:25:26 INFO impl.YarnClientImpl: Submitted application application_1542507182584_0007
18/11/17 20:25:27 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1542507182584_0007/
18/11/17 20:25:27 INFO mapreduce.Job: Running job: job_1542507182584_0007
18/11/17 20:25:42 INFO mapreduce.Job: Job job_1542507182584_0007 running in uber mode : false
18/11/17 20:25:42 INFO mapreduce.Job:  map 0% reduce 0%
18/11/17 20:25:44 INFO mapreduce.Job: Task Id : attempt_1542507182584_0007_m_000000_0, Status : FAILED
Container launch failed for container_1542507182584_0007_01_000002 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/11/17 20:25:44 INFO mapreduce.Job: Task Id : attempt_1542507182584_0007_m_000001_0, Status : FAILED
Container launch failed for container_1542507182584_0007_01_000003 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/11/17 20:25:45 INFO mapreduce.Job: Task Id : attempt_1542507182584_0007_m_000001_1, Status : FAILED
Container launch failed for container_1542507182584_0007_01_000005 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/11/17 20:25:45 INFO mapreduce.Job: Task Id : attempt_1542507182584_0007_m_000000_1, Status : FAILED
Container launch failed for container_1542507182584_0007_01_000004 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/11/17 20:25:47 INFO mapreduce.Job: Task Id : attempt_1542507182584_0007_m_000001_2, Status : FAILED
Container launch failed for container_1542507182584_0007_01_000006 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/11/17 20:25:51 INFO mapreduce.Job: Task Id : attempt_1542507182584_0007_m_000000_2, Status : FAILED
Container launch failed for container_1542507182584_0007_01_000007 : org.apache.hadoop.yarn.exceptions.InvalidAuxServiceException: The auxService:mapreduce_shuffle does not exist
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)
	at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)
	at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

^[[A18/11/17 20:26:16 INFO mapreduce.Job:  map 100% reduce 100%
18/11/17 20:26:52 INFO mapreduce.Job: Job job_1542507182584_0007 failed with state FAILED due to: Task failed task_1542507182584_0007_m_000001
Job failed as tasks failed. failedMaps:1 failedReduces:0

18/11/17 20:26:52 INFO mapreduce.Job: Counters: 13
	Job Counters 
		Failed map tasks=8
		Killed reduce tasks=1
		Launched map tasks=8
		Other local map tasks=6
		Data-local map tasks=2
		Total time spent by all maps in occupied slots (ms)=11
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=11
		Total time spent by all reduce tasks (ms)=0
		Total vcore-milliseconds taken by all map tasks=11
		Total vcore-milliseconds taken by all reduce tasks=0
		Total megabyte-milliseconds taken by all map tasks=11264
		Total megabyte-milliseconds taken by all reduce tasks=0
[hadoop@master bin]$ hadoop jar /soft/source/mr-0.0.1-SNAPSHOT.jar hmr/mr/App /user/hadoop/data /user/hadoop/out

问题解决方法:在hadoop/etc/hadoop/yarn-site.xml文件里添加如下:

        
            yarn.nodemanager.aux-services
            mapreduce_shuffle
        
        
             yarn.nodemanager.aux-services.mapreduce_shuffle.class
             org.apache.hadoop.mapred.ShuffleHandler
         

重新启动hadoop服务,运行hadoop jar ...... 命令即可成功。

[hadoop@master bin]$ hadoop jar /soft/source/mr-0.0.1-SNAPSHOT.jar hmr/mr/App  /user/hadoop/data /user/hadoop/out
18/11/17 20:47:38 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.2.10:8032
18/11/17 20:47:41 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
18/11/17 20:47:44 INFO input.FileInputFormat: Total input paths to process : 2
18/11/17 20:47:44 INFO mapreduce.JobSubmitter: number of splits:2
18/11/17 20:47:44 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1542516391042_0001
18/11/17 20:47:46 INFO impl.YarnClientImpl: Submitted application application_1542516391042_0001
18/11/17 20:47:46 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1542516391042_0001/
18/11/17 20:47:46 INFO mapreduce.Job: Running job: job_1542516391042_0001
18/11/17 20:48:07 INFO mapreduce.Job: Job job_1542516391042_0001 running in uber mode : false
18/11/17 20:48:07 INFO mapreduce.Job:  map 0% reduce 0%
18/11/17 20:48:28 INFO mapreduce.Job:  map 100% reduce 0%
18/11/17 20:48:46 INFO mapreduce.Job:  map 100% reduce 100%
18/11/17 20:48:47 INFO mapreduce.Job: Job job_1542516391042_0001 completed successfully
18/11/17 20:48:47 INFO mapreduce.Job: Counters: 49
	File System Counters
		FILE: Number of bytes read=468
		FILE: Number of bytes written=367466
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=5930
		HDFS: Number of bytes written=17
		HDFS: Number of read operations=9
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=2
	Job Counters 
		Launched map tasks=2
		Launched reduce tasks=1
		Data-local map tasks=2
		Total time spent by all maps in occupied slots (ms)=38220
		Total time spent by all reduces in occupied slots (ms)=12984
		Total time spent by all map tasks (ms)=38220
		Total time spent by all reduce tasks (ms)=12984
		Total vcore-milliseconds taken by all map tasks=38220
		Total vcore-milliseconds taken by all reduce tasks=12984
		Total megabyte-milliseconds taken by all map tasks=39137280
		Total megabyte-milliseconds taken by all reduce tasks=13295616
	Map-Reduce Framework
		Map input records=42
		Map output records=42
		Map output bytes=378
		Map output materialized bytes=474
		Input split bytes=210
		Combine input records=0
		Combine output records=0
		Reduce input groups=2
		Reduce shuffle bytes=474
		Reduce input records=42
		Reduce output records=2
		Spilled Records=84
		Shuffled Maps =2
		Failed Shuffles=0
		Merged Map outputs=2
		GC time elapsed (ms)=413
		CPU time spent (ms)=8170
		Physical memory (bytes) snapshot=539553792
		Virtual memory (bytes) snapshot=6189756416
		Total committed heap usage (bytes)=350158848
	Shuffle Errors
		BAD_ID=0
		CONNECTION=0
		IO_ERROR=0
		WRONG_LENGTH=0
		WRONG_MAP=0
		WRONG_REDUCE=0
	File Input Format Counters 
		Bytes Read=5720
	File Output Format Counters 
		Bytes Written=17

另贴上代码:

mapper:

package hmr.mr;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

public class MaxTemp extends Mapper {
	private static final int MISSING=9999;

	@Override
	protected void map(LongWritable key, Text value, Mapper.Context context)
			throws IOException, InterruptedException {
		//取得一整行文本
			String line=value.toString();
			//提取年份
			String year =line.substring(15,19);
			//定义气温变量
			int airTemperature;
			if(line.charAt(87)=='+'){
					airTemperature=Integer.parseInt(line.substring(88,92));
					
			}else
			{
				airTemperature=Integer.parseInt(line.substring(87,92));
			}
			String quality=line.substring(92,93);
			if(airTemperature !=MISSING && quality.matches("[01459]")){
				context.write(new Text(year), new IntWritable(airTemperature));
			}		
	} 	

}

Reducer:

package hmr.mr;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;

public class MaxTempReducer extends Reducer {

	@Override
	protected void reduce(Text key, Iterable values,
			Reducer.Context context) throws IOException, InterruptedException {
		int maxValue=Integer.MIN_VALUE;
		//提取年份最大值
		for(IntWritable value :values){
			maxValue=Math.max(maxValue, value.get());
		}
		context.write(key,new IntWritable(maxValue));

	}

}

 

 

APP主类:

package hmr.mr;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


/**
 * Hello world!
 *
 */
public class App 
{
    public static void main( String[] args ) throws  Exception
    {
    	if(args.length !=2){
    		System.err.println("Usage:MaxTemperature");
    		System.exit(-1);
    	}
    Configuration conf=new Configuration();    
    Job job=Job.getInstance(conf);
    job.setJobName("Max Temperature");
    job.setJarByClass(App.class);
    FileInputFormat.addInputPath(job, new Path(args[0]));
    FileOutputFormat.setOutputPath(job, new Path(args[1]));
    job.setMapperClass(MaxTemp.class);
    job.setReducerClass(MaxTempReducer.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(IntWritable.class);
    System.exit(job.waitForCompletion(true)?0:1);
    }
}

 

你可能感兴趣的:(大数据)