使用MRUnit,Mockito和PowerMock进行Hadoop MapReduce作业的单元测试

0、preliminary 环境搭建

Setup development environment

Download the latest version of MRUnit jar from Apache website: https://repository.apache.org/content/repositories/releases/org/apache/mrunit/mrunit/. For example if you are using the Hadoop version 1.0.3, download mrunit-x.x.x-incubating-hadoop2.jar.
Include the jar in your IDE classpath. Also download the latest verison of mokito (http://code.google.com/p/mockito/) and JUnit jar and add them to your class path of development environment.
请导入jar包到CentOS中的eclipse

download site: https://mrunit.apache.org/general/downloads.html

download web site https://repository.apache.org/content/repositories/releases/org/apache/mrunit/mrunit/1.1.0/
这里写图片描述
使用MRUnit,Mockito和PowerMock进行Hadoop MapReduce作业的单元测试_第1张图片

You should Attention:

hadoop 框架包里面有自带的mock jar包,吧他们全部删除,否则要产生jar包兼容性异常(编译器不晓得调哪个jar包为好)

转自: http://www.infoq.com/cn/articles/HadoopMRUnit

(请关心里面的干货——测试用例文末)

引言

Hadoop MapReduce作业有着独一无二的代码架构,这种代码架构拥有特定的模板和结构。这样的架构会给测试驱动开发和单元测试带来一些麻烦。这篇文章是运用MRUnit,Mockito和PowerMock的真实范例。

我会介绍

使用MRUnit来编写Hadoop MapReduce应用程序的JUnit测试
使用PowerMock和Mockito模拟静态方法
模拟其他类型中的业务逻辑(译注:也就是编写测试驱动模块)
查看模拟的业务逻辑是否被调用(译注:测试驱动模块是否运行正常)
计数器
测试用例与log4j的集成
异常处理

本文的前提是读者应该已经熟悉JUnit 4的使用。

使用MRUnit可以把测试桩输入到mapper和/或reducer中,然后在JUnit环境中判断是否通过测试。这个过程和任何JUnit测试一样,你可以调试你的代码。MRUnit中的MapReduce Driver可以测试一组Map/Reduce或者Combiner。 PipelineMapReduceDriver可以测试Map/Reduce作业工作流。目前,MRUnit还没有Partitioner对应的驱动。MRUnit使开发人员在面对Hadoop特殊的架构的时候也能进行TDD和轻量级的单元测试。

测试用例(MapTest + ReduceTest + MapReduceTest)

publicclass MapTest {

private Mapper mapper;
private MapDriver driver;

@Before
publicvoid init(){
    mapper = new WordCount.Map();
    driver = new MapDriver(mapper);
}

@Test
publicvoid test() throws IOException{
    String line = "this is a test case for map";
    driver.withInput(new LongWritable(1),new Text(line))
            .withOutput(new Text("this"), new IntWritable(1))
            .withOutput(new Text("is"), new IntWritable(1))
            .withOutput(new Text("a"), new IntWritable(1))
            .withOutput(new Text("test"), new IntWritable(1))
            .withOutput(new Text("case"), new IntWritable(1))
            .withOutput(new Text("for"), new IntWritable(1))
            .withOutput(new Text("map"), new IntWritable(1))
            .runTest();
}

}

publicclass ReduceTest {

private Reducer reducer;
private ReduceDriver driver;

@Before
publicvoid init(){
    reducer = new WordCount.Reduce();
    driver = new ReduceDriver(reducer);

}
@Test
publicvoid test() throws IOException{
    String key = "test";
    List<IntWritable> values = new ArrayList();
    values.add(new IntWritable(2));
    values.add(new IntWritable(3));//5        
    driver.withInput(new Text(key),values)
            .withOutput(new Text("test"), new IntWritable(5))
            .runTest();
}

}

publicclass MapReduceTest {

private Reducer reducer;
private Mapper mapper;    
private MapReduceDriver driver;

@Before
publicvoid init(){
    reducer = new WordCount.Reduce();
    mapper = new WordCount.Map();
    driver = new MapReduceDriver(mapper,reducer);
}
@Test
publicvoid test() throws IOException{
    String line = "chinacache is a great CDN is it not";
    driver.withInput(new LongWritable(1),new Text(line))
            .withOutput(new Text("CDN"), new IntWritable(1))
            .withOutput(new Text("a"), new IntWritable(1))
            .withOutput(new Text("chinacache"), new IntWritable(1))
            .withOutput(new Text("great"), new IntWritable(1))
            .withOutput(new Text("is"), new IntWritable(2))
            .withOutput(new Text("it"), new IntWritable(1))
            .withOutput(new Text("not"), new IntWritable(1))
            .runTest();
}

}

你可能感兴趣的:(hadoop)