mrunit

Conflicting API when trying to run MRUnit example

时光总嘲笑我的痴心妄想 提交于 2020-01-05 12:34:34
问题 I've been playing around with MRUnit and tried running it for a hadoop wordcount example following the tutorial for wordcount and unit testing Though not a fan, I've been using Eclipse to run the code and I keep getting an error for setMapper function import java.io.IOException; import java.util.ArrayList; import java.util.List; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mrunit.mapreduce

MRUnit with Avro NullPointerException in Serialization

别来无恙 提交于 2019-12-18 08:47:16
问题 I'm trying to test a Hadoop .mapreduce Avro job using MRUnit. I am receiving a NullPointerException as seen below. I've attached a portion of the pom and source code. Any assistance would be appreciated. Thanks The error I'm getting is : java.lang.NullPointerException at org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:73) at org.apache.hadoop.mrunit.internal.io.Serialization.copy(Serialization.java:91) at org.apache.hadoop.mrunit.internal.io.Serialization

Testing multiple outputs with MRUnit

老子叫甜甜 提交于 2019-12-14 03:45:06
问题 Is there a way to test a reduce class with MRUnit that uses MultipleOutputFormat to write to multiple output files? 回答1: It looks like support for MultipleOutputs is still work in progress in the MRUnit Jira. That being said, I found someone who implemented his own drivers subclassing MRUnit's MapReduceDriver to make it work with MultipleOutputs here, hope that helps. 回答2: MRUnit 1.1.0 has been released in June 2014 (see http://mrunit.apache.org/) This latest release includes support for

MRUnit passing values in hbase Result object

六眼飞鱼酱① 提交于 2019-12-12 10:26:24
问题 I am testing my mapper with MRUnit. I am passing key and list of values as input to the mapper from the test class. The problem is : String key=1234_abc; ArrayList<KeyValue> list = new ArrayList<KeyValue>(); KeyValue k1 = new KeyValue(Bytes.toBytes(key),"cf".getBytes(), "Val1".getBytes(),Bytes.toBytes("abc.com")); KeyValue k2 = new KeyValue(Bytes.toBytes(key), "cf".getBytes(), "Val2".getBytes(),Bytes.toBytes("165")); Result result = new Result(list); mapDriver.withInput(key, result); The

Avro with MRUnit gives InstantiationException

若如初见. 提交于 2019-12-11 17:26:28
问题 I'm using: hadoop-client 2.2.0 mrunit 1.0.0 avro 1.7.6 avro-mrunit 1.7.6 ... and the entire thing is being built and tested using Maven. I was getting a NullPointerException until I followed the instructions at MRUnit with Avro NullPointerException in Serialization. Now I am getting an InstantiationException: Running mypackage.MyTest log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly.

What's the successor of mrunit?

谁都会走 提交于 2019-12-03 03:51:41
问题 Today I found out that the ASF retired mrunit (see https://blogs.apache.org/foundation/entry/the_apache_news_round_up85 and https://issues.apache.org/jira/browse/HADOOP-3733 and the homepage itself). Other than "inactivity" there was no reason given, so I guess there has to be an alternative? What's supposed to be used instead of mrunit to unit test mapreduce jobs in the future? 回答1: I believe the reason for this retirement is the realization that Mockito already can handle what you need to

What's the successor of mrunit?

时光怂恿深爱的人放手 提交于 2019-11-28 18:24:56
Today I found out that the ASF retired mrunit (see https://blogs.apache.org/foundation/entry/the_apache_news_round_up85 and https://issues.apache.org/jira/browse/HADOOP-3733 and the homepage itself ). Other than "inactivity" there was no reason given, so I guess there has to be an alternative? What's supposed to be used instead of mrunit to unit test mapreduce jobs in the future? SriK I believe the reason for this retirement is the realization that Mockito already can handle what you need to unit test your mapper/reducers. All you need is to mock your Context, Counter and HBase KeyValue and