To create MapReduce jobs you can either use the old org.apache.hadoop.mapred
package or the newer org.apache.hadoop.mapreduce
package for Mappers a
Functionality wise there is not much difference between the old (o.a.h.mapred
) and the new (o.a.h.mapreduce
) API. The only significant difference is that records are pushed to the mapper/reducer in the old API. While the new API supports both pull/push mechanism. You can get more information about the pull mechanism here.
Also, the old API has been un-deprecated since 0.21. You can find more information about the new API here.
As you mentioned some of the classes (like MultipleTextOutputFormat) have not been migrated to the new API, due to this and the above mentioned reason it's better to stick to the old API (although a translation is usually quite simple).
Old API (mapred)
Exists in Package org.apache.hadoop.mapred
Provide A map/reduce job configuration.
New API (mapreduce)
Exists in Package org.apache.hadoop.mapreduce
Job configuration is done by separate class, Called JobConf which is extension of Configuration
Class
Reduces values for a given key, based on the Iterable
Package Summary
Both the old and new APIs are good. The new API is cleaner though. Use the new API wherever you can, and use the old one wherever you need specific classes that are not present in the new API (like MultipleTextOutputFormat
)
But do take care not to use a mix of the old and new APIs in the same Mapreduce job. That leads to weird problems.