Is it better to use the mapred or the mapreduce package to create a Hadoop Job?

后端 未结 3 2025
感动是毒
感动是毒 2020-11-27 03:15

To create MapReduce jobs you can either use the old org.apache.hadoop.mapred package or the newer org.apache.hadoop.mapreduce package for Mappers a

相关标签:
3条回答
  • 2020-11-27 03:39

    Functionality wise there is not much difference between the old (o.a.h.mapred) and the new (o.a.h.mapreduce) API. The only significant difference is that records are pushed to the mapper/reducer in the old API. While the new API supports both pull/push mechanism. You can get more information about the pull mechanism here.

    Also, the old API has been un-deprecated since 0.21. You can find more information about the new API here.

    As you mentioned some of the classes (like MultipleTextOutputFormat) have not been migrated to the new API, due to this and the above mentioned reason it's better to stick to the old API (although a translation is usually quite simple).

    0 讨论(0)
  • 2020-11-27 03:44

    Old API (mapred)

    1. Exists in Package org.apache.hadoop.mapred

    2. Provide A map/reduce job configuration.

    3. Reduces values for a given key, based on the Iterator
    4. Package Summary

    New API (mapreduce)

    1. Exists in Package org.apache.hadoop.mapreduce

    2. Job configuration is done by separate class, Called JobConf which is extension of Configuration
      Class

    3. Reduces values for a given key, based on the Iterable

    4. Package Summary

    0 讨论(0)
  • 2020-11-27 03:58

    Both the old and new APIs are good. The new API is cleaner though. Use the new API wherever you can, and use the old one wherever you need specific classes that are not present in the new API (like MultipleTextOutputFormat)

    But do take care not to use a mix of the old and new APIs in the same Mapreduce job. That leads to weird problems.

    0 讨论(0)
提交回复
热议问题