Is there any way to set a parameter in job configuration from Mapper
and is accessible from Reducer
.
I tried the below code
In Mapp
As far as I know, this is not possible. The job configuration is serialized to XML at run-time by the jobtracker, and is copied out to all task nodes. Any changes to the Configuration object will only affect that object, which is local to the specific task JVM; it will not change the XML at every node.
In general, you should try to avoid any "global" state. It is against the MapReduce paradigm and will generally prevent parallelism. If you absolutely must pass information between the Map and Reduce phase, and you cannot do it via the usual Shuffle/Sort step, then you could try writing to the Distributed Cache, or directly to HDFS.
If you are using the new API your code should ideally work. Have you created this "Sum" property at the start of the job creation? For example like this
Configuration conf = new Configuration();
conf.set("Sum", "0");
Job job = new Job(conf);
If not you better use
context.getConfiguration().setIfUnset("Sum","100");
In your mapper class to fix the issue. This is the only thing I can see.