Global variable's value doesn't change after for Loop

前端 未结 2 1585
庸人自扰
庸人自扰 2021-01-23 18:39

I am developing a hadoop project. I want to find customers in a certain day and then write those with the max consumption in that day. In my reducer class, for some reason, the

2条回答
  •  孤城傲影
    2021-01-23 18:47

    Probably all the values that go to your reducer are under 0. Try min value to identify if you variable change.

    max = MIN_VALUE;
    

    Based on what you say, the output should be only 0's (in this the max value in the reducers are 0) or no output (all values less than 0). Also, look this

    context.write(key, new IntWritable());
    

    it should be

    context.write(key, new IntWritable(max));
    

    EDIT: I just saw your Mapper class, it has a lot of problems. the following code is skiping the first element in every mapper. why?

        if (counter > 0) {
    

    I guess, that you are getting something like this right? "customer, 2013-07-01 01:00:00, 2,..." if that is the case and you are already filtering values, you should declare your max variable as local, not in the mapper scope, it would affect multiple customers.

    There are a lot of questions around this.. you could explain your input for each mapper and what you want to do.

    EDIT2: Based on your answer I would try this

    import java.io.IOException;
    import java.text.ParseException;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.StringTokenizer;
    import org.apache.hadoop.io.IntWritable;
    import org.apache.hadoop.io.LongWritable;
    import org.apache.hadoop.io.Text;
    import org.apache.hadoop.mapreduce.Mapper;
    
    public class AlicanteMapperC extends Mapper {
        private final int max = 5;
        private SimpleDateFormat ft = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
    
        @Override
        public void map(LongWritable key, Text value, Context context)
                throws IOException, InterruptedException {
            Date t = null;
            String[] line = value.toString().split(",");
            String customer = line[0];
            try {
                t = ft.parse(line[1]);
            } catch (ParseException e) {
                // TODO Auto-generated catch block
                throw new RuntimeException("something wrong with the date!" + line[1]);
            }
            Integer consumption = Integer.parseInt(line[2]);
    
            //find customers in a certain date
            if (t.compareTo(ft.parse("2013-07-01 01:00:00")) == 0 && consumption == max) {
                context.write(new Text(customer), new IntWritable(consumption));
            }
            counter++;
        }
    }
    

    and reducer pretty simple to emit 1 record per customer

    import java.io.IOException;
    import java.util.ArrayList;
    import java.util.List;
    import org.apache.hadoop.io.IntWritable;
    import org.apache.hadoop.io.Text;
    import org.apache.hadoop.mapreduce.Reducer;
    import com.google.common.collect.Iterables;
    
    public class AlicanteReducerC extends
            Reducer {
    
        public void reduce(Text key, Iterable values, Context context)
                throws IOException, InterruptedException {
    
            //We already now that it is 5
             context.write(key, new IntWritable(5));
             //If you want something different, for example report customer with different values, you could iterate over the iterator like this 
             //for (IntWritable val : values) {
               // context.write(key, new IntWritable(val));
            //}      
        }
    }
    

提交回复
热议问题