How to process multi line input records in Spark

后端 未结 2 953
盖世英雄少女心
盖世英雄少女心 2020-11-28 12:00

I have each record spread across multiple lines in the input file(Very huge file).

Ex:

Id:   2
ASIN: 0738700123
  title: Test tile for this product
         


        
相关标签:
2条回答
  • 2020-11-28 12:47

    If the multi-line data has a defined record separator, you could use the hadoop support for multi-line records, providing the separator through a hadoop.Configuration object:

    Something like this should do:

    import org.apache.hadoop.conf.Configuration
    import org.apache.hadoop.io.{LongWritable, Text}
    import org.apache.hadoop.mapreduce.lib.input.TextInputFormat
    val conf = new Configuration
    conf.set("textinputformat.record.delimiter", "id:")
    val dataset = sc.newAPIHadoopFile("/path/to/data", classOf[TextInputFormat], classOf[LongWritable], classOf[Text], conf)
    val data = dataset.map(x=>x._2.toString)
    

    This will provide you with an RDD[String] where each element corresponds to a record. Afterwards you need to parse each record following your application requirements.

    0 讨论(0)
  • 2020-11-28 12:53

    I have done this by implementing custom input format and record reader.

    public class ParagraphInputFormat extends TextInputFormat {
    
        @Override
        public RecordReader<LongWritable, Text> createRecordReader(InputSplit inputSplit, TaskAttemptContext taskAttemptContext) {
            return new ParagraphRecordReader();
        }
    }
    
    public class ParagraphRecordReader extends RecordReader<LongWritable, Text> {
        private long end;
        private boolean stillInChunk = true;
    
        private LongWritable key = new LongWritable();
        private Text value = new Text();
    
        private FSDataInputStream fsin;
        private DataOutputBuffer buffer = new DataOutputBuffer();
    
        private byte[] endTag = "\n\r\n".getBytes();
    
        public void initialize(InputSplit inputSplit, TaskAttemptContext taskAttemptContext) throws IOException, InterruptedException {
            FileSplit split = (FileSplit) inputSplit;
            Configuration conf = taskAttemptContext.getConfiguration();
            Path path = split.getPath();
            FileSystem fs = path.getFileSystem(conf);
    
            fsin = fs.open(path);
            long start = split.getStart();
            end = split.getStart() + split.getLength();
            fsin.seek(start);
    
            if (start != 0) {
                readUntilMatch(endTag, false);
            }
        }
    
        public boolean nextKeyValue() throws IOException {
            if (!stillInChunk) return false;
    
            boolean status = readUntilMatch(endTag, true);
    
            value = new Text();
            value.set(buffer.getData(), 0, buffer.getLength());
            key = new LongWritable(fsin.getPos());
            buffer.reset();
    
            if (!status) {
                stillInChunk = false;
            }
    
            return true;
        }
    
        public LongWritable getCurrentKey() throws IOException, InterruptedException {
            return key;
        }
    
        public Text getCurrentValue() throws IOException, InterruptedException {
            return value;
        }
    
        public float getProgress() throws IOException, InterruptedException {
            return 0;
        }
    
        public void close() throws IOException {
            fsin.close();
        }
    
        private boolean readUntilMatch(byte[] match, boolean withinBlock) throws IOException {
            int i = 0;
            while (true) {
                int b = fsin.read();
                if (b == -1) return false;
                if (withinBlock) buffer.write(b);
                if (b == match[i]) {
                    i++;
                    if (i >= match.length) {
                        return fsin.getPos() < end;
                    }
                } else i = 0;
            }
        }
    
    }
    

    endTag identifies the end of each record.

    0 讨论(0)
提交回复
热议问题