Is there a way to write one file for each record with Apache Beam FileIO?

前端 未结 1 1395
猫巷女王i
猫巷女王i 2021-01-27 04:17

I am learning Apache Beam and trying to implement something similar to distcp. I use FileIO.read().filepattern() to get the input files, but while writing with FileIO.write, the

1条回答
  •  走了就别回头了
    2021-01-27 05:03

    You can use FileIO.writeDynamic and specify in .by how you want to write them. For example, if you have unique keys you can use .by(KV::getKey) and each key element will be written to a separate file. Otherwise, the criterion can be the hash of the row, etc. also you can tune .withNaming at will. As a demo:

    p.apply("Create Data", Create.of(KV.of("one", "this is row 1"), KV.of("two", "this is row 2"), KV.of("three", "this is row 3"), KV.of("four", "this is row 4")))
     .apply(FileIO.>writeDynamic()
        .by(KV::getKey)
        .withDestinationCoder(StringUtf8Coder.of())
        .via(Contextful.fn(KV::getValue), TextIO.sink())
        .to(output)
        .withNaming(key -> FileIO.Write.defaultNaming("file-" + key, ".txt")));
    

    This will write the four elements into four files:

    $ mvn compile -e exec:java \
     -Dexec.mainClass=com.dataflow.samples.OneRowOneFile \
          -Dexec.args="--project=$PROJECT \
          --output="output/" \
          --runner=DirectRunner"
    
    $ ls output/
    file-four-00001-of-00003.txt  file-one-00002-of-00003.txt  file-three-00002-of-00003.txt  file-two-00002-of-00003.txt
    $ cat output/file-four-00001-of-00003.txt 
    this is row 4
    

    Full code:

    package com.dataflow.samples;
    
    import org.apache.beam.sdk.Pipeline;
    import org.apache.beam.sdk.coders.StringUtf8Coder;
    import org.apache.beam.sdk.io.FileIO;
    import org.apache.beam.sdk.io.TextIO;
    import org.apache.beam.sdk.options.PipelineOptions;
    import org.apache.beam.sdk.options.PipelineOptionsFactory;
    import org.apache.beam.sdk.options.Description;
    import org.apache.beam.sdk.options.Validation;
    import org.apache.beam.sdk.transforms.Contextful;
    import org.apache.beam.sdk.transforms.Create;
    import org.apache.beam.sdk.transforms.DoFn;
    import org.apache.beam.sdk.transforms.ParDo;
    import org.apache.beam.sdk.values.KV;
    import org.apache.beam.sdk.values.PCollection;
    
    
    public abstract class OneRowOneFile {
    
        public interface Options extends PipelineOptions {
            @Validation.Required
            @Description("Output Path i.e. gs://BUCKET/path/to/output/folder")
            String getOutput();
            void setOutput(String s);
        }
    
        public static void main(String[] args) {
    
            OneRowOneFile.Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(OneRowOneFile.Options.class);
    
            Pipeline p = Pipeline.create(options);
    
            String output = options.getOutput();
    
            p.apply("Create Data", Create.of(KV.of("one", "this is row 1"), KV.of("two", "this is row 2"), KV.of("three", "this is row 3"), KV.of("four", "this is row 4")))
             .apply(FileIO.>writeDynamic()
                .by(KV::getKey)
                .withDestinationCoder(StringUtf8Coder.of())
                .via(Contextful.fn(KV::getValue), TextIO.sink())
                .to(output)
                .withNaming(key -> FileIO.Write.defaultNaming("file-" + key, ".txt")));
    
            p.run().waitUntilFinish();
        }
    }
    

    Let me know if that works with your custom sink, too.

    0 讨论(0)
提交回复
热议问题