Can datastore input in google dataflow pipeline be processed in a batch of N entries at a time?

情到浓时终转凉″ 提交于 2019-11-28 01:11:59

You can batch these elements up within your DoFn. For example:

final int BATCH_SIZE = 100;

pipeline
  // 1. Read input from datastore  
  .apply(DatastoreIO.readFrom(datasetId, query))

  // 2. Programatically batch users
  .apply(ParDo.of(new DoFn<DatastoreV1.Entity, Iterable<EntryPOJO>>() {

    private final List<EntryPOJO> accumulator = new ArrayList<>(BATCH_SIZE);

    @Override
    public void processElement(ProcessContext c) throws Exception {
      EntryPOJO entry = processEntity(c);
      accumulator.add(c);
      if (accumulator.size() >= BATCH_SIZE) {
        c.output(accumulator);
        accumulator = new ArrayList<>(BATCH_SIZE);
      }
    }

    @Override
    public void finishBundle(Context c) throws Exception {
      if (accumulator.size() > 0) {
        c.output(accumulator);
      }
    }
  });

  // 3. Consume those bundles
  .apply(ParDo.of(new DoFn<Iterable<EntryPOJO>, Object>() {
    @Override
    public void processElement(ProcessContext c) throws Exception {
        sendToRESTEndpoint(batchedEntries);
    }
  }));

You could also combine steps 2 and 3 in a single DoFn if you didn't want the separate "batching" step.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!