For Google App Engine (java), how do I set and use chunk size in FetchOptions?

前端 未结 3 1772
梦毁少年i
梦毁少年i 2021-02-08 12:59

Im running a query and it is currently returning 1400 results and because of this I am getting the following warning in the log file:

com.google.appengine

相关标签:
3条回答
  • 2021-02-08 13:35

    Meeting the same problem and the last comment was from a month ago, so here is what I found out about heavy dataset query.

    I guess I'm gonna use the "Query cursor" technique after reading those lines in the google docs article (the one in python mentioned by the way) :

    This article was written for SDK version 1.1.7. As of release 1.3.1, query cursors (Java | Python) have superseded the techniques described below and are now the recommended method for paging through large datasets.

    In the google docs about "Query Cursor". The first line of the doc gives precisely why the need for cursor :

    Query cursors allow an app to perform a query and retrieve a batch of results, then fetch additional results for the same query in a subsequent web request without the overhead of a query offset.

    The documentation provides also a java example of a servlet using the cursor technique. There is a tip how to generate a safe cursor for the client. Finally, limitations of cursor are exposed.

    Hope this gives you a lead to resolve your problem.

    Small reminder about range and offset, quite impacting on performance if forgotten (and I did^^) :

    The starting offset has implications for performance: the Datastore must retrieve and then discard all results prior to the starting offset. For example, a query with a range of 5, 10 fetches ten results from the Datastore, then discards the first five and returns the remaining five to the application.


    Edit : As working with JDO, I kept looking for a way to allow my previous code to load more than 1000 result in a single query. So, if you're using JDO too, I found this old issue:

    Query query = pm.newQuery(...);
    // I would use of value below 1000 (gae limit) 
    query.getFetchPlan().setFetchSize(numberOfRecordByFetch); 
    
    0 讨论(0)
  • 2021-02-08 13:36

    This is how I apply FetchOptions, compared to your example code, you might need to tweak a bit:

    // ..... build the Query object
    FetchOptions fetch_options =
        FetchOptions.Builder.withPrefetchSize(100).chunkSize(100);
    QueryResultList<Entity> returned_entities =
        datastore_service_instance.prepare(query).asQueryResultList(fetch_options);
    

    Of course that the figures may be changed (100).

    If my answer isn't what you're looking for then you're welcome to rephrase your question (edit).

    By the way I'm the one who wrote the first linked question.

    0 讨论(0)
  • 2021-02-08 13:41

    If you are using the dataStore directly, without JDO, then you would do something like the following to set the chunk-size when you are iterating through the data:

    Query query = new Query("entityname");
    PreparedQuery preparedQuery = dataStore.prepare(query);
    // the 200 should be less than 1000
    FetchOptions options = FetchOptions.Builder.withChunkSize(200);
    for (Entity result : preparedQuery.asIterable(options)) {
        ...
    }
    
    0 讨论(0)
提交回复
热议问题