How to unpack sqlite3 database written by Google AppEngine bulk downloader

后端 未结 2 1703
無奈伤痛
無奈伤痛 2021-01-06 18:12

I need to download all instances of fairly large (multi-GB) entity in my app\'s datastore. I have enough disk space to store the entity\'s data, but not enough to store both

相关标签:
2条回答
  • 2021-01-06 18:38

    Entities are stored in the downloaded SQLite database as encoded Protocol Buffers (the same as they're stored in the production environment, and everywhere else - an entity is an encoded PB, in short). You can read them out yourself by using the SDK code for decoding entities (db.proto_to_entity() etc), but it'll be a bit of work to set everything up.

    The relevant code is the ResultDatabase class in bulkloader.py - which you can probably reuse, along with other parts of the bulkloader, to make your job easier.

    0 讨论(0)
  • 2021-01-06 18:52

    Here's the code that worked for me:

    import sqlite3;
    from google.appengine.datastore import entity_pb
    from google.appengine.api import datastore
    
    conn = sqlite3.connect('UserRecord.db', isolation_level=None)
    cursor = conn.cursor()
    cursor.execute('select id, value from result order by sort_key, id')
    for unused_entity_id, entity in cursor:
        entity_proto = entity_pb.EntityProto(contents=entity)
        print datastore.Entity._FromPb(entity_proto)
    
    0 讨论(0)
提交回复
热议问题