pickling python objects to google cloud storage

前端 未结 3 1166
南方客
南方客 2021-01-20 08:22

I\'ve been pickling the objects to filesystem and reading them back when needed to work with those objects. Currently I\'ve this code for that purpose.

def p         


        
3条回答
  •  鱼传尺愫
    2021-01-20 09:10

    For Python 3 users, you can use gcsfs library from Dask creator to solve your issue.

    Example reading :

    import gcsfs
    
    fs = gcsfs.GCSFileSystem(project='my-google-project')
    fs.ls('my-bucket')
    >>> ['my-file.txt']
    with fs.open('my-bucket/my-file.txt', 'rb') as f:
        print(f.read())
    

    It basically is identical with pickle tho :

    with fs.open(directory + '/' + filename, 'wb') as handle:
            pickle.dump(shandle)
    

    To read, this is similar, but replace wb by rb and dump with load :

    with fs.open(directory + '/' + filename, 'rb') as handle:
            pickle.load(handle)
    

提交回复
热议问题