Google datalab : how to import pickle

牧云@^-^@ 提交于 2019-12-06 05:23:58

Run the following code in an otherwise empty cell:

%%storage read --object <path-to-gcs-bucket>/my_pickle_file.pkl --variable test_pickle_var

Then run following code:

from io import BytesIO    
pickle.load(BytesIO(test_pickle_var))

I used the code below to upload a pandas DataFrame to Google Cloud Storage as a pickled file and read it back:

from datalab.context import Context
import datalab.storage as storage
import pandas as pd
from io import BytesIO
import pickle

df = pd.DataFrame(data=[{1,2,3},{4,5,6}],columns=['a','b','c'])

# Create a local pickle file
df.to_pickle('my_pickle_file.pkl')

# Create a bucket in GCS
sample_bucket_name = Context.default().project_id + '-datalab-example'
sample_bucket_path = 'gs://' + sample_bucket_name
sample_bucket = storage.Bucket(sample_bucket_name)
if not sample_bucket.exists():
    sample_bucket.create()

# Write pickle to GCS
sample_item = sample_bucket.item('my_pickle_file.pkl')
with open('my_pickle_file.pkl', 'rb') as f:
    sample_item.write_to(bytearray(f.read()), 'application/octet-stream')

# Read Method 1 - Read pickle from GCS using %storage read (note single % for line magic)
path_to_pickle_in_gcs = sample_bucket_path + '/my_pickle_file.pkl'
%storage read --object $path_to_pickle_in_gcs --variable remote_pickle_1
df_method1 = pickle.load(BytesIO(remote_pickle_1))
print(df_method1)

# Read Alternate Method 2 - Read pickle from GCS using storage.Bucket.item().read_from()
remote_pickle_2 = sample_bucket.item('my_pickle_file.pkl').read_from()
df_method2 = pickle.load(BytesIO(remote_pickle_2))
print(df_method2)

Note: There is a known issue where the %storage command does not work if it is the first line in a cell. Put a comment or python code on the first line.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!