google-cloud-datalab

Using Google Datalab, trying to import data from Google Drive

老子叫甜甜 提交于 2019-12-11 04:49:36
问题 Access to data from Google Drive from ipython notebook in Google Datalab returns: Exception: accessDenied: Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found. Tried running gcloud config solution in bq cmd query Google Sheet Table occur "Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found" Error Solution does not work for datalab. Only other thing i can think of is that I have more than one Google ID (outside of this project), so

Where can I find more detailed documentation on what chart settings can be used in Google-cloud-datalab

删除回忆录丶 提交于 2019-12-11 04:39:26
问题 I am using the %chart magic in DataLab to generate charts. I see in the samples that there are options that can be specified in the cell body to fine-tune these charts. Where do I find documentation on these? 回答1: The charts generated with %chart are using Google's chart tools (https://developers.google.com/chart/). The documentation there describes the options available for each chart. For example, the options for bubble charts are documented at https://developers.google.com/chart

Read CSV file to Datalab from Google Cloud Storage and convert to pandas dataframe

痴心易碎 提交于 2019-12-11 00:29:39
问题 I am trying to read a csv file save in gs to a dataframe for analysis I have follow the following steps without success mybucket = storage.Bucket('bucket-name') data_csv = mybucket.object('data.csv') df = pd.read_csv(data_csv) this doesn't work since data_csv is not a path as expected by pd.read_csv I also tried %%gcs read --object $data_csv --variable data #result: %gcs: error: unrecognized arguments: Cloud Storage Object gs://path/to/file.csv How can I read my file for analysis do this?

Using persistent disks with google Datalab

风流意气都作罢 提交于 2019-12-08 20:38:20
问题 I use google cloud for a hpc project. I have multiple instances writing and reading from from the same persistent disk (mounted using sshfs). I want to analyze some data using Datalab, and I'm not sure how to mount the persistent disk from Datalab (which natively uses google storage). Should I just execute the mount as a bash command from within my notebook, or is there a more elegant way to set things up? Am I wrong sticking with persistent disks over google storage? 回答1: Yes, just use a %

tensorflow upgrade failed on google datalab

隐身守侯 提交于 2019-12-08 07:42:40
问题 Datalab currently seems to be running 0.6.0. I wanted to update to version 0.8.0 I did: !pip install --upgrade https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.8.0-cp27-none-linux_x86_64 I got: SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:581) Storing debug log for failure in /root/.pip/pip.log How can I fix this? 回答1: It is not recommended to update packages which are installed in Datalab by default. This is to ensure that you do not break a

Google Cloud Datalab error writing to Cloud Storage

孤人 提交于 2019-12-08 07:42:34
问题 I am using Google Cloud Datalab for the first time to build a classifier for a Kaggle competition. But I am stuck trying to write a csv file containing the pre-processed training data to Cloud Storage using the google.datalab.storage API. The file contains strings with unicode characters which causes the write_stream to a Storage object to trigger the error: Failed to process HTTP response. Here is the simplified code only trying to write a single string: from google.datalab import Context

How to connect datalab with Google Cloud SQL?

最后都变了- 提交于 2019-12-08 07:12:36
问题 Trying to connect from a datalab notebook with PostgreSQL database hosted on Google Cloud SQL. Try both direct IP and instance connection ways but both give us an exception. direct connection URI: "{engine}://{user}:{password}@{host}:{port}/{database}" using gcloud sql connect "{engine}://{user}:{password}@/{database}?host=/cloudsql/{instance_connection_name}" both give us this exception: OperationalError: (psycopg2.OperationalError) could not connect to server: Connection timed out Is the

PuTTy “unknown option -o” when trying to connect

為{幸葍}努か 提交于 2019-12-08 04:48:35
问题 following the getting started guide I attempt to create & connect to a datalab vm instance with the command: datalab create demo but I get the following pop-up: then, on ok-ing the error, connection broken Attempting to reconnect... in the command prompt Any idea how to have the keys generated a different way to allow me to connect? 回答1: As a workaround, you can try either running the datalab connect demo command from inside of Cloud Shell, or downgrading to version 153.0.0 of the Cloud SDK.

Google datalab : how to import pickle

徘徊边缘 提交于 2019-12-08 03:38:22
问题 Is it possible in Google Datalab to read pickle/joblib models from Google Storage using %%storage clause? This question relates to Is text the only content type for %%storage magic function in datalab 回答1: Run the following code in an otherwise empty cell: %%storage read --object <path-to-gcs-bucket>/my_pickle_file.pkl --variable test_pickle_var Then run following code: from io import BytesIO pickle.load(BytesIO(test_pickle_var)) I used the code below to upload a pandas DataFrame to Google

How to execute a python notebook inside another one at google cloud datalab

自古美人都是妖i 提交于 2019-12-08 03:18:58
问题 I'd like to execute a python notebook I'had created to data pre-processing inside another notebook related with data classification process. So the last notebook depends on the functions and execution provided by the first notebook. How could I do that at google cloud datalab environment? I do like to reuse the functions and variables used at the pre-processing notebook on the classification notebook. Thanks. 回答1: The following should work: myNotebook = <relative path to notebook> %run