gcp-ai-platform-notebook

Best way to create randomly assigned partitions in Google BigQuery

心不动则不痛 提交于 2021-02-17 06:27:08
问题 I have a BigQuery table that is not randomly sorted. The IDs are also not random. I would like to partition the data into chunks based on a random number, so that I can use those chunks for various parts of the project. The solution I have in mind is to add two columns to my table: a randomly generated number, and a partition number. I am following this code snippet on AI Platform Notebooks. The only substantive difference is I've changed the query_job line to traintestsplit=""" DECLARE randn

How do you override Google AI platform's standard library's (i.e upgrade scikit-learn) and install other libraries for custom prediction routines?

痴心易碎 提交于 2021-02-10 07:45:06
问题 I'm currently building a pipeline and trying to see if I can get an ML model deployed in AI platform's prediction service, then use it later on in other projects via the HTTP request that the prediction service offers. However the model that is being used was built using an scikit-learn library that is of a higher version than offered for the prediction runtime version 1.15 (this is the current version supported by google for scikit-learn predictions). This runtime version only supports

How do i automate my jupyter notebook using google cloud?

孤者浪人 提交于 2021-01-29 05:54:16
问题 I have a code with Jupyter notebook and i would like to schedule daily running by Google Cloud. I already created VM instance and running my code there, but I couldn't find any guide or video how to implement daily running. So, how can I do that? 回答1: Google is offering a product which is called AI Platform Notebooks. It is implementing lots of useful stuff like lots of open-source frameworks, CI etc. There is also a blog post by the Google Cloud that explains the product in depth and can be

Is it possible to obtain instance metadata for GCP on Google Collaboratory?

风格不统一 提交于 2021-01-29 05:36:13
问题 I'm trying to obtain instance metadata on Google Collaboratory. I'm interested in finding out the compute region. Specifically, on the hosted runtime on Google Collaboratory, I tried running the following commands but they time out. !curl "http://metadata.google.internal/computeMetadata/v1/?recursive=true&alt=json" -H "Metadata-Flavor: Google" curl: (7) Failed to connect to metadata.google.internal port 80: Connection timed out !curl "http://169.254.169.254/computeMetadata/v1/instance/

What is difference between AI Notebook and Cloud Datalab in GCP?

安稳与你 提交于 2021-01-24 07:22:26
问题 I have searched for an answer to this question and this question is duplicate but I need clarification as I looked at two different places and answers are a bit opposite. The following Stack Overflow answer mentions that Google Cloud AI Platform Notebooks is an upgraded version of Google Cloud Datalab . On the following Quora page, one of the architects mentions that Cloud Datalab is built on top of Jypyter Notebook . Cloud Datalab is adding a new network of its own. AI Notebooks remains

GCP run a prediction of a model every day

荒凉一梦 提交于 2021-01-06 07:33:28
问题 I have a .py file containing all the instructions to generate the predictions for some data. Those data are taken from BigQuery and the predictions should be inserted in another BigQuery table. Right now the code is running on a AIPlatform Notebook, but I want to schedule its execution every day, is there any way to do it? I run into the AIPlatform Jobs, but I can't understand what should my code do and what should be the structure of the code, is there any step-by-step guide to follow? 回答1:

GCP run a prediction of a model every day

好久不见. 提交于 2021-01-06 07:22:23
问题 I have a .py file containing all the instructions to generate the predictions for some data. Those data are taken from BigQuery and the predictions should be inserted in another BigQuery table. Right now the code is running on a AIPlatform Notebook, but I want to schedule its execution every day, is there any way to do it? I run into the AIPlatform Jobs, but I can't understand what should my code do and what should be the structure of the code, is there any step-by-step guide to follow? 回答1:

How to get write access to the library folder in AI Platform R 3.6 notebook instance in Google Cloud

纵然是瞬间 提交于 2020-08-09 09:20:00
问题 I am having trouble installing R packages in JupyterLab in AI Platform on Google Cloud. I am the owner of the project I work in. I have created a new R 3.6 instance with the permission set to the default Compute Engine default service account. The issue is that I for some reason do not have write access for the folder where packages are saved even though I am project owner and therefore should have write access to everything in the project. Here is what I have tried and the error message I

Unable to resolve “Error: Git server extension is unavailable.” (Google Notebooks)

僤鯓⒐⒋嵵緔 提交于 2020-06-27 16:59:46
问题 After creating a new notebook instance in the last few days there is an internal error relating to the Git server extension when opened: Internal Error: Fail to get the server root path. Error: Git server extension is unavailable. Please ensure you have installed the JupyterLab Git server extension by running: pip install --upgrade jupyterlab-git. To confirm that the server extension is installed, run: jupyter serverextension list. This means I can't use the Git clone button which returns:

What is the difference between google cloud datalab and google cloud ai platform notebooks?

江枫思渺然 提交于 2020-01-24 10:51:11
问题 I'm looking into the best way to set up an end-to-end machine learning pipeline, and evaluating the data exploration component options. I'm trying to figure out the difference between google cloud datalab, and google cloud ai platform notebooks. They both seem to offer similar functionality, so not sure why they both exist, or whether one is a new iteration of the other. If they are different, what is the benefit of one over the other? 回答1: Google Cloud AI Platform Notebooks is effectively