Lots of the BigQuery examples begin with:
import gcp.bigquery as bq
But I get ImportError: No module named gcp.bigquery
whenev
Use pandas and google-api-python-client. The function you are looking for is pd.read_gbq http://pandas.pydata.org/pandas-docs/stable/generated/pandas.io.gbq.read_gbq.html
If you're accessing BigQuery in python, you can do that using the gcloud library.
First, install the gcloud
library:
$ pip install --upgrade gcloud
Then, after setting up your auth and project info, you can make api calls in python like this (adapted from the gcloud-python docs):
from gcloud import bigquery
client = bigquery.Client()
datasets, next_page_token = client.list_datasets()
print([dataset.name for dataset in datasets])
(As someone mentioned previously, you can also do it using the google-api-python-client.)
License: Apache 2
You can build the library from datalab teams' content on github:
Hope this helps. Executing the docker image locally does not work for me at least.
For anyone with this problem, it looks like the datalabs library was updated and now you should import things differently.
import datalab.bigquery as bq
You should try a simple:
$ pip install --upgrade google-api-python-client
as discussed in the documentation.
Furthermore, gcp.bigquery
is part of Google Cloud DataLab, so you should try from that angle if you are still interested.
gcp.bigquery is a library specific to Cloud Datalab (as would be any samples you saw such an import in).