1
votes

I am trying to run google bigquery in jupyter notebook on a local host on my pc but turns out that its not working,whereas its working fine in google vms in gcp and google colab notebooks.

Tried everything but nothing seems to work.

from google.cloud import bigquery

ModuleNotFoundErro Traceback (most recent call last)
<ipython-input-1-1035661e8528> in <module>
----> 1 from google.cloud import bigquery

ModuleNotFoundError: No module named 'google'
1
Sorry if this is an obvious question but Have you installed the bigquery library? "pip install --upgrade google-cloud-bigquery"? cloud.google.com/bigquery/docs/visualize-jupyter might be useful. - Micah Kornfield
Agree with @MicahKornfield, first step should be following guide and being able to run a simple query from a python script. - Yun Zhang
@MicahKornfield thank-you so much bro, but now stucked at the authentication part. please help - Shubham kumar

1 Answers

3
votes

You can connect to BigQuery from an environment which is outside GCP. You need to setup two things:

  1. Bigquery client library of your choice of language. Looking at the above code, it looks like you want to use python. You can install Bigquery python client lib by running

    pip install --upgrade google-cloud-bigquery

  2. Authentication to BigQuery -

a. Get your GCP creds by running following command:

gcloud auth application-default login

This should create a credential JSON file at location "~/.config/gcloud/"

b. You can set an environment variable pointing to the JSON creds file on the command line

export GOOGLE_APPLICATION_CREDENTIALS="~/.config/gcloud/application_default_credentials.json"

Or, you can set the above environment variable in your python program by adding the following lines:

import os 
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] =
 '~/.config/gcloud/application_default_credentials.json'

Hope this helps.