1
votes

I am building a Node.js server to run queries against BigQuery. For security reasons, I want this server to be read only. For example, if I write a query with DROP, INSERT, ALTER, etc. statement my query should get rejected. However, something like SELECT * FROM DATASET.TABLE LIMIT 10 should be allowed.

To solve this problem, I decided to use a service account with "jobUser" level access. According to BQ documentation, that should allow me to run queries, but I shouldn't be able to "modify/delete tables".

So I created such a service account using the Google Cloud Console UI and I pass that file to the BigQuery Client Library (for Node.js) as the keyFilename parameter in the code below.

// Get service account key for .env file
require( 'dotenv' ).config()
const BigQuery = require( '@google-cloud/bigquery' );

// Query goes here
const query = `
  SELECT * 
  FROM \`dataset.table0\` 
  LIMIT 10
`

// Creates a client
const bigquery = new BigQuery({
  projectId: process.env.BQ_PROJECT,
  keyFilename: process.env.BQ_SERVICE_ACCOUNT
});

// Use standard sql
const query_options = {
  query : query,
  useLegacySql : false,
  useQueryCache : false
}

// Run query and log results
bigquery
    .query( query_options )
    .then( console.log )
    .catch( console.log )

I then ran the above code against my test dataset/table in BigQuery. However, running this code results in the following error message (fyi: exemplary-city-194015 is my projectID for my test account)

{ ApiError: Access Denied: Project exemplary-city-194015: The user [email protected] does not have bigquery.jobs.create permission in project exemplary-city-194015.

What is strange is that my service account ([email protected]) has the 'Job User' role and the Job User role does contain the bigquery.jobs.create permission. So that error message doesn't make sense.

In fact, I tested out all possible access control levels (dataViewer, dataEditor, ... , admin) and I get error messages for every role except the "admin" role. So either my service account isn't correctly configured or @google-cloud/bigquery has some bug. I don't want to use a service account with 'admin' level access because that allows me to run DROP TABLE-esque queries.

Solution:

I created a service account and assigned it a custom role with bigquery.jobs.create and bigquery.tables.getData permissions. And that seemed to work. I can run basic SELECT queries but DROP TABLE and other write operations fail, which is what I want.

2

2 Answers

1
votes

As the error message shows, your service account doesn't have permissions to create BigQuery Job

You need to grant it roles/bigquery.user or roles/bigquery.jobUser access, see BigQuery Access Control Roles, as you see in this reference dataViewer and dataEditor don't have Create jobs/queries, but admin does, but you don't need that

To do the required roles, you can follow the instructions in Granting Access to a Service Account for a Resource

From command line using gcloud, run

gcloud projects add-iam-policy-binding $BQ_PROJECT \
    --member serviceAccount:$SERVICE_ACOUNT_EMAIL \
    --role roles/bigquery.user

Where BQ_PROJECT is your project-id and SERVICE_ACOUNT_EMAIL is your service-account email/id

Or from Google Cloud Platform console search or add your service-account email/id and give it the required ACLs

1
votes

I solved my own problem. To make queries you need both bigquery.jobs.create and bigquery.tables.getData permissions. The JobUser role has the former but not the latter. I created a custom role (and assigned my service account to that custom role) that has both permissions and now it works. I did this using the Google Cloud Console UI ( IAM -> Roles -> +Add ) then ( IAM -> IAM -> <set service account to custom role> )