1
votes

I am trying open a file I have in Google Cloud Storage using cloudstorage library. I get the error that module cloudstorage has no attribute 'open'.

I want to specify a read buffer size when I load the file from Google Cloud Storage to Google BigQuery. This is the function that I wish to use for that. The parameters require a file like object.

Client.load_table_from_file(file_obj, destination, rewind=False, size=None, num_retries=6, job_id=None, job_id_prefix=None, location=None, project=None, job_config=None)[source]

Upload the contents of this table from a file-like object.

Is there any other way to the pass the Cloud Storage file as an object to this method? Or perhaps another way to load a file from cloud storage to Google BigQuery while specifying a read buffer size.

 from google.cloud import bigquery
 from google.cloud import storage 
 import cloudstorage as gcs
 def hello_gcs(event, context):
    gcs_file = gcs.open('no-trigger/transaction.csv')
    job_config = bigquery.LoadJobConfig()
    job_config.autodetect = False
    job_config.max_bad_records=1
    job_config.create_disposition = 'CREATE_IF_NEEDED'
    job_config.source_format = bigquery.SourceFormat.CSV
    load_job = bclient.load_table_from_file(
    gcs_file,
    dataset_ref.table(temptablename),
    location='asia-northeast1',
    size=2147483648,
    job_config=job_config)  # API request
Can you share the code you're trying and the full traceback?Dan Cornilescu
Hi,Added a snippet of the code.Ahana