0
votes

while uploading csv file on BigQuery through storage , I am getting below error: CSV table encountered too many errors, giving up. Rows: 5; errors: 1. Please look into the error stream for more details.

In schema , I am using all parameter as string.

In csv file,I have below data:

It's Time. Say "I Do" in my style.

I am not able upload csv file in BigQuery containing above sentence

2
I do not know BigQuery, but in many cases with CSV-files you may have to escape " char with \" or you have to change CSV quote char from " to ' (or to regex all " chars in input/output strings to ' ) in order to get CSV data imported correctly. - Mika72

2 Answers

0
votes

Does the CSV file have the exact same structure of the dataset schema? Both must match for the upload to be successful.

If your CSV file has only one sentence in the first row of the first column, then your schema must have a table with exactly one field as STRING. If there is content in the second column of the CSV, the schema must then have a second field for it, etc. Conversely, if your scheman has say 2 fields set as STRING, there must be data in first two columns in the CSV.

Data location must also match, if your BigQuery dataset is in US, then your Cloud Storage bucket must be in US too for the upload to work.

Check here for details of uploading CSV into BigQuery.

0
votes

Thanks to all for a response.

Here is my solution to this problem:

with open('/path/to/csv/file', 'r') as f: text = f.read()

converted_text = text.replace('"',"'") print(converted_text)

with open('/path/to/csv/file', 'w') as f: f.write(converted_text)