0
votes

ive been trying to load a csv file in bigquery using this command and i keep getting error. My csv file looks like this, my file has total 40 rows only including the header. I get the same error when i tried to load the csv file using the console gui

header 1,header 2,Header 3,Header 4,Header 5,Header 6,header 7,Header 8,header 9,Header 10 
Justine,Mobile,Address,Location,2020-01-26,1,0,11,3,1

With this command line:

bq load --allow_jagged_rows=1 --skip_leading_rows=1 --source_format=CSV  datasetId.mytableid /home/username/file.csv

And I got this error

Upload complete.
Waiting on bqjob_r421ecc680c644b1b_0000016fdb99efe1_1 ... (1s) Current status: DONE   
BigQuery error in load operation: Error processing job 'my-project-id:bqjob_r4212fs8997c644b1b_000db99efe1_1': Error while reading data, error message: CSV table encountered too many errors,
giving up. Rows: 1; errors: 1. Please look into the errors[] collection for more details.
Failure details:
- Error while reading data, error message: Error detected while
parsing row starting at position: 309. Error: Bad character (ASCII
0) encountered.
2

2 Answers

2
votes

I used csvkit to clean my file and after that, i was able to load it successfully in bigquery

1
votes
bq load --source_format=CSV project_id:dataset.tablename /PATH_TO_FILE/airports.csv IATA:STRING,AIRPORT:STRING,CITY:STRING,STATE:STRING,COUNTRY:STRING,LATITUDE:FLOAT64,LONGITUDE:FLOAT64

I simply removed the first row (the header) from the CVS as I have already mentioned the schema and datatype in the command. I then run the above command and it created the table for me