0
votes

This is the code I am using to send 1 row of data to bigquery.

Assuming the following:

  1. Table Schema is good (works when I create a table in the ui with the same schema and the same 1 row of data)
  2. Map containing key:value pairs is good
  3. Credentials are good
  4. ProjectId, datasetId and tableId are correct (checked all by stepping through when it creates the url)

It always returns the response with no errors and following output: {"kind":"bigquery#tableDataInsertAllResponse"} // confirmed Status : 200

There is a possibility that the structure of my row is incorrect, but I spent lots of time breaking it apart. As I understand it

  1. List (of TableDataInsertAllRequest.Rows objects)

  2. TableDataInsertAllRequest.Rows Object contains a key "json" where the value is ->

  3. Map (which contains the JSON values wanted)

    List<TableDataInsertAllRequest.Rows> rowsList = new ArrayList<>();
    TableDataInsertAllRequest.Rows oneRow = new TableDataInsertAllRequest.Rows();
    try {
        Map<String, Object> objectMap = new TreeMap<>();
        oneRow.setJson(objectMap);
    } catch (Exception e){
        e.printStackTrace();
    }
    rowsList.add(oneRow);
    
    TableDataInsertAllRequest content = new TableDataInsertAllRequest();
        content.setKind("bigquery#tableDataInsertAllRequest");
        content.setRows(rowsList);
    
    Bigquery.Tabledata.InsertAll request =
                bigqueryService.tabledata().insertAll(projectId, datasetId, tableId, content);
    TableDataInsertAllResponse response = request.execute();
    

Any ideas?

1
How much is the delay? According to the documentation a few seconds is expected: cloud.google.com/bigquery/streaming-data-into-bigquery Streamed data is available for real-time analysis within a few seconds of the first streaming insertion into a table.Hua Zhang

1 Answers

0
votes

Found the solution, apparently I missed it in the docs.

https://cloud.google.com/bigquery/streaming-data-into-bigquery

"Data can take up to 90 minutes to become available for copy and export operations..."

P.S. for anyone who runs into this issues, to see if data actually was uploaded query the table, should have the data in it almost instantly after streaming.

** Be sure to know the streaming limits, otherwise you will get a random 400 error. Had to chunk my data for each request.