0
votes

I have created an "export" from my Stackdriver Logging page in my Google Cloud project. I configured the export to go to a BigQuery dataset.

When I go to BigQuery, I see the dataset.

There are no tables in my dataset, since Stackdriver export created the BigQuery dataset for me.

How do I see the data that was exported? Since there are no tables I cannot perform a "select * from X". I could create a table but I don't know what columns to add nor do I know how to tell Stackdriver logging to write to that table.

I must be missing a step.

Google has a short 1 minute video on exporting to Big Query but it stops exactly at the point where I am in the process.

2
Once you have defined an export with a sink to BQ then NEW log messages will be written to BQ which will create the tables. Have new messages been arriving in the logs that match the filters in the export? - Kolban
@Kolban When I create the export - is there no way to export the existing logs? There are quite a few logs already existing but, no, no new data has been written to the logs yet. - Terry Chambers - Onix
@Kolban I see that it did eventually create a 'table' in the dataset but there are no rows in it. I went to my source application and performed some actions and I see the updated log entries in the Stackdriver log UI. Does the action of "create export" automatically cause a regular 'sync' process to update the data with more data? I took it as a one time sync. - Terry Chambers - Onix

2 Answers

0
votes

When a new Stackdriver export is defined, it will then start to export newly written log records to the target sink (BQ in this case). As per the documentation found here:

https://cloud.google.com/logging/docs/export/

it states:

Since exporting happens for new log entries only, you cannot export log entries that Logging received before your sink was created.

If one wants to export existing logs to a file, one can use gcloud (or API) as described here:

https://cloud.google.com/logging/docs/reference/tools/gcloud-logging#reading_log_entries

The output of this "dump" of existing log records can then used in whatever manner you see fit. For example, it could be imported into a BQ table.

0
votes

To export logs in the bigquery from the stackdrive , you have to create Logger Sink using code or GCP logging UI

Then create Sink, add a filter. https://cloud.google.com/logging/docs/export/configure_export_v2

enter image description here

Then add logs to stack driver using code

public static void writeLog(Severity severity, String logName, Map<String, String> jsonMap) {
List<Map<String, String>> maps = limitMap(jsonMap);
for (Map<String, String> map : maps) {
  LogEntry logEntry = LogEntry.newBuilder(Payload.JsonPayload.of(map))
      .setSeverity(severity)
      .setLogName(logName)
      .setResource(monitoredResource)
      .build();
  logging.write(Collections.singleton(logEntry));
}

}

private static MonitoredResource monitoredResource = 
MonitoredResource.newBuilder("global")
   .addLabel("project_id", logging.getOptions().getProjectId())
   .build();

https://cloud.google.com/bigquery/docs/writing-results