I was wondering what the appropriate method for sending structured logging to Stackdriver is for a Cloud Composer deployment. I have looked at the google-cloud-logging module for python to call in the Plugins or DAGs, but it looks like this is intended for projects that do not already have logging deployed.
I have followed this guide to modify the outputs of my loggers to stdin / stderr into a json format and output is similar to: {"message": "test error", "severity": "ERROR"}
When checking the Stackdriver logs however this doesn't get parsed as a JSON and instead gets placed in the message body as-is.
Below is how it looks in Stackdriver UI (I have replaced sensitive info with generics. The new line after the json in the message string is how it is in the Stackdriver Console).
{
insertId: "xxxxxxxxx"
jsonPayload: {
message: "{"message": "This is WARN TEST", "severity": "WARNING"}
"
python_logger: "airflow.processor"
}
logName: "projects/project_name/logs/airflow-scheduler"
receiveTimestamp: "2000-01-01T00:00:0.0000000Z"
resource: {
labels: {
environment_name: "ariflow-environment-name"
location: "us-location2"
project_id: "project_name"
}
type: "cloud_composer_environment"
}
severity: "INFO"
timestamp: "2000-01-01T00:00:0.0000000Z"
}
Has anyone had success in sending structured logs to Stackdriver from Composer using the above or other methods?