2
votes

Background

I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.

Scenario

Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.

Question

I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.

  1. Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
  2. How you recommend implementing it?

Thank you

2

2 Answers

1
votes

Here is one way to go about it:

  • Create a new cloud pubsub topic
  • Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like protoPayload.authenticationInfo.principalEmail="<service-account-name>@<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
  • Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.
2
votes

Is this possible to write all those access denied incidents to Google Cloud Stackdriver?

Most but not all Google Cloud services support this. However, access success will also be logged.

  • You will need to enable Data Access Audit Logs.
  • This could generate a massive amount of logging information.
  • Access logs for Org and Folder are only available via API and not the console.
  • Review pricing before enabling data access audit logs.

How you recommend implementing it?

This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.

Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.

Best practices for monitoring GCP audit logs

To understand the fields that you need to process read this document:

AuthorizationInfo

This link will also help:

Understanding audit logs