Big Query

GCP has possibly the best logging system of the main cloud vendors. It lets you consolidate all logs from all projects into a central project, and can export all of those logs automatically into Big Query. From there, you can use a graphical tool such as Data Studio to search and analyse those logs. Compare this to AWS where you will spend a lot of time configuring IAM, buckets, Glue jobs and Athena queries, and you will notice how easy this is in GCP.

This will show you how to get some logging up and running really quickly.

Set up a logging project

resource "google_project" "my_project" {
  name       = "My Project"
  project_id = "your-project-id"
  org_id     = "1234567"
}

Set up an organizational logging sink

module "log_export" {
  source                 = "terraform-google-modules/log-export/google"
  destination_uri        = "${module.destination.destination_uri}"
  filter                 = "severity >= ERROR"
  log_sink_name          = "storage_example_logsink"
  parent_resource_id     = "sample-project"
  parent_resource_type   = "project"
  unique_writer_identity = true
}

module "destination" {
  source                   = "terraform-google-modules/log-export/google//modules/storage"
  project_id               = "sample-project"
  storage_bucket_name      = "storage_example_bucket"
  log_sink_writer_identity = "${module.log_export.writer_identity}"
}

Connect Data Studio and do some reporting

BQ Demo


Optimise your logs

variables.tf

variable "exclusions" {
  default     = []
  description = "(Optional) A list of sink exclusion filters."
  type = list(object({
    name        = string,
    description = string,
    filter      = string,
    disabled    = bool
  }))
}