Cumulostrata - GCP to Splunk logging automation Terraform generator

This project is designed to simplify the configuration of pushing data to Splunk using the GCP Dataflow Pub/Sub to Splunk template. The fully local configuration website (no external network requests) allows you to provide GCP and Splunk details, then select common data sources and provided custom Stackdriver/GCP Logging query filters to stream to Splunk. A Terraform template is generated based on your selections, which can be run within GCP's Cloud Shell CLI, or on a local CLI tool to deploy logging automation.

Try out the configuration site here: https://cumulostrata.github.io, or download this repository and open the index.html file.

Wondering why the name Cumulostrata? It's a mashup of Cumulonimbus clouds and Terraform stratum.

Prerequisites and GCP side considerations.

  1. Ensure the user deploying the template has the appropriate roles and permissions to deploy the Terraform template:

    • A full list of roles and permissions is being developed, however the user running the template must be able to create resources for the following GCP services: Pub/Sub, Logging, IAM, Dataflow, GCS
  2. Enabling audit logging (Data Access):

    • Most audit logs in GCP are enabled by default, however Data Access logs must be turned on in each project being monitored. See more documentation about enabling Data Access audit logs here.
    • The query filter automatically used to capture Cloud Audit logs is below (if selected in the configuration site):
      • log_name:"projects/[PROJECT_ID]/logs/cloudaudit.googleapis.com"
  3. Enabling VPC flow logs:

    • VPC flow logs must be turned on in each subnet, each configured subnet has an aggregation interval, sample rate, among other configurations. See more documentation about enabling VPC flow logs here.
    • The query filter automatically used to capture VPC flow logs is below (if selected in the configuration site):
      • resource.type="gce_subnetwork" AND log_name="projects/[PROJECT_ID]/logs/compute.googleapis.com%2Fvpc_flows"
  4. Determine other data sources relevant to your requirements and use cases:

    • The template generation tool provides an option for custom query filters, any events matching these filters will be sent to Splunk once the logging infrastructure is fully deployed.
    • See the Query filter library for ideas of possible query filters
    • Note that each query filter will be ORed together. i.e. [Custom filter 1] OR [Custom fitler 2]
  5. Using Organization level filters (Aggregated exports):

    • See the the GCP docs on aggregated exports to understand the needed roles and permissions. These roles and permissions will need to be applied to the account running the generated Terraform template.
  6. Enable G Suite to send audit logs to GCP

    • In order to send G Suite audit logsto GCP (such as Logins and Admin activities). You must enable log delivery to GCP within G Suite. See (here)[https://cloud.google.com/blog/products/identity-security/cloud-audit-logs-integrated-audit-transparency-for-gcp-and-g-suite] for documentation.
    • Note: Accessing G Suite logs in GCP require an organiation level logging export and the associated permissions (See prerequisite note 5 above)
    • The query filter automatically used to capture G Suite Admin activity logs is below (if selected in the configuration site):
      • log_name="organizations/[ORGANIZATION_ID]/logs/cloudaudit.googleapis.com%2Factivity" protoPayload.serviceName="admin.googleapis.com" resource.type="audited_resource"
    • The query filter automatically used to capture G Suite login logs is below (if selected in the configuration site):
      • log_name="organizations/[ORGANIZATION_ID]/logs/cloudaudit.googleapis.com%2Fdata_access" protoPayload.serviceName="login.googleapis.com" resource.type="audited_resource"

Deploying the generated Terraform template

  1. The Cloud Shell in the GCP console is the simplest way to deploy the generated Terraform template. The Cloud Shell has Terraform already installed, so the only steps required are to upload the file using the upload file utility (See the 3 dot More) menu and select Upload File
    • The same steps should work outside of Cloud Shell, assuming Terraform in installed and GCP credentials have been set using an environment variable. See more documentation here.
  2. mkdir splunk_terraform
  3. mv customized_splunk_gcp_template splunk_terraform
  4. cd splunk_terraform
  5. terraform init
  6. terraform apply
    • Type yes when prompted to confirm
    • If this fails (common), run terraform apply again once
  7. To roll back the deployment terraform destroy
    • Type yes when prompted to confirm