skydive-project/skydive-flow-exporter

AWS VPC Flow Logs

Opened this issue · 0 comments

Intro

The idea is to provide VPC Logs (https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs.html) Analytics capabilities by:

  • Skydive+Exporter to generate VPC Flow logs on S3
  • AWS: VPC FLow Logs Analytics pipeline (Athena + CloudInsights)

Motivation

The motivation for generating VPC Flow Logs compliant records is that this enables us to make use of the available analytical tools which has emerged around the AWS standard, tools such as:

Spec

The MVP would include v2 format (most commonly used currently) stored in CSV encoding to S3. (using empty values for fields we can't easy populate). The desired output of the MVP would be integration with AWS analytics tools: https://aws.amazon.com/blogs/mt/analyzing-vpc-flow-logs-got-easier-with-support-for-s3-as-a-destination/

Record Format (Transform Phase)

There are three version: v1, v2 and v3. It appears v2 is still in wide use; while v3 is the future version - so plan is to support both v2 and v3 with the default being v2.

Template:

<version> <account-id> <interface-id> <srcaddr> <dstaddr> <srcport> <dstport> <protocol> <packets> <bytes> <start> <end> <action> <log-status>

Accept record:

2 123456789010 eni-1235b8ca123456789 172.31.16.139 172.31.16.21 20641 22 6 20 4249 1418530010 1418530070 ACCEPT OK

No data:

2 123456789010 eni-1235b8ca123456789 - - - - - - - 1431280876 1431280934 - NODATA

Skipped data:

2 123456789010 eni-11111111aaaaaaaaa - - - - - - - 1431280876 1431280934 - SKIPDATA

Space Delimited Row (Encode Phase)

An example of row:

2 123456789010 eni-1235b8ca123456789 172.31.16.139 172.31.16.21 20641 22 6 20 4249 1418530010 1418530070 ACCEPT OK

Batch Records by Space or Time (Store Phase)

https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs-s3.html: "The maximum file size for a log file is 75 MB. If the log file reaches the file size limit within the 5-minute period, the flow log stops adding flow log records to it. Then it publishes the flow log to the Amazon S3 bucket, and creates a new log file."

S3 Target (Write Phase)

https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs-s3.html: "Log files are saved to the specified Amazon S3 bucket using a folder structure that is determined by the flow log's ID, Region, and the date on which they are created. The bucket folder structure uses the following format."

Directory format:

bucket_ARN/optional_folder/AWSLogs/aws_account_id/vpcflowlogs/region/year/month/day/log_file_name.log.gz

Filename format:

aws_account_id_vpcflowlogs_region_flow_log_id_timestamp_hash.log.gz

Timestamp format:

YYYYMMDDTHHmmZ

Verification

First setup folder/bucket as detailed here https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs-s3.html

Next run Skydive+Exporter and get logs to be reported to AWS S3.

Next check you can use Athena to query flow logs as detailed here: https://docs.aws.amazon.com/athena/latest/ug/vpc-flow-logs.html

Future Work

Some of the VPC Flow logs record fields require special deeper integration (also requiring skydive core modifications)

  • log status
  • action
  • etc.

But for the MVP we may only provide rudimentary support, or leave fields empty/default value.