Public Cloud Storage (Bucket)

Description

Detects when new or existing public storage (e.g., an S3 Bucket) is set to public.


Use Case

Security Monitoring, Advanced Threat Detection

Category

Data Exfiltration, SaaS

Security Impact

Open Cloud Storage (such as an S3 Buckets or Azure blob) are an extremely common way for breaches to occur these days. People host files for quick transfer but forget to take them down, or use storage for backups of sensitive data but inadvertently mess up permissions. Newly created storage should be monitored and data quickly pulled. If you have a corporate cloud environment, you should prioritize analyzing any open storage. You may even wish to automate the remediation of them through AWS functions (link) or Splunk Phantom.

Alert Volume

Very Low (?)

SPL Difficulty

Medium

Journey

Stage 3

Data Sources

Audit Trail
GCP
Azure
AWS

   How to Implement

Assuming you use the ubiquitous AWS, GCP, or Azure Add-ons for Splunk to pull these logs in, this search should work automatically for you without issue. While implementing, make sure you follow the best practice of specifying the index for your data.

   Known False Positives

There are two types of undesired alerts that can occur from this search. One is when someone intentionally creates a public storage -- you may wish to whitelist marketing employees that do this on a regular basis, or create a policy for how to create a public storage, so that you can exclude it. The other is when someone creates a bucket that is public momentarily, but then they switch it back to private (discussed more in How To Respond).

   How To Respond

When this alert fires, there are three questions that should be asked -- is it still public, are the files public, and what is in it. The first is easy to detect -- just search your logs for the bucket name to see any subsequent ACL changes. The second and third are trickier, and require that server access logging is turned on for the bucket (not done by default, and pretty inconvenient, so don't bet on it).

   Help

Public Cloud Storage (Bucket) Help

This example leverages the Simple Search Assistant. Our example dataset is a collection of anonymized AWS CloudTrail logs, where someone creates a public storage. Our live search looks for the same behavior using the very standardized index and sourcetypes for AWS CloudTrail, GCP and Azure Audit, as detailed in How to Implement.

SPL for Public Cloud Storage (Bucket)

Demo Data

There is.. lamentably no way to really show the search here. Use the Live Search to get the extreme detail of managing AWS logs!

AWS Data

First we bring in AWS Cloudtrail logs, filtering for the PutBucketAcl events that occur when bucket permissions are changed, and filtering for any that include AllUsers.
Next, we extract the User who made the change, via the spath search command that will traverse the JSON easily.
Similarly, we extract the bucket name.
Here is where things get tricky -- AWS uses JSON arrays to show multiple permissions in one message. What we're doing here is extracting that block of ACLs -- spath will return a multi-value field to us. Then we can expand that into multiple events (so if before there were 2 events with 3 ACLs defined in each, we would end up with six events -- three copies of each original event, but the grantee field would be different). Finally, we can use spath to extract the values from each grantee field.
Next, we can search for just those individual permissions apply to all users (unauthenticated users). While we're missing some context here about who else has permissions, we can follow-on to investigate.
Last, we format the data a bit to meet our needs.

GCP Data

First we bring in GCP Audit data showing additions to the storage IAM permissions.

Azure Data

First we bring in Azure Blob logs (specifically the $logs container), filtering for the GetBlob AnonymousSuccess events that occur when someone anonymously access a file in a blob. While this doesn't show when the permissions on a blob is changed as there doesn't appear to be a good log source for this, it does show when anonymous users access blobs.
Next, we extract the storage account and storage container accessed.
Last, we format the data a bit to meet our needs.

Screenshot of Demo Data