Usecase Universe

A collective of use cases for DevOps teams

Browse a variety of 200+ predefined templates to automate all your AWS actions

Create Template
Solutions
All Categories

S3

24 Times Used
22 MAY 2019
DNS Compliant S3 Bucket Names
AWS Best Practices
S3
AWS Best Practices

Ensure that your AWS S3 buckets are using DNS-compliant bucket names in order to adhere to AWS best practices

S3 Bucket Policy Change Events
Operational Excellence
S3
CloudTrail

Send report of all the S3 bucket policy changes in your AWS account i.e. if any operations like put/delete bucket policy is performed in your AWS account then this workflow will generate a report of it and send it to your email.

S3 Bucket Public 'WRITE_ACP' Access
S3
Security
Security

Sends a report of all S3 Buckets with Public 'WRITE_ACP' Access. Providing this access allows unauthorised users to edit who has control over your objects, thereby allowing them to delete, edit and add objects in S3 Buckets.

S3 Bucket Public Access Via Policy
S3
Security
Security

This workflow reports the public S3 buckets in the AWS account. It gives an overview of public buckets which helps in making sure no customer data is exposed.

CloudTrail S3 Bucket Logging Enabled
Security
Security
S3
CloudTrail

Sends a report of CloudTrail Trails whose S3 bucket does not have "Bucket Logging" enabled. With Server Access Logging feature enabled for your S3 buckets you can track any requests made to access the buckets and use the log data to take measures in order to protect them against unauthorized user access.

S3 Bucket with Public 'READ' Access
Security
Security
S3

Sends a report of S3 buckets which have Public READ Access

S3 Bucket Public 'WRITE' Access
S3
Security
Security

Sends a report of all S3 Buckets that provide Public 'WRITE' Access. Providing this access will allow unauthorised users the ability to delete, change and add objects in your S3 Buckets.

CloudTrail S3 Bucket Publicly Accessible
Security
S3
Security

Sends a report of CloudTrail Trails present in your AWS account whose S3 bucket is publicly accessible. Using Public S3 bucket makes your log files less secure and easily accessible for others

DynamoDB to S3 Exporter
DynamoDB
S3
Backup
AWS Best Practices

Exporting data from dynamoDB to S3 is one of the best AWS backup strategies and successfully safeguards your data. In cases of accidental deletion of data, you can restore the data from a previous export file in Amazon S3. You can even copy data from a DynamoDB table in one AWS region, store the data in Amazon S3, and then import the data from Amazon S3 to an identical DynamoDB table in a second region.

Enable S3 Log File Validation for AWS CloudTrail
S3
CloudTrail
Security

This feature will enable you to verify the integrity of your CloudTrail log files and determine whether the files have been changed after they have been delivered to the selected S3 bucket. The validation of log file integrity uses industry-standard algorithms such as SHA-256 which makes it impossible to change files without detection.

S3 Bucket Logging Enabled
S3
Security
Security

Sends a report of S3 Buckets without Bucket logging enabled. The logs enable you to track the request made to access the Buckets and use this data to protect against unauthorised access.

S3 Buckets with Public 'FULL_CONTROL' Access
S3
Security
Security

Sends a report of all S3 Buckets with Public 'FULL_CONTROL' Access. Allowing this access is dangerous as unauthorised users can view, delete, edit and add objects in your S3 Buckets.

Take a Snapshot of EBS Volume Every 3 Days
S3
Backup

This template backs up the data on your Amazon EBS volumes to Amazon S3 by taking snapshots every 3 days for durable recovery. These incremental backups enable you to safeguard your sensitive data.

S3 Bucket with Public 'READ_ACP' Access
S3
Security
Security

Sends a report of all S3 Buckets with Public 'READ_ACP' Access. Providing this access allows unauthorised users to view who has control over your objects, and find those with badly configured permissions.

Copy EC2 Logs Data to S3 and Delete the Log Folder
S3
EC2
Remediation
AWS Best Practices
Remediation

Moves the logs present in the log folder of EC2 machine and transfers them into a specified S3 Bucket. This practice helps you to store the logs you want, without worrying about the disk space in the machine.