This workflow lets you import data from DynamoDB to S3. Importing DynamoDB data to S3 successfully safeguards your data and doubles up as an efficient AWS backup strategy. Automating this process with scheduled backups guarantees no loss of data and cost-efficient storage practices.
The workflow is set to run everyday, it will use AWS Data Pipeline to export data from a DynamoDB table to a file in an Amazon S3 bucket. The workflow nodes primarily consist of two action nodes for creating the pipeline and passing the data.
The trigger is a recurring schedule that runs throughout the week. The two action nodes are used to create the data pipeline and pass the data across the pipeline. There’s also a notification node to alert you of successful backups throughout the week.
Use this to initiate a workflow based on time, events or call.
Select action to be done on resources in the workflow.
Select action to be done on resources in the workflow.
Send notification with a custom message.
Use this to initiate a workflow based on time, events or call.
Select action to be done on resources in the workflow.
Select action to be done on resources in the workflow.
Send notification with a custom message.
You can publish templates created by you on this platform.