Database Backup Shipping
The Database Backup Shipping feature makes it possible to automatically ship Database Backups for an application’s production environment at regular intervals to a configured cloud storage bucket (e.g. Amazon Web Services (AWS) S3 buckets, Google Cloud Storage buckets, Azure Blob Storage).
The shipped Database Backups are useful for internal security policies, to automate regular data ingestion for analysis systems, and can be imported to local development environments.
- Backup SQL database files are shipped in a gzip-compressed plain text format with the file format of
mysqldump-<date>-<time>.sql.gzwhere<date>-<time>is the timestamp that the database backup was taken in UTC. - Files from more than one of an organization’s applications and environments can be shipped to a single bucket.
Limitation
Database Backup Shipping can only be enabled for an application’s production environment.
Configure Database Backup Shipping
Prerequisite
- Only customers with an Enhanced, Signature, or Premier Support package, and some customers on legacy contracts can enable Database Backup Shipping.
- To enable Database Backup Shipping, a user must have at minimum an App admin role for that application or an Org admin role.
- Review the cloud storage bucket requirements before enabling this feature.
- Navigate to the VIP Dashboard for an application.
- Select the production environment from the environment dropdown located at the upper left of the VIP Dashboard.
- Select “Database” from the sidebar navigation at the left of the screen.
- Select “Backup Shipping” from the submenu.
Step 1 of 3: Select Provider
- Select a cloud storage provider from the dropdown options (e.g. “AWS S3”, “Google Cloud Storage”, or “Azure Blob Storage”).
- Select the button labeled “Continue“.
Prerequisites
To complete the configuration of an AWS S3 bucket, a user must have sufficient access permissions on AWS to:
- Modify the AWS bucket policy of the AWS S3 bucket.
- Create an AWS CloudFormation stack in the AWS account.
Step 2 of 3: Configure
- Configure the AWS S3 bucket requires by adding valid entries for the required fields: “AWS Account ID” (How to find your AWS Account ID), “Bucket Name”, and “Bucket Region”. Optionally enter values for any other fields that are relevant to the S3 bucket.
- Select the button labeled “Continue“.
Step 3 of 3: Test Configuration
Based on the values entered in “Step 2 of 3: Configure”, a CloudFormation Template will populate the field labeled “Generated CloudFormation Template“. To continue the process of enabling Log Shipping:
- Select the button labeled “Download Template” to download the CloudFormation Template JSON file.
- Follow the instructions for Creating a stack on the AWS CloudFormation console to use the CloudFormation Template JSON file to create a stack in AWS CloudFormation.
- Select the button labeled “Run Test“ to test the configuration of the S3 bucket. A test file named
vip-go-test-file.txtwill be uploaded to the S3 bucket as part of the verification process. This file will always be present in a site’s configured S3 bucket and path, alongside the dated folders that contain the logs themselves.
A test file named vip-go-test-file.txt will be uploaded to the cloud storage bucket as part of the verification process. This file will always be present in an environment’s configured cloud storage bucket and path, alongside the dated folders that contain the logs themselves.
Prerequisites
To complete the configuration of a Google Cloud bucket, a user must have a Google Cloud Platform Service account with sufficient permissions to:
- Create and configure a Google Cloud bucket.
- Generate and download an authentication key as a JSON file.
Step 2 of 3: Configure
- Configure the Google Cloud bucket by adding valid entries for the required fields: “GCP Bucket Name” and “GCP Credentials JSON“. Optionally enter a value for the “GCP Prefix” field if it is relevant to the Google Cloud bucket.
- Select the button labeled “Continue“.
Step 3 of 3: Test Configuration
Select the button labeled “Run Test“ to test the configuration of the cloud storage bucket.
A test file named vip-go-test-file.txt will be uploaded to the cloud storage bucket as part of the verification process. This file will always be present in an environment’s configured cloud storage bucket and path, alongside the dated folders that contain the logs themselves.
Prerequisites
To complete the configuration of an Azure Blob Cloud bucket, a user must have at least a Storage Blob Data Contributor role to have sufficient permissions to:
- Create and configure a storage account and a container.
- Create an SAS token.
Step 2 of 3: Configure
- Configure the Azure Blob bucket by adding valid entries for the required fields: “Azure Storage Account Name”, “Azure Container Name“, and “Azure SAS Token“. Optionally enter a value for the “Azure Prefix” field if it is relevant to the Azure Blob bucket.
- Select the button labeled “Continue“.
Step 3 of 3: Test Configuration
Select the button labeled “Run Test“ to test the configuration of the cloud storage bucket.
A test file named vip-go-test-file.txt will be uploaded to the cloud storage bucket as part of the verification process. This file will always be present in an environment’s configured cloud storage bucket and path, alongside the dated folders that contain the logs themselves.
Schedule
In the last module of the configuration steps titled “Schedule”:
- Select “Daily” or “Hourly” from the dropdown menu as the frequency option Database Backup Shipping.
- If “Daily” is selected, an additional option will appear for the user to select a specific hour of day for backups to be shipped. If no hour is specified, backups will be shipped daily at a default hour set by the system.
- Select “Continue“.
Enable Database Backup Shipping
After completing the required configuration steps, the Database Backup Shipping feature must be enabled in order for file shipping to the configured cloud storage bucket to begin.
In the upper area of the Configure Database Backup Shipping panel:
- Select the button labeled “Enable“.

Disable Database Backup Shipping
- Navigate to the VIP Dashboard for an application.
- Select an environment from the environment dropdown located at the upper left of the VIP Dashboard.
- Select “Database” from the sidebar navigation at the left of the screen.
- Select “Database Backup Shipping” from the submenu.
- Select the button labeled “Disable” in the upper area of the Database Backup Shipping panel.

Update configuration values
The configuration values for the Database Backup Shipping feature can be updated at any time. To edit the values in the configuration fields:
- If the feature is not yet disabled, disable the feature by selecting the button labeled “Disable” in the upper area of the Configure Database Backup Shipping panel.
- Update the field values as needed starting with Step 1 of 3 by selecting the linked text “Edit” positioned at the right side of each step module.
Service Status
The box titled “Service Status” located in the upper area of the Configure Database Backup Shipping panel indicates the current enablement status of the feature.
- “Awaiting configuration“: This feature has never been configured or enabled.
- “Disabled“: This feature has been configured but is not currently enabled.
- “Enabled“: This feature has been configured and currently enabled.

Restricting access by IP range
To restrict access to an AWS S3 bucket via IP range, ensure that the bucket access policy accounts for the VIP Platform’s dynamic IP range. A system to auto-update the access policy will need to be implemented, as the IP ranges are subject to change.
Last updated: October 27, 2025