Best Scheduled Archive to cloud for small retention and data use

Hello, Hope things are well,

We have a Rapid Recovery core archiving to s3 storage and was looking to have a discussion on it.

I was curious how we could have archive data in the s3 storage have shorter retention but still have a daily backup to the cloud. Is there a way to roll up the data in the cloud?

Also, I noticed that with 10-12+ machines and incremental daily backups for longer than 90-180 days may take a significant time to attach or retrieve data from.

I found schedule archive options in the documentation here to reference. Here are some of the options.

Daily At time Select the hour of the day you want to create a daily archive.
Weekly At day of week Select a day of the week on which to automatically create the archive.
At time Select the hour of the day you want to create an archive.
Monthly At day of months Select the day of the month on which to automatically create the archive.
At time Select the hour of the day you want to create an archive.

Replace this Core Overwrites any pre-existing archived data pertaining to this Core but leaves the data for other Cores intact.
Erase completely Clears all archived data from the directory before writing the new archive.
Incremental Lets you add recovery points to an existing archive. It compares recovery points to avoid duplicating data that exist in the archive.

I have set individual s3 locations for bigger protected machines. Wondering if there is a maximum archive size, if left running it can grow to 100ks of files and alot of folders, 10tbs of data.

  • Using the daily would work but our live repositories are 1.6tbs or so it takes a long time to upload entirely, maybe i can scope a more recent date and set a daily schedule.

  • Hello EricW,

    When setting a scheduled archive from the Rapid Recovery UI, the job will start archiving all the recovery points of the machines based on the options selected.
    About the rollups being performed to an archive, this idea is under consideration for a future release of Rapid Recovery. You can vote for it here.
    You can either select the "Monthly/Replace this Core" option or you may want to refer to our PowerShell Module Reference Guide and set up the "Start-Archive" cmdlet on task scheduler to start a daily archive job with a specific range of recovery points.
    Additional charges may apply by your cloud provider every time you perform an archive. I recommend testing this PS cmdlets on local storage before doing so on the cloud.

    Please let me know if you have any inquiries.

  • Hey Eric, 

    "Is there a way to roll up the data in the cloud?"  - No, there is not. That requires replication, and a 'live' core and repository. With a live core and a live repo you could save more or less at either core (different retention policies are allowed). That would also allow you to have that smaller footprint of daily traffic as you'd only be sending incremental RPs. Your Quest lic includes replication. You can create your own in a cloud provide, or go to cloud vendor that leases/rents them turnkey for you. 

    We are a Quest Partner and Reseller, and we do offer services just like that if you are interested.