Best Scheduled Archive to cloud for small retention and data use

Hello, Hope things are well,

We have a Rapid Recovery core archiving to s3 storage and was looking to have a discussion on it.

I was curious how we could have archive data in the s3 storage have shorter retention but still have a daily backup to the cloud. Is there a way to roll up the data in the cloud?

Also, I noticed that with 10-12+ machines and incremental daily backups for longer than 90-180 days may take a significant time to attach or retrieve data from.

I found schedule archive options in the documentation here to reference. Here are some of the options.

Daily At time Select the hour of the day you want to create a daily archive.
Weekly At day of week Select a day of the week on which to automatically create the archive.
At time Select the hour of the day you want to create an archive.
Monthly At day of months Select the day of the month on which to automatically create the archive.
At time Select the hour of the day you want to create an archive.

Replace this Core Overwrites any pre-existing archived data pertaining to this Core but leaves the data for other Cores intact.
Erase completely Clears all archived data from the directory before writing the new archive.
Incremental Lets you add recovery points to an existing archive. It compares recovery points to avoid duplicating data that exist in the archive.

I have set individual s3 locations for bigger protected machines. Wondering if there is a maximum archive size, if left running it can grow to 100ks of files and alot of folders, 10tbs of data.