This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Minimum size needed by the Netvault database?

We thought we had hard drives big enough for our backups, but now I'm hearing that the Netvault database itself is going to be huge.

We are wanting to back up just a single client.  It's about 250 GB in size, and it's approximately 2 million files/directories.  We only want 5 backups, which should fit on a 2 TB drive.

But if I'm reading the documentation correctly, it says that I would need approximately 1.5 TB alone just for the database -- before we even backup anything.

The documentation says that if you disable the search index, most of the database size will be eliminated.  I doubt we need a search index.  We just want to backup the client.

Is all of this true?

  • Hi

    There are two techniques for calculating a NetVault database size. The first is a general “rule of thumb” approach. This is the simplest way to define the amount of storage a NetVault database will require. The “rule of thumb” approach is based on numbers gained from internal and external feedback and experience.
    - A typical large (400 to 1000 clients) NetVault server environment will require 1TB of storage for the NetVault database.
    - A typical small (1 to 400 clients) NetVault server environment will require 500GB of storage for the NetVault database.
     
    When in doubt, it is best to provide 1TB or more of storage for NetVault database. For some installations this may be an excessive amount of storage, but it will help to ensure and trouble free product experience and provide room for unexpected growth.
    If the NetVault server is virtual, then it is possible to create a smaller volume and extend this when required. In this scenario, always use thick-provisioning (VMware) or fixed VHDs (Microsoft Hyper-V).
    The second sizing technique is to calculate the amount of storage a NetVault database will need using known variables. By default, backups have an infinite life. Most systems cannot handle this amount of storage requirement, so the NetVault backup administrator must ensure that a suitable retirement is set for each backup.
     
    Database size calculation variables:
     
    - Approximate number of files and directories backed up per machine: NFD
    Each file or directory that is included in a backup requires an average number of bytes for an index entry in the NetVault database. This average is based on a formula of 71 bytes plus the average
    number of characters contained in the filenames of the files that make up a target file system.
    - Approximate number of generations retained: NGR
    Each generation is a separate instance of a file or directory backup. For example, if the same file is backed up seven times using the default backup settings, there are
    seven generations of the file stored on the media and indexed in the NetVault database.
    - Number of Machines backed up: NMB
    - Average filename length: AFL
    Database size calculation formula:
    NFD NGR NMB x (71 bytes + AFL)
     
  • Tomas,

    Yes, I read that information in the documentation.  Unfortunately, that doesn't answer my question.

    The documentation says the following:

    The backup indexes take up the majority of the space. Indexes are list of the filenames contained in a backup job and are used for searching for files during a restore process. This functionality is commonly used in the File System and VMware plug-ins, if you do not require this search functionality, consider disabling the index functionality, it will cut down on the disk space requirements and the size of NetVault database backups.

    So, my question is this: if we disable indexes, is the database still going to be 1.5 TB in size?  Or, is the database going to be substantially smaller?

    What doesn't make sense to me is that we are trying to backup 250 GB of data, but the database is going to be 1.5 TB.  In other words, the database is going to take up more space than our backups do!

  • Can you use the formula over here, I don't see how it could go 1.5 tb with only 1 client with 5 backups and 2 million files. 

  • Sorry, the original answer should say 3 million files.  (It's actually less than that, but it's close to 3 million.)

    Wait, maybe I did the formula incorrectly.  Does this look correct to you?

    3 million files TIMES (71 bytes + 20 characters) TIMES 5 generations TIMES 1 machine 

    EQUALS

    1365000000 bytes = 1333007.8125 KB = 1301.76 MB = 1.365 GB

    If it's just 1.3 GB, we can tolerate that.  Can you confirm?

  • I think you got this right now