Recently (https://products.office.com/en-IN/sharepoint/collaboration?ms.officeurl=sharepoint) Microsoft announced new guidelines for Storage Capacity in SharePoint with the release of Service Pack 1 for SharePoint 2010.
Here is a synopsis of the changes and how it impacts your Quest Products.
1- You can now create a Content Database of up to 4TB if you meet certain criteria.These include meeting some performance metrics (disk sub-system performance of 0.25 IOPS per GB for example) and have in place a viable disaster recovery plan above and beyond the recycle bin.
2- 60 million objects can now be stored in any given Content Database
Overall this is good news. In the past Administrators were always watching their content databases like a hawk. 200GB (the old recommended limit) isn’t all that much these days and it often didn’t take very long, days even, for that limit to be hit. When creating new sites and site collections, content database selection and even new creation was the order of the day. With a much higher limit some of those concerns go away. However as with most things these days you will have a few new complications to consider.
Recovery may become more complicated. These content databases will take more time to backup and recover. Backup files may be much larger and recovery times may exceed service level agreements. So a bit of extra planning is required. Microsoft is correct that you need a more viable plan for data integrity in general. In the guideline this is outlined as a requirement. Not sure what that means per say (will they push back on support if you don't have a plan?) but Microsoft seems serious about that so plan for some tools above and beyond what you get from within the native tool set. In other words its not just about the recycle bin.
Larger content databases do not remove the value proposition for externalization as well, especially as you consider the cost of storing all these large content database in SQL.
Specifically, how can your Quest tools help with these new extra large content databases?
Externalization is only partially meant to reduce the size of the content database due to the 200GB limit. The three prime needs for SMAX continue to be:
1- SQL storage is more expensive than most other storage. This will remain the case and customers who use SMAX will save money by moving content to cheaper non-SQL storage in these larger content database scenarios.
2- Large files do not perform well in SQL. This continues to be the case. Moving files over 3MB from a large content database to external storage will improve upload and download times and search crawl times for those files.
3- Compression is only available outside of SQL with a tool such as SMAX. This is another costs savings in general. The size of the content database will also mean that backups and recovery will take longer, splitting these extra large content databases up may become an issue when recovery timelines are reviewed and compression will help here.
Microsoft requires that as part of these new recommendation sto have in place a more formal backup, recovery and disaster recovery plan: “Requires the customer to have plans for high availability, disaster recovery,future capacity, and performance testing.”
You need to consider this when looking to use these extra large content databases, and 3rd party tools will fill the gap here. RMSP provides extension improved recovery and full farm recovery capabilitiesthat are now required within this Microsoft guideline.
So enjoy your new limits, and plan for them!