We would like to export backup data for all monitored SQL Servers ideally directly to another database so that we can join with another system to create reports. For example: show backups by DBA, by application etc.. Management is very interested in this type of tracking for obvious reasons. Is there a way to do this other than exporting to flat file? We would like an automated method.
you posted this request in the Foglight forum, but Foglight is a monitoring tool and not a backup tool. Maybe I misunderstand your request. If that is the case, please provide a bit more details which part of the backup workflow you want to monitor.
Yes, you do misunderstand what I am asking. We monitor our SQL Server environment with Foglight. There are several ways to look at backup data (when was the last backup, how it was performed, was the backup successful etc.) within Foglight. However, from management's perspective, seeing a list of databases and their last successful backup date is only mildly interesting. How does a manager, especially upper management, filter to figure out who is responsible for failed backups, or what application the database is a part of? They can't just from the Foglight view. However, we have other database inventory systems that match databases to DBAs, to first line managers, and to applications. Obviously, we could manually export reports from Foglight as XLS and then export reports from our inventory systems and manually create these types of reports that show the information we are after. However, it would be ideal to automate a report export from Foglight to a remote database so that we can then join that report to our inventory system so that we can create a new report/dashboard (not in Foglight) that shows what we are looking for. If not directly to an external database, at least to a remote fileshare. So far, I have been able to use groovy script to save a report to a local directory on the FMS, but I cannot figure out how to send remotely.
If that inventory system can be accessed through a simple sql from the DB server that is monitored (and as a consequence from the FGLAM that monitors that server) , you could create a User-Defined Collection and then create a report based on that user defined collection.
Would you be able to provide some more details your process to add the data to the remote database"?
Using the flat file, perhaps a scheduled job could detect new files in a folder and then process the files. In Linux, inotify-tools or incron might be able to assist with this aspect.
We have a large customer who did something similar to what you want. Their want to export host monitored data hourly to their Hadoop system for later analysis and solution was to use Foglight RestAPI
They create few python script to pull data from Foglight using RestAPI then pushed to his Hadoop; this can be two steps ( pull the data to flat file/jason file then import) or single step if your target database system support RestAPI
Here are some good reference
Hope this give you some ideas