Throughout my career I have conversed with many IT leaders that have adopted the practice of measuring the performance of their technology assets by monitoring key performance indicators (or KPIs). Most of them will tell you that there needs to be some standard against which KPIs are measured and compared. The process of comparing a set of key performance metrics against some kind of objective standard is a form of benchmarking. Benchmarking is one of the most effective ways IT organizations can improve their operational efficiency while lowering the total cost of managing the supporting infrastructure. It enables IT organizations the ability to determine which of their processes could benefit from improvement, and in which areas these improvements might yield results. By benchmarking their performance monitoring on a consistent and ongoing basis, IT organizations can gauge who effective their improvement efforts are over a period of time.
Performance metrics themselves have significantly less value when dealt with in isolation. Consider an example from an online retail business. Suppose a retailer determines that its mean time for producing certain online orders is 3 minutes. Is this good or bad? There is no way of really knowing unless 3 minutes is compared to an objective standard, such as knowing the trend in which these orders were produced during steady and peak periods of transactional activity.
A fundamental part any benchmarking exercise is to determine which performance standards will be measured. This will differ from one technology asset to the next. Once the performance standards have been established, the next step is to establish the baseline for good or acceptable performance. What is the normal range for any given performance metric? Only then will one be able to measure deviation from the normal range.
Performance monitoring solutions such as Foglight can provide an automated method of calculating baselines for important metrics over time. During operating cycles, Foglight compares incoming data for those metrics that have threshold values configured, measure the degree of deviation from the baseline, and can indicate potential performance bottlenecks.
Once the performance standards have been established and baselines have been collected, perform comparisons can be made that identify which processes can benefit from most improvement. Let’s say for instance the processing at the database level is of key concern to the IT organization. Some questions that may arise are related to how certain transactions perform during peak periods of system utilization; or how certain workloads will perform on a newly configured version of the database itself. Foglight’s Performance Investigator allows for performing comparisons of performance for that given database instance against the baseline for that period of time; against a specific time period of activity; or against another database instance altogether. This consolidated reporting allows IT organizations to examine the workload to get a clear before and after picture which can identify areas for improvement or to validate that certain changes will yield favorable results in performance.
While measuring performance of technology assets through KPIs has become a standard practice for many IT organizations, without benchmarking, the exercise is of limited value. Monitoring solutions such as Foglight can assist not only with establishing the objective standard for key performance metrics, but also provide a clear before and after picture that helps IT organizations prioritize, measure and validate.
Learn more about Benchmark Factory.
Solid post, Amit.