Redgate Monitor 14

Statistics update and index fragmentation recommendations

When many instances are monitored, appropriate maintenance of the Redgate Monitor database (also called the data repository) is essential for performance; especially when CPU and memory usage are concerning factors. This is in part due to the vast and frequent sampling that is done on the databases.

Summary of guidance

  • Perform a full scan statistics update daily
  • Don’t defragment indexes. Accept that indexes will naturally become fragmented.
    • Unless you have a clear and specific benefit that can be attained by targeted defragmentation of specific indexes.

Statistic updates

Statistics help the SQL server query optimizer make informed decision about the most efficient way to execute queries. They provide information about the distribution of data values in a table, which helps in estimating the cost of various query plans. Regular updates to statistics ensure that the optimizer has accurate data, which is vital for maintaining optimal performance.

SQL server automatically updates statistics when a certain threshold of data modification is met, however this threshold may not be reached during normal operation causing performance degradations. Another aspect to consider is that automatic updates use a sampled approach, which only reads a subset of rows to estimates the statistics; this is less resource heavy but may not be as accurate.

It is highly recommended that a full scan statistics update is done daily. If this is not already implemented, perform a sampled statistics scan if performance issues are observed after implementing the daily scan. Learn more. 

Index defragmentation

Index defragmentation involves reorganizing or rebuilding indexes to reduce fragmentation. Fragmentation occurs when the logical order of pages does not match the physical order, which can lead to inefficient data retrieval. 

SQL server's way of handling fragmented indexes is adequate for most use cases and should not be manually defragmented. If issues such as memory limitations are identified, these guidelines should be followed:

  • Only target specific indexes. Indexes which are used in table scans and joins are the most likely to cause performance issues, high amounts of fragmentation are not required to cause long query times; as such the queries in which the index is used in those scenarios should be the primary concern. Learn more.
  • Do not do a blanket defragmentation. Indexes naturally fragment, the cost of frequent defragmentation would incur a higher CPU cost than allowing a fragmented index. This is due to page splits being very CPU intensive which can occur frequently when an index fragments.

Didn't find what you were looking for?