This article describes the functionality and limitations of Executive Summary which can cause inconsistent or confusing outputs.
The typical sources of misinterpretation of the Executive Report are the following :
The patch data as reported by the Executive Summary did not match the patch data as provided by the UI
The UI only provides the current view of patching results, whereas the Executive Summary attempts to provide a historical view based on the reporting period selected for the report. Please note that it is not possible for the Executive Summary to do so accurately since historical patching data is not kept in the database. The function attempts to report historical facts using partially historical records.
Executive Summary for a given time range does not produce consistent results
Running the Executive Summary with the same reporting start and end dates, but on different days, will prompt different results for certain sections of the report. This is due to the fact that some portions of the report, such as the patch health scores and patch summary charts, are attempting to make data seem historical, but are in fact not stored in the database. Other sections report only current data while yet other sections accurately report valid historical data. It is this mix of historical, pseudo-historical, and current data that can lead to inconsistency in repeated reports for the same time period.
The following sections describe in detail the availability of current and historical data used for reporting:
Patch Health scores and Patch Summary charts
The UI provides a current view of the patch statuses while the report is trying to give a historical view.
By design, these two sections are to show what the state of patching was during the selected reporting period. The patch chart is supposed to detail the number of failed, installed, needed and not needed patches that existed during the requested reporting period. The patch score is to identify how healthy the patching system was during the timespan of the requested reporting period by calculating the percentage of patches that were supposed to be installed during that time range that were successfully installed.
The following reports are kept for reporting purposes:
- The latest status of a patch.
- The timestamp of when the last status change happened.
It is this timestamp in the patch status record that the Executive Summary filters on, in an attempt to give a historical account of patching results. While this timestamp can be used to accurately determine the number of final state patches (such as installed or failed) at a given point in time in the past, it is not effective to use this timestamp to count transient state patches, such as downloaded & needed.
The result is that if the Executive Summary is executed with a reporting period in which the end date timestamp is set to the current time, then the patch data and scores are accurate. However, if the end date is set to a point of time in the past, then the patch data and scores may be incorrect if any patch statuses have changed in the time period between the report end date and the current date.
The patch scores are rolled up into the Server Health score, the Workstation Health score, the Site Health score, and the Previous Period Health score, and so the accuracy of these calculations are affected as well.
In order for the patch summary and the patch health score (and the other health scores that it affects) to be accurate, proper historical records of the patch statuses need to be maintained in the database.
Patch Assessments Completed
The report is calculated using the following formula:
reporting range (hours) / smallest detection frequency among all patch agent policies * number of devices with a patch agent policy
The result is a number of patch assessments that should have been completed based on the current patch configuration. This approach has the following limitations:
- The detection frequency and number of patch configured devices are based on the current configuration, and historical changes (number of stations, detection frequency, etc.) cannot be reflected in the report. Historical data is therefore inaccurate.
- If different devices have different detection frequencies, this will further affect the calculation. The result shows the theoretical maximum number of patch assessments, rather than the actual number of patch assessments.
Top 5 Server Disk Utilization and Top 5 Workstation Disk Utilization
These graphs report the current disk utilization using the FixedDiskAsset table which only provides current data, no historical data. Therefore, if an Executive Summary report is executed on different days but with the same reporting time range, the charts will give different results.
Asset Overview
Like the disk utilization information, historical data is not kept for assets so this is another section of the report that provides the only current view.
Alert Summary, Alert Turnaround, Top 5 Devices by Number Of Alerts
Similar to the patch status records, the alert records only track the latest state of the alert and the timestamp of when that last state was set. Historical data regarding past alert states are not kept. The alert record also keeps a timestamp of the time that the alert was initially raised.