USAID’s ADS 220.127.116.11 states that the purpose of data quality assessments is to ensure that the USAID mission/office and assistance objective teams are aware of (1) the strengths and weaknesses of the data, as determined by applying appropriate quality standards, and (2) the extent to which data integrity can be trusted to influence management decisions.
The ADS also states that data reported to USAID/Washington in compliance with the Government Performance and Results Act of 1993 (Public Law 103–62) or for reporting externally on USAID performance must have had a data quality assessment within 3 years before submission.
USAID’s Performance Management Toolkit elaborates on this statement, adding that missions should determine whether there are procedures to (1) ensure that data are free of significant error or bias; (2) periodically review data collection, maintenance, and processing; and (3) provide for periodic sampling and quality assessment of data.
To assess the quality of partner data, the Toolkit recommends periodically sampling and reviewing partner data to ensure completeness, accuracy, and consistency and determining whether the partner appropriately addressed known data quality problems.
To monitor whether implementation is on track toward expected results, missions can use field visits, data from other sources, and independent surveys or evaluations to ensure acceptable data quality. According to ADS, missions should assess whether reports accurately reflect performance in the field. All assessments should be documented and available.
Complete data quality assessments for program indicators in accordance with Automated Directives System requirements.
Source: AUDIT REPORT NO. 4-674-10-005-P MAY 12, 2010
The recommendations are derived from audit reports of the Office of the Inspector General. The source refers to the audit report, which is available on this site as part of the Audit Database Project: an educational tool for compliance with USAID regulations. Please see the disclaimer of this site before using recommendations.
|←Previous Thorough Site Visits Not Conducted - ADS 202.3 - Performance Management Toolkit||Performance Management Plan Not Completed - ADS 203.3 - Performance Management Toolkit Next→|
- Indicators to Measure Higher Level Results - ADS 203.3
- Program’s Indicators, Targets and Goals Should Be Revisited - ADS 203
- Reported Results Did Not Meet Data Quality Standards - ADS 203.3
- Baseline Data, Indicators, and Targets Needed to Measure Progress and Achievement - ADS Chapter 203
- Indicators Do Not Effectively Measure Program Impact - ADS 18.104.22.168 - ADS 200.2.b
- Performance Data Lacked Support - ADS 203.3
- Performance Indicators and Targets Did Not Facilitate Program Management - ADS 203.3
- Setting Performance Targets for Partners - ADS 203
- Thorough Site Visits Not Conducted - ADS 202.3 - Performance Management Toolkit
- Performance Management Plan Not Completed - ADS 203.3 - Performance Management Toolkit
- Performance Indicator Definitions Not Consistently Applied - ADS 203.3 - GAO Standards for Internal Control in the Federal Government
- Reported Results Were Not Verified – ADS 22.214.171.124. - ADS 202.3.6.
- Partner Implementation Plans Lacked Vital Information – ADS 200.6
- Lack of a Current Performance Management Plan (PMP) - ADS 203.3.3
- Thorough Site Visits Were Not Conducted - ADS 126.96.36.199 - ADS 202.3.6
- Data Quality Assessments Were Not Completed - ADS 188.8.131.52 - Performance and Results Act of 1993
- Performance Management Plan Was Not Completed - ADS 203.3.3
- Improve Data Quality and Program Monitoring - ADS 184.108.40.206 - ADS 220.127.116.11.b
- Performance Management Plans Were Not Approved - ADS 203.3.3