2017 Federal Index

Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY17? (Example: Performance stat systems)

Administration for Children and Families (HHS)
  • ACF’s performance management framework focuses on outcomes and aims for coordinated and results-oriented management and operations across all ACF programs.
  • ACF aims to develop performance measures that are meaningful and can be used by program managers, leadership, outside stakeholders, and Congress to assess and communicate progress. Results for these metrics are reported annually in the ACF Congressional Budget Justification. ACF reports on 150 performance measures (90 outcome measures and 60 output measures) in the FY18 Congressional Budget Justification.
  • ACF is an active participant in the HHS Strategic Review process, an annual assessment of progress on key performance measures. In addition, ACF participated in data-driven reviews as part of the Agency Priority Goal process, including face-to-face meetings between agency and Department leadership to examine performance. Finally, individual ACF programs regularly analyze and use performance data, administrative data, and evaluation data to improve performance. During 2017 ACF is participating in the development of HHS’s FY 2018–2022 strategic plan, which will include ACF-specific objectives.
Corporation for National and Community Service
  • CNCS’s performance management framework is described in the Congressional Budget Justification for Fiscal Year 2016 (p.3) and Fiscal Year 2017 (p.6).
  • CNCS has a focused set of Agency-Wide Priority Measures derived from the 2011-2015 Strategic Plan. Every CNCS Program contributes to the Agency-Wide Priority Measures. There are also specific grantee/sponsor measures that roll up into the Agency-Wide Priority Measures, which can be found in the Agency-Wide Priority Measures chart. Grantees are required to select at least one national performance measure, and they are required to report performance measures data annually. CNCS encourages grantees to use these measure for continuous program improvement. CNCS uses the agency-wide priority measures to assess its own progress toward attaining the goals and objectives of its strategic plan.
  • New metrics (“Fast Facts”) were developed in FY17 to provide stakeholders, including government representatives and the public, with clear metrics on how and to what degree CNCS programs get things done for America. Together, Fast Facts provides a meaningful picture of our core activities and investments, as it aligns with the CNCS Mission. These are expected to be released for the first time in FY17.
  • Additionally, CNCS produces state profile reports, which provide a picture of agency resources in each state at a given point. These reports contain a number of priority indicators, including the number of participants engaged in national service activities as well as the amount of non-CNCS resources generated by the agency’s programs. Along with its stakeholders, CNCS uses this information to understand the capacity of service available in different geographic regions and discuss related implications with key service partners.
  • CNCS’s Chief Operating Officer (COO) has been piloting a proof of concept performance management framework that aligns with GPRA since the fourth quarter of FY16, which is notable since CNCS is not subject to GPRA requirements. The COO pilot will help inform an update to the agency’s strategic plan and performance framework as part of its response to Executive Order 13781 and related Administration guidance to federal agency heads, which focus on making agencies more efficient, effective, and accountable.
Millennium Challenge Corporation
  • MCC monitors progress towards compact results on a quarterly basis using performance indicators that are specified in the Compact M&E Plans. The M&E Plans specify indicators at all levels (process, output, and outcome) so that progress towards final results can be tracked. Every quarter each country partner submits an Indicator Tracking Table (ITT) that shows actual performance of each indicator relative to the baseline level that was established before the activity began and the performance targets that were established in the M&E Plan. Some of the key performance indicators and their accompanying data by country are publicly available. MCC reviews this data every quarter to assess whether results are being achieved and integrates this information into project management decisions.
  • MCC also supports the creation of multidisciplinary ‘compact development teams’ to manage the development and implementation of each Compact program. Teams usually include the following members: Coordinator, economist, private sector development specialist, social inclusion and gender integration specialist, technical specialists (project specific), M&E specialist, environmental and social performance specialist, Legal, and financial management and procurement specialists. From the earliest stages, these teams develop project logics and M&E frameworks supported by data and evidence, and use them to inform the development of the projects within each Compact Teams meet frequently to gather evidence, discuss progress, make project design decisions, and solve problems; and they are encouraged to use the lessons from completed evaluations to inform their work going forward.
  • MCC is in the process of implementing a new reporting system that will enhance MCC’s credibility around results, transparency, and accountability. The “Star Report” captures key information that will provide a framework for results and improve the reporting process for compact lifecycles. For each compact, systematic evidence will be collected on performance indicators, evaluation results, partnerships, sustainability efforts, and learning, among other elements; and critically, this information will all be available in one report. Through this new reporting system, MCC will be able to provide better reporting of compact performance to public audiences, such as Congress, other development agencies, and the academic community. Each compact will have a Star Report published roughly three months after completion. The next MCC compact to close under this new system will be Cabo Verde in November 2017. 

  • MCC hosts regular “colleges” in which MCC counterparts from partnering countries are invited to a weeklong set of meetings and workshops to discuss best practices, strengthen collaboration, and improve strategies for effectively implementing projects.
Substance Abuse and Mental Health Services Administration
  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. On a quarterly basis, three SAMHSA discretionary grant portfolios (one from each of the three program Centers) conduct a self-analysis to examine grantee performance based on objective performance data, financial performance and other factors. Program staff present their program self-assessments to the PIRT and receive feedback on, for example, targets of concern. In one instance, grantees were surpassing their targets by 200-300%, resulting in the board suggesting that the targets be re-examined as appropriate for these high-performing grantees. In addition, the Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed.
  • A new unified data collection system, SAMHSA’s Performance Accountability & Reporting Systems (SPARS), was put into place in early 2017. Historically, the three program Centers had independent data collection systems that did not allow for global reviews of agency activities. The new system allows for greater transparency about grantee performance across Centers. SAMHSA aligns program objectives and measures through its utilization of SPARS, SAMHSA’s online data entry, reporting, technical assistance request, and training system for grantees to report timely and accurate data. SPARS is a mechanism by which SAMHSA meets requirements of the Government Performance and Results Act (GPRA) of 1993 and the GPRA Modernization Act of 2010.
  • SAMHSA’s strategic plan Leading Change 2.0: Advancing the Behavioral Health of the Nation 2015–2018 outlines six strategic initiatives for fiscal years 2015-2018. These initiatives are linked to SAMHSA’s policy, programmatic, and financial planning that support innovative, evidence-based practices that show promising results and best expertise and knowledge around treatment and prevention of mental health and behavioral challenges for a variety of stakeholders (see p. 5).
  • Pursuant to the 21st Century Cures Act, SAMHSA is required to establish standards for grant programs that, among other factors, addresses the extent to which grantees must collect and report on required performance measures, and SAMHSA must advance the use of performance metrics recommended both by the Assistant Secretary for Planning and Evaluation (ASPE) (Sec. 6002, pp. 464-465) and the Director of CBHSQ (Sec. 6004, p. 470). In addition, SAMHSA’s Chief Medical Officer is required to coordinate with ASPE to assess the use of performance metrics in evaluation activities, and coordinate with the Assistance Secretary to ensure programs consistently utilize appropriate performance metrics and evaluation designs (Sec. 6003, p. 468). The Assistant Secretary must also submit a biennial report to Congress that assesses the extent to which its programs and activities meet goals and appropriate performance measures (Sec. 6006, p. 477).
U.S. Agency for International Development
  • USAID partners with the U.S. Department of State to jointly develop and implement clear strategic goals and objectives. Indicators measuring progress on strategic goals and objectives from across the Agency are collected, in part, through the Performance Plan Report (PPR) and reported through the Annual Performance Report (APR). In FY2016, USAID and the U.S. Department of State’s annual reporting system, FACTS Info, and processes were redesigned to improve data collection and ease of data use for continuous improvement.
  • USAID’s Performance Improvement Officer (PIO) leads Agency efforts to use data for decision-making and improve performance and operational efficiency and effectiveness. Angelique M. Crumbly serves as the USAID Performance Improvement Officer. The PIO coordinates tracking of Cross Agency Priority (CAP) and Agency Priority Goal (APG) progress; leverages performance management reviews to conduct deep-dives into evidence; and oversees business process reviews and other assessments to ensure that the Agency more efficiently and effectively achieves its mission and goals.
  • USAID’s strategic plan, annual performance plan and report, and other performance reports are publicly available. USAID and the U.S. Department of State are in the process of developing the Joint Strategic Plan for FY18 – FY22. The most recent reports are available at the links below:
  • USAID is actively engaged in the President’s Management Council which is developing the next generation of CAP goals. USAID reports on APG and CAP goals on www.performance.gov. These goals help the Agency improve performance, efficiency, and effectiveness, while holding the Agency accountable to the public. USAID assesses progress and challenges toward meeting the goals quarterly during data-driven reviews with Agency leadership. USAID will develop new APGs as part of the FY 2018-2022 Joint Strategic Plan.
  • USAID field missions develop Country Development Cooperation Strategies (CDCS) with clear goals and objectives and a performance management plan that identifies expected results, performance indicators to measure those results, plans for data collection and analysis, and periodic review of performance measures to use data and evidence to adapt programs for improved outcomes.
  • In addition to measuring program performance, USAID measures operations performance management to ensure that the Agency achieves its development objectives and aligns resources with priorities.
U.S. Department of Education
  • ED develops a four-year strategic plan and holds quarterly data-driven progress reviews of the goals and objectives established in the plan, as required by the Government Performance and Results Act Modernization Act (GPRAMA) of 2010. ED’s FY14-18 Strategic Plan includes a goal on the continuous improvement of the United States education system with objectives focused on enhancing the use of data, research, evaluation, and technology (see pp. 37-43 of the Strategic Plan). GPRMA also requires agencies to develop agency priority goals (APGs) and submit information on those goals to OMB on a quarterly basis. APGs reflect the top near-term performance priorities that agency leadership aims to accomplish within a two-year period. ED established an APG on enabling evidence-based decision-making. Strategic objective 5.3 in the Department’s current four-year strategic plan, which is part of the continuous improvement goal referenced above, includes the metrics for the evidence APG. The Department’s FY 2016 Annual Performance Report and FY 2018 Annual Performance Plan includes the FY 2016 performance results for the APG and strategic objective 5.3. Although many of the metrics in the strategic plan are annual, the Department uses the quarterly reviews to discuss data available and milestones achieved.
  • In addition, ED has emphasized continuous improvement in evidence-based decision-making among States and districts. In 2016, ED released non-regulatory guidance, Using Evidence to Strengthen Education Investments, which recommends a five-step decision-making process to promote continuous improvement and improvement and support better outcomes for students. This guidance has served as a framework for the ED’s technical assistance related to implementation of ESSA’s evidence provisions, such as the State Support Network’s community of practice on evidence-based practices that supports 11 states with selection of interventions. ED has conducted outreach to build awareness of the guidance with stakeholder groups. In addition, ED included tailored guidance for these five steps in its guidance on Title II, Part A, and Title IV of the ESEA. These resources supplement ED’s substantial evidence-focused technical assistance efforts, such as:
    • Regional Educational Laboratories work in partnership with policymakers and practitioners in their regions to evaluate programs and to use evaluation findings and other research to improve academic outcomes for their students.
    • Comprehensive Centers provide support to States in planning and implementing interventions through coaching, peer-to-peer learning opportunities, and ongoing direct support.
    • The State Implementation and Scaling Up of Evidence-Based Practices Center provides tools, training modules, and resources on implementation planning and monitoring.
U.S. Dept. of Housing & Urban Development
  • HUD conducts regular data-driven performance reviews—“HUDStat” meetings—that focus on quarterly progress toward achieving each of HUD’s priority goals. The HUD Secretary and senior leadership from throughout the agency, and sometimes from partner agencies, attend these meetings to address challenges, review metrics, improve internal and external collaboration, and increase performance. A new strategic framework is being developed in FY17 as provided by OMB Circular A-11 (see section 200.23). HUD documents alignment between strategic goals and supporting objectives and metrics in the Annual Performance Plan and Annual Performance Report, and identifies the staff assigned lead responsibility for each objective.
U.S. Department of Labor
  • DOL’s Performance Management Center (PMC) is responsible for the Department’s extensive performance management system, which includes over 400 measures whose results are reviewed quarterly by Department leadership. PMC leads the Department’s Continuous Process Improvement (CPI) Program which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
  • PMC’s activities are intended to improve DOL’s program performance through data-driven analysis, sharing best practices, and implementing activities associated with the Government Performance and Results Modernization Act of 2010 (GPRA). Using a PerformanceStat-type reporting and dashboard system, PMC coordinates quarterly meetings between the Deputy Secretary and each agency head, to review performance results contributing to DOL’s strategic goals, to make commitments related to performance improvement, and to follow up on the progress of previous performance improvement commitments. PMC also oversees the Strategic Planning process and analyzes performance data in collaboration with agencies to achieve continuous performance improvement. CEO actively participates in the quarterly performance reviews to incorporate findings from evaluations as appropriate.
  • One of the most important roles that DOL’s CEO plays is to facilitate the interaction between program and evaluation analysts, and performance management and evaluation. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The quarterly reviews with leadership routinely include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.
  • To promote the use of evidence based strategies DOL’s Employment and Training Administration (ETA) also continues to manage the Workforce Systems Strategies website, which identifies a range of potential strategies informed by research evidence and peer exchanges to support grantees in providing effective services to customers.
Back to the Index

Visit Results4America.org