2017 Federal Index


Administration for Children and Families (HHS)

Score
8
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY17? (Example: Chief Evaluation Officer)

  • ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation oversees its Office of Planning, Research, and Evaluation (OPRE) and supports evaluation and other learning activities across the agency. ACF’s budget for research and evaluation in FY17 is approximately $165 million. ACF’s evaluation policy gives the OPRE Deputy Assistant Secretary “authority to approve the design of evaluation projects and analysis plans; and…authority to approve, release and disseminate evaluation reports.” OPRE’s staff of 44 includes experts in research and evaluation methods as well as ACF programs and policies and the populations they serve. OPRE engages in on-going collaboration with program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions. OPRE also provides written summaries of emerging findings and discusses their implications with agency and program leadership.
  • While OPRE oversees most of ACF’s evaluation activity and provides overall coordination, some ACF program offices also sponsor evaluations. ACF’s evaluation policy states, “In order to promote quality, coordination and usefulness in ACF’s evaluation activities, ACF program offices will consult with OPRE in developing evaluation activities. Program offices will discuss evaluation projects with OPRE in early stages to clarify evaluation questions and methodological options for addressing them, and as activities progress OPRE will review designs, plans, and reports. Program offices may also ask OPRE to design and oversee evaluation projects on their behalf or in collaboration with program office staff.”
Score
9
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY17?

  • ACF’s evaluation policy addresses the principles of rigor, relevance, transparency, independence, and ethics and requires ACF program, evaluation, and research staff to collaborate. For example, the policy states, “ACF program offices will consult with OPRE in developing evaluation activities.” And, “There must be strong partnerships among evaluation staff, program staff, policy-makers and service providers.” ACF established its Evaluation Policy in November 2012, and published it in the Federal Register in August 2014.
  • ACF’s Office of Planning, Research, and Evaluation (OPRE) proposes an evaluation plan to the Assistant Secretary each year in areas in which Congress has provided authority and funding to conduct research and evaluation.
  • ACF’s annual portfolio reviews describe recent work and ongoing learning agendas in the areas of family self-sufficiency, child and family development, and family strengthening, including work related to child welfare, child care, Head Start, Early Head Start, strengthening families, teen pregnancy prevention and youth development, home visiting, self-sufficiency, welfare and employment. Examples include findings from Head Start CARES; the BIAS project; multiple reports from the first nationally representative study of early care and education in over 20 years; early findings on the Maternal, Infant and Early Childhood Home Visiting program; and a report on challenges and opportunities in using administrative data for evaluation.
  • ACF’s evaluation policy requires that “ACF will release evaluation results regardless of findings…Evaluation reports will present comprehensive findings, including favorable, unfavorable, and null findings. ACF will release evaluation results timely – usually within two months of a report’s completion.” ACF has publicly released the findings of all completed evaluations to date. In 2016, OPRE released nearly 100 publications.
Score
7
Resources

Did the agency invest at least 1% of program funds in evaluations in FY17? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • In FY17, ACF plans to spend $162 million on evaluations, representing 0.3% of ACF’s $56.5 billion budget in FY17 (in addition to investments in evaluations by ACF grantees). This spending includes a $30 million increase in welfare research funds appropriated by Congress. The amount of ACF’s spending on evaluation is largely determined by Congress. For example, in FY 2017, Congress designated 0.33% of the Temporary Assistance for Needy Families Block Grant for research, evaluation, and technical assistance.
Score
8
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY17? (Example: Performance stat systems)

  • ACF’s performance management framework focuses on outcomes and aims for coordinated and results-oriented management and operations across all ACF programs.
  • ACF aims to develop performance measures that are meaningful and can be used by program managers, leadership, outside stakeholders, and Congress to assess and communicate progress. Results for these metrics are reported annually in the ACF Congressional Budget Justification. ACF reports on 150 performance measures (90 outcome measures and 60 output measures) in the FY18 Congressional Budget Justification.
  • ACF is an active participant in the HHS Strategic Review process, an annual assessment of progress on key performance measures. In addition, ACF participated in data-driven reviews as part of the Agency Priority Goal process, including face-to-face meetings between agency and Department leadership to examine performance. Finally, individual ACF programs regularly analyze and use performance data, administrative data, and evaluation data to improve performance. During 2017 ACF is participating in the development of HHS’s FY 2018–2022 strategic plan, which will include ACF-specific objectives.
Score
9
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY17? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • ACF has made numerous administrative and survey datasets publicly available for secondary use, such as data from the National Survey of Early Care and Education, Child Care and Development Fund, National Survey of Child and Adolescent Well-Being, and Adoption and Foster Care Analysis and Reporting System, among many other examples.
  • ACF’s Interoperability Initiative supports data sharing through developing standards and tools that are reusable across the country, and addressing common privacy and security requirements to mitigate risks. In 2016, ACF established a new Division of Data and Improvement providing federal leadership and resources to improve the quality, use, and sharing of data. ACF has developed resources such as the National Human Services Interoperability Architecture, which proposes a framework to facilitate information sharing, improve service delivery, prevent fraud, and provide better outcomes for children and families; an Interoperability Toolkit to help state human services agencies connect with their health counterparts; and a Confidentiality Toolkit that supports state and local efforts by explaining rules governing confidentiality in ACF and certain related programs, by providing examples of how confidentiality requirements can be addressed, and by including sample Memoranda of Understandings and data sharing agreements.
  • Several ACF divisions have also been instrumental in supporting cross-governmental efforts, such as the National Information Exchange Model (NIEM) that will enable human services agencies to collaborate with health, education, justice, and many other constituencies that play a role in the well-being of children and families.
  • ACF’s National Directory of New Hires has entered into data sharing agreements with numerous agencies. For example, DOL’s CEO and ETA have interagency agreements with HHS-ACF for sharing and matching earnings data on 9 different formal net impact evaluations. The NDNH Guide for Data Submission describes an agreement with the Social Security Administration to use its network for data transmission. Also, ACF Administers the Public Assistance Reporting Information System, a platform for exchange of data on benefits receipt across ACF, Department of Defense, and Veterans Affairs programs. This platform entails data sharing agreements between these three federal agencies and between ACF and state agencies.
Score
9
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY17? (Example: What Works Clearinghouses)

  • ACF has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education and the National Science Foundation. The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation, and (2) clarify for potential grantees and others the expectations for different types of studies.
  • ACF maintains an online clearinghouse of evidence reviews of human services interventions. These reviews rate the quality of evaluation studies using objective standards vetted by technical experts and applied by trained, independent reviewers, and similar to those used by other agencies such as the U.S. Department of Education’s What Works Clearinghouse and the U.S. Department of Labor’s CLEAR. The clearinghouse includes results of the reviews in a searchable format as well as comprehensive details about the review standards and process. Reviews to date have covered teen pregnancy prevention; home visiting; marriage education and responsible fatherhood; and employment and training; and include both ACF-sponsored and other studies.
Score
8
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY17? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • ACF’s Behavioral Innovations to Advance Self-Sufficiency (BIAS) project is the first major effort to apply a behavioral economics lens to programs that serve poor families in the U.S. Since its inception in 2010, the project has conducted 15 rapid-cycle randomized tests of behavioral innovations in seven states with nearly 100,000 sample members.
  • ACF’s Behavioral Interventions for Child Support Services (BICS) demonstration project is applying behavioral insights to child support contexts, developing promising behavioral interventions, and building a culture of regular, rapid-cycle evaluation and critical inquiry within the child support community.
  • ACF has actively participated in the HHS IDEA Lab, an entity within HHS launched in 2013, to invest in internal innovation, leverage external innovation, and build collaborative communities to tackle cross-cutting issues of strategic importance. Recent projects include the ACF Administration for Native Americans’ Application Toolkit and DataQuest: Making ACF Native Data Visible and Useful, the ACF Office of Family Assistance’s Understanding Temporary Assistance for Needy Families Through Data Vizualization, and the ACF Office of Head Start’s Partnership Alignment Information Response System.
Score
7
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY17? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • In FY17 the 5 largest competitive grant programs are: 1) Head Start ($8.6 billion); 2) Unaccompanied Children Services ($1.4 billion); 3) Early Head Start-Child Care Partnerships ($0.6 billion) 4) Transitional and Medical Services ($0.5 billion); and 5) Preschool Development Grants ($0.25 billion).
  • ACF’s template (see p. 14 in Attachment C) for grant announcements includes two options, requiring grantees to either 1) collect performance management data that contributes to continuous quality improvement and is tied to the project’s logic model, or 2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement and analysis.
  • In FY12, ACF established the Head Start Designation Renewal System requiring Head Start ($8.6 billion in FY17) grantees to compete for grants moving forward if they failed to meet criteria related to service quality, licensing and operations, and fiscal and internal control.
  • ACF’s Personal Responsibility Education Program ($69.8 million in FY17) includes three individual discretionary grant programs that support evidence-based competitive grants that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
  • To receive funds through ACF’s Community Based Child Abuse Prevention (CBCAP) program, ($39.6 million in FY17) states must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” CBCAP defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds (most recently 61.1% in FY15) directed towards evidence-based and evidence-informed practices.
Score
7
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY17? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • In FY17, ACF’s 5 largest non-competitive grant programs are: 1) Temporary Assistance for Needy Families ($17.3 billion); 2) Child Care and Development Fund (Block Grant and Entitlement to States combined) ($5.8 billion); 3) Foster Care ($5.3 billion); 4) Child Support Enforcement Payments to States ($4.2 billion); and 5) Low Income Home Energy Assistance ($3.4 billion).
  • ACF’s Foster Care program ($5.3 billion in FY16) has approved over 30 jurisdictions to develop and implement child welfare waiver demonstration projects to improve outcomes for children in foster care or at risk for entry or re-entry into foster care. Through these demonstrations, ACF waives provisions of law to allow flexible use of funding normally limited to foster care for other child welfare services. Many participating jurisdictions are implementing evidence-based or evidence-informed interventions and all demonstration projects are required to have a rigorous evaluation conducted by a third-party evaluator. Although ACF does not currently have statutory authority to grant new waivers, current projects are expected to continue through September 30, 2019. General information on this program, including a fact sheet and summary of relevant legislation/policy, is available at the Children’s Bureau portal.
Score
8
Repurpose for Results

In FY17, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • The Head Start Designation Renewal System requires Head Start ($8.6 billion in FY17) grantees to compete for grants moving forward if they failed to meet criteria related to service quality, licensing and operations, and fiscal and internal controls. The 2007 Head Start Reauthorization Act made all Head Start grants renewable, five-year grants. At the end of each five-year term, grantees that are running high-quality programs will have their grants renewed. But grantees that fall short of standards are now required to compete to renew grants. Grantees whose ratings on any of the three domains of the Classroom Assessment Scoring System, an assessment of adult:child interactions linked to improved outcomes, fall below a certain threshold, or in the lowest 10 percent of grantees, must also compete.
  • ACF, in collaboration with the HHS Health Resources and Services Administration, has established criteria for evidence of effectiveness of home visiting models, and oversees the Home Visiting Evidence of Effectiveness Review (HomVEE), which determines whether models have evidence of effectiveness. To date HomVEE has reviewed evidence on 45 home visiting models, determining 20 of these to have evidence of effectiveness. Grantees must use at least 75% of their federal home visiting funds to implement one or more of these models.
  • ACF’s FY18 budget request proposes to eliminate the Community Services Block Grant and the Social Services Block Grant (see pp. 144 and 348).
Back to the Index

Visit Results4America.org