2017 Federal Index


U.S. Department of Education

Score
8
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY17? (Example: Chief Evaluation Officer)

  • ED’s Institute of Education Sciences (IES), with a budget of $605.3 million in FY17, has primary responsibility for education research, evaluation, and statistics. The IES Director is appointed by the President and confirmed by the U.S. Senate, and advises the U.S. Education Secretary on research, evaluation and statistics activities. Four Commissioners support the IES Director, including the Commissioner for the National Center for Education Evaluation and Regional Assistance (NCEE), who is responsible for planning and overseeing ED’s major evaluations. IES employed approximately 180 full-time staff in FY 2017, including approximately 25 staff in NCEE.
  • The Office of Planning, Evaluation, and Policy Development’s (OPEPD) Program and Policy Studies Services (PPSS) has a staff of 20 and serves as the Department’s internal analytics office. PPSS conducts short-term evaluations to support continuous improvement of program implementation and works closely with program offices and senior leadership to inform policy decisions with evidence. While some evaluation funding – such as that for Special Education Studies and Evaluations – is appropriated to IES ($10.8 million in FY17), most evaluations are supported by funds appropriated to ED programs. NCEE and PPSS staff work closely with program offices to design program evaluations that reflect program priorities and questions. IES and PPSS provide regular briefings on results to help ensure information can be used by program offices for program improvement.
  • IES and PPSS staff collaborate closely through ED’s Evidence Planning Group (EPG) with other senior staff from the ED’s Office of Planning, Evaluation, and Policy Development (OPEPD), including Budget Service, and the Office of Innovation and Improvement (OII). EPG supports programs and advises Department leadership on how evidence can be used to improve Department programs. EPG has coordinated, for example, the development of revised evidence definitions and related selection criteria for competitive grant programs that align with the Elementary and Secondary Education Act, as amended by the Every Student Succeeds Act (P.L. 114-95) (ESSA). EPG has also facilitated cross-office alignment of evidence investments in technical assistance and pooling program funds for evaluations.
  • Senior officials from IES, OPEPD, and OII are part of ED’s leadership structure. Officials from OPEPD and OII weigh in on major policy decisions. OPEPD leadership plays leading roles in the formation of the Department’s annual budget requests, recommendations for grant competition priorities, including evidence, and providing technical assistance to Congress to ensure that evidence informs policy design.
Score
8
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY17?

  • ED has a scientific integrity policy to ensure that all scientific activities (including research, development, testing, and evaluation) conducted and supported by ED are of the highest quality and integrity, and can be trusted by the public and contribute to sound decision-making. The policy may be accessed here.
  • IES, and PPSS, in concert with the EPG, work with program offices and ED leadership on the development of ED’s annual evaluation plan. This plan is implemented through ED’s annual spending plan process.
  • In addition, IES prepares and submits to Congress a 2-year biennial, forward-looking evaluation plan covering all mandated and discretionary evaluations of education programs funded under the Elementary and Secondary Education Act, as amended by the Every Student Succeeds Act (P.L. 114-95) (ESSA). IES and PPSS work with programs to understand their priorities, design appropriate studies to answer the questions being posed, and share results from relevant evaluations to help with program improvement. This serves as a research and learning agenda for ED.
  • ED’s FY 2016 Annual Performance Report and FY 2018 Annual Performance Plan includes a list of ED’s current evaluations, organized by subject matter area. IES publicly releases findings from all of its completed, peer-reviewed evaluations on the IES website and also in the Education Resources Information Center (ERIC). IES announces all new evaluation findings to the public via a Newsflash and through social media. Finally, IES regularly conducts briefings on its evaluations for ED, the Office of Management and Budget, Congressional staff, and the public.
  • Finally, IES manages the Regional Educational Laboratory ( REL) program, which supports districts, states, and boards of education throughout the United States to use research and evaluation in decision making. The research priorities are determined locally, but IES approves the studies and reviews the final products. All REL studies are made publicly available on the IES website.
Score
7
Resources

Did the agency invest at least 1% of program funds in evaluations in FY17? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • Overall spending on evaluation ($144 million in FY16) and evaluation technical assistance and capacity-building ($58 million in FY16) represents 0.4% of ED’s $45.6 billion discretionary budget (without Pell Grants) in FY16. (FY17 estimates are forthcoming.) This total includes impact studies, implementation studies, rapid cycle evaluations, evaluation technical assistance, and capacity-building funded through IES and OII.
  • ED spent about $144 million on program evaluations in FY16, including funding evaluation projects under IES’ research grant programs. It is important to note that the timing of evaluation projects and the type of research projects proposed by the field results in year-to-year fluctuations in this amount, and does not reflect a change in ED’s commitment to evaluation.
  • It is worth noting that much of ED’s investment in evaluation is directed toward supporting states and school districts so that they can conduct studies of their own education policies and programs. For example, IES runs annual grant competitions to support researcher-practitioner partnerships between state and local education officials and research institutions, including a new program for low-cost, short duration evaluations. IES also provides large grants to states to develop longitudinal data systems for purposes of tracking students through the educational system and conducting state-sponsored evaluations. (To date 47 states have received IES funding for this purpose.) Finally, the Regional Education Laboratories provide extensive technical assistance on evaluation and support research alliances that conduct implementation and impact studies on education policies and programs in ten geographic regions of the U.S., covering all states, territories, and the District of Columbia.
  • OII emphasizes evaluation and the building of evidence in a number of its grant programs through requirements that grantees conduct or participate in evaluations. This emphasis on evaluation continues even for programs supported by an existing evidence base, as the evaluation design for these programs looks at impact with new settings, different populations, and project-specific implementation. In many instances, during grant competitions, this evaluation criterion is reviewed and scored by evaluation experts familiar with the What Works Clearinghouse standards, ensuring proposed evaluation plans are of the appropriate design and rigor. Additionally, a number of OII programs dedicate resources to evaluation technical assistance and program level evaluations. The Investing in Innovation (i3) and the Supporting Effective Educator Development programs utilize technical assistance contractors that support grantees in the design and implementation of their project-level evaluations, as well as the development of evaluation resources that are now being shared publicly on the What Works Clearinghouse and program websites.
  • IES and OII grantees are expected to make the results of their evaluations public through Education Resources Information Clearinghouse (ERIC) and other grant-funded dissemination activities. In addition, all impact evaluations funded by IES and OII are reviewed by the What Works Clearinghouse (WWC), which plays a major role in summarizing and disseminating findings from the most rigorous studies to ED and the broader field.
  • In addition, some programs encourage their grantees to conduct project-level evaluations. One of the key lessons from the Investing in Innovation program (i3) has been that high-quality technical assistance for grantees on project-level evaluations is critical to producing credible information on project outcomes. In FY16 i3 invested more than $4 million of its appropriation in evaluation technical assistance, and the Regional Educational Laboratories continued to provide States and districts with technical assistance on evaluations and data use – virtually no other competitive grant programs at ED have the authority or means to fund such a robust vehicle for technical assistance. ED is working to identify and align its technical assistance efforts on the use of evidence in education at the state and local level.
  • ED has an opportunity to significantly increase its annual investment in program evaluation through the reauthorized ESEA pooled evaluation authority, which makes available up to $40 million annually from ESEA programs funded by Congress that can be used to evaluate any ESEA program included in the biennial evaluation plan prepared by IES.
Score
8
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY17? (Example: Performance stat systems)

  • ED develops a four-year strategic plan and holds quarterly data-driven progress reviews of the goals and objectives established in the plan, as required by the Government Performance and Results Act Modernization Act (GPRAMA) of 2010. ED’s FY14-18 Strategic Plan includes a goal on the continuous improvement of the United States education system with objectives focused on enhancing the use of data, research, evaluation, and technology (see pp. 37-43 of the Strategic Plan). GPRMA also requires agencies to develop agency priority goals (APGs) and submit information on those goals to OMB on a quarterly basis. APGs reflect the top near-term performance priorities that agency leadership aims to accomplish within a two-year period. ED established an APG on enabling evidence-based decision-making. Strategic objective 5.3 in the Department’s current four-year strategic plan, which is part of the continuous improvement goal referenced above, includes the metrics for the evidence APG. The Department’s FY 2016 Annual Performance Report and FY 2018 Annual Performance Plan includes the FY 2016 performance results for the APG and strategic objective 5.3. Although many of the metrics in the strategic plan are annual, the Department uses the quarterly reviews to discuss data available and milestones achieved.
  • In addition, ED has emphasized continuous improvement in evidence-based decision-making among States and districts. In 2016, ED released non-regulatory guidance, Using Evidence to Strengthen Education Investments, which recommends a five-step decision-making process to promote continuous improvement and improvement and support better outcomes for students. This guidance has served as a framework for the ED’s technical assistance related to implementation of ESSA’s evidence provisions, such as the State Support Network’s community of practice on evidence-based practices that supports 11 states with selection of interventions. ED has conducted outreach to build awareness of the guidance with stakeholder groups. In addition, ED included tailored guidance for these five steps in its guidance on Title II, Part A, and Title IV of the ESEA. These resources supplement ED’s substantial evidence-focused technical assistance efforts, such as:
    • Regional Educational Laboratories work in partnership with policymakers and practitioners in their regions to evaluate programs and to use evaluation findings and other research to improve academic outcomes for their students.
    • Comprehensive Centers provide support to States in planning and implementing interventions through coaching, peer-to-peer learning opportunities, and ongoing direct support.
    • The State Implementation and Scaling Up of Evidence-Based Practices Center provides tools, training modules, and resources on implementation planning and monitoring.
Score
9
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY17? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • ED has several resources to support the high-quality collection, analysis, and use of high-quality data in ways that protect privacy. IES’ National Center for Education Statistics (NCES) serves as the primary federal entity for collecting and analyzing data related to education. Almost all of ED’s K-12 statistical and programmatic data collections are now administered by NCES via EDFacts. NCES also collects data through national and international surveys and assessments. Administrative institutional data and statistical sample survey data for postsecondary education is collected through NCES in collaboration with the Federal Student Aid Office (FSA). Some data are available through public access while others only through restricted data licenses. ED’s Office for Civil Rights conducts the Civil Rights Data Collection (CRDC) on key education and civil rights issues in our nation’s public schools. Additionally, the Data Strategy Team helps to coordinate data activities across the Department and the Disclosure Review Board, the Family Policy Compliance Office (FPCO), the EDFacts Governing Board, and the Privacy Technical Assistance Center all help to ensure the quality and privacy of education data. NCES data are made publicly available online and can be located in the ED Data Inventory. In FY2017, ED continued to maintain and grow the Data Inventory, ensuring the information for ED contacts are up to date and expanding the library to include additional years of existing data sets as well as adding new data sets. Additionally, ED is exploring ways to leverage revisions to a technical system to use the data generated through information collection approval process to populate new entries within the Data Inventory.
  • ED made concerted efforts to improve the availability and use of its data in FY17. With the release of the new College 2016 Scorecard, the Department now provides newly combined data in a tool that helps students choose a school that is well-suited to meet their needs, priced affordably, and consistent with their educational and career goals. Additionally, the College 2016 Scorecard promotes the use of open data by providing the underlying data in formats that researchers and developers can use. In fall 2017, the Department will update the Scorecard as part of its annual data refresh.
  • The College 2016 Scorecard effort is a model for future releases of education data, and led to ED’s new effort, InformED, to improve Department capacity to release data in innovative and effective ways to improve public use of data. Through InformED, the Department has:
    • Secured hosting to enable the Office of Civil Rights—for the first time—to post the Civil Rights Data Collection (CRDC) dataset online for direct download.
    • Established critical infrastructure for a more robust Department API program and, through ED’s newly-launched GitHub platform, provides developers with needed information and resources. This includes the creation and release of a new set of APIs providing developers with access to the CRDC, and My Brother’s Keeper data (including student outcomes in high school, college, and beyond by race/ethnicity and gender).
    • Built an interactive data story template and used it to deliver rich and accessible data narratives around the CRDC. The Department’s June 2016 data story, Chronic Absenteeism in the Nation’s Schools, generated substantial attention among the field and press.
    • Revamped how users find the Department’s data resources by updating the agency’s data landing page to be more comprehensive and interactive.
  • ED also participated in the Opportunity Project initiative, now coordinated by the U.S. Department of Commerce. In 2016, ED participated in the initiative’s federal agency cohort of projects and worked with external developers to support the development of three online tools focused on equity gaps:
    • Kitamba & Data Society built the Philadelphia School Community Resource Mapper to help school leaders identify and connect with opportunities for community partnerships.
    • GreatSchools & Education Cities used college readiness data from ED’s Civil Rights Data Collection (CRDC) to measure and display gaps in access to educational opportunities across student groups (to be released shortly).
    • LiveStories built district- and school-level comparison tool “LiveStories IQ” and customizable data briefs using CRDC data to help local school districts and education foundations to discover and share compelling data narratives.
  • In 2017, ED is supporting development of a tool to support homeless youth to access resources, including education.
  • ED partnered with the U.S. Department of Housing and Urban Development to develop a resource supporting data-sharing between public housing agencies and school districts: Data Sharing Road Map: Improving Student Outcomes through Partnerships between Public Housing Agencies and School Districts.
  • Additionally, ED administers the Statewide Longitudinal Data System (SLDS) program ($32.3 million in FY17), which provides grants to states to develop their education-related data infrastructure and use these data for education improvement.
Score
10
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY17? (Example: What Works Clearinghouses)

  • ED’s evidence standards for its grant programs, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s What Works ClearinghouseTM (WWC) evidence standards. ED uses these same evidence standards in all of its discretionary grant competitions that use evidence to direct funds to applicants proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation (see Question #8 below for more detail). As noted above, EPG has coordinated the development of revised evidence definitions and related selection criteria for competitive programs that align with the Every Student Succeeds Act (ESSA) and streamline and clarify provisions for grantees. These revised definitions align with ED’s suggested criteria for States’ implementation of ESSA’s four evidence levels, included in ED’s non-regulatory guidance, Using Evidence to Strengthen Education Investments.
  • Additionally, IES and the National Science Foundation issued a joint report that describes six types of research studies that can generate evidence about how to increase student learning in 2013. These principles are based, in part, on the research goal structure and expectations of IES’s National Center for Education Research (NCER) and National Center for Special Education Research (NCSER). NCER and NCSER communicate these expectations through their Requests for Applications and webinars that are archived on the IES website and available to all applicants.
  • ED’s What Works ClearinghouseTM (WWC) identifies studies that provide valid and statistically significant evidence of effectiveness of a given practice, product, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website. The WWC has reviewed more than 10,000 studies that are available in a searchable database, including a commitment to review all publicly available evaluation reports generated under i3 grants. In FY 2016, 37 i3 grant evaluation reports, containing 48 studies were reviewed and are included in the searchable database. In fall 2016, ED revised and enhanced the WWC website to make evidence easier to access, including through the “Find What Works” tool that makes it easier to find relevant educational programs and interventions, and improving navigation.
Score
8
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY17? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • The Investing in Innovation program (i3) is ED’s signature innovation program for K–12 public education. i3 grants are focused on validating and scaling evidence-based practices, and to encourage innovative approaches to persistent challenges. These “Development” grants are the most prevalent type of i3 grant, comprising 115 out of the 172 i3 grants, and 10 of the 15 new i3 grants made in FY16. i3’s successor program is the Education Innovation and Research (EIR) program, which published notices for the three tiers of grants under that program in December 2016 and will make awards in September 2017. One unique aspect of the EIR notices for the Mid-phase and Expansion tiers was the inclusion of a field-initated innovation priority that allowed applicants to propose a project of their choosing, as long as it met the evidence requirement for that competition.
  • ED is participating in the Performance Partnership Pilots for Disconnected Youth initiative. These pilots give state, local, and tribal governments an opportunity to test innovative new strategies to improve outcomes for low-income disconnected youth ages 14 to 24, including youth who are in foster care, homeless, young parents, involved in the justice system, unemployed, or who have dropped out or are at risk of dropping out of school.
  • ED is continuing to promote the use of data in innovative ways by engaging with developers. This includes launching a new Developer Hub and GitHub platform, which provides developers with needed information and resources, and the creation of new APIs. Additionally, ED continues to be an active participant on the Opportunity Project, which encourages the use of federal data for social good by providing a process for developers, data enthusiasts, policy leaders, and communities to co-create innovative tech solutions that expand opportunity.
  • The White House Social and Behavioral Sciences Team conducted several behavioral science studies related to ED’s work, including looking at the impact of text message reminders for students about key tasks related to college matriculation, such as completing financial aid paperwork, and about notices to student borrowers about income-driven repayment plans.
  • ED is currently implementing the Experimental Sites Initiative to assess the effects of statutory and regulatory flexibility for participating institutions disbursing Title IV student aid.
  • The IES Research Grants Program supports the development and iterative testing of new, innovative approaches to improving education outcomes. IES makes research grants with a goal structure. “Goal 2: Development and Innovation” supports the development of new education curricula; instructional approaches; professional development; technology; and practices, programs, and policies that are implemented at the student-, classroom-, school-, district-, state-, or federal-level to improve student education outcomes.
  • On behalf of ED, IES also administers the Small Business Innovation Research (SBIR) program, which competes funding to small business that propose developing commercially viable education technology projects that are designed to support classroom teaching and student learning. Projects must go through an iterative development process and conduct research to determine promise of effectiveness.
  • ED has funded a number of tools in FY16 and FY17 to support innovation and rigorous evaluation in the field, including:
    • RCT-YES is a free software tool that uses cutting-edge statistical methods to help users easily analyze data and report results from experimental and quasi-experimental impact studies of education programs.
    • Downloadable programs help users build logic models and create ongoing plans for monitoring, measuring, and tracking outcomes over time to measure program effectiveness.
    • A guide for researchers on how to conduct descriptive analysis in education to help identify and describe trends in populations, create new measures, or describe samples in studies aimed at identifying causal effects.
    • The Ed Tech Rapid Cycle Evaluation Coach, a free online tool that helps users plan, conduct, and report findings from experimental and quasi-experimental impact studies of education technology products. The tool is optimized for non-technical users and employs unique statistical methods that allow findings to be presented.
  • ED is implementing a number of Pay for Success projects in FY17:
    • Career and Technical Education (CTE): $2 million to support the development of PFS projects to implement new or scale up existing high-quality CTE opportunities.
    • English Language Acquisition: $293,000 contract to conduct a feasibility study that will identify at least two promising school sites that are using evidence-based interventions for early learning dual language models where a PFS project could take shape to help scale the interventions to reach more students those who need them.
    • Early Learning: $3 million for Preschool Pay for Success feasibility pilots to support innovative funding strategies to expand preschool and improve educational outcomes for 3- and 4- year-olds. These grants will allow states, school districts and other local government agencies to explore whether Pay for Success is a viable financing mechanism for expanding and improving preschool in their communities.
    • Technical Assistance: The Office of Special Education Programs is collaborating with early childhood technical assistance centers to educate and build capacity among state coordinators in IDEA Part C and Part B to explore using PFS to expand or improve special education services for young children. In addition, we have conducted a Pay for Success webinar series for the Comprehensive Centers.
Score
9
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY17? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • ED’s five largest competitive grant programs in FY17 are:1) TRIO ($950 million in FY17); 2) GEAR UP ($340 million in FY17); 3) Charter Schools Grants ($342 million in FY17); 4) Teacher and School Leader Incentive Program (TSL) ($200 million in FY17); and 5) Comprehensive Literacy Development Grants ($190 million in FY17.
  • ED uses evidence of effectiveness when making awards in all 5 of these largest competitive grant programs. 1) ED awards competitive priorities for grant applicants in Upward Bound (over $300 million in FY17) and Upward Bound Math & Science (approximately $48 million in FY17) who demonstrate they will use strategies with moderate evidence of effectiveness. 2) ED awards competitive priorities for grant applicants in GEAR UP state and partnership grants (approximately $70 million in new awards in FY17) who demonstrate they will use strategies with moderate evidence of effectiveness. 3) Under the Charter Schools Grants, ED makes awards through the Replication and Expansion of High-Quality Charter Schools program ($approximately $57 million in new awards in FY17) to applicants based on their demonstrated success in improving student outcomes. 4) The TSL statute requires applicants to provide a description of the rationale for their project and describe how the proposed activities are evidence-based, and grantees are held to these standards in the implementation of the program. 5) Under the Comprehensive Literacy Development program ED included an absolute priority directing States to ensure that their subgrants to districts support projects that are based on moderate evidence when such evidence is available.
  • The Evidence Planning Group (EPG) advises program offices on ways to incorporate evidence in grant programs, including use of evidence as an entry requirement or priority to encourage the use of practices where there is evidence of effectiveness, and/or an exit requirement or priority to build new evidence. As noted in response to Question #4, for the past several years, ED has reported publicly on Performance.gov on its Agency Priority Goal (APG) focused on directing an increasing percentage of funds available for new competitive awards towards projects that are supported by evidence. Funding decisions are not yet final for FY17 grant competitions, but ED is on track to meet its target for that APG.
  • In FY16, ED conducted 12 competitions that included the use of evidence beyond a logic model. In FY17 not all decisions have been finalized yet, but so far ED has announced the following 12 competitions that include the use of evidence beyond a logic model: 1) Strengthening Institutions Program; 2) TRIO-Upward Bound; 3) TRIO-Upward Bound Math & Science; 4) GEAR-UP; 5) National Professional Development; 6) Education Innovation and Research; 7) Supporting Effective Educator Development (2 cohorts); 8) Professional Development for Arts Educators; 9) Jacob K. Javits Gifted and Talented Students Education; 10) Stepping Up Technology Implementation; 11) Model Demonstration Projects to Improve Algebraic Reasoning for Students with Disabilities in Middle and High School; and 12) Magnet Schools Assistance Program.
  • The Education Innovation and Research program ($100 million in FY17) provides competitive grants to create, develop, implement, replicate, or take to scale entrepreneurial, evidence-based, field-initiated innovations to improve student achievement and attainment for high-need students; and rigorously evaluate such innovations. ED’s FY18 budget prioritizes funding evidence-based activities. For example, the budget includes $370 for the EIR program, an increase of $270 million over the FY17 enacted level.
  • Additionally, ESEA requires that ED give priority to applicants demonstrating strong, moderate, or promising levels of evidence within the following seven competitive grant programs: Literacy Education for All, Results for the Nation; Supporting Effective Educator Development; School Leader Recruitment and Support; Statewide Family Engagement Centers; Promise Neighborhoods; Full-Service Community Schools; and Supporting High-Ability Learners and Learning.
Score
8
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY17? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • ED’s 5 largest non-competitive grant programs in FY17 included: 1) Title I Grants to LEAs ($15.4 billion); 2) IDEA Grants to States ($12.0 billion); 3) Supporting Effective Instruction State Grants ($2.2 billion); 4) Impact Aid Payments to Federally Connected Children ($1.3 billion); and 5) 21st Century Community Learning Centers ($1.2 billion).
  • ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized education law, the Every Student Succeeds Act (ESSA). As a result, section 1003 of ESEA requires states to set aside at least 7% of their Title I, Part A funds ($14.9 billion in FY16) for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay For Success initiatives.
  • ED is working to align its diverse technical assistance to best serve state, school districts, and schools as they use evidence to drive improvements in education outcomes.
Score
7
Repurpose for Results

In FY17, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • ED seeks to shift program funds to support more effective practices by prioritizing the use of entry evidence. For ED’s grant competitions where there is evaluative data about current or past grantees, or where new evidence has emerged independent of grantee activities, ED typically reviews such data to shape the design of future grant competitions.
  • Additionally, ED uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, ESEA’s Education Innovation and Research (EIR) program – the successor to i3 – supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students.
  • The President’s 2018 Budget request eliminates or reduces funding for more than 30 discretionary programs that do not address national needs, duplicate other programs, are ineffective, or are more appropriately supported with State, local, or private funds. Major eliminations and reductions in the 2018 Budget include:
    • Supporting Effective Instruction State grants (Title II-A), a savings of $2.3 billion. The program is proposed for elimination because evidence shows that the program is poorly structured to support activities that have a measurable impact on improving student outcomes. It also duplicates other ESEA program funds that may be used for professional development (p. C-16-C20).
    • 21st Century Community Learning Centers program, a savings of $1.2 billion. The program lacks strong evidence of meeting its objectives, such as improving student achievement. Based on program performance data from the 2014-2015 school year, more than half of program participants had no improvement in their math and English grades and nearly 60 percent of participants attended centers for fewer than 30 days (pp. C-23-C-24).
  • It is also worth noting that one of the themes of the FY18 budget for ED was “building evidence around educational innovation.” Consistent with this, the Department sustains funding for all IES-authorized activities and for continued support of State and local-based research, evaluation and statistics that help educators, policymakers and other stakeholders improve outcomes for all students. As another example, the budget requested $42 million for Supporting Effective Educator Development (SEED) to provide evidence-based professional development activities and prepare teachers and principals from nontraditional preparation and certification routes to serve in high-need LEAs.
  • In the previous administration, ED worked with Congress to eliminate 50 programs, saving more than $1.2 billion, including programs like Even Start (see pp. A-72 to A-73) (-$66.5 million in FY11) and Mentoring Grants (see p. G-31) (-$47.3 million in FY10), which the Department recommended eliminating out of concern based on evidence.
Back to the Index

Visit Results4America.org