2016 Federal Index

U.S. Department of Labor


Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY17? (Example: Chief Evaluation Officer)

  • DOL’s Chief Evaluation Officer is a senior official with responsibility for all activities of the Chief Evaluation Office (CEO), and coordination of evaluations Department-wide. In 2016, DOL’s Chief Evaluation Officer was converted to a career position, a change which more fully cements the principle of independence and reflects the Department’s commitment to institutionalizing an evidence-based culture at DOL. Evaluation results and products are approved and released by the Chief Evaluation Officer (as per the CEO Evaluation Policy), and disseminated in various formats appropriate to practitioners, policymakers, and evaluators.
  • The CEO includes 15 full-time staff plus a small number of contractors and 1-2 detailees at any given time. This staff level is augmented by staff from research and evaluation units in other DOL agencies. For example, the Employment and Training Administration has 9 FTE’s dedicated to research and evaluation activities with which CEO coordinates extensively. CEO staff have expertise in research and evaluation methods as well as in DOL programs and policies and the populations they serve. CEO also employs technical working groups on the majority of evaluation projects whose members have deep technical and subject matter expertise. Further, CEO staff engage and collaborate with program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.
  • In FY17, the CEO will directly oversee an estimated $40 million in evaluation funding (this includes the direct appropriation, the set aside amount and other funds that come from programmatic accounts where evaluations are co-funded). The $40 million includes the appropriated budget for the Departmental Program Evaluation (over $8 million in FY17) and the Department’s evaluation set-aside funds (of up to 0.75% of select department accounts), which will be approximately $24 million in FY17. CEO also collaborates with DOL and other Federal agencies on additional evaluations being carried out by other offices and/or supported by funds appropriated to DOL programs such as Employment and Training Administration (ETA) pilots, demonstrations and research and evaluations of large grant programs including the Performance Partnership Pilots (P3), American Apprenticeship Initiative (AIA), the Trade Adjustment Assistance Community College and Career Training (TAACCCT) Grant Program, and Reentry Programs for Ex-Offenders.
  • The CEO also participates actively in the performance review process during which each operating agency meets with Department leadership to review progress on performance goals established for the year required under Government Performance and Results Act (GPRA).
  • The CEO’s role is to incorporate evidence and evaluation findings as appropriate and to identify knowledge gaps that might be filled by evaluations or convey evidence that can inform policy and program decisions or performance. DOL’s Chief Evaluation Officer and senior staff are part DOL’s leadership structure and weigh in on major program and policy decisions and play a role in the formation of the DOL’s agencies’ annual budget requests, recommendations around including evidence in grant competitions, and providing technical assistance to the Department leadership to ensure that evidence informs policy design. There are a number of mechanisms set up to facilitate this: CEO participates in quarterly performance meetings with DOL leadership and the Performance Management Center (PMS); CEO reviews agency operating plans and works with agencies and the PMS to coordinate performance targets and measures and evaluations findings; Quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process; and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY17?

  • DOL has an Evaluation Policy Statement that formalizes the principles that govern all program evaluations in the Department, including methodological rigor, independence, transparency, ethics, and relevance. In addition, the Chief Evaluation Office publicly communicates the standards and methods expected in DOL evaluations in formal procurement statements of work.
  • DOL also develops, implements, and publicly releases an annual Evaluation Plan (i.e., Department-level learning agenda) which includes planned projects with each of DOL’s operating agencies. Agency learning agendas, developed by CEO in partnership with each operating agency, form the basis for the DOL’s Evaluation Plan. The 2016 Evaluation Plan was posted in the Federal Register. The 2017 plan will be posted on the CEO website once finalized before the end of the fiscal year.
  • Once contracts are awarded for new evaluation studies, they are posted on the Current Studies page of CEO’s website for the public to see everything currently underway as well as timelines for study completion and publication of results
  • All DOL reports and findings are publicly released and posted on the complete reports section of CEO website. The Chief Evaluation Officer has the “authority to approve, release, and disseminate evaluation reports” (as per the DOL Evaluation Policy). DOL agencies also post and release their own research and evaluation reports.

Did the agency invest at least 1% of program funds in evaluations in FY17? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • In FY 17, DOL’s CEO will directly oversee an estimated $40 million in evaluation funding. This represents approximately 0.3% of DOL’s FY17 discretionary budget of $12.8 billion. However in addition to this, many DOL evaluations are supported by funds appropriated to DOL programs and/or are carried out by other offices within DOL. For example, in addition to CEO, most agencies and program offices conduct and support evaluation activities with their dollars. Further, several DOL agencies also have separate evaluation appropriations. DOL studies funded through individual agencies and program offices also coordinate with DOL’s CEO. DOL’s Chief Evaluation Office directly funds evaluations and also combines CEO funds with agency funds to jointly sponsor some evaluations.
  • In many areas where DOL is undertaking evaluation activities, the evaluation budget far exceeds 1% of the budget for the program (e.g., the budgets for the evaluations of a number of recent grant programs are between 3% and 5% of the programmatic budget).
  • The Administration’s FY14-FY17 budget requests recommended allowing the U.S. Secretary of Labor to set aside up to 1% of all operating agencies’ budgets for evaluations, coordinated by CEO. In FYs 2012-2015, Congress authorized the Secretary to set aside up to 0.5% of these funds for evaluations, in addition to the separate evaluation funds that exist in many DOL agencies. In FY16 and FY17, Congress authorized DOL to set aside up to 0.75% of operating agency budgets for use by CEO for evaluations.
  • The development of States’ capacity to conduct evaluations is a long-range and iterative process. Significant resources are dedicated to aiding these efforts through various forms of technical assistance and guidance. A primary goal initially will be to enhance capacity by building knowledge among State staff regarding various methodologies, approaches for enlisting expertise, and the potential role of evaluations and research in meeting State goals and priorities. During FY17, ETA hosted three National Convenings in Washington, DC, Dallas TX, and San Diego, CA. ETA included sessions on Research and Evaluation under WIOA in coordination with the performance accountability sessions. More technical assistance is planned.
  • In order to support the workforce system in being prepared to meet evaluation and reporting requirements under their WIOA grants, DOL has invested in evaluation and evaluation capacity-building activities, including:
    • DOL has launched a Workforce Innovation and Opportunity Act implementation study to document and describe how critical state activities under WIOA are being implemented, particularly the core programs in titles I and III, and identify possibly areas on which further technical assistance, guidance, or policies might be needed in order to help states implement the law. The WIOA state implementation study will build on both past and ongoing studies, including: 1) the ongoing random assignment evaluation of the adult and dislocated worker programs under the Workforce Investment Act, 2) studies involving data analytics on workforce program services, 3) an ongoing institutional analysis of AJCs, and 4) an ongoing study on AJC customer experiences.
    • In response to the WIOA’s requirement to measure effectiveness in serving employers as a primary indicator of performance, DOL is piloting three approaches designed to gauge three critical workforce needs of the business community: (1) Retention with the same employer; (2) Repeat business customers; and (3) Employer Penetration Rate. States must select two of the three approaches to pilot, and may develop an additional State-specific approach. DOL will evaluate State experiences with the various approaches and plan to identify a standardized indicator that the Departments anticipate will be implemented no later than the beginning of Program Year 2019.
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY17? (Example: Performance stat systems)

  • DOL’s Performance Management Center (PMC) is responsible for the Department’s extensive performance management system, which includes over 400 measures whose results are reviewed quarterly by Department leadership. PMC leads the Department’s Continuous Process Improvement (CPI) Program which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
  • PMC’s activities are intended to improve DOL’s program performance through data-driven analysis, sharing best practices, and implementing activities associated with the Government Performance and Results Modernization Act of 2010 (GPRA). Using a PerformanceStat-type reporting and dashboard system, PMC coordinates quarterly meetings between the Deputy Secretary and each agency head, to review performance results contributing to DOL’s strategic goals, to make commitments related to performance improvement, and to follow up on the progress of previous performance improvement commitments. PMC also oversees the Strategic Planning process and analyzes performance data in collaboration with agencies to achieve continuous performance improvement. CEO actively participates in the quarterly performance reviews to incorporate findings from evaluations as appropriate.
  • One of the most important roles that DOL’s CEO plays is to facilitate the interaction between program and evaluation analysts, and performance management and evaluation. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The quarterly reviews with leadership routinely include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.
  • To promote the use of evidence based strategies DOL’s Employment and Training Administration (ETA) also continues to manage the Workforce Systems Strategies website, which identifies a range of potential strategies informed by research evidence and peer exchanges to support grantees in providing effective services to customers.

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY17? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)

  • DOL makes the majority of its administrative and survey datasets publicly available for secondary use. For more information, see CEO’s Public Use Datasets and ETA’s repository of public use datasets.
  • DOL’s Bureau of Labor Statistics (BLS) (approximately $600 million in FY17) serves as the principal Federal agency responsible for measuring labor market activity, working conditions, and price changes in the economy. BLS has 110 Cooperative Agreements with 50 States and 4 Territories for labor market and economic data sharing. For calendar year 2016, there were 513 “letters of agreement” on data usage with academics to conduct statistical research, and 8 data sharing agreements with the Bureau of Economic Analysis and the Census Bureau, for a total of 521 agreements.
  • DOL’s Employment and Training Administration (ETA) has agreements with 52 States and Territories for data sharing and exchange of wage data for performance accountability purposes. In FY15 DOL’s ETA began work with the Department of Education’s Office of Career Technical and Adult Education, Rehabilitative Services Administration and Office of the General Counsel to revise and renegotiate the agreements that ETA shares with 52 States and Territories to facilitate better access to quarterly wage data by States for purposes of performance accountability and Research and Evaluation requirements under the Workforce Innovation and Opportunity Act (WIOA). This work aims to expand access to wage data to Education’s Adult and Family Literacy Act programs (AEFLA) and Vocational Rehabilitation programs among others. This work has continued through FY17 and is being conducted in collaboration with State agencies who are subject to the performance accountability and research and evaluation requirements of WIOA and the State Unemployment Insurance Agencies.
  • DOL’s CEO, Employment Training Administration (ETA), and the Veterans Employment and Training Service (VETS) have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the Directory of New Hires. In this past year DOL has entered into interagency data sharing agreements with HHS and obtained data to support 10 job training and employment program evaluations.
  • DOL’s worker protection agencies have open-data provisions on enforcement activity for firms from DOL’s five labor enforcement agencies online and accessible through the Enforcement Data Base (Mine Safety and Health Administration, Wage and Hour Division, Occupational Safety and Health Administration, and the Employee Benefits Security Administration).
  • The privacy provisions for BLS and DOL’s Employment and Training Administration (ETA) are publicly available online.
  • In FY17, DOL continued efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in DOL’s CEO office, and through new pilots beginning in BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes. The Data Analytics unit has also updated its Data Exchange and Analysis Platform (DEAP) with high processing capacity and privacy provisions to share, link, and analyze program and survey data across DOL programs and agencies and with other agencies. Internal use of DEAP is available now and public access will be available in the future.
  • The Workforce Innovation Opportunity Act (WIOA) calls for aligned indicators of performance for WIOA authorized programs. DOL’s Employment and Training Administration has worked within DOL and with the U.S. Department of Education to pursue the deepest WIOA alignment possible, including indicators definitions, data elements, and specifications to improve the quality and analytic value of the data. DOL chose to include several additional DOL programs in this process, which will result in unprecedented alignment of data and definitions for 13 federal programs (11 DOL and 2 Education). DOL and ED have issued five WIOA Final Rules, which all became effective October 18, 2016. The regulations cover WIOA programs under Title I, II, III, and IV, in addition to other miscellaneous changes. The aligned indicators of performance are included in the DOL-ED Joint Rule for WIOA, part 677.
  • ETA continues funding and technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data and education data longitudinally. ETA and DOL’s Veteran’s Employment and Training Service have also modified state workforce program reporting system requirements to include data items for a larger set of grant programs, which will improve access to administrative data for evaluation and performance management purposes. An example of the expanded data reporting requirements is the Homeless Veterans Reintegration Program FY16 grants.
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY17? (Example: What Works Clearinghouses)

  • DOL uses the Cross-agency Federal Evidence Framework for evaluation planning and dissemination.
  • DOL’s Clearinghouse for Labor Evaluation and Research (CLEAR) is an internet-based evidence clearinghouse of evaluation reports that reviews designs, methodologies, and findings according to specific standards developed by technical work groups. Each study is 2016 Scored and given a “causal evidence rating” according to the scoring rubric in the standards. CLEAR is a user-friendly, searchable website, that includes academic quality reviews for each study included in the system, appropriate for peer academic researchers, potential evaluation contractors submitting technical proposals, program practitioners seeking information on “what works”, policy makers, and the general public.
  • DOL uses the CLEAR evidence guidelines and standards when discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are thus used in grants for evidence-based programs demonstrations and in reviewing evaluations in the structured evidence reviews conducted in CLEAR. Requests for proposals also indicate the CLEAR standards apply to all CEO evaluations. Also, DOL has a “Department Evaluation Policy Statement” that formalizes the principles that govern all program evaluations in the Department, including methodological rigor, independence, transparency, ethics, and relevance. In addition, CEO publicly communicates the standards and methods expected in all DOL evaluations, and the standards are incorporated into formal procurement statements of work, with scoring for awards based on the standards.
  • Additionally, DOL collaborates with other agencies (HHS, ED-IES, NSF, CNCS) on refining cross-agency evidence guidelines and developing technological procedures to link and share reviews across clearinghouses. The Interagency Evidence Framework conveys the categories of evaluations, the quality review of evaluation methodologies and results, and the use of evaluation finings. The framework is accepted Department-wide.

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY17? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • DOL is participating in the Performance Partnership Pilots (P3) for innovative service delivery for disconnected youth which includes not only waivers and blending and braiding of federal funds, but gives bonus points in application reviews for proposing “high tier” evaluations. DOL is the lead agency for the evaluation of P3.
  • DOL recently completed a number of behavioral science evaluations—three in unemployment insurance, two in OSHA, one in OFCCP, and one in EBSA for pension contributions (see the CEO website), and is in the planning phase for a number of others. For more DOL’s work in this area, see DOL’s Behavioral Insights page.
  • DOL is using Job Corps’ demonstration authority to test and evaluate innovative and promising models that could improve outcomes for youth. In 2016 DOL awarded a contract for a Job Corps pilot program, the Cascades Job Corps College and Career Academy. The pilot will test alternative strategies for the operation of Job Corps for younger participants (ages 16 and 21). Past research on the program showed that while Job Corps increased the education and earnings of program participants, it was more beneficial for youth over age 20 than for its younger participants. This pilot uses DOL’s demonstration authority to test innovative and promising strategies (which include using a career pathway program approach of integrated academic and technical training, workforce preparation skills and support services) to better meet the needs of this population. CEO is sponsoring a rigorous impact evaluation to examine the effectiveness of the pilot. See the study overview here.
  • DOL has two pilot projects testing use of a Pay for Success (PFS) financing model. In 2013, DOL awarded 2 grants of approximately $12 million to two states, Massachusetts and New York, to test a PFS model where private for-profit and philanthropic investors pay the up-front costs of delivering an intervention designed to achieve specific outcomes within a given timeframe. In return for accepting the risk of funding the project, the investors may expect a return if the project is successful; however, repayment by the government is contingent on the validated achievement of results. Both pilots are employing a random assignment methodology to measure results, which are due in 2017. DOL is sponsoring a process study to document project implementation and provide information on the PFS approach for policymakers and program administrators. The first report from this study was released in 2016, documenting the development of pilots and first year of implementation. A second report will document the pilots’ longer-term operational experiences, including the extent to which the pilots achieved their performance milestones.
  • DOL has invested more than $90 million through the ApprenticeshipUSA initiative – a national campaign bringing together a broad range of stakeholders, including employers, labor, and States, education and workforce partners, to expand and diversify Registered Apprenticeship in the United States. This includes more than $60 million for State-led strategies to grow and diversify apprenticeship, and State Accelerator Grants to help integrate apprenticeship into education and workforce systems; engage industry and other partners to expand apprenticeship to new sectors and new populations at scale; conduct outreach and work with employers to start new programs; promote greater inclusion and diversity in apprenticeship; and develop statewide and regional strategies aimed at building state capacity to support new apprenticeship programs. All of these grants include funding for data collection; additionally, ETA and CEO are conducting an evaluation of American Apprenticeship Initiative.
  • CEO received grant making authority in 2016. In January of 2017, we awarded 9 research grants aim at supporting university-based research of workforce policies and programs. The goal is to build capacity and drive innovation among academic researchers to answer questions that will provide insight into labor policies and programs.
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY17? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • In FY17, the five largest competitive grant programs awarded were: 1) Senior Community Service Employment Program (SCSEP) National Grants ($140 million), 2) America’s Promise Job Driven Grant Program ($111 million), 3) YouthBuild ($80 million), 4) ApprenticeshipUSA State Expansion Grants ($50.5 million), and 5) Reentry Projects ($66 million).
  • All have been or will be involved in evaluations designed by CEO and the relevant agencies, and require or encourage (through language in the funding announcement and proposal review criteria) grantees to use evidence-based models or strategies in grant interventions and/or test, by participating in an evaluation, new interventions that theory or research suggest are promising.
  • DOL includes rigorous evaluation requirements in all competitive grant programs, involving either: 1) full participation in a national evaluation as a condition of grant receipt; 2) an independent third-party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, bonus scoring for evidence-based interventions or multi-site rigorous tests), or 3) full participation in an evaluation as well as rigorous grantee (or local) evaluations. For example, SCSEP includes language that states, “By accepting grant funds, grantees agree to participate in the evaluation, should they be selected, including making records on participants, employers, and funding available and providing access to program operating personnel and participants, as specified by the evaluator(s) under the direction of ETA, including after the period of operation.”
  • The America’s Promise grant program aims to build partnerships focused on key priorities, including using evidence-based design. The grant funding announcement outlined a number of evidence-based strategies and encouraged their use as well as evaluation of promising strategies to build evidence. For instance, the funding announcement stated that, “sector strategies can increase the employability, employment, earnings, and outcomes of job seekers, and at the same time benefit employers through improved worker productivity, job retention, and enhanced profitability. For example, studies randomly assigning people to job training programs with sector partnerships found that participants were employed at a higher rate with higher earnings (an additional $4,500 over 24 months) than those who went through other employment and training programs. For applicants that already have sector strategies in place, we are interested in seeing data and demonstration of strong outcomes for job seekers and employers alike that are outlined later in this document….”
  • The America’s Promise grant program aims to build partnerships focused on key priorities, including using evidence-based design. The grant funding announcement outlined a number of evidence-based strategies and encouraged their use. In reviewing grant proposals, 54 points (out of 100) were given for “project design” (and additional 10 points were given for the expected outcomes and outputs section). The funding announcement, and specifically the project design section outlined evidence based strategies and provided links to resources to make it easier for bidders to identify effective strategies and support improved outcomes. For instance, the funding announcement stated that, “sector strategies can increase the employability, employment, earnings, and outcomes of job seekers, and at the same time benefit employers through improved worker productivity, job retention, and enhanced profitability. For example, studies randomly assigning people to job training programs with sector partnerships found that participants were employed at a higher rate with higher earnings (an additional $4,500 over 24 months) than those who went through other employment and training programs. For applicants that already have sector strategies in place, we are interested in seeing data and demonstration of strong outcomes for job seekers and employers alike that are outlined later in this document…” Additionally, the funding announcement required participation in a national evaluation as a condition of award.
  • The YouthBuild funding announcement required applicants to demonstrate how their project design is informed by the existing evidence base on disadvantaged youth serving social programs, and in particular disadvantaged youth workforce development programs.
  • The ApprenticeshipUSA grant program strives to meet DOL’s goal to double and further diversify Registered Apprenticeships across the country. The grant funding announcement encourages applicants to “use program models with demonstrated evidence of success in serving the targeted population(s), especially models shown by rigorous program evaluations to have positive impacts on participants’ employment and earnings outcomes.” Further, allowable activities include program administration to improve program efficiency, program quality and outcome measurement such as project management, data collection and grant reporting, and grant monitoring and evaluation.
  • The Reentry Projects grant program requires applicants to propose evidence-based and informed interventions, or new interventions that theory or research suggests are promising, (or a combination of both) that lead to increased employment outcomes for their target populations and must frame their goals and objectives to address this issue; applicants are able to select and implement different program services and/or features of program models. The grant funding announcement includes examples of previous studies and evaluations that DOL has conducted on reentry programs, as well as other evidence-based and promising practices, and applicants were encouraged to review these resources prior to designing their intervention.
  • The TechHire Partnership grants rapidly train workers for and connect them to well-paying, middle- and high-skilled, and high-growth jobs across a diversity of H-1B industries. It used a tiered-evidence framework where evidence-based design was a requirement of grant award, and for grantees that requested between $4 and $5 million, applications were assessed by a panel of experts along a continuum of innovation and evidence, ranging from strong to moderate to preliminary. Grantees receiving more than $4 million must also plan to replicate, at multiple sites and/or with the targeted and other populations, strategies that have been shown by prior research to have evidence of positive impacts on education and/or employment outcomes.
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY17? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • In FY17/PY17, the 5 largest non-competitive grant programs at DOL are in the Employment and Training Administration: 1) the Unemployment Insurance State grants ($2.5 billion in FY 2017); 2) the Employment Security program state grants ($666 million in PY 2017); and 3) three authorized programs under the Workforce Innovation and Opportunity Act (WIOA). The 3 WIOA-authorized grants are: 1) Youth Workforce Investment program ($867 million in PY 2017), 2) Adult Employment and Training program** ($809 million in PY 2017), and 3) Dislocated Workers Employment and Training program** ($1.2 billion in PY 2017).
  • All Employment and Training Administration grant programs allocate funding by statute, and all include performance metrics (e.g., unemployment insurance payment integrity, WIOA common measures) tracked quarterly.
  • A signature feature of the Workforce Innovation and Opportunity Act (WIOA) (Pub. L. 113-128), is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to States’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other State agencies and Federal evaluations under WIOA), dissemination, and provision of data and other information for Federal evaluations.
  • WIOA includes evidence and performance provisions which: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized States and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
Repurpose for Results

In FY17, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • DOL’s evidence-based strategy is focused on program performance improvement and expansion of strategies and programs on which there is evidence of positive impact from rigorous evaluations. The department takes all action possible to improve performance before considering funding reductions or program termination. However, DOL does use program performance measures and results from evaluations to make decisions about future funding. For example:
    • In 2016 DOL established a methodology assessing the performance of Job Corps centers and selecting Centers for closure. In 2016 one Job Corps Center was closed because of its chronic low performance. Closure of this center allows DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in July 2016, DOL announced the closure and the methodology used for selecting Centers for closure.
  • In 2016 DOL established a methodology assessing the performance of Job Corps centers and selecting Centers for closure. In 2016 one Job Corps Center was closed because of its chronic low performance. Closure of this center allows DOL to shift limited program dollars to centers that will better serve students by providing the training and credentials they need to achieve positive employment and educational outcomes. In a Federal Register notice published in July 2016, DOL announced the closure and the methodology used for selecting Centers for closure.
  • All discretionary grant performance is closely monitored and has been used to take corrective action and make decisions about continued funding. For example, Youthbuild grant funding is based heavily on past performance. Organizations that have previously received and completed a YouthBuild grant award receive points based on past performance demonstrated totaling 28 points (almost 30% of their score). This effectively weeds out low performing grantees from winning future awards. (For more information, see the Grant Funding Announcement.) Additionally, DOL uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, recent grant programs such as TechHire and America’s Promise supports the creation, development, implementation, replication, and scaling up of evidence-based practices designed to improve outcomes.
  • DOL’s FY18 budget request prioritizes programs with demonstrated evidence (e.g., by allocating $90 million to expand apprenticeships, an evidence-based approach that combines on-the-job training with classroom instruction) and proposes reductions to unproven or duplicative activities (e.g., it proposes a reduction of $238 million by closing additional Job Corps centers that do not meet performance standards, and proposes eliminating the Senior Community Service Employment Program).
Back to the Index

Visit Results4America.org