2017 Federal Index
Corporation for National and Community Service
8
Leadership
Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY17? (Example: Chief Evaluation Officer)
- CNCS’s Office of Research and Evaluation Director (R&E) oversees the development of social science research designed to measure the impact of CNCS programs and shape policy decisions; encourage a culture of performance and accountability in national and community service programs; provide information on volunteering, civic engagement, and volunteer management in nonprofit organizations; and assist in the development and assessment of new initiatives and demonstration projects. The R&E Director, who is overseeing R&E’s $4 million budget and a staff of 9 in FY17, reports directly to the CNCS Chief of Staff and is a member of CNCS’s Leadership Team and Policy Council. The R&E Director also meets regularly with CNCS Program Directors to identify areas where evidence can be generated and used for various decisions.
- The R&E Director meets annually with all CNCS program offices to identify priorities and negotiate which pools of funds are need to support the year’s priorities. The FY17 plan was developed through a series of formal and informal conversations.
8
Evaluation & Research
Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY17?
- CNCS has an evaluation policy that presents 5 key principles that govern the agency’s planning, conduct, and use of program evaluations: rigor, relevance, transparency, independence, and ethics.
- CNCS has an evaluation plan/learning agenda that is updated annually based on input from agency leadership as well as from emerging evidence from completed studies. This agenda is reflected in the CNCS Congressional Budget Justifications each year (see Fiscal Year 2016 (pp. 55-56) and Fiscal Year 2017 (pp. 5-6, 55-56).
- The CNCS Office of Research and Evaluation has built a portfolio of evidence around the agency’s mission and its programs through research studies conducted by university-based scholars, program evaluations conducted by independent third parties, agency performance metrics, and analyses of nationally representative statistics. A report synthesizing findings from FY16 and early FY17 may be found here. In terms of the agency’s research and learning agenda for Fiscal Year 2017 and beyond, there are a few examples worth noting. Two projects – Building Evidence for Service Solutions and Scaling Evidence Based Models – each have project periods of 5 years (a base year with up to 4 option years) and reflect the goal of learning across agency programs and systematically building evidence where there is little or none and bringing to scale effective models as demonstrated through scientific evidence. Similarly, the agency’s second Research Grant competition builds on the first cohort of grantees (3 year study periods) and encourages knowledge building around the agency’s mission and its programs.
- CNCS creates four types of reports for public release: research reports produced directly by research and evaluation staff, research conducted by third party research firms and overseen by research and evaluation staff, reports produced by CNCS-funded research grantees, and evaluation reports submitted by CNCS-funded program grantees. All reports completed and cleared internally are posted to the Evidence Exchange, an electronic repository for reports. This virtual repository was launched in September 2015. Since it launched, a total of 79 research reports have been made available to the public (eight in FY15; 43 in FY16; and 28 in FY17 thus far.
- In FY16 CNCS developed Evaluation Core Curriculum Courses which are presented to its grantees through a webinar series and is available on the CNCS website along with other evaluation resources. The courses are designed to help grantees and other stakeholders easily access materials to aid in conducting or managing program evaluations. In addition to these courses, R & E staff hosted workshops for all 4 regional (staff) trainings in FY17 that focused on how to apply findings from research and evaluation studies to daily operations (i.e., AmeriCorps and Senior Corps programs).
7
Resources
Did the agency invest at least 1% of program funds in evaluations in FY17? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)
- CNCS plans to spend a total of $6.3 million, representing .6% of CNCS’s $1 billion budget in FY17, on evaluation and evaluation capacity building activities (R&E evaluation and program funds combined), including:
- $540,375 of FY17 evaluation funds will be awarded in grants to universities that focus on the economic and social outcomes of the AmeriCorps program (continuation dollars);
- $1,617,525 of FY17 AmeriCorps funds have been awarded in Commission Investment Fund grants for building commission capacity to support grantees (formula grantees) in the areas of evaluation and performance measurement;
- $3.14 million of FY17 Senior Corps funds will be spent on evaluation and evidence-building activities (representing 1.6 percent of Senior Corps’ $202.11 million FY17 budget); and
- $1,000,797.70 of FY17 evaluation funds have been awarded for AmeriCorps grantee evaluation capacity building.
8
Performance Management / Continuous Improvement
Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY17? (Example: Performance stat systems)
- CNCS’s performance management framework is described in the Congressional Budget Justification for Fiscal Year 2016 (p.3) and Fiscal Year 2017 (p.6).
- CNCS has a focused set of Agency-Wide Priority Measures derived from the 2011-2015 Strategic Plan. Every CNCS Program contributes to the Agency-Wide Priority Measures. There are also specific grantee/sponsor measures that roll up into the Agency-Wide Priority Measures, which can be found in the Agency-Wide Priority Measures chart. Grantees are required to select at least one national performance measure, and they are required to report performance measures data annually. CNCS encourages grantees to use these measure for continuous program improvement. CNCS uses the agency-wide priority measures to assess its own progress toward attaining the goals and objectives of its strategic plan.
- New metrics (“Fast Facts”) were developed in FY17 to provide stakeholders, including government representatives and the public, with clear metrics on how and to what degree CNCS programs get things done for America. Together, Fast Facts provides a meaningful picture of our core activities and investments, as it aligns with the CNCS Mission. These are expected to be released for the first time in FY17.
- Additionally, CNCS produces state profile reports, which provide a picture of agency resources in each state at a given point. These reports contain a number of priority indicators, including the number of participants engaged in national service activities as well as the amount of non-CNCS resources generated by the agency’s programs. Along with its stakeholders, CNCS uses this information to understand the capacity of service available in different geographic regions and discuss related implications with key service partners.
- CNCS’s Chief Operating Officer (COO) has been piloting a proof of concept performance management framework that aligns with GPRA since the fourth quarter of FY16, which is notable since CNCS is not subject to GPRA requirements. The COO pilot will help inform an update to the agency’s strategic plan and performance framework as part of its response to Executive Order 13781 and related Administration guidance to federal agency heads, which focus on making agencies more efficient, effective, and accountable.
9
Data
Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY17? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies)
- As the nation’s largest grant maker for service and volunteering, CNCS collects data about service program members, volunteers, and the organizations in which members and volunteers are placed. Member/volunteer demographic, service experience, and outcome data are collected in a variety of ways – both through administrative processes and through surveys:
- In FY17, data collected from a revised member exit survey allowed CNCS to generate more accurate reports on key experiences and anticipated college, career, and civic engagement outcomes, which were shared internally. Survey results are being shared with program and agency leadership in FY17 for program improvement purposes. In FY17, R&E finalized a data request form and an MOU template so that program-level and state-level data sets and reports can be shared with partners. The agency is working on protocols to share these data on our open data platform.
- The agency launched 2 Open Data projects in FY17. Volunteering statistics were made available through this interactive platform for the first time as well as service location data. The goal was to make these data more accessible to all interested end-users. Additional projects will be identified in FY17.
- A report summarizing cross-sectional survey findings on Senior Corps Foster Grandparents and Senior Companion Program volunteers was posted in FY17. A longitudinal survey of volunteers in these 2 Senior Corps programs was implemented in FY15, and preliminary findings (year one follow up data) have been shared internally and are expected to be released as an interim report in FY17.
- The dataset of alumni identified for the alumni outcome survey pilot will be shared with the Census and matched with the LEHD survey data, with findings expected in late FY17. This administrative data match between alumni records and the Census’ LEHD dataset to obtain employment and employment sector outcomes for AmeriCorps alumni will help the agency reduce its reliance on traditional survey methods so that key economic outcomes can be obtained from more objective sources and for less cost.
- CNCS worked closely with the U.S. Census Bureau in FY17 to revise the Current Population Survey supplements to improve the data quality of these instruments. One supplement was created based on a thorough literature review, psychometric testing, cognitive interviews, and public comment. The instrument is being reviewed by OIRA and data collection is planned for September 2017.
- VISTA has initiated a qualitative study involving a document review of a sample of Volunteer Assignment Descriptions (VADs) and VISTA Annual Project Progress Reports (PPRs). The twofold purpose of the project is (A) to support VISTA’s efforts to assess the concurrence between the volunteer’s work assignments and their actual duties; and (B) assess the quality, consistency, utility, depth and completeness of their qualitative records and data. Specifically, VISTA seeks to evaluate required project records to determine the extent to which these records can be used to inform program reports, evaluations and impact assessments and policy discussions.
- The purpose of the Disaster Services Unit member deployment survey is to obtain AmeriCorps Disaster Response team (ADR-T) members’ perception of their deployment experience so that supervisors and DSU can monitor members’ experience and make the necessary changes to improve deployments. The survey has questions pertaining to their general experiences with the logistics, safety, their experiences working with team members, supervisors, other individuals such as families and other organizations and how meaningful those experiences were for the members, and an assessment of the training provided and skills learned. The survey will be implemented as a pilot and the final version will be part of DSU’s clearance package that will be submitted to OMB in December 2017. The expectation is that each ARD-T member will take the survey at the end of their deployment and that their supervisor may also ask some open ended questions 30 days after the deployment.
- The Social Innovation Fund’s Pay for Success Administrative Data Pilot launched in FY17 will increase access to administrative data sets for evaluation purposes, and findings will be available by the end of the fiscal year.
8
Common Evidence Standards / What Works Designations
Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY17? (Example: What Works Clearinghouses)
- CNCS also adapted the evidence framework used by its Social Innovation Fund and the Investing in Innovation Fund at ED and included it as part of the AmeriCorps State and National program’s FY16 and FY 17 grant competitions. The evidence framework used in the FY16 and 17 AmeriCorps competition was revised from FY15 to make it more consistent with what is used in other federal agencies.
- In March 2015, CNCS released Phase I of the CNCS Evidence Exchange, a virtual repository of reports intended to help CNCS grantees and other interested stakeholders find information about evidence- and research- based national service and social innovation programs. Phase 1 includes a database of single study reports with some additional descriptive information about the study, as well as a systematic review of the national service evidence base. Phase 2 in FY16 added studies as grantees completed their independent evaluations and submitted reports to CNCS. In FY17 CNCS focused on disseminating final reports as studies were completed and ensuring that the functionality of the site made the information as accessible as possible.
7
Innovation
Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY17? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)
- CNCS awarded 7 AmeriCorps Evidence-Based Planning grants (an investment of $500,000 in evaluation funds) in FY17. These one-year planning grants were awarded to encourage the identification of a new evidence-based program or practice and develop a strategy for integrating national service into the effective model. In addition, awardees will develop an evaluation plan to assess the innovation should it be funded in future grant competitions. Research & Evaluation staff will conduct a process evaluation of these grantees to systematically assess the successes and challenges of implementing these grants.
- CNCS awarded another round of Commission Investment Fund grants in FY17. Overall, .3% of these grants (totaling $5,391,750 in AmeriCorps funding) will focus on building the capacity of State Commissions and their grantees to collect and use performance and evaluation data. Research & Evaluation staff are completing a process evaluation of these grantees that will identify the successes and challenges of implementing these grants. These findings will be made public in FY17 and used to inform the second cohort of funded grantees.
- R & E developed and pilot-tested an organizational capacity assessment tool with the goal of providing grantees one instrument to track data across CNCS programs. Encouraged by findings from the SIF National Assessment (which demonstrated the effectiveness of the initiative for improving organizational capacity in the areas of evidence and evaluation), CNCS seeks to assess its impact on organizations more systematically. The instrument will be available for OIRA clearance in FY17.
9
Use of Evidence in 5 Largest Competitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY17? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)
- CNCS is operating three competitive grant programs in FY17: 1) AmeriCorps State and National program (excluding State formula grant funds) ($232,372,470 million in FY17); 2) Senior Corps RSVP program ($49 million in FY17); and 3) the Social Innovation Fund (SIF) ($50 million in FY16). (SIF funding was zeroed out by Congress in FY17, but CNCS still spent FY16 SIF funds in FY17.)
- CNCS’s AmeriCorps State and National Grants Program (excluding State formula grant funds), application (see pp. 10-14) allocated up to 31 points out of 100 to organizations that submit applications supported by performance and evaluation data in FY17. Specifically, up to 19 points can be assigned to applications with theories of change supported by relevant research literature, program performance data, or program evaluation data; and up to 12 points can be assigned for an applicant’s incoming level of evidence with the highest number of points awarded to strong levels of evidence. These categories of evidence are modeled closely on the levels of evidence defined in the Social Innovation Fund.
- From FY10-FY16, SIF provided competitive grants to non-profit grant-making organizations to help them grow promising, evidence-based solutions that address pressing economic opportunity, healthy futures, and youth development issues in low- income communities. The FY14-16 Omnibus Appropriations Acts allowed CNCS to invest up to 20% of SIF funds each year in Pay for Success initiatives.
7
Use of Evidence in 5 Largest Non-Competitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY17? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)
- CNCS operates one formula grant program, the AmeriCorps State formula grants program ($154 million in FY17). CNCS also operates four direct grant programs in FY17: 1) National Civilian Community Corps (NCCC) ($30 million in FY17), 2) VISTA ($92 million in FY17), 3) Senior Corps Foster Grandparents ($108 million in FY17), and 4) Senior Corps Senior Companion Program ($46 million in FY17).
- In FY17, the Senior Corps Foster Grandparents and Senior Companion programs embedded evidence into their grant renewal processes by offering supplemental funding (“augmentation grants”) to grantees interested in deploying volunteers to serve in evidence-based programs (see pp. 2-4) and providing evaluation data on implementation fidelity, including outcomes. A total of $500,000.00 (.01% of total program funds) is allocated for the RSVP grantees to augment their baseline award to implement an evidence based program. For the Foster Grandparents program $400,000.00 (.004% of total program funds) has been allocated in FY17. A total of $300,000.00 is allocated for the Senior Companion program in FY17, or .007% of total program funds.
- VISTA finalized its theory of change and a learning agenda/evaluation plan to guide them in FY17 that makes explicit the link between the work that the volunteers perform, the design of a sponsor’s project to address community needs, and the evidence to support this activity. This effort will impact several management aspects including project approval, volunteer assignment descriptions, member activity, data collection, and the role of evidence in the design and implementation of projects.
4
Repurpose for Results
In FY17, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)
- Over the past 8 years, AmeriCorps has reduced the amount of a grant award if member enrollment/retention benchmarks are not met. For example, in FY17, a grantee that had not met the performance targets requested 380 part-time members (the equivalent of $1 million) and was given zero.
- Over the past 3 years, money has been redirected away from grantees with poor past performance metrics and awarded to grantees with positive past performance metrics so that they will have “forward funding.” (CNCS programs typically award one-year grants with an option to receive continuation awards). For example, in FY17, AmeriCorps “forward funded” 12 grants, with 9% of its competitive funding based on the reallocation of dollars from poor performing grantees to high performing grantees.
- According to CNCS policy, Americorps State Commission staff will assess their recompeting subgrantees’ past performance and submit those assessments to CNCS. CNCS will assess its recompeting direct grantees related to past performance. This assessment is in addition to the evaluation of the applicant’s eligibility for funding or the quality of its application on the basis of the Selection Criteria. Results from this assessment will inform funding decisions. In evaluating programmatic performance, CNCS will consider the following for applicants that are current formula and competitive grantees and are submitting applications for the same program model:
- Grant progress reports – attainment of Performance Measures
- Enrollment and retention
- Compliance with 30-day enrollment and exit requirements in the AmeriCorps portal
- Site visit or other monitoring findings (if applicable)
- Significant opportunities and/or risks of the grantee related to national service
- Commission Rank