2017 Federal Index


Substance Abuse and Mental Health Services Administration

Score
7
Leadership

Did the agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them in FY17? (Example: Chief Evaluation Officer)

  • The director of SAMHSA’s Center for Behavioral Health Statistics and Quality (CBHSQ) Division of Evaluation, Analysis and Quality (DEAQ) serves as the agency’s evaluation lead with key evaluation staff housed in this division. In addition, the agency’s chief medical officer (CMO), as described in the 21st Century Cures Act, plays a key role in addressing evaluation approaches and the utilization of evidence-based programs and practices among grantees; at this time, a collaborative approach between CBHSQ and the Office of the CMO is being established to ensure broad agency evaluation oversight by senior staff. The Office of the CMO is housed within the agency’s emerging Mental Health Policy Lab (currently the Office of Policy, Planning and Innovation) and will influence evaluation policy decisions across the agency in a more systematic manner as the new Policy Lab is stood up in January 2018.
  • SAMHSA’s Office of Policy, Planning and Innovation provides policy perspectives and guidance to raise awareness around SAMHSA’s research and behavioral health agenda. OPPI also facilitates the adoption of data-driven practices among other federal agencies and partners such as the National Institutes for Health, the Centers for Disease Control and Prevention, and the Centers for Medicare and Medicaid Services.
  • At this time, evaluation authority, staff, and resources are decentralized and found throughout the agency. SAMHSA is composed of four Centers, the Center for Mental Health Services (CMHS), the Center for Substance Abuse Treatment (CSAT), the Center for Substance Abuse Prevention (CSAP) and the Center for Behavioral Health Statistics and Quality (CBHSQ). CMHS, CSAT, and CSAP oversee grantee portfolios and evaluations of those portfolios. Evaluation decisions within SAMHSA are made within each Center specific to their program priorities and resources. Each of the three program Centers uses their program funds for conducting evaluations of varying types. CBHSQ, SAMHSA’s research arm, provides varying levels of oversight and guidance to the Centers for evaluation activities. CBHSQ also provides technical assistance related to data collection and analysis to assist in the development of evaluation tools and clearance packages. Within CBHSQ’s DEAQ, the Quality, Evaluation, Performance Branch (QEPB) builds internal capacity for “developing more rigorous evaluations conducted” internally and externally to assess the “impact of its behavioral health programs… and treatment measures,” and the Analysis and Services Research Branch (ASRB) focuses on effective delivery and financing of health care and services.
  • SAMHSA evaluations are funded from program funds that are used for service grants, technical assistance, and for evaluation activities. Evaluations have also been funded from recycled funds from grants or other contract activities. Given the broad landscape of evaluation authority and funding, a variety of evaluation models have been implemented. These include recent evaluations funded and managed by the program Centers (e.g., First Episode Psychosis, FEP); evaluations funded by the Centers but directed outside of SAMHSA (e.g., Assisted Outpatient Treatment, AOT), and those that CBHSQ directly funds and executes (e.g., Primary and Behavioral Health Care Integration, PBHCI, and the Cures-funded Opioid State Targeted Response funding). Evaluations require different degrees of independence to ensure objectivity and the models above afford SAMHSA the latitude to enhance evaluation rigor and independence on a customized basis.
  • In 2016, CBHSQ conducted a summer review of evaluation activities with the program Centers and presented its findings to the SAMHSA Executive Leadership Team (ELT). As a result, SAMHSA revised and finalized a new Evaluation Policy and Procedure (P&P) grounded in an earlier evaluation P&P and is currently developing a Learning Agenda to prioritize activities to address gaps in data collection, data analysis and the identification of evidence based practices in high profile areas (e.g. SMI, SED, Opioids, Marijuana, Suicide, Health Financing, among others.) The new Evaluation P&P requires Centers to identify research questions and appropriately match the type of evaluation to the maturity of the program. A new workgroup, the Cross-Center Evaluation Review Board (CCERB), composed of Center evaluation experts, will now review significant evaluations at critical milestones in the planning and implementation process, providing specific recommendations to the Center Director having the lead for the evaluation. SAMHSA’s Cross Center Evaluation Review Board (CCERB) works with the four centers within SAMHSA: CSAP, CMHS, CSAT, and CBHSQ to advise, conduct, collaborate, and coordinate on all evaluation and data collection activities that occur within SAMHSA. CCERB staff provides support for program-specific and administration-wide evaluations. SAMHSA’s CMO will also play a key role in reviewing evaluation proposals and clearing final reports.
Score
7
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and research/learning agenda(s) and did it publicly release the findings of all completed evaluations in FY17?

  • SAMHSA’s Evaluation Policy and Procedure (P&P), revised and approved in May 2017, provides guidance across the agency regarding all program evaluations. Specifically, the Evaluation P&P describes the demand for rigor, compliance with ethical standards, and compliance with privacy requirements for all program evaluations conducted and funded by the agency. The Evaluation P&P serves as the agency’s formal evaluation plan and includes a new process for the public release of final evaluation reports, including findings from evaluations deemed significant. The Evaluation P&P sets the framework for planning, monitoring, and disseminating findings from significant evaluations.
  • Results from significant evaluations will be available on SAMHSA’s website, a new step SAMHSA is taking with its newly-approved Evaluation P&P, starting in the Fall of 2017. Significant evaluations include those that have been identified by the Center Director as providing compelling information and results that can be used to make data driven, evidence-based, and informed decisions about behavioral health programs and policy. The following criteria is used to determine whether an evaluation is significant: 1) whether the evaluation was mandated by Congress; 2) whether there are high priority needs in states and communities; 3) whether the evaluation is for a new or congressionally-mandated program; 4) the extent to which the program is linked to key agency initiatives; 5) the level of funding; 6) the level of interest from internal and external stakeholders; and 7) the potential to inform practice, policy, and/or budgetary decision-making.
  • CBHSQ is currently leading agency-wide efforts to build SAMHSA’s learning agenda. Via this process, we have developed agency-wide Learning Agenda templates in the critical topic areas of opioids, serious mental illness, serious emotional disturbance, suicide, health economics and financing, and marijuana; learning agendas focused on other key topic areas such as alcohol are underway as well. Other topics, such as cross-cutting issues related to vulnerable populations, are interwoven through these research plans. Through this multi-phased process, CBHSQ is systematically collecting information from across the agency regarding research and analytic activities, analyzing and organizing this information into a guiding framework to be used for decision-making related to priorities and resource allocation. SAMHSA began this process in early 2017 and plans to complete it in the winter of 2018. SAMHSA has developed a template for the issue of opioid abuse, the first topic we tackled in this effort and thus the most complete at this point in time and has been used in determining research questions along with the current activities underway across the agency that are relevant to these areas. The template follows the construct outlined by OMB in the publication entitled Analytical Perspectives; Budget of the U.S. Government; Fiscal Year 2018.
  • SAMHSA’s Data Integrity Statement outlines how CBHSQ adheres to federal guidelines designed to ensure the quality, integrity, and credibility of statistical activities.
  • SAMHSA’s National Behavioral Health Quality Framework, aligned with the U.S. Department of Health and Human Services’ National Quality Strategy, is a framework to assist providers, facilities, payers, and communities better track and report the quality of behavioral health care. Through this framework, SAMHSA “ proposes a set of core measures to be used in a variety of settings and programs, as well as in the evaluation and quality assurance efforts.” These metrics are focused primarily on high-rate behavioral health events such as depression, alcohol misuse, and tobacco cessation, all of which impact health and health care management and thus affect a large swath of the U.S. population.
Score
9
Resources

Did the agency invest at least 1% of program funds in evaluations in FY17? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, and capacity-building)

  • SAMHSA will have spent $31.5 million on sixteen evaluations by the end of FY2017. This represents .8 percent of SAMHSA’s $3.78 billion budget appropriated for FY2016.
Score
8
Performance Management / Continuous Improvement

Did the agency implement a performance management system with clear and prioritized outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY17? (Example: Performance stat systems)

  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. On a quarterly basis, three SAMHSA discretionary grant portfolios (one from each of the three program Centers) conduct a self-analysis to examine grantee performance based on objective performance data, financial performance and other factors. Program staff present their program self-assessments to the PIRT and receive feedback on, for example, targets of concern. In one instance, grantees were surpassing their targets by 200-300%, resulting in the board suggesting that the targets be re-examined as appropriate for these high-performing grantees. In addition, the Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed.
  • A new unified data collection system, SAMHSA’s Performance Accountability & Reporting Systems (SPARS), was put into place in early 2017. Historically, the three program Centers had independent data collection systems that did not allow for global reviews of agency activities. The new system allows for greater transparency about grantee performance across Centers. SAMHSA aligns program objectives and measures through its utilization of SPARS, SAMHSA’s online data entry, reporting, technical assistance request, and training system for grantees to report timely and accurate data. SPARS is a mechanism by which SAMHSA meets requirements of the Government Performance and Results Act (GPRA) of 1993 and the GPRA Modernization Act of 2010.
  • SAMHSA’s strategic plan Leading Change 2.0: Advancing the Behavioral Health of the Nation 2015–2018 outlines six strategic initiatives for fiscal years 2015-2018. These initiatives are linked to SAMHSA’s policy, programmatic, and financial planning that support innovative, evidence-based practices that show promising results and best expertise and knowledge around treatment and prevention of mental health and behavioral challenges for a variety of stakeholders (see p. 5).
  • Pursuant to the 21st Century Cures Act, SAMHSA is required to establish standards for grant programs that, among other factors, addresses the extent to which grantees must collect and report on required performance measures, and SAMHSA must advance the use of performance metrics recommended both by the Assistant Secretary for Planning and Evaluation (ASPE) (Sec. 6002, pp. 464-465) and the Director of CBHSQ (Sec. 6004, p. 470). In addition, SAMHSA’s Chief Medical Officer is required to coordinate with ASPE to assess the use of performance metrics in evaluation activities, and coordinate with the Assistance Secretary to ensure programs consistently utilize appropriate performance metrics and evaluation designs (Sec. 6003, p. 468). The Assistant Secretary must also submit a biennial report to Congress that assesses the extent to which its programs and activities meet goals and appropriate performance measures (Sec. 6006, p. 477).
Score
8
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) federal, state, and local programs in FY17? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies

  • SAMHSA has five data collection initiatives: National Survey on Drug Use and Health (NSDUH): population data; Treatment Episode Data Set – Admissions: client level data; National Survey of Substance Abuse Treatment Services (N-SSATS): substance abuse facilities data; Drug Abuse Warning Network: emergency department data; and the National Mental Health Services Survey (N-MHSS) and has made numerous administrative and survey datasets publicly available for secondary use. Each data collection can be sorted by metadata parameters such as geography, methodology, spotlights, data reviews, and data tables. CBHSQ oversees these data collection initiatives and provides publicly available datasets so that some data can be shared with researchers and other stakeholders while preserving client confidentiality and privacy. Some restricted data cannot be shared beyond federal staff.
  • CBHSQ prepares specialized reports on a range of mental health and substance use issues relevant to government officials and policymakers at the state, federal, and community levels. These are publicly available at the National Library of Medicine and the SAMHSA website.
  • SAMHSA’s Data Integrity Statement articulates the administration’s Center for Behavioral Health Statistics and Quality (CBHSQ), a Federal Statistical Unit, adherence to the federal common set of professional and operational standards that ensure the “quality, integrity, and credibility” of statistical activities.
  • SAMHSA’s Performance and Accountability and Reporting System (SPARS) hosts the data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA. SPARS serves as the repository for the Administration’s 3 centers, Center for Substance Abuse and Prevention (CSAP), Center for Mental health Services (CMHS), and Center for Substance Abuse Treatment (CSAT). Due to concerns about confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA and publications based on grantee data will not be permitted. We expect to revisit the issue once the Commission on Evidence Base Policymaking releases their findings in September 2017. Enhancements to the existing data collection system to improve data transparency and sharing of administrative and performance data are being planned. The foundational system went live in February 2017. Going forward, changes will allow for analytic reports to be shared with grantees so that performance successes and gaps can be better tracked, both by the project officers overseeing the grantees and by the grantees themselves. It is anticipated that this will improve communication and oversight as well as offer more real-time opportunities for program performance.
  • SAMHSA’s Performance and Accountability and Reporting System (SPARS) hosts the data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA. SPARS serves as the repository for the Administration’s 3 centers, Center for Substance Abuse and Prevention (CSAP), Center for Mental health Services (CMHS), and Center for Substance Abuse Treatment (CSAT). Due to concerns about confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA and publications based on grantee data will not be permitted. We expect to revisit the issue once the Commission on Evidence Base Policymaking releases their findings in September 2017.
  • SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA) contains substance use disorder and mental illness research data available for restricted and public use. SAMHDA promotes the access and use of SAMHSA’s substance abuse and mental health data by providing public-use data files and documentation for download and online analysis tools to support a better understanding of this critical area of public health.
  • Per SAMHSA’s Evaluation Policy & Procedure (P&P), CBHSQ will work with CMHS, CSAT, and CSAP Center Directors and other program staff to develop a SAMHSA Completed Evaluation Inventory of evaluations completed between FY11 and FY17. This inventory and the evaluation final reports will then be made available on SAMHSA’s intranet and internet sites. In addition, data files from completed evaluations will be made available on the intranet, and via a restricted access mechanism such as SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA).
Score
7
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding decisions and did it disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY17? (Example: What Works Clearinghouses)

  • There is great diversity across SAMHSA programming, ranging from community-level prevention activities to residential programs for pregnant and post-partum women with substance misuse issues. While this diversity allows SAMHSA to be responsive to a wide set of vulnerable populations, it limits the utility of a common evidence framework for the entire agency. Within Centers (the Center for Substance Abuse Prevention, the Center for Substance Abuse Treatment, and the Center for Mental Health Services), consistent evidence frameworks are in use and help to shape the process of grant-making (e.g., Center staff are familiar with the pertinent evidence base for their particular portfolios). At the programmatic level, staff review the state-of-the-art for a particular topic area to facilitate grantee adoption and implementation of evidence-based practices (EBPs). While staff awareness of EBPs varies, a systematic approach to evidence classification remains to be developed. Most Center staff rely on the National Registry of Evidence-based Programs and Practices to identify evidence-based programs for grantee implementation.
  • SAMHSA has universal language about using evidence-based practices (EBPs) that is included in its Funding Opportunity Announcements (FOAs) (entitled Using Evidence-Based Practices (EBPs)). This language includes acknowledgement that, “EBPs have not been developed for all populations and/or service settings” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population. Specifically, the language states that applicants should, 1) document that the EBPs chosen are appropriate for intended outcomes, 2) explain how the practice meets SAMHSA’s goals for the grant program, 3) describe any modifications or adaptations needed for the practice to meet the goals of the project, 4) explain why the EBP was selected, 5) justify the use of multiple EBPs, if applicable, and 6) discuss training needs or plans to ensure successful implementation. Lastly, the language includes resources the applicant can use to understand EBPs. Federal grants officers work in collaboration with the SAMHSA Office of Financial Resources to ensure that grantee funding announcements clearly describe the evidence standard necessary to meet funding requirements.
  • In 2011, based on the model of the National Quality Strategy, SAMHSA developed the National Behavioral Health Quality Framework (NBHQF). With the NBHQF, SAMHSA proposes a set of core measures to be used in a variety of settings and programs, as well as in evaluation and quality assurance efforts. The proposed measures are not intended to be a complete or total set of measures a payer, system, practitioner, or program may want to use to monitor quality of its overall system or the care or activities it provides. SAMHSA encourages such entities to utilize these basic measures as appropriate as a consistent set of indicators of quality in behavioral health prevention, promotion, treatment, and recovery support efforts across the nation.
  • SAMHSA’s National Registry of Evidence-Based Programs and Practices (NREPP) is an online registry for over 450 substance use and mental health interventions that meet minimum review requirements. Its purpose is to “help people learn more about available evidence-based programs and practices and determine which of these may best meet their needs.” By providing a clearinghouse for evidence-based practices, SAMHSA is working to provide timely and relevant scientific knowledge for practical application. In 2015/2016, NREPP underwent a significant set of revisions to ensure a more rigorous review process prior to posting to the registry. While NREPP previously accepted only voluntarily submitted programs, resulting in key gaps due to non-submission for a variety of reasons, the new approach allows SAMHSA to fill registry gaps.
  • Before being posted to NREPP, interventions undergo a review process that ensures reliability by “tak[ing] into account the methodological rigor of evaluation studies, the size of the program’s impact on an outcome, the degree to which a program was implemented as designed, and the strength of the program’s conceptual framework.” The review process results in an “outcome evidence rating” of Effective, Promising, Ineffective, and Inconclusive. Additionally NREPP provides tools to help decision makers use the information in the best way possible.
  • In addition, the NREPP Learning Center was revamped and launched in June 2017 as a companion site to the more traditional registry focused on emerging practices and uptake and adoption of evidence-based programs and practices. Stakeholder input from diverse individuals was sought to guide the development of the new Learning Center. With its June launch, the Learning Center now provides a home for emerging practices and programs of critical interest to vulnerable populations that may not have the resources to engage in rigorous evaluation or for whom traditional evaluation techniques are not culturally appropriate. In addition, the Learning Center engages stakeholders interested in selecting, adopting and implementing a new program, offering a variety of learning tools including videos that describe the activities from implementer and developer perspectives.
Score
8
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY17? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with strong evaluation requirements)

  • The SAMHSA Knowledge Network, a collection of technical assistance and training resources provided by the agency, provides behavioral health professionals with education and collaboration opportunities, and ample tools and technical assistance resources that promote innovation in practice and program improvement. Located within the Knowledge Network are groups such as the Center for Financing Reform and Innovation, which works with states and territories, local policy makers, providers, consumers, and other stakeholders to promote innovative financing and delivery system reforms.
  • In addition, SAMHSA participates in collaborations with other HHS agencies to promote innovative uses of data, technology and innovation across HHS to create a more effective government and improve the health of the nation, via the HHS IDEA Lab. SAMHSA has co-developed and submitted several innovative data utilization project proposals to the Ignite Accelerator of the HHS IDEA Lab, such as a project to monitor and prevent opioid overdoses by linking heroin users to resources and information.
  • The agency is currently exploring the use of tiered-evidence frameworks in its award decision-making to actively encourage innovation at the grantee/program level. In addition, pursuant to the 21st Century Cures Act, SAMHSA is establishing the National Mental Health and Substance Use Policy Laboratory (Policy Lab) (Sec. 7001, p.501), by restructuring the current Office of Policy, Planning, and Innovation (OPPI). The new Policy Lab will review programs and activities operated by the agency to identify programs and activities that are duplicative, identify programs and activities that are not evidence-based or effective, and formulate recommendations for coordinating, elimination, or improving such programs (Sec 7001, pp.502-503).
  • To further promote innovation, per the Cures Act, SAMHSA’s Assistant Secretary may coordinate with the Policy Lab to award grants to states, local governments, tribes and tribal organizations, and other eligible organizations to develop evidence-based interventions. These grants can help support the evaluation of models and interventions that show promise, or the expansion, replication, or scaling up of interventions that have been established as evidence-based (Sec. 7001, pp. 503-504).
Score
7
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY17? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring; Pay for Success provisions)

  • The following represents SAMHSA’s 5 largest competitive grant programs for which funds were appropriated in FY17: 1) Opioid State Targeted Response ($500 million in FY17); 2) Children’s Mental Health Initiative ($119 million in FY17); 3) Strategic Prevention Framework ($119.5 million in FY17); 4) Substance Abuse Treatment Criminal Justice ($78 million in FY17); and 5) Targeted Capacity Expansion – General ($67.2 million in FY17).
  • The President’s Budget request for SAMHSA for FY18 stipulates “that up to 10% of amounts made available to carry out the Children’s Mental Health Initiative may be used to carry out demonstration grants or contracts for early interventions with persons not more than 25 years of age at clinical high risk of developing first episode of psychosis.” Specifically, funds from this set-aside should address whether community-based interventions during the prodrome phase can prevent further development of serious emotional disturbances and eventual serious mental illness, and the extent to which evidence-based early interventions can be used to delay the progression of mental illness, reduce disability, and/or maximize recovery.
  • SAMHSA has universal language about using evidence-based practices (EBPs) that is included in its Funding Opportunity Announcements (FOAs) (entitled Using Evidence-Based Practices (EBPs)). This language includes acknowledgement that, “EBPs have not been developed for all populations and/or service settings” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population. Specifically, the language states that applicants should, 1) document that the EBPs chosen are appropriate for intended outcomes, 2) explain how the practice meets SAMHSA’s goals for the grant program, 3) describe any modifications or adaptations needed for the practice to meet the goals of the project, 4) explain why the EBP was selected, 5) justify the use of multiple EBPs, if applicable, and 6) discuss training needs or plans to ensure successful implementation. Lastly, the language includes resources the applicant can use to understand EBPs. SAMHSA shares evidence-based program and practice language with grantees as they compete for SAMHSA grants and describe the types of program/practice implementation they hope to engage in to address the needs of their particular target populations and communities. The review criteria contained in the FOA make clear that applicants proposing to use programs and practices with a more robust evidence base will receive higher scores and thus greater support for their funding application.
  • The President’s Budget for SAMHSA for FY18 plans to implement a tiered evidence approach in the Screening, Brief Intervention, and Referral to Treatment (SBIRT) program, which will allow for funding allocations and awards based on the implementation of both innovative practices or programs and more standard programming. Grant funding will be tied to the particular approach taken by the grantee. At the present time, SAMHSA does not use preference points to link funds to evidence of effectiveness; however, the 10% set-aside includes language to suggest that the Coordinated Specialty Care model is a first episode approach of importance to this work.
  • Among SAMHSA’s standard terms and conditions of all grant funding is the requirement that grantees collect and report evaluation data to ensure the effectiveness and efficiency of its programs under the Government Performance and Results (GPRA) Modernization Act of 2010 (P.L. 102-62). In addition, grantees must comply with performance goals and expected outcomes described in Funding Opportunity Announcements (FOAs), which may include participation in an evaluation and/or local performance assessment. While exemplar FOAs are not available to be shared publicly at this juncture, SAMHSA is developing the first tiered evidence FOA that will be funded in FY2018, a key step to incentivize innovative practice/program models among grantees. While exemplar FOAs are not available to be shared publicly at this juncture, SAMHSA is developing the first tiered evidence FOA that will be funded in FY2018, a key step to incentivize innovative practice/program models among grantees.
Score
8
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY17? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

  • The following represents SAMHSA’s largest non-competitive grant programs for which funds were appropriated in FY17: 1) Substance Abuse Prevention and Treatment Block Grant Program ($1.8 billion); 2) Mental Health Block Grant Program ($562 million); 3) Projects for Assistance in Transition from Homelessness (PATH) Program ($64.6 million); and 4) Protection and Advocacy for Individuals with Mental Illness (PAIMI) Program ($36.1 million).
  • In FY17, SAMHSA’s Mental Health Grant Block maintained a 10% set-aside for evidence-based programs (see p. 4) to address early serious mental illness (ESMI) (including psychosis) among individuals. In FY18-19 grant applications, states must describe how they will utilize the 10% set aside to align with coordinated specialty care models such as that which is grounded in the National Institute of Mental Health’s RAISE (Recovery after an Initial Schizophrenic Episode) work, or other approved evidence-based approaches. A key assumption of the block grant applications that grantees must meet is that, “state authorities use evidence of improved performance and outcomes to support their funding and purchasing decisions” (p. 8). In addition, a quality improvement plan is requested from all grantees, which is based on the principles of Continuous Quality Improvement/Total Quality Management (CQI/TQM). Grantees are also required to comply with performance requirements, which include assessing how funds are used via data and performance management systems and other tracking approaches.
Score
1
Repurpose for Results

In FY17, did the agency shift funds away from or within any practice, program, or policy that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; proposing the elimination of ineffective programs through annual budget requests)

  • The SAMHSA budget provides performance information along with budget information which Congress can use to determine funding levels. Each year the program Centers review grantees within each program, project, or activity in terms of performance and financial management, when funding decisions are made for continuation funding. It is up to each Center to determine the factors that go into decisions related to continued funding based on guidance from the Office of Financial Management, Division of Grants Management. To the extent that costs are reduced for continuation funding, those funds can be repurposed to fund new grantees or to provide additional contract support for those grantees. In FY 2017, SAMHSA underwent a stringent review process for all funding requests utilizing both program and fiscal performance. During this process, SAMHSA utilized $51M in unspent funding from existing grantees to fund new programs and activities.
  • CBHSQ staff conducted a summer evaluation inventory in the summer of 2016, requesting that program staff from the Centers provide information related to how their evaluation findings inform the next iteration of their programs and/or new evaluation activities. For the most part, program staff indicated that evaluation findings were used to improve the next round of funding opportunity announcements and thus grantee implementation of program.
Back to the Index

Visit Results4America.org