Relationship between program planning and grant proposals

Writing an Evaluation Plan | Research at Brown | Brown University

relationship between program planning and grant proposals

These grant program announcements provide a long list of how pre-R01 Does the analysis section demonstrate how relationships between constructs . and the process of implementation (planning, engaging, executing. Do you know the difference between goals and objectives? How to Write Goals and SMART Objectives for Your Grant Proposal It also spells out the specific results or outcomes you plan to achieve. area of Baltimore with a 2-hour training program that will provide health and nutrition information. Not all grant proposals require an evaluation plan. If one is required, it will generally be listed in the program announcement. Most often, larger, more involved.

While developing a proposal for a planning grant is much the same as developing a request for a program grant, there are a few important differences. Program Proposal The discussion of the problem is almost identical to that of a planning proposal.

The plan should be well-reasoned, broadly accepted, and designed to address factors that are causing the problem. To the greatest extent possible, the plan should be based on solid evidence that the approaches identified are likely to succeed. Within eight months, an Action Plan will be adopted by the community coalition: The Action Plan will be based on research or other evidence that the proposed approaches are likely to be effective in addressing the problem; Process documents will verify that the eight targeted segments of the community were vigorously engaged in data gathering, research, and planning Program Proposal The long-term GOAL of an implementation grant is also a reduction in the problem.

relationship between program planning and grant proposals

These changes will take place during the period of grant funding. Here are examples of outcomes for a program to address poverty: Who will lead the charge? The evaluation plan used in your initial proposal must be incorporated into a final report. What do we do with the data?

YALI Network: Planning to Write Your Grant Proposal

Besides being used to answer important scientific questions, the data also serves to show our sponsors and policy makers the impacts we are making. What types of proposal are there? These are not required for IPM Issues proposals. IPM Issues proposals will be evaluated by other measures that assess the feasibility of achieving your stated goals. What does the evaluation process entail?

A National Institute of Mental Health Council workgroup report [ 58 ] calls for the engagement of multiple stakeholder perspectives, from concept development to implementation, in order to improve the sustainability of evidence-based services in real-world practice. The engagement of key stakeholders in implementation research affects both the impact of proposed implementation efforts, the sustainability of the proposed change, and the feasibility and ultimate success of the proposed research project.

Thus, implementation research grant proposals should convey the extent and manner in which key stakeholders are engaged in the project. Stakeholders and researchers can forge different types of collaborative relationships.

Some authors advocate for the CBPR model as a strategy to decrease the gap between research and practice because it addresses some of the barriers to implementation and dissemination [ 60 - 62 ] by enhancing the external validity of the research and promoting the sustainability of the intervention. The information gathered from stakeholder analysis can then be used to develop strategies for collaborating with stakeholders, to facilitate the implementation of decisions or organizational objectives, or to understand the future of policy directions [ 6364 ].

Implementation research grant applications are stronger when preliminary data, qualitative or quantitative, reflect stakeholder preferences around the proposed change. Engagement is also reflected in publications that the principal investigator PI and key stakeholders have shared in authorship, or methodological details that reflect stakeholder priorities. Letters of support are a minimal reflection of stakeholder investment in the proposed implementation project.

This is not to say that all implementation research should be conducted in settings with high appetite for change.

  • Writing implementation research grant proposals: ten key ingredients
  • Introduction to Evaluation Plans for Grant Proposals

Implementation research is often criticized for disproportionate focus on settings that are eager and ready for change. The field of implementation science needs information about the process of change where readiness varies, including settings where change is resisted.

Preliminary data on the organizational and policy context and its readiness for change can strengthen an application. Because organization, policy, and funding context may be among the strongest influences on implementation outcomes, context needs to be examined front and center in implementation research [ 69 ].

Due in part to issues with reliability and validity of the measures used in the field, work in this area is ongoing [ 7172 ]. An evaluation of barriers and facilitators can be conducted through qualitative [ 78 - 80 ] or survey [ 8182 ] methodology. In fact, a number of scales for measuring implementation barriers have been developed [ 748384 ]. Letters are stronger when they address the alignment of the implementation effort to setting or organizational priorities or to current or emergent policies.

A number of implementation strategies have been identified and discussed in the literature [ 3685 - 87 ]. However, as the Improved Clinical Effectiveness through Behavioural Research Group notes [ 38 ], the most consistent finding from systematic reviews of implementation strategies is that most are effective some, but not all of the time, and produce effect sizes ranging from no effect to a large effect.

Our inability to determine how, why, when, and for whom these strategies are effective is hampered in large part by the absence of detailed descriptions of implementation strategies [ 40 ], the use of inconsistent language [ 44 ], and the lack of clear theoretical justification for the selection of specific strategies [ 39 ].

Thus, investigators should take great care in providing detailed descriptions of implementation strategies to be observed or empirically tested. Implementation Science has endorsed [ 40 ] the use of the WIDER Recommendations to Improve Reporting of the Content of Behaviour Change Interventions [ 88 ] as a means of improving the conduct and reporting of implementation research, and these recommendations will undoubtedly be useful to investigators whose proposals employ implementation strategies.

Additional design specific reporting guidelines can be found on the Equator Network website [ 90 ]. The selection of strategies must be justified conceptually by drawing upon models and frameworks that outline critical implementation elements [ 10 ].

Planning vs. Program Grants - part 1 of 2

Theory should be used to explain the mechanisms through which implementation strategies are proposed to exert their effects [ 39 ], and it may be helpful to clarify the proposed mechanisms of change through the development of a logic model and illustrate the model through a figure [ 91 ]. According to Brian Mittman, in addition to being theory-based, implementation strategies should be: We therefore emphasize taking stock of the budget impact of implementation strategies [ 94 ] as well as any cost and cost-effectiveness data related to the implementation strategies [ 95 ].

Although budget impact is a key concern to administrators and some funding agencies require budget impact analysis, implementation science to date suffers a dearth of economic evaluations from which to draw [ 9697 ]. The empirical evidence for the effectiveness of multifaceted strategies has been mixed, because early research touted the benefits of multifaceted strategies [ 9899 ], while a systematic review of implementation trials by Grimshaw et al.

Writing implementation research grant proposals: ten key ingredients

However, Wensing et al. For example, providing training and consultation is a multifaceted implementation strategy; however, it primarily serves to increase provider knowledge, and does not address other implementation barriers.

Thus, Wensing et al. Proposals that employ multifaceted and multilevel strategies that address prospectively identified implementation barriers [ ] may be more compelling to review committees, but mounting complex experiments may be beyond the reach of many early-stage investigators and many grant mechanisms.

However, it is within the scope of R03, R21, and R34 supported research to develop implementation strategies and to conduct pilot tests of their feasibility and acceptability—work that can strengthen the case for sustainability and scalability. Proposal writers should provide preliminary work for implementation strategies in much the same way that intervention developers do, such as by providing manuals or protocols to guide their use, and methods to gauge their fidelity. Such work is illustrated in the pilot study conducted by Kauth et al.

Investigators should also make plans to document any modifications to the intervention and, if possible, incorporate adaptation models to the implementation process, because interventions are rarely implemented without being modified [ 67].

Writing an Evaluation Plan

While providing detailed specification of theory-based implementation strategies is critical, it is also imperative that investigators acknowledge the complexity of implementation processes. Aarons and Palinkas [ ] comment: It is becoming increasingly clear that being prepared to implement EBP means being prepared to evaluate, adjust, and adapt in a continuing process that includes give and take between intervention developers, service system researchers, organizations, providers, and consumers.

The reader should observe that NIH gives different scores for the team experience with the setting and for the research environment http: Investigators can convey capacity through a variety of ways.

Chief among them is building a strong research team, whose members bring depth and experience in areas the PI does not yet have. Implementation research exemplifies multidisciplinary team science, informed by a diverse range of substantive and methodological fields [ 96]. Early career investigators, therefore, should surround themselves with more established colleagues who bring knowledge and experience in areas key to the study aims and methods.

Additionally, the new formats for NIH biosketches and budget justifications enable a clear portrayal of what each team member brings to the proposed study. For the NIH applications, the research environment is detailed in the resources and environment section of a grant application.

Introduction to Evaluation Plans for Grant Proposals - Northeastern IPM Center

Preliminary studies and biosketches provide additional ways to convey the strengths of the environment and context within which an investigator will launch a proposed study. In summary, researchers need to detail the strengths of the research environment, emphasizing in particular the resources, senior investigators, and research infrastructure that can contribute to the success of the proposed study.

relationship between program planning and grant proposals

A strong research environment is especially important for implementation research, which is typically team-based, requires expertise of multiple disciplines, and requires strong relationships between researchers and community based health settings.

Feasibility of proposed research design and methods One of the most important functions of preliminary work is to demonstrate the feasibility of the proposed research design and methods. Landsverk [ ] urges PIs to consider every possible question reviewers might raise, and to explicitly address those issues in the application.

Data from small feasibility studies or pilot work around referral flow; participant entry into the study; participant retention; and the extent to which key measures are understood by participants, acceptable for use, and capture variability can demonstrate that the proposed methods are likely to work.