7 1 Program Evaluation Katherine Frew Capella University Suhad Sadik November 13,



Program Evaluation

Katherine Frew

Capella University

Suhad Sadik

November 13, 2021

Riverbend Services Human Services Program Evaluation

Second Chance Riverbend is a non-profit human assistance agency. SCR serves both adults and minors. Adult programming aims to reduce recidivism, while juvenile programming seeks to prevent and divert kids from the criminal justice system. Adults receive counseling, employment coaching, and supervised housing, while juveniles receive counseling, schooling, and anger management services. SCR has submitted data for program review and is seeking accreditation. Accreditation requires SCR to show that its organization can deliver services and have processes in place to oversee program compliance (Sylvia & Sylvia, 2012).

Types of Program Evaluations Applied to the Case

SCR’s program assessment will consist of three evaluations to measure the effectiveness of their services. These evaluations will be outcome-based, process-based, and accreditation-based. An organization’s policy and planning processes require program evaluation (Guyadeen & Seasons, 2016). An assessment of a program involves recognizing a problem, developing a solution, implementing it, and assessing its success (Guyadeen & Seasons, 2016). Program assessment measures the effectiveness of an intervention and identifies opportunities for improvement (Guyadeen & Seasons, 2016). Assuring that the program’s investments are reasonable allows program stakeholders to view statistics (Sylvia & Sylvia, 2012). Stakeholders are significant in organizations because they have a “stake” in a program and can influence or be affected by its process and outcomes (Chyung, 2015).

Standards-Based Evaluation

The type of evaluation analyzes a program’s or firm’s effectiveness. The ideal definition of standards by Sylvia & Sylvia (2012) acknowledges their attributes as a set of criteria used to evaluate organizational operations. It is possible to engineer standards from historical data. In the Riverbend City case, there were comprehensive evaluations for SC RBC, which included assessments for regulations, money, and fundraising capacities. The findings demonstrate that SC RBC complies with the stipulated criterion for ongoing accreditation based on the application overview. Standards generally involve the adequacy of efforts to serve the organization’s clients and their impact on the issue of interest (Sylvia & Sylvia, 2012). The firm complied with the standards necessary.

Outcomes-Based Evaluations

An outcome-based evaluation measures a program’s or organization’s outputs. Calculations are done for insight into an organization’s or program’s outcomes (Sylvia & Sylvia, 2012). The RCS used outcome-based assessment to analyze programs like anger management, courageous characters, and tutoring incorporated into the First Program. The scenario’s outcomes are evaluated using general indicators. General indicators are data that the program regularly collects, according to Sylvia et al. The benchmark and result data points prove functional in measuring each program’s goals. An organization’s or program’s success depends on outcome evaluation. The outcomes-based evaluation seeks to address the following. Did the work produce the desired outcome for your clients and community (Welcome to Corporation for National and Community Service, n.d.) The First Chance Program looks to have met or exceeded the outcomes.

Process-Based Evaluation

An organization or program might assess its management, operations, support programs inclusive of marketing and communications. Fund development is another support program that facilitates the achievement of the desired objectives (Process Evaluation vs. Outcome Evaluation, 2019). The Riverbend City scenario used a process evaluation to assess program changes based on quantitative and qualitative data. An optimum strategy to apply is the Four Phases of Process Intervention/Evaluation (Sylvia & Sylvia, 2012). It is an ideal strategy that incorporates feedback evaluation, implementation, solution generation, and problem identification stages. The Second Chance RBC assessed through surveys, focus groups, and ethnographic interviews. To summarize, process evaluations look at the types and amounts of services given, their recipients, and recipients’ needs. They also look at problems encountered in delivering services and solutions devised to solve those problems (Process Evaluation vs. Outcome Evaluation, 2019).

Program Assessment

The program assessment will use the ten-point checklist to evaluate the Riverbend City First and Second Chance programs. Is the program continuing or experimental? According to the documentation and reports, the programs are current and fully operational. Decision-makers and program leaders often favor ongoing programs (Sylvia & Sylvia, 2012). Program leaders are conducting surveys to assess the organization’s overall health.

Audience: This evaluation’s intended audience is the program’s stakeholders and leaders. First and Second Chance programs involve stakeholders and leaders. By involving staff in program evaluation design and implementation, the evaluator may boost staff participation in reviewing and approving outcomes. Engaged workers will implement change based on assessment outcomes, even a harsh one (Sylvia & Sylvia, 2012).

Appropriateness of Needs, Designs, and Measures to the Audience: The design measures appear to meet the program’s needs. Their tutoring program, for example, is addressed in the program. The next step involves putting up a design to aid decision-making. Decision-makers may also want to know if the program is efficient and if officials follow procurement, contracting, and other requirements (Sylvia & Sylvia, 2012).

Any existing Desire for Impact Evaluations or Outcome Evaluations: An outcome review was done to give standards and outcomes for the First Chance and Second Chance programs. Generic indicators summarize the evaluations’ impact.

The objective of Evaluation: This review is to ensure the programs work. And following the Board of Directors’ and stakeholders’ norms. Some assessments are done to satisfy an overhead agency or legislator. Others want to know how the proposal compares to others (Sylvia & Sylvia, 2012).

Source of Data on Client Populations: The First and Second Programs seek both decision-oriented data and customer population. The outcome evaluation offers demographic and programs service data.

Value of Insight from the Evaluation on Funding: Administrators often lose funding due to the legislature’s requirement to balance the budget (Sylvia & Sylvia, 2012). The information supplied implies that the programs are fighting to keep their funding and to continue serving those involved in the criminal justice system.

Does funding inhibit Design Considerations? First, Second Chance RBC has proved its ability to maintain accreditation, effectively regulate them, and provide outcomes required to remain supported. The firm’s portfolio has a history of ensuring funding is practical and meets quality evaluation requirements.

Input from Program Staff on the Centrality of the Measures against Agency Goals: Second Chance RBC’s mission’s objective is to provide alternatives to the criminal justice system in Riverbend City. Based on stakeholder interviews and program documentation, the programs meet their goals to benefit the criminal justice community.

Actions to Undertake, their Impact, and Stakeholders affected; Assessing the program ensures it meets policy and criteria. Also, the metrics in place show what the Second Chance program achieves. The transparency allows those interested to see the program’s high-level offerings.

Ethical Issues that Arise in Program Evaluation

Imperatively, the assessment plan also factors in ethical considerations. It’s also essential not to transgress ethical lines. Integrity and honesty should be on an evaluator’s mind during the evaluation process. The evaluation should provide evidence that demonstrates compliance for persons and informed consent considerations. In this case, standard two and standard 18 come to mind. Obtain informed consent to give services to clients at the start of the assisting relationship, says Standard 2. Except in cases of a judicial order, clients should be entitled to withdraw consent and ask questions before accepting services. Clients who cannot give consent should have those legally able to review and provide proper consent (Barrett, 2019). Participants in such studies must first offer informed consent to an evaluator. Uninformed consent can lead to ethical difficulties. Standard 18 reads, “Human service workers appropriately describe the success of treatment programs, interventions, therapies, and techniques” (Barrett, 2019). This implies that human service professionals must base their decisions or assessments of programs on data-driven information. The assessor should not evaluate based on personal beliefs. Standard 7 supports the declaration above when it states that “human service providers guarantee that their values or biases are not forced on their clients” (Barrett, 2019).

Program Evaluation Critique

Recognizing the distinctiveness of an organization’s context and aims, according to Ackermann and Eden (2011), allows managers to identify specific stakeholders and be explicit about their importance for the organization’s future. If handled effectively, how stakeholders are employed throughout the review process can be helpful. There does not appear to have been much stakeholder participation harnessed throughout the standard-based evaluation. This evaluation provided an overview of the evidence that was used to maintain accreditation. The acknowledgment of stakeholder participation was not included in the annual accreditation review. Some stakeholder data appeared to contribute to the outcome of the outcome-based evaluation. The program facilitated the development of a reporting system, which proved helpful in collecting the information used in this report. After acquiring consent, qualitative and quantitative data from student records were obtained. The information provided a clear picture of who and what was being served. The participants’ demographic and academic profiles were provided as part of the evaluation.

Stakeholder input appeared to be more critical in the process-based evaluation than in any other assessment. According to Ackermann and Eden (2011), stakeholders that occupy various grid positions (for example, as both partners and competitors) may see some of the organization’s strategies favorably and others negatively. Therefore, it is clear that there was strategic planning in place to ensure appropriate management. Second Chance RBC solicited feedback from guidance counselors, principals, and school social workers during the process review. The information was gathered through a survey, summarized, and informed the arrival at specific conclusions. It’s critical to get feedback from all stakeholders to identify issue areas if any exist. Everyone’s contribution aids in the resolution of problems. When stakeholders react to organizational activity, they do so with respect to other stakeholders and the focus organization. One stakeholder’s efforts can elicit a dynamic response from various stakeholders (Ackermann & Eden, 2011).


The key to evaluating whether or not a program is entirely operational, and the assessments attest to its functionality. In addition, program review is in place to verify that organizations follow program standards and policies. The standard base, outcome-based, and process-based assessments are the three types of program evaluations that determine whether or not a program has an excellent governing foundation. They are the glue that ties a non-profit organization or program together and guarantees that it can remain accredited, supported, and operational to help those in need.


Ackermann, F., & Eden, C. (2011). Strategic Management of Stakeholders: Theory and Practice. Long Range Planning, 44(3), 179–196. https://doi.org/10.1016/j.lrp.2010.08.001

Barrett, S. (2019). Ethical Standards for HS Professionals. National Organization for Human Services. https://www.nationalhumanservices.org/ethical-standards-for-hs-professionals

Chyung, S. Y. Y. (2015). Foundational Concepts for Conducting Program Evaluations. Performance Improvement Quarterly, 27(4), 77–96. https://doi.org/10.1002/piq.21181

Guyadeen, D., & Seasons, M. (2016). Evaluation Theory and Practice: Comparing Program Evaluation and Evaluation in Planning. Journal of Planning Education and Research, 38(1), 98–110. https://doi.org/10.1177/0739456×16675930

Sylvia, R. D., & Sylvia, K. M. (2012). Program planning and evaluation for the public manager. Waveland Press.

TSNE MissionWorks. (2018). Process Evaluation vs. Outcome Evaluation. TSNE MissionWorks. https://www.tsne.org/blog/process-evaluation-vs-outcome-evaluation

Welcome to Corporation for National and Community Service. (n.d.). HandsOn Network: Evaluating Your Volunteer Program Three Types of Volunteer Program Evaluation [PDF File]. https://www.nationalservice.gov/sites/default/files/resource/TypesofEvaluation.pdf