PROGRAM EVALUATION
By Jacqueline Bess 12052021
INTRODUCTION
In the United States, formal support systems were developed to aid members of the community experiencing sociological issues impacting their individual lives negatively, especially when these issues conversely affect the general well-being of society as a whole. Created in the 1950s, the U.S. Department of Health & Human Services (HHS) – though first organized as the Department of Health, Education and Welfare (HEW) – provides such support through federally funded essential human services intended to protect the general well-being of the U.S. population. Likewise at the state level, since 1972 the Georgia Department of Human Services (DHS) has the mission of promoting self-sufficiency, safety and well-being for Georgians. Within the DHS, the Division of Family & Children Services (DFCS) investigates reports of child abuse, finds foster and adopted homes for abused and neglected children, and provides numerous support services and programs for families in need.
At both federal and state levels, ensuring effective delivery of human services requires regular evaluation of these government programs. HHS’ Administration for Children & Families (ACF) conducts social services program evaluations while also using the evidence gathered to inform policy and practice in support of improving the lives of American children and families.
A learning organization with a culture of continuous improvement requires many types of evidence, including not only evaluation, but also descriptive research studies, performance measures, financial and cost data, survey statistics, program administrative data, and feedback from service providers, participants and other stakeholders. “ACF Evaluation Policy” (2021)
Yet, such rigorous evaluation policies rooted in research initiatives have not always been the standard. Prominent sociologist and distinguished educator Peter H. Rossi (1921-2006) became known for his work in evaluating federally funded initiatives in health and human services before other mechanisms for regular program evaluation were the standard. His application of sociological expertise in these areas affected policymaking and funding and established him as a renowned leader in program evaluation.
Concern for large-scale evaluations originated during President Johnson’s War on Poverty in the 1960s, but even as poverty slipped from public consciousness and Reagonomics and high stock prices came to dominate the American political scene, Peter demonstrated how to continue the fight. … Systematic evaluation could highlight the effects of government programs on American lives and thereby enable policymakers, politicians, and voters to make better decisions. (Huber, 2006)
Rossi’s professional contributions included developing the Five-Domain Evaluation Model with several colleagues. Through this model we understand that structured evaluations are necessary for social program improvement, accountability, and knowledge generation. This model also provides a framework for instructional program evaluation by guiding the evaluator to answer questions that fall into one or more of these five domains: the need for the program, its theory and design, its implementation and service delivery, its outcome and impact, and its costs.
Thorough explanation of the Five-Domain Evaluation Model is presented in Rossi’s book, Evaluation: A Systematic Approach, which also provides this definition:
Program evaluation is the use of social research methods to systematically investigate the effectiveness of social intervention programs in ways that are adapted to their political and organizational environments and are designed to inform social action to improve social conditions. (Rossi, Lipsey, & Freeman, 2004, p. 28)
Simply asking frontline social services staff to report on program effectiveness by charting successful client outcomes is ineffective. Confirmation bias (the tendency to see things in ways favoring preexisting beliefs) is almost always present. To reach valid conclusions through program evaluation, independent evaluators and systematic methods must be utilized. Likewise, instructional programs designed for an organization or corporate entity also require a deeper level of evaluation beyond merely distributing end-of-training surveys. While participants may be very satisfied with their training, this does not imply that the training was positive for knowledge transfer and the trainees developed skills and competencies to the extent of resulting in organizational impact. (Blume, Ford, Baldwin, Huang, 2010, p. 1066)
OVERVIEW OF THE CASE
This case study focuses on the instructional design of DFCS’ Case Manager Certification Program and a 2019-2020 evaluation conducted by Organizational Development Consultant Margery Gildner, who specializes in reviewing needs assessments with additional expertise in facilitating child welfare training for government agencies. Additionally, this case study incorporates the five domains of Rossi’s program evaluation model.
CASE STUDY
Georgia Division of Family & Children Services (DFCS), Social Services Case Manager (SSCM) Training and Certification
Domain 1 – Needs Assessment
Is there a need for the program?
First, DFCS’ needs statement: SSCM training is critical to the development of a skilled child welfare workforce to achieve outcomes of safety, permanency and well-being for children entrusted to the care of the public child welfare systems.
Secondly, legally mandated requirements for DFCS were enacted following the filing of a June 2002 class action lawsuit against the state of Georgia and DFCS. The lawsuit, Kenny A. v Perdue, was filed on behalf of a class of children in Fulton and DeKalb County foster care alleging violations of their federal and state rights to adequate protection and services while in the state’s foster care systems. The subsequent Kenny A. v Perdue Consent Decree requires monitoring of DFCS child welfare services with stipulations pertaining to employee training as well as production of public reports every six months.
Georgia State University provides data support in producing these biannual monitoring reports through the Andrew Young School of Policy Studies Center for State and Local Finance, cataloged via Kenny A. v Deal Monitoring. When an evaluation process is an ongoing function that occurs regularly, it is called program monitoring and can include outcome monitoring. (Rossi, et al., 2004) Thus, the needs assessment is confirmed by not only the desire of DFCS to effectively train its workforce, but also the legal mandate requiring proficient, effective professional training and development of staff.
Domain 2 – Theory Assessment
Is the program well designed?
The SSCM training certification program includes classroom and online training, practice activities, and assessments through which the SSCM demonstrates basic competencies in child welfare practice. Subareas (or specific learning tracks) include Child Protective Services, CPS Intake Communication Center (CICC) – Georgia Gateway, Foster Care Services, Adoption Services and Resource Development. Instructional strategies include Field Practice Activities (FPA), online training modules, Transfer of Learning Activities (TOL) shadowing veteran case managers, classroom training, and course assessments (tests completed at the end of each class with a passing score of 85%). The SSCM must also complete training in the use of Georgia SHINES (Georgia’s statewide automated child welfare information system for case management and data collection that serves as the legal case record for the state’s involvement with families).
EVALUATION RESULTS: The evaluator indicated the overall training strategy was thoroughly outlined in the CCSM training manual, accurately describing how the certification program builds toward successful CCSM performance. The evaluator discussed the presence of a solid, blended learning structure conducive for an adult learner with 22 field activities, 17 transfer of learning activities, eight weeks of classroom training, course assessments, and simulation training to practice essential skills in performing case documentation. The evaluation noted solid instructional design and development indicators with defined competencies and key learning objectives stated. The evaluator also referenced the specialized competencies in adoption and foster care, services to adolescents, sexual abuse, assessments in critical thinking and corresponding skills. Additionally, the evaluator outlined objectives in relation to the fundamentals of child welfare practices in determining whether outcomes would be achieved. Further comments indicated a variety of learning tools were present including video, PPT slides, Kahoot! (learning games), and participant handbooks. The evaluator further commented on GA’s Practice Model and Solution-Based Casework (SBC) present in Week 1 and 2 Field Activities. However, the evaluator indicated four out of the five SBC competencies and corresponding skills were noticeably different; that four had higher-level learning skills while the fifth matched the language of other basic skills competencies. However, these notations were related to manual materials, and the evaluator requested to revisit the full classroom curriculum in relation to the effectiveness of the activities in achieving the skills and competencies to determine if they are aligned overall.
Domain 3 – Implementation Assessment
Is the program implemented effectively?
EVALUATION RESULTS: While the bulk of the evaluation centered on competencies and outcomes, discussion on training implementation was limited in scope. The evaluator did focus attention on the sequencing of the overall training program, in particular questioning the additional training requirements post initial certification indicated at six months, six-nine months and at 12 months as they were not as detailed in the CCSM manual. These additional post-certification activities require supervisor review of the case manager’s case records, supervisor-led ongoing coaching and mentoring, an interim performance management plan (PMP), and ongoing professional development (20 hours annually).
Domain 4 – Impact Assessment
Does the program have the intended outcomes?
EVALUATION RESULTS: The evaluator asked for more data pertaining to what is working and not working within the training, requesting access to information on how the in-house training and provider agency training was received. Additionally, the evaluator asked to observe classroom and simulation activities and review observation tools used to assess the employees. The evaluator also asked for clarity on the training responsibilities of the Supervisor, Training Coordinator, Field Instructor, Field Practice Coach, and Field Program Specialist and how their roles impact support and feedback provided during and after the certification program and whether this is working. The evaluator also noted that these same education and training staff are not responsible for making the final SSCM certification granting decision – that responsibility rests with the County Director – thus questions remained unanswered on procedures for confirming outcomes and granting certification.
Domain 5 – Efficiency Assessment
Is the program cost effective?
EVALUATION RESULTS: The evaluator did not complete an efficiency assessment.
In the absence of reported findings on efficiency, referral back to Rossi’s Evaluation text reminds us that a general assumption is made regarding social programs that are intended to ameliorate social problems and improve social conditions: interested stakeholders hold them accountable for their contributions to societal good. “The investment of social resources such as taxpayer dollars by human service programs is justified by the presumption that the programs will make beneficial contributions to society” (Rossi, et al., 2004, p. 49).
GA DHS serves more than 2 million Georgians and employs almost 9,000 people. It has an annual budget of $1.89 billion. And while cost effectiveness of the DFCS SSCM training program was not reviewed, as a government entity, they are required to remain transparent and accountable to their stakeholders. DFCS currently provides Federal Regulations and Data, Federal Reviews and Plan, Strategic Plan and Annual Report documentation online.
SUMMARY OF OUTCOMES & IMPLICATIONS
Instituting training programs for a workforce that encounters clients experiencing sociological issues is inherently more difficult than say creating computer or sales training for a global corporation’s workforce. In such instances, assessment of the causes and resolution for the overarching problem can be achieved through implementing solutions (for example, a corporate-wide productivity plan supported by training). However, in social services, the underlying issue can only be referenced (for example, gauging child abuse and neglect as interrelated to parental substance abuse and then enacting training to assist clients in these areas). But society has consistently failed in alleviating these social issues in their entirety. Therefore, it is especially important to continuously perform thorough evaluation of social programs and associated training programs for the workforce that provides these services.
Still, evaluation of social services workforce training programs requires different thought processes and actions than say reviewing teacher training initiatives for public school districts. Although both sets of trainees work with the public, the addition of never-ending sociological issues present in society add a complex dynamic to the responsibilities of the social services workforce. Implications of this can be seen with the legal ramifications associated with the “Kenny A” lawsuit that, in turn, required additional training solutions with outcomes continuously monitored in the court systems. Any training solutions implemented and their outcomes are now continuously subject to legal review. Significant and costly lessons in overwhelming caseloads and lack of caseworker training have been learned. However, abandoned children ending up in foster care is still an ongoing societal issue, and reducing these numbers is frequently discussed in the “Kenny A” decree monitoring reports.
Additional complexities inherent in providing government-funded social services perhaps explains the federal agency restructuring from HEW (where services for health, education and welfare were combined into one agency) to a completely separate agency for HHS. On a positive note, pathways toward change and better solutions for sociological issues in society can occur through the analysis, data collection and research garnered during program evaluation processes.
RECOMMENDED RESOURCES
For further study and research on “Program Evaluation” methodologies as related to human services, the following resources are recommended:
Centers for Disease Control and Prevention (CDC), Program Performance and Evaluation Office (PPEO): “Program Evaluation for Public Health Programs: A Self-Study Guide”
https://www.cdc.gov/eval/guide/introduction/index.htm
Centers for Disease Control and Prevention (CDC): Evaluation Fellowship Program
https://www.cdc.gov/eval/fellowship/index.htm
University of Wisconsin-Madison, Division of Extension, Program Development and Evaluation: “Evaluating Programs”
https://fyi.extension.wisc.edu/programdevelopment/evaluating-programs/
U.S. Department of Health & Human Services (HHS), Office of Planning, Research & Evaluation (OPRE), Administration for Children & Families (ACF): “Evaluation”
https://www.acf.hhs.gov/opre/report/acf-evaluation-policy
U.S. Department of Health & Human Services (HHS), Administration for Children & Families (ACF), Title IV-E Prevention Services Clearinghouse: “Handbook for Standards and Procedures”
https://preventionservices.abtsites.com/resources
REFERENCES
Administration for Children & Families (ACF). (2021). ACF evaluation policy. Retrieved from the ACF website: https://www.acf.hhs.gov/opre/report/acf-evaluation-policy
Blume, B., Ford, J., Baldwin, T., & Huang, J. (2010). Transfer of training: a meta-analytic review. Journal of Management, 36(4), 1065-1105.
Chyung, S. (2015). Foundational concepts for conducting program evaluations. Performance Improvement Quarterly, 27(4), 77-96.
Fox, M. (2006, October 13). Peter H. Rossi, 84, sociologist who studied homelessness, dies. The New York Times.
Georgia Department of Human Services (DHS). (2021). Strategic plan. Retrieved from the website: https://dhs.georgia.gov/organization/about/dhs-strategic-plan
Georgia Division of Family & Children Services (DFCS). (2021). 2020-2024 Strategic plan. Retrieved from the website: https://dfcs.georgia.gov/about-us/2020-2024-strategic-plan
Georgia Division of Family & Children Services (DFCS). (2021). Annual report. Retrieved from the website: https://dfcs.georgia.gov/about-us/dfcs-annual-report
Huber, J. (2006). Footnotes: Peter H. Rossi (1921-2006). American Sociological Association. Retrieved from the website: https://www.asanet.org/sites/default/files/savvy/footnotes/dec06/indextwo.html
Kenny A. v Perdue, 365 F. Supp. 2d 1353 (N.D. Ga. 2005). Retrieved from the Civil Rights Litigation Clearinghouse website: https://www.clearinghouse.net/detail.php?id=11053
Rossi P., Lipsey M., & Freeman, H. (2004) Evaluation: A systematic approach. (7th ed.). Thousand Oaks, CA: Sage Publications.
Weiss, C. (1993). Where politics and evaluation research meet. Evaluation Practice, 14(1), 93-106.