Title: Program evaluation in higher education
Author Name: Abby & Sabrina
1. Introduction
One of the key issues colleges and universities face is connecting students to different campus services. There is a gap between students’ awareness of campus services and their offerings. Campus programs that are held to increase engagement are a way to allow students, faculty, and staff to connect. Students can ask quick questions and become familiar with the numerous services and resources, making it more easily accessible to students. At the Georgia State University, Clarkston campus Student Life hosted a success program that was four days long called “A Successful You“. The goal of this program was to provide additional information about the support and resources available to students. The students would learn about these services by participating in different events and activities.
Campus support resources have been found to have direct and indirect effects on student graduation rates and student success. Effective use of student services has been shown to increase student engagement, increase student retention, and reduced degree completion time (Blum and Jarrat, 2014). Colleges and universities want to increase student utilization of these campus services due to the impact it can have on student experience and graduate rate at their university. However, a survey shows that between 40% to 50% of first-year students have never used campus support resources, such as academic advising, career planning, or academic tutoring (Kuh et al., 2006). It is apparent that colleges and universities need to make a concerted effort to market these services and resources to students. This chapter will follow Georgia State University’s attempt at increasing student engagement and awareness of campus support services through a student services fair and how program evaluation can be used to improve these types of programs.
Program evaluation is an important part of determining if a particular program is effective in reaching its goals. It can also provide suggestions for future improvements. Choosing an appropriate program evaluation model is crucial to having an authentic and effective evaluation. While there are many evaluation models to choose from, Rossi’s Five Domain Model and the Kirkpatrick Training Evaluation Model are used to evaluate this case study.
Rossi’s Five Domain Evaluation Model is popular because it is widely applicable to many different types of programs. This is because the evaluation is tailored based on the individual needs and resources of the stakeholders. This model contains five evaluation domains. The domains include a needs assessment, a theory assessment, an implementation assessment, an impact assessment, and an efficiency assessment. The needs assessment asks the question of if there is a need for the program. The theory assessment evaluates why there is a need for that particular program. The implementation assessment determines if the program was implemented as planned. The impact assessment checks to see if the program reached its goals with the targeted audience. Finally, the efficiency assessment determines if the program was cost effective (Reiser & Dempsey, 2018).
The Kirkpatrick Training Model model follows four levels. The first level is the reaction, which is the learners’ feedback towards the experience. This should not only encompass the learners’ overall satisfaction but should also contain inquiries about specific components of the training program. The second level is learning. Kirkpatrick defined this as “the extent to which participants change attitudes, improve knowledge, and/or increase skill as a result of attending the program“ (Reiser & Dempsey, 2018). To measure learning, an achievement test, a performance test, or a questionnaire should be used. Level 3 involves behavior or transfer of learning to real life. In our case study, this would involve determining if the students utilize student services after participating in the program. Finally, level 4 contains the results. The results should include all the outcomes that the organization is affected by. For example, costs, return on investment, and engagement levels are all items that could be included in the results. A program should be evaluated at all four levels, if possible, to be considered most effective. For an audiovisual overview of the Kirkpatrick Evaluation Model, check out this video.
2. Overview of the Case
The “A Successful You” program was designed to allow students to interact with different activities and events to learn about the student services offered to them at Georgia State University. Each event had a special focus within the program to provide more time for students to explore the services. Each event also had different incentives, such as food and drinks, to motivate student engagement.
The first event focused on student health and wellness. The second event was about student safety and success resources. The third event was regarding offering financial support. The last event provided success workshops such as vision board and notetaking though a traditional and non-traditional approach.. The program focused on connecting students to campus resources and various interactive workshops for student success in college. This was designed to tackle the issue of what is needed to provide more awareness to students of what is available to them. In these programs, it was emphasized to keep it interactive and provide activities that keep students engaged and motivated. The ultimate goal was to educate students on student services and to increase their well-being.
It is important to determine if programs like these have the desired impact on students and if they should further investigate how to increase student utilization of campus services and resources. If so, then these programs should be held per-semester for students. If these programs do not increase student utilization of campus resources or have a positive impact on students, then the resources should be directed to other programs on campus.
3. Solutions Implemented
Rossi’s Five Domain Model is applied to this program. Beginning with the needs assessment, there was brainstorming about what services are underutilized by students due to a lack of awareness. There is a large array of services available to students, so they may not be aware of everything that is offered to them. Next, there was a theory assessment performed. While there are other resources fairs that take place for students, we thought it would be helpful to have another program with a narrower focus. This is achieved by breaking down the events into four distinct categories. The program was also evaluated through an implementation assessment, in which a detailed proposal was designed providing event planning and logistics. It was found that the program followed the outlined plan, passing this evaluation domain. The impact assessment looked at student participation and found that student participation goals were exceeded. The efficiency assessment found that the program was cost effective because the program cost did not exceed the budget allotted.
We also evaluated this case study using the 4 levels of the Kirkpatrick model. Due to restrictions and other factors, we could not gather data for every level, however, suggestions are listed for future program evaluations.
Level 1– Reaction: During the end of the program, there was a posterboard on the table for students to comment their thoughts about the program. Students would list what they learned or liked or disliked about the program. Offering a traditional survey was restricted at the time by the student life director. So, data at this level was based solely from the feedback left by students on the posters.
Another way a survey could be collected is by providing students a QR Code where there would be a survey available to them. This would make the survey easily accessible and would make it easy to obtain the data from the survey. Some ideas of questions that could be asked for a Level 1 survey can be found here.
Level 2–Learning: For this level, we monitored student participation throughout the event. Since this event was centered around teaching students about services, participation is correlated to learning.
Level 3-Behavior: Data could not be gathered at the level. We propose that in the future, a survey be emailed out to students who participated in the event afterwards. This survey will ask the students what student services they have engaged in since attending the event (if any). This data will show how much the event taught students about student services and how much it influenced them to engage in what was available to them.
Level 4-Outcomes: Data could not be gathered at the level. We propose a longitudinal study be held on the students who participated in this event versus those who did not participate. Data should be gathered on the retention rates and graduation rates on these students to see if there is a significant difference between the two groups. If there is, it can be inferred that those who engaged in student services have better learning outcomes than those who did not. It would also give the University a definitive Return on Investment calculation, which would help secure funding for future events.
4. Outcomes
Rossi’s Five Domain Model Outcomes:
Needs Assessment: Need for program was determined due to student lack of awareness of services offered.
Theory Assessment: Program was needed to provide a narrower focus of services provided.
Implementation Assessment: Program followed the outlined plan that was created when coordinating the program. Implementation was successful.
Impact Assessment: Students were engaged and provided positive feedback about program. The goal of students learning about the services available to them was met.
Efficiency Assessment: Budget was met and was not exceeded, so the program was cost effective.
Kirkpatrick Training Model Outcomes:
Level 1: Overall, the event was successful, given that the participation goals were exceeded, and student feedback was overwhelmingly positive.
Level 2– Student attendance in events was high. There was peak in student participation in the first hour and a half. From this, we acknowledge that the two-hour interval is more fitting for these types of events in the future. The reason it was designed with a longer interval originally was to maximize student attendance, since the longer the interval lasted, the more opportunities it would provide for students to stop by. We recognize students have classes and other factors that could hinder their attendance. It could be specific to the campus, but it seems appropriate to change the time to shorter timeframe given the results.
5. Implications
Based on student feedback, this program was successful, and these kinds of events should be held at regular intervals. However, to have a more in-depth understanding of the impact, a program evaluation should be completed after each event. These evaluations should include satisfaction surveys and data about the flow of student traffic in student services offices. A longitudinal study may be considered to follow students who participated in events like the one mentioned in order to compare retention and graduation rates to those that did not participate. This type of data would allow for the Return on Investment to be calculated.
This program can also be designed as fully online or as a blended option with modifications. Due to the recent and on-going pandemic, it is important to provide virtual options for safety and comfort reasons. Safety is the number one factor and fully virtual options are the safest option given with the current pandemic. Students, staff and faculty may not feel comfortable in an in-person setting due to potential exposure, so fully virtual options are a necessity! Having virtual options can make the participants feel safer. However, keeping students engaged in virtual programs during the COVID-19 pandemic should still be prioritized.
The blended options offer choice to students who are more open to in-person events. Safety measures are provided for the in-person and blended options with COVID-19 guidelines. For example, Georgia State University is currently offering a virtual “Destress Fest“ and other activities to provide opportunities for students to engage with their campus community. This is one example of many of how colleges and universities are adapting to a mostly virtual world.
Studies are showing an increase in participation with certain virtual services provided by universities. Students and faculty do not have worry about commuting and traffic issues with virtual an option. For example, a particular study was conducted on supplemental instruction shared in Supplemental instruction online: As effective as the traditional face-to-face model? where online versus in person attendance was compared. Virtual options make these services more easily accessible to participants, which causes an increase in student attendance due to convenience factors. However, that is not to exclude technical difficulties. There are audio and video updates likely needed with hosting these virtual events. Therefore, various campus services approached these situations by separating virtual and in-person services.
With more virtual options, different methods of learning for students have been provided. We also recognize that adjusting to online learning is different for many students because students learning preferences vary. Previously, digital learning had a smaller population within campuses. Now, due to the shift, most classes have become digital. Technology features like breakout sessions and hand raise help to make online interactions more organized and customized experience.
A formal evaluation should also be done on online events to see the efficacy of online versus in-person and blended option. The evaluations show what could be improved upon in the future.
References
Blum, A., & Jarrat, D. (2014, October). Using Student Services to Enhance Outcomes and Reduce Costs. Insight Track. https://www.insidetrack.org/wp-content/uploads/2014/10/using-student-services-to-enhance-outcomes.pdf.
Hizer, S. E., Schultz, P. W., & Bray, R. (2017). Supplemental instruction online: As effective as the traditional face-to-face model? Journal of Science Education and Technology, 26(1), 100–115. https://doi.org/10.1007/s10956-016-9655-z
Kirkpatrick, J. (2016). The New World Level 1 Reaction Sheets. Kirkpatrick Partners. https://www.kirkpatrickpartners.com/Portals/0/Storage/The%20new%20world%20level%201%20reaction%20sheets.pdf.
Kuh, G., Kinzie, J., Buckley, J., Bridges, B., & Hayek, J. (2006, July). What Matters to Student Success: A Review of the Literature . National Center for Education Statistics. https://nces.ed.gov/npec/pdf/kuh_team_report.pdf.
Orey, M. (2014, September 8). Kirkpatrick’s 4 Levels of Evaluation. https://www.youtube.com/watch?v=E-NhbKAzT2Q.
Reiser, R. A., & Dempsey, J. V. (2018). Trends and Issues in Instructional Design and Technology (4th ed.). Pearson.