This is the first review I have attempted on a program evaluation. I hope I am somewhat on track :) Please feel free to provide feedback, suggestions or comments as I am sure it will be useful in the future.
The Medical Education Department made revisions to the existing program evaluation approaches at Dokuz Eylul University School of Medicine (DEUSM) in June of 2005.
After careful consideration they chose a mixed evaluation model that was systematic and multidimensional and was developed to meet their institutional needs. I also found some of Scriven’s evaluations methods present after reading through the program evaluation.
The evaluation consisted of three main questions as the areas of inquiry based on the general education goals:
1. What are the effects of the educational program on students and graduates?
2. What are the effects of the educational program on trainers?
3. Is the educational program being implemented as planned?
I think it is impressive when an institution conducts a program evaluation in order to make any necessary changes than in turn will make their program more successful. After only one year of the evaluation the institution made important revisions to the educational program. This was a case where the program evaluation was completed for a purpose, improving the current program.
Some of the strengths and weaknesses I found for DEUSM’s evaluation are noted below:
Strengths:
· Started the evaluation with a purpose.
· Had 3 guiding questions for the evaluation.
· Made necessary changes after only one year.
· Had a variety of methods for data collection.
· Had a place for data analysis and interpretation.
· Results were used for program improvement.
· Obtained qualitative and quantitative data.
· Looking at An Overview of Evaluation Theories by Michael Scriven, the idea of allowing those who are being evaluated to participate in the evaluation has become increasingly popular. This is more collaborative and participatory in terms of evaluation. It makes sense to have the people being evaluated take part in the evaluation and this was the case for DEUSM.
Weaknesses:
· Data collected for the most part was only once a year.
· Due to time limitations some activities were moved to the next year.
Overall I feel the program evaluation was a success and met the needs of the institution. They made effective changes based on the two year evaluation program that included revising their curriculum, reducing the frequency of exams, and a diversification of socio-cultural activities in order to meet the needs of the students. Other content based changes were made and the school plans on continuing the program evaluation.
As stated earlier I found that the DEUSM program evaluation adheres to Scriven and Mix methods of evaluation.
Some of the key points of program evaluation according to Scriven that the DEUSM has used are:
• Simple approach to evaluation (started with 3 questions)
• Purpose of evaluation can be goals or roles (goal was to improve the current program)
• Goals - outcomes of the program, reason for a program, really need to study the goals of a program (DEUSM questions 3, is the educational program being implemented as planned?)
Scriven also says that the formative part of evaluations gives feedback during the delivery of a program for immediate or future modification. DEUSM started making modifications after the first year of evaluation.
Scriven, (1991) also says that formative evaluation is typically conducted during the development or improvement of a program; in this case it was their medical program that they wanted to improve. He also states that the summative part of the evaluation can evaluate the learning materials and learning process to see what changes need to be made for improvement which DEUSM did after one year.
Mix Methods
• Rarely is a pure theory applied (DEUSM had questions that related to their program needs)
• Usually a mixture. Parts and pieces and then modifications to fit a particular situation. (had a variety of methods for data collection, analysis and interpretation. Used questions that would guide their evaluation needs.).
I think the program evaluation completed by DEUSM was very thorough and useful. They completed a program evaluation but also took important steps in making their findings functional and purposeful. They actually did something with the findings in order to improve their current program. They used a variety of ways to obtain valid information and data which in turn allowed for the needed changes.
Hi Lisa,
ReplyDeleteGood analysis of the program evaluation. I found it interesting when I went to the link that the results were not stated, only mention of the methods used to determine each area and that changes were implimented. Maybe I missed something.
Glenys
Thanks! Yeah, the article just said they made changes and some of the changes that they implemented but not the actual results. They did say they would continue with the evaluation, so I assumed it was positive.
ReplyDeleteExcellent analysis Lisa. You have done a great job of connecting what you have chosen to the literature. The repeated connections you make to the theorists is valid. YOur way of delivering your findings is clear and well organized. The positive outcome you share about the actual application of the recommendations is great. Making recommendations without changes is waste of a PE.
ReplyDeleteJay