نقش و جایگاه دانش آموختگان در فرایند ارزشیابی برنامه‌های درسی رشته های علوم انسانی در دانشگاه های برتر ایران و جهان به منظور طراحی مقیاس مطلوب

نویسندگان

1 دانشجوی دکترای مطالعات برنامۀ درسی دانشگاه اصفهان، اصفهان، ایران

2 استاد گروه علوم تربیتی دانشگاه اصفهان، اصفهان، ایران

3 دانشیار گروه علوم تربیتی دانشگاه اصفهان، اصفهان، ایران:

چکیده

بهبود برنامه‌های درسی رشته‌های دانشگاهی نیازمند در نظر گرفتن دیدگاه ­ها و نیازهای گروه­ های ذی ­نفع برنامۀ درسی است. هدف از پژوهش حاضر بررسی نقش دانش ­آموختگان در ارزشیابی برنامه‌های درسی دانشگاه­ های برتر کشور و جهان به­منظور طراحی یک مقیاس ارزشیابی برنامۀ درسی ویژۀ دانش ­آموختگان بود. پژوهش به­ صورت ترکیبی و از نوع اکتشافی-متوالی در قالب طراحی ابزار و جامعۀ آماری شامل شش گروه بود. گروه اوّل اعضای هیئت علمی ‌و صاحبنظران برنامه‌ریزی درسی  دانشگاه­ های برتر، گروه دوم مسئولان مراکز برنامه‌ریزی آموزشی دانشگاه ­های برتر و گروه سوم مسئولان ذی­ ربط در وزارت علوم، تحقیقات و فناوری بودند که از این سه گروه با 30 نفر مصاحبه شد. گروه چهارم دانشگاه­ های منتخب جهان، گروه پنجم مدرسان ‌دورۀ کارشناسی ارشد برخی از رشته‌های علوم انسانی دانشگاه­ های برتر و گروه ششم 30 نفر از دانش ­آموختگان شاغل بودند. ابزار جمع ­آوری داده‌ها مصاحبۀ نیمه ­سازمان یافته، تحلیل مستندات و مقیاس محقق ­ساختۀ ویژۀ دانش ­آموختگان بود. روایی سؤالات مصاحبه از نوع محتوایی بود که با استفاده از نظرهای ده نفر از متخصصان دانشگاهی و پایایی داده‌های مصاحبه با استفاده از پیاده‌سازی مجدد مصاحبه‌ها و استفاده از گروه همتا و نیز روایی مقیاس دانش ­آموختگان به ­صورت محتوایی و اعتباریابی آن با استفاده از ضریب لاشه به­ دست آمد. همبستگی درونی گویه‌ها از طریق آلفای کرونباخ برآورد شد. یافته‌ها نشان داد که دانشگاه‌های معتبر جهان به­ منظور بهبود و بازنگری برنامه‌های درسی از نظرهای تمام ذینفعان نظیر اعضای هیئت علمی، دانشجویان، دان ش­آموختگان و کارفرمایان استفاده می‌کنند. در مقایسه با آن، در دانشگاه­ های ایران دانش ­آموختگان بجز در موارد استثنایی در ارزشیابی برنامه‌های درسی مشارکت چندانی ندارند و حتّی صاحبنظران اهمیت و ضرورت مشارکت دانش ­آموختگان را پایین­ تر از سایر ذینفعان تلقی می­ کنند. با استفاده از داده‌های به­ دست آمده، یک مقیاس برای ارزشیابی برنامۀ درسی از دیدگاه دانش ­آموختگان با 62 عبارت و 7 سؤال بازتدوین شد.

کلیدواژه‌ها

عنوان مقاله [English]

The role and position of graduates in the process of evaluating humanities curricula in top universities in Iran and the world in order to design a desired scale

نویسندگان [English]

  • Rasool Golkar 1
  • Ahmad Reza Nasr 2
  • Mohammad Reza Nili 3

1 Doctoral Student, Faculty of Psychology and Education, Isfahan University, Isfahan, Iran

2 Professor, Faculty of Psychology and Education, Isfahan University, Isfahan, Iran

3 Associated Professor, Faculty of Psychology and Education, Isfahan University, Isfahan, Iran

چکیده [English]

Improving the curriculum of academic disciplines requires to consider the perspectives and needs of the curriculum constituencies. The purpose of this study was to investigate the role of graduates in evaluating the curricula of the top universities in Iran and the world in order to design a specific curriculum evaluation scale for graduates. A mixed sequential -exploratory research in the form of instrument design was chosen for this study. Statistical population consisted of six groups. The first group included faculty members and curriculum experts at top universities, the second group was officials of educational planning centers at top universities and the third group was relevant officials in the Ministry of Science, Research and Technology from which a total of 30 people were interviewed. The fourth group consisted of the world's selected universities, the fifth group included master's degree program instructors in some humanities disciplines at top universities, and the sixth group consisted of 30 employed graduates. Data collection tool was semi-structured interview, documents analysis, and specific graduate- researcher-made scale. The validity of the interview questions was content-based that used the opinions of ten academic experts. The reliability of the interview data was done through re-implementation of the interviews and the use of peer group as well as the validity of the graduate scale in terms of contents and validation using Lawshe coefficient. The internal correlation of variables was
estimated through Cronbach's alpha. The findings showed that the world's leading universities use the views of all constituencies such as faculty members, students, graduates and employers to improve and revise curricula. In comparison, in Iranian universities, graduates, except in exceptional cases, do not participate much in the evaluation of curricula, and even experts consider the importance and necessity of graduate participation lower than other constituencies. Using the obtained data, a scale with 62 phrases and 7 open-ended questions was developed to evaluate the curriculum from the perspective of graduates.

کلیدواژه‌ها [English]

  • curriculum evaluation
  • graduates
  • Top public universities
  • World credible universities
  • scale
1. Academic Program Review Guidelines (2019). University of Nebraska. Retrieved from www.unomaha.edu. (in Persain).
2. Aghapour, S., Movahhd Mohammadi, S.H., & Alambaigi, A. (2014). Key skills role in employability formation of college students. Quarterly Journal of Research and Planning in Higher Education, 20 (1), 41-56 (in Persain).
3. Akhlagi, F., Yarmohammadian, M., Khoshkam, M., & Mohebbi, N. (2011). Evaluating the quality of educational programs in higher education using the CIPP model. Journal of Health Information Management, 8 (25), 22-38 (in Persain).
4. Akyol, B., & Arslan, H. (2014). The evaluation of higher education problems in Turkey. European Scientific Journal, ESJ, 10(7).
5. Anderson, L.W., & Postlethwaite, T.N. (2012). Program evaluation: Large-scale and small-scale studies. International Institute for Educational Planning, Series Booklet, 8.
6. Anderson, S.B., & Ball, S. (1978). The profession and practice of program evaluation. San Francisco, California: Jossey Bass.
7. Aslan, M., & Saglam, M. (2017). Methodological investigation of the curriculum evaluation theses completed between the years 2006-2015 in Turkey. Universal Journal of Educational Research, 5(9), 1468-1478.
8. Bazargan, A., & Farasatkhah, M. (2017). Monitoring and evaluation in higher education. Tehran: Samt Publication (in Persain).
9. Beyrns, H. (2008). Owning up to ownership of foreign language program outcomes assessment. ADFL Bulletin, 39(2-3), 28-30.
10. Bigdely, M., Keramati, M., & Bazargan, A. (2012). The relationship between education and employment status of psychology and educational sciences’ alumnus in Tehran University. Quarterly Journal of Research and Planning in Higher Education,18(3), 111-131 (in Persain).
11. Bitzer, E.M. (2006). Stakeholders in quality: A response to Loyiso Jita’s article. South African Journal of Higher Education, 20 (6), 932-940.
12. Blackmore, P., & Kandiko, C.B. (2012). Strategic curriculum change–global trends in universities. Society for Research into Higher Education, Routledge, London.
13. Borden, V.M.H., & Kernel, B. (2013). Measuring quality in higher education: An inventory of instruments, tools, and resources. Retrieved from http://apps.airweb.org/surveys/Default.aspx.
14. Brunel University London (2016). Higher education review of Brunel University London. Retrieved from www.qaa.ac.uk/reviews.
15. Cohen, A.R., Fetters, M., & Fleischmann, F. (2012). Major change at Babson College: Curricular and administrative, planned and otherwise. Journal of Advances in Developing Human Resources, 7(3), 324-337.
16. Deros, B.M., Mohamed, A., Mohamed, N., & Ihsan, A.K. (2012). A study of alumni feedback on outcome based education in the faculty of engineering & built environment, University Kebangsaan Malaysia. Journal of Social and Behavioral Sciences, 60, 313 - 317.
17. Dumford, A.D., & Miller, A.L. (2015). Are those rose-colored glasses you are wearing? student and alumni survey responses. Journal of Research & Practice in Assessement, 10, 5-14.
18. Emil, S. (2011). Assessment for improvement in higher education: Faculty perception of and participation in program assessment. Dissertations and Theses.
19. Ewell, P. (2002). Perpetual movement: Assessment after twenty years. Keynote address to the American Association for Higher Education (AAHE). Assessment Forum, Boston, MA.
20. Farasatkhah, M., & Maniee, R. (2015). Effective factors on faculty participation in higher education policy making and university planning. Quarterly Journal of Research and Planning in Higher Education, 20 (4), 29-53 (in Persain).
21. Fatathi, B. (2011). The study of changing the curriculum of higher education; Case study: Master's degree curriculum for educational planning. Master's Degree Course in Educational Planning. Faculty of Social Sciences and Humanities, Mazandaran University (in Persain).
22. Fu, Y. (2016). A study on curriculum evaluation methods in higher education. 3rd International Conference on Management, Education Technology and Sports Science (METSS 2016).
23. Glatthorn, A.A., Boschee, F.A., & Whitehead, B.M. (2015). Curriculum leadership: Strategies for development and implementation. Sage Publications, Inc.
24. Gorsuch, G. (2009). Investigating second language learner self‐efficacy and future expectancy of second language use for high‐stakes program evaluation. Journal of Foreign Language Annals, 42(3), 505-540.
25. Harris, L., Driscoll, P., Lewis, M., Matthews, L., Russell, C., & Cumming, S. (2009). Implementing curriculum evaluation: Case study of a generic undergraduate degree in health sciences. Journal oF Assessment & Evaluation in Higher Education, 35(4).
26. Heidari, H., & Khashei, V. (2016). Analysis of the relationship between job characteristics and job adjustment of alumni. Quarterly Journal of Research and Planning in Higher Education, 22 (1), 127-143 (in Persain).
27. Henson, K.T. (2015). Curriculum planning: Integrating multiculturalism, constructivism and education reform. Fifth Edition.
28. Kiely, R., & Rea-Dickins, P. (2005). Program evaluation in language education. New York: Palgrave Macmillan.
29. Kindler, A. (2013). Principle, procedures and guidelines for external academic unit renivers uviews. The University of British Columbia Office of the Provost.
30. Labanauskis, R., & Ginevicius, R. (2017). Role of stakeholders leading to development of higher education services. Journal of Engineering Management in Production and Services, 9 (3), 63-75.
31. Leathwood, C., & Phillips, D. (2000). Developing curriculum evaluation research in higher education: Process, politics and practicalities. Journal of Higher Education, 40(3), 313-330.
32. Lindsten, H., Auvinen, P., & Juuti, T. (2019). Internal and external stakeholders’ impact on product developement curriculum design. Presented in Internationak Conference on Engineering and Product Design Education 12-13 September 2019, Department of Design, Manufactoring and Engineering Management, University of Strathclyde, United Kingdom.
33. Maleki, H. (2009). Fundamentals of secondary education planning. Tehran: Publication of the Samt (in Persain).
34. Malin, J.R. (2014). Curriculum evaluation for the improvement of STEM programs of study. Champaign, IL: Office of Community College Research and Leadership, University of Illinois at Urbana-Champaign.
35. Maric, I. (2013). Stakeholder analysis of higher education institutions. Interdisciplinary Description of Complex Systems, 11(2), 217-226.
36. Martens, S.E., Spruijt, A., Wolfhagen, I.H.A., Whittingham, J.R.D., & Dolmans, D.H.J. (2019). A students’ take on student-staff partnerships: Experiences and preferences. Journal of Assessment & Evaluation in Higher Education, 44(6), 910-919.
37. Marynowski, S. (2006). Best practices guide to program evaluation for aquatic educators. Retreived from www.pandionsystems.com.
38. Memarian, H. (2014). Innovation in engineering education. Tehran University Press (in Persain).
39. Meyer, M.H., & Bushney, M.J. (2008). Towards a multi-stakeholder-driven model for excellence in higher education curriculum development. SAJHE 22(6), 1229-1240.
40. Momeni Mahmouei, H. (2011). Pathology of curriculum evaluation in higher education. Educ Strategy Med Sci., 4 (2), 95-100.
41. Muradov, A. (2001). A model of academic program review for program improvement. Master’s Capstone Projects. 109. Retrieved from https://scholarworks.umass.edu/cie_capstones/109.
42. Norris, J.M. (2006). The issue: The why (and how) of assessing student learning outcomes in college foreign language programs. The Modern Language Journal, 90(4), 576-583.
43. Nyabero, C. (2016). Toward a collective approach to course evaluation in curriculum development, a contemporary perspective. Journal of Education and Practice, 7(35), 60-64.
44. O’Neill, G. (2010). Programme design (Programe Evaluation). Retrieved from www.ucd.ie/Teaching.
45. O'Kane, C. (2013). Stakeholder involvement for programme development. Teaching Fellowships, 35. Retreived from https://arrow.tudublin.ie/fellow/35.
46. Oliver, S.L., & Hyun, E. (2011). Comprehensive curriculum reform in higher education: Collaborative engagement of faculty and administrators. Journal of Case Studies in Education, 11, 1-20.
47. Randall, S.D. (2011). Understanding technology literacy: A framework for evaluating educational technology integration. Tech Trends, 55(5), 45-60.
48. Roberts, J., Gentry, D., & Townsend, A. (2011). Student perspectives: Evaluating a higher education administration program. Journal of Case Studies in Education, 1, 3-21.
49. Saunders-Smits, G., &Graaff, E.D. (2012). Assessment of curriculum quality through alumni research. European Journal of Engineering Education, 37(2), 133-142.
50. Shawer, S.F., & Alkahtani, S.A. (2012). The relationship between program evaluation experiences and stakeholder career satisfaction. Journal of Creative Education, 3(8), 1336-1344.
51. Spiel, C., Schober, B., & Reimann, R. (2006). Evaluation of curricula in higher education: Challenges for evaluators. Evaluation Review, 30: 430-450.
52. Tang, W., Bai, J., Liu, J., Wang, H., & Chen, Q. (2012). Students’ evaluation indicators of the curriculum. International Journal of Medical Education, 3,103-106.
53. Ulewicz, R. (2017). The role of stakeholders in quality assurance in higher education. Human Resources Management & Ergonomics, 11, 93-107.
54. University of Saskatchewan (2007). Criteria for evaluation of program proposals at the University of Saskatchewan. based on procedural and policy documents as reported to or approved by Council from 1996 To 2007.
55. University of Toronto Mississauga (2014). Guidelines & procedures for the student evaluation of teaching in courses. Retreived from http://www.governingcouncil.utoronto .ca/policies/ Policy Student Evaluation of Teaching in Courses.
56. Uow College. (2018). Curriculum review guidelines. Uow enterprises delegations of authority part B- Uowc Ltd Academic Delegations. WVNCC Board of Governors.
57. Van den Akker, J., & Verloop, N. (1994). Evaluation approaches and results in curriculum research and development in the Netherlands. Studies in Educational Evaluation, 20(4), 421-436.
58. Volkwein, J.F. (2009). Assessing alumni outcomes. Published online in Wiley Inter Science. Retreived from www.interscience.wiley.com
59. Walberg, H., & Haertel, G. (1990). The international encyclopedia of educational evaluation. Oxford: Pergamon.
60. Walsh, J. (2016). A research and evaluation framework to monitor impacts of curriculum reform in Maynooth University. 11th European Quality Assurance Forum 17-19 November 2016.
61. Wolf, P., Hill, A., & Evers, F. (2006). Handbook for curriculum assessment. University of Goelf. Educational Research and Development Unit.
62. Wright, B. (2002). Accreditation and the scholarship of assessment. In T.W. Banta & Associates, Building a scholarship of assessment (pp. 240-2458). San Francisco, CA: Jossey-Bass.
63. Zhao, D., Ma, X., & Qiao, S. (2017). What aspects should be evaluated when evaluating graduate curriculum: Analysis based on student interview. Studies in Educational Evaluation, 54, 50-57.