With reference to the presage-process-product model of student learning (Prosser & Trigwell, 1999), the Student Learning Experience Questionnaire (SLEQ) is developed to assess primarity a broad range of students’ perceptions related to learning and teaching and students’ perceptions of their achievement of the University’s educational aims. Questionnaire development and revision have gone through piloting and focus group discussions to ensure the clarity and the robustness of the developed / revised questions. The questionnaire is reviewed and refined frequently to capture the latest teaching and learning practice.
The survey data are analysed at institutional, Faculty, Department, curriculum and programme levels. Within cohort, longitudinal tracking is conducted on the individual students who took the survey at their freshman and senior years. Across cohorts, cross-sectional comparisons are performed over years. Comparisons are also made between students from different cultural and educational backgrounds and in various study modes. Students’ written comments are processed and analysed in different themes.
Additionally, both quantitative and qualitative findings are reported by themes, such as, on academic advising, Common Core curriculum, e-learning, English language enhancement and residential education, in support of the enhancement of students’ learning experiences.
Advanced psychometric methods were employed to analyse data collected from institutional surveys of students’ learning experiences. Psychometric properties of the instruments have been regularly examined to provide evidence on reliability and validity in support of the reported scores. Good reliability and validity are demonstrated through psychometric analyses based on classical test theory and modern measurement theory. Furthermore, studies have been performed to continuously collect validity evidence using advanced psychometrics and contemporary methodologies, such as, examining the measurement invariance/ equivalence of the SLEQ (Zhao et al., 2017), quantifying learning gains using a more accurate measure (Zhao et al., 2016), and examining overall student learning experience during the COVID-19 pandemic using various indicators of teaching and learning practice (Chan et al., 2021).
Automatic text-mining approaches were pioneered for analysing the contents of students’ written comments regarding teaching and learning (Ko et al., 2020). With the aid of automatic text analytic tools (e.g., LeximancerTM), the frequency, co-occurrence, and inter-relationship of key themes and concepts in student comments are visualized in graphics. It also allows for examining how key concepts varied between students from different study years, and whether key concepts in comments on the best aspects and aspects to be improved differed.
Chan, Y., Huen, J., Li, T. T., & Zhao, M. Y. (2021, December, 8–10). Undergraduate students’ experience of online learning during COVID-19: Findings from quantitative and qualitative analyses [Conference presentation abstract]. International Conference on Learning and Teaching 2021, Hong Kong, China. https://www.eduhk.hk/iclt2021/
Ko, W. T., Chan, Y. W., & Zhao, M. Y. (2020, December, 2–4). An automatic approach to analyzing students’ qualitative feedback in relation to teaching and learning [Conference presentation abstract]. International Conference on Learning and Teaching 2020, Hong Kong, China. https://www.eduhk.hk/iclt2020/
Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. Buckingham, PA: Society for Research into Higher Education & Open University Press.
Zhao, Y., Huen, J.M.Y., & Chan, Y.W. (2017). Measuring longitudinal gains in student learning: A comparison of Rasch scoring and summative scoring approaches. Research in Higher Education, 58(6), 605-616. https://doi.org/10.1007/s11162-016-9441-z.
Zhao, Y., Huen, J.M.Y., & Prosser, M. (2017). Comparing perceived learning experiences of two concurrent cohorts under curriculum reforms in Hong Kong: A multiple-group confirmatory factor analysis approach. Quality Assurance in Education, 25(3), 270-286. doi: 10.1108/QAE-11-2016-0070.