Re-thinking Student Written Comments in Course Evaluations: Text Mining Unstructured Data for Program and Institutional Assessment

A nearly ubiquitous instrument of assessment for instructors and courses at the university and community college is the student course evaluation. One common feature of course evaluations is the open-ended questions that are often used to provide feedback to instructors on course and instructional content. Because of the difficulty in large scale assessment of written text, the written comments are often not analyzed with a systematic or consistent methodology. Technological advances, however, have made it possible to quantitatively study the unstructured data from these written responses through the algorithmic use of text and data mining. This study, using 835 surveys from a continuing education program over a five-year period, employed an embedded correlational model using text mining methods such as Principle Component Analysis (PCA) and Singular Value Decomposition (SVD) within a qualitative framework to determine the viability of such an analysis on an institutional level. The study's major findings show that while there is only a weak correlation between the Likert responses and the open-ended written portion, there are significant words and patterns within the unstructured data that provide additional information at the institutional level. The results of this research suggest a need to rethink the design, implementation, and approach to the student course survey that can take advantage of text mining as an analytical tool for the institution.