As of Fall 2025, the Center for Integrated Professional Development (CIPD) will limit its Opscan services to direct instructional activities only, such as exams processing. As a result, Opscan will no longer be available for course evaluation processing.
To support departments and schools in continuing to collect meaningful course evaluation data, CIPD has developed a comprehensive Qualtrics-based solution. This approach is designed to ensure continuity, reduce administrative burden, and maintain data integrity.
We recommend that each department or school use a single Qualtrics survey per semester, with course-specific information embedded in the survey link (via metadata). This strategy is designed to minimize the complexity of managing multiple surveys and helps to ensure consistency in data collection.
This process has been tested and vetted by multiple teams across campus. By following this process, you can ensure data from course evaluations are:
We are committed to making this transition as smooth and effective as possible. This effort is a collaborative partnership between CIPD and Technology Solutions.
CIPD is leading the support for departments/schools with the conceptual and pedagogical aspects of administering course evaluations. For questions related to evaluation design, implementation strategies, or best practices, please contact ProDev@ilstu.edu.
Technology Solutions is providing support for the technical implementation and troubleshooting of the system. For assistance with system access, functionality, or technical issues, reach out to the Technology Support Center at supportcenter@ilstu.edu.
When planning your course evaluations, consider what you most want to learn from your students. What kinds of questions will give you meaningful insight into their experiences and learning? How will you approach gathering that feedback in a way that is clear, consistent, and respectful of student voices?
Using this method in Qualtrics, evaluations are always secure, and anonymous, giving students a safe space to share honest feedback. Keep your questions focused on learning outcomes and the aspects of your programs that you can act on—this ensures the feedback you collect is both useful and actionable.
As you develop your course evaluation instrument:
There are several established tools for student ratings of instruction, such as SEEQ (Students’ Evaluation of Educational Quality) and IDEA. Below are sample questions drawn from these published sources.
Sample Questions:
The following instruments have two very different approaches to gathering feedback from students: SEEQ is focused on students’ perceptions of the quality of their learning environment, while IDEA focuses on students’ perception of their progress made towards meeting learning objectives. Both instruments have significant evidence-based backing on their reliability and validity. Either option could be a good choice for your unit. However, changing from one survey instrument to another must be considered carefully, particularly if your units assessment plan relies on longitudinal uses of these data.
What it measures: Nine common dimensions (e.g., Learning/Value, Organization & Clarity, Enthusiasm, Group Interaction, Individual Rapport, Breadth of Coverage, Assessment/Exams, Assignments/Readings, Overall).[1]
Why it’s recommended: Extensively validated across countries and disciplines; stable factor structure and acceptable reliability that improves with more respondents.[2][3]
Example Instrument:
Reference example of SEEQ items (full form): [4]
What it measures: Student progress on relevant objectives identified by the instructor (e.g., understanding concepts, applying knowledge, communication/professional skills), along with teaching methods and contextual adjustments.[5][6]
Why it’s recommended: Strong internal consistency and validity evidence; reporting distinguishes progress on essential/important objectives and provides adjusted scores.[5][6]
Example Questions (tailor objectives to your outcomes):
Instructor should select 3–5 essential/important objectives to anchor reporting. [7]
[1] Marsh, H. W. (1982). SEEQ: A Reliable, Valid, and Useful Instrument for Collecting Students’ Evaluations of University Teaching. British Journal of Educational Psychology. ERIC record: https://eric.ed.gov/?id=EJ264777
[2] Marsh, H. W. (1984). Students’ Evaluations of University Teaching: Dimensionality, Reliability, Validity, Potential Biases, and Utility. Journal of Educational Psychology. PDF: https://www.wittenberg.edu/sites/default/files/media/faculty/Marsh1984.pdf
[3] Grammatikopoulos, V., et al. (2015). Assessing the SEEQ in Greek Higher Education. Higher Education. Abstract: https://go.gale.com/ps/i.do?id=GALE%7CA425954547
[4] Example SEEQ item set (McKendree University CTL). PDF: https://www.mckendree.edu/academics/vcte/resources/StudentEvaluationofEducationalQuality.pdf
[5] IDEA Research & Technical Reports (validity, fairness, objective‑based reporting). https://www.ideaedu.org/research-resources/research-technical-reports/index.html
[6] IDEA Interpretive Guide (Diagnostic Form reporting, adjusted scores). PDF: https://www.tntech.edu/iare/pdf/assessment/Interpretive_Guide_Diagnostic_Report.pdf
[7] IDEA faculty guidance on selecting objectives (USU/URI). https://www.usu.edu/oda/idea/idea-faculty-faqs and https://web.uri.edu/wp-content/uploads/sites/1970/Thoughts-on-selecting-IDEA-objectives.pdf
[11] American Sociological Association (2019/2020). Statement on Student Evaluations of Teaching. PDF: https://www.asanet.org/wp-content/uploads/asa_statement_on_student_evaluations_of_teaching_feb132020.pdf
[12] Stark, P. B., & Freishtat, R. (2014). An Evaluation of Course Evaluations. PDF: https://www.stat.berkeley.edu/~stark/Preprints/eval14.pdf
[13] University of California Academic Council (2020). Teaching Evaluation Task Force Report. PDF: https://senate.universityofcalifornia.edu/_files/reports/kkb-divs-teaching-evaluation-task-force-report.pdf
[14] UC Berkeley Course Evaluations Question Bank (ordering specific items before global). https://teaching.berkeley.edu/teaching-guides-resources/course-evaluations-question-bank
[15] Peterson, D. A. M., et al. (2019). Mitigating gender bias in student evaluations of teaching. PLOS ONE. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0216241
[16] Kim, F. (2024). Bias‑intervention messaging in student evaluations (overview of recent studies). https://www.sciencedirect.com/science/article/pii/S2405844024131714
[17] Gupta, V., et al. (2020). Impact of moving to online evaluations on response rates (drop from ~56% to ~33%). Open‑access: https://pmc.ncbi.nlm.nih.gov/articles/PMC7055417/
[18] Guder, F., & Malliaris, M. (2010). Online vs. paper course evaluations (mode effects). PDF: https://files.eric.ed.gov/fulltext/EJ1060301.pdf
[19] Evidence and guidance that offering in‑class completion time boosts response rates (e.g., review & strategies). https://apps.weber.edu/wsuimages/ie/Evaluations/Top%2020%20strategies%20to%20increase%20online%20response%20rates.pdf
[20] Pitt (OMET) summary of IDEA reliability by class size (split‑half reliability rises with N). PDF: https://teaching.pitt.edu/wp-content/uploads/2018/12/OMET-idea-paper_50.pdf
American Sociological Association (2019/2020). Statement on Student Evaluations of Teaching. https://www.asanet.org/wp-content/uploads/asa_statement_on_student_evaluations_of_teaching_feb132020.pdf
Guder, F., & Malliaris, M. (2010). Online And Paper Course Evaluations. https://files.eric.ed.gov/fulltext/EJ1060301.pdf
Gupta, V., et al. (2020). Qualitative Analysis of the Impact of Changes to Course Evaluation System. https://pmc.ncbi.nlm.nih.gov/articles/PMC7055417/
IDEA. Research & Technical Reports. https://www.ideaedu.org/research-resources/research-technical-reports/index.html
IDEA Interpretive Guide (Diagnostic Form Report). https://www.tntech.edu/iare/pdf/assessment/Interpretive_Guide_Diagnostic_Report.pdf
Kim, F. (2024). Bias intervention messaging in student evaluations. https://www.sciencedirect.com/science/article/pii/S2405844024131714
Marsh, H. W. (1982). SEEQ article (ERIC record). https://eric.ed.gov/?id=EJ264777
Marsh, H. W. (1984). JEP overview of SET/SEEQ. https://www.wittenberg.edu/sites/default/files/media/faculty/Marsh1984.pdf
McKendree University CTL. SEEQ item set. https://www.mckendree.edu/academics/vcte/resources/StudentEvaluationofEducationalQuality.pdf
Pitt OMET summary of IDEA reliability by class size. https://teaching.pitt.edu/wp-content/uploads/2018/12/OMET-idea-paper_50.pdf
Peterson, D. A. M., et al. (2019). Mitigating gender bias in SET (PLOS ONE). https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0216241
UC Academic Council (2020). Teaching Evaluation Task Force Report. https://senate.universityofcalifornia.edu/_files/reports/kkb-divs-teaching-evaluation-task-force-report.pdf
UC Berkeley Course Evaluations Question Bank. https://teaching.berkeley.edu/teaching-guides-resources/course-evaluations-question-bank
URI/USU guidance on IDEA objective selection. https://web.uri.edu/wp-content/uploads/sites/1970/Thoughts-on-selecting-IDEA-objectives.pdf ; https://www.usu.edu/oda/idea/idea-faculty-faqs
Stark, P. B., & Freishtat, R. (2014). An Evaluation of Course Evaluations. https://www.stat.berkeley.edu/~stark/Preprints/eval14.pdf
Summary of reliability vs. number of raters. https://evidencenet.pbworks.com/f/Students_evaluations_of_university_teaching.pdf
Download an accessible PDF of Two Recommended Models for Course Evaluation.
After you have a plan for what questions you'd like to ask in the course evaluations, you can create your survey in Qualtrics using our University-approved process.
Visit the Qualtrics for Course Evaluations guide to access step-by-step instructions, embedded videos, and tutorials to support you through each stage of the process.
To be successful with this process, you'll need to:
Human Resources and Technology Solutions offers support for using both Excel and Qualtrics. You can find more information about available trainings at Human Resources Trainings & Workshops and Technology Help. If no training is available, you can request additional support through the Technology Support Center.
Once you have your survey and unique links for each survey, you'll need a method for sharing those links with students. You can either share the links directly with students through a mail merge or you can share the links with instructors to share with students in their courses.
To increase response rates:
Be sure you set and communicate when the survey will close and when instructors will be notified of the results. Course evaluations should be sent to students in the final two weeks of class. Surveys should stop collecting feedback prior to the start of final exams week.
Results from the evaluations should only be shared with instructors after final grades are recorded. Since many faculty include the results of their course evaluations in their end of year productivity reports, sharing the data shortly after grades are submitted will give instructors ample time to prepare their reports.
After the survey has closed, you'll likely have hundreds of student responses to organize and share with instructors. You can use Qualtrics’ built-in reporting tools or export data to Excel, depending on your preference and comfort level.
When sharing results:
Course evaluations and derivative work products may be stored on University-managed endpoint devices (e.g., laptops, desktops, tablets) while they are being processed, analyzed, and disseminated, but should be moved to a secure location when they are no longer in active use. Student Evaluations are official University records and therefore subject to ISU’s Records Management procedures. They must be retained for at least three (3) years for individual student comments and eight (8) years for tabulated results of student evaluations. After the minimum retention period, records may only be disposed of with formal approval from the Office of the Provost. If no such approval is obtained, records must be retained indefinitely.
After generating and sharing reports, remember that course evaluation data must be securely retained indefinitely unless an alternative retention plan is approved by the Provost’s Office.
Instructors may have questions or concerns about their feedback. Encourage them to:
While course evaluation data is helpful for individual instructors to adjust their teaching practices, it can also help at the department level. Aggregate data can help you see what courses are successful, overall student attitudes towards your programs, and areas of growth.
If you'd like to learn more about how to use course evaluation data to improve your teaching, attend a workshop hosted by CIPD or schedule an individual consultation to discuss tangible ways you can incorporate course evaluation data in your teaching.