A Christian College of the Liberal Arts & Sciences

´╗┐Overview of Houghton’s Course Evaluation Program:

The IDEA Center’s Student Ratings of Instruction


An Introduction for Faculty

     In fall 2009, Houghton moved from a locally-created paper/pencil form to a total online course evaluation system. The nationally-normed Student Ratings of Instruction system, called IDEA (Individual Development and Educational Assessment), was created in 1975 at Kansas State University and is now administered by The IDEA Center. Other sister schools, like Bethel University (MN) and Anderson University (IN), use this instrument, along with hundreds of other institutions. The 18 items below are part of the prescribed Short Form, but there is a 47-item Diagnostic Form option for assistance with formative evaluation and instructional development. Twenty local questions may be added to tailor the evaluation to your course. Since Houghton adds three faith-related items, you may add up to 17 departmental or personal questions.

    By college policy, all tenure-track (pre-tenure) faculty will use the Diagnostic Form for each course each semester. Tenured faculty members must evaluate one upper level and one lower course each year and they have the option to use either form. An email arrives mid-semester inviting faculty to complete the Faculty Information Form (FIF). Instructors select between 3 and 5 of the 12 general objectives which best describe their course, and only student ratings on those few objectives are considered in the results. The key (but not exclusive) assessment of a course, then, is how students rate their own progress on the relevant course objectives selected by instructors.

Instructors’ results are benchmarked with a) all Houghton colleagues, b) others in one’s discipline nationally, and c) a broad national sample of all IDEA users. A hardcopy Feedback Report for each course is provided to instructors within two-three weeks of the close of classes. (See a sample Diagnostic Report and a sample Short Form Report. Note that the greatest difference is the detailed information generated for page 3 of the diagnostic report.) Copies goes to the instructor, the department chair, the area associate dean, and one to the academic dean’s office. Special group reports may be ordered, such as, for academic departments or all music ensembles.

     Students read the following directions for the Short Form (note: on the Diagnostic Form, the objectives are items numbered 21-33):

“Twelve possible learning objectives are listed below, not all of which will be relevant in this class. Describe the amount of progress you made on each (even those not pursued in this class) by using the following scale:

  1. No apparent progress
  2. Slight progress; I made small gains on this objective.
  3. Moderate progress; I made some gains on this objective.
  4. Substantial progress; I made large gains on this objective.
  5. Exceptional progress; I made outstanding gains on this objective.

[To what extent do you think this course helped you make] Progress on:

  1. Gaining factual knowledge (terminology, classifications, methods, trends)
  2. Learning fundamental principles, generalizations, or theories
  3. Learning to apply course material (to improve thinking, problem solving, and decisions)
  4. Developing specific skills, competencies, and points of view needed by professionals in the field most closely related to this course
  5. Acquiring skills in working with others as a member of a team
  6. Developing creative capacities (writing, inventing, designing, performing in art, music, drama, etc.)
  7. Gaining a broader understanding and appreciation of intellectual/cultural activity (music, science, literature, etc.)
  8. Developing skill in expressing myself orally or in writing
  9. Learning how to find and use resources for answering questions or solving problems
  10. Developing a clearer understanding of, and commitment to, personal values
  11. Learning to analyze and critically evaluate ideas, arguments, and points of view
  12. Acquiring an interest in learning more by asking my own questions and seeking answers

  13. For the remaining questions, use the following code:

    1 = Definitely False
    2 = More False Than True
    3 = In Between
    4 = More True Than False
    5 = Definitely True
  14. As a rule, I put forth more effort than other students on academic work.
  15. My background prepared me well for this course's requirements.
  16. I really wanted to take this course regardless of who taught it.
  17. As a result of taking this course, I have more positive feelings toward this field of study.
  18. Overall, I rate this instructor an excellent teacher.
  19. Overall, I rate this course as excellent.”

     For more complete descriptions of the Diagnostic Form, the Feedback Reports, or the general processes of instructor involvement, review the other documents on this site. Contact Daryl Stevenson for confidential help in interpreting the feedback reports, for improving ratings in a course, or selecting appropriate key objectives.

     Following is a generic timeline of events related to IDEA each semester (and a semester-specific timeline with exact dates is published prior to each semester on this website):

Prior to each semester--
     1) Determine which 3 – 5 of the 12 IDEA objectives best describes the course and include these within the course list of syllabus goals/objectives.

Early in semester--
     2) Emphasize the goals/objectives to students early—perhaps on day one—so students are clear about course goals. Revisit these from time to time to reinforce them, perhaps tying course assignments and activities to specific goals/objectives. Help them make the connections.

Middle of semester—
     3) Instructors receive an initial email asking them to inform your departmental coordinator/assistant which courses will be evaluated at semester’s end. There is other information taken at that time: Use the Short or Diagnostic form? Is the course team-taught? Is the course cross-listed?
     4) About two weeks later instructors receive the FIF to complete—a separate email for each course being evaluated. Complete and return the FIF within about a week. Important: Keep the email and link until the FIF period closes to allow for any changes. On the FIF, select 3 – 5 course objectives from the list of 12, select a discipline code (for disciplinary benchmarking), add any personal feedback questions, and briefly check off a few other items such as when the course is taught, how many are in the class, and what type of course it is (lecture, seminar, lab, studio, etc.).

Fourteen days prior to the last day of class—
     5) Students will receive an email with an embedded link for the course. If students are taking four courses to be evaluated, they receive four emails. Students have the last two weeks of the semester to complete the ratings form, with reminders arriving every three days until it is completed. The response period closes on the last day of classes. We do not ask students to do evaluations during finals week.

     Instructors’ attitudes about doing the online evaluations outside of the class have much to do with general student response. Taking the evaluation seriously and showing interest in students’ opinions usually leads to more students participating. Response rates are typically much higher (perhaps even 100%) if evaluations are completed during class time together, if there is some incentive offered, or if it is framed as an expectation of being a course citizen to improve the quality of instruction for future students. If instructors ignore the evaluation experience and remain silent, or worse, show contempt for it, students will too. While we strive for 100% student participation, a reasonably useful response rate is at least 75%, and 65% is considered the minimum for good statistical generalizability. However, even lower rates provide important and useful feedback.

     Instructors have options for directing the process. They can decide to take 20 minutes to do it in class together (bring laptops that day), or they can do it when it is most convenient for them anytime during the two weeks of the response period. Communicate with students prior to the response period so they will know whether to complete it when it arrives, or to wait to do it during a designated class period. If instructors offer bonus points or extra credit for doing it, the on campus administrator can provide a list of completers but not what students say, of course. During the response period, instructors receive occasional updates to reveal how the response rates are progressing.

Following the semester—
     6) About two to three weeks after the semester ends, instructors will receive a four page summary Feedback Report for each course which provides empirical scores and benchmarks. Department chairs and Area Associate Deans can help with its interpretation or one may prefer to set up a time with Daryl Stevenson (the
on-campus administrator) for a confidential meeting.

     Periodically, the IR&A office holds lunch groups and workshops for some more detailed discussions, such as proven means for increasing the accuracy of the IDEA evaluations, completing the FIF accurately, managing other logistics, interpreting the Feedback Report, and ways to motivate greater response rates. Consult also the FAQ page on this site. You can explore the IDEA Center’s website at http://www.theideacenter.org.

Daryl Stevenson, PhD
Associate Dean and Director,
Office of Institutional Research and Assessment
daryl.stevenson@houghton.edu