Skip to main content

Custom Questions for Instructors and Departments

Up to five custom SET questions can be added by instructors, and up to five additional custom SET questions can be added by departments/units. If you would like to add custom questions for your students, you can submit them through the Evaluations site (for SET). The deadline to enter properly formatted custom questions is a week before SET starts (SET starts on the Monday of Week 9 in Fall, Winter and Spring, and Friday of week 4 in Summer Session 1 and 2. Special Session dates vary). These questions may use a Likert scale (Strongly Disagree - Strongly Agree) or ask for an open-ended response.
  • For Instructors, custom questions can be used to gather formative feedback from students about course design, instructional activities, and teaching effectiveness of classes.
    • The Teaching and Learning Commons can provide assistance to Instructors  in developing custom SET questions.
  • For Departments/Units, in addition to use as part of the holistic evaluation of teaching effectiveness in the merit/promotion process, custom questions could be used to collect information relevant to the specific discipline or modes of instruction, or gain student perspectives on programmatic aspects across multiple courses.
    • Departments planning to use student responses to custom questions in the merit review/faculty promotion process should establish a policy explaining this use to candidates and outlining how review committees will use the responses.
    • The information obtained from custom SET questions may also potentially be useful for accreditation or other institutional studies.
     
The custom questions that are added, and the corresponding student responses, become part of the full PDF SET report for that class, which is available only to the instructor, the department, CAP, and other academic file reviewers. Aggregated student responses to custom SET questions are not available to the UC San Diego community via the SET data website. (Note: Departments also have the ability to add custom questions for Instructional Assistant SETs.)  
Here are some guidelines to keep in mind as you develop custom SET questions:
    • Avoid duplication or overlap with the standard SET questions included on all forms.
    • Consider whether the questions you are adding are:
      • Clear, Focused, and Easily Understood by all students (avoid questions addressing multiple issues or using unnecessary jargon or complicated phrasing),
      • Neutral (avoid questions that “steer” students in a particular direction),
      • Specific and about Observable Teaching Practices (e.g. “The use of [pedagogical technique] helped me master [course concept or skill]”), and
      • Student-Centered (e.g. directly related to the student’s course experience).
    • Open-ended questions can allow students to provide more nuanced feedback than Likert-scale questions.
    • Develop questions that could inform a specific response
    • Avoid questions that do not ask students to reflect on learning, that could trigger known biases, or undermine the holistic evaluation of teaching. 
A set of sample questions, along with a description of the strengths of those recommended and the weaknesses of those that would likely be less effective, is included below. Additional ideas for custom questions can be found in this “Question Bank” from UC Berkeley, or this one from the University of Illinois, or this from the University of Wisconsin.

Examples of Effective Custom SET Questions

Sample Question

(I=Instructor, D=Department)

Purpose(s)

 

Open response (I): “How, if at all, did the use of [pedagogical technique]/inclusion of [specific resource or assessment] help you master [course learning objective]?”

 

  • Gather feedback on the effectiveness of a specific technique/resource/assessment that may otherwise be missed in SET data.
  • Particularly helpful for newly-implemented techniques/resources/assessments.

Likert Scale (I/D): “This course gave me confidence to do more advanced work in the subject.”

  • Measure impact of course on students’ confidence and self-efficacy.
  • Could be most valuable for instructors who use techniques and practices specifically intended to increase students’ confidence in the discipline.

 

Open Response (I/D):  “What from this course do you imagine might be useful in your future career or life in general? Why?”

Open Response (I/D): “What topics or skills you learned in this course do you think you will remember in five years? Why?”

 

  • Measures impact the course had on students.
  • Asks students to reflect on their learning in the context of future goals.

Open response (I): “How frequently did you use the course website [insert link], and what did you use it for?”

  • Gather information about whether and how students are utilizing instructor-created resources.
  • Could inform whether the instructor continues to invest time in creating these resources for future iterations of the course.

Open response (I/D): “What artificial intelligence sites/resources did you use in this course, and how did you use them?”

  • Help instructor understand how students engage with artificial intelligence as part of their course experience.
  • Could inform future course policies or resources.

Open response (I/D): “What was a study/learning strategy that you found to be effective for you in this course? How did it help you learn course material?”

  • Encourage students to reflect on their own learning activities and how they impacted their course experience.
  • Gather information about what learning strategies students found to be the most helpful.
  • Could inform the design of future resources, and/or be shared with future students as tips for success.

Open response (I/D): “If you wanted to explain to a friend outside [discipline] what was the most important thing you learned in this class, what would you say?”

  • Asks students to reflect on what they have learned, why it was important, and …
  • … to phrase their answers in terms a non-specialist could understand.

 

Likert Scale (D):  “This course built my understanding of [subject]/[skill], extending the knowledge I developed in [pre-requisite course].”

Open Response (D): “In what ways have [pre-requisite course] and [current course] enabled me to master [big learning objective]/[major skill]?”

 

  • Asks students to reflect on how what they learned in one course is connected with other courses.
  • Provides the Department with information on how well course-sequences are coordinated and built on one another.

Examples of Less Effective SET Custom Questions

Sample Question
(not recommended)

Why not?

Likert Scale: “Overall I would recommend my instructor.”

  • Unclear what students’ responses would be measuring (general likability? Personality? Course content? Teaching methods?)
  • More susceptible to bias based on instructor characteristics (gender, age, race, etc.)        

Open response: “What suggestions would you make to improve the course materials?”


  • Duplicates SET question “Please describe any specific aspects of the course and/or teaching practices your instructor used that were less helpful for your learning. Optionally you may offer constructive suggestions that might improve their effectiveness.”
  • Increases length of SET without adding new insight, increasing chances of survey burnout and lower response rate.

Likert Scale: “The way in which this instructor taught the course gave me the opportunity to learn and master the subject matter”


  • Duplicates, with less specificity, the SET Likert scale questions on whether Class Sections and Assignments “helped me understand the course material” and whether the “course developed my understanding and skills for the subject.”

 

 

Open Response: “What changes would you recommend to the course resources (Canvas page, textbook, office hours, etc.) to make this class better for future students?”

Open Response: “What in-class activity or practice do you think is helpful to your learning?”

 

  • Duplicates SET questions “Please describe any specific aspects of the course and/or teaching practices that your instructor used that… were less helpful for your learning.  Optionally you may offer constructive suggestions that might improve their effectiveness.”

 

Open Response: “What was your favorite thing about this course? Why did you like it?”

Open Response: “What was your least favorite thing about this course? Why didn’t you like it?

 

  • Does not ask students to reflect on learning - just what they “liked” or “did not like.”

 

Likert Scale: “The pace and scope/number of topics covered in this course was appropriate and reasonable.”

 

  • Breadth of the question makes interpretation difficult.
  • Doesn’t ask students directly about learning.