Course Experience Survey FAQ

What is the CES?

Two standard Course Experience Surveys one for Higher Education and one for TAFE, were developed by the RMIT Planning Group after an extensive trialing and consultation process in 2005. Both versions incorporate items from the Good Teaching Scale (GTS) of the CEQ (a national graduate survey governed by Graduate Careers Australia) for Higher Education and the Student Outcomes Survey for TAFE.

Both versions have provision for five “additional” items to be created by staff to gather information of interest to specific courses. (See below for more information)

Questions have been developed to take measures on the following themes:

  • Feedback
  • Quality of the Teaching and Learning environment
  • Learning Objectives
  • Clear Goals
  • Assessment – workload
  • Commitment of staff – pastoral care
  • Learning Resources
  • The balance of theory/instruction and practice
  • Course interest
  • Online – computer based materials
  • Overall Student Satisfaction

Each of the questions is measured against a 5 point scale ranging from “Strongly Disagree” to “Strongly Agree”

Two qualitative questions are asked allowing for written comment by respondents:-

“What are the best aspects of this course”?

“What aspects of this course are in most need of improvement”?

What is the University Policy governing the CES?

The administration and reporting of the Course Experience Survey is governed by the RMIT Student Feedback Policy

How was the CES developed?

The University has been progressively moving to commence standardized and systematic collection and reporting of course level student feedback. Questions in the CES were sourced from the Course Experience Questionnaire (managed by Graduate Careers Australia), the Students Outcomes Survey (managed by the National Centre for Vocational Education Research), a range of University feedback instruments and extensive consultations.

The CES has a Higher Education version and a TAFE version with 13 standard items respectively. Each version also has provision for 5 “additional items” where course coordinators can ask questions of students of local interest.

How often are courses surveyed?

The RMIT Student Feedback Policy requires that “staff will seek student feedback in a form which can be captured, analysed and reported every time a course is delivered.

Administration of the CES

The CES is administered online from Week 9 each semester and the survey is open for four weeks.

On the commencement of the online survey, the students in your course will be sent an email from the SSC with the URL of the survey, asking them to complete the survey. To ensure a better response, inform your students in class to check their email and complete the survey, or set time aside in a laboratory class and ask them to complete it. The survey will be closed on the nominated date (the last day of completion). After the closing date the SSC generates individual teacher, School, College and University reports.

What if I want to ask more questions in the survey?

The CES provides for five additional likert items to the standard questions asked of all students to allow course coordinators to ask questions about issues specific to their course or their student’ learning experience.

Course Coordinators are emailed a web link used to populate the additional questions.

Can I conduct my own survey in addition to the CES?

You may conduct your own survey in addition to the CES provided it does not threaten the integrity of the CES – its application, data and use.

Students are sensitive to the numbers of surveys conducted each semester. We suggest you consider using the five additional items on the CES first. Any additional survey should be sharply focused on specific aspects of the course of relevance to your students and the teaching team.

Who has access to the data?

Access to CES data is governed by the University’s Student Feedback Policy and the current Enterprise Agreement.

Policy : “Feedback reports will be distributed to the relevant staff with designated responsibility for improving the student experience and outcomes...”

The Enterprise Agreement (2010, 47.5) :-
"Student evaluations of individual courses will not be made available externally where they may be identified with the performance of individual staff. To avoid doubt, it is agreed that such evaluations will be available to University staff dealing with Teaching and Learning matters including Pro Vice-Chancellors, Heads of Schools and other relevant staff.

How is data from the CES used?

The primary purpose of the data is to contribute to systematic improvement cycles across RMIT at course and program level – led by course teams and program leaders respectively.

The context for the use of the CES data is provided by the RMIT Student Feedback Policy and the Enterprise Agreement 2005.

Student Feedback Policy :

All staff at RMIT will engage with and respond to student feedback as part of the continual improvement of the student learning experience and undertake improvement planning on an annual basis.

Student feedback will be used by academic staff and their supervisors as part of the process of evaluating and enhancing teaching effectiveness and to inform promotion and probation decisions.

Student Feedback will be used :

To improve the quality of programs and courses through the development of annual improvement plans.

To support the scholarship of teaching.

To inform professional development programs.

To enhance program design and the connection of inter-related programs and

To improve the provision of learning resources, facilities, equipment and services through the development of annual improvement plans.

The Enterprise Agreement (2010, 47) :-
47.1 Discussion of teaching performance and teaching related issues will be included in the periodic workplanning and performance management meetings between the academic employee and her or his supervisor. Joint evaluation of the employee’s teaching performance will be based on evidence provided by the University and the employee, including feedback from students about their learning experiences; student performance and student outcomes; the employee’s self-assessment; and any other evidence provided by the employee.

47.2 This discussion will provide an opportunity for the supervisor to recognise and commend good practice in teaching, and where necessary discuss and agree staff development and/or other actions to enhance and enrich teaching quality.

47.3 Performance management in relation to teaching performance will take into account the whole teaching and learning environment, including staff workload, characteristics of the student cohort, the physical environment in which teaching takes place, the structure of the course, the availability of and access by students to learning resources, the method of delivery and the provision of professional development support to the employee.

47.4 Student feedback surveys on their own will not be used as a measure of teaching performance or to take any action under the Disciplinary Procedures in clause 46 of this Agreement.

Complete documents can be found at:-

How do staff get the reports?

At the close of the survey, results from the student online/hard copy responses are aggregated into a report using a standard template. Staff are then emailed this quantitative report including the Good Teaching Scale (GTS) and Overall Satisfaction Index (OSI) percentage, plus a file with the students’ written comments.

What data is published from the CES?

No individual course data is published. Aggregated data by School. University level, Program level and Field of Education is published on the Survey Services Centre website. This is publicly available.

Aggregated data is also provided to the Head of School.

Why are we asking people other than those that teach the program to provide time for students to complete the survey in class?

The Student Feedback policy procedures stipulate that in order to ensure student confidentiality:

Survey administration is undertaken by administrative staff or by a person other than teaching staff for that Program/course, (usually an administrative staff or nominated student representative)

It is clearly inappropriate that a course teacher administer the survey.

What is the Good Teaching Scale (GTS)?

Good Teaching Scale (GTS) measures students’ perceptions of teaching standards. It focuses on teachers' feedback, motivation, attention, understanding of problems and skill in explaining concepts. High scores on this scale are associated with the perception that there are good practices in place, conversely lower scores reflect a perception that these practices occur less frequently.

Items making up the GTS for the Higher Education version of the CES are:

  • The teaching staff are extremely good at explaining things.
  • The teaching staff normally give me helpful feedback on how I am going in this course.
  • The teaching staff in this course motivate me to do my best work.
  • The teaching staff work hard to make this course interesting.
  • The staff made a real effort to understand difficulties I might be having with my work.
  • The staff put a lot of time into commenting on my work.

Items making up the GTS for the TAFE version of the CES are:

  • My instructors have a thorough knowledge of the course assessment.
  • My instructors provide opportunities to ask questions.
  • My instructors treat me with respect.
  • My instructors understand my learning needs.
  • My instructors communicate the course content effectively.
  • My instructors make the course content as interesting as possible.

How is the GTS score calculated?

Course reports include a Good Teaching Scale (GTS) score. The GTS percentage is calculated by adding the number of students in a course that "agree" or "strongly agree" with good teaching items as a percentage of all student responses, so the GTS ranges from a low of 0 to 100 percent.

What is a “good score” on the CES?

Student feedback is dependent on a range of influences including the discipline, level of student progress, resources available for teaching etc. It also may be influenced by students' experiences of different aspects of the course. Moreover a new teaching innovation may impact on the student feedback. As a result feedback needs to be interpreted in the light of these factors within course and program areas and no single score can be considered a benchmark. It is also useful to compare individual course outcomes with those for the overall program and school.

The CES commenced in 2005. It is expected that staff will be able to build a view of performance in their individual courses based on student feedback over time. Indeed this is one of the most important uses course teams can make of the CES data.

CES Reliability Bands

From Semester 1, 2013 all CES reports issued by the SSC will contain a reliability assessment. The reliability calculation is based on the number of enrolments and the number of responses to the CES at the course/class level. One of three reliability bands will be shown on each report. The bands are labelled "Good", "Sufficient" and "Insufficient". The reliability band applied at the course/class level will apply to all individual teacher reports. For example, if the number of responses at course/class level is "Sufficient", all individual teacher reports will contain the same reliability assessment.

The reliability assessment will state "With X responses from a survey population of Y, the data presented in this report are considered to be 'Good' OR 'Sufficient' OR 'Insufficient' for use, including academic expectations"

On each report, the number of responses needed to be considered "Sufficient" and "Good" will be shown.

A reliability assessment calculator is available here. Simply enter the number of responses and the survey population and the reliability calculator will show the number of responses needed for each band.

Where can I get help in understanding my report?

If you have any questions about the data in this report or how it is calculated please contact at the Survey Services Centre either:

How can I improve my student feedback results on the CES?

“Improving Course Experience Questionnaire Results for RMIT”

For further assistance interpreting and acting on the Course Experience Survey:

If you are in the Design and Social Context College, contact Meredith Seaman.

If you are in the Business College, contact your School's Director or Coordinator of Teaching and Learning

if you are in the Science, Engineering and Technology College, contact Peter Muir.

What are the other main University surveys?

Student Experience Survey (SES)

Collects information from students (through an online survey in Semester 2, 2013) about their university wide experience and their program experience. Versions are available for VET, Higher Education and Offshore programs.

Graduate Destination Survey

This survey comprises of the :

(1) Australian Graduate Survey (AGS)

A national survey designed to gather information about recent graduates about their program, any work in their final year, their status regarding work and study, salary four months after graduation, and methods they have used to search for jobs.

(2) Course Experience Questionnaire (CEQ)

A national survey (administered with the AGS) designed to gather perceptions of recent graduates about the quality and usefulness of their programs. Graduates express their degree of agreement or disagreement on a five-point scale with statements about different facets of their course:

  • Good Teaching
  • Goals and standards
  • Appropriate Assessment
  • Appropriate Workload
  • Generic skills
  • Work relevance (three ATN universities only)
  • Overall satisfaction

Student Outcomes Survey (SOS)

A Victorian Government survey designed to gather information from TAFE graduates and module completers about their general characteristics, fields of study, employment outcomes, occupations and industries of employment and satisfaction with and opinions about their course of study.

Postgraduate Research Experience Questionnaire (PREQ)

A national survey designed to gather information on the educational experiences of students in higher research degree courses in Australian institutions.

Graduates express their degree of agreement or disagreement on a five-point scale with 28 statements about seven facets of their program:

  • the quality of supervision
  • intellectual climate of the department in which the respondent was based
  • the development of skills
  • the quality of the infrastructure provided by the university
  • the thesis examination process
  • the clarity of goals and expectations
  • overall satisfaction