Student evaluations of instructor, course, and course materials overview summer 2023

Student evaluations of instructor, course, and course materials provide guidance for the institutions on areas of relative strengths and areas where there may be room for improvement. Bearing in mind that old data is not usefully actionable data, the intent of this report is to convey broad themes in the evaluations to decision makers as rapidly as possible. 

This report presumes familiarity with the student evaluations form in use at the institution. This report is based on 275 student evaluations. The responses were converted to numeric values:

Strongly disagree: 1
Disagree: 2
Neutral: 3
Agree: 4
Strongly agree: 5

The overall average response on a five point scale was 4.31 with a standard deviation of 1.00. Spring 2023 the mean was 4.35 with a standard deviation of 0.98. With 275 evaluations changes are likely to be small as further evaluations are submitted.

As noted in the past, students tend to respond with agree or strongly agree. Teasing information out of means that are similar requires the realization that an excess of agrees relative to other ratings in the section represent a downgrade for that item. The following report looks to highlight what are small but potentially real differences. 

Instructor evaluations


The averages are all close to the overall means of 4.31. Students tend to either agree (4) or strongly agree (5). The small differences may not be statistically significant. Bearing this in mind, the differences may suggest areas of focus for improvement. 
As was seen both last fall and spring, the first question on overall instructor effectiveness was rated lower than most of the rest of the metrics. This is a consistent pattern term after term. This raises questions as to how the students are interpreting the prompt.

Timely feedback has been a challenge for the institution in past evaluations. The high intensity summer term reversed this and strong improvement was seen in this metric.


There are strong cautions to comparing results from a sixteen week term and a sample size of 1008 submissions to a six and half week term and a sample size of 275 submissions. With that caveat, the change in regular contact and timely feedback was strongly positive. Drops in the averages for those averages that fell were not as precipitous.

Course evaluations





While students felt that the assignments were appropriate, the students felt that they did not have enough time to complete their assignments. 

The highest rating in this section went to the metric that measured whether the course assignments and exams allowed the student to demonstrate their knowledge and skills. The instructors are developing and deploying course appropriate materials. 


When compared to the spring 2023 results where enough time had a positive rating, the fall in the time provided metric is even more precipitous. Faculty have noted that grades appear to improve in the summer term. This has led to proposals to shift to six and half week intensive courses during the regular 16 week term where two sessions would be deployed. The complication with this approach is providing sufficient time students need to work on longer multiple draft projects and papers. 

If a mix of compressed and uncompressed courses is attempted, then there are scheduling conflicts that arise. The six and half week intensive courses run in 90 minute blocks that conflict with the Monday, Wednesday, and Friday sixty minute blocks of 16 week three credit courses and with three hour science laboratories on Tuesday and Thursdays. Threading the six and half week intensives around the 16 week courses is complex at best.

Instructional materials evaluation


Ease of access and appropriateness of the textbook both received negative evaluations in spring 2023 and this was seen again in the summer term. In general students find their textbook to be not appropriate to their course. 

The highest rating in this section went to the Canvas learning management system.

When looked at from the point of view of term-on-term change, the appropriateness of the textbook dropped but textbook access was less negative than in spring 2023. 

Physical campus, modes of delivery



Responses are from students located on five campuses. 


Responses came in from students in residential, online, and hybrid courses (for example where a lecture was online but the laboratory was in person).


The preferred course mode for the respondents remains strongly in favor of face-to-face residential courses. The least preferred mode is synchronous online courses. The two charts above suggest that the college is offering a larger share of courses online than there is demand for. While 60% of the respondents prefer residential instruction, only 33% are in residential courses. This 2:1 ratio suggests an imbalance in the mix of course offerings. 


When asked "For learning materials in the course, which one of the following do you prefer?" a majority students prefer in-class lecture. Online presentations is the second most favored learning support material, following by textbooks. Videos are fourth rank. The lowest ranking is abbreviated above but the option in full was, "Online videoconferencing (for example Zoom)." Very few students favor live online videoconferencing.

Current course versus course preference



Updated at n = 343 responses

When the current course a student is in is cross-tabulated against their preferred course type one can see that while very few of the students who are in a residential course want to be in an online course, 34% of the students who are in an online course want to be in a residential course.  Students who are in a hybrid course also favor residential instruction, although the question did not offer hybrid instruction as a possible preference. Synchronous online courses are generally disfavored including by students who are in online courses.

Current course versus preferred materials



Updated at n = 338 responses

Across all three course types students prefer in-class lecture. For the online students where in-class lecture is not an option, the next most favored option is online presentations. Based on the students distaste for synchronous online courses and videoconferencing, this author presumes that the students are referring to presentations that are asynchronous and self-paced - slide decks available online. Synchronous online videoconferencing is the least favored option for support materials. 

Deep dive into spreadsheet function technical weeds

The following is for spreadsheet geeks only.

Because the above data is based on dynamically updating spreadsheet, dynamically updating functions are optimal. The following is being done in a Google Sheets spreadsheet that is being updated with new responses in real time from a Google Forms survey.


The bulk of the forms data comes in as nominal level data on the Form Responses 1 tab of a spreadsheet. A second tab ("d") has functions that convert these to numeric values.


The function in use is:

=IFS('Form Responses 1'!J2="Strongly disagree",1,'Form Responses 1'!J2="Disagree",2,'Form Responses 1'!J2="Neutral",3,'Form Responses 1'!J2="Agree",4,'Form Responses 1'!J2="Strongly agree",5,'Form Responses 1'!J2="Not applicable","")

The IFS functions works like a CASE function in other programming languages. This function is filled down and across all cells. Unfortunately this tab has to be manually updated as new responses arrive by filling down.


On a third tab the function =average(d!A:A) automatically updates based on values in the second tab. No editing is needed as the function specifies a columnar range.



The responses for some responses is being automatically updated using a countif function. The use the columnar range means that this is updating in real time as responses arrive.

=countif('Form Responses 1'!B:B,A2)

The key to making this work is that the items in column A have to exactly the form responses.

The greater challenge was replacing the pivot tables that cross-tabulate responses from two columns of the forms spreadsheet. Pivot tables do not dynamically update - remember the data range is continuously expanding. 


The solution is a mixed address countifs function that can be filled down and across, automatically picking up the criteria for each cross-tabulation coupled with columnar ranges. 

=countifs('Form Responses 1'!$E:$E,$M2,'Form Responses 1'!$AG:$AG,N$1)

The result is a pivot table without using the pivot table functionality. Note that the numbers above add to 281. This is because six more evaluations arrived during the composition of this article. 



Table column and headers can then be cleaned up using a second table to shadow the original (the original table is constrained to exactly match the responses on the form). 




Comments

Popular posts from this blog

Plotting polar coordinates in Desmos and a vector addition demonstrator

Traditional food dishes of Micronesia

Setting up a boxplot chart in Google Sheets with multiple boxplots on a single chart