Assessing Learning in Online Introductory Statistics

On 16 March 2020 the spring term was brought to a sudden and abrupt end by a beta-coronavirus epidemic. After a couple weeks of future uncertainty I opted to begin preemptively working on delivering statistics online in the summer. By late April the college had made a decision to run only online classes in the summer.

I knew I had to deliver the core outcomes for MS 150 Statistics in an online format, and that focus guided the course redesign. I also knew I had to design for students who might have nothing more than a smartphone. The course was already centered on Google Sheets as its statistical software, and the the Google Sheets app would make possible statistics on both Android and iOS.

I was also guided by my own knowledge of the bandwidth limitations the students on home ADSL would face, especially on Kosrae island as Kosrae is connected to the Internet only by a fairly narrowband satellite. I knew that the only video which might be deliverable to Kosrae was YouTube. YouTube has the best dynamic throttling on quality for changing bandwidth of any streaming video service that I could upload to.

I focused on producing what would become a collection of 34 learning support statistics videos for the students.

Three course level student learning outcomes currently guide MS 150 Introduction to Statistics:
  • Perform basic statistical calculations for a single variable up to and including graphical analysis, confidence intervals, hypothesis testing against an expected value, and testing two samples for a difference of means.
  • Perform basic statistical calculations for paired correlated variables.
  • Engage in data exploration and analysis using appropriate statistical techniques including numeric calculations, graphical approaches, and tests.

Final examination details and performance

Assessment of learning this term was based primarily on analysis of the final examination. Thirty-one students took the final examination in two sections of MS 150 Statistics, 23 in section O1 and 9 in section O2 of the course. The course had 32 students enrolled at term end, one student had not completed any work after the midterm. Family issues appeared to be involved for that one student.
The table depicts the percent success rate by item on the final examinations across ten terms.
Databars for the data

The pattern of student success on the final examination for the first summer online run of statistics is similar to the pattern of student success seen in the past. While performance levels were slightly lower, performance was not significantly lower.
Performance by section on the final and in the overall course average

In the online world there really is no such thing as a section nor sectional differences. Here the small sample size of section O2 has contributed to greater variation in the inter-sectional averages.

Student learning outcome performance over multiple terms

The introduction of Schoology Institutional in January 2018 has made possible tracking of performance based on student learning outcomes. Prior to January 2018 Schoology Basic permitted the entering of student learning outcomes, but the Basic version does not provide access to the Mastery screen. Once the college adopted the institutional version, however, Mastery data from as far back as the instructor measured against student learning outcomes becomes available. Data across seven terms is reported for the following three learning outcomes:

1.0 Perform basic statistical calculations for a single variable up to and including graphical analysis, confidence intervals, hypothesis testing against an expected value, and testing two samples for a difference of means.
2.0 Perform basic statistical calculations for paired correlated variables.
3.0 Engage in data exploration and analysis using appropriate statistical techniques including numeric calculations, graphical approaches, and tests.
Performance on student learning outcomes over multiple terms.

Performance on the first learning outcome was on par with historic performance levels. The second learning outcome saw a strong uptick in performance. As there is no particular reason for that strong a change, this is likely a random change due in part to the overall smaller size of the course this summer.

Long term course average and standard deviation

The shortened spring term had been noted to have had a positive impact on the course average. The summer average was slower and represented a return toward the multi-term overall average, a return to the mean.


Spring 2020 had no final examination. Summer 2020 saw a drop from the prior two terms. The drop also fell below the long term overall average. A return towards the mean had been predicted since the unusual strength of the spring 2019 term. Charts such as this 14 year longitudinal data collection remind me that the idea of "continuous improvement" is laudable but does not have any particular meaning over a sufficiently long enough time frame. There is arguably not a time in human history that humans intentionally sought to not do better. There are simply real world constraints on continuous improvement, and this data helps illustrate this. Data is messier and long term trends are often obscured by short term higher frequency events. While the running average improved from 72% in 2012 to 75% 2019, the improvement was modest at best and insignificant at worst.

In general the first run of the statistics course as a purely online asynchronous course went well, better than expected. At the end of the course 31 of the 32 students achieved passing grades in the course. Every student who took the final examination achieved a passing mark. Learning outcome performance was on par with prior residential instruction.

I realize that students consider online instruction to be inferior to residential instruction, and perhaps it is. That said, I feel that my students had a quality learning experience in part because of the approaches I took and the level of engagement I maintained throughout the summer term. I remained available seven days a week from the time I woke up until the time I went to bed. This added to my personal workload, but I did not want students waiting 12 or 24 hours for an answer to a question that they had. I cannot judge whether I was successful, but one student wrote to say, "I give you one big fat "A" for your effort, time, and everything regarding this course. You are very helpful to me throughout the summer. I give you one big fat "Thank you" for that, also. Thank you so much."

Comments

Popular posts from this blog

Plotting polar coordinates in Desmos and a vector addition demonstrator

Traditional food dishes of Micronesia

Setting up a boxplot chart in Google Sheets with multiple boxplots on a single chart