Canvas analytics and assessment data week ten fall 2021

Courses and instructors remained the same week nine to ten. The number of students dropped slightly. Other areas of the Instructure Canvas platform showed modest growth.


Breaking out the number of assignments, students, discussions, and media recordings shows more clearly the growth in assignments. Files are not included in the above chart due to the complexity of interpreting the files numbers. Some files are word documents or portable document file format files uploaded by an instructor. Other files include images inserted into pages, quizzes, assignments and other areas of Canvas. Instructors have a 500 Mb file storage limit. 


Academic content is tracked as assignments, which would include tests and quizzes in this case, discussions, and media recordings. Breaking out just those three categories indicates that assignments strongly dominate platform use. 


With the week not yet over at the time this report was produced, page views in week ten appear to be on par with week nine. 


While week ten appears to be underperforming week nine, data is not yet complete for Friday, Saturday, and Sunday of week ten. 


3,905 assessments of learning have occurred for 222 students in 14 courses with an overall average performance of 3.44 on a five point scale.


Performance by course can be seen in the above dashboard. 


Performance on the general education learning outcomes remains at 3.98 out of five, the same as week nine. Last week there were 3480 assessments under general education, those are now 3806 this week. 


Work done in the 1990s indicated that English, mathematics, and science courses were functioning as "gateways" to graduation. Lower rates of success in these general education core courses was throttling student throughput to graduation at the institution. There are suggestions in the learning data that this academic landscape remains essentially the same a quarter century later. The confounding factor would be that these same courses may be acting as a sieve such that only academically stronger students are in upper level education, social science, and business classes, resulting in the above performance differentials. Another caveat is that under this data are only 14 courses and 222 students. For some of the two letter prefixes the sample sizes are not large.

Grey Goo


For a week in which a course learning outcome is assessed that was not previously mapped, the dashboard will display No data. This leads me back to the course outline to try to determine how to map the outcome to a program learning outcome. All too often I see the following:


Grey goo. Everything maps to everything. As if to not decide is to decide, for in saying something maps to everything is not saying anything at all. I look on this situation with no small amount of despair. Yes, one could theoretically build mapping matrices that allowed the many-to-many relationships implicit above, but the result would be that the averages would converge on the average of the averages at each level. When everything has the same value, then one cannot identify areas of relative strength, weakness, and opportunities to improve. 

Note too that the course learning outcome does not itself map to anything. More fundamentally, there is no sense in which the specific learning outcomes map to the course learning outcome as they are mapped around the course level directly up to the program and institutional level. With each and every outline carrying an implicit mapping decision for program learning outcomes to institutional learning outcomes. 

With the outline author silent on a primary mapping or any mapping for the course learning outcomes, those in the trenches of assessment work have to make reasonable decisions on how to map each course learning outcome to a program learning outcome that course outcome best serves. Bear in mind that the assessment systems in use track and map course learning outcomes, not specific learning outcomes.

Note that this does not mean a single assignment cannot assess multiple course, program, and institutional learning outcomes. 


Only that each course learning outcome must map to a single program learning outcome. The above assignment, a science laboratory marked by a rubric containing course learning outcomes in the rubric, maps to three course learning outcomes and through each of those to three different program learning outcomes. 

In this case the course learning outcome "Develop math lessons with supplementary materials" has to serve one of the following program learning outcomes.

The above screenshot is from the college catalog. Note that program learning outcomes two and three are seen in the line containing program learning outcome one. There are six listed program learning outcomes. The outline, however, lists only five learning outcomes, which are similar but not the same.



I realize that there is a natural reaction to fix this particular specific instance of an error, but this points to the larger meta-issue that the outlines too complex. Errors such as the above are inevitable with the outline format in current use - the outline is overloaded making errors unavoidable and, when made, hard to discover. The outline above apparently dates back to 2018. A proposal was developed for a format that simplifies the outlines and was first presented in May 2021. A simpler outline would be easier to review and deploy. 


Underneath the hood of the assessment mappings is a dimensions table. This provides a single point at which mappings can be entered and vetted. The work that would have to be done if the one-to-one mapping proposal were to be accepted are these mapping decisions.


For the purposes of demonstrating the resulting dashboard, this course learning outcome is being mapped to edtc4 above because "developing ... lessons with supplementary materials" seems to logically be a part of organizing and managing a classroom environment for learning. Any of these mappings can be changed later, but a mapping of course learning outcomes to the program learning outcomes - which do not exist - should have been done on the outline.

Reports

Week one: assessment available at sunrise on a new term
Week two: SIS disaggregation and a deep dive into mapping decisions
Week three: SIS disaggregation and another dive into mapping complexities
Week four: a focus on subaccounts and campuses
Week five: pivot of state of origin versus gender
Week six: dashboards can be focused
Attendance versus performance in statistics
Week seven: outcome performance at the course level and other dashboards
Week eight: update at midterm
Week nine: a look at score and grade distributions
Week ten: grey goo revisited

Comments

Popular posts from this blog

Plotting polar coordinates in Desmos and a vector addition demonstrator

Setting up a boxplot chart in Google Sheets with multiple boxplots on a single chart

Traditional food dishes of Micronesia