Canvas analytics and assessment data week two fall 2021

 


Week two saw upticks in the number of courses being served by Instructure Canvas as well as in the number of instructors and students active on the platform. Systemwide, Canvas delivered 1293 assignments and 150 discussions in the second week of classes. 

Page views provides a snapshot of platform activity for all users. Canvas was most active on Mondays, Tuesdays, and Thursday. As seen in the past, on Friday activity levels on the platform tend to drop, falling to their lowest levels on Saturdays. Traditionally Saturday is a family day for housework, gardening, farming, fishing, and recreation. The single in-term Sunday that has occurred showed activity levels on par with the prior Friday. 


At the end of the first week of classes there were 451 outcome evaluation events recorded on the platform. In week two the outcomes dashboard documents 1,065 outcomes evaluation events. These events include rubrics with outcomes from the institutional bank of course learning outcomes being marked and quizzes/tests with quiz banks linked to outcomes. The values under pentascale rescale outcome points from quizzes and tests to match the scale used for outcomes included in rubrics. The course learning outcomes are rated on the following five point scale:

5    Optimal performance of an outcome
4    Sufficient performance of an outcome
3    Suboptimal performance of an outcome
0    No evidence of performance of an outcome

If an instructor choose to remove the numeric values from a rubric, as seen as the second option below, then the outcomes rating is reported only as a nominal rating.


The result is course learning outcomes are reported without a value. These are filtered out in the tables of averages seen above, but the data remains available.


This is one of the strengths of using Canvas along with the college's own business intelligence tools. The college chooses how to analyze the data. 

Disaggregation by student information system variables

Assessment by student information system variables is available to this author only for students in courses taught by the author. 


This still provides an opportunity to demonstrate disaggregation by variables including gender, major, state of origin, and age. 


Disaggregation by gender suggests a small but potentially significant gender gap given the sample sizes. 


The above disaggregation shows overall performance across the author's courses by major in ascending rank order. Some majors are not displayed due to small sample sizes in terms of the number of students in those majors. This is also a demonstration of the capabilities inherent in controlling the tools. The dashboard was modified to permit disincluding majors where the number of students was less than five. The HCOP major is dominated this term by data generated by DDFT students in MS 150 Statistics. Bearing in mind that the above averages are across all course learning outcomes seen earlier, the ANR students appear to be in need of the most overall assistance and learning support. 


A descending rank order by state of origin listed in the SIS suggests that students from Yap and Kosrae may be underperforming versus students from Pohnpei and Chuuk. This table is more complex to interpret as Pohnpei state has a disproportionally larger number of residential students.

The College of the Marshall Islands received a recommendation to be able to disaggregate learning data by more than just gender - and CMI is also using TracDat. The above data is near real time, not an after-action post-term report. The data has the potential to allow an institution to move from reactive to proactive solutions while disaggregating data by other variables available in the SIS.

As an instructor the author is doing no additional work beyond marking assignments using rubrics containing course learning outcomes from the institutional bank of course learning outcomes in Canvas. The author is also using quiz question banks that are linked to the same set of course learning outcomes. The only additional work that the author is doing is the generation of the dashboards, a task that might typically be handled by an office of assessment staffed with technical expertise in data mining and analysis. 

Appendix: An optional deep and long dive into the pSLOs and iSLOs dashboard

Summary: In order to display values on the dashboards above, provisional decisions are being made about outcomes mappings. The current structure of the outlines makes these decisions challenging to make. These are only temporary decisions to be later reviewed and replaced if a decision is made to use Canvas + Business Intelligence dashboards to handle academic outcomes aggregation and assessment.

New to the outcomes reports above was course level outcome number one for EN 120b Expository Writing II. Outcome EN120b.1 states, "1. Investigate research topics in a variety of disciplines." To produce the dashboards seen above, the course level outcome must be mapped to a program learning outcome. The current outlines do not specify to which program learning outcome a given course learning outcome maps. The outlines map lower level specific learning outcomes to multiple program and institutional learning outcomes. Formatting issues complicate this process. 


The outcome has a number of format deficiencies in the online version including the spurious 2., 3., and 4., seen above. Without altering the content, the first course learning outcome includes the following specific learning outcomes.


As noted in earlier articles on the taxonomy of assessment and grey goo, these grids imply many-to-many relationships for course learning outcomes. While matrices could be constructed to potentially handle these aggregations, the result of mapping many-to-many is an average of averages at higher levels of the outcomes hierarchy. Higher level outcomes at the program and institutional level tend to converge on a single population mean average. A single course can still map to multiple program learning outcomes.


Here the apparent multiway mapping to institutional learning outcomes relates to a proposed set of five institutional learning outcomes (i5 series) aligned with the general education program learning outcomes. A proposed modification to course outlines would specify these mappings

To provide a demonstration of the outcomes dashboard included earlier, however, choices have to be made on mapping EN120b.1 to a single program learning outcome. EN120b.1.1 carries the bulk of the content, and along with EN120b.1.2, EN120b.1.3, maps to PSLO "2". The PSLO is not explicitly labelled in the outline with a prefix to indicate which program the outcome is found in. 

Above on the outline, however, is the following and the "2" refers to the second outcome listed below, not the canonical name of the program outcome but rather the relative position in this list.

PROGRAM STUDENT LEARNING OUTCOMES (PSLOs): The student will be able to:
◦ Write a clear, well-organized paper using documentation and quantitative tools when appropriate.
◦ Demonstrate the ability for independent thought and expression.
◦ Demonstrate understanding of the modes of inquiry by identifying an appropriate method of accessing credible information and data resources; applying the selected method; and organizing results.

Note that the above does not tell the reader what program these outcomes have come from. Even further above on the outline is a section:

PSLOS OF OTHER PROGRAMS THIS COURSE MEETS:
PSLO#: Program
# 1 Effective Communication: General Education
# 2 Critical Thinking and Problem Solving

Which would imply to the reader that the PSLO section further below are not general education program PSLOs. The complication is that PSLO number one in "general education" is not "Effective communication." Among other things one cannot say, "Students will be able to effective communication." That is not even a properly worded student learning outcome. And the outcomes in the PSLO section further below ARE general education program learning outcomes. 

General education outcome number one is actually, "Gen Ed 1.1 Write a clear, well-organized paper using documentation and quantitative tools when appropriate." Number two listed above appears to be a reference to the first of the following pair of general education outcomes:

Gen Ed 2.1 Demonstrate the ability for independent thought and expression.
Gen Ed 2.2 Demonstrate understanding of the modes of inquiry by identifying an appropriate method of accessing credible information and data resources; applying the selected method; and organizing results.

Returning to:

◦ Write a clear, well-organized paper using documentation and quantitative tools when appropriate.
◦ Demonstrate the ability for independent thought and expression.
◦ Demonstrate understanding of the modes of inquiry by identifying an appropriate method of accessing credible information and data resources; applying the selected method; and organizing results.

Note that general education outcome 2.1 matches the second outcome in the list of three program learning outcomes, while the first program learning outcome matches general education outcome 1.1. What is not obvious to the reader is that there is an intervening second general education program learning outcome between 1.1 and 2.1, general education outcome 1.2: "1.2 Make a clear, well-organized verbal presentation." Again, the proposed outline modification removes the spurious "other programs this course meets," adds program prefixes for clarity, uses the numbering for the program outcomes used elsewhere, and simplifies the mapping to a single table.

Program section with prefixes and original numbers for clarity (MS 150 Statistics example)

Course learning outcomes listed (specific outcomes move to an administratively approved guide)

A grid with prefixes to make the mapping clear to the reader


Note that both Nuventive TracDat and Canvas + BI operate from the course learning outcome level. Neither attempts to aggregate from the specific learning outcome level. Specific learning outcomes and recommended assessments move to an administratively approved course guide under the proposal.


The actual mappings occur in a mapping table, as seen above. Note that the same issues of many-to-many mapping arise between the program and institutional level - the outline implicitly maps specific learning outcomes to both program and institutional learning outcomes. These program to institutional mappings are proposed to move up to the program "outlines" which are known as Appendix T and Appendix J.

Lookup tables provide the ability to modify mappings from single place.

Program to institutional outcome mappings

Institutional learning outcomes mapping between current and proposed

To resolve the formatting issues, the outlines and program documents would be held in the format seen for the example outline on in the college's new Workspace for Education. A Shared Workspace Drive has been set up to hold outlines, provide access to outlines for faculty, and give management control to appropriate authorized college personnel.

This deep dive under the hood of the dashboards is intended to provide transparency on the decisions being made - decisions that will need to be reviewed and are obviously subject to future modification. For now the dashboards are intended as a demonstration of capabilities. They are based in the most reasonable set of decisions that the author can make at this time. Without these decisions, the dashboards would be unable to demonstrate their capabilities. 

Comments

Popular posts from this blog

Plotting polar coordinates in Desmos and a vector addition demonstrator

Setting up a boxplot chart in Google Sheets with multiple boxplots on a single chart

Traditional food dishes of Micronesia