There is no option to migrate from Instructure Canvas to Moodle by exporting courses from Canvas and importing those courses into Moodle. Canvas exports courses as Instructional Management Systems Common Cartridge version 1.1 files. Moodle only imports IMS Common Cartridge 1.0 files. The versions are incompatible and as far as this author is aware there are no converters between the versions.
IMSCC 1.0 is a 2008 standard and IMSCC 1.1 dates back to 2011 or earlier. In other words, both versions are out of date. The current version is IMSCC 1.3 and IMSCC 1.4 is under development. Neither Instructure nor Moodle appear to be working on code to handle any newer IMS CC version. The inability to easily move courses between learning management systems benefits vendor lock in.
As a result, migration is a heavy lift manual process of rebuilding element-by-element from a blank sheet. Or one can pay a third party significant sums of money to handle the migration. This tends to keep institutions from leaving a learning management system platform.
Playlists
The following are links to playlists that this author found helpful.
Moodle produces a series of videos that introduce the features of Moodle 4.5. Perhaps binge watch the entire playlist on a rainy afternoon and then go back to view the specific features of interest.
DELTA (Digital Education and Learning Technology Applications) LearnTech at North Carolina State University has Moodle videos in direct support of the collegiate classroom. DELTA LearnTech's Moodle 4 videos provide step-by-step coverage of specific Moodle 4 topic areas. Moodle 4.3 is substantively similar to the current version 4.5.
Moodle 4 came out in April 2022 and featured an extensive change of the user interface from Moodle 3. Moodle 4 adopted a course home page design similar to that of Canvas. Thus videos that are more than two years old are not going to be helpful.
Moodle also has community forums much as Canvas did. The forums are actually running in a Moodle.
For those who want to dive into open issues and ongoing developer work under the hood of Moodle, Moodle has an open issues tracking system. For example, learning that MathML is not yet supported in the code editor Tiny.
Moodle has a mobile app in phone app stores with the icon seen above.
Copying and pasting text with images between the platforms does not move images
I am doing a lot of copying and pasting from Canvas to Moodle. Copying and pasting text and image content from Canvas to Moodle appears straightforward. Drag-select the material to be copied in Canvas and then use the usual keyboard shortcut for copying.
Here the introduction in a Canvas page is being copied. The whole page was selected by drag-selection and keyboard copy. Then the material was pasted into a Moodle Page* using the keyboard shortcut for pasting:
Everything appears to have come over from Instructure. A look at the source code, however, suggests a future issue.
Note in line two that the source of the image is back in the Instructure cloud for the course from which the copy was made. The image is being rendered because the Single Sign On for Canvas is the same as Moodle. The image is not in Moodle. Users without authorized access to the Canvas course in that term will not see the image. Eventually the course will be deleted from Canvas along with all images.
To move the image right-click on the image in Canvas, save the image to the desktop, delete the copied and pasted image in Moodle, and then upload the image from the desktop into Moodle. The resulting code will show that the image is now hosted in Moodle.
*Moodle has both Pages and Books. The actual work was moving the material into a Moodle Book, here a Moodle Page is being used for demonstration purposes.
Categories
The term "categories" in Moodle can refer to two different elements.
Gradebook categories categorize activities into categories such as assignments, tests, presentations, laboratory reports, or other gradebook categories of your choosing.
Question Bank categories categorize question bank questions into categories of your own choosig. In MS 150 Statistics I have Question Bank categories such as Basic Statistics, Paired Variables, Confidence Intervals, Hypothesis testing, Two sample t-tests.
I do not recall seeing documentation that was clear on this distinction.
Preventing the embedding of videos
The Moodle multimedia plugin automatically embeds YouTube videos whenever a link is detected in a text field such as in a page or description text field. There are situations in which this behavior is not desirable, such as a paragraph of text with links from words in the paragraph to YouTube videos. Use View Source to add embed=no%26 between the ? and the v= in the YouTube link.
The %26 is URL encoding for the ampersand character and is necessary for the link to operate properly.
In one instance the %26 code failed. No amount of trouble shooting was able to resolve the issue. The solution was to use the HTML character encoding of & rather than %26. This is something I have encountered before but the source of the issue is a puzzle.
From this point forward I only used the addition of embed=no& to the link as this approach appears to always work. The use of URL encoding was not consistent.
Auto-linking
"Auto-linking is a feature by which words or phrases used within a Moodle site are automatically linked to glossary or database entries, or activity and resources within the course with the same name."
Auto-linking means that activities and resources should have unique names and that, as one composes, one should remain aware that auto-linking will occur only after saving a page, book, or other text block. Note that auto-linking does not occur for resources that start with an emoji. Thus a page named "🎞 Links to Slides" will not be discovered by the auto-link system. To auto-link to such a page, the first character cannot be an emoji. The page "Links to Slides 🎞" does appear to generate functioning auto-links.
Odd Ends
There is a lot to learn and many subtleties. An early gotcha for me included Scales (which attach to Outcomes) must be entered from lowest ranking to highest ranking: No evidence, Suboptimal, Sufficient, Optimal. Once a Scale is attached to an outcome there appears to be no way to detach the Scale other than to delete the Outcome and start over.
Another gotcha was that once an outcome is attached to an assignment or quiz from the settings screen, that outcome cannot be deleted from that same screen after saving. To delete the outcome one has to go into the Grades -> Grade Setup to delete the outcome.
As soon as questions are created, they are loaded into the Question Banks. Questions can only be duplicated in the Question Bank. Knowing this can be a real time saver. Once duplicated and edited, the question cannot be added to the quiz or test from the Question Bank screen. You have to return to the Quiz or Test to add questions in the Question Bank.
Rubrics
I am building Rubrics on a one-off basis as I have yet to learn how to manage and reuse Rubrics. As far as I can tell, outcomes cannot live in Rubrics. How Rubrics, Outcomes, and Scales will interact and present themselves to me is something I do not expect to know until I have students submitting assignments.
Rubrics, updated. Rubrics can be reused but the terminology and processes differ vastly from Canvas. To start the process of using or reusing a rubric, set the Grade method to rubric when setting up an assignment.
As soon as the assignment is saved with the Rubric option checked, one is automatically taken to a screen to either build a new rubric or generate one from a template.
This area is also accessible from Advanced Grading. Building a new rubric is relatively straightforward. Reusing an existing rubric is rather non-obvious to a Canvas user. The option "Create new grading form from a template" is the path to reusing a pre-existing rubric. In the following always remember to scroll down every time the screen changes. Scroll down, scroll down, when in doubt, scroll down. This is because Moodle inevitably returns one to the top of a page when the dialog box one needs is near the bottom.
After choosing to create a new grading form from a template, scroll down to see the above. Strangely enough, the "include my own forms" checkbox is unchecked by default. To see your own rubrics, check that box. Why isn't that checked by default? Then click on Search. Although it might appear that nothing has happened except to bounce you to the top of a page, scroll down.
If all has gone well, your pre-existing rubrics should be near the bottom of the page. Below the rubric you want to reuse, check Use this form as a template. And, again, you may now be bounced to the top of the page and puzzled at why nothing appears to have happened. Scroll down. Note if you attempt to leave this page you will get a Leave page? warning. This is because there is an open dialog box below. Do not leave the page. Scroll down.
Down the page is where you confirm the reuse of the rubric, referred to as a template or grading form. Notice that the word rubric is completely absent from the dialog box as well as other places. Moodle apparently considers these to be grading forms. One caveat: the various rubric settings will be grayed out, unchangeable, in the reused rubric. If you need new rubric settings then you will need to make a new rubric. I have yet to grok those settings, so I have been leaving them set at default. I may live to regret this.
Submissions and Moodle Grader 10 December
For students submitting gdocs and gsheets from the college Google Workspace for Education there is Google LTI connection available in the Moodle file picker. This opens a Google Drive file picker that, at present, does not display all files in a student's Google Drive. The workaround is that the student should enter the name of the file in the search box and then click on search to display the desired file. Partial matches work in the search box.
Mathematical equations in Moodle 10 December
There is a math equation editor in Moodle that produces and inserts TeX into Moodle resources and activities. The graphical user interface does not include access to all of the capabilities of Tek. Moodle has a support document that cover the use of TeX in Moodle. The only element that I did not see in the support document is \text{} which I use in physical science formulas. Note too that equations in TeX can be entered directly into the editor without using the math equation widget. The equation does not display until you save the page. If you are writing your own TeX in the editor and want to check the syntax, you can select the TeX statement, open the math equation editor, and the equation editor will display the formatted TeX.
Assignment round trip marking with rubric and outcomes 11 December
After setting up a scale, outcomes, and then a rubric, I was able to round trip mark an assignment with both a rubric marking the assignment and course level outcomes being evaluated.
In the above screenshot I am in student mode and I click on the Add icon. The assignment to be submitted is a Google Docs assignment in the college Workspace for Education.
The Google File Picker does not show all folders or files in the drive by default. A search for the title of the document is necessary.
The search has located the file. Looking forward I can see that I need some sort of unique identifier system for filenames to help students find files.
There is an interstitial file picker screen where the students have to choose a license. This may be confusing to a few students.
There is a final step where the student must click on Save changes to submit the document. Back in the instructor role a blue Grade button has appeared. This opens the Moodle Grader.
The rubric can be seen on the lower right.
The right side panel did not appear to have a movable "left wall" but at the top right of the rubric is a "full screen" icon that expands the rubric.
Below the rubric are the course level outcomes that were linked to the assignment.
The rubric is marked by clicking on the rubric, as in Canvas.
A drop down list allows the outcome to be marked.
There is one potential gotcha here: the Save changes button must be clicked on for the grade to be saved. This button is likely to be off the bottom of the screen on a laptop. In Moodle the mantra is "When in doubt, scroll down. Scroll down."
This leads to a confirmation screen.
Below the grade points is the marked rubric. Note that the rubric description appears here.
The comment I made in the grader ("Feedback") appears below the rubric. Outcomes are not reported on this screen.
In Grades there is now an Outcomes report.
This report appears to be an aggregate report. I did not see anything similar to the Learning Mastery Gradebook, but then I also have no actual students in this course.
Note that Moodle appears to have assigned points to the outcomes. That is not something I saw as an option when I set up the outcomes. Also, Moodle has chosen a number scale that appears to be 4, 3, 2, and 0.
Edit: After a deeper dive I think Moodle is using No evidence (1), Suboptimal (2), Sufficient (3), and Optimal (4). Scales apparently set up monotonically starting at one. There is a distinction in the documentation between normalized and sum aggregation, but perhaps that option is not enabled in the college Moodle as I did not see an aggregation option. And the Standard scale checkbox is disabled.
The zero above threw me off, but that zero is an artifact of SC130.2 not having. been evaluated yet. Those are the course averages on the 4 point monotonic scale. Which makes some sense. Although I was using a 5,4,3,0 scale so my outcomes could be used to mark assignments in Canvas, a zero (No evidence) had a disproportionate impact on a student's average. Put another way, a student with one No Evidence and one Optimal has an average of 2.5 which is below Suboptimal. On a four point monotonic scale they would have a (1+4)/2 = 2.5 but now that 2.5 is between Suboptimal and Sufficient. I had thought about this issue even in Canvas but hadn't decided on a way to address this.
This is also underneath the logic of school systems that now set a floor of 50% on grades. A 50% is the new zero. Same logic. A student with a score of 0 and 100 has a 50% average, an F. By making 50% the new zero the same student has (50+100)/2 = 75 or a C. Put another way, in the first case the average of an F and an A is an F, but the 50% floor system means that the average of an A and an F is a C. So I get the monotonic scale. Which also means that outcomes then cannot be used to generate scores for grading. Because Suboptimal is not the same as failing, but 2/4 is failing.
The Outcomes are set to a scale (once used in an assignment, the scale cannot be edited nor changed, so design well before you attached outcomes to assignments).
The scale provided no indication that any numbers were being attached to the scale. So this is an area into which I need to take a deeper dive before getting too much further into setting up my course. Edit: this was written before the earlier edit. This is a stream of spaghetti consciousness post.
Canvas Learning Mastery gradebook. As far as I could tell there also was not outcomes results export report that would provide the student-by-student achievement of outcomes information that Canvas provides. But, again, with students in the course next term perhaps something else will become available.
Ungraded activities 12 December
Set a maximum of zero points and you get an ungraded activity such as a survey. An option that I do not recall being directly available in Canvas is that one can set an activity to zero weight in the grade book but still have the activity show points - think a practice test where you want the students to see a score but do not the practice test to count. Lane Community College covers this.
Question bank categories 12 December
I am struggling to understand how question bank organization decisions will impact me in a year or two. I have stumbled through some information that may come back to haunt me.
One is a vague reference to ID numbers being necessary for embedding of question bank questions in quizzes that is hidden under Create Questions from Inside the Question Bank.
The other thing I have seen is that tags may be important for filtering later. I have 90 questions in a single "Basic statistics" question bank in MS 150 Statistics. Filtering may turn out to be important if these banks grow from term to term. I haven't been setting tags, but I can see a Filter by Moodle Tags in screen 5 of a slide show embedded under "Add questions from a Question Bank" on University of New South Wales help guide. The UNSW site has some nice embedded slide shows that demonstrate features more more succinctly than most videos.
Numeric questions 12 December
For multiple choice questions the status of a student answer is included in the yellow box below.
For the numeric question type the yellow box can lead the student to believe that their answer is wrong. What the student cannot see is that the error is set to 0.5. This answer is correct. This is shown at the upper left in the gray box.
But if the question is sufficiently long, the gray box on the left will have scrolled up off screen and is not visible to the student. Some students will see a situation such as the above, where they answered 733 and then see that the correct answer was 732.776. Students should quickly adapt to this, but there could be some confusion and consternation on the first quiz or two.
Rebalancing
As far as I can determine, course learning outcomes cannot be included in marking rubrics and cannot contribute points to an assignment. This dropped 20 points from my physical science laboratory report marking rubrics and reduced the most frequently used rubric from 45 points to 25 points. As a result the quizzes were pushing up towards 50% of the grade, but my emphasis is on doing science, so I wanted more weight in the laboratory reports. I could have adjusted the laboratory report points, but the scale and rubrics I use makes sense to me. I could differentially weight laboratory reports in the gradebook, but past experience with weighting convinced me that students have a harder time comprehending how any one item affects their grade when weights are being applied. I prefer the points to carry the weight naturally. So that meant throttling back the quiz points.
Rather than two points per quiz question I shifted to one point for multiple choice and numeric answers that do not require a calculation, two points for a numeric answer that requires a calculation, and one point per matching item (they were two points). This seems to help rebalance my gradebook.
No submission assignments cannot be marked with a rubric
Non-submission assignments (in class presentations, in class work done on paper) does not appear to be able to have a description or be marked using a rubric. This is impacting assignments such as my laboratory nine where I cover clouds and have the students attempt to mimic Luke Howard and draw a scientifically accurate depiction of a cloud type. This exercise is usually used to introduce an art rubric to the elementary pre-teacher prep students who often population my class. I use the laboratory as a way of demonstrating a science lab that could be done from the youngest grades. Teachers teach as they were taught, not as they were told to teach. Which is why I do not believe in instructional methods courses. Back when I was in school studies showed that teachers fell back on how they were taught despite whatever they learned in a methods course. So I don't tell the pre-teacher prep students to engage in drawing as a part of science class, I have them draw. If I want future teachers to do something in their classroom, then they have to have done it in my classroom. In other words, physical science is a science methods course.
I am still wrestling with how to handle laboratory nine because the drawing is not submitted, I cannot easily figure out how to get a rubric in. And I am not sure the grader will let one mark when no submission has come in. Some of my "round trip" testing of submissions has been ambiguous as to whether one can open the grader in the absence of a submission. For now the rubric is in the textbook and marking is straight points.
One other side effect of no submission items in Moodle not being an assignment: they only appear in the gradebook and cannot be surfaced on the course homepage so far as I can tell.
Edit 16 December
The work around is apparently to use the Assignment activity type and uncheck all submission options. This resolves the surfacing issue cited further below on 15 December. This blog is now officially non-time linear.
More odd ends 13 December
Over the years in Schoology and Canvas my physical science course modules/homepage built up a number of items, many unpublished, that guided the course content. A number of items were simply there to remind myself what I was covering next. Those prompts, so to speak, are now primarily in a spreadsheet calendar of the term. Many of these were links to supplementary material and videos, those have moved to my college workspace site. This trims significantly the material in the learning management system and makes me less dependent on the particular LMS.
Many of the quizzes later in the term include multiple choice questions based on videos that the class watches. Those questions are not being migrated, those have been deemed of lower priority against the December 31 sunset of Canvas at the institution. All of the quizzes in the latter half of the term will need additional work in 2025.
I have moved most of my links to playlists used late in the physical science term to the Workspace. These links were in the Modules. For now I am leaving them in Workspace. Some may be returned to the Moodle course homepage at a future date, but I wasn't launching most of these from Canvas. Having only a single set of links in the Workspace removes redundancy on the one hand, but on the other it saves me updating information in two places.
Surfacing manually marked items on the course homepage 15 December
You cannot. All one can do is follow the maxim that everything can either be hacked or kludged. The kludge is to use a Text and Media Area on the course homepage to let the students know that a manually marked assignment (a Grade Item in Moodle speak) is occurring but this does not resolve a number of related issues. The framing questions that the Moodle approach raises include:
How to assign due dates to manually graded items?
How to "surface" manually graded items on the course homepage and in the calendar?
How to mark manually graded items with a rubric and outcomes ratings?
Will the Moodle grader present a rubric to mark and outcomes for a student if the student hasn't submitted anything?
Background
In Canvas all assignments, including manually graded items (in class presentations, work submitted on paper) is an assignment.
These manually graded items are given the submission type "No submission" and grades are manually entered for these assignments in the gradebook. Further down the above dialog box one can set due dates for these manually graded assignments. Thus they appear as an assignment in the Modules (course homepage) and they populate the student's calendar on the appropriate day.
In Gradebook Setup > New Grade Item there is no place to enter a due date. Nor, as far as I can tell, can these be added to the course homepage.
The only date setting relates to being able to hide the item until manual grading is complete.
My ethnobotany course consists primarily of in class presentation, in class activities, along with submissions made to other sites, such as student submissions on iNaturalist.
The course homepage will be rather sparsely populated as a result and the student calendar cannot display the manually graded assignments because they have no dates associated with them. What am I missing?
I have wandered through support pages and videos. For now my work around is to use a Text and Media resource to add the manually graded item to the course homepage. The complication is that I mark many of these assignments using rubrics, and rubrics are not supported by manually graded items. My only other workaround would be to post as assignment with a submission of "blank sheet" of some sort and thus gain access to rubrics and outcomes in my marking, but that is likely to confuse a student who does a presentation, forgets to submit the null assignment, and then inadvertently receives no grade for the presentation.
These manually graded items were, in Canvas, also linked to course learning outcomes for reporting at the end of the term. Thus I will also have difficulty with end of term outcomes evaluation for data entry into Nuventive, but that appears to be a broader issue due to the apparent absence of a Learning Mastery Gradebook equivalent. I know that issue will not fully arise until next May.
For courses such as ethnobotany the inability to link manually graded items to outcomes will make term end outcomes assessment more challenging.
Image reuse in quizzes 16 December
I found that using the Moodle File Picker I can locate and reuse images that have been uploaded to the Book or Page resource in Moodle but I haven't found a way for the Moodle File Picker to display images uploaded to activities such as Assignments and Quizzes.
is being used in a couple of quiz questions. Rather than upload two copies I would prefer to pull the image up using the File picker. Obviously the work around is just to reuse the code:
But at some future date I might want to be able to search for and retrieve the file using the File picker. Reusing a file can allow more efficient caching of the file by browsers and edge storage devices on the network.
I have punted around in the File picker but have not found where these uploaded Assignment and Quiz image files go. Searching for them with the File Picker search box also brings up nothing.
The workaround for quizzes is to go to the question bank containing the desired image and duplicate the question with the image. Then edit the duplicate question.
Using the question bank for duplication of multiple choice questions that have the same answer options is a fast way to generate questions and keep them organized in a question bank category.
Assignments cannot have zero points 17 December
Nor more than 100 points. Thus an assignment designed only to measure a student learning outcome but carry no points in the course appears to be impossible because an assignment with a submission cannot carry zero points. An edge case, but somewhere out there someone is doing this. Three years as a Canvas admin taught me that all combinations occur in an LMS of sufficient size.
Student submissions with student comments
Students can add comments to submission at the end of the submission process. This arises in ethnobotany where I will have students submit an image and then tell me what the image is of in the submisson comment. This is a capability also available in Canvas.
After submission, the option to add a comment appears for the student.
Do not forget to save the comment.
Copying quiz questions between courses 19 December
There is a guide from the University of Louisiana at Lafayette that provides a step-by-step instruction on how to move Moodle quiz questions from question banks in one Moodle course to question banks in another Moodle course (Moodle to Moodle, not Canvas to Moodle). This makes the questions available in the second course. The guide did not mention that prior to importing you should be in the desired question bank into which you want the import to occur.
Question banks in Moodle can be in a hierarchy 20 December
For the second time in two days I am reorganizing my question banks as I discover that Moodle question banks are not flat as they were in Canvas. In Canvas question banks lived in a flat space at the user level. The upside was that I could pull questions from any of my courses into a quiz or test in another course - including pulling the evaluation of student learning outcomes across my curriculum. In Moodle I can only access questions in the course question bank. I can export banks and import them, as note above, but outside of that process I cannot pull questions from a question bank in another course.
Moodle, however, allows question bank categories to live as subcategories of other banks. This resolves an issue I had where tests pull quiz questions from the unit level by bank. I believe I will be able to subsume my subsection level banks in an overarching unit level bank and thereby potentially pull random questions from that overarching unit level bank. I only just cottoned onto this and I am once again rethinking how to better organize my banks in botany. I can foresee already that a year from now I will be tearing everything apart and rebuilding while thinking, "Why did I ever think that what I did back in December 2024 would work?"
The key operational capability is that one can display all subcategories, and right now my four units in botany have no top level category at the unit level. I have all 30 categories under the course top level, From what I am seeing, I can drag and drop the 30 under the four unit categories.
Note that to create the first child of a category the vertical ellipsis menu is used to move one category into a child position of the intended parent category. Once the first child is in place, other categories can be dragged and dropped under that first child. I was unable to drag and drop that first category into a child position without use of the menu.
Eventually the subcategories should be renamed to be more descriptive, but everything is being done rather backwards in this four week push to migrate six courses and three years worth of material.
The cross-reference for the subcategories are the section numbers in the calendar above. Renaming will eventually occur to better describe the subcategories. Again, I was thinking in terms of Canvas and a flat question bank space where banks are listed alphabetically, hence the common unit prefix and two digit numbers. Moodle has manually arranged question banks which means an alphabetic design is not necessary.
Randomly pulled questions from question banks and points
If questions are pulled randomly from a question bank, then the questions are assigned a common point value from the questions screen. Hindsight will be 20/20 for the next few months, but in a four week crash and dash migration to a new platform there just isn't anyway to know how things should be organized.
In retrospect perhaps the question banks should have been organized such that every question in a category has the same point value. But I wanted topically organized categories because test one was always meant to pull questions from quizzes one to four plus subsections 09 and 10. With multiple choice worth a single point and matching worth one point per match, what happens when everything pulls into test one as a single point but some the questions are multiple choice?
The answer is that partial, fractional, credit is awarded. 0.83 points out of 1.00 for 5 out of 6 correct.
I suppose I could have split up the question bank categoreies by question type, but the course already has 41 categories. Of course, I am migrating from Canvas where classic quiz question bank management is different. 41 banks in a single course seems like it would be difficult to manage. So I think of 41 question bank categories as a lot. I am wrong. According to one study of Moodle sites, the average number of question bank categories is 50. What I perceive to be a lot by Canvas standards is below average for a Moodle.
Migration workflow change to build all question banks first 22 December
My workflow had been to work down my Canvas Modules line-by-line manually rebuilding item-by-item in Moodle. At first I added questions via the Question tab in the Quiz resource. Then I shifted to adding questions directly to the question banks and then loading the questions bank-by-bank into quizzes and tests. I was still moving line by line in the module.
Instructure Canvas on the left, Moodle on the right
I have shifted to working in the questions banks from the course level rebuilding each and every question bank. There is a process available for exporting Classic Quizzes from Canvas and importing the file into Moodle question banks. A colleague who is taking this approach has wound up "touching" every question anyway to fix issues that arose in the migration process. By rebuilding question-by-question I am finding errors, revising unclear questions, and adding new material in places where appropriate. Essentially a tear down and overhaul of every facet of the course. I am confident this approach will yield benefits downstream in the term to come, even though this has taken 162 hours over the past 25 days and is still an ongoing effort.
Default quiz question points 23 December
When one is composing a quiz question one can assign a default mark for the question, the default number of points for the question.
When the question is saved for the first time, the question carries that default number of points. In Moodle that default value can be altered in each place the question is used. The above matching was originally saved with the default unintentionally set to three points.
Here the points that the question will receive in this quiz can be altered - while leaving the default alone. At first this is puzzling for a Canvas user where the point value is a single value that travels with the question. In Moodle, the points assigned can be changed with each quiz or test. At first this will puzzle a Canvas user, but this provides a rather clever way to "flatten" the points awarded for randomly drawn questions. I had to shy away from randomly drawn questions in Canvas because my questions carried variable numbers of points. Which would change the number of points in each and every student's quiz when the quiz pulled randomly drawn questions. Since the random draw points are assigned separately, the random draw points can be any fixed value. And Moodle then rescales the question to the new points possible - as seen earlier in this blog.
This does mean, however, that when one accidentally saves a question with the wrong default point value, then there two places that have to be corrected. One is the default value in the question, the other is the errant value displayed in the Questions interface.
Ordering question type Canvas versus Moodle 24 December
Canvas includes the ability to have a top label and a bottom label, useful in situations in which a terms might be displayed as a scale.
The words are then dragged and dropped into order using the stippled areas in Canvas.
As far as I could tell, there are no top and bottom labels in Moodle. One would have to specify the order in the prompt. The questions can be dragged or the arrows can be clicked on to move an item.
Above is the result with the marking set to "absolute position" and only the third one correct. Points were set to four on the logic that there are four correct answers, this also yields whole number value points for partially correct solutions.
Multiple answer questions are not a separate question type in Moodle
Multiple answer questions are not a separate question type in Moodle as they are in Canvas. Multiple answer questions are multiple choice questions with multiple answers allowed. The correct answers should have a percentage that adds to 100%. The incorrect answers should have a percentage that adds to -100%.
Canvas New Quiz Categorization questions in Moodle 25 December
Canvas New Quizzes brought the categorization question type. While this specific structure is not directly available in native Moodle (H5P may offer an equivalent, but this author has not yet tackled H5P), a native capability can be coopted into providing this question type. The following is certainly not necessarily the canonical or optimal way to do these sorts of questions, only a first crack at this nut.
This was used as the background image for the question.
Markers were then set up.
Rectangle is an available shape in Moodle 4.5 which works optimally in this case. Note that the coordinates can be edited, copied, and pasted, which will assist in setting up markers.
In process of setting up the rectangles. The rectangle drags from the red circle, resizes from the corner.
Detail view of the rectangle controls. Get one right by dragging, copy the coordinates.
What the student sees before starting the question. Questions drag and drop onto the image. If the marker is set to multiple, not just one, then the marker remains visible below until the student reaches the marker count. For this use the marker count was set to one.
After marking and submitting the quiz, with review settings still at default, the student sees the above. Note that as far as I can tell the student only knows that they obtain four out of six correct. The student does not appear to know which two were incorrect. There is a general feedback option in the question setup that could be used to report the correct answers to all students after submission.
Above is a four category question with rectangular drop zones. The Google Drawing was an easy modification of the earlier form. Once one has built one use Make a Copy and edit to make another one. A few extra steps than Canvas, but not too bad.
The coordinates of the drop zones can be seen above. Dragging the first drop zone in a set is more functional than manually attempting to set the drop zone. Manually set drop zones may exceed the image boundaries and that throws an error. Once a boundary is working, the coordinates can be copied and pasted. Drop zone shape options include polygons and circles.
Scroll distances 26 December
One of the design issues to be considered is what in Moodle world is referred to as the scroll death. This is primarily a design issue and also somewhat unavoidable.
In Moodle 4.5 the use of a Google-esque material design for the theme in use at the college, perhaps the default theme for 4.5, occupies more vertical space that the equivalent module in Canvas. Both platforms use an accordion design to provide a way to tame the scroll of death late for students late in the term. Subsections, a newer feature in 4.5, may be another way to cope with the long journey to the bottom of the course homepage, but this feature has not yet been enabled at the institution.
Requiring students to click the submit button 28 December
The first item under Submission settings is "Require students to click the submit button" which is set to "No" by default.
With the submit button set to No, students move through the usual Moodle submission process.
In the above screen they can drag and drop a file or click on the ⊕ icon upper left to access the Moodle File picker.
Once their file is chosen and uploaded, the student clicks on Save changes to submit the assignment.
The student then sees the above screen where they can choose to edit or remove their submission prior to the submission being marked. This is helpful for a student who realizes that they uploaded the wrong document and wants to correct the mis-submission.
With the Require student to click the submit button set to Yes, the process begins the same way.
When they reach the above dialog box and click on Save changes...
...they reach a screen where they again have the option to Edit submission or Remove submission. Note, however, that the submission is not submitted for grading at this point, the document is in a draft status. Why would a student do this? This allows the student to "save" their work in Moodle on one computer, move to another computer, and use Edit to continue working on their submission prior to final submission. This is going to be useful for documents that are produced by software on a local machine and stored on a hard drive, rather than in the cloud. It allows the student to use Moodle for temporary file storage.
Once the student clicks on Submit assignment, they see...
As far as I can tell the document cannot now be edited or withdrawn.
I have left this setting at the default "No" with the logic that I encourage my students to use Google Workspace to produce their documents. These are cloud based documents and the students can get back to their documents via Google Drive. I want my students to be able to resubmit post-submission - many times I have seen students resubmit a few minutes after submitting when they realized they made an error or omitted an answer. I can see the "Yes" option as leading to premature submissions, but then I have not used Moodle in production with students.
Extra credit assignments
Extra credit assignments are designated from the Grades: Grade Setup screen.
One sets up the assignment first and then sets the assignment as extra credit in the gradebook. The horizontal ellipsis menu accesses this capability. This sets a whole assignment or quiz to be extra credit.
I have a few extra credit, manually marked items in ethnobotany - such as when marching off into the muddy swamp to see Spathoglottis micronesiaca. Marching into the swamp is optional. Those marks are entered post-hoc into the gradebook.
The process to set up a "grade book column" in Moodle is more akin to Schoology than to Canvas. Manually marked (no submission or on paper in Canvas) items are added from the Gradebook setup screen. Look to the right for Add ∨ to Add a grade item.
Set the Item name, maximum grade, and the Grade category. Then click on Show more....
Then scroll down. Down. Further. Always remember the Moodle song, "Scroll, scroll, scroll your mouse, merrily down the page..." Under Parent category should appear Extra credit. Caveat: whether extra credit can be checked may depend on the aggregation method. I am using Natural with no weights - just straight points. I also set the Hidden until date for the day on which this occurs (not shown above).
That plus to the right in the Gradebook setup means the item is extra credit.
The longer answer is that older documents and videos suggest there was a way to award bonus points on quizzes, but not in the current version. The even longer answer is that there are documents and videos on workarounds for this. The few I have read and watched did not make sense to me. Be careful of documentation or videos that are more than two years old: those are earlier versions of Moodle.
Manual regrading might be one such workaround, but it appears that it will be labor intensive. See also Moodle Quiz Settings although, frankly, the explanation lost me along the way.
Student submissions from Google Workspace to Moodle
if your students are using Google Workspace to submit documents they will be puzzled by the disappearance of the Google LTI 1.3 submission method. Under 1.3 a student could click on "Recent" to see their Google Drive files for submission. That option is not currently available at this time in the college's configuration. The workaround is that the student will need to use search in the Moodle File Picker to search for their file by name in their college Google Workspace account. The following is a video I did on this process, pardon the background sounds! Note that I am using a copy on link approach for my statistics assignments. The video content is rather specific to statistics, my apologies.
A video on submitting a physical science laboratory report (Google Docs) from Google Drive is also available.
In an earlier post I noted that Desmos did not directly plot polar coordinates. Not only was I incorrect, but Desmos responded to my blog to correct me! Although I had at some point seen that one could define a function f(x)=3x+5 and then have Desmos calculate f(6), I had not absorbed how this might be used to plot polar coordinates. The above works beautifully . Realizing that I could effectively program Desmos, I applied this thinking to demonstrating how to add two vectors when given the magnitudes m and the direction angles theta. The graph calculates the i and j components for the two vectors and then adds the vectors, graphically displaying the result while also providing information on the magnitude and direction of the vector sum. m1 and theta1 are the magnitude and direction angle for one of the two vectors, m2 and theta2 are the magnitude and direction for the other vector. All four are dynamically interactive and can be changed. The diagram purports to illustra...
The Google Statistics add-on for Google Sheets can display multiple boxplots in a single chart. The key is the layout of the data. One column should be the variable by which the data is to be grouped, the other column should be the data to be box plotted. Set up the Statistics add-on with the data to be plotted as the variable, and the grouping column as the "by" variable. In this image I had deselected all but the boxplot option, the result was the appearance of the Moment, Standard errors, and Confidence intervals options. The default is apparently a 95% confidence interval for the mean. The result is multiple boxplots on a single chart with a common scale. The new tab that is created also quotes 95% confidence intervals for the mean. Note that as of 2018 the Google Statistics add-on cannot be found by search in the add-ons. In addition, as of May 2018 the add-on no longer verifies, possibly due to the add-on not having been updated since August 2017. One may ha...
Clara, Jennette, and Joanie presented ground hard taro cooked in an umw, effectively a hot rock oven. Pohnpeian: rotama. Pingelapese: sero. Mwoakillese: rodma. Mortlockese: Amahd. Kosrae: rodoma. After cooking in an umw the rotama is pounded with the petiole of a palm frond. Rotama is made from hard or swamp taro, known as mwahng on Pohnpei, mweian on Pingalap, pula in Mortlockese, and pahsruhk on Kosrae. In traditional times on the outer islands the women had primary responsibility for tending the taro while the men handled fishing and climbing tasks. Joesen and Noeleen presented fermented breadfruit from the outer islands of Yap such as Woleai and Eauripik. The raw ingredients were actually brought in from Yap, Noeleen prepared the maare. The leaf wrap is ti leaf - Cordyline fruticosa. Pauleen, Barnson, Trisha, Verginia, Con-ray, and Maylanda from Kitti, Pohnpei brought in mahi umw, koahpnoair koakihr, and uht sukusuk. Above is breadfruit umw, mahi umw, o...
Comments
Post a Comment