Monday, July 25, 2016

Insights Assessment into Learning in Statistics

A Faculty Focus article titled A New Twist on End-of-Semester Evaluations originally provided the stimulus for the following assessment. The article suggests shifting from summative scalar evaluations such as are used at the college at present (Was the instructor on time most of the time, some of the time, rarely, never...) to exploring the course experience for the students. The evaluation produces a qualitative result rather than a quantitative result. Qualitative data does not reduce down to an average or median score, and the answers to instructional quality questions are embedded in the details of the individual answers. This takes more reading time than "Instructor Lee Ling has an average of 4.5 on the year end evaluation." Yet in the details are a richer and more nuanced view of the course, the material, the instructor, and the students.

The seven prompts are listed below, the students answers were transcribed as written. The course being evaluated was MS 150 Statistics. Many of the prompts included a "... because..." structure which is reflected in the format below.

The directions to the students were: "Your insights into your learning in this course can help me see our course from your side of the desk. Please respond to any three of the statements below (more if you’d like). Do not put your name on this sheet. I will use these insights as I plan for my courses next semester."

1. My learning of the content was most helped when… … because… 
…listening  to lecture. Because that is the moment that the instructor explains what to do in the homework.
…we did group work, because we get to share our knowledge together.
…I watch illustrative or graphs of data. It helps everyone a lot more to do better because they support the data to look for.
…reading the textbook.
…I focus in the classroom, because my mind is fresh and no disturbance.

2. The activity that contributed the most to my learning was… … because… 
…the homework given, because the daily homework helped me remember the different functions there were taught.
…the open data exploration. This allowed us to deal with "real life" situations that can be statistically recorded by using basic statistics.
…all the activities especially the beads on the floor activity, and the paper airplane activity. This class is helpful because there is problem solving, everything has numbers in them, and I know a bit more about numbers from now on.
…presentations because it teaches me how to work on my own and understanding the subject.
…the homework and especially the PowerPoint because both of them were challenging but in the end, I learned it. Like you said, there's no harm in trying.
…presentation because it helps me have better understanding on the statistical front
…the practices/assignments given online because doing the assignments assigned to us on my own made me go through the statistic reading book trying to learn all the formula model.

3. The biggest obstacle for me in my learning the material was… … because…
…the project because it help me understand what to do.
…even when I was stuck on a problem in homework, I could not find the function for the problem on the main website so I have to search elsewhere on the Internet. My other obstacle was trying to keep up in class even when I am feeling behind.
…I'm alone because it allow me to focus more into the work
…trying to figure out which calculation is right for the data, what was I trying to look for, to analyze.
…having a hard problem on understanding all the problems and how to apply them in the real world.
…the presenting PowerPoint presentation part. That's because it is scary to present something you don't really know if it's correct or not since the teacher only corrects your mistakes after your presentation. Plus I don't like talking in front of an audience.
…being reliable.

4. I was most willing to take risks with learning new material when… … because… 
…working on project analysis.
…it is challenging me because I want to challenge myself where my limit is.
…it is a mandatory to learn because it will help me one day in my studies or career.
…learning new things everyday because it helps me have a better understanding of the subject especially its useful in the future.
…there's no family problem relations.

5. During the first day, I remember thinking… … because… 
…of percents and ratios because that was all I know about statistics at the time. As we gone through the term, I realized that statistics are just like scientist but dealing with numbers.
…that the course might be very difficult and boring class because we usually deals with mathematical and equations. But, later on its not what I expected. I found that this class is best course that I've ever had.
…how hard this class is because I know nothing or couldn't remember anything at all. But after the first week things are starting to go in a smooth flow.
…that I will never pass this class because the name/subject of the class (statistics) sounds tough. But it wasn't that tough, because the teacher/classmates were helpful.
…man, I have no idea what are all those because everything was so new and statistically it looks hard

6. What I think I will remember five years from now is… … because… 
…how to make good PowerPoints and the basic statistic reports, because it was taught well and because of the daily assignments.
…is a relationship between two variables. I say I will remember this because just by using basic statistics and a scattered graph, it will clearly show you what you are looking for.
…is basic statistics because I surely believe that I will really need someday if I could able to find a job.
…how to calculate the mean, min, max, sample size, because I believe that I have remember them by heart all thanks to endless homework, for it helped me a lot to learn by heart.
…the solving of all the statistical data and putting them in good use in the real world!
…is identifying relationships of graphs/data because it is the most interesting part of taking statistics, the reading the graphs/charts to find relationship.

7. What is something covered in this course material that you can do now that you could not do or did not fully understand at the beginning of the term? … because… 
I did not know how to use Excel, like the functions and the graphs, but now I can use Excel because I have enough practice with the material.
Now I can do analysis on data and work on statistical report.
One thing that really took me a hard time to understand is the histogram. It is because its hard for me to memorize the steps.
All the formulas and how to get each answer including the graphing. 

Wednesday, July 20, 2016

Assessing learning in physical science

SC 130 Physical Science proposes to serve two institutional learning outcomes (ILO) through four general education program learning outcomes (GE PLO) addressed by four course level student learning outcomes (CLO). Not listed are proposed specific student learning outcomes that in turn serve the course level learning outcomes.  This report assesses learning under the proposed course level learning outcomes which in turn supports program and institutional learning outcomes.

ILO 8. Quantitative Reasoning: ability to reason and solve quantitative problems from a wide array of authentic contexts and everyday life situations; comprehends and can create sophisticated arguments supported by quantitative evidence and can clearly communicate those arguments in a variety of formats.

3.5 Perform experiments that use scientific methods as part of the inquiry process. 1. Explore physical science systems through experimentally based laboratories using scientific methodologies
3.4 Define and explain scientific concepts, principles, and theories of a field of science. 2. Define and explain concepts, theories, and laws in physical science.
3.2 Present and interpret numeric information in graphic forms. 3. Generate mathematical models for physical science systems and use appropriate mathematical techniques and concepts to obtain quantitative solutions to problems in physical science.

ILO 2. Effective written communication: development and expression of ideas in writing through work in many genres and styles, utilizing different writing technologies, and mixing texts, data, and images through iterative experiences across the curriculum.

1.1 Write a clear, well-organized paper using documentation and quantitative tools when appropriate. 4. Demonstrate basic communication skills by working in groups on laboratory experiments and by writing up the result of experiments, including thoughtful discussion and interpretation of data, in a formal format using spreadsheet and word processing software.


Explore physical science systems through experimentally based laboratories using scientific methodologies

Laboratory fourteen in the penultimate week of the term provided a vehicle for assessing this course level outcome. The students were given a system that was unfamiliar to them and asked to determine the underlying mathematical models for the objects in the system. The specific system was the launch velocity versus the flight distance for a variety of flying objects.

Data was gathered and then reported to the board. Students were to produce tables and xy scatter graphs for each object type.

The laboratory reports were assessed to determine whether students properly recorded data in labelled tables, generated xy scatter graphs, made a decision on linearity and if deemed to be a linear relationship added linear trend lines to the graph, reported the slopes in their analysis, and discussed the results. Twelve of the thirteen students submitted this laboratory.

Analysis of laboratory fourteen

For the twelve laboratory reports submitted, students recorded their data in properly labelled tables, generated labelled xy scatter graphs, opted for a linear models, added a linear trend lines and trend line equations. Nine students went on to discuss their results in a reasonably meaningful manner.


2. Define and explain concepts, theories, and laws in physical science.

The 58 question final examination asked students to define and explain concepts, theories, and laws in physical science. Some questions consisted of two to seven subparts. Counting subparts there were 108 questions to be answered. The item analysis was done at the question level and a question was considered correct if a strong majority of subparts were substantively correct. The final had students make calculations, use formulas, and interpret graphical data. Average success rates based on an item analysis of the 58 item final examination were aggregated by topic and by skill. Aggregate average success rates based on an item analysis of the final examination were generally low.

Aggregate average success rates by topic and skill

The overall average for 13 students on these 58 items based on the item analysis was 64%, a considerable improvement from the spring 2016 success rate of 51% and a return to the 66% success rate seen on the fall 2015 final examination.

Multi-term final examination averages based on item analysis aggregate average


3. Generate mathematical models for physical science systems and use appropriate mathematical techniques and concepts to obtain quantitative solutions to problems in physical science.

While the laboratory fourteen assessment above provides some data on this course learning outcome, this outcome supports the general education program learning outcome "3.2 Present and interpret numeric information in graphic forms." With this focus in mind, a pre-assessment and post-assessment was included in the course. The post-assessment was embedded in the final examination.

SC 130 Physical Science includes a focus on the mathematical models that underlie physical science systems. Laboratories one, two, three, five, seven, nine, eleven, and twelve have linear relationships. A number of assignments in the course also have linear relationships. The students also encounter a quadratic relationship in laboratory three. A plot of height versus velocity generates a power relationship, specifically a square root relationship. By the end of the course students have repeatedly worked with linear relations. One relationship at a time, not "problems one to thirty, even problems only." Every equation is built from data that the students have gathered. From the concrete to the abstract, repeated throughout the term, providing cognitive hooks on which to "hang" their mathematical learning.

The first thirteen questions on the final examination were identical to the questions on the pre-assessment. The following bar chart depicts the percentage of students answering correctly on the pre-assessment on the right end of the turquoise bar, the percentage of students answering correctly on the post-assessment on the left end of the turquoise bar. Thus the chart shows the improvement from the pre-assessment to the post-assessment.

Of note is that on the very first question, a calculation of slope for a line that has a non-zero y-intercept, performance on the final examination was worse than on the pre-assessment. On the pre-assessment eight students answered correctly, on the final examination only five students answered correctly. This represented a drop of 28% which is not displayed on the chart above. On all other questions performance improved.

The average score for the students increased from 4.85 out of 13 to 9.77 out of 13. This increase was significant with a large effect size (1.40).

The post-assessment does not answer whether there will be long term student retention of the ability to present and interpret numeric information in graphic forms. The course has had, at least in the short run, a positive impact on the students' ability to work with numeric information in graphical forms.


4. Demonstrate basic communication skills by working in groups on laboratory experiments and by writing up the result of experiments, including thoughtful discussion and interpretation of data, in a formal format using spreadsheet and word processing software.

Course level learning outcome four focuses on communication, specifically writing. In the late 1990s assessment data suggested some students were graduating with limited writing communication skills. As noted by the languages and literature division at that time, two college level writing courses in the general education core cannot by themselves produce collegiate level writers. Writing must occur across the curriculum, across disciplines. In 2007 SC 130 Physical Science at the national campus was redesigned to put an emphasis on writing. A "fill-in the blank" cook book style laboratory manual was replaced by laboratories which led to laboratory reports constructed using spreadsheet and word processing software.

By the end of the term students could produce a laboratory report with tables and charts integrated from a spreadsheet package. The students could produce reports that included the use of quantitative tools.

As reported above, student ability to include thoughtful discussion and interpretation of data supported by their quantitative evidence was demonstrated by nine students as measured by laboratory fourteen.

Inherent in supporting institutional learning outcome two, which course learning outcome four serves, is proper mechanics. Physical Science laboratory report marking rubrics at the national campus include evaluation of four broad metrics: syntax (grammar), vocabulary and spelling, organization, cohesion and coherence. Each of these four metrics is measured on a five point scale yielding a total possible of twenty points. In general, students enter the course with writing skills. Errors of tense and agreement tend to mirror areas in students' first language that do not have similar tense or agreement structures. All students in the class are working in English as a second language.

The brief six week duration of the summer session meant that up to three laboratory reports were due per week. The students in the course this summer generally started with fairly strong writing abilities and those abilities remained with them throughout the summer session. Thus there was little room for improvement against the existing rubrics. Coupled with the small sample size, significant improvement in writing mechanics could not be shown this summer.

Learning has occurred for all course level outcomes, the program learning outcomes served by those course level outcomes, and the institutional learning outcomes served in turn by the program learning outcomes. 

Monday, July 18, 2016

Assessing learning in introductory statistics

MS 150 Introduction to Statistics has utilized an outline based in part on the 2007 Guidelines for Assessment and Instruction in Statistics Education (GAISE), the spring 2016 draft GAISE update, and the ongoing effort at the college to incorporate authentic assessment in courses. The three course level student learning outcomes currently guiding MS 150 Introduction to Statistics are:
  1. Perform basic statistical calculations for a single variable up to and including graphical analysis, confidence intervals, hypothesis testing against an expected value, and testing two samples for a difference of means.
  2. Perform basic statistical calculations for paired correlated variables.
  3. Engage in data exploration and analysis using appropriate statistical techniques including numeric calculations, graphical approaches, and tests.
The first two outcomes involve basic calculation capabilities of the students and are assessed via an item analysis of the final examination (original was a test inside Thirty-nine students in two sections took the final examination.

Average success rate based on an analysis of the three sections of the final examination

In the above chart the centers of the yellow topmost circles are located at the average success rate for the students on final examination questions under the first course learning outcome - basic single variable statistics. The chart reports results from 2012 to present. The radii are the standard deviations. The middle blue circles track performance under the second course level learning outcome, paired dependent data. The orange bottom-most circles track performance on the open data exploration and analysis.

Nineteen of twenty students enrolled in MS 150 Statistics summer 2016 sat the final examination.

The first course learning outcome focuses on basic statistics. Twenty-one questions on the final examination required the students to perform basic single variable statistical calculations on a small sample. Based on the item analysis the average success rate on this material was 81.7%, not significantly different from the 78.0% success rate of the 39 students who completed the final examination spring 2016. Of note is that the final examination for summer 2016 was identical to the final examination for spring 2016. Over the past four years average success rate on this material has been 80.6% and can be expected to vary by as much as 5%.

Success rates on individual final exam items for 19 students

Performance on the second course learning outcome, linear regression statistics, was measured by six questions on the final examination. Student performance on this section was 65.8%. The four year average is 69.2%. This success rate varies by 6%, the difference of -2.1% is not significant. Student success on linear regressions has remained lower than success rates on basic statistical calculations. The stability of these values suggests that increased success rates would be challenging to achieve.

Performance on the third course learning outcome, open data exploration and analysis, is not comparable term-on-term. The scoring system for the open data exploration section of the final examination varies term-on-term. Performance is always weaker on this open data exploration and analysis section than on the first two learning outcomes. Students perform strongly when asked to calculate a specific statistic, students struggle when raw data and open ended questions are posed about the data. The students responded to this section with a single essay question set up using Schoology. This one question was then marked by the instructor.

The 48.7% student success rate seen on the third learning outcome this term represents the average score. Of the 19 students who took the final examination, only one made a fully correct analysis of the data, measuring the means and running a test for a significance difference in those means. The open data exploration this term explored two samples where the optimal solution would have been to calculate the means and then test for a significant difference in the means.

In the data provided this term, the sample means were different but not significantly different at a five percent risk of a type I error. Anecdotally, the students have more difficulty failing to reject a difference in the means than rejecting a difference in the means. The students see any difference as being real, the idea that the variation can effectively eliminate the reality of the difference is difficult for them to grasp. Bear in mind that in this portion of the exam the students are presented with raw data and an open question, they are not told how to analyze the data to answer the question.

Breakdown of solution quality for open data exploration on the final examination

Only one student realized that the solution was an independent samples t-test for a difference of the means, ran the test, and then correctly failed to reject a null hypothesis of no significant difference in the means. Another five students reported the means, noted that the means differed, ran a t-test for a difference in the means, and then incorrectly rejected the null hypothesis. Seven students calculated the means, noted that the means were different without running any statistical test for a difference of the means.

Five students did not attempt to utilize a measure of the middle to compare the samples. Some tried to cite differing maximum values as evidence of a difference.

Overall success rate on the final examinations has been exceptionally stable over the past three years, and generally stable for the past decade. The long term average success rate is 73.8%, the current term saw a 77.0% success rate on basic and linear regression statistics. 

Final examination average since 2005

In an educational world where a common goal is "continuously improving" best practices, the inert stability of the success rate above might be seen as a failure to continuously improve. The effort to continuously improve mathematics education overall goes back not to the new math of the 1960's but much, much further. Ultimately there are long term average success rates, and statistics assures us that numbers tend to return to long term averages. A look at the running cumulative mean success rate on the final examination since 2005 suggests that the longer term mean to which terms return might be improving, but even this statistic is subject to a tendency to return to an even longer term mean.

Note that the y-axis does not start at zero: exaggerated vertical scale

In general students who complete the course are able to successfully make basic statistical calculations on 74% of the questions posed.

The course average over time includes performance on homework, quizzes, and tests. Course level performance underlies course completion rates. Data on course level performance is available from 2007 forward.

Course average over the past eight years

The course wide average has a long term average of 77.7%, the current term average is 79.2% . The radii of the circle is proportional to the standard deviation of the student averages in all three sections of the course. The standard deviation is fairly constant over time at about 15%.

This term the open data exploration exercises were each capped off with a presentation rather than a quiz. Performance on the open data explorations was marked using rubrics. Each rubric consisted of five to seven criteria and generated twenty to twenty-eight points. Three to five criteria were content oriented, one focused on the presentation software, and the final criteria rated on the presentor.

Criteria 4 Excellent 3 Good 2 Satisfactory 1 Needs improvement
Basic statistics: Appropriate basic statistics calculated correctly and reported meaningfully. All appropriate statistics reported in a meaningful manner Appropriate basic statistics reported and cited in report Some basic statistics reported A few basic statistics cited.
Nitrogen storage: Do nitrogen fixing trees store significantly more carbon in the soil than non-nitrogen fixing trees? Answer is correct and supported by a fully appropriate statistical analysis Answer is correct supported by statistics which do not provide evidence that answer is correct Answer is correct but unsupported by numeric values Result is incorrect.
Strength of the difference: How strong is the effect size for this study? Answer is correct and supported by fully appropriate statistical analysis Answer is correct supported by statistics which do not provide evidence that answer is correct Answer is correct but unsupported by numeric values Result is incorrect.
Presentation software: Original work submitted as presentation software, presentation is appropriate to the material and subject matter, presentation generally follows guidelines for a good presentation. Presentation that heeds general presentation guidelines, avoids distracting visual extras, and is appropriate to the subject matter presentation with only a few areas in which the presentation as a visual aid could be improved Presentation with more than a few issues. Transitions distract from the content, timing is inappropriate, or other issues such that the visual aid becomes a distraction Submission of a spreadsheet or other fundamental fault in the submission.
Presentation mechanics: Presentor delivered clearly, concisely, demonstrated familiarity with the contents. Well delivered exhibiting preparation and knowledge of the presentation. Spoke clearly and always towards the audience Presentor showed evidence of preparation and some familiarity with the content of presentation. Usually faced the audience Presentor was able to read the slides, sometimes with their back to the audience Little evidence of preparation, unfamiliar with the slide contents, spoken facing the display panel."
A typical presentation rubric

The open data exploration assignments were structured as assignments in the Schoology learning management system. Students had to submit by midnight on the day prior to the presentation, the assignment locking system in Schoology permitted this functionality.  

Schoology assignment editing screen, locking set at the bottom

The average score on the four in-class presentations was 80.7%. During the previous term the average had been 80.6%. 

Students presenting in MS 150 Statistics

The presentations were downloaded as a batch using the download all functionality of Schoology.

The students then presented using native Microsoft PowerPoint or Impress software. 

An earlier article examined authentic assessment in the statistics course

The item analysis of the twenty-seven final examination questions also provides insight on the success rate against the two general education program learning outcomes served by the course.

Program Learning Outcomes PLO PLO sum PLO n PLO%
3.1 Demonstrate understanding and apply mathematical concepts in problem solving and in day to day activities 3.1 14.526 17 85.4
3.2 Present and interpret numeric information in graphic forms 3.2 6.57910 65.8

Overall performance remains stable in this mature but evolving course. 

Wednesday, July 6, 2016

Flying objects and mathematical models

The weather permitted a laboratory fourteen in which I have the students explore for mathematical models underneath airborne objects. This term I included a princess ball in the mix, flying disks, flying rings, and an aerobie. I had the students report data back to the board and tasked the class with a four table, four graph report using the multigraph rubric available to me in Schoology.

Laboratory fourteen data post-lab. Each pair only worked with a single object. The pooling of results is reminiscent of laboratory six and is being considered as a reworking for laboratory five on friction.

Equipment included GPS units to measure distance, a radar gun to obtain launch speeds. All throws were to be horizontal.

Preliminary results do not argue for any strongly nonlinear model. Thus one is left using a linear model for the four types of objects. The slopes appear to well reflect the expected distance performance for the objects. The Aerobie has a unique air foil to deliver maximum distance, rings outperform disks aerodynamically, and balls do not fly in the airfoil sense of the word. And the slopes stacked up just this way.

Mayleen Samuel

Laboratory fifteen again explored site swap notation. This term I opted to use what I think of as the alternate introduction to the laboratory. Being summer term I had just shown Morgan Freeman's "What are we really made of?" I used this as a jumping off point to put up a diagram using alpha, beta, and three electrons.


This term only one student nodded when asked if they understood. Still, one student reacted positively, when what I had put on the board was effectively nonsense.

Neikaman would demonstrate a 342 site swap 

I then noted for the benefit of the pre-teacher prep majors that students will say they understand when they do not, when they could not. I went on to demonstrate the actual system, generating a new site swap 3 diagram and explaining the meaning of each part physically using three balls.

I went on to demonstrate and show a 51, and finally ran a 342 site swap both on the board and with the balls. Now more students claimed to understand the notation. I cautioned them that I was still no more certain of comprehension than before. I suggested that they try to perform a 3 and, if they could master that, then to try a 51 and a 342.

Sucy-ann Liwy showing a 3 site swap

I also lobbied for not teaching mathematics as an abstract system devoid of physicality. Just as the alpha-beta diagram was meaningless and confusing, so is the abstract approach of the typical math class. Yes, of course there are "real world word problems" but those are after the fact.

Gino Retogluwe working on a 3.

I advocate the approach I take in physical science: start with a system. Measure, analyze, and then repeat with a new system. The mathematics should always have a meaning. Sure, physics may be equally inapplicable to the future for many students, but at least they will know where that math arises and what it is used for.

Joemar Wasan flashes a 3

Mayleen would master 3, show a 51, and perform a 342 swap

Marsha Solomon working on a 3

Neika working on 51

Mayleen showing good control

Shirely-Ann Rudolph



Friday, July 1, 2016

Ohms law and floral litmus solutions

Laboratory twelve is electricity lite, a laboratory that has been placed on my list of laboratories to be replaced. I have yet to decide which direction to take this laboratory, but I am certain that the time has arrived to replace the forty year old equipment being used. Although watching the students use the same equipment their parents used does carry a sense of a cycle coming back around full circle.

Joemar and Preston try to coax reliable readings out of equipment that was old on the day they were born.

Mayleen, Hansha, Neika, Marmelyn, Marlinda, Joemar, and Marsha try to pull readings from a decade resistance box for which only certain resistances are still functional.

Hansha noted that one of her parents had attended CCM, perhaps they had seen this equipment too.

I experimented with the atomic lecture, having the class be protons, electrons, and neutrons. The class is gender unbalanced. Men were assigned proton duty, women took on electron and neutron duty. This permitted setting up nuclei where no protons had to be in contact with other protons - protons were linked only through neutrons. The class made it to lithium.

Floral litmus solution is a stable and well performing laboratory, although I forgot to bring limes this term. Land grant provided limes gratis. Sucy-Ann and Neikaman draw floral litmus solution.

Marmelyn, Sucy-Ann, and Neika test unknowns

Marmelyn shows off her test tube collection

Marlinda Tom shows her color change for an acid

Marlinda, Hansha, and Mayleen

John Cheida and Gino Retogluwe

Tuesday, June 28, 2016

Light benders

This summer laboratory ten was flipped with the section ten lecture to accomplish the laboratory in the computer laboratory. The video on color and light was then shown in A101 in the afternoon. This was followed up by showing a light and optics video the next regular class day in A101 in the morning, and running laboratory eleven on reflection and refraction in the afternoon. The following images are from the reflection and refraction laboratory.

Neikaman, Marmelyn, Sucy-ann

John and Gino

Marsha and Marlinda

Mayleen Samuel and Hansha

Marmelyn makes a measurement

Gino determines apparent depth


Preston works out the apparent depth of the penny