Wednesday, May 25, 2016

Linear regressions and confidence intervals

My one remaining use of Gnumeric had been the regression statistical function which produced confidence intervals for slopes and intercepts along with p-values against a null hypothesis that either one was zero. With Gnumeric having opted to be "Linux only" in 2014, I wanted to have some of those same capabilities in cross-platform Although the LibreOffice 5 includes a statistics add-in by default with a regression option, that option did not produce the values I sought.

The core to the solution was on board LibreOffice all along. The LINEST function, entered as an array formula produces the values necessary to generate confidence intervals. 

When entered into a two by five array of cells as an array formula, the results of the function generate standard error values for the slope and intercept. The values generated by LINEST can then be used to feed other functions.

Values such as t-critical, the margin of error, bounds on the confidence intervals, and values from a test against the being zero or the y-intercept being zero are al possible. In this particular case the data has a number of problems some of which could be ameliorated and others which cannot. The data set is from a physical science class laboratory where students were using force on an x-axis to determine whether sandpaper grit, surface area, or sled weight most affected the force of sliding friction. To gain a common x-axis, the measured force was put on x and the other variables were treated as y variables.

The formulas driving the calculations can be seen above. While I was working on the above I also decided I wanted to graphically show the spread in the predicted values for y. I calculated the 95% confidence interval for the y-values.

The predicted value is the value based on the slope and intercept. The formula for the lower bound can be seen below, the upper changes only the sign.

The above table permits producing a chart that shows the confidence intervals graphically. 

I am indebted to Charles Zaiontz' site Real Statistics Using Excel, M.G. Bulmer's Principles of Statistics, and notes on the LINEST function at Colby. All errors and misunderstandings are mine.

Saturday, May 14, 2016

Assessing learning in physical science

SC 130 Physical Science proposes to serve two institutional learning outcomes (ILO) through four general education program learning outcomes (GE PLO) addressed by four course level student learning outcomes (CLO). Not listed are proposed specific student learning outcomes that in turn serve the course level learning outcomes.  This report assesses learning under the proposed course level learning outcomes which in turn supports program and institutional learning outcomes.

ILO 8. Quantitative Reasoning: ability to reason and solve quantitative problems from a wide array of authentic contexts and everyday life situations; comprehends and can create sophisticated arguments supported by quantitative evidence and can clearly communicate those arguments in a variety of formats.

3.5 Perform experiments that use scientific methods as part of the inquiry process. 1. Explore physical science systems through experimentally based laboratories using scientific methodologies
3.4 Define and explain scientific concepts, principles, and theories of a field of science. 2. Define and explain concepts, theories, and laws in physical science.
3.2 Present and interpret numeric information in graphic forms. 3. Generate mathematical models for physical science systems and use appropriate mathematical techniques and concepts to obtain quantitative solutions to problems in physical science.

ILO 2. Effective written communication: development and expression of ideas in writing through work in many genres and styles, utilizing different writing technologies, and mixing texts, data, and images through iterative experiences across the curriculum.

1.1 Write a clear, well-organized paper using documentation and quantitative tools when appropriate. 4. Demonstrate basic communication skills by working in groups on laboratory experiments and by writing up the result of experiments, including thoughtful discussion and interpretation of data, in a formal format using spreadsheet and word processing software.


Explore physical science systems through experimentally based laboratories using scientific methodologies

Laboratory fourteen in the penultimate week of the term usually provides a vehicle for assessing this course level outcome. In the past the students have been given a system to explore and questions to answer. This term the structure of the holidays and their interaction with the Thursday laboratory schedule deleted one laboratory from the syllabus. Laboratory fourteen was removed and laboratory nine was used to assess the first course learning outcome.

The clapper is back at that distant rise. Appears impossible from the photo, but the class could see the boards clapping together. 

In laboratory nine the students timed the arrival delay between seeing two wooden boards clap and hearing the sound of the clap. The experiment was done at different distances such as to generate a linear relationship between the time and the distance. The slope of that relationship is the speed of sound.

Telephoto view from 500 meters out, the class would also obtain data at 550 meters.

The laboratory reports were assessed to determine whether students properly recorded data in a labelled table, generated an xy scatter graph, added a linear trend line to the graph, reported the slope in their analysis as the speed of sound, and then discussed their analysis and results in a reasonably meaningful manner.

By ninth week thirty students of the original thirty-two were still actively attending class. Twenty-one students turned in a laboratory report for laboratory nine.

Analysis of laboratory nine

For all twenty-one laboratory reports submitted, students recorded their data in a properly labelled table and went on to generate a labelled xy scatter graph. Twenty students correctly added a linear trend line to the graph. Only twelve went on to write up an analysis that explicitly noted that the slope of the line on the chart represented the speed of sound. The students are able to handle the software mechanics of producing tables and graphs, but understanding what that graph then means physically is far more difficult for the students.

Only five students then continued on with a meaningful discussion of the results including an error analysis against the published speed of sound at the air temperature measured on the day of the laboratory.

The submission of only twenty-one reports for thirty active students might seem low, but this rate is on par with rates seen a year ago during spring 2015.

Laboratory submission rates for each odd (major) lab report

The 70% submission rate for laboratory nine exceeded the submission rate, year-on-year, for that laboratory.


2. Define and explain concepts, theories, and laws in physical science.

The 34 item final examination asked students to define and explain concepts, theories, and laws in physical science. The final also had students make calculations, use formulas, and interpret graphical data. Average success rates based on an item analysis of the 34 item final examination were aggregated by topic and by skill. Aggregate average success rates based on an item analysis of the final examination were generally low.

Aggregate average success rates by topic and skill

The overall average for thirty students on these 34 items was 51%, a drop from the 66% seen on the fall 2015 final examination. Aggregate average performance on the final examination was lower than in prior terms.

Multi-term final examination averages based on item analysis aggregate average


3. Generate mathematical models for physical science systems and use appropriate mathematical techniques and concepts to obtain quantitative solutions to problems in physical science.

While the laboratory nine assessment above provides some data on this course learning outcome, this outcome supports the general education program learning outcome "3.2 Present and interpret numeric information in graphic forms." With this focus in mind, a pre-assessment and post-assessment was included in the course. The post-assessment was embedded in the final examination.

SC 130 Physical Science includes a focus on the mathematical models that underlie physical science systems. Laboratories one, two, three, five, seven, nine, eleven, and twelve have linear relationships. A number of assignments in the course also have linear relationships. The students also encounter a quadratic relationship in laboratory three. A plot of height versus velocity generates a power relationship, specifically a square root relationship. By the end of the course students have repeatedly worked with linear relations. One relationship at a time, not "problems one to thirty even problems only." Every equation is built from data that the students have gathered. From the concrete to the abstract, repeated throughout the term, providing cognitive hooks on which to "hang" their mathematical learning.

The first thirteen questions on the final examination were identical to the questions on the pre-assessment. The following bar chart depicts the percentage of students answering correctly on the pre-assessment on the right end of the bar, the percentage of students answering correctly on the post-assessment on the left end of the bar. Thus the chart shows the improvement from the pre-assessment to the post-assessment. Of note is the overall weak performance on all items on the pre-assessment except for the physical plotting of (x,y) scatter plot points.

The average score for the students increased from 5.06 out of 13 to 8.63 out of 13. This increase was significant with a large effect size (0.94). Although improvement was seen for all questions, overall post-assessment performance (66%) is below a target value of 70%.

The post-assessment does not answer whether there will be long term student retention of the ability to present and interpret numeric information in graphic forms. The course has had, at least in the short run, a positive impact on the students' ability to work with numeric information in graphical forms.


4. Demonstrate basic communication skills by working in groups on laboratory experiments and by writing up the result of experiments, including thoughtful discussion and interpretation of data, in a formal format using spreadsheet and word processing software.

Course level learning outcome four focuses on communication, specifically writing. In the late 1990s assessment data suggested some students were graduating with limited writing communication skills. As noted by the languages and literature division at that time, two college level writing courses in the general education core cannot by themselves produce collegiate level writers. Writing must occur across the curriculum, across disciplines. In 2007 SC 130 Physical Science at the national campus was redesigned to put an emphasis on writing. A "fill-in the blank" cook book style laboratory manual was replaced by laboratories which led to laboratory reports constructed using spreadsheet and word processing software.

By the end of the term students could produce a laboratory report with tables and charts integrated from a spreadsheet package. The students could produce reports that included the use of quantitative tools.

As reported above, student ability to include thoughtful discussion and interpretation of data supported by their quantitative evidence was demonstrated by only five students as measured by laboratory nine.

Inherent in supporting institutional learning outcome two, which course learning outcome four serves, is proper mechanics. Physical Science laboratory report marking rubrics at the national campus include evaluation of four broad metrics: syntax (grammar), vocabulary and spelling, organization, cohesion and coherence. Each of these four metrics is measured on a five point scale yielding a total possible of twenty points. In general, students enter the course with writing skills. Errors of tense and agreement tend to mirror areas in students' first language that do not have similar tense or agreement structures. All students in the class are working in English as a second language.

Two metrics were examined this term, syntax and cohesion. Syntax showed a  significant improvement from laboratory one to laboratory thirteen with a medium effect size (0.64). Cohesion did not show a significant improvement from laboratory one to laboratory thirteen.

The course has an impact on syntax, possibly improving control of grammar. The student's ability to write cohesive science text that flows and connects logically from one idea to the next may not be positively impacted.

Learning has occurred for all course level outcomes, the program learning outcomes served by those course level outcomes, and the institutional learning outcomes served in turn by the program learning outcomes. 

Wednesday, May 11, 2016

Assessing Learning in Ethnobotany

SC/SS 115 Ethnobotany proposes to serve four program learning outcomes through three course level outcomes. The course serves learning outcomes in general education, the Micronesian studies program, and the Agriculture and Natural Resources program.

GE 3.4 Define and explain scientific concepts, principles, and theories of a field of science. 1. Identify local plants, their reproductive strategies, and morphology.
GE 4.2 Demonstrate knowledge of the cultural issues of a person’s own culture and other cultures.

MSP 2 Demonstrate proficiency in the geographical, historical, and cultural literacy of the Micronesian region.
2. Communicate and describe the cultural use of local plants for healing, as food, as raw materials, and in traditional social contexts.
ANR 2 Demonstrate basic competencies in the management of land resources and food production. 3. Demonstrate basic field work competencies related to management of culturally useful plant resources and foods.


Identify local plants, their reproductive strategies, and morphology.

The twenty-four students in the course engaged in a number of activities in support of this learning outcome. Vegetative morphology was supported by a field identification walks and presentations. Reproductive strategies were also communicated via student presentations. Identification of local plants permeated every outing, field trip, and hike.

Premna obtusifolia variant locally known as oahr

Twenty-four of the twenty-five students attended the field final examination exercise. The single absentee was a student who had stopped attending the class on 07 April, a month prior to the end of the term.

The final examination involved a walk on campus and required the twenty-four students present to identify twenty local plants. This was increase of four plants from previous terms. The students had to identify the plants by Latin binomial, local name, and provide a specific use for the plant.

The students had a list of 73 Latin binomials for plants found on and around the Paies, Palikir, campus to assist with the Latin name identification. Collectively, the twenty-four sudents made 401 correct Latin binomial identifications out of 480 possible identifications for an 84% success rate. This success rate was identical to the year-on-year 84% success rate of spring 2015 for sixteen plants. Fall 2015 the  Latin name identification success rate lifted to 97% for sixteen plants. Student performance on this metric appears to be variable term-on term.

The students made 416 correct local name identifications out of the 480 possible identifications for a success rate of 87%. Identifying plants in their own language is a more difficult task than one might expect. Despite the students being fluent in their first language, they often do not know their local plant names. Fall 2015 the success rate was 97%, spring 2015 the success rate was 96%. Student performance on this metric appears to have fallen.

Overall there is the suggestion of increased weakness on the final examination, however the final was longer and thus more difficult than in prior terms.


Communicate and describe the cultural use of local plants for healing, as food, as raw materials, and in traditional social contexts.

Sweena with the roots of a Morinda citrifolia tree. The scrapings of the yellow root are squeezed to produce a juice that treats stomach ulcers. 

Students engaged in presentations on healing plants, plants as food, plants used for material culture, and wrote two essays during the course of the term on the cultural use of plants. Essays were marked using rubrics provided one the day one calendar and syllabus.

Hellen, Rebseen, and Jeanie presented dapiohka, also known as kehp tuhke, menioak, or moanioak. 

For the twenty plants on the final examination, twenty-four students were collectively able to cite 403 uses for the 480 instances, a success rate of 84%. This represented a drop from 97% fall 2015 and 93% spring 2015.

Outer island Yapese moarupw wrap skirt of banana fiber

Eighteen of the 25 students (72%) turned in the first essay on healing plant usage, a marked improvement from a 54% turn-in rate the previous term. The second essay on the loss of material culture saw a decrease in the submission rate to 58%. The average for the first essay was 72%, the second essay was 67%. Performance was generally weak.

Cherlylinda demonstrate proficiency in cultural skill of thatching


Demonstrate basic field work competencies related to management of culturally useful plant resources and foods.

Students tended to a banana tree collection and engaged in maintaining ethnobotanical plant collections on campus.

Helen and Rebseen demonstrate basic competencies in food production by tending to young bananas

The students worked with bananas from production on the land to the kitchen to the table. The collection also provided a living banana herbarium and assisted in teaching students the diversity of bananas. 

Ground boiled banana dish

Students also tended to ethnobotanically useful plant collections and learned to identify threats to 
food production such as invasive species.

Clidemia hirta

Sweena, Helen, Sunet, Twain, Stewart, and Cherlylinda learning invasive plants in the field

Performance across the past three terms has been highly variable.  This term marked the first term for the use of twenty plants on the final examination. The previous term sixteen plants were on the final examination. The Latin flora list has also grown with each term. Each term the final is more demanding and challenging.

Final examination performance: percent success rate on the three sections of the final

The current form of the final traces back to circa spring 2012 when only twelve plants were on the final examination and the Latin binomial list was also twelve plants. That first term the exercise was essentially a one-on-one matching exercise. Contrast that to this term with twenty plants listed and 73 plants on the Latin binomial list, with students expected to know the local names and uses for 62 of the plants. 

While overall performance was down term-on-term on the final examination, the drop is not as precipitous as the bubble chart above suggests. 

Long term final examination success rates

A longer time frame indicates that the final examination performance is fairly stable over the past few terms. From 2002 to 2005 the course had a course content oriented final examination. From 2005 to 20012 (not shown on the chart above, did not yield percentages) the course ended with a final essay examination. In 2012 the course shifted to using the present format of naming plants and explaining their uses in a field final practical examination. In 2012 there were twelve plants on the final. This was later increased to 16 plants and this term increased again to 20 plants. The downturn might reflect the increased number of plants and the increased number of plants the students are now expected to know. 

Part of this increase in the number of plants is due to the intentional evolution of the campus and environs as a living herbarium. The conversion of the Pohnpei Campus Traditional Plants Garden to an agriculture/food crops focus has led to the development of the Palikir campus as an ethnobotanical garden and living herbarium.

Monday, May 9, 2016

Assessing learning in introductory statistics

MS 150 Introduction to Statistics has utilized an outline based in part on the 2007 Guidelines for Assessment and Instruction in Statistics Education (GAISE), the spring 2016 draft GAISE update, and the ongoing effort at the college to incorporate authentic assessment in courses. The three course level student learning outcomes currently guiding MS 150 Introduction to Statistics are:
  1. Perform basic statistical calculations for a single variable up to and including graphical analysis, confidence intervals, hypothesis testing against an expected value, and testing two samples for a difference of means.
  2. Perform basic statistical calculations for paired correlated variables.
  3. Engage in data exploration and analysis using appropriate statistical techniques including numeric calculations, graphical approaches, and tests.
The first two outcomes involve basic calculation capabilities of the students and are assessed via an item analysis of the final examination (original was a test inside Thirty-nine students in two sections took the final examination.

Average success rate based on an analysis of the three sections of the final examination

In the above chart the centers of the yellow topmost circles are located at the average success rate for the students on final examination questions under the first course learning outcome - basic single variable statistics. The chart reports results from 2012 to present. The radii are the standard deviations. The middle blue circles track performance under the second course level learning outcome, paired dependent data. The orange bottom-most circles track performance on the open data exploration and analysis.

The first course learning outcome focuses on basic statistics. Twenty-one questions on the final examination required the students to perform basic single variable statistical calculations on a small sample. Based on the item analysis the average success rate was 78%. Although slightly lower term-on-term, the difference was not significant. Performance has remained stable. Over the past four years average success rate on this material has been 80%.

Success rates on individual final exam items for 39 students

Performance on the second course learning outcome, linear regression statistics, was measured by six questions on the final examination. Student performance on this section was 68%. The four year average is 69%. Student success on linear regressions has remained lower than success rates on basic statistical calculations. This performance is also stable.

Performance on the third course learning outcome, open data exploration and analysis, is not comparable term-on-term. The scoring system for the open data exploration section of the final examination varies term-on-term. Performance is always weaker on this open data exploration and analysis section than on the first two learning outcomes. Students perform strongly when asked to calculate a specific statistic, students struggle when raw data and open ended questions are posed about the data. The students responded to this section with a single essay question set up using Schoology. This one question was then marked by the instructor.

The 51% student success rate seen on the third learning outcome this term represents the number of students who arrived at a fully or partially correct analysis of the data presented. Of the 39 students who took the final examination, only five made a fully correct analysis of the data, measuring the means and running a test for a significance difference in those means. The open data exploration this term explored two samples where the optimal solution would have been to calculate the means and then test for a significant difference in the means.

Breakdown of solution quality for open data exploration on the final examination

Another five students reported the means and noted that the means differed but did not then run a t-test for a difference of means. Ten cited the correct data set as having a higher mean, but either cited inappropriate statistics or no statistical support.

Fifteen students obtained an incorrect solution, often without providing statistical support for that solution. A few of these students noted as evidence the greater number of highest scores in the data sample that actually had the lower overall average.

Overall success rate on the final examinations has been exceptionally stable over the past three years, and generally stable for the past decade. The long term average success rate is 73.6%, the current term saw a 75.7% success rate on basic and linear regression statistics. 

Final examination average since 2005

In an educational world where a common goal is "continuously improving" best practices, the inert stability of the success rate above might be seen as a failure to continuously improve. The effort to continuously improve mathematics education overall goes back not to the new math of the 1960's but much, much further. Ultimately there are long term average success rates, and statistics assures us that numbers tend to return to long term averages. A look at the running cumulative mean success rate on the final examination since 2005 suggests that the longer term mean to which terms return might be improving, but even this statistic is subject to a tendency to return to an even longer term mean.

Note that the y-axis does not start at zero: exaggerated vertical scale

In general students who complete the course are able to successfully make basic statistical calculations on 73% of the questions posed.

The course average over time includes performance on homework, quizzes, and tests. Course level performance underlies course completion rates. Data on course level performance is available from 2007 forward.

Course average over the past eight years

The course wide average has a long term average of 77.6%, the current term average is 73% . The radii of the circle is proportional to the standard deviation of the student averages in all three sections of the course. The standard deviation is fairly constant over time at about 15%.

This term the open data exploration exercises were each capped off with a presentation rather than a quiz. Performance on the open data explorations was marked using rubrics. Each rubric consisted of five criteria and generated up to twenty points. Three criteria were content oriented, one focused on the presentation software, and the final criteria on the presentor.

Criteria 4 Excellent 3 Good 2 Satisfactory 1 Needs improvement
Basic statistics: Appropriate basic statistics calculated correctly and reported meaningfully. All appropriate statistics reported in a meaningful manner Appropriate basic statistics reported and cited in report Some basic statistics reported A few basic statistics cited.
Nitrogen storage: Do nitrogen fixing trees store significantly more carbon in the soil than non-nitrogen fixing trees? Answer is correct and supported by a fully appropriate statistical analysis Answer is correct supported by statistics which do not provide evidence that answer is correct Answer is correct but unsupported by numeric values Result is incorrect.
Strength of the difference: How strong is the effect size for this study? Answer is correct and supported by fully appropriate statistical analysis Answer is correct supported by statistics which do not provide evidence that answer is correct Answer is correct but unsupported by numeric values Result is incorrect.
Presentation software: Original work submitted as presentation software, presentation is appropriate to the material and subject matter, presentation generally follows guidelines for a good presentation. Presentation that heeds general presentation guidelines, avoids distracting visual extras, and is appropriate to the subject matter presentation with only a few areas in which the presentation as a visual aid could be improved Presentation with more than a few issues. Transitions distract from the content, timing is inappropriate, or other issues such that the visual aid becomes a distraction Submission of a spreadsheet or other fundamental fault in the submission.
Presentation mechanics: Presentor delivered clearly, concisely, demonstrated familiarity with the contents. Well delivered exhibiting preparation and knowledge of the presentation. Spoke clearly and always towards the audience Presentor showed evidence of preparation and some familiarity with the content of presentation. Usually faced the audience Presentor was able to read the slides, sometimes with their back to the audience Little evidence of preparation, unfamiliar with the slide contents, spoken facing the display panel."
A typical presentation rubric

The open data exploration assignments were structured as assignments in the Schoology learning management system. Students had to submit by midnight on the day prior to the presentation, the assignment locking system in Schoology permitted this functionality.  

Schoology assignment editing screen, locking set at the bottom

The average score for the presentations was 16.12 points, which is 80.6% of 20 possible points. 

Students presenting in MS 150 Statistics

The presentations were downloaded as a batch using the download all functionality of Schoology.

The students then presented using native Microsoft PowerPoint or Impress software. 

The item analysis of the twenty-seven final examination questions also provides insight on the success rate against the two general education program learning outcomes served by the course.

Program Learning Outcomes PLO PLO sum PLO n PLO%
3.1 Demonstrate understanding and apply mathematical concepts in problem solving and in day to day activities 3.1 14.000 17 82.4
3.2 Present and interpret numeric information in graphic forms 3.2 6.436 10 64.4

Overall performance remains stable in this mature but evolving course. 

Sunday, May 8, 2016

Hints that partial attendance is better than none

Students arriving late for class, especially early morning classes, is not uncommon at the college where I teach. There are a variety of factors that contribute to this, all of which are discussed perpetually by faculty. Each new faculty member discovers anew the issue of late arrivals. Some faculty in the past have gone as far as locking the class room doors. Others, intentionally or unintentionally, may embarrass the late arriving student.

Late arrivals can be disruptive to a class, especially if group work is the plan for the day. How one handles late arrivals also determines the impact those late arrivals may or may not have.

While rummaging through attendance and late arrival data looking for signals, a practice strongly discouraged by the statistical test community, I noticed that absences appeared to be correlated to weaker performance in class while late arrival - partial attendances if you will - were not correlated to performance.

The course examined was MS 150 Statistics course. There were 45 students in the course with 46 attendance days in the term. The sample size for each correlation varies as some students missed presentations, others did not take the final. The table is a table of correlation coefficients r.

Overall Presentations Final Abs Late
Overall 1.00 0.72 0.74 -0.72 0.13
Presentations 0.72 1.00 0.36 -0.63 0.05
Final 0.74 0.36 1.00 -0.39 -0.05
Abs -0.72 -0.63 -0.39 1.00 0.03
Late 0.13 0.05 -0.05 0.03 1.00

The variables in the table are:
Overall: The student's overall course percentage for all work in the course including homework, quizzes, tests, presentations, and the comprehensive final examination.
Presentations: The student's average performance on three open data exploration presentations that capped the course at the end of the term.
Final: The student's performance on the final examination.
Abs: The number of absences.
Late: The number of late arrivals to class.

Of note is that there is moderate to strong negative correlation for absences to the overall mark in the course (-0.72) and to the open data exploration presentations (-0.63). The negative correlation indicates that performance decreased for the overall course percentage and performance on presentations as absences increased.

Performance on the comprehensive final was also negatively correlated to absences but only weakly at best (-0.39) given the sample size.

Of more interest is that the number of late arrivals was not correlated to a student's overall mark (0.13), their performance on the open data exploration presentations (0.05), nor to their performance on the final examination (-0.05). The correlation values are too small, there is no relationship. Being late had a random effect on overall mark, presentations scores, and final examination performance.

Thus being absent appears to negatively impact a student's ability to succeed in the course, but being late has a random impact on their ability to succeed in the course. That random impact suggests being late has no functional impact on student performance. The take away is "best to be present, better to be late than absent."

I choose to permit late arrivals. Whether my students tend to take advantage of this, I have not sought to determine. The correlations above suggest that even partial attendance is better than being absent.

As an internal statistical note for the course, there is the somewhat curious statistic that the presentations and the final are well correlated to the overall mark, but not to each other. Of course the correlation to the overall mark is in large part because both contribute to that overall score. The lack of correlation between them, or a weak positive correlation at best (0.36), suggests the two are measuring quite different skill sets. This is useful information for the instructor: the presentations and the final are not generating redundant information. One could not be dropped in favor of the other.

Wednesday, May 4, 2016

Ethnobotany review for authentic assessment final

Years ago a researcher arrived on Pohnpei who I had been led to believe had expertise in tropical plants. Upon arrival, the researcher asked for a tour of the rather small if not completely pathetic ethnobotanical garden my class had been tinkering around in for a few years. I was excited by the chance to get come help in identifying plants I could not identify.

As we walked down to the garden we passed Terminalia catappa, our local dipwopw tree also known as a seaside almond. The researcher looked at the tree and gave some Latin binomial I did not recognize. I asked if maybe the tree was not Terminalia catappa, and the researcher agreed that the tree could be T. catappa. For the first few plants, plants I already knew well, the researcher was citing Latin-sounding names that were unconnected to the plants. When I would suggest the name I knew, the researcher would agree with words to the effect, "Oh, yes, of course."  Eventually the researcher fell silent. I already knew I could not trust the identification of the plants I did not know and I was fairly certain by this point that the names being used were simply fabricated. Why not simply say, "I do not know Pacific island tropicals"? I will never know. Their tenure here was brief, but I knew from that walk that their tenure would not be successful.

I also knew that this was the final for ethnobotany. A walk on campus naming the plants in Latin and the student's own local language, along with citing the use of the plant. A simply yet authentic examination of a student's ethnobotanical knowledge.

This term I took off from the classroom at 3:30 sharp and headed down past the Premna obtusifolia variant called oahr, past the Spathoglottis plicata and Falcataria moluccana. Down under the Pterocarpus indicus and on to the Ipomoea carnea and Clerodendrum inerma near the Magnifera indica trees. Only two students were on time and were tracking me, but by the time I headed down into the soccer field over half the class was trying to catch up to me. As I walked I named plants. I was not moving fast, the students were simply not on time today. Last day of classes, the last time slot of the spring 20165 term - 3:30 to 5:00 P.M.

There are some plants, such as the above, that I am still learning to identify. I have no problem admitting to what I do not know. And I let the class know this as well. There is no shame in not knowing, only in pretending to know.

Eventually I will figure this one out!

This plant reminds me of a plant that I knew as a Japanese lantern when I was young.

The seed is within the translucent pod.

I told the class, here is another one I do not know.

I have forgotten the name of this small tree and need to go back through my own resources for this one.

Angela amidst the Ischaemum polystachyum, reh padil. The tangle on the tree to the left is a Cananga odorata, seir en wai, tree with a Coffea arabica bush, a Piper nigrum vine growing on both.

Piper ponapense, konok, on a tree trunk.

Clerodendrum inerme, ilau.

Dukay, Sweena, Darren, Kerat

Ixora casei. A branch broke at some point, but the plant is very hardy.

Ixora casei

That a Psidium guajava in the middle background, Acacia auriculiformis on the left side.

Ipomoea littoralis, omp.

Ipomoea littoralis.

Sage leads the way back across campus. Sage has been an active and enthusiastic participant in the class.

Dukay being, well, Dukay. Cinnamomum carolinense on the right.

Psidium guajava.

That's the flavor of the final. Can a student walk through their environment and name their plants and their uses - knowing also their local names.

The final the next day...

The final moved up from 16 to 20 identifications and included an expanded Latin name flora the students had access to during the final.

I skipped the Terminal catappa, Premna obtusifolia, and Psidium near the classroom and began with the ferns on the Magnifera indica down at the corner. I looped back behind the south faculty building, skipped the Ponapea ledermaniana, opting instead to pick up Piper methysticum and Ananas comosus down behind the A+ building. I was splitting the campus 7-7-7, roughly, in mind, but knowing the cap was twenty. Above Angela and Rogan are under Coffea arabica, there is also Piper nigrum in there which proved confusing for the students who misidentified it as Piper betel or Piper ponapense.

I ended with Saccharum officinarum and Musa spp. Kerat, Ravelyn, and Sweena in front of the Saccharum officinarum .

Sage and Casan-Jenae

With rain threatening, the class moved to the porch of the FSM China Friendship Sports Center to work on rewrites.

One oversight: three students were stranded on campus post-final because they did not have taxi money and the school shuttle was not operating. Students might be reminded to bring taxi money in the event that they miss the last bus.