Sunday, June 29, 2008

Lab Writing

This is effectively the second part of an article on the objectives of lab classes, where I had limited the discussion to everything except lab reports. I also discussed some of this in the past, stimulated by two articles decrying empty thesis statements and overly effusive language (using verbiage to replace thought) in papers for a upper division or grad-level classes in ed (history of education) and english (Victorian literature). Their complaints are familiar to us in the sciences; I regularly learn new things from folks on that other side of the blog campus and sometimes on my own. [One interesting side result of what I have picked up in academic blogs is the realization that I should talk to people who teach composition or in the social sciences or humanities about these sorts of teaching issues.]

As I see it, the written parts of lab reports address two learning objectives: technical writing, where they have to learn to use numerical results (with uncertainties and units) in complete sentences and clearly define quantities in english rather than with symbols, and critical thinking, where they have to learn to draw conclusions based on quantitative results that a vague fuzziness due to experimental uncertainties as well as separate important from unimportant.

Let's look at these two issues, being careful to include ease of effective grading (assessment is the new magic word) in the design of the task we set our students ... and our TAs(who likely don't have English as a first language).

But first, one word on philosophy: The majority of my students are going on to become engineers, so I am looking more toward that kind of corporate environment (with its memos and reports) than academic research. I hope I can get The Thomas to provide his critique of these thoughts from where he sits in Corporate America, either in a long comment here or in his own blog.

Technical Writing.

I think of this as the requirement that they produce work that looks professional (using correct symbols and notation, such as SI prefixes or superscripts for powers of ten, and grouping the value and its uncertainty inside parentheses) and that communicates their answer without ambiguity. This means using technical terms instead of a catch phrase, and not using a symbol (is it V or v, is it velocity or voltage or volume?) that has not been defined by the writer or in the question.

This requirement is unambiguous and applies everywhere, but I don't mark it off every time I see it made. [I do know profs who take off 0.1 point for every instance of even a minor error, but I don't have that much time to devote to grading.] Instead, I focus attention on specific sections of the report for specific things, such as making sure values are reported correctly (with units) in the calculation section and in the conclusions sections. Those instances are highlighted on my version of the report. [Because of the particular, rather efficient, system I use for grading, this might be one of only two things I am looking for on a particular page of the report.] Nonetheless, I always keep my eye open for an oversight somewhere else in an otherwise good paper. I want everyone to be paranoid about errors of omission such as units or sig figs.

There is also an expectation that answers to questions and some lab exam problems be stated as complete sentences.


Critical Thinking.

This is the real challenge. It takes time and energy to do this right, so I often look at this part of the lab report first or devote a separate day to it, so I can do it while I am fresh. It also requires that the task for the student be well defined, so I try to focus it into just a few areas of the report.

One of these areas is actually easy to grade: the post-lab questions. These are actually a good place to address "conclusion" questions about the implications of the lab results, particularly the ones that require the use of uncertainties to address the significance of an "error" between what is expected and what is found. The key requirement is that they get the right answer and given an appropriate explanation (such as using the size of the standard deviation to judge the precision of their result). A specific question ensures that the student has to address the topic and also means I don't have to go hunting for it somewhere in two pages of conclusions.

I also require that they address specific issues in specific places in the conclusions part of the report. One section has to identify and summarize the most important results of the experiment. It has to be a single paragraph of modest length, like the abstract for a research paper or a memo in the real world. They are marked off for not putting the right things in there or for technical writing errors noted above. I sometimes require that this be a cover memo, while other times I have them call it an abstract. (That helps catch plagiarists.)

The other section must address a particular aspect of a typical "conclusion", such as what followup experiment could be done (and describing it in detail), identifying a real-world situation where this effect is important, explaining what might have caused inaccuracies in their results and how to avoid them in the future, or identifying specific procedures they used that might have contributed to the precision of their results. I vary this question from semester to semester to make it harder on lazy kids who try to use a file from a student in last year's lab.

The real trick is keeping track of students who say the same thing (e.g. didn't read the manual before class to be sure they knew what they were doing) rather than learning from their mistakes and improving their skills in the lab! If they are thinking, these answers should be more than a throwaway line.


Grading Rubrics.

These are essential, and it is essential that they be designed so students who do the experiment and complete all of the calculations with reasonable accuracy and attempt to answer all of the post-lab questions will get a minimum grade of C. Not only is that passing for our college, but it is all that the nearby engineering colleges need to see. That limits somewhat the deductions for egregious errors, but still leaves lots of room to encourage improvement.

It also has to work for me and the adjuncts who work under my supervision. Thirty or so lab reports a week can be a big load for me, particularly when they come due on a week when I also have to grade 250 pages (or more) of exam solutions. It might be an even bigger load for my adjuncts, who have other jobs as well. This requires focus on specific spot checks and clearly defined tasks, as noted above.

I long ago quit cutting them much slack on the initial reports. A warning without a deduction of points has no effect on future behavior. However, I do cut the penalty points for "critical thinking" types of errors to about half of the norm. We go over tech writing skills from day 1, so they are expected to do that part correctly. We also drop the lowest report, so that encourages improvement (but won't help if they skip one of the labs).

Appropos a point Matt made in his blog recently (see below for the link), the max deduction for omission of the separate "conclusion" writing assignments is 20%, although the deductions for flawed contributions are usually around 10% of the total grade. However, other items that require critical thinking make up at least another 20%, if not 30%, of the total. I also include questions of this type (interpret a certain result or write a summary of certain results) on the lab exams.


Other voices.

Matt's recent article about improving lab report conclusions, from the perspective of a graduate student at an R1 university, offers an interesting suggestion: collect some good conclusion sections from physics research papers. (I suppose I could use some of my own!) I'd also like to have a similar collection from industry, since those examples are generally unknown within the physics research or teaching community. As noted below, I would put more emphasis on showing them a good abstract rather than some good conclusions.

Chad Orzel provided his thoughts on the writing style used in lab reports last year, from the perspective of selective liberal arts college with a large physics program. I agree 100% on the evils of the passive voice, but I know where it comes from: our chemistry department. [Comment #15 makes that same observation.] They insist on it. In contrast, I insist on simple declarative sentences that state (in the correct past tense) that a specific quantity was measured, giving a specific result. [Comment #10 gives a nice example of good an bad ways of saying the same thing.] Of equal interest is how a number of comments came from composition teachers who regularly fight the same strange view of what makes good academic writing in their classes. Maybe the problems start in high school!

I also like Chad's emphasis on framing. In effect, that is what I do by trying making them start out by stating the most important result, whether it is a measured value or the verification that energy was conserved to within 10%. A good abstract tells you the important result(s) in a "Just the facts, ma'am" style, much like the headline / sub-headline sequence in the NYTimes.

2 comments:

Anonymous said...

I don't yet have any experience in marking lab reports (although I think I will get to experience that eventually). Just one comment about passive voice. My thesis advisor brainwashed me into (almost) never using passive voice.

Doctor Pion said...

Not quite on topic, but it is about writing and applies to some annoyin things in lab reports: Sherman Dorn writes about teaching peeves related to teaching education history. I'm guilty of the parenthetical tangent myself.