Wednesday, December 15, 2010


My blog friend Dean Dad has a couple of bees under his bonnet that rival those of much older, semi-senile faculty. One of them is his obsession with the Credit Hour, even when (at other times) he worries about articulation and the transfer of course "credit" from one institution to another. He reminds me of one college that got rid of grades to foster creative risk taking, only to discover that no one wanted to hire their students because it took too much effort to evaluate the individual portfolios. That college no longer exists.

His latest version of the argument, in a blog last month, is to blame it for a lack of productivity in academia:

Third, we've defined what we do in a way that defeats productivity improvements. We measure learning in units of time. Until we stop doing that, no amount of efficiency-tinkering will make enough of a difference. A three-credit class required forty-five hours of seat time thirty years ago; it still does.

Actually, it doesn't. We only require 42 or 43 hours of seat time (plus final exams) at our college. They have to learn the same physics content in 28 weeks of class time that they once had 30 weeks to learn. You see, we measure learning in units of chapters in physics books, and engineering schools expect the same prerequisite knowledge they always did. AFAICT, everyone deals with this by cutting back somewhat on topics that students never learned anyway, but that does not help the overall learning cycle for the core material in the course.

Side comment: Somehow the same has not happened in math. Calculus takes longer now than it did when I first encountered teaching it, although that change might be making up for the lost weeks I mentioned above.

Besides, this is all a red herring. Productivity is not what one student learns, it is the cost of producing that learning. Productivity is about the difference between one professor running a tutorial for a single student and having an 80 student lecture/discussion class. And there, I know my productivity has increased significantly in just the past decade because my annual enrollment has grown by more than 50%. That means a lot more money is paying for my time, which is all that matters for the college's bottom line.

Side comment: An assistant chairman in my distant past also argued that failing students was a way to increase productivity in the department. It enabled them to squeeze twice as much money for the same amount of learning. This doesn't always work, of course, because repeats can displace other students who might be more likely to learn the material and pass the class.

And my productivity has increased because, in the days of smaller enrollment, I was also teaching labs. That is three hours of my time that only generates one credits worth of income from a small number of students rather than three credits worth for twice as many students -- a factor of SIX in income for my time! Using an adjunct instead of a tenured professor has lowered our cost there while freeing me to generate more income for the college.

One complaint I don't understand is
Fourth, unlike almost every other sector except health care, we have to invest in technology even when it doesn’t improve our own productivity.

No, you don't. That is a cop out. Managers like yourself did not have to replace blackboards in every classroom with SmartBoards and projectors without doing any study to see if they improved learning. (By the way, that is not a one-time capital expense. Projector bulbs are expensive and projectors wear out. There is also more security required because of a significant theft problem.) Indeed, they didn't even do a study to see if increased use of Powerpoint might reduce learning!

He also blames tenure, although he is actually blaming a seniority-based pay system rather than tenure. You can have tenure without automatic pay increases and you might need step pay increases without tenure, to keep your best people. Besides, as I alluded to above, one way every college has increased productivity is the use of contingent faculty, particularly at universities where the benefits are the greatest. I say this because the fraction of classes taught by adjuncts at my college has been stable for a long time at about 50%. (I am counting classes rather than people for a good reason: we have a significant number of adjuncts who only want to teach one or two classes.) I think this is possible because the salary disparity is not as great as at universities.

And productivity gets harder to define when you shift from a Community College environment (where he and I work on the teaching side) to a Research University environment (where I used to work on the research side). Is a professor's time better spent in the classroom generating credit hours or in the lab generating grants with overhead and jobs for students that help support enrollment? I think we all know that the answer to the last question is "yes" at an R1 institution, where it even includes the creation of non-teaching faculty positions that exist solely to bring in additional contract dollars.

And that last detail is why I think, in another article from last month, Dean Dad completely misses the point made by Historiann. Historiann is at a Wannabe Major University, just the sort of place where managers do profit (pay increases and job jumps up the ladder) by shifting resources to areas where they are more likely to get more research grants that generate more "overhead" (indirect cost recovery) and more administrative positions. There isn't much (make that ANY) value to the university if Historiann publishes another book. There is a lot of value in the 40% that gets siphoned off of a grant, and even more if the professor's salary and benefits and all other expenses (office, light, heat, staff support) can get charged to the grant while an adjunct with no benefits and few of those expenses teaches hir class that semester.

I have little doubt that what I just wrote is "far removed from any reality I [Dean Dad] can recognize", but it is a reality I am very familiar with.

But I would not put all of the blame on the managers who made it happen. Many faculty are complicit in the expansion of the university research enterprise because their lives are devoted to research and graduate and post-doctoral education. It is an unfortunate reality that history cannot compete with biochemistry at this game, and tight budgets will push money to where it creates the most return for the people managing the university.


Anonymous said...

When reading DeadDad's comment about having to invest in technology, I got the impression he was talking about computer labs and such for students, not fancy blackboards.

FrauTech said...

The technology thing really hits home to me. In high school they made a big deal about adding a bunch of computers and having us all do "research" on them. Then in college, I was at a strictly blackboard institution. Yes there were projectors with powerpoint, but most of my classes were chalk and board taught. And the value I gained was often better than the powerpoint. They've started using webct here and at the local community college, but I don't see much of a benefit to it. I agree it's nice to have computers in school for kids who don't otherwise have access to them but often the teaching of technology is so poorly done and a lot of fancy things are brought in that are entirely unnecessary. My community college has these projectors that are like the old transparency ones except it has a camera, so you can put just a regular sheet of paper there. But I fail to see why that's any better than bringing up a pdf on the class computer!