Dean Dad picked up on my article of a few days ago and wrote a great followup to my followup that linked it with an e-mail request for help and an anecdote of his own. If you didn't see it, go read it now (as well as the IHE version) along with the comments. There are good ones on both sites, several of which deserve additional remarks.
Since I already posted some comments as CCPhysicist on the original DD blog, I figured I should shift over here before getting carried away in his comments section.
First, I'll include the same back link to my early writings on the "concept of prerequisites". Those came in my first wave of academic postings three years ago when I started the blog. I find that early article interesting to read because some of my views have evolved since then as I have studied it further (sadly, I think some of that is unbloggable). I should also link to the article where my readers and I came up with the idea of using basics rather than prerequisites when talking to students. However, the one area that gets more and more of my attention is the role of K-12 testing, as mentioned in the comments on DD's blog. Those, first mentioned here, have strengthened every time I talk to students about their pre-college experiences and compare the current generation of students to ones who didn't grow up in that testing culture.
But enough of that. Let's get to the new stuff.
Dean Dad told this story:
I recall a student I tried to advise at Proprietary U. He was several semesters into his program, and he was choosing classes for the following semester. I mentioned that course x was next in the sequence, and required for his program; he objected that it covered a software package he didn’t know. I responded that the software package was covered in the class he was currently finishing. His response, which haunts me to this day: “but that was over a month ago!” His tone suggested that I was being completely outlandish; he was just mannerly enough not to end with “duh!”
I found this fascinating because the student was still in the class that taught the prerequisite material! Presumably he still had a final exam to take, but maybe that is presuming too much about how they do things at Proprietary U. More likely it was a module on one programming tool that was tested with projects and the like before moving on to the next tool.
But I take some exception to DD's conclusion:
Some of that is just a cost of doing business. Memory can play weird tricks. .... But it’s also true that thoughtful course sequencing -- which presupposes both thoughtful curricular design and steady academic advisement -- can provide reinforcement of key skills.
precisely because the student was still in the class teaching that new skill. You see, not only didn't the student know the new programming language or tool, the student didn't know it was going to be used in the next class in what I assume (from the story) was a clearly defined sequence for a "workforce" type program like ones that my CC has. I see this as an oversight by the instructor, although it could very well be the fault of the university if the instructor was a part-time adjunct who was not even aware of the curriculum. (Why else would Prof DD be advising a computer science student, given what DD says about his academic background, rather than the instructor.)
What would I recommend in this case to a colleague? First, that the subject of this programming language should be introduced by identifying when (meaning both the future classes and semesters, but also the career types) it would be used. I recommend something similar to my calculus colleagues when they introduce limits to students who "just" want to learn derivatives, and do something similar at certain key points in my physics course. Second, maybe the exam on that language should include questions about where it will be used. Hey, that is an idea for my physics class! Third, don't just say it the first day. Say it at least every week, much as I use the "this week in lab" or "next week in lab" observation to link what we are doing (or did several weeks ago) to our lab class.
Several comments made explicit reference to the known fact that it is always easier to relearn something than learn it the first time. I know this quite well, but that is not the problem I am talking about here. (Hey, I too forgot lots of things along the way, so I frequently use the prompting/review example technique Cherish wrote about in the comments. Ditto for what Ivory and Lisa wrote, as well as HS lab partner of Dean Dad. I'll come back to a few of those later, since I think they are worth emphasizing just for my own future reference.) The problem I am talking about is when students have allegedly learned something several times and still don't have a grasp of it. My favorite example (listed in one of my previous articles linked up above) is the logarithm. Widely used as an essential computation tool in pre-calculator days, it remains an essential tool because exponential behavior (and, hence, exponential functions) are so common in nature. But students don't seem to really get it until the fourth time around.
We first teach it in college algebra, and I have seen the test questions used as well typical final exam questions so I know the skill level in that class. We teach it again in a pre-calculus class, where (based on the principle described above) they should just pick it back up and move on to new applications. Yet I have seen students struggling well past the end of an exam period on a pre-calculus exam that mostly contained questions just like the college algebra class. That part of the class was effectively starting from scratch. However, the ones who survive that class and log integrals in calculus seem to have learned it when I give a pop quiz on them before starting RC circuits. The fraction that survive that sequence, however, is not large. I don't think it is an exaggeration to say that the lack of even partial retention plays a key role in our retention problems in math.
Gordon McAlister mentions Problem Based Learning in a comment on the IHE version of DD's blog. Although I have an aversion to Three Letter Acronym solutions to all that ails us, I tend to note that all of physics and math is problem based. The trick is what problems you choose, and what problems you put on tests. My speculation that the student in DD's anecdote was in a class built around modules comes from my experience teaching physics. IME, the worst retention results from a class where the material is tightly compartmentalized. You know, where a student taking Test 4 asks "is this like what we did on Test 2?" Every test should be part "Final Exam" in the sense of sampling key older ideas. Some of the best math profs (in the sense that I love having their students in my physics class) do this on a regular basis, and I do it also.
Ivory posted a link to this critique of the mini-PhD approach to the construction of a syllabus. Yes, this is part of the problem, and it is fascinating to see a familiar problem from physics addressed in the context of a history course. It is long, but all of it (along with the comments) is worth reading. Now we don't have the political baggage they do when deciding whether the Doppler Effect is worth our time (or an exam question) compared to some other worthy subject, but it is the same problem. Clutter obscures the essential.
For me, this is a work in progress, but I will state my criteria: will someone else expect them to know this topic, or is it one where they will be expected to look up the equation that applies to a particular problem and plug in the values? Is it a skill or is it a factoid? Will their BASIC skills get better if I go a bit deeper and challenge them in a familiar area or if I take up this new topic at a very shallow level? I think the answer is that we have to deal with the reduction from 15 weeks of classes (plus exams) to 14 weeks by dropping some things that used to be thought essential. However, I am always quite up front in telling my students that I am not skipping it because no one needs to know it.
Ivory also pointed to an abstract that describes one of those Increasingly Common Five Letter Acronyms (that also needs a few lower case letters) for a teaching technique. It looks to me like this was used in a course that was originally modular (if this is Tuesday, it must be Botulism). This is something that is a lot easier to do in a course like physics, and is almost identical to what a math colleague does on his calculus exams. What I find interesting is the idea of making it explicit to the students that you are doing this: that is, that you value retention of a specific subset of the earlier material. Not by talking about it, but by testing on it.
Definitely something to think about in a survey course where this is rarely done.
But you know something? The humanities courses where I really retained the material (and that make visiting museums a joy) were ones where there was a unifying theme in the interpretation of disparate items. That made you look for patterns as new things showed up, and LOOKING is the first step to real learning. You don't learn if it just washes over you like a rogue wave.
Read Entire Article......