Showing posts with label assess. Show all posts
Showing posts with label assess. Show all posts

Friday, July 19, 2013

Learning Outcomes and Assessment

Interesting story in IHE today about mandatory assessment of learning outcomes in Iowa. What I found most interesting, however, was the vast amount of naivete among those posting comments. Folks, this is coming to a college near you whether they pass a law or not. This is a requirement for every college and university that wants it accreditation reaffirmed in the southeastern US, and in many other regions from what I can gather. It was, for example, prominently mentioned as an issue at San Francisco City College. I'm not sure if we are in the second or third wave of colleges required to do it, but we are early enough that I can understand why this is still news to the majority of faculty. My opinion, having done this for awhile, is that it is a useful exercise. I've learned a lot about what my students learn, even if it is relatively short-term learning as reflected at the time of the final exam. My main objection is to how we report our data, but I keep my own version as well since that can inform future development of the course. They want the results for every student in the class, but IMHO it is a mistake to lump failures in with those who passed. To me, it is a good thing when students who failed the class did so because they failed to achieve most of the expected outcomes defined for the course. And I don't think it is a failure of the course if they didn't learn because they chose to not attend class or do the homework or participate in active learning exercises in class. (They think I don't notice when they are playing on their phone, but I know exactly why person X could not do the kinds of problems that dropped him from an A to a B.) I'm most interested in what was missed by students who fail (to help them learn those topics) and what was missed by those that pass (ditto). In both cases it can be quite surprising to see what they learn from specific subsets of the course. PS - The significant extra work required to develop these and collect the data is one reason that I have not blogged very much lately.


Read Entire Article......

Saturday, January 22, 2011

Forcing the horse to drink

You can lead a horse to water ... but you can't make hir drink.

Did you know I was talking about "Academically Adrift"? (IHE articles here and here point to the main story lines.) That's right, I'm talking about the folks who don't realize that you can send a kid to school ... but you can't make hir learn.

I've commented about this book twice, and was ready to comment again when I decided it was time to put those comments here in my blog.

Just for background, it first came to my attention in a January 18 comment on Dean Dad's blog that pointed to a column by someone from “The Hechinger Report” in the Sacramento Bee. That led to the following observation:


That article about Critical Thinking was interesting until we got to the inevitable “decline of civilization” comments from various experts. How can you compare the results of an Unprecedented One-of-the-First Study with guesses about college life 30 or 50 years ago? You can’t. I’m pretty sure we spent 50% of our time socializing in the dorms back in the good old days.

I’m impressed that almost two thirds !! of the study group made significant gains in critical thinking, even if the journalist chose to emphasize the negative side of the same results.

And I was stunned to learn that students in “business, education, social work and communications” showed the least gains, not to mention that “students learned more when asked to do more”. Who would ever have imagined such a result.

When Sherman Dorn questioned the reliability of the assessment tool used to generate data for the book (and later decried the over generalized conclusions drawn from it, I ventured the opinion (on the former article, slightly rewritten here) that:

Commentary based on perceiving a single data point as if it were a century long time series does not show any critical thinking on the part of the commentator.

I would also add that I think I would not have shown any significant improvement in my critical thinking skills after 4 years of college because I entered college with spectacularly good ones due to my high school experiences. I’m sure I improved, but not within the uncertainties of an instrument like they used.

The best article, however, is the one Chad wrote in Friday's Uncertain Principles. What makes it the best is, of course, is that he agrees with me that it is no surprise that 1/3 of all students in college are coasting. Well, that and he links to reports on the actual study the book is based on.

I'll add to his comments that the headline findings are based on an evaluation instrument that Sherman Dorn, a professor of the history of education and an expert on assessment in K-12, thinks is not suitable as an accountability measure.

Further, I'd guess that his students at Union and his fellow students at Williams are and were, like me, likely to have scored pretty high on the "critical thinking" essay they were given and thus less likely to improve. You have to push such students REALLY hard if you want them to improve their already high-level skills.

And I'll also add that I use a very good textbook (Wolfson's "Essentials" for calc-based physics) that runs about 22 pages of reading per week for the first semester (Mechanics and Thermodynamics) and 18 pages for the second semester (Electricity and Magnetism and Optics) -- a total that includes the textbook problems. Yet few students in any random sample at any university would trade that (not to mention homework assignments that are thick with critical thinking challenges) for 40 pages of a history book.

The worst article, however, is the NYTimes interview with one of the authors. There we discover that he actually thinks surveys and a single essay test -- likely taken without any academic (grade) or monetary (continued employment) motivation -- actually measures learning, for example, marketable engineering skills. What a load of narrow minded crap. As if the only job in the world is writing a sociology book.

And I take any claim that grades have been inflated to a Gentleman's B in my classes as a personal insult. My students know better.

But I will agree with the conclusion that challenging students makes them better. What bothers me is that someone who is a professor Emeritus at NYU thinks this is news. This has been known for centuries and obvious to me since elementary school.


Read Entire Article......

Tuesday, October 26, 2010

Assessment

Not sure if I have much to say about this particular story, but it definitely deserves mention here because of my "promise" to engage in questions about the A word in response to an excellent article by Dr. Crazy last month.

An article today in IHE asks the musical question Why are we assessing? (I know why we are -- our accreditor insists on it -- but that only begs the question.)

Sorry, nothing to see here for the moment. Well, not nothing. This particular observation

We now have a number of intriguing published instruments although, for many, evidence of their quality and value remains a work in progress.
from the article definitely deserves flagging. Are they really saying there is no "there" there? That no one knows if there is any value in the institutional effort we have started? Sure sounds like it.

Fortunately, we have a functioning system at our college so faculty have been given the lead to design assessments that make sense in each general area (composition, math, science, history, etc) and for different courses within that area. Agreeing on what is Really Important has been an interesting exercise, as has been the process of comparing how each of us might assess a particular item in our own courses. We don't often talk about tests, and different ways of testing or grading, so that has led to an interesting conversation that will continue for years.

I think what we are doing will have value to each of us, even if it proves worthless on a cross-institutional level to the ed bureaucrats.


Read Entire Article......