Wednesday, July 13, 2011

Students as Customers

Nothing is simple these days. So it is with teaching and learning in higher education. Given the current furor in Texas over Governor Rick Perry's proposals to "make college more efficient" I thought I'd give my take on the subject, and in this way deliver a post I promised a while back, writing about what I learned from my teaching evaluations. Illinois and Texas are sufficiently similar universities - very large, public, flagship campuses of the state system, highly regarded research oriented institutions - that I believe what I have to say has relevance for there. There are differences of course - Austin is also the state capital while it's not quite a hundred miles between Urbana and Springfield, Texas is a red state while Illinois is a blue one, Texas has been better in the revenue producing sports - but the similarities outweigh the differences.

First, let me tick through a bunch of obvious points. It is virtually impossible to have an honest discussion of these issues in a highly visible public forum because the subject is politically charged and there is fear that if you give an inch they'll take a mile. Given our national politics, the fear is rational. My blog is barely visible these days. So I can afford to be more forthcoming.

College Purpose

On the core objective of college from the student point of view, one can give at least two defensible yet different answers: a) getting the degree and b) learning something of importance while a college student. There is yet a third possibility that may be less defensible but more realistic: c) blowing off steam and learning to be responsible for oneself. Obviously these don't preclude one another, but it does matter where the emphasis lies, especially for defining what it means to be doing the job well.

Competing for Students

The bulk of the undergraduate students are from within the state and for them there aren't close substitutes in quality of school at approximately the same tuition. So to a large extent these students (and there families if the parents are paying tuition) are "captured." The situation is quite different for out-of-state students and the competition for such students is much fiercer.

Measuring Performance

The NCAA maintains a searchable database on graduation rates from the Department of Educations IPEDS-GSR data. If the data are used primarily to benchmark sports team graduation rate performance (better than the institution, same as the institution, etc.) that *may* constitute a reasonable use of the data. In using the data, however, to do cross institution comparisons there is the standard problem of confounding inputs with outputs. It is well understood there is a high correlation between the standardized test scores of students used for admission and graduation rates. That the greater the selectivity of the institution the higher the standardized test scores means graduation rates are apt to be high at highly selective institutions. Both Illinois and Texas fall into this category. That is an input effect. If you compared graduation rates at schools with comparable standardized test scores, that would come closer to comparing outputs. However, schools with higher standards will have lower graduation rates, other things equal. For example, at Illinois the washout rate from Engineering is pretty high (many of those students transfer to other colleges at Illinois). It is necessary to also control for performance standards, if graduation rates are to be used this way to measure the effectiveness of the education. The main take away from this section is that the measurement problem is hard, and people tend to trivialize it.

University Revenues

The share of the cost of education borne out of general tax revenues has been declining for quite a while, while the share covered by tuition has been rising. In this sense what historically was a publicly provided good has been becoming more and more private and that has been happening for quite some time. This is a source of mismatch in expectations between the university, on the one hand, and the students and their families, on the other. (Where the tuition increase is simply an offset for declining revenues from the state, the university is no richer so internally may see little reason to change its practice.) In my opinion, this is the biggest argument for the university to become more "customer centric," though as mentioned in the Competing for Students section, the bulk of the students are captured. Thus the reason to become more customer centric has to do with politics more than with economics.

Difference between Faculty and Campus Perspective

Cash as input is not sufficient for a student. Both diligence and perceptiveness are also necessary in pursuit of their studies. In the abstract, who could disagree with this? The issue is what to do when it is not immediately forthcoming. One can envision two distinct responses - one is to be very demanding of the students, the other is to accommodate the reality of actual student performance. Over the years I've personally struggled with this issue (see the last paragraph on page 2 for a recent recounting) favoring the very demanding approach though it was out of sync with what the department wanted. In my capacity as an administrator, I've observed the reverse for adjunct faculty, who have dumbed down their exams and raised their grade distributions, because their jobs literally depend on getting adequate student satisfaction. Let me conclude this section simply by noting here that there is a clear tension between giving students what they want and certifying their performance.

* * * * *

With this general background, let me turn to my teaching evaluations, and here I will focus on the intermediate microeconomics course only. These evaluations were administered in the ultimate class session before the final.

In the design of the class I took what might be termed "a discovery approach." One can see this in the spreadsheets I designed, each with Excelets aimed to illustrate the economics, where students could perform little experiments by manipulating the parameter values and seeing how the graphs changed and where the more industrious and mathematically inclined students could go several steps further to reverse engineer what was being constructed by looking at the cell entries from which the graphs are plotted as well as the formulas that generate those cell entries. There were also screen movies, to help the students get started with the spreadsheets and to explain the underlying economics. With these inputs, the idea was for the students to explore on their own and learn from that. There were homework problems in Moodle to help the students assess their understanding. If they didn't get those right they could redo them. There were other problems on the exams, which because that's how we administer tests, were one and done.

The description in the previous paragraph represented a compromise necessitated by what I call do-ability. I was making a good bit of this content only a few weeks before the students would use it. So I needed to get it done. And with that I needed a method I could rely on that would produce completed product in a timely manner. I had come up with an alternative method, which I called the Dialogic Approach, where there are spreadsheets designed to simulate Socratic dialog, with "small" questions interspersed in the presentation that the students must answer in order to proceed. The questions would be posed in some short answer format so the responses could readily be evaluated for whether they are correct. The dialogic approach gives more direction to the students on how to proceed and is less open ended than the discovery approach. I have a couple of workbooks done in this approach that I made quite a while ago. My recollection is that each took about a month to make, a lot of which was conceptualizing how to present the material. That was just too slow, so I didn't make dialogic content this time around. But my sense is in moving forward with this stuff I should make more dialogic content because that way the students get more direction, which is desirable, and my time now is ample.

I designed this content with the aim that the course would eventually be taught in a blended format. It was very much a work in progress last spring and further the course hadn't been advertised as blended. So I we meet for the regular time last semester - 80 minutes twice. The above describes about half the course content, what I refer to as the analytic side of the course. There was also a narrative side, based on readings the students would do and blogging about the readings. Some of the readings were essays I had produced, there were a couple of journal articles by famous economists, and we also read Heilbroner's the Worldly Philosophers. The essays and journal articles lined up, more or less, with the spreadsheets so could be seen as a narrative counterpoint to the analytic theory. The Worldly Philosophers did not line up too much, beyond the early chapters on Smith and Ricardo. I intended this from a different vantage. The neoclassical theory that buttresses the analytic approach is essentially static. Even when time is explicitly accounted for that is done with perfect foresight. Likewise, when uncertainty is considered the model assumes all possible contingencies can be described in advance. The classical economists, however, had an essentially dynamic conception. Statics is fine where it simplifies the presentation without changing the message. However, when it does change the message one should be suspicious of it. So I wanted to provide the students with some skepticism about what they were being taught. I also wanted them to become familiar with some of the great economists.

On the analytic side, I knew things were not working well before getting the course evaluations, because the midterm scores had been quite low. One part of the solution to that poor performance, something I wrote about near the end of the semester, was for the students to attend extended office hours and have them talk through their questions and misunderstandings. The vast majority of students didn't do this during the semester and those who did it early on were among the better students in the class. This clearly needs to be a part of any revised version of the course. The question is how to get it done broadly.

But there is an additional question I didn't ask earlier, which is how much is it reasonable to expect that students can progress on their own with the discovery approach? If they don't get very far with the approach, it is because they don't know how to proceed or because they aren't trying? It's not hard to imagine a vicious cycle where each begats the other. The teaching evaluations confirmed there were big problems. In those the students made the culprit my ability to explain things. Many of the evaluations that had comments said something to the effect - he's a really smart guy but he's a really bad teacher. Since the comments were so consistent I have no doubt about the sincerity of the students who made them. Their expectation, apparently, is that if I explained well then they'd be well prepared to do the exam problems. I'm perfectly willing to admit that I'm hard to understand on occasion, either I go quickly during the hard parts or don't illustrate those well. I do wonder though, is that the whole story? Even if I explain it great, don't the students still need to figure it out for themselves as well?

Though there were a few Business students who did well in the class, the majority of the better performers were from disciplines that require a lot of math. What about the non-math students? The course is meant for them too. Yet they don't find the material welcoming. This had been my experience 10 years ago when I last taught the course. Indeed much of my experience this spring parallels earlier teaching experiences, including when I first started 30 years ago. The issue around my intelligence, inability to explain, and making the course too hard were with me then. Since that time I've learned a lot about using the technology and motivating learning, much of which I tried to incorporate into the course. But these core issues have not been resolved. If anything, by trying this "new approach to teaching" (new to how intermediate microeconomics is taught and how most of their other courses are taught as well) I brought out the old issues in their raw form, unmasked by a variety of accommodations that invariably creep into the teaching so the course isn't perceived to be too bad or too hard.

With one or two exceptions, I did not test on the readings and I didn't have essay questions on the exams. The blogging counted for a good chunk of the course credit (40%). In my mind it was an end in itself. The students, however, didn't see it this way. Their comments indicated they view all outside the classroom work as instrumental - preparation for the exams. My approach didn't comport with their views and they were clearly angry that it didn't, particularly that they had to read Heilbroner. Perhaps if they had been adequately prepared for the exams they would have reacted differently to the readings.

Yet the students weren't critical of the analytic content. They took it as a given that they should "learn" that stuff, even if they couldn't relate it well to other things they were being taught in their business courses, and even if a few months later most of this content will have been forgotten. The anger expressed in their evaluations was about not feeling prepared and not getting a high grade.

I've cooled off since I received these evaluations, a few weeks ago. Because I read and commented on all the blog posts, I definitely put in more hours on this course than any other I ever taught. My ego was bruised by the evaluations and my immediate reaction was, why bother? But with a little more distance I can see the results are indicative of some larger issues - with required courses like intermediate microeconomics that act as a barrier for students to move on, with the undergraduate education overall and which purpose the students see as theirs, and with experimenting in teaching approach in an individual course in a culture that otherwise has little such experimenting. And I can see my own aim should be less grandiose and more concrete.

I've not gotten back to making economic content this summer. Having gotten this post out of my system, I'm almost ready to resume.

No comments: