Stop Paying Professors to Teach

Two people graduate from high school. Both are deemed adults by their society: they can drive, vote, rent an apartment, join the military, go to jail, get married. (Paradoxically in the US, they can’t drink alcohol legally.) They can reproduce. One of these two people goes to college; the other takes a job. The one who goes to college pays; the one who takes a job gets paid. They’re both learning something new, by means of written learning materials, guidance from masters in the field, discussion, self-study, and practice. After a year the college student has 3 more years to go until graduation; the one who took a job can do work at a professional level of competence.

Two people graduate from college. One goes to graduate school; the other takes a job. The grad student neither pays nor is paid; the one who took the job gets paid. They’re both learning something new, by means of written learning materials, guidance from masters in the field, discussion, self-study, and practice. After a year both the grad student and the one who took a job can do work at a professional level of competence.

Adults are capable of learning on their own, without having to pay professionals to teach them. They can read, think, discuss, ask questions of those who know more than they do. Teachers can be beneficial to adult learning, having greater expertise than the students. Teachers can also infantilize their adult students by perpetuating dependency relationships firmly established in childhood. Quantitatively or qualitatively, it’s not clear whether adult students learn better among themselves or under the supervision and tutelage of an expert.

Adult students can be useful to experts. Beginning by performing menial tasks, students can rapidly acquire the skills and knowledge required to attain some core competencies in their professors’ areas of expertise, even becoming active contributors to the professors’ research. It would seem a fair trade for the professors to guide the students’ learning in exchange for the students’ labor as research assistants. What about students who want only an overview of several academic disciplines, without gaining competence as a practitioner in those disciplines? Give them a set of standard reading materials and a forum for discussion and let them have at it. Or an advanced student who is attaining competence in that field can teach — just as grad students often do now.

I don’t see why college shouldn’t be organized like graduate school. No money changes hands. Students learn, gradually attaining competence in doing work in the fields they study. Professors do research, benefiting in their labs from the work their students gradually learn to perform. It’s still not quite as good a deal for the students as taking a paying job, but it’s better than having to pay their own money — or their parents’ money or the taxpayers’ money — to do something they could do on their own or in a cooperative exchange with professors that’s mutually beneficial.

Why isn’t college organized this way? In part it’s because a college degree is widely regarded as an entry requirement for higher-status, more enjoyable, better-paying jobs, and so spending the time as full-time learners eventually pays off financially. Also, there’s the intrinsic value of learning for its own sake, a value for which there is no financial reward. But why should students have to pay for these opportunities to learn and to advance their career prospects, rather than just putting in the time and effort? In no small part it’s because college professors don’t get paid enough to do the work in the fields of research in which they’ve attained expertise. They have to support themselves financially, in full or in part, by teaching. The government, which at one time covered the teaching expenses for college students just as they still do for primary and secondary school, no longer pays the full cost. That could have meant fewer professors handling larger class sizes in order to keep education free for the students. Instead it means that the students have to pay more and more out of their own pockets to cover their professors’ salaries.

In order to preserve this source of income, the professors and their administrators sell the value not just of advanced education but of the college degree. Adults can learn on their own, individually or collectively, but they cannot bestow a degree on themselves or on one another. Only universities are authorized to dispense this particular credential. If you want it, you’re going to have to pay for it.

Increasingly, areas of work expertise are transformed into areas of teaching specialization, while universities are transformed from research centers into adult education schools. In American liberal arts colleges and community colleges, the professors might not even be expected to do research: they’re paid to teach and only to teach. Maybe it’s always been this way. But does it have to stay this way? Let the taxpaying public finance the research that gets done in the universities (and recoup the return on investment, if there is any, rather than handing it over to industrial investors). Let the learning take care of itself.


  1. Sam Carr says:

    A lot of thoughts go through my mind. The system as it stands really is lousy.
    One practical problem is that those who seem able to be successful as researchers, i.e. the ones who get grants, are usually adjudged to be the worst teachers. In many colleges teaching and research almost function as mutually exclusive compartments. Earn money for the institution by getting grants OR earn it by teaching.


  2. ktismatics says:

    I see your point, Sam, and agree that your proposal would almost surely constitute an improvement over the status quo. One doesn’t really need a Ph.D. or an active research program in order to teach the accumulated content and methods of most academic disciplines. High school teachers might even be better teachers of university courses: not only have they studied the scholarly topic they teach, but they’ve spent time learning how to teach.

    On the other hand, I’m concerned about perpetuating a system in which education is increasingly deemed a consumer good that teachers produce and students consume. I’ll have to get back to exploring this thought a bit later, however.


  3. ktismatics says:

    For decades the cost of a university education has increased at twice the rate of inflation, with the burden to pay falling increasingly on the students. Industry increasingly shapes university education by moving course offerings toward more applied fields, thereby shifting what had been its own training costs to the taxpayers and the students. Employers insist that applicants hold university degrees for higher-paying jobs, even if the degree bears little relationship to the job requirements, in part because the degree serves to prequalify job applicants as having at least above-average intelligence and perseverance. Only universities can bestow a degree, and to get one the student has to pay four years’ worth of tuition into the university’s coffers.

    Students can teach themselves, individually and collectively, without a teacher. There are no widely recognized standards, either qualitative or quantitative, by which to evaluate either teaching excellence or student proficiency at the university level. Students could achieve the equivalent of a college degree on their own, without having to pay the exorbitant price tag. What stands in their way? The tacit complicity between industry: industry will continue insisting on the degree, which pays the professors’ salaries, and the university continues providing job training and prequalification services for industry.

    Maybe one of the reasons the universities don’t want to standardize curricula or learning accomplishments for the courses they offer is that they’re afraid students will figure out how to meet those standards on their own, without paying the university’s ransom. In the US it’s possible to get a high school equivalency without graduating from school by studying for and passing a standardized learning test. This might be a good thing at the higher education level as well.

    In speculating along these lines, I’m not trying to bring about the demise of the universities. Instead I’m suggesting that learning and teaching be re-incorporated into the doing of intellectual work. Students would be regarded as novice producers of this work, rather than as consumers of the knowledge generated by the intellectual workers and supplied to them by professional teachers. Students would be paid as novices, learning while also contributing to the work, which is how industry works. Failing that, and failing a change of heart on the part of taxpayers, students would self-study, with no money changing hands until they’ve acquired enough expertise to get paid for their work.

    I understand that the current situation presents certain opportunities to the entrepreneurially-inclined educator. Suppose, for example, that medical transcription companies stop offering on-the-job training to their new employees, instead requiring all new workers to be precertified transcriptionists. Universities don’t teach this skill. This gap between supply and demand creates an economic opportunity for for-profit schools to offer training that qualifies the student for the required certificate. The students pays for training that formerly their employers paid for. The employer and the training school agree on the certification requirements. Perhaps the students must even borrow money in order to pay the school’s fees. Hopefully for the student the investment pays off. But the training cost has shifted from the employer to the employee. This shift contributes to the long-term trend whereby education costs go up but real inflation-adjusted wages remain stagnant.


  4. ktismatics says:

    For the preceding example I think that the medical transcription company should pay the cost of training if the trainee agrees to work for that company after being certified. I suppose if the transcription company pays transcriptionists not as employees but as piecework contractors, then the company has yet another way of cutting costs at the expense of employees. In such instances, of course, employment opportunities may open up for people who might otherwise be excluded from relatively higher-paying work. This is how the multinational companies exploit differences in labor conditions, improving economic conditions incrementally for some at the expense of others, while always pocketing the profit on the difference themselves.


  5. Carl says:

    I’ll have to think about this. The first thing that’s clear to me, though, is that most academics do not do research that would be worth a living wage alone. Whether there’s enough of such research to go around and it would be stimulated by the change you’re proposing is another question.


  6. ktismatics says:

    Based solely on the amount of public money currently set aside for research, I’m sure you’re right, Carl: there would be a drastic reduction in post-secondary academic jobs if my proposal were suddenly to be enacted. Similarly, if only as many students were admitted to college as could be financed by public funds, without any supplemental tuition expenses charged to students, there would be a lot fewer people attending college than is currently the case. Many private liberal art schools — maybe the one where you teach, and maybe the ones which our high-school daughter rates highest on her list (she doesn’t agree with my proposal) — would go out of business.

    If all the public money that currently goes for paying professors’ salaries were earmarked for research and none for teaching, then more research would get done in our society than is currently the case. And if the learning-teaching component of the universities were handled co-operatively by the students themselves, then everyone with the time, intelligence, and effort to put into it would be able to obtain a free college education. Both would be good results in my book.

    If the undergrads’ education included participating, apprentice-like, in the professors’ research programs and doing their own research, then there would be a further multiplier effect on the amount of new work getting done across the academic disciplines. How much did your undergraduate study of history give you a head start in progressing through grad school? Probably not very much, if your experience was like mine in psychology. You pretty much start from scratch as a new grad student, yet within fairly short order you’re equipped with the rudimentary knowledge and tools to start actually doing the work of a historian, or psychologist, or whatever. I think undergrads could do it too.

    Here’s a question I posed at Perverse Egalitarianism but to which I’ve not yet received a response: How does one get tenure at a liberal arts college? Is it based primarily on teaching, or on research like at the universities?


  7. says:

    I believe there’s a name for someone who’s studied philosophy without the exacting formal training reading alone can never provide: Alain de Botton.


    1. ktismatics says:

      Hey info@, I wonder if I should pick up de Botton’s Proust for Dummies book, inasmuch as I’m now partway through In a Budding Grove. In this post I’ve focused on undergraduate education, but I see on Wikipedia that de Botton has been awarded a master’s degree in philosophy from King’s College London. Based on your email address I presume you have something to do with that very department. I see from the profs’ open letter to Leiter that the administration has relented in closing down philosophy at King’s, at least for now. Maybe de Botton pulled a few strings on behalf of old alma mater? I doubt that the Middlesex affair will end as amicably, as the administrators there seem bent on being the biggest assholes possible. Again, I’m all for continued government funding of university research, not just in the more industrially useful fields like engineering but in the arts and humanities as well. And I’d define “research” liberally, to include works like novels, paintings, films, flashmobs…


      1. says:

        Let’s put it like this: there’s precisely nothing in his published work to indicate the kind of analytical and lateral-mindedness that regular seminar attendance would demand/inculcate. The same Wikipedia page also claims that “he began a Ph.D in French philosophy at Harvard University” – a truly bizarre notion, given the nature of the faculty.

        The Middlesex decision and its subsequent handling were so brazen and reckless (why slaughter one of your few cash-cows?), and so heedless to the threat of boycott by humanities departments both in the UK and abroad, that the future for Middlesex humanities as a whole looks extremely bleak.


      2. ktismatics says:

        I wonder whether de Botton might have met the academy’s demands and received its inculcation but set them aside for the sake of marketability. This is of course the risk/reward calculus of presenting oneself as a “public philosopher”: crap sells. That’s one reason why I’d like to see works of art and literature and film supported by public funds and expert endorsement: to make it more likely that excellence can at least survive, even if it doesn’t get rich.


  8. Shahar Ozeri says:

    Sorry, I actually thought I responded to this at PE: How does one get tenure at a liberal arts college? Is it based primarily on teaching, or on research like at the universities?

    As far as I know, it varies a bit. A good many, but not all, liberal arts colleges weigh teaching more heavily than research universities. I think this is good. Yet, when assessing research and service for promotion and tenure, I think there is a tendency to follow what the research uni’s do. I think that most of those highly selective colleges will look for some kind of “substantial publication” or significant research “cooked up”, and wants to see –however nebulous– an indication of their continued scholarly activity in the future. However, at the top research uni’s, I think the actual amount of research required at the time of “tenuring” is a bit larger, but at times, the tenure clock clicks for a longer period.

    As far as community colleges, well, that varies too, many cc’s have a tenure and promotion standards akin to 4 years, but teaching is weighed the heaviest (upwards of 50-60%), with service and professional development rounding out the review (the latter may include research, but it’s not in one’s contract). At other cc’s one is an at-will employee.


  9. ktismatics says:

    Thanks Shahar. As I mentioned in one of these recent education-related posts, I’m neither a teacher nor a student, and we have a daughter who’s about to finish high school and is looking at colleges. So my orientation to these issues is surely different from yours.

    Given that most liberal arts colleges function primarily as teaching institutions, it makes sense that tenure would be predicated on teaching excellence. High schools don’t award tenure, but salaries are pegged to years of teaching experience, even though there’s no empirical evidence supporting teaching improvement after about the first 5 years. And the union contracts typically protect teachers with seniority when it comes time for administrators to let teachers go for budgetary reasons. So there are parallels in employment arrangements between secondary and higher ed.

    I think loyalty to the job should be recognized with loyalty in return; I’m just not sure that the pay differentials should be so great. Lifetime job security ought to be worth something in its own right, without necessarily getting paid 2 or 3 times as much as the equally competent but less well-entrenched colleague for doing the same job. As things currently stand, high schools and colleges carry a lot of high-income talent on the payrolls, putting increasing financial burdens on taxpayers, parents, and students.


  10. Carl says:

    Sorry I can’t join in the de Botton bashing, it looks like fun.

    But I do have relevant gossip and anecdata to share, not to mention unsupported, self-serving opinion. Having just sat on the tenure and promotion committee and applied for promotion at my little regional U, I can report that teaching is pretty decisive for tenure (except when it’s not, that is, when teaching is marginally adequate but research and service are spiffy), research is approved-of in a field-dependent way (some of our more ‘professional’ fields are not very researchy) and that we’re working on raising the bar for full professorship. Anecdatally, I did not get that promotion despite being the current Professor of the Year and doing loads of service; presumably because I’m in a field where research and publication are common and I’m not doing much.

    No biggie. Our pay scale is low and really flat – there’s like a $500 raise for promotion, there’s maybe a few thousand dollars a year separating our oldest and newest departmental members. And tenure is a courtesy title, our faculty manual/contract specifies that we still serve at the will of the Trustees, although there is certainly a strong presumption of continuance. I don’t think we’re unique among smaller privates in any of this. Because it’s really not a lot more expensive than adjuncts, we generally hire tenure-track, at least in the humanities, but we always hire way behind the need curve. I’m currently working on a proposal to create a teaching fellowship for first-year post-docs that would get them the teaching lines they need on their vita to be in play for permanent positions while getting us a rotation of teachers at the lower end of entry-level compensation.

    On reflection, I think there are two problematic elements to your proposal, John. The first is systematic: shifting faculty compensation to research would severely damage the humanities, because most of what we call ‘research’ is actually puttering around with pet projects in a large system of mutual collusion that wouldn’t withstand a moment’s scrutiny if anything was actually at stake. I would venture to say that the cash value of the humanities is simply not in its research; there is no historical discovery, for example, that makes enough practical difference in the world to justify a living salary. Of course research is of some intrinsic value, and may have long-tail effects in terms of a general cultural commitment to getting things right, but assigning a market price to this would be vanishingly difficult. Our academic value is rather in transmitting and translating the noble and status-bearing acquisitions of the human spirit to new generations, which brings me to the second problem.

    I’m a fairly good teacher, and it’s not because I sit back and let the students find stuff out for themselves. They just don’t. By the time I get them, they’re pretty thoroughly habituated to passive learning. So much of the sweaty work of my daily grind is patiently unwinding those habits, which come in a bewildering variety and density. Some of my colleagues help, some hinder, for all the reasons Adorno talks about and more. But the basic point I want to make is that in the current system, teachers are needed to invite students to learn, show them how, encourage them to keep trying when it’s not as easy as sitting back and getting spoon-fed, and so on. I love the romantic notion that you can just show students good stuff and they’ll get to it. I can get some of them there if they’re willing and I really work at it. But at the moment, at least, it’s not a functional model.

    So obviously we need to start from the bottom and encourage kids’ curiosity and responsible self-discovery right from the start! But there are some things in the way of this, starting with the fact that the latent function of our schools is not to enlighten independent learners but to discipline dependent workers. Including the numbskulls who cycle back into the system as primary-school teachers.


  11. says:

    I wonder whether de Botton might have met the academy’s demands and received its inculcation but set them aside for the sake of marketability. This is of course the risk/reward calculus of presenting oneself as a “public philosopher”: crap sells.

    Well I think the difference between de Botton on the one hand and, say, Baggini and Warburton on the other, is that the latter have some sense of the cost of populism in terms of analytical rigour, whereas de Botton’s a writer who just happens to adopt certain philosophical themes. When asked “Do you still use skills that you picked up at university?”, rather than referring, say, to the power of discrimination that comes with the painstaking analysis of arguments, he replies: “Doing university essays taught me the valuable skill of condensing a huge amount of material in a minimal amount of time.”


  12. ktismatics says:

    Sorry to hear about not getting that promotion, Carl: I hope you have a dog to kick or something. I’m confused about something though: first you say “I’m in a field where research and publication are common,” then you say “most of what we call ‘research’ is actually puttering around with pet projects in a large system of mutual collusion that wouldn’t withstand a moment’s scrutiny if anything was actually at stake.” So would your chances for tenure go up if you puttered around with collusive pet projects like the rest of the gang? Your description of research makes me skeptical about the academic rigor of history as a discipline. Maybe you could write up some Gramscian and Taylorite stuff about your mobile sculptures, call it research, get it reviewed by some interdisciplinary panel, and submit it to the tenure committee next year as your Project.

    To be clear, I’m not calling for assigning market value to humanities research. I’m looking for excellence, largely as defined by the experts in the field. And I’d like the public to honor that commitment to excellence with public funding. Let industry pay for market-value research, since they’ll reap the rewards. As a citizen I’m more interested in tax dollars encouraging “the noble and status-bearing acquisitions of the human spirit.” I presume this does not exclude empirical work out of hand, since there’s a lot of science that advances knowledge in ways that don’t generate high-tech gizmos or other ROI vehicles.

    “teachers are needed to invite students to learn, show them how, encourage them to keep trying when it’s not as easy as sitting back and getting spoon-fed”

    I believe this is what primary and secondary pedagogical improvement is working on. Whether it’s possible to turn the faintly odious schoolmasters or the Miss Trunchbulls into effective educators I’m not sure. I might write a post about teacher effectiveness research before I move on from teaching as a topic. I see that in these tight economic times the market is very tight for all levels of educators, which means the primary and secondary schools can be more selective in new hires. However, there just aren’t many openings to fill.

    I will say that the best teacher I had as an undergrad was a history professor. He would walk into the lecture hall, fill the chalkboard with an outline without looking at his notes, then deliver a great lecture that had virtually nothing to do with the chalked-up outline. No discussion; just mastery of content. I learned a lot in that class, though I’m sure I’ve forgotten most of it. He did provide an excellent public demo of what mastery looks like.


  13. Carl says:

    I was interested to discover that Mr. Active Learning himself, John Dewey, was a straight lecturer. If I remember correctly, he thought that at the top levels of instruction that demonstration of mastery was what the students needed. Of course it’s easy to see that tip over into doctrinaire inculcation and power trips. But in my experience you don’t reach many students that way, only the ones who are set up right for it; and that tends to get into class reproduction, which is another can of worms.

    The promotion thing was mildly disappointing, but no big surprise. After all I was a member of the committee that was tightening those standards, so I had a good idea what was coming. And yes, there are many opportunities to publish shiny crap that will do the trick for promo, along with rigorous studies of marginal interest and interesting optional takes on familiar material, such as my dissertation. About these things I think in terms of the Erasmian conversation – much of the business of high intellection is done by stimulating interaction, so what and how do we have to contribute to that? In the old days there were few participants widely separated, and writing standalone books made some logistical and technological sense. I don’t think those conditions obtain any more, but old habits die hard. On this theory I submitted offprints of that ANT/Gramsci conversation at Dead Voles for Rethinking Marxism to the committee, but apparently it was not taken seriously as the work of scholarship and may well have had quite the opposite effect.


Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s