7/23/2018

Students Strike Back!

I was pretty much a “career student” for the first 40+ years of my life. After a dismal Midwestern K-12 and community college experience, I pretty much gave up on education by the time I was 19. I loved to Dallas, Texas for a computer programming school in 1967 and after that experience and investment turned to crap, I ended up attending El Centro Community College in downtown Dallas, mostly to clean up some disastrous grades I’d received when I dropped out of my awful hometown college to go on the road with a rock and roll band. For the first time, I experienced overwhelmingly competent and well-versed college instructors and I was hooked. From 1968 until 1991, I attended evening classes in community and 4-year colleges everywhere I lived; from Texas (2 schools) to Nebraska (3 schools) to California (4 schools) plus two correspondence schools. 

As a “non-traditional student,” (someone who does not attend school full time and during the day) I was subjected to a wide range of educator talents. The absolute worst was a Calculus I & II instructor at the University of Nebraska, Omaha who was a native German who learned to speak English as an adjunct instructor in Pakistan. His whole classroom plan was to copy the formulas from our textbook to the blackboard. His tests were so old that the mimeograph (remember those) contrast was pretty much slightly darker blue on light blue paper. That didn’t matter to the Offutt cadets or to the frat brats because the instructor hadn’t changed his tests in at least 10 years so they knew all the answers before they entered the classroom. Most of the Air Farce guys just wrote the answers on their papers and didn’t even pretend to know how to display their “work.” But the best instructors were people who made a mark on my life forever, both from the information I received in their classes and from the role models they provided as leaders and classroom managers.

By the time I got to California and discovered that transfer credits are something that is arbitrary as the weather, I needed to become more efficient in my course and instruction selection. Because I lived and worked fairly close to Orange Coast Community College in Costa Mesa, I decided to reboot my attempt at obtaining bachelors there. I quickly discovered that the range of instructor quality was all over the place; from amazing to depressingly, amazingly awful. Since I lucked into a friendship with one of the great instructors and a business relationship with one of the decent instructors, I started milking those sources for information about who is who at OCCC. That was helpful, but even more helpful were the opinions of the better students I met in my classes. By the time I transferred to Cal State Long Beach (CSULB) in 1988, I had a process for selecting instructors and courses:

  1. Late in the semester, I began asking classmates and other students about instructors and courses that were “possibles” on my next semester’s class schedule. I documented those opinions so I wouldn’t have to refer to memory when it came time to register for classes.
  2. As part of that student opinion gathering process, I created my own course/instructor evaluation questions, since the questions the schools ask are designed by instructors to obtain minimal criticism and to keep the answers meaningless and neutral.
  3. I shopped for academic advisors, looking for someone who might actually be honest about classes and instructors. This was marginally useful, but sometimes not totally worthless.
  4. Whenever possible, volunteer to be the student proctor for course evaluations. That allowed me the time and access to see what other students said about a class and instructor I’d just experienced. That allowed me to weight the opinions I would receive from Step 1.
  5. Finally, and most importantly, I let my gut drive my participation in a class at a level I have never before or since allowed. Initially, because of the cost (money and time) of school, I made my whole decision to stay or leave from the first day of class. Since CSULB the add/drop date and the associated financial penalties were pretty lenient up to the end of the 2nd week of class, I sometimes held off making that decision to that last moment. However, if I disliked the instructor or the material on the first day, I dumped the class like a wet handful of poison ivy. In my last year, CSULB instigated an Undergraduate Withdrawal Limit that was punitive (now it is a total of 18 units over the course of a CSULB student’s undergrad career) and I got stuck with a couple of instructors and courses that were among the worst I’d ever suffered.

Today, there are options other than all of the work I put into my course and instructor selection. The best—and most reviled by academics—is RateMyProfessors.com/, a website born in 1999 and still barely known to or used by college students. College instructors are all over the map on whether they think those reviews are “fair” or not. Of course, their biggest bitch is the loss of control. Faculty unions do everything possible to protect incompetence and corruption among their ranks; proving, again, that self-regulation is a libertarian wet dream. It never works. In an Inside Higher Ed essay, “How To Fight RateMyProfessors.com,”James Miller wrote, "The cure for bad information is better information." He followed that with "There’s a lot of unhappiness among college faculty members about RateMyProfessors.com, a Web site containing student ratings of professors. Many college students use it to help pick their classes. Unfortunately, the site’s evaluations are usually drawn from a small and biased sample of students. But since students usually don’t have access to higher-quality data, the students are rational to use RateMyProfessors.com. Colleges, however, should eliminate students’ reliance on RateMyProfessors.com by publishing college-administered student evaluations."

Instructors who read this typically replied with snarky nonsense like this brave and anonymous prof's whine, "The only person who rated my outstanding colleague was a whining, lazy, slow-witted student who thought the professor was too demanding in requiring her to show up to class and to be prepared for class. The professor's many bright, good students have too much respect to post comments, good or bad, on a shoddy, inadequately managed, and poorly designed whine post for weak students. The professor now has a very low score, thanks to maintaining some modicum of academic standards. Any reputable study or survey would require a minimum number of subjects or sources before publishing data. Letting one sour apple tarnish a fine professor's reputation online is irresponsible, unethical and dishonest!" There is obviously so many “irresponsible, unethical and dishonest” delusions in this response that is probably ought to be republished with the instructor’s name.

In my last decade at McNally Smith College, our Faculty Committee had so watered-down the student evaluation forms that they had become useless to any instructor who wanted actual student input to future class work. For one semester, I re-introduced my own course evaluation from almost 30 years ago. It didn’t seem appropriate for me to directly review the results. I am fairly good at identifying handwriting and style and that defeats the whole concept of anonymous student course reviews. Ideally, instructors would receive a summary of the evaluations and wouldn’t be allowed near the actual forms. The fact that most students already do not trust the course evaluation process (and shouldn’t) means that most students just check the boxes and leave the comments sections blank. A not-insignificant percentage of students don’t even check the boxes for fear of being identified. Characters like “another southern prof” would be fine with that, “TMP is whining about how I was unfair or boring and nasty personal comments (which are taken off, while the numerical ratings of those who did this are left up). &#*@ RMP.”

A New York Times article, titled “The Prof Stuff,” postulates, “But like many online experiments, Rate My Professors has turned out to be a companion to nothing. It is its own world. Sure, hot, easy teachers get the laurels traditionally denied them by tenure committees who have that fetish for credentials and scholarship.” Actually, an adult (non-academic) reading of the reviews would demonstrate that many students are more concerned with getting value for their education dollar, rather than easy grades.

However it is absolutely true that, "The top professors on Rate My Professors, after all, are not the top professors in the nation. Rather, they’re the top professors on RateMyProfessors.com." Unfortunately, there is no other way for students to rate prospective course and classes and, by design, no useful way for school administrations to evaluate instructors. The usual rubric is “student retention,” a measure of how much money the instructor puts in the school’s bank account by being fast and loose with grades, attendance, and participation. Otherwise, most school administrators would just as soon not be bothered with the “classroom crap.” They are busy inflating their salaries and padding the departments with layoff fodder.

One article on this subject, “Should We Stop Asking College Students to Evaluate Their Instructors?”, was so irrationally biased and uncritical of academic corruption that it was hard to decide if the article or the whining profs were the least sympathetic. “If this sort of customer satisfaction survey works for your car insurance salesman, why wouldn’t it work for a teacher? For a long time, many academic researchers thought that these evaluations were a good thing; by the 1970s, evaluations were widespread in academia. Surely, the argument went, students could distinguish between a punctual and prepared professor, and the chaotic and disorganized instructor. . . research showed that teachers could increase student evaluation scores by simply smiling more or being more enthusiastic.” In other words, if an instructor is interested in the subject manner and creates an environment friendly to active learning, that instructor receives "unfair" preference from students. Amazing. Students can be so shallow. Worse, "More recent research showed no consistent pattern and many studies showed that student evaluations were riddled with biases." Those damn students are like every other human being on the planet? Unacceptable.

Someone called the NUWildcat wrote, “The easiest way to get high marks from the students is to give them good grades, regardless of their actual performance. The effect of that is that the students, getting good grades, think they are really proficient in that course. By inflating their grades, the students don't have the face the reality that there are some areas where they are weak and perhaps should change their majors.” It’s easy to make a claim like that, but difficult to prove. Lucky for profs, they can get away with claiming silly shit and call it “proof.” Another equally prof-biased Chronicle of Higher Education article, “Why We Must Stop Relying on Student Ratings of Teaching,” claimed, a "study also showed that ‘a male instructor administering an identical online course as a female instructor receives higher ordinal scores in teaching evaluations, even when questions are not instructor-specific.’ Kristina Mitchell, one of the study’s authors, summarized its findings in Slate last month and concluded: 'Our research shows they’re biased against women. That means using them is illegal.'" Typically, no real evidence was provided, other than a minimal study description, to justify that claim. Of course, the study does not prove that student evaluations are “biased against women.” It might prove that students (male and, possibly, female) are biased against women, though. Careful analysis might even find that women are less likely to approve of a woman instructor than are men. What do you do with that information? The next thing a reasonable person might ask would be, “Is there a reason students are inclined to be biased against taking classes from women?” Mitchell’s response falls solidly in the “shoot the messenger” category. Most of this propaganda is academia trying to protect itself from quality standards. I can not generate much sympathy for that.

There was one area from the Chronicle of Higher Education article that I totally agree with, “Student evaluations have also become less reliable over the years because most institutions have switched to online systems. In 2016 the American Association of University Professors released a comprehensive survey of faculty members about teaching evaluations. which found that . . .  the rate at which students were filling out evaluations has gone down precipitously in the electronic age.” Not just students, but everyone.

Tools like SurveyMonkey have allowed data collectors of all sorts to delude themselves into believing the are collecting useful information. As I said earlier, from the first day I started teaching at MSCM I begged my students not just to review my classes on RateMyProfessors.com but to add as much good and bad information about my class materials and presentations as they felt might be useful. Over 13 years, I had a total of 3,000 students in my classes. That total and regular promotion and nagging got me 33 RateMyProfessor reviews. As a member of the faculty senate, when I attempted to survey the faculty about issues as important as salary, required hours of instruction or office hours, or curriculum changes, I was lucky to get a 10% response from the faculty. When the Minnesota Motorcycle Safety Center went from paper reviews handed out while the license paperwork was being handled to emailing a SurveyMonkey link, responses dropped from 100% to less than 20%. The problem isn’t that student reviews are marginally complete and useful. The problem is that electronic surveys need to be tied to something the reviewers want and/or need.

During that short period in the early 80’s when American companies still had the vitality to manufacturer products and the management capability to do that, a basic rule of quality management was that “any paperwork generated has to benefit the people who do the paperwork.” If colleges and instructors are ever going to be responsive to students’ needs and interests, it will be because there is a feedback system from students to instructors and academic mismanagement. Obviously, the worst instructors want that system to be totally under their control. Administration bureaucrats are also mostly driven by their laziness, so their motivation to improve the educational quality of the facilities they mismanage is tempered by the fact that doing so would require work from . . . them. That leaves students with one remaining outlets with which to provide unwanted, unread feedback to the schools stealing their time and money and a warning or recommendation to future students: RateMyProfessor.com. Until that changes, teachers will continue to whine (and even sue!) and students will probably continue to be too lazy to use the only resource they have for avoiding lousy instructors.

No comments: