OCR Text |
Show Measuring by number Course evaluations should always cnang( anything out of a class. He would like to see a study done over a longer time period to see if student opinion would change drastically. Do the questions themselves have a tendency to affect the results? Not drastically. General Education Dean Oakley J. Gordon did an analysis of 36 course evaluation questions and concluded that only four basic factors 'were being rated: 1) satisfaction with the course and instructor, 2) student ability and effort, 3) adequacy of examinations and 4) the instructor's in-structor's tolerance for independent in-dependent ideas. Almost all committee members interveiwed agreed that the program should be in a state of constant change. "Course evaluation can't be static," says Ingebretsen because "it's impossible im-possible to please everybody." He believes the program s, move away from a Univ. " wide evaluation and foc!?l; more specialized studies R?.. Croft agrees, but wants surveys (General Educ Honors and some depart! ' conduct their own suS1' the same computer form ' f" Catmull thinks there are',' many surveys. In the middle of the hepn Glen Wentworth. As chairman for the coming year 7 will be required by the Assert to itemize h.s budget request a find a way to link cour; evaluations with the workings student advisory committees anr college councils This is ironic, since roan, committeemen believe course evaluations were a catalyst f0, greater student participation iD departments and colleges. Question: Rate the extent to which the course evaluation program has been effective in measuring a teacher's performance. per-formance. You probably won't find this . question on a course evaluation card, but it's running through the minds of the program's committee com-mittee members, past and present. The ASUU Assembly has frozen the CE Committee's proposed $10,000 budget for the coming year. They want committee com-mittee chairman Glen Wentworth to come up with some answers on how to improve. He has his hands fulk, For one thing, processing more than 20,000 cards each quarter is expensive. Last year, use of the computer cost $5,000 and the programmer's salary was more than $3,000 In the past, the committee has tried to make up some of the expense through the sales of course evaluation books. They haven't sold like hotcakes. Former committee chairman Ed Catmull believes the books are being purchased by one student, then passed on to another. Wentworth would like to do away with the book altogether and substitute department handouts. He would also like to post all CE results in the library. Course evaluation enjoys a good track record. Barbara Croft, another CE committee veteran, says the program has had a 70 percent participation rate the highest of any activity on campus and perhaps the highest of any course evaluation program in the country. More importantly, committee members seem convinced that teachers are impressed by student ratings. Long-time committee member and computer programmer Joluut Vanderhooft says he receives calls from teachers who both swear by and at the ratings. He still thinks "a lot of the faculty are gunuinely interested in improving their teaching and the evaluations are a source of input for them." However, he's quick to add that "printing a name and number is not enough. You can't break down a man's profession and life's work to a single digit number." Giving a teacher's performance a numerical value is not the committee's purpose. According to the most recent CE book "The course evaluation committee does not claim to be establishing an absolute criteria (sic) for teaching competence." Another former chairman, Bob Ingebretsen, says the results merely point out how students evaluated a teacher. He says interpreting the data should always be left up to the departments. depart-ments. Is course evaluation a popularity contest? Ingebretsen doesn't think so and believes students know whether they got |