We have reviewed the "Report of the Senate Faculty Affairs and Personnel Committee on Charge S-0109, Best Practices in Assessment of Teaching." In response to your request, we will address the resolution's contents point by point.
Recommendation 1. Wording for student comments on reverse of
form:
We will have these words removed from the Student
Instructional Rating Survey forms in future orders from National Computer
Systems who print the forms for us. At present we have a stock of
approximately 150,000 forms, which will be exhausted by the Spring 2003
term. We can fully implement this recommendation by Fall 2003.
Recommendation 2. Peer review and Mentoring:
This recommendation applies directly to academic
departments. We will offer workshops to interested departments in
the design of mentoring and peer evaluation systems. As the resolution
suggests we will prepare guidelines and procedures for this peer review
and mentoring.
Recommendation 3. Teaching Portfolios:
As we have been doing for the past eight years,
Dr. Devanas will continue to offer workshops to schools, departments and
faculty on the development and use of Teaching Portfolios. Cook College
has already implemented the Teaching Portfolio for all of its faculty members.
We will contact the other Deans in New Brunswick and discuss with them
ways for their faculty to adhere to this recommendation from the Senate.
We believe University-wide use of Teaching Portfolio to be the most effective
and cost effective method of improving teaching we can follow. We
enthusiastically support this recommendation.
Recommendation 4. Departmental storage of information:
This resolution states that each department should
keep on file all of the information on the Student Instructional
Rating Survey forms for each instructor, for each section for each course
for at least ten years. There ar a number of ways to comply this
recommendation. One is for the departments to store all the evaluation
forms per instructor per course per term for ten years. For some
departments this would be a massive task given the volume of forms used
per term by some departments, i.e., psychology and math. Though the
data from the forms is saved in the scanned data files and processed in
the result sheets mounted on the web, it is the comments on the back of
the form that create the problem. The simplest way to save the information
contained in the comments is to save the forms themselves. Other
alternatives include re-typing the comments from the forms into a computer
file, or scanning the comments.
The recommendation of the Senate clearly states
that this is the department's responsibility. It is our believe though,
that for many departments, it will be very difficult or impossible to comply.
In the long run, a cost effective solution would be to find another way
to record the student comments, rather than on paper, so that they can
more easily stored.
Recommendation 5. Candidates for promotion should list SIRS
ratings:
The TEC has sent to each department head a complete
list of SIRS ratings results per instructor per course per term since the
inception of the system. We also send back with the scanned and processed
forms a results sheet for each instructor for each course for each term.
However, chairs and instructors routinely loose this information.
This fact was a major motivation for putting the SIRS data on the web.
Therefore we will put on the SIRS web site data from Fall 1995 to the present
so that all of this information is available to those who need it.
Since Summer 1996, we have surveyed the courses offered in summer in New
Brunswick and Camden and Summer 2001. We have not been asked to post
SIRS Summer results on the University's web site and have not done so.
We have no plans to do so either.
Recommendation 6. Creation of a SIRS Database:
This recommendation says that any faculty member,
chair or dean could send a query to a database, managed and maintained
by the TEC and receive an immediate response. Here is an example:
I make a request on-line to the SIRS Database for my score on Question
9 in all courses that I have taught over the past ten years. Immediately,
a list of all of my scores for Question 9 is produced for every course
I taught over the past ten years.
This recommendation is very significant for the
structure of the Teaching Excellence Center. The creation of a database
for each faculty member containing student course rating scores and summary
statistics, available to each faculty member, department chair and dean,
on request, is a monumental undertaking. To create a data structure
to house the vast quantity of information in the SIRS system for all instructors
is a considerable project all on its own. The maintenance and management
of this database, especially the need to keep it up to date, is an enormous
task too.
We suggest the following:
1. By making available on the web, the results
from all past SIRS starting Fall 1995 and constructing a search engine
that can be used by faculty, chairs and deans, we can proxy the functions
of a true database. So for example, I could search the SIRS website
for all summary statistics sheets for "Gigliotti" and quickly find all
of my results sheets from 1995 to the present. Then I could take
from those sheets my scores from any of the ten questions that I was interested
in.
2. The TEC will begin to experiment with a
new processing system that will in one step create from the data scanned
from the forms, a database that can be queried as the Senate intends.
When this is complete and thoroughly tested and piloted with selected departments,
we can scale up to a University-wide database. Such an online database
would replace the current system of creating summary statistics sheets
per instructor per section of a course per term. Instead students,
faculty, chairs and deans could simply query the database for any information
they need. Such a plan will take approximately two years to implement.
Recommendation 7. Distribution of raw scores and summary statistics
to each department:
The TEC has always distributed all summary statistics,
term by term, to each department chair as mentioned above. We have
also distributed the raw data as scanned from the forms to any department
or dean who has requested it. Physics and Math have often requested
this information and performed their own analyses of it. Cook College
has done the same for many years.
To make the distribution of raw data and results
summary sheets more efficient, given the large volume we have now been
asked to distribute, we will create a password-protected web site for department
heads to use to download their summary statistics and raw data each term.
This site will be open to their use via a TEC-provided password for two
weeks near the beginning of each term for accessing data from the previous
semester. After that time the site will be closed, but data can be
available by request to the TEC Director.
The TEC has created an Excel program that will allow
departments to process their own raw data to make any of the comparisons
suggested by the Senate or that the Department deems useful to them., e.g.,
normalized means. We will distribute this program in Fall 2002 to
all departments and provide training and assistance in its use.
General Comments:
1. The thrust of the Senate's recommendations
are that departments should be much more responsible for the storage and
analysis of the data from the SIRS, and that they should be much more active
in peer discussions, mentoring and evaluation of teaching within their
own units.
2. The recommendations also imply that the
TEC should be become much more sophisticated in its data processing and
data management so that faculty, chairs and deans can get any information
that they want at the click of a mouse. We will do our best to implement
these recommendations.
3. As long as comments are collected on the
SIRS forms and sent to use, we are involved significantly in the movement
of paper back and forth from departments to use across the University.
It would be much more efficient to create a paper form or an online form
that would keep the comments where they belong, in the hands of the instructor
and the department. We are working on ideas to make this possible.
4. We are concerned about the recommendation
that each department store all of the information on its forms for ten
years. Most departments will simply not be able to find space for
the forms. Some will re-type the comments, a major misuse of University
resources in our opinion. Others may try to scan the comments to
store them electronically, either as images or through character recognition
software. The image files would require major storage capacity.
The text files created through character recognition software would require
less space but the scanning would be very inaccurate due to idiosyncrasies
of each individual's handwriting. If student comments are to be stored
for ten years, it would be much wiser to find a way to collect them online
rather than through the current paper-based system. The TEC will
experiment with scanning comments this summer to get an idea of the costs
and difficulties involved. We will also work on other methods of
collecting comments. Any results from these experiments will not
appear for at least a year, but it makes sense to investigate the possibilities
now.