TO: Interested members of the Earlham College community
FROM: Doug Bennett
RE: First results from the National Survey of Student Engagement

DATE:January 15, 2001

Attached are some summary data from Earlham's participation in the National Survey of Student Engagement (NSSE). I believe these data provide an interesting basis for discussion among us.

NSSE is a national project for the assessment of quality in higher education. Its development has been largely funded by the Pew Charitable Trusts. It involves a survey given to college students asking them to report on the forms of engagement that characterize their college experience. Each of the items used in NSSE has been drawn from one or another survey instrument regularly used in higher education. More important: each item has been shown to correlate strongly with student learning. While indirect, NSSE gives us a way of assessing the quality of education we offer by asking students whether their experience is characterized by the activities and engagements we believe are conducive to learning. A great deal of additional information about NSSE, including the research that links items on the survey instrument to demonstrated student learning, is available on the project website: .

Earlham participated in the Fall 1999 pilot administration of NSSE. At that time, samples were drawn from sophomore and senior students. We also participated in the Spring 2000 first regular administration of NSSE, when those sampled were first year students and seniors. Because Earlham has just a few hundred students in each class, we could not participate in the spring 2000 administration to seniors: all our seniors had already been invited to participate in the fall pilot. We will be participating in spring 2001.

The tables that follow show the mean scores for Earlham and comparison groups in the two administrations. What is reported are mean scores on the questions. Even though the scores come from two different administrations of the survey, I think it is permissible to compare results from the two. The survey instrument and the method of administration were identical in the two, and the two administrations were only a few months apart. Thus, in addition to looking at the mean scores of our students in comparison with those of students at other institutions, we can look at how the mean scores of our students change as they progress from 1st year to 2nd to 4th.

The columns are as follows:

EC 1Y spring 2000 Earlham 1st year students (N=117)
BacI 1Y spring 2000 1st year students from Baccalaureate I colleges (N=4,513)
Natl 1Y spring 2000 1st year students from all participating institutions (N=30,726)

EC 2Y fall 1999 Earlham 2nd year students (N=70)
*Ann 2Y fall 1999 2nd year students from Annapolis group colleges (N=??)
Natl 2Y fall 1999 2nd year students from all participating institutions (N=7,124)

EC 4Y fall 1999 Earlham 4th year students (N=97)
Ann 4Y fall 1999 4th year students from Annapolis group colleges (N=994)
Natl 4Y fall 1999 4th year students from all participating institutions (N=7,401)

*the Annapolis group 2nd year fall 1999 column is blank because of an error in reporting these scores to us. We do not know whether we can get corrected scores.

Note: The Annapolis Group is comprised of about 100 residential liberal arts colleges in the U.S. Colleges are eligible to join if their 'Carnegie Classification' is Baccalaureate I. Baccalaureate I colleges are those which (a) are undergraduate residential colleges, (b) offer programs solely or predominantly in the liberal arts and sciences and (c) have selective admissions.

What I believe we should look for here is not some summary judgment about whether we are doing 'good' or 'fair' or 'excellent,' but rather a profile of the college that shows some things we are doing well (at least as seen through this imperfect lens) and some things we are not doing so well in terms of what we are attempting to do. There are some items in the survey where I do not think we want to achieve a high score. (Item 6, 'Worked with students on projects during class,' may be one example. Plenty of courses here involve group projects, but often the work is done in group work outside of class, which is picked up in item 7.)

I'd like to know what you see here, and what you make of what you see. Does NSSE appear to be giving us accurate, useful assessment of how students are engaged in the educational process? If not, why not? If so, what should we be noticing? What is going well that we should try to keep going well? What could use some fresh attention?

I'll welcome any and all feedback.