In their follow-up reports to the Provost units reported a variety of assessment activities and changes they made based on their results. Below is a summary of changes and specific examples of assessment activities as described by the units.
Summary of Changes
Summary of Changes Made From Assessment Activities
- Revised curriculum
- Revised requirements
- Added new courses
- Revised courses
- Added/deleted content
- Changed textbooks
- Changed computer software
- Changed teaching assignments
- Improved student progress review process
- Improved student course placement
- Improved student advising
- Added advisors
- Shifted advising responsibilities
- Improved faculty communication about assessment
- Created new assessment and review committees
- Conduct retreats/forums
- Discuss assessment at dept. meetings
- Acquired more student input
- Include students on committees
- Include students in evaluation panels of student work
- Conduct focus groups and exit interviews
- Improved instruction
- Meet with faculty receiving low ICES scores
- Offer pedagogy/teaching workshops
- Offer peer review of teaching
- Improved assessment of student learning
- Through embedded assessment items on classroom exams
- Testing at program stages (standardized tests, external reviewers)
- Increased monitoring of student achievement
- Review enrollment and transfer data
- Make assessment records electronic
Unit Examples of Assessment Activities
Following are some detailed examples of what units reported in their follow-up reports.
From Chemical and Biomolecular Engineering:
Course Outcomes: We have quantitatively assessed achievement of all course outcomes for all undergraduate courses by calculating students’ scores on specific questions from quizzes, homework, exams, and projects. We keep this in spreadsheet format and make 1-2 changes the next time the course is taught based on areas in need of improvement. In other words, if students have a low performance score on a question from a quiz which assesses a specific outcome, that concept is revisited and reinforced next time the course is taught.
From Natural Resources and Environmental Sciences (NRES):
In response to the review of freshman grades in required courses, the Assessment Coordinator undertook a more thorough analysis of NRES majors’ grades in Chemistry 101 and 102. As a result of that assessment, the department designed an experimental course to be taken in conjunction with CHEM 101 entitled “NRES Problems for Introductory Chemistry,” modeled on the Merit Chemistry program. This course is being offered for the first time this semester and will be assessed via comparison of student grades in CHEM 101.
From Speech and Hearing Science:
Our student assessments led us to realize that we had a gap between academics and clinical training in our instructional programs. We have thus created a flexible lab portion of our disorders classes….Both academic and clinical instructors have some involvement in creating content of the labs, so that labs reflect the theoretical orientation of classes and the skills expected in the clinical practica. Students experience continuity across the program.
From Electrical and Computer Engineering:
Feedback and suggestions from undergraduate students prompted the Department to develop a new program designed to encourage underclassmen to explore research at an early stage in their academic career. The PURE (Promoting Undergraduate Research in ECE) program pairs up undergraduate students with a graduate student mentor. Together, the students and graduate mentors decide on a project which the undergrads work on throughout the semester. All undergraduate mentees are expected to summarize their research experience by writing a 1-2-page summary and giving an oral presentation at the end of each semester.
From Industrial and Enterprise Systems Engineering:
Integration of the assessment tasks into normal department operations has had its challenges, but it is being adopted as part of the routine teaching duties for undergraduate courses. Some necessary course changes have been identified and made as a result of using the assessment tools. The key change is that instructors now have an opportunity to review data for their own courses and see the results of their teaching in the demonstrated student abilities. Several instructors have voluntarily made changes, now that this data is available to them.
There have been unexpected by-products of our attempt to shift the departmental culture away from the traditional authoritarian models. For instance, faculty have been prompted by students to take a hard look at “Awards” procedures which have been voted on exclusively by faculty in the past. It has been suggested that instead of faculty making these determinations independently, that students be asked to serve on selection committees alongside faculty or that students be given the opportunity to apply for awards. These suggestions would mean that students are no longer passive players in the procedures and that they would have to accept some responsibility for the outcome. We believe that these student suggestions are evidence that the departmental culture is shifting more rapidly and more profoundly than we had anticipated.
From Labor and Employment Relations
For the doctoral program, we have chosen to approach deficient student performance on the core HR/IR exam by providing extensive developmental feedback and tailoring a re-take opportunity for students to exhibit an appropriate level of improvement to warrant a passing grade (or counseling out of the program).
In General Chemistry this will most likely take the form of common final exams, given across all sections of a given course. Since the exam forms are not released to students, the same, or similar, exams can be administered yearly. Performance can be evaluated as a function of changes made in the curriculum.
The outcomes assessment has increased awareness of the importance of outcomes. These recent assessments have not caused us to make any changes in the program, though the PhD assessment reinforced the need for early mentorship of new PhD students.
So far, our unit has collected “comparative” papers from most of our majors, guided them in their choice of senior thesis topic (which will significantly contribute to their future academic careers if they chose to go to graduate school), and we have kept a more systematic account of their GPAs. We have also started to gather data based on our General Education courses in order to better assess directly undergraduate students’ performance in four different skills: analytical content, use of text references, organization of the argument, and writing mechanics (see appendix below). We collected data last spring and are continuing to do so this fall. A preliminary examination of the quantitative results shows that students generally score high on most skills, with a need for improvement in organization, and content for some students. Students who rank low in the first paper usually improve significantly in the course of the semester. Additional data will enable us to refine the assessment of student performance and advise section TAs on how to improve quality in specific areas.
The Provost Office encourages you to continue implementing your Unit Outcomes Assessment Plan. The Provost will periodically request future progress reports to learn of your activities.