Major-Program Assessment: College Initiatives

The College of Arts and Sciences includes a remarkably diverse collection of programs, ranging from Art to Zoology, with anything from Physics to Poli-Sci in between. And these programs offer a wide variety of degrees, from bachelor’s degrees in music (BM) to Doctorates in Audiology (DA). This diversity creates the ideal environment for assessment innovation.

ASC Curriculum and Assessment Services is interested in working with programs, departments, and groups of programs to explore new ways to improve the quality of major-program assessment practices in the arts and sciences, especially by helping program leaders balance disciplinary demands with departmental priorities, university goals, and broader community interests.

While program leaders are always welcome to schedule a consultation with the ASC Assessment Coordinator to discuss new ways to assess their programs in particular, they may also find it valuable to participate in one or more of the following initiatives alongside other program leaders.

  • Dissertation Data Capture: In the College of Arts and Sciences, we’re piloting a new approach for gathering assessment data for doctoral programs by integrating the process directly into the dissertation defense. This approach should provide greater efficiency in gathering and processing richer, more authentic data. Learn more about this initiative below
  • Stakeholder Conversations: More information coming soon. 

If you are interested in participating in either initiative for major-program assessment, please reach out to Dan Seward, Assessment Coordinator for the College of Arts and Sciences.


Dissertation Data Capture

In the College of Arts and Sciences, we're piloting a new approach for gathering assessment data for doctoral programs by integrating the process directly into the dissertation defense. This approach should provide greater efficiency in gathering and processing richer, more authentic data about the learning of advanced graduate students. 

Here’s How It Works:

  1. Preparation: Developing an outcomes-aligned rubric. Interested graduate programs would work with the ASC Assessment Coordinator to refine or create a rubric for evaluating dissertations in accordance with the doctoral program’s learning outcomes. (Note: Other program capstone projects may also be suitable—contact the ASC Assessment Coordinator for guidance in implementing similar methods.)
  2. Integration: Including the rubric in the dissertation defense. After the rubric has been developed, ASC Curriculum and Assessment Services (ASC-CAS) will generate a Qualtrics form and related messaging to explain the processes described in the steps immediately following. Links to the form and supporting information are sent directly to doctoral candidates and all members of each dissertation committee each time a defense is scheduled.
  3. Collection: Routine gathering of multiple data points at each defense. After reading the dissertation, but before the defense, all committee members, as well as the candidate, will be asked to enter scores on the rubric (i.e., the Qualtrics form). For the student, this will serve as a self-assessment to help them prepare for their upcoming defense. Then, at the defense, immediately before the closing conversation, all committee members will be asked to complete the rubric again—perhaps entering the same scores, perhaps adjusting them after discussion.
  4. Analysis: Routine reporting of rich assessment results. Annually, ASC-CAS will then provide program directors with a dataset reflecting the scores submitted for the rubric, including subsets disaggregated according to roles (i.e., candidate, doctoral advisor, second reader, outside reader, etc.) and pre-/post-defense scoring. Where needed to ensure anonymity in presenting program-level results, ASC-CAS will pool data over multiple years or present only aggregate scoring of all committee members (student’s self-assessments removed, of course).

Chief Benefits of this Approach

  • Graduate programs can “outsource” the primary data collection and management processes to ASC-CAS without relinquishing the essential evaluative role of faculty in assessing learning.
  • More faculty members will contribute data for program assessment (assuming current practices rely only on the dissertation director’s or a program director’s scoring).
  • Disaggregated results reflecting multiple perspectives will provide richer datasets for analysis, allowing comparisons between student self-assessments and faculty perspectives, as well as comparison among faculty responses reflecting different roles for each dissertation.
  • The very process of pre-defense and post-defense scoring can frame the defense within the contexts of the program learning outcomes.
  • Centralized management of the assessment instruments (the rubric and form) ensures uninterrupted data collection even when program directors or admin personnel change.
  • Centralized data management allows for greater security and more efficient processing. 

How to Participate

If you are interested in participating in this initiative for doctoral program assessment (or a similar capstone-based assessment activity), please reach out to Dan Seward, Assessment Coordinator for the College of Arts and Sciences.


Stakeholder Conversations

More information coming soon.