Major Program Assessment: Annual Routine

Note that the Nuventive.Improve reporting requirements are also treated in the TracDat User’s Manual:

  • For entering and updating assessment Plans, including Methods to be used, see Section 4.
     
  • For entering Results see Section 5, especially page 51 of the PDF
     
  • For entering Use and Action accounts and Follow-up details, see Section 5, especially page 59 of the PDF.

Confirming and Updating Annual Assessment Plans

Programs are asked to submit assessment plans during the program proposal and approval processes. 

However, at the start of each year, program directors will need to confirm that the details of the plan are ready to implement. 

The steps listed below provide a simple guide to planning your program’s annual assessment efforts—an activity that ideally occurs before the academic year starts, perhaps even at the end of the previous academic year. 

This guidance identifies the most basic considerations in a concise format. For more in-depth coverage of assessment planning, please consult Major Program Assessment Development.

 

Step 1: Identify the outcome(s) to be assessed for the year

Each program is required to assess at least one outcome each year, making sure to assess all program outcomes over a span of three years. 

Consequently, annual assessment plans should also account for outcomes assessed in previous or subsequent years, unless covering all outcomes annually.


Additional considerations

For the three-year assessment cycle, direct methods are expected—at least one direct assessment for each outcome within a three-year period. 

Moreover, for the annual requirement, if only reporting one outcome for a year, the method of assessment should be direct (see more below).

 Though not required, programs are encouraged to report other forms of assessment that further document student and program achievements.

 

Step 2: Confirm (and refine as needed) the method(s) of assessment to be used

Each program provides an assessment plan as part of its approval process.

In planning the upcoming year’s assessment activities, program directors must confirm or adjust the methods originally proposed, taking into consideration their validity, as well as the resources available for conducting the assessment. 

Methods vary in how much training will be required for those producing, gathering, or processing the assessment data; methods also vary in the timing of data collection and the degree to which the data addresses program-related questions beyond those associated with the achievement of outcomes.


Additional considerations

Keep in mind that methods may differ also in how well they reflect the program’s student population. 

For direct assessments intended to meet the minimal reporting requirements described above, programs should aim for data reflecting a representative sample of students significantly advanced in the program, for instance, graduating seniors or enrollees in required upper-division classes. 

When gathering data from classes, moreover, it may be necessary to filter results for students not enrolled in the program.

 

Step 3: Develop data-collection instruments and training materials

Some of the most common forms of data collection are surveys, assignment harvesting (or “embedded artifacts”), reader ratings, and interviews. 

The associated instruments—questions, evaluation rubrics, and so on—should be designed to capture information directly related to one or more program outcomes, even if used to gather other data as well (e.g., details about student sentiments, student background, etc.). 

Besides creating the instruments themselves, program administrators must consider how they will be disseminated: with direct emails to particular students? In LMS templates for particular courses? By advisors meeting with students? 

Each approach has pros and cons, depending upon resources, methodological validity, and larger programmatic goals.


Additional considerations

When training those who will use data-collection instruments (e.g., instructors, raters, etc.), your program may achieve more representative results and response rates if the purposes of the assessment are explained to these stakeholders, noting as well what they can get out of the assessment process. 

Also, recognize that data gathered for the purposes of assessment should be protected like other student information, which for some methods may necessitate additional training in privacy protocols, such as data de-identification and security.

 

Step 4: Update Assessment Plans and Methods in Nuventive.Improve

Once you have an assessment plan for the upcoming year, please enter the details into Nuventive.Improve, or update them as necessary. 

By entering these plans into Nuventive at the beginning of the year, you’ll get a head-start in completing some of the year-end reporting tasks, and potentially even reporting tasks for subsequent years (assuming the program doesn’t change its assessment methods). 

At this stage, it’s most important that all the program’s outcomes and methods are entered into the system to confirm that all outcomes will be assessed with a direct method over a three-year assessment cycle.

Additional considerations

Besides filling out the “Method” forms in Nuventive, we recommend uploading your supporting documents as well (timelines, courses serving as sites for data collection, rubrics, questions, etc.). 

Altogether, these entries and uploads serve as a knowledge repository for the program, facilitating continuity through staffing turnover and role reassignments, even if they occur during the middle of the year.


Conducting Assessments and Reporting Results

Although many of the planning activities listed above may not need to be repeated each year, all programs need to conduct and report on at least one direct assessment a year. 

The steps listed below can serve as a checklist of sorts, though each step entails a cluster of subordinate tasks.

 

Step 1: Deploy assessment instruments and train those administering them

The timing of these activities will, no doubt, depend upon the methods being used. 

Whereas surveys may be sent at a few strategic points during the academic year, embedded assignments usually need to be added to LMS courses before the semester starts. 

For training those who contribute to the assessment, the methods will also affect the timing. 

Instructors will need to be made aware of course-embedded assessments prior to the term to ensure the relevant assignments are included on syllabi. 

For reader ratings, on the other hand, it may make sense to conduct training during semester breaks.


Additional considerations

Assessor training sessions—for instance, to go over the application of a rubric for evaluating student artifacts—provide excellent opportunities for program stakeholders to discuss similarities and differences in perspectives, even within the limited context of a shared rubric. 

These conversations may prove useful for adding nuance to the interpretation of results later in the process or for refining the assessment instruments themselves. 

It may even be possible to integrate assessor training into teacher training—as long as grading is distinguished from scoring for the purposes of assessing learning outcomes.

 

Step 2: Collect and process data according to the planned method

Although it can be tempting to put off data gathering and processing until just before you’re ready to analyze the data and write the report—assuming your methods allow for it—it’s better to make data collection a routine and ongoing activity. 

Here’s why: 

  1. First, the program won’t be “putting all its data into one basket” (i.e., a single term)—should something go wrong.
     
  2. Second, the routine process of administering the instrument will improve the likelihood that data collection goes right. 
     
  3. Third, your program will have a larger sample, or at least one that better captures variations across academic terms. 


Additional considerations

However and whenever you collect assessment data, remember that part of the processing with entail de-identifying the records. 

Note that this restriction doesn’t necessarily preclude methods that reconcile demographic or other personal information with data reflecting individual students’ performance on particular outcomes. 

For more guidance on why your program might choose to draw on such data and how to maintain privacy and protection for students, consider setting up a consultation with the ASC Assessment Coordinator.

 

Step 3: Analyze data and report results

Although, from an institutional perspective, programs must ultimately break down the results of an assessment in terms of student achievement—specifically, the proportions of students in the program meeting an outcome at various levels—program administrators will learn the most about their program if the results are reconciled with other identifiable influences. 

In interpreting the results, then, consider recent changes to the program’s curriculum, variations in enrollments, or other factors discovered through indirect methods of assessment, such as opinion surveys, curricular reviews, or focus groups.
 

Additional considerations

An assessment report can be a useful conversation starter, especially when shared with different program stakeholders, including students, faculty reflecting a range of disciplinary perspectives, staff holding various roles, and other administrators. 

In composing the report, consider treating those stakeholders as your target audiences.

 

Step 4: Enter assessment activities into Nuventive.Improve

During the summer, all programs must upload their full reports to the Nuventive.Improve system and distill the results, analyses, and interpretations into a few summary paragraphs. 

Program directors are also asked to submit an Executive Summary for all assessment activities for the program. 

Finally, at the end of the calendar year, after directors have been able to reflect on the assessment results and dialogue with stakeholders, programs are asked to submit action plans for following up on the results submitted during the preceding summer. 

What program changes will be made based on the results?


Additional considerations

The Nuventive.Improve entries have a functional purpose for higher administration, ensuring that each program’s assessment activities and core results may be accounted for in external reporting.

However, for program administrators, the Nuventive entries also allow for easy review of past assessment activities, assuming the information was completely entered into the required forms and supporting documents were uploaded.

This OSU-specific guide provides more guidance on using Nuventive—p. 51 of the PDF, in particular, treats the entry of assessment results.

Program assessment is best treated as an ongoing process, part of the routine of program administration. 

This page outlines such a routine, starting with assessment planning for the year and ending with required reporting assessment activities.


Calendar of activities and reporting requirements

The table below provides a timeline for completing and reporting assessment activities. 

  • The “Assessment Activities” column lists the types of details each program with need to account for, providing a rough timeline for when those details should be addressed and when specific activities need to be completed. 
     
  • The “Nuventive Reporting” column lists more specific requirements for entering assessment plans and activities into the university’s Nuventive.Improve system.

Assessment Activities

Summer

  • For previous year’s assessment

    • Complete analysis and interpretation of data collected.
    • Disseminate assessment reports to program stakeholders.
       
  • For upcoming year’s assessment

    • Update assessment plans and data-collection instruments for upcoming year.
    • Collect data for next year’s reporting cycle if using Summer-term data.

Autumn

  • For previous year’s assessment

    • Develop action plan after discussing results with program stakeholders.
       
  • For current year’s assessment

    • Collect data for current year’s assessment if using Autumn data.

Spring

  • For current year’s assessment

    • Collect data for current year’s assessment if using Spring data, finalizing annual data set.
    • Start analyzing and interpreting data gathered during the year.

Nuventive Reporting

Summer

  • For previous year’s assessment

    • June 30: Enter Results and activity Summaries into Nuventive for previous year (Summer to Spring).
    • July 15: Colleges submit Executive Summaries of assessment activities to Nuventive.
       
  • For upcoming year’s assessment

    • August 15: Update assessment plans in Nuventive, in particular Method summaries and assessment schedule.

Autumn

  • For previous year’s assessment

    • December 15: Enter Use and Action statements into Nuventive.
    • Optional: Add Follow-up information if applicable.

Spring

  • For previous year’s assessment

    • Optional: Add Follow-up information if applicable, for instance, whether proposed actions were completed.