Fall-2011 Program Assessment

Fall 2011 Program-level Assessment

 
Program Assessment is simply the process whereby we take a reflective look at our program, our graduates and ourselves. Program Assessment is already an important an ongoing part of many programs.  The goal here is to quickly collect a 'snapshot' of program assessment.  The survey analysis will give a view of assessment activities, this survey does not constitute an assessment plan.  The survey was developed from the HLC Fundamental Questions.

We recommend that the survey be completed a small group exercise, for example: conducted by members of the school assessment committee.  Estimated time of completion: first use 20 min, subsequent use 10 min.
1. Please identify the program being reviewed:
(the drop-box contains the code, degree, program name - sorted by name)
2. What was the approximate date of:
Approval of program outcomes:This review:
Fall 2011
Spring/Summer 2011
Fall 2010
Spring/Summer 2010
Fall 2009
Spring/Summer 2009
Fall 2008
Spring/Summer 2008
Fall 2007
Spring/Summer 2006
Fall 2005
Prior to Fall 2005
3. Program Outcomes should be appropriate to your mission, programs, degrees students and other stakeholders. How would you characterize the outcomes listed for this program:
4. To what extent does each course in the program contribute to program outcomes? How are course outcomes aligned with program outcomes?
5. What kinds of evidence do you collect and analyze to demonstrate that students in this program have achieved your stated Program Outcomes?  For example,  a scoring rubric where faculty evaluate the presentation of the senior research at a seminar against predetermined and published standards
6. Give a specific example of where you have analyzed and used the evidence of student learning collected for this program? For example, "presentation scoring rubrics are tabulated, compared to previous year's results, and information used to modify/refine poster requirements for next year's seminar", "poster quality improved using same rubric after clarifying assignment parameters", "poster data has been collected but not analyzed", or "no examples exist".
7. Give a specific example of where have you ensured shared responsibility for assessment of student learning related to this program? For example, "tabulated student presentation scores are distributed and discussed at the annual faculty meeting on assessment", "poster data has been collected but not shared", or "no examples exist".
8. Give a specific example of how have you evaluated and improved the methods you use to assess and improve student learning for this program? For example, "a longitudinal study of presentation scores suggested a new format for the rubric with fewer total items but which rewarded higher level skills", "the department approved a new survey for seniors", or "no examples exist"
9. Give an example of how have you informed the public about what students learn in this program, and how well they learned it? For example, "program assessment data was presented at the spring advisory board meeting", or "no examples exist".
10. OVERALL, considering the factors identified above, give your PROGRAM an Assessment Factor Score (from 0-4)
*
11. In a short paragraph, provide an example or illustration of 'closing the loop in assessment' related specifically to this program. This can expand on a previously stated example. These examples, taken as a snapshot of each program, will give a picture of the state of assessment at LSSU.
Powered by SurveyMonkey
Check out our sample surveys and create your own now!