September 2012

Main Content

Welcome back to the Fall semester! May a restful and productive Summer sustain you in your efforts to help our students learn this Fall.

In this issue of Assessment Showcase, I want to bring two items to your attention.

The first concerns the workshop and meetings planned with two unusual colleagues from the University of Missouri, Kansas City (UMKC). Nathan Lindsay, the Assistant Provost, and Drew Bergerson, a historian and Assessment Fellow, who will be on campus September 6 and 7 to consult with us about measuring student learning outcomes. There is more about their visit in the Highlight article and the Save the Date box below. Please take a look.

The second item is the assessment report that the Provost office recently asked you to prepare this Fall. It is due December 15 on a special template that Dr. Sharon Walters, Assistant Director of Assessment and Program Review, has developed. After the Campus-Wide Assessment Committee (CWAC) critiqued the reports submitted last Fall and Spring – the results of which Sharon analyzed in the July and August issues of the Assessment Showcase -- we consulted with our colleagues at UMKC and surfed on the web to create a user-friendly but more focused format for the report.

To help you with filing your report, Sharon’s article walks you through both the conceptual and the practical steps you will need to take. Follow her guide below and you will not only save time, but you will provide essential information about your programs for your colleagues to use in revising them. What could be more useful than that?

I wish you the best of luck this term with your courses and your students. How I envy you your work with them!

Jim Allen
Associate Provost for Academic Programs

 


Highlight

Annual Assessment Review

To assist programs in completing their annual assessment report, we have invited two colleagues from the University of Missouri, Kansas City. Nathan Lindsay, Assistant Provost for Assessment, and Drew Bergerson, Professor of History and Assessment Fellow, to provide an interactive workshop on Best Practices in Assessment.

This session is hands-on. Faculty and staff will develop an assessment plan for a hypothetical degree assessment plan template program. For training purposes, colleagues will choose from one of the following outrageous (and not so outrageous) possibilities: a BA in flower arranging, a BS in flower arranging, a BA in extraterrestrial studies, an MA in extraterrestrial studies, a PhD in social media, an MD in veterinary medicine with a track in dinosaurs, and an LD to practice law in the World of Warcraft. Each table will pick one program to develop a model assessment plan.

We hope everyone can join us on Thursday, September 6, 2012, 1:30-3 PM/General Session and 3:15-5 PM/ Break-out sessons in the Student Center, Ballroom A. There will be coffee, tea, cookies, and fruit for refreshment. If you cannot stay for the whole afternoon, feel free to make plans to attend an hour or two.

For those faculty unable to attend any part of the workshop on Thursday, we have arranged a second session on Friday, September 7, 2012, 8:00 a.m. - 9:00 a.m. in the College of Liberal Arts Dean’s Conference Room (Faner 2408).  Be sure to save the date. Since both sessions will be hands-on workshops, space will be limited. Please go to the Survey Monkey link to register.

During your participation in the workshop, please remember: for this review cycle, programs need to provide us with the following information in their reports due December 15.

Mission of the Academic Degree Program

missions statement A mission statement is a general, concise statement outlining the purpose guiding the practices of the degree program.

The mission statement for an academic degree program, while similar to the mission statement for the department or college, needs to define the primary purpose, primary functions, and stakeholders, but this information needs to be for the program, not for the department or college.

The first step in creating an effective assessment plan is to look at the mission of the academic degree program. This might be a good opportunity to review your program mission statement to determine whether it has changed since your program first started.  Does your mission statement still define the program’s primary purpose, primary functions, and stakeholders? Does the program’s mission support the mission of the department, college, and university?  

Program Goals

Writing program goals and student learning outcomes should involve all the stakeholders - faculty, students, and staff. program goals

As we prepare to use the analytic features of Desire2Learn (referred to as SIU Online), we need programs to verify their program goals (referred to as competencies in D2L) posted on the Assessment and Program Review website

Program goals are written to describe general outcomes for graduates as well as discipline-specific outcomes relevant to the degree program.  For example:

“The Widget Building Program’s goal is to provide students with the essential skills necessary to broadly communicate with a global audience regarding important conservation issues.”

or

“Students who complete the bachelor’s of science degree in widget building will develop the essential skills required to function as a professional in their field of study.”

Do the goals you have listed still define the knowledge, abilities, values, and attitudes for the ideal graduate of your program?  Previous editions of the Assessment Showcase newsletter provided information on how to write program goals.  This might be a good time to review them to determine if you need to revise your programs’ stated goals.

Program Student Learning Objectives/Outcomes – not individual course learning objectives.

objective Programs need to verify that the program student learning outcomes (referred to as learning objectives in D2L) posted on the Assessment and Program Review website are current. 

Program student learning objectives/outcomes should reflect what you want graduates of your program to know (cognitive), what you want your graduates to think or care about (affective), and what you want your graduates to be able to do upon completion of the program (active). It is important to understand that each program goal should be tied to at least one program student learning outcome. 

Each goal (or competency) may have several learning objectives or outcomes. For example, using the program goal listed above,

Program goal: Students who complete the bachelor’s of science degree in widget building will develop the essential skills required to function as a professional in their field of study.”

The student learning objectives would include action words (verb) that identify when students will develop the essential skills (performance), a learning statement that specifies what learning will be demonstrated, and a broad statement of the criterion or standard that will be used for acceptable performance. For example,

Program student learning objective/outcome #1: At the end of their capstone course, students should be able to build widgets with a 90% accuracy rate.”

Program student learning objective/outcome #2: Graduates will demonstrate awareness of cultures and backgrounds other than their own.”

Obviously, you need additional student learning objectives to demonstrate that the students had met the stated goal. Previous editions of the Assessment Showcase newsletter provided information on how to write program student learning objectives or outcomes.  This might be a good time to review them to determine if you need to revise any of your programs’ stated student learning outcomes.

Curriculum Map or Curriculum Alignment Matrix (description of the method)

Mapping/Alignment is used to link the program goals to the program student learning objectives and then to the course objectives. As you will recall, November 2011’s Assessment Showcase discussed curriculum mapping/alignment (i.e., strategies to tie individual course assessments and course objectives to program assessments).curriculum map

The newsletter also showed how to use different types of matrices for assessment planning, monitoring, and reporting (i.e., strategies to tie relevant experiences to program assessments and linking objectives to data gathering tools). You will want to develop a curriculum map or a description of the method you used to link the program student learning outcomes (SLOs) to the program goals, individual course objectives, and student learning outcomes. 
            
The individual course goals and student learning objectives/outcomes in the program are linked to theprogram goals and student learning objectives/outcomes.  Once you know in which course(s) a particular program goal is being measured, you can then track the student’s achievement.  This will also allow you to determine whether all the stated program goals have been addressed in the existing curriculum.

target success Methods/Measures/Achievement Targets/Performance expectations for each program student learning objective/outcome.

November 2011’s Assessment Showcase also presented strategies that programs can use to identify the measures/methods to ensure that the program goals and student learning outcomes are assessed appropriately. [Assessment Process – Part 2] Remember, you need to use both indirect and direct methods, like the examples provided in the newsletter.

The December 2011 Assessment Showcase, however, discussed the importance of providing detailed information regarding the benchmark/standard that will be used to measure acceptable student achievement. [Assessment Process – Part 3]

For instance, if the stated student learning objective was, “graduates of the nursing program will be competent in giving injections to individuals” and the expectation for satisfactory performance was 95%, would you want to be injected by any of the 5 percent of students who did not meet the expectations?
teaching standards But, for the stated learning objective, “graduates will be competent in assessing, planning, implementing and evaluating community-based health programs,” a 95% might exceed expectations. That is why departments must clarify what standards their faculty use to measure students’ achievement for each of the stated learning objectives. Your colleagues must be clear as to what these standards are.
 
You should also read the article, “ Next Steps in Assessment: Better Learning, Faster Learning, Less Teaching, Happier Environment” by Dr. Douglas Eder, emeritus professor, Southern Illinois University Edwardsville.  As Dr. Eder cautions, a student’s final grade by itself does not show the true picture of the student’s achievement. 

For example, with the program goal listed above,

Program student learning objective/outcome #1: At the end of their capstone course, students should be able to build widgets with a 90% accuracy rate.”

A student enrolled in the capstone course might receive 100% on all the written exams and research papers, but only 80% on the widget-building exercises and still receive an ‘A’ in the course. Tracking the student’s grades on the widget building exercises would be a better measure of the student’s achievement on program student learning outcome #1 than his or her final course grade.

Action Plan/Assessment Infrastructure - how you used the 2011-2012 assessment data results to improve or make changes in your program’s curriculum and how you plan to use the 2012-2013 assessment data results action plan

The Campus-Wide Assessment Committee (CWAC) noted that most departments are simply reporting data on an annual basis on each individual student learning outcome without providing a narrative describing how the data were used or whether there were any changes from the previous year’s assessment report.  While our hope is that there will be improvement from the previous year’s assessment report, meeting with all program faculty members to compare the results from previous years is valuable regardless if the results show any improvement or decline.

The July 2012 Assessment Showcase provided information on completing the action plan/assessment infrastructure section of the assessment plan. 

closing the loopThe CWAC is primarily interested in how well programs are “closing the loop” as discussed in the January 2012 Assessment Showcase newsletter.  Once the data are collected, are faculty meeting as a group to discuss the findings and ways to implement necessary changes in the curriculum? Are they comparing the results of this year’s findings with previous years to determine if any trends exist? The members of CWAC are interested in how programs will use the assessment results once the data have been collected and analyzed.

If, after the “Assessment Best Practices” interactive workshop with Drew Bergerson and Nathan Lindsay, you still feel overwhelmed, please remember we are here to assist you.

Sharon Walters, PhD
Assessment and Program Review

References:

University of Central Florida (2008, February). Program Assessment Handbook: Guidelines for Planning and Implementing Quality Enhancing Efforts of Program and Student Learning Outcomes, retrieved fromhttp://oeas.ucf.edu/doc/acad_assess_handbook.pdf

University of Connecticut (2012). Assessment Primer: Goals, Objectives and Outcomes, retrieved fromhttp://assessment.uconn.edu/primer/components.html


You Should Know...

new leadership alliance

This month’s newsletter from the New Leaders Alliance for Excellence in Student Learning and Accountability (a leading national advocate of assessment) contains an article that discusses “ROI,” the return on investment, of an undergraduate education.

NILOA 

This month’s newsletter from the National Institute for Learning Outcomes Assessment (NILOA) contains an interesting guest viewpoint by Anne Goodsell Love, “Discussing the Data, Making Meaning of the Results.” The newsletter and website contain a wealth of information regarding assessment and transparency.

The newsletter and website contain a wealth of information regarding assessment and transparency.


Bragging Rights

Have you heard?

CCCSIR

Ryan Ceresola, master's degree student in sociology at SIU, received the Carl J. Couch Internet Research Award recipient for 2012. This award recognizes student-authored papers. Ceresola's paper, "I Don't Play Unless You Do: A Qualitative Analysis of Stigma Management and Techniques of Neutralization in the World of Warcraft," examines the world of online games and those who play them. This is definitely worth bragging about!

Reference:

Hahn, A. (2012). Student wins Couch Internet Research Award, The Saluki Times. Retrieved fromhttp://news.siu.edu/2012/06/062712amh12103.html


Does your department or college have something related to assessment or program review it would like to brag about?  If yes, please e-mail and let us know.


Upcoming Spotlights

WORKSHOP: ASSESSMENT AND BEST PRACTICES

Nathan Lindsey On September 6th, we hope you will join us for the hands-on workshop being presented by Dr. Nathan Lindsay, Assistant Provost for Assessment, and Dr. Drew Bergerson, Drew BergersonProfessor of History and Assessment Fellow, University of Missouri, Kansas City.  This session is hands-on, as faculty/staff develop an initial assessment plan for a hypothetical degree program. They can choose from one of the following outrageous (and not so outrageous) possibilities:  a BA in flower arranging, a BS in flower arranging, a BA in extraterrestrial studies, an MA in extraterrestrial studies, a PhD in social media, an MD in veterinary medicine with a track in dinosaurs, and an LD to practice law in the World of Warcraft.  Since each table will only be allowed to pick one program, space will be limited. 


calender SAVE THE DATE!

ASSESSMENT AND BEST PRACTICES WORKSHOP!
Thursday, September 6, 2012, 
1:30-5 PM IN BALLROOM A

Thursday, September 6, 2012, 1:30-5 PM. in the Student Center, Ballroom A. Please go to the Survey Monkey link to register


questionQuestions?


Each month we will present what we believe to be practical information regarding assessment practices. However, because we believe assessment works best when feedback is sought, we encourage you to submit any questions regarding any aspect of the assessment process (i.e., at the course level, department level, college level, etc.) to us at assess@siu.edu. We will attempt to answer these questions in the following month’s newsletter and post them on our website under FAQs.