Assessment date: Learning has to be assessed every year, although specific outcomes need not ALL be assessed in the same cycle.
Keep the following question in mind when designing outcomes:
- Is the date a specific date as opposed to a range or inexact date (e.g. March 12, 2013; 6th week, spring semester, 2013)?
Assessment method(s): In this column, identify the discrete assessment tools used to measure each specific outcome. The assessment tool should not be achievement of a grade/GPA. Course grades cannot be used because they are typically too broad; they measure too many aspects of a course. The assessment should be associated with a metric or a rubric which clearly defines the expected level of performance. Finally, the assessment methods do not have to be exclusively quantitative. Depending on the outcome and the field, programs may elect to use qualitative assessment methods.
When writing this section of your plan, keep the following questions in mind:
- Is the place in the curriculum where the assessment was conducted identified? (i.e. was it attached to a course or given at the end of the program?)
- Does the assessment tool measure a specific learning outcome?
- Does the overall assessment of the learning outcomes include a mix of internal and external measures (e.g. internal test vs. nationally normed test)?
- Who is doing the scoring / assessment?
- Does the assessment correspond with the dates given in the previous column?
Assessment Methods– Design and Examples
Banta, T. W., & Blaich, C. Closing the Assessment Loop. Change, 43(1), 22-27.
Provides specific guidelines for designing learning outcomes assessment that leads to enhanced student learning and program quality.
Banta, T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (1996). Assessment in practice. Putting principles to work on college campuses. San Francisco: Jossey-Bass.
Presents 82 cases studies of assessment practices– at program, course, and institutional levels– in a variety of institutions and programs.
Gallery of Teaching and Learning (2012), from http://gallery.carnegiefoundation.org
The CASTL HE link (right inside) provides examples of instructional practices in a variety of fields. Searchable by subject.
Murray, M., Pérez, J., & Guimaraes, M. (2008). A Model for Using a Capstone Experience as One Method of Assessment of an Information Systems Degree Program. Part of the special issue, IS education assessment, 19(2), 197-208.
An example of assessment in the field of information systems.
Borden, V. M. H., & Kernel, B. (2012). Measuring quality in higher education: An inventory of instruments, tools and resources Retrieved 18 February 2013, 2013, from http://apps.airweb.org/surveys/
Provides links to sets of tools useful for conducting assessment, including benchmarking instruments.
Banta, T. W., Griffin, M., Flateby, T. L., & Kahn, S. (2009, December 2009). Three promising alternatives for assessing college students’ knowledge and skills. NILOA Occasional Paper No.2, from http://www.learningoutcomeassessment.org /occasionalpapertwo.
Provides a rationale, methods and examples for using E-portfolios, Rubrics and Online assessment communities to conduct authentic assessment of student learning outcomes that can be used at course, unit and institutional levels.
Clark, E. J., & Eynon, B. (Winter 2009). E-portfolios at 2.0—Surveying the Field. Peer Review, 11(1), 18-23.
Includes a list of resources providing examples of e-portfolios.
Portfolio. National Institute for Learning Outcomes Assessment. Available at: http://www.learningoutcomeassessment.org/Portfolio.htm
Provides guidelines for developing and using portfolios, as well as examples of best practices.
Hatfield, Susan R. (2012). Sample rubrics. Association for the Assessment of Learning in Higher Education. Retrieved 19 April 2013 from http://course1.winona.edu/shatfield/air/rubrics.htm
Provides links to many examples of rubrics in a variety of fields.
Mueller, J. (2012). Authentic assessment toolbox Retrieved 24 January 2012, 2013, from http://jfmueller.faculty.noctrl.edu/toolbox/index.htm
Proposes ways to plan for and conduct authentic assessment, and includes a section on designing rubrics.
Olsen, T. (2011). Constructing rubrics. Retrieved 24 January 2013, 2013, from http://tenntlc-utk-edu.wpengine.netdna-cdn.com/files/2011/10/rubric-construction.pdf
A resource worksheet for creating rubrics.
Schreyer Institute for Teaching Excellence. “Planning for a test” and “Test Blueprints.” 2013. http://www.schreyerinstitute.psu.edu/Tools/TestPlanning
This page explains how to connect tests to student learning outcomes, aim for higher order thinking levels, and create a blueprint that will give sub scores for sets of concepts tested.
Other assessment tools
Bond, L. (2009). Toward Informative Assessment and a Culture of Evidence. A report from Strengthening Pre-collegiate Education in Community Colleges (SPECC). Stanford, CA: The Carnegie Foundation for the Advancement of Teaching.
Describes the use of think-aloud protocol, pre-post testing and common examinations to assess student-learning outcomes. Also proposes a strategy to enable faculty to interpret and use assessment data to improve student learning.
Maki, P. L. (2010). Identifying or designing tasks to assess the dimensions of learning Assessing for Learning. Building a sustainable commitment across the institution. (pp. 155-215). Sterling, VA: Stylus.
Discusses both direct and indirect assessment methods, reviewing the use of internal and external (standardized) instruments, including electronic assessment tools. Includes worksheets for identifying and designing assessment instruments that measure learning outcomes.
National Council of Teachers of English. (2012). NCTE-WPA White Paper on Writing Assessment in Colleges and Universities. Retrieved 29 November, 2012, from http://www.ncte.org/library/NCTEFiles/Resources/Positions/WPAwritingassessment.pdf
Articles are available through the UT library; books by Banta and Maki available in the Tenn TLC library (see current books).