Assessment and Outcomes
by Richard Frye, Western Washington University
Both the Washington State Legislature and the Northwest Association of Schools and Colleges (Western's accrediting agency) require all degree-granting programs to develop a plan for the assessment of student learning outcomes, and to document its use for continual program improvement. That both agencies now focus on the importance of student learning outcomes represents a significant convergence of two major trends in higher education: the assessment movement and the accountability movement.
Assessment has evolved from efforts to improve the quality of educational outcomes by continually improving teaching and learning, while accountability has evolved primarily from the efforts of state legislatures to make higher education more cost-effective. Simply put, assessment has historically been oriented toward internal review for improving student learning, while accountability has historically been associated with external review for proving institutional effectiveness.
Assessment is an iterative process for gathering, interpreting, and applying outcomes data from courses, programs, or entire curricula to improve program effectiveness, particularly as measured by student learning outcomes. Assessment is intricately associated with a "student-centered," or "learner-centered" model of institutional effectiveness, and represents a fundamental shift in how educational institutions define their missions and measure their effectiveness.
Accountability is essentially the same iterative process as assessment except that it is aimed at external reporting and review of educational outcomes. Accountability measures required by agencies like the Washington State Higher Education Coordinating Board (HEC Board) were originally oriented toward finding measurements of overall institutional efficiency in higher education, such as time-to-degree, graduation efficiency, and student retention. These have gradually given way to the view that the primary "output" of higher education is learning; as a result, accountability measures increasingly focus on student learning as the central measure of program effectiveness.
In the NWASC accreditation site visit report in 1999, Western was specifically directed "to identify and publish expected learning outcomes for each of its degree and certificate programs, and to actively engage faculty in defining learning objectives and developing specific plans to assess and evaluate outcomes at the course and educational program level."
Similarly, the HEC Board now requires each academic degree program, in its periodic review process, to include a formal plan for assessing how well program objectives in general, and student learning objectives in particular, are being met. Each unit must also demonstrate in periodic reviews: 1) how assessment data has been regularly gathered, and 2) how it has been used to improve program effectiveness and student learning. These requirements have been formalized in Western's Assessment Plan, which outlines program assessment responsibilities.
In addition, higher education has increasingly taken on the attributes of a market commodity, and schools increasingly compete for the best students and their tuition dollars. Prospective students and their parents, as well as legislators, are increasingly demanding convincing evidence of institutional effectiveness as part of their decision framework. If they are going to "buy" Western, they want to know what they are getting. Assessment data about student achievements, graduate placement, and alumni and employer satisfaction are an important part of the evidence prospective "customers" want to see about the quality of a Western degree.
Assessment is an iterative feedback process for continual program improvement, based on the model shown below.
Step one is to define intended program learning objectives: specifically, what do we want our graduates to know and actually to be able to do?
Step two is to define measurable outcomes that will serve as evidence of how well each objective has been met, and then actually to measure them. Because this step requires explicit articulation of program success criteria, it often has the added benefit of clarifying faulty assumptions.
Step three is to compare actual observed outcomes to intended program objectives: how well did we meet our objectives in general, and our student learning objectives in particular?
Finally, in step four, based on how well or how poorly achieved outcomes compare to intended outcomes, elements of the program (including assessment elements) are redesigned as appropriate, and a new assessment cycle begins.
The assessment cycle is an integral part of student-centered education. It provides an ongoing mechanism for challenging tacit assumptions about program effectiveness, identifying conflicting program elements, and assuring that student learning objectives are met. It also allows for evolution of program goals over time. Although it is by no means an easy task to define learning objectives and measurable outcomes for an educational program, faculty engaged in the process inevitably and uniformly are rewarded by identifying with heightened clarity what it is they are trying to accomplish and how they can better go about it.
Good assessment practice is based on a number of assumptions:
- The first precept of good assessment practice is to assess what is most important;
- Anything that can be taught or learned can be assessed;
- Assessment should be applied at course, program, and institutional levels;
- Every program and every course should be organized around clearly articulated learning goals and objectives, explicit assessment methods, and measurable outcomes;
- An assessment process should be logistically feasible and practically manageable to insure that it is regular and ongoing.
What is important varies by program and program level, and program competencies are often sequentially developed; general education competencies tend to be first steps in the development of a range of integrative abilities across the curriculum. In Washington State, four such integrative abilities are being explored in considerable depth to find consistent methods of assessing how well graduates of Washington schools have mastered them: writing, information technology literacy, quantitative reasoning, and critical thinking.
In addition, for many years Washington State has required state colleges and universities regularly to survey alumni regarding satisfaction with fourteen particular areas of learning. These questions ask how satisfied graduates have been with Western's contribution to their growth in:
- writing effectively
- speaking effectively
- critical reading
- quantitative reasoning
- arts appreciation
- scientific principles
- civic rights and responsibilities
- problem solving
- working cooperatively
- learning independently
- cultural and philosophical diversity
- interaction of society and environment
- readiness for career
- readiness for graduate study
- developing satisfying meaning for life
All of these educational goals involve general skills or abilities that the legislature has implied should be developed in a Washington State public college education. This development begins with the general education curriculum, but continues at higher and more integrative levels in the major. Clearly major programs must focus on developing the specific competencies of their fields; but they also have responsibility to develop in their students this broad range of general abilities.
These responsibilities include integrating the particular skills and abilities of the major with the general developmental skills and abilities in the following areas:
- Knowledge: recall of facts, literature, patterns, processes
- Values: professional, ethical, social, personal
- Technical skills: demonstrating what students can actually do
- Integrative skills: demonstrating student capacity for analysis, synthesis, and evaluation
As the assessment and accountability movements in higher education have converged on student learning as the center of the educational universe, ideas about what constitutes a high-quality education have shifted from the traditional view of what teachers provide to a practical concern for what learners actually learn, achieve, and become.
In the traditional "teacher-centered" model, the focus has been on inputs: the credentials of faculty, the topics to be presented, the sequencing of presentations, and so forth.
Oddly, even though college teachers are expected to be good teachers, they are not required to have had any formal training in teaching and learning; expertise in their disciplines is somehow generally considered adequate preparation for a career in college teaching. In addition, even though faculty are almost universally very much interested in promoting student learning, traditional program organization takes for granted the teacher-centered view of teaching and learning. Faculty "teach," generally in the ways that worked best for them as students, and students are at liberty (or their peril) to learn what they can. Although this system has worked fairly well for a long time, research over the last thirty years suggests that we can do much better.
In the "student-centered," or "learner-centered" model, the focus is on outputs: what knowledge have students actually acquired, and what abilities have they actually developed? Implicit in the student-centered model is the idea that instructors are facilitators of learning. It is not enough to construct a syllabus and present information; the job of instructors now involves creating and sustaining an effective learning environment based on a wide range of "best practices" in teaching and learning. The fundamental role of assessment is to provide a complementary methodology for monitoring, confirming, and improving student learning.
The "paradigm shift" from a teacher-centered program design to a learner-centered program design is well underway nationwide, and has already been widely adopted by accrediting agencies, with many important implications.
First, student-centered programs are output- oriented. The primary measure of program success is what graduates actually know and are able to do.
Second, student-centered programs are competency-based. Learning objectives and learning outcomes are tied to the most important skills and knowledge in a program.
Third, learner-centered education is dedicated to continual improvement through ongoing assessment of student learning. By monitoring the effects of program changes on learning outcomes, program faculty are enabled to identify problem areas and to design improvements.
The increasing focus on student learning as the central indicator of institutional excellence challenges many tacit assumptions about the respective roles of college students and faculty. In student-centered education, faculty take on less responsibility for being sources of knowledge, and take on greater responsibility as facilitators of a broad range of learning experiences. For their part, students are called on to take on more responsibility for their own learning.
As shown in the following table, the responsibilities of students and faculty and the relationships between them are quite different in the two models:
|Knowledge||Transmitted from instructor||Constructed by students|
|Role of professor||Leader/authority||Facilitator/partner in learning|
|Role of Assessment||Few tests, mainly for grading||Many tests, for ongoing feedback|
|Emphasis||Learning correct answers||Developing deeper understanding|
|Assessment method||Unidimensional testing||Multidimensional products|
|Academic culture||Competitive, individualistic||Collaborative, supportive|
Beginning with Bloom's taxonomy for educational objectives, and continuing with considerable research on teaching and learning, over the last thirty years many detailed lists of "best practices in teaching" have been compiled. Most lists of important "best practices" include the following:
- Engage students in active learning experiences
- Set high, meaningful expectations
- Provide, receive, and use regular, timely, and specific feedback
- Become aware of values, beliefs, preconceptions; unlearn if necessary
- Recognize and stretch student styles and developmental levels
- Seek and present real-world applications
- Understand and value criteria and methods for student assessment
- Create opportunities for student-faculty interactions
- Create opportunities for student-student interactions
- Promote student involvement through engaged time and quality effort
As shown in the figure below, the best student learning outcomes follow from a combination of activities: encouraging faculty development as teachers using the best practices in teaching and learning; engaging students with high levels of involvement in their studies, with other students, and with faculty; and implementing regular, thoughtful, and periodic assessment procedures to provide ongoing feedback: to students about the progress of their learning, to instructors about the efficacy of their teaching, and to program faculty about how well their program is meeting its objectives.
In order to satisfy the reporting requirements imposed by both the Washington State Higher Education Coordinating Board (HECB) and the Northwest Association of Schools and Colleges, Western's Assessment Plan requires all academic programs to implement formal plans for the assessment of program outcomes in general and of student learning outcomes in particular. Western must be able to demonstrate to these outside agencies that all academic units have designed and implemented assessment plans, and must report annually on how assessment data is being used to improve academic programs. It is important to emphasize that the purpose of assessing student learning outcomes is to make inferences about programs, not about students.
Program assessment must document two kinds of learning outcomes: basic mastery of fundamental knowledge and abilities, and sequential development through a hierarchy of professional and personal abilities, including elements which foster social interaction and personal maturation, such as volunteerism, internships, capstone experiences, field-related employment experiences, collaborative learning experiences, interaction with faculty, and other experiential mechanisms.
Program assessment plans should be designed around developmental goals and objectives in ways that demonstrate how well a program curriculum works as a whole. Academic departments are encouraged to develop methods for assessing the relevant integrative abilities of their disciplines in addition to assessing the more conventional kinds of cognitive gains which have generally been more narrowly defined and easier to measure.
Virtually every program already is doing many kinds of informal assessments. Formalizing these activities into a viable and valuable assessment plan can be made relatively easy using the following guidelines.
1. Read all the pages in this section to get an overview of assessment, accountability, and student learning.
2. Participate with your program faculty in brainstorming discussions on these questions:
- What particular skills, knowledge, or abilities should graduates of your program be able to demonstrate upon graduation?
- At what levels of expertise should they be able to demonstrate such knowledge, skills, and abilities?
- As specifically as possible, identify how you can assess whether students have acquired these abilities.
3. Based on the discussions from #2, write a list of specific program learning objectives. Include both discipline-specific learning objectives and across-the-curriculum developmental or integrative objectives.
(See more about writing program objectives)
Wherever possible use verbs to frame learning objectives as specific actions.
Example: "Graduates should be able to explain the impacts of various taxes on the economic decisions of producers and consumers."
list of "verbs" to use in objective statements (NCGIA)
4. For each learning objective, identify at least one (more are better) actual learning outcome which will be measured or observed to provide evidence of how well the objective has been met by each student.
5. Organize the set of learning objectives around common themes; use these themes to define tentative program goals. In addition to defining discipline-specific goals and objectives, program goals should also reflect the continuing development of Western's general education learning objectives throughout each major.
6. Integrate program goals into a tentative mission statement.
7. Repeat 3, 4, 5 to integrate mission, goals, and objectives and make them congruent.
At this point you have a mission statement, goals statement, learning objectives, and learning outcomes; what remains is to "close the loop" by establishing procedures and assigning responsibilities for:
a) Measuring actual outcomes and comparing them with intended objectives;
b) Implementing program changes based on assessment results; and
c) Assessing, documenting, and reporting the effectiveness of changes introduced during the previous assessment cycle.
Academic departments at Western show considerable variation in levels of development of their assessment programs. Many, especially those forced to establish assessment procedures to meet the professional accreditation requirements of their disciplines, have quite highly developed plans for assessing program outcomes, including especially student learning outcomes. Many others have not had such incentives, and have developed only vestigial assessment plans at best. Even those programs with considerable experience with assessment do not necessarily share a common view of the importance of various learning outcomes or a common format for documenting their assessment activities or reporting their findings.
It is useful to acknowledge this range of experience with program assessment by identifying three stages of development of program assessment plans: beginning, intermediate, and integrated.
The Planning stage is the beginning level of implementation. It is characterized by tentativeness and uncertainty; mission and goals are not clearly defined; program learning objectives are not clearly defined and may not be congruent with goals; outcomes measures are not good estimators of program objectives; assessment data are being collected or analyzed only sporadically; classroom assessment procedures are not congruent with stated program goals; or collected data has either not been analyzed or results have not applied for program improvement.
The Emerging stage is the intermediate level of implementation. It is characterized by familiarity, growing confidence, and growing commitment to assessment; faculty members are increasingly engaged in collecting and applying assessment data; assessment results are increasingly used in decisions about course sequencing, faculty allocations, teaching methods, program curricula, choice of instructional resources, planning and budgeting, and program improvement; and faculty are increasingly engaged in an ongoing conversation about program improvement based on assessment findings.
The Maturing stage is the integrated level of implementation. It is characterized by: continued development of the processes of the "emerging" level, the increasingly important role of student learning and teaching excellence in defining program effectiveness and guiding program changes, and the full engagement of faculty in an active "culture of evidence" dedicated to improving student learning, performance, involvement, and achievement.
Western's goal is for all academic program assessment plans to evolve to the "maturing" stage. This website is to assist program faculty in the development, implementation, and improvement of unit assessment plans, and to establish a unified annual reporting format which summarizes departmental assessment activities. In addition, staff at the Center for Instructional Innovation and Assessment and the Office of Institutional Assessment are available for assistance.
The Mission Statement is the initial point of reference for any program or course. It is a concise statement of the general values and principles which guide the curriculum. In broad strokes it sets a tone and a philosophical position from which follow a program's goals and objectives; therefore the mission statement is also a statement of program vision. The mission statement can and should be brief. However, it is not an isolated document. Rather, it is the cornerstone of a the curricular structure , defining the very broadest curricular principles and the larger context in which more specific curricular goals will fit. The program mission statement should define the broad purposes the program is aiming to achieve, describe the community the program is designed to serve, and state the values and guiding principles which define its standards.
Program mission statements must also be consistent with the principles of purpose set forth in the University's mission and goals statements; therefore, a good starting point for any program mission statement is to consider how the program mission supports or complements the University mission and strategic goals.
Paraphrasing from several versions of Western's Mission Statement:
The mission of Western Washington University is to provide to Washington State students a high quality undergraduate education which nurtures the intellectual, ethical, social, physical, and emotional development of each student, through:
- A common, broad-based mastery of the fundamental concepts, history, perspectives, and significance of the arts, sciences, social sciences, and humanities; and
- Baccalaureate and master's degree major programs of a practical and applied nature directed to the educational, economic, and cultural needs of Washington State residents.
These mission elements are further elaborated in Western's Strategic Plan, which emphasizes three broad goals of educational quality, multicultural enrichment, and community service.
The program mission statement must serve as a link between departmental goals and objectives on the one hand, and University mission and goals on the other; it must also demonstrate logical internal consistency among program mission, goals, objectives, and outcomes.
As a result, writing the mission statement is an iterative process of successive approximations:
- first approximation of mission
- first approximation of goals
- first approximation of objectives
- second approximation of mission, etc.
Therefore, in the initial stages of mission development, a rough listing of the main purposes of a program, and how it fits into the larger mission and goals of the University, might be adequate before moving on to first approximations of program goals and objectives.
The main function of the goals statement is to form a bridge between the lofty language of the Mission Statement and the concrete-specific nuts and bolts of program objectives. In the goals statement, the broad principles of the Mission are narrowed and focused into the specific categories of skills, knowledge, and abilities which will characterize graduates of your program including those that are specific to your discipline as well as those which represent the broader general competencies implied by Western's mission and strategic goals.
The goals statement is essentially becomes a blueprint for implementing the mission by answering the following questions:
- How do program goals relate to the program mission?
- How does this program fit into a student's overall development?
- What general categories of knowledge and abilities will distinguish your graduates?
- For each principle of the mission, what are the key competency categories
graduates of the program should know or be able to do?
As discussed above in the "overview" section, general competency goals might include the four integrative abilities being considered as possible statewide required accountability goals (writing, information technology literacy, quantitative reasoning, and critical thinking), as well as the fourteen areas of alumni satisfaction Washington State currently wants assessed in alumni surveys—satisfaction with Western's contribution to the graduate's ability for:
- writing effectively
- speaking effectively
- critical reading
- quantitative reasoning
- arts appreciation
- scientific principles
- civic rights and responsibilities
- problem solving
- working cooperatively
- learning independently
- cultural and philosophical diversity
- interaction of society and environment
- readiness for career
- readiness for graduate study
- developing satisfying meaning for life
Each major department must take responsibility for promoting and assessing student development across the range and level of abilities appropriate to its programs, including both majors and general education students. Therefore the program goals statement should include all of the key competency areas which the program or its courses address, for both majors and non-majors.
Program objectives are brief, clear, focused statements of specific intended learning outcomes. Each objective can be linked directly to one or more program goals. Each objective should be defined with outcomes assessment criteria in mind for "measuring" how well each objective has been accomplished. Operationally, it is very helpful to formulate each objective statement to include.
Stating each objective in the form of an "action verb" combined with a description of a very specific ability helps translate objectives into learning outcomes students can actually demonstrate and faculty can actually measure. The use of the verb form emphasizes that objectives can be assessed by examining very specific products or behaviors students can actually do. By implication, each objective must have associated criteria for evaluating the success of the program in terms of the actual accomplishments of its graduates. For example, here are some sample learning objectives from the Human Services program:
- Examine the history and philosophies of human services
- Identify what constitutes genuine and empathic relationship
- Analyze the role of conflict in individual and societal systems
- Demonstrate a broad range of relevant communication skills & strategies
- Design integrated services using innovative practices in diverse settings
Two kinds of learning objectives: mastery and development
There are two general categories of learning objectives. Mastery objectives establish minimum criteria for the acquisition and demonstration of foundational skills or knowledge. Mastery implies the achievement of a minimal or threshold level of competence, and also implies that what is important is the attainment of a minimum or threshold level of competence. Mastery objectives are measured on a binary scale: pass/fail, satisfactory/unsatisfactory, etc.
In contrast, developmental objectives imply a sequential continuum of integrative abilities. In general these include two distinct categories of abilities to be assessed as student learning objectives: general, across-the-curriculum abilities, and abilities specific to the major. Developmental objectives form a hierarchy of sequential skill levels which become the basis for particular course sequences within a program.
Because developmental objectives are best represented as a sequence of checkpoints for student learning, it is important and useful for departments to establish criteria for defining and assessing several different levels of developmental abilities, and to associate the attainment of sequential levels of such abilities with specific courses or groups of courses in their programs. In this way program objectives can be integrated meaningfully into individual courses, and learning objectives for one course become prerequisite knowledge for more advanced courses.
For example, a sequence of developmental objectives might include:
- Demonstrate observational skills
- Draw reasonable inferences from observations
- Demonstrate perception of important relationships in observations
- Analyze structure and organization
- Select and apply appropriate theoretical constructs to observations
Both mastery and developmental objectives can be associated with a wide variety of competencies:
- Cognitive development--area and level
- Technical skill development--skill and level
- Process skill development--skill and level
- Comprehension--type and level
- Integrative thinking/ creativity
- Attitudes, behaviors, and values
- Development of desirable personal/professional qualities
Learning outcomes are observable indicators or evidence of actual student learning. Each program must select an array of assessment tools, which can include both direct measures of student knowledge and performance, and indirect measures of changes in student behavior, attitudes, or values.
Direct measures include national standardized tests; licensing or certification exams; local content or competency exams, papers, or projects; skills tests, projects, reports, demonstrations, or performances; portfolio analysis; capstone projects, experiences, or performances; email or online discussion board content; and so forth.
Indirect measures include surveys of students, alumni, or employers; student or graduate profiles, interviews, or focus groups; transcript analysis; periodic review of syllabi, textbooks, exams, or other curricular materials; and so forth.
Each program will have its own unique needs and its own set of outcomes. What is important is that each outcome provides evidence about the accomplishment of a particular program objective. Ideally, each objective will be assessed by multiple outcomes measures so that:
- Each outcome is a measurable estimator of a program objective
- Outcomes selected are feasible measures given the resources available
- Outcomes link actual student learning to intended post-graduate abilities
- Outcomes accurately reflect ability and knowledge
- Outcomes can be direct or indirect measures
The whole point of assessment to establish an ongoing, systematic mechanism for assessing, reviewing, and improving programs. Therefore each program assessment plan must include explicit procedures for determining which outcomes will be measured; when they will be measured; who will measure them; who will analyze them; what results will be reported, to whom; and how results have been implemented.
This is the step in the assessment cycle that makes assessment relevant, and it is the step which is likely to be most scrutinized by outside agencies. The "accountability" aspect of assessment is the requirement to document how assessment findings have been used to guide program improvement.
Currently Western is using an annual survey of academic departments to gather information on program assessment plans. This procedure is likely to be modified in the future into a uniform reporting format that includes program mission, goals, objectives, outcomes, and procedures, along with a cumulative listing of program improvements that have been made as a result of assessment findings.
Therefore, this section of each plan must show not only how results have been applied to program improvement in each annual cycle, but also must analyze what results say about program effectiveness and about the impact of assessment-induced changes on program effectiveness over time.
At present Western has no common assessment activity reporting requirement or format for academic units. In the past, assessment information has been gathered from units via an online survey, and data from the survey has been collated and used to construct assessment reports for accreditation and for the State.
In the future it is quite likely that Western will adopt some common reporting format for academic units, which will generally follow the structure described in this section and shown in the figure below:
- Program mission statement
- Program goals consistent with mission statement
- Multiple learning objectives (intended learning outcomes) for each goal
- Measurement of multiple outcomes for each learning objective
- Assessment criteria for each learning objective
- A framework for data analysis and program improvement
- Documentation of how assessment results have improved both programs andassessment criteria and procedures
A number of departments and units at Western have already made a serious commitment to developing program and course assessment plans, and are using assessment data in many different ways to improve student learning. The information presented here is meant to show some of the diversity of work in progress in different programs across campus. It is by no means exhaustive, nor are any of the results shown here "final" in any sense. Rather, they are snapshots of the ongoing evolution of the assessment of student learning at Western.
- Engineering Technology
This is an example of a departmental model designed in response to the requirements of Industrial Advisory Boards, appropriate accrediting agency, and several professional organizations. It features learning outcomes in particular skill areas of analysis, communication, teamwork, technology, creative problem-solving, ethics, and professionalism.
- Physical Education Outcomes Assessment Plan
Recreation Program Outcomes Assessment Plan
Based on a learning outcomes development process created at California State University at Chico, these outcomes assessment models for Physical Education and Recreation identify Learning Objectives, Learning Processes, Assessment Techniques, Status/Outcomes/Results, and Decisions/Plans for Future Recommendations.
- Environmental Studies Introductory Course Assessment
Huxley College of the Environment moved to revise its core curriculum and entry-level GUR course to place greater emphasis on problem-solving and interdisciplinary integration of subject matter. The course revisions were designed as part of a larger curricular change, which includes this entry-quarter core experience, to be followed by a choice of other substantive courses to complement the student's major, and finally, another integrative, problem-based capstone course. Perry analysis of student papers was conducted along with student self-evaluation to assess student learning.
- College of Business and Economics MBA Outcomes Assessment
This study in the MBA program uses a "Post-Then methodology" to ask students to rate their learning in the program by comparing their levels of expertise in various areas with those upon entry into the program. Both the rationale and the results of the assessment are presented in this preliminary report.
- Geology Program Trial of Critical Thinking Rubric
In Spring of 2002 the Geology Department faculty tested a critical thinking rubric adapted from one developed at WSU, using a number of raters. Although ratings varied unpredictably between some raters on some questions, most faculty found the rubric promising and useful, and plan to develop it further.
- Woodring College of Education State Program Approval for Math / Woodring College of Education State Program Approval for Health & Fitness
Washington State places special requirements on Education programs for assessment of student learning among future teachers, with different assessment plans required for different specialties. Here are the competencies for two of those programs, together with outlines of where in the curriculum these competencies are to be learned, and how they are assessed.
WWU Primer on improving the Teaching, Learning, Assessment Cycle
Janice Lapsansky, Project Leader
Beginning in June 2003, four faculty members engaged in a pilot project designed to lead Western Washington University’s teaching community in the regular and systematic assessment of student learning outcomes (SLO) in undergraduate courses. Supported by the Vice Provost for Undergraduate Education and the Office of Institutional Assessment, Research, and Testing, this effort was launched as one mechanism to facilitate the transition from program ssessment to individual course assessment.
Sustainable implementation of regular and formal classroom assessment was the central objective of this project. The program focused primarily on assessment of learning outcomes in general education courses within four disciplines: History, Environmental Science, Psychology, and Geology. The primary goal was to produce exemplars of classroom assessment, incorporated into an instructional design with the potential to help students become more reflective and effective learners.
Each participating faculty member selected a student research partner to access the student perspective in order to gain a deeper understanding of student learning and of effective communication of course objectives in classroom practice. The structure of the program and the methods employed were characteristic of scholarship in any field: incorporating literature review, creativity, collaboration, enactment, and dissemination activities. The project leader solicited mid-point and final evaluations from participants to document and make improvements to the process.
Below is a graphic depiction of the assessment learning cycle in which the four faculty participated. More detailed synopses of each faculty member's contributions to the project are available on the links above to environmental science, geology, history, and psychology.
WWU Primer on improving the Teaching, Learning, Assessment Cycle
Environmental Science - Scott Brennan
Faculty Member Statement by Scott Brennan,
Huxley College of the Environment
I selected Environmental Science 101, Western’s largest lecture course, and Environmental Studies 481, an upper division environmental journalism course as the two courses for the summer 2003 Teaching, Learning and Assessment work supported by the Provost's office.
Teaching Environmental Science 101 presents special instructional and assessment challenges because of the course's size (~475 students) and the interdisciplinary and applied nature of its subject matter. I have taught this introductory, GUR course 12 times since the Fall of 1999 and was pleased to have this opportunity to revisit the course features, goals, intended learning outcomes and assessment while working with a team of faculty colleagues and student research partners.
Environmental Journalism, Environmental Studies 481, is a small, senior-level course intended to provide students with an immersion experience in investigative environmental feature writing and the tools required to work as a staff reporter or freelance writer. This course presents special assessment challenges because of the sometimes subjective nature of evaluating student writing and the highly diverse backgrounds and differing levels of writing experience students bring to the course. Students in this class also gain valuable experience and insight into their own writing by editing and being edited by their peers in the class.
These two courses serve very different programmatic needs and present very different instructional and assessment-related challenges but I am confident that students in both courses will benefit from the application of Teaching, Learning and Assessment work to their syllabi. I am also hopeful that the lessons I have learned while revising both courses will be valuable to a diverse group of faculty who are interested in revising their courses as well.
Student Research Partner Statement by Jillian Martin,
WWU Junior, major undeclared:
As a student new to learning assessment I had vague ideas as to what we would be accomplished this summer but I did not initially realize the integral role I could play as a student research partner. As an undecided junior at Western I felt participating in learning assessment would be a great opportunity to become involved in Western's community and explore possible major interests. I was interested in getting involved in this project after taking Environmental Science 101 with Scott Brennan spring quarter of 2003. The subject matter was of great interest to me and I felt compelled to offer my critique of the course and participate in its improvement. One of my goals this summer was to help create new and more focused outcomes and to help students gain a higher level of personal involvement and interest in understanding and solving environmental problems.
I found it easy to be excited about revision to the 101 course because it is such an eye opener to the problems and possible solutions of our world. I feel the role of the course is to show students how we use our planet, what is happening as a result, and help them to understand the fragile balance of life. As a recent ESCI 101 student I had specific ideas of what needed to be improved, what worked well, and what could be done to get more student involvement and motivation. I hope the revised course will provide every student with more specific information that applies to their own personal choices and gives them the same opportunity and experience I had in ESCI 101 during the spring of 2003.
Environmental Science 101 is a 3 credit, large lecture course that meets WWU's B-Science GUR requirement. The current University Bulletin describes the course as follows:
An introduction to environmental studies stressing a scientific approach toward understanding the nature and scope of contemporary problems in the human environment. The course reflects applications of physical, chemical, biological and geological principles to define ecological change both natural and anthropogenic.
The course goal or purpose, according to the most recent syllabus, is:
To investigate the relationship between human life and the environment from a scientific perspective, illustrating current and emerging problems and potential solutions, while increasing students' awareness of their individual impacts on environmental systems.
To develop measurable Intended Learning Outcomes consistent with the course description and purpose, we (Scott Brennan and Jillian Martin) first reviewed recent course evaluations to identify the course's strengths and weaknesses. Several themes emerged from this analysis.
- The course makes scientific subject matter relevant to them on a personal level.
- The course's subject matter illustrates the connection between individual choices of contemporary problems in the human environment
- The course's subject matter is diverse and varied
- The course uses a variety of media and diverse means of presenting material that are suitable for students of all learning styles
- The course provides students with opportunities to be involved in class discussions and projects as well as community-based solutions to environmental problems.
- The course did not provide enough of an introduction to basic science and scientific reasoning
- The course did not adequately emphasize the systemic nature of environmental problems
- The course did not provide enough information regarding solutions to environmental problems.
- The course did not provide enough opportunity to critically evaluate divergent views of environmental issues.
- The course did not provide enough opportunity for students to apply scientific principles and knowledge of environmental interrelationships to their own lives and their community.
We then critiqued the existing Course Objectives, listed below:
- To introduce you to environmental science, its central ideas, concepts, models and applications
- To help you apply the fundamentals of environmental science to important local, regional, national and global environmental problems and potential solutions
- To give you an opportunity to analyze and discuss the relevance of environmental science to your personal, professional, and academic life
We concluded that, while useful, these Course Objectives were not easily quantified and that they were teacher-centered rather than learner-centered. To address this problem we decided to restate these Course Objectives as measurable, learner-centered Intended Learning Outcomes prefaced with “Upon successful completion of this course, you should be able to…” We worked to ensure that these Intended Learning Outcomes emphasized the course purpose, embodied existing course strengths and addressed current course weaknesses. The resulting Intended Learning Outcomes and the means of enabling students to achieve them, are given below.
Upon successful completion of this course you should be able to:
- Describe the structure and function of significant environmental systems. (readings, lecture)
- Use scientific reasoning to identify and understand environmental problems and evaluate potential solutions. (L.E.A.D. community volunteer projects, in-class talk shows, readings, lectures)
- Critically evaluate arguments regarding environmental issues. (online readings, talk shows)
- See the impact your way of life has on the environment. (ecological footprint calculator)
- Apply your understanding of environmental issues to your own choices. (change in ecological footprint over the quarter)
|Intended Learning Outcomes (Measurable and student-centered)||How do students learn to do this?||What evidence is there that students are learning this?||What additional information is needed to understand how well students are learning this?||What possible new or improved assessment techniques might be used?|
|Describe the structure and function of major environmental systems||Readings
|Use scientific reasoning to identify and understand environmental problems and evaluate potential solutions||Readings
Extra credit essays
|Critically evaluate arguments regarding environmental issues||Online readings Talk shows Guest lectures Readings||Exams Talk shows||Written work Discussion board|
|See the impact their own lives have on their environment||Ecological footprint calculator
|Exams||Change in ecological footprint over quarter||Pre-Post footprint calculation and survey|
|Apply their understanding of environmental issues to their own choices||Videos
|Change in choices over quarter||Pre-Post footprint calculation and survey|
Environmental Science 101 Teaching Practices and Curriculum Design Features
The following are a few examples of teaching practices and curriculum design features and the Intended Learning Outcomes they support. Many of these practices and features could be applied to other large lecture courses.
Readings, Lectures and the Testalator/Weekly Quizzes
At the end of each week, students have an opportunity to complete an online quiz and self assessment based on that week’s readings and lectures. The results of this self assessment give students an accurate understanding of their comprehension of the relevant scientific content of that section of the course.
Intended Learning Outcome Supported:
Describe the structure and function of significant environmental systems.
In-class Talk Shows
Periodically throughout the term students receive a series of take-home essay questions regarding current, contentious environmental issues. For extra credit, all students have an opportunity to post their responses to these questions on the online class bulletin board. The instructor selects 4-5 of the most thought-provoking and diverse responses and invites their authors to serve on an in-class panel or talk show during the next class session. This facilitated discussion between the panelists and the other students in the class provides students with opportunities to work toward several Intended Learning Outcomes.
Intended Learning Outcome Supported:
Use scientific reasoning to identify and understand environmental problems and evaluate potential solutions.
Critically evaluate arguments regarding environmental issues.
Ecological Footprint Calculator
This online interactive exercise asks students questions about their daily food, housing, transportation and other consumer activities and calculates the total land area required to support their lifestyle. By completing this exercise students are able to better understand the connection between their choices and changes in environmental systems.
Intended Learning Outcome Supported:
See the impact your way of life has on the environment.
Apply your understanding of environmental issues to your own choices.
Environmental Science 101 Selected Assessment Tool, Ecological Footprint Calculator:
The U.S. accounts for less than 5% of the Earth’s population but consumes more than 25% of many of its key resources. As a result, the average American requires almost 6 times as much of the Earth’s land area to produce the goods and services and absorb the waste resulting from his or her lifestyle than does the average person outside of the U.S. Quite simply, if everyone on earth were to make the consumer choices that the average American makes, we would need six more planets.
As a result, it is critical that our students understand their Ecological Footprint, or total amount of land required to produce the raw materials and handle the waste products that they produce. The Ecological Footprint Calculator exercise described below will play a key role in allowing students to increase their awareness of the impact of their choices on environmental systems and, hopefully, to begin making different, more sustainable choices.
Students will complete the Ecological Footprint Calculator at the beginning of the quarter and record the total land area required to support their lifestyle. At the end of the quarter, students will recalculate their Ecological Footprint and answer the following questions:
- What was your Ecological Footprint at the beginning of the quarter?
- List the land area required to support each category (e.g. food, housing, transportation) of your consumer choices at the beginning of the quarter.
- What was your Ecological Footprint at the end of the quarter?
- List the land area required to support each category (e.g. food, housing, transportation) of your consumer choices at the end of the quarter.
- Did you make any deliberate changes in your lifestyle during the quarter to change your ecological footprint? If so, describe these changes and your reasons for making them. If not, describe your reasons for deciding not to make any changes.
- What did you learn about the impact of your choices on the environment during this quarter?
Because of the extremely large class size, it is not feasible to analyze the responses from every student. Therefore we will randomly select a group of students and analyze their responses to the preceding questions and we will calculate the total impact of the class at the beginning of the quarter and at the end. If the cumulative class ecological footprint and the mean individual footprint declines significantly during the quarter we will conclude that the Intended Learning Outcome “[s]ee the impact your way of life has on the environment” and “[a]pply your understanding of environmental issues to your own choices. We will use this information together with the random sample of student responses to the question above to determine how successful our assessment tool has been.
Connection to Ongoing Curricular Assessment
Analyzing the data generated over the quarter, including student responses to the questions listed above and the cumulative class ecological footprint, will enable us to understand the extent to which students have achieved two key Intended Learning Outcomes. By analyzing the reasons that some students decided against making lifestyle changes to reduce their ecological footprint we will be able to improve the curriculum in future quarters to increase student participation in our Ecological Footprint reduction exercise.
Additional unresolved issues may present themselves during the quarter but at this time, the key variables will likely be student participation in the ecological footprint calculator and the instuctor’s ability to elucidate connections between personal consumer choices and environmental problems and their solutions.
Environmental Journalism, ESTU 481, is a four credit, writing intensive course required of all environmental journalism majors. It is also open to all Western students who are interested in environmental journalism. It is a small course, typically enrolling 15 upper division students.
The course description, according to the most recent WWU Bulletin is:
Goal is to equip students to report and write clearly, critically and constructively on environmental and natural resource issues. Emphasis on writing articles for publication involves reading, discussion and much research and writing.
The course purpose and objectives as of Winter 2003 were:
The purpose of this course is to enhance your ability to write effectively about environmental issues and to teach you the skills you need to ensure that your work is published and reaches a wide audience.
- To write environmental news and feature stories.
- To learn how to work as a freelancer and sell your work.
- To study and critique the work of other environmental journalists.
- To discuss techniques, business, ethics and other issues with working environmental journalists.
- To complete a final project emphasizing one aspect of environmental journalism of particular interest to you.
As a result of our reflection on this course, its strengths and areas for improvement, we have revised the existing learning objectives to reflect a learner-centered, assessment-oriented approach.
Revised learning objectives:
Upon successful completion of this course you should be able to:
- Write effective query letters.
- Conduct effective interviews and investigations.
- Craft compelling environmental stories built around real places and real people.
- Write effective environmental and natural resource-related news and feature stories.
- Sell your work as a freelance environmental journalist.
Environmental Studies 481 Teaching Practices and Curriculum Design Features
The Intended Learning Outcomes work we devoted to this course during the summer of 2003 was much narrower in scope than were our efforts related to ESCI 101 (above). The primary objective of our work on this course was to address the most common complaint evident in student evaluations of the course in recent years. While the course as a whole has received extremely positive course evaluations, many students have expressed frustration with the evaluation of their work and the difficulty they have had understanding and applying evaluative creteria. Students have cited unclear and insufficient evaluation criteria and a lack of clarity regarding faculty expectations.
Because defining “good writing,” beyond, of course, the fundamentals of sound composition, can be as challenging as defining what is “good food,” this is a common and perhaps expected complaint regarding such writing intensive course. In past years I have attempted to refine the evaluation criteria and express them in a variety of ways but during the summer of 2003 I have decided to draw upon the strengths of evaluation rubrics and democracy to develop a new evaluation paradigm, and accompanying assessment tool for the environmental journalism course.
Environmental Science 481 Selected Assessment Tool: The Stakeholder Driven Rubric
In past quarters, students were presented with a series of criteria that define “good writing” in the context of this course. These criteria have been presented in writing (both in and outside the syllabus), verbally in lectures, and through in-class critiques of student and professional writing. During the next offering of this course (Winter 2004) students will collaborate with faculty to develop and implement their own criteria through what we have decided to call Stakeholder Driven Rubric Development (SDRD).
The SDRD process will work as follows:
- During the first week of classes each student will find two examples of what they define as excellent works of environmental journalism.
- Students will bring these examples of excellence to class and discuss their reasons for classifying them as such.
- The instructor will facilitate an in-class discussion of the nature of excellence in environmental journalism and, out of this discussion, will assist the students as they define excellence in the context of measurable criteria.
- The students and instructor will come to consensus on the criteria defining excellence in environmental journalism and assemble a rubric that they will use to peer-edit drafts of assignments and that the instructor will use to grade final drafts of these stories.
Environmental Science 481 Summary
It is our hope that by engaging the students themselves in the process of defining excellence and codifying criteria through a Stakeholder Driven Rubric Development process that student learning will be enhanced and that students will better understand and implement the characteristics defining excellence in environmental journalism. The Stakeholder Driven Rubric that was developed during Fall Quarter, 2003, is shown below:
Environmental Journalism Writing Rubric
Instructions for use:
Please use the following scale to evaluate drafts of your own work and the final articles and essays shared with you by your peer review partner. Please print this sheet out and rate each essay using these criteria and this scale.
0-Fails to accomplish
|CRITERION SCORE||(0-5 SCALE & Comments)|
Material is fair, balanced, well-researched and properly represented
The piece is creative, lively, innovative and inspires personal interest.
The piece exhibits humanity and respect while educating the audience in a manner free of stereotypes.
The piece tells a local story in a global context, conveys a sense of place, is timely and offers solutions to the problems addressed.
The piece exhibits sound grammar, spelling, punctuation, organization, flow, cohesiveness and some modicum of journalistic style.
WWU Primer on improving the Teaching, Learning, Assessment Cycle
Geology - Thor Hansen
Geology 101 is an introductory general education course with an enrollment of around 125 students. Most students in the course have never had a geology course and do not intend to take another one. Although it is attended by non-science majors, and in fact many of the students are science-phobic, many new geology majors are recruited from this course. I engaged in this learning outcomes/assessment project because I wanted to transform this course from a survey/content-rich/cover-everything approach to one where the students developed specific competencies in geology and left the course seeing the world from a new geological perspective.
As a result of this course, I want students to notice geological features through their car window or on the beach, be able to make appropriate geological observations and understand the origin or significance of the feature in question. I want them to be excited about these discoveries and tell their friends and family about them. In order to achieve this level of geological competency, the students need to learn appropriate geological content and get practice in applying this knowledge in novel situations. As they become confident in their ability to make geological interpretations, I hope they will discover the relevance of geology to their everyday life and enjoy it. My student partner, Michele Malone, and I developed six specific learning objectives for Geology 101, accompanied by six activities at which they should be able to demonstrate skill.
- Recognize examples of the three major rock types (igneous, sedimentary and metamorphic) based on visible physical characteristics and explain how these rocks formed. (Look at a rock on the beach, or a photo of a rock, and formulate a brief geological history of it based on visible physical characters.)
- Understand the origin, distribution, and classification of volcanoes. (Analyze a map of a volcano and predict the regional risks of blast, mudflow, lava flow and ash fall in the event of an eruption, e.g. will the town of Glacier survive if Mt. Baker erupts?)
- Understand the physics, distribution, origin of and damage caused by earthquakes. (Assess the earthquake risk of a building based on type of bedrock, type of construction, and the nature of frequency of earthquakes in the region, e.g., what is the most earthquake-dangerous building in Bellingham?)
- Understand the role of plate tectonics as a general explanatory hypothesis for a variety of geological phenomena. (Predict the distribution of mountains, earthquakes, volcanoes and islands that you would find in an area based on the nature of the plate tectonic boundaries, e.g., why do we have earthquakes and volcanoes but New York does not?)
- Understand how air, water and gravity have shaped Earth's surface. (Describe the landscape out of the window of a car or airplane and construct a brief geological history of the area, e.g., how did the Grand Canyon form?)
- Understand the influence of environmental hazards (e.g. flooding, landslides) on society (e.g. resource use, voting decisions). (Evaluate the geological qualities and hazards of a potential building site and know what questions to ask about the geology of the area that are pertinent to building construction, e.g., is this a good place to build a house?)
Translating the learning objectives into active skills not only makes them inherently assessable, it also provides a guide to me for exactly what content should be taught and gives a rationale to the student as to why they have to learn this particular fact. For example, if the target skill is to be able to identify rocks on a beach, the students will need to know how to recognize visible minerals and structures in rocks. There is no reason for the professor to discuss non-visible microscopic features or rocks too unusual to normally be seen in nature. The content lesson will be immediately followed by application (“What is this rock?” referring to a photo in class or an actual specimen). The student sees the relevance of the material just learned and internalizes it by repetition and discussion with peers. Competency at an activity like this employs higher order thinking skills than simple memorization, because the student must evaluate and analyze rather than repeat facts. Although some content will be sacrificed from the survey approach normally used in Geology 101, far more of what they learn should be retained.
The structure of the class has been redesigned to both teach and assess the six competencies listed above. Take for example outcome/skill number 1 (recognize rock types). The lecture will cover common rock-forming minerals and the significance of the three major rock types. Student comprehension during this phase will be assessed daily using the “one-minute write” format. The lab will also have activities on identifying minerals and rocks using purchased lab specimens which each exemplify a particular type of rock. The rock identification activity will be personalized by giving each student a “pet rock” at the beginning of the course which has been collected at a local beach. They will be told to take this rock to both lecture and lab in order find examples of other rocks like it and to help them identify it. It is important that this is a “wild” rock that has been collected locally and not a laboratory specimen chosen because it best characterizes a certain rock type. During the course the students are to identify their rock by using the principles outlined in the lectures and lab and to tell its “story”, i.e. how it formed, what kind of tectonic regime it characterizes (e.g. ocean floor, continental crust, etc.) and speculating about how it got to the beach where it was found. The rocks will be numbered and each student will be registered with their rock. All the rocks will be pictured on a web site and each student will log onto this site and tell the story of their rock. I will then comment on, ask for revision, and/or approve their story on the web page. All rocks and comments will be available for all students to see, thus modeling the story-telling process. I will have the students share their stories in small groups, making sure that each group has several different rock types, thus promoting peer discussion and practice at rock story-telling. I will bring closure to the exercise, by showing a geological map of the region and helping them infer where their rock originated and how it was transported to the beach. The results can then be assessed by showing a picture(s) of a rock(s) from entirely different areas and assigning a one-minute write.
WWU Primer on improving the Teaching, Learning, Assessment Cycle
History - Kathleen Kennedy
I am an Associate Professor of History at Western Washington University. I came to Western Washington from Texas where I taught for three years in the School of Arts and Humanities at the University of Texas at Dallas. I completed my Ph.D. at the University of California, Irvine. I have written on State repression of dissent during wartime, amnesty and civil liberties movements in the United States, feminist theory, gendered and racial violence, anti-Catholicism, Joan Jett and Xena, Warrior Princess. Despite spending my adult life west of the Mississippi, I was born in and in and retain a primary allegiance to the Northeast, especially in sports. I am optimistic that most of my students take history courses because they love history and believe that someday the Red Sox will win the World Series. When not teaching or writing, I spend my days watching sports, Babylon 5 reruns and playing with my dogs, Eliot and Sampson.
What my students say about me when they think I am not listening:
- She hangs upside down in her office
- Buffy could take Xena in a minute
- The Backstreet Boys have talent
- Cats are not vermin
- The Red Sox will never win the World Series
My primary reason for participating in this project was to align my individual assignments with my course goals. I was concerned that my assignments did not always measure the skills that they claimed they did. I began this project as a skeptic and like many faculty members did not want assessment imposed on me by either legislative mandate or by colleagues unfamiliar with the humanities. I anticipated that my participation would enable me to not only write better assignments and to translate the useful literature on assessment to my colleagues in the humanities. By doing so, I hoped to retain control over the learning process in my classroom and to provide a language for my colleagues to address assessment requirements without compromising what they do in the classroom. It was my hope that once my colleagues were reassured that they would maintain this control that they would be see assessment projects as an opportunity to improve their courses rather than a threat to their pedagogy.
Ed Chatterton, Student Research Partner:
My name is Ed Chatterton, and I am a graduate student in History Department. I am what many would call a "returning" student; that is a student returning after a career in a non-academic field. My decision to participate in this project was fueled by three distinct considerations. First, my respect for Dr. Kennedy and a realization that anything she was involved in would be excellent. Secondly, the opportunity to interact with teachers and fellow students all motivated by a similar goal–the improvement of their individual and corporate learning skills. I would be dishonest if I did not acknowledge that the opportunity to be paid for the experience I would gladly have purchased was also a factor.
My experience in the non-academic "real world" has convinced me that knowledge in any particular field is often "caught" rather than "taught." Teachers plan curriculum, syllabi and evaluate procedures well, but much of what is actually learned does not originate in the structured, traditional classroom paradigm. Clearly stated, intended learning outcomes carefully crafted learning opportunities (included lectures, labs, cooperative and collaborative learning methods and individual student research), and realistic, innovative assessment tools are the primary goals of this Teaching and Learning group.
History 103 fulfills the General Education Requirement in the Humanities. It introduces students to the first half of American History (Ancient Native American societies to the Civil War) that typically enrolls anywhere from seventy-five to one-hundred and twenty-five students. When I teach the course in the Fall, most of these students are first-quarter freshmen, some of whom are enrolled in a Fig cluster. Like most History courses it has a heavy writing component as the members of the History Department use essay exams and generally assign anywhere from five to fifteen pages of writing in lower division courses. The course has the dual burden of teaching students the basic content of early American history and in introducing students to the practice of history.
When I first began including learning objectives on my syllabus, I used as a guide a rubric developed by the History Department at California State University at Long Beach. This rubric was the result of a project designed to examine and articulate what first-year history students should know. I have revised this rubric to make it specific to History 103 and to arrange it in accordance with the Bloom’s Taxonomy of higher learning. By using the language of Bloom’s Taxonomy, I hope to better articulate how these objectives moved students from basic knowledge to critical thinking.
- Place in time key historical events and actors
- Identify and evaluate multiple perspectives and approaches to historical understanding
- Identify and Evaluate the diversity of AAmerican@ experiences
- Formulate and defend an historical argument
- Write clearly, economically and persuasively about historical problems
- Develop a proper foot/end note
- Locate appropriate primary sources
- Distinguish between primary and secondary sources
- Interpret different types of evidence
- Detect and appraise bias and point of view
- Draw conclusions and inferences from examined evidence
- Formulate historical questions
- Create, organize and support an historical argument in written and oral presentation
- Assess and prioritize historical causes
These learning objectives are measured by three assignments:
- Two history papers that articulate a clear historical argument developed from a careful analysis of primary sources
- Two to three short essay exams that require students to define key concepts, identify and discuss the significance of key people and events and to place those concepts and events in historical time and context
- Participation in a discussion list in which students contribute to ongoing discussions and arguments about key historical events, their causes and meanings.
The assignment I will focus on here is the history paper. Because of class size, students only write two history papers of between 1000 and 1250 words. These papers are modified research essays. By modified I mean that students do not actually locate the primary sources that they will use in this essay in the library but are provided with a series of primary sources in an assigned reader. They are also provided with a topic and series of questions that their essay must address. I needed an assessment tool that would enable me to measure the quality of their essay. I wanted a tool that would allow me to provide students with information as to the overall quality of their essay and one that would break the essay down so that students could understand how well they had addressed the various parts of historical writing. To this end, I chose a writing rubric that would give students a visual picture of my assessment. I have also found that the rubric leads to a more consistent assessment of student essays as it ensures that I am articulating clear standards.
WWU Primer on improving the Teaching, Learning, Assessment Cycle
Psychology - Mike Mana
Greetings and Salutations! My name is Mike Mana. I am a physiological psychologist and an Assistant Professor in the Psychology Department at Western. I received my Ph.D. in Psychology from the University of British Columbia in 1990. I then spent 5 years as a Research Fellow in the Neuroscience Department at the University of Pittsburgh, and 4 years as an Assistant Professor in the Biology Department at Chatham College in Pittsburgh, before coming to Western in 1999.
My interest in the process and product of pedagogy dates back to my earliest TA experiences at University of British Columbia; I have always loved the give-and-take atmosphere of a college classroom and the “buzz” that comes from sharing (as opposed to simply talking about) my interest in brain and behavior. My teaching philosophy emphasizes the notion that students learn best when their participation in the classroom is encouraged and expected…in the selection of topics included in the course syllabus; in the direction that a class discussion takes; in electronic chat rooms devoted to topics pertinent to the class; and in the application of their growing knowledge to issues of personal and/or public importance and interest.
My interest in the TOLO project is based, in part, on a desire to be able to better gauge the success of my teaching by careful assessment of what my students learn. It is always interesting . . . and humbling . . . to play “This is what I said, but this is what you remember?” with students, especially good ones. In this regard, I am very grateful for the opportunity to interact with two outstanding undergraduate students on the TOLO project, Kyle Nelson and Meghan Manaois.
My name is Kyle Nelson and I am currently am a junior at Western Washington University. I am originally from Tumwater, Washington where I graduated from Black Hills High School. I plan to graduate in two years with a Psychology Major and a Biology Minor; my long-term goal is to continue with graduate training in psychology with the eventual goal of opening a camp for teenagers with low self-confidence. I have taken an Introductory Physiological Psychology course from Dr. Mana, and currently work in his research lab. I became interested in the TOLO project because I was interested in the opportunity to impact the nature of the courses offered in the Biopsychology area. I feel that the lower-level classes level need to help students better prepare for more upper-level courses, while upper-level classes need to better prepare students for real life applications of their knowledge and skills. Through our work with the TOLO process this summer, I believe that we have moved closer to these goals, making the courses more interesting and also more applicable.
My name is Meghan Manaois and I am a recent graduate of Western Washington University. My involvement in the TOLO program stemmed from an interest in furthering the connection between student and professor; further, the opportunity to refine the syllabus for a class that was so influential in determining my future career plans has been quite a privilege for me. Having worked on the TOLO project, I believe that future students in Psychology 320 will be better informed about the area of physiological psychology in general, and better prepared if they choose a career in this area. The program, in general, seemed helpful in structuring a class to fit everyone's needs.
I am also very happy to have interacted with the other members of the TOLO group this summer. From the outset, I was interested in developing, adapting, adopting, borrowing or outright stealing assessment approaches that would test the gamut of student knowledge (rote detail to broad themes and connections), be interesting and challenging to the student, applicable to their future academic and/or professional endeavors…and perhaps equally important in a time of shrinking budgets and increasing class size, entail little additional faculty effort. Working with the other members of the TOLO group provided ample stimulation in all of these areas…thanks to one and all!
The course that we have chosen to highlight on the TOLO website is Psychology 320: Topics in Physiological Psychology (pdf file opens in new window). Designed for students with more than a casual interest in the area of brain and behavior, Psychology 320 is the second course in a sequence of courses that begins with Psychology 220 (Introduction to Physiological Psychology) and ends with a number of 400-level seminar courses on specific issues in the area of physiological psychology. Psychology 320 provides a focused and detailed understanding of brain/behavior issues that were introduced in Psychology 220. Its main goals are to provide students with a more detailed understanding of the biological bases of behavior; to familiarize students with the different research questions asked, and approaches used, by physiological psychologists; and to develop skills required to effectively evaluate, and communicate about, research in this area.
To meet these objectives, the course has traditionally included 2 multiple-format exams (multiple-multiple choice; modified true-false; fill-in-the-blank; identification; and/or short answer/essay); a series of “target article critiques” in which students learn to critically read primary research literature, and an end-of-term poster session in which they present an original research paper to the other members of the class. In our evaluation of the course and its assessment tools, my student colleagues and I decided to focus on the teaching objectives and learning outcomes associated with the poster presentation.
Psychology 320 focuses on various topics and issues that fall under the guise of physiological psychology. Its main goals are to provide a strong basic background in brain/behavior relationships; to familiarize students with different areas of research in physiological psychology; and to develop the skills required to evaluate, and communicate about, research in this area.
To this end, student in Psychology 320 have traditionally been assigned an end-of-term poster presentation in which they must read and evaluate an original research paper in an area of their choosing, and then design and present a poster describing this research. Students are prepared for their poster presentation in several ways:
- Approximately every 2 weeks, a class is devoted to the “shredding” of an original research paper. In each class, a single paper (distributed 1 week earlier) is read and analyzed in terms of the hypotheses that were tested; the research design, techniques, and analyses employed; its strengths and weaknesses; and the “future research ideas” that were generated. Students must involve themselves in an in-class discussion as well as submit a 2-page critique of the article under consideration.
- In the middle of the term, 1-2 classes are devoted to the nuts and bolts of designing and putting together a scientific poster, using the Poster Guidelines of the Society for Neuroscience.
The inclusion of a poster's preparation and presentation in Psychology 320 is intended to assess students' ability to read and understand research in physiological psychology, to integrate this information into a cogent whole, and to communicate this new understanding to their peers. We believe…and student evaluations support the idea…that the planning, preparation and presentation skills are practical ones for students who plan to go on to a graduate program in the behavioral and brain sciences, as well as students who will leave Western with a B.A. in psychology and join the workforce in some area completely unrelated to psychology.
Student evaluations indicated that the poster sessions were one of the most enjoyed, and valuable, assignments in Psychology 320. However, many students felt they lacked the background necessary to confidently speak in the think-on-your-feet, Q-&-A atmosphere of a poster session. In addition, many students missed the opportunity to develop their writing skills in an upper-level course. After careful consideration, my student colleagues and I decided that the completion of both an end-of-term paper AND an end-of-term poster presentation was cruel and unusual punishment. Instead, we elected to change the course requirements to include a mid-term paper and end-of-term poster, on a single topic. We hope that the novel (at least to us!) integration of a midterm paper with an end-of-term poster in our class assessment plan will provide a vehicle that will facilitate students' abilities to gain both breadth and depth of knowledge in a favored area of physiological psychology, while developing technical writing and presentation skills that will be useful to future academic and nonacademic endeavors.
In large part, the knowledge base and skills outlined in the learning objectives for Psychology 320 are slowly acquired during each class period over the entire quarter. With that said, my student colleagues and I have selected several key classes and assessment tools included in the syllabus that are particularly relevant to the intended teaching objectives and learning outcomes required for the paper-and-poster assignment that we have focused on for the TOLO website.
Critical Thinking/Public Speaking Skills. My student colleagues and I believe that the first formal (i.e., not happening every class) teaching objective/learning outcome to be assessed that is relevant to the end-of-quarter poster presentation will occur during the first “article critique”. In this class, we single out an original research paper and explore its strengths and weaknesses. Most students in Psychology 320 have just begin to appreciate that much of what is reported in science…especially the behavioral and brain sciences…is not dualistic (aka the Perry Scheme's notion of right v. wrong) but instead is multiplistic (there are several possible answers to a question) and/or relativistic (the answer depends on the conditions). They are intimidated by the challenge of CRITICALLY reading the research of a “published scientist” and by the notion that they should question EVERY important claim made in that published article.
To get students past this hump early…to whet their appetites for the critical dialogue that should increasingly become a part of their academic life…I will schedule the first “paper shred” as early as possible in the quarter (i.e., within the first 2 weeks of class). In this way, students will begin to speak publicly about their own critical thinking as early as possible in the course. In addition, they will begin to appreciate the vagaries of scientific research and the advantages and pitfalls inherent to different approaches to physiological psychology research. Some students will leap at the chance to engage in this dialogue; other students will have to be “bird-seeded” with leading questions that encourage their participation. Regardless, each student will be expected to contribute to this 90-min discussion in some way.
Following this class, each student will receive feedback (written comments and a grade) about their contributions to the class discussion and about their 2-page critique of the research article. The first article critique will be worth ½ of subsequent critiques, so that students will not feel penalized in their first attempts to engage in this new academic skill.
Critical Thinking/Novel Synthesis of Existing Knowledge/Technical Writing Skills. The second phase of our teaching objectives/learning outcomes to be assessed will begin at the start of the 3rd week of class, when students will use Blackboard to submit a 1-page outline of their midterm paper. This outline will describe the general area they have selected to write about and lay out the key sections of their paper; references for the articles that each student has gathered to date will also be included. It is our hope that this phase of the process will allow students to accumulate and integrate the background knowledge in their topic area that will allow them to more comfortably handle the Q-&-A Environment of the end-of-term poster session.
These outlines will reviewed for general organization, clarity of thought, breadth v. depth of topic, and quality of references; comments will be promptly returned (within 1 week) and students will be given until the end of the 5th week of classes (about 10 days) to complete and submit their mid-term paper. If they choose to, students will be able to submit a draft of their paper for further comments and revision.
Critical Thinking/Focused Description/Technical Presentation Skills. The third phase of our teaching objectives/learning outcomes will prepare students specifically for their poster presentations. During this class, which will occur during the 6th week of the quarter, I will talk to students about the differences between a poster presentation and the term papers that they have just completed, using the guidelines of the Society for Neuroscience as a starting point. The need to focus on a single piece of original research, brevity of text, importance of graphical data presentation, text size, poster layout, and other topics will be discussed using (mostly) model posters from previous classes to highlight key points. I will use Blackboard to provide students with links to sites that provide further instruction on the poster-making process and to provide a forum for students to seek/offer help on their poster presentation.
Following these classes, students will use PowerPoint, Word or a similar program to prepare a mock-up of their poster, including text and key figures, for submission to me via Blackboard by the end of the 7th week of class. Comments will be returned during the 8th week, with comments focused on the key points outlined above.
Think on Your Feet/Poster Preparation/Public Speaking and Presentation Skills. The fourth and final phase of teaching objective/learning outcome assessment comes during the final week of class when each student is responsible for a 5-10 min presentation of their poster. The forum mimics a real scientific meeting…during the in-class presentations, posters line the walls of the classroom, coffee is available so that brains are properly stimulated, and each poster presenter is allowed to “tell their story”, with the caveat being that they can be subjected to a “Q-&-A period” where classmates ask questions about the research under consideration at any point in time.
Assessment of the midterm paper and the end-of-term poster session will be done separately, so that two grades will be assigned. Midterm papers will be graded according to a rubric provided to students at the beginning of the quarter and available at the Blackboard site for the class. Each poster presenter will be graded on the quality of their poster according to a rubric that will be distributed during the class focusing on poster preparation and presentation and available at the Blackboard site for the class. In addition, each poster will be anonymously evaluated by other students in the class, using a rubric distributed to each student at the start of each poster session. Students who are not presenting will be graded on the quality of their participation in the Q-&-A sessions (thoughtfulness of question; clarity of question). At the end of the poster session, each student will receive a single grade based upon the design and content of their poster (40%), their presentation of the poster (30% my grade + 10% summary of peer evaluations), and their interactions during other poster presentations during the Q&A session (20%).
At the end of the quarter, students will use Blackboard to complete a Student Assessment of Learning Outcomes (SALG) which will provide feedback on their perception of the success of the combined term paper/poster presentation.