Office of Educational Assessment

Other

Chemistry 142 Placement Exam Development. D. McGhee, D. Wiegard,  OEA Report 16-03, 2016. (215K PDF)

The UW Chemistry department, in collaboration with the Office of Educational Assessment, created two parallel forms of a 26-item exam to place students into Chemistry 142.  Development took place over the course of four academic quarters (Spring 2015 through Winter 2016) and the exam was first introduced into use in June 2016.  This report describes the process of development from item writing through forms validation.

UW Academic Challenge and Engagement Study (UW ACES). C. Beyer, A. Davis-Unger,  OEA Report 16-02, 2016. (485K PDF)

Using a “citizen science” model, researchers in the Office of Educational Assessment (OEA) asked departmental advisers if they would volunteer to interview a sample of seniors in their departments about the challenge those students experienced in the major.  Sixty-six academic advisers from 33 UW departments—about half of the undergraduate degree-granting programs on campus—volunteered to participate.  This report summarizes what UW undergraduates, in general, found the most challenging about their learning experiences at UW, using the 33 departments participating in the UW ACES as case studies.

Undergraduate Student Work Life at the University of Washington. C. Beyer, A. Davis-Unger, J. Elworth, N. Lowell, and D. McGhee, OEA Report 15-06, 2015. (573K PDF)

In fall 2007, the UW introduced the Husky Promise program and guaranteed that eligible Pell Grant/Washington State Need Grant students would receive at least enough funding through grants and scholarships to cover their tuition and fees. Since this program began, the number of students who qualify for need-based grants has increased, while the capacity of allocated state funding to meet state eligibility for such funding has not, placing increasing demands on the UW’s institutional resources to fund the Husky Promise. This study follows a preliminary report describing what we know from the academic literature and from OEA’s previous research on students’ work lives. It summarizes responses to work-related questions added to a broad-based survey of undergraduates, and to a more directed survey and focus groups of currently enrolled Work Study students.

What We Know about the Place of Work in the Academic Life of UW Undergraduates. N. Lowell, C. Beyer, and J. Elworth, OEA Report 15-05, 2015. (182K PDF)

The University of Washington’s (UW’s) Enrollment Management Advisory Council (EMAC) has been asked by the President and Provost to investigate and recommend courses of action to address the effects of insufficient need based funding on UW students and academic programs. This preliminary report identifies what is generally known about college employment from published research and information available on UW students. It summarizes information from the published literature, a recent survey of undergraduates at UW Seattle, and statistics on the UW Work Study program. The report concludes with a proposal for further study.

Diversity Symposium Spring 2015 Evaluation Summary. A. Davis-Unger, OEA Report 15-01, 2015. (267K PDF)

On January 30th, 2015, a brief online survey was conducted at the Diversity Symposium held in the Samuel E. Kelly Ethnic Cultural Center. The purpose of the survey was to collect feedback from participants regarding their experience at the Symposium. The majority of participants responding to the survey were UW staff members (55%) and faculty (25%). Overall, the Symposium received very high ratings from participants. Specifically, they reported a broader exposure to various perspectives on climate and diversity and gained useful strategies and tools to promote inclusion in their respective departments. Participants also provided a variety of suggestions for future sessions.

Mathematics Placement Test Performance Predicts Subsequent Math Course Success at Four-Year Universities. D. McGhee. OEA Report 10-04, 2010. (285K PDF)

This report describes the results of a study of the relationship between Mathematics Placement Test (MPT) scores and subsequent student performance in mathematics courses at four Washington universities. The results show that MPT total scores can be used to predict performance in college-level courses. In entry- and precalculus-level courses, all three tests produced scores that were significantly correlated with numeric course grade. At the calculus level, MPT-A total score was moderately-to-strongly related to course grade. The equations derived from logistic regression analyses can be used to estimate the MPT score likely to result in course success, which may be helpful to faculty going about the task of reviewing or setting placement cut scores.

UW Seattle English Language Proficiency Requirement: Autumn 2009 Cohort. D. McGhee. OEA Report 10-03, 2010. (282K PDF)

Beginning Autumn quarter 2009, the University of Washington (Seattle) instituted new English Language Proficiency Requirement (ELPR) procedures for incoming undergraduates. The new (interim) policy requires that all entering undergraduate students (both freshmen and transfers) demonstrate English language proficiency prior to enrollment in classes. This report summarizes the English Language Proficiency status of the Autumn 2009 entering cohort as of January, 2010.

Intermediate Mathematics Placement Test (MPT-I): Version 11 Development. D. McGhee and J. Peterson. OEA Report 10-02, 2010. (250K PDF)

This report describes the development of Version 11 of the Intermediate Mathematics Placement test (MPT-I). The instrument pilot test was conducted during winter quarter 2010 at the University of Washington. Item and total score statistics indicated that MPT-I Version 11 is acceptable for use in the Academic Placement Testing Program. MPT-I Version 11 will be released at the beginning of the 2010-2011 testing season.

The University of Washington Senior Research Study 2009. J. Peterson, C. Beyer, S. Chang, and A. Giesbrecht. OEA Report 10-01, 2010.  (422K PDF)

The University of Washington’s Senior Research Study (UW SRS) was exploratory, designed to help us understand and assess the research that UW undergraduates are typically required to do as a normal part of progress through their UW experience. Seniors from 15 UW departments participated in focus groups, completed questionnaires, and brought a class paper or project that had required research which they considered challenging. Departmental results were analyzed and provided to department chairs. The 15 departments were then used as cases for this general report. The UW SRS found that, on average, students conducted research in close to half of the courses they took after their freshman year, with much of that concentrated in upper-level courses; that why and how students conducted research varied widely by disciplinary area and often by department within disciplinary areas; that students in most departments were not prepared by their high school research experience to use the research methods required of them at UW; and that the amount and structure of instruction in research methods and practices students reported varied widely by department. Finally, the study raised questions about how academic institutions define “research”.

General Mathematics Placement Test (MPT-G): 2009 Pilot Study. D. McGhee, N. Lowell, J. Gillmore, and J. Peterson. OEA Report 09-03, 2009. (214K PDF)

Over the course of the 2008-2009 academic year, the new MPT-G and revised MPT-I were administered to high school and college students throughout the state. Subsequent end-of-course grades were collected to assist in setting a common college readiness cut score. This report describes the pilot procedures and results.

2008 DELNA Screening Pilot. N. Lowell and D. McGhee. OEA Report 08-03, 2008. (359K PDF)

This report describes the methods and outcomes of a pilot administration of the Diagnostic English Language Needs Assessment (DELNA) Screening carried out during September and August of 2008. The DELNA Screening is a short (17 minutes) online test consisting of Vocabulary and Speed-Reading subtests. It was completed by 1158 students attending selected freshman, transfer, and international summer orientation sessions. Student test scores were combined with demographic and academic data from the UW Student Database and subjected to a series of f to determine whether the DELNA Screening would provide an appropriate measure to estimate English language proficiency of incoming students.

2008 Survey of Community and Technical College Testing Centers. A. Giesbrecht. OEA Report 08-02, 2008. (154K PDF)

In 2007 the Washington State legislature passed a bill requiring the Mathematics Placement Tests (MPT) be revised to align with College Readiness Mathematics Standards (CRMS) created by Washington’s Transition Mathematics Project (TMP). This report describes a brief survey of test administrators at community and technical colleges to determine whether the newly developed General Math Placement Test (MPT-G) can play a dual placement/college readiness role at two-year institutions and, further, whether there is a good fit between the current test format and existing testing infrastructure. Survey results indicated that the MPT-G will be of limited usefulness at community and technical colleges because the majority of students require placement into below-college level mathematics courses not addressed by the MPT-G. Additionally, the format of this test (paper-pencil, pre-scheduled, group) does not fit readily into the regular, daily testing process of two-year institutions which rely heavily on computer-adaptive, walk-in, and individual testing. However, two-year schools currently administer tests of the same format as the MPT-G as date-specific administrations once or twice a year, and any given institution may elect to offer the MPT-G on the same basis. Students at all community and technical colleges are eligible for testing at existing APTP administrations whether or not the tests are available at their home institution.

General Mathematics Placement Test (MPT-G): Initial Test Development. D. McGhee, J. Peterson, J. Gilmore, and N. Lowell. OEA Report 08-01, August 2008. (462K PDF)

Over the past several years, there has been increasing concern at both the state and national level about mathematics preparation among high school students. In 2007 the Washington State legislature mandated that the Mathematics Placement Test offered by the Academic Placement Testing Program be modified to align with College Readiness Mathematics Standards developed by the Transition Math Project. This report describes the initial development of the General Mathematics Placement Test (MPT-G) to meet the requirements of House Bill 1906.

Classroom Learning Environment Questionnaire UW College of Education Pilot: AU 2006-SU 2007. D.E. McGhee. OEA Report 07-09, October 2007. (144K PDF)

This report details the second stage of development of the Classroom Learning Environment (CLE) questionnaire. The aim of the present study was to evaluate the statistical characteristics of the revised instrument using a large sample of classes.

Factors Related to Attrition and Retention of Under-Represented Minority Students: National and Regional Trends. S. Lemire and C. Snyder, OEA Report 06-08, 2006. (430K PDF)

This report summarizes analyses of selected data from files of the National Postsecondary Student Aid Study (NPSAS:04). The NPSAS:04, conducted by the National Center of Educational Statistics (NCES) during the 2003-2004 school year, is a comprehensive nationwide study designed to determine how students and their families pay for postsecondary education and to describe characteristics of those enrolled. It captures extensive information on students’ educational circumstances, and the resulting dataset affords the opportunity to describe the national context relative to a variety of educational issues. Our purpose is to present a limited number of educational variables, both nationally and regionally, relating to attrition and retention of under-represented minority students.

The Classroom Learning Environment (CLE) Questionnaire: Preliminary Development. D.E. McGhee, N. Lowell, S. Lemire, and W. Jacobson, OEA Report 06-07, 2006. (386K PDF)

There is strong interest at the University of Washington in providing a positive environment for all faculty, staff, and students. Within the past few years, this Office has been asked to assist in administering two surveys of campus climate and, more recently, an extensive study of Leadership, Community, and Values has been initiated by our new Provost. It is in this context that we were asked by the Dean of the Office of Undergraduate Education to consider ways in which questions relating to issues of diversity could be integrated into ongoing course evaluations. The Office of Educational Assessment maintains a well established course evaluation system used by most courses and all departments at UW Seattle. Our task was to determine whether and in what way we could capitalize on the capabilities of this system to obtain regular student assessment of classroom climate. In order to do this, we formed an Advisory Council made up of faculty and staff from a variety of programs and offices that work with diverse groups.

Mathematics Placement Test (MPT) Alignment with Washington State College Readiness Mathematics Standards (CRMS). J. Peterson, OEA Report 06-06, 2006. (212K PDF)

This report provides a basic description of the Math Placement Tests (MPT) currently in use in five of the six public baccalaureate institutions in Washington State. The recently developed College Readiness Mathematics Standards (CRMS) are also described, along with an analysis of the alignment of the MPT to those standards carried out by an external agency, Achieve, Inc. Achieve noted that the MPT were not closely aligned to the CRMS, as would be expected given that the tests are principally designed to place entering college students into first-year mathematics courses rather than to provide a comprehensive assessment of their K-12 mathematics education. Nevertheless, a mapping of the MPT items to the CRMS should be included in a deliberate discussion of the structure and content of future revisions of the tests. For this reason, we undertook a more detailed analysis of the MPT and CRMS alignment, finding more correspondence than reported by Achieve and identifying specific areas on which to focus test redevelopment efforts.

UW Academic Advising Self-Study: Preliminary Report. S. Lemire, C. Snyder, and L. Heuertz, OEA Report 05-03, 2005. (2,177K PDF)

In the summer of 2004, the University of Washington (UW) Board of Regents authorized funding to address advising issues at the UW, and the Office of Educational Assessment (OEA) was asked to undertake a self-study of all undergraduate advising activities at the UW. OEA contacted academic advisors, students, and administrators campus-wide to solicit feedback on their experiences with, and perspectives on academic advising. The results of surveys, interviews, and reviews of existing records provided a rich array of both quantitative and qualitative data.

The Evaluation of General Education: Lessons from the USA State of Washington Experience. G. Gillmore, OEA Report 04-03, 2004. (274K PDF)

Assessment and accountability are presented as contrasting models for evaluating outcomes in higher education. While both models are concerned with quality and improvement, the accountability model stresses externally imposed evaluation goals, methods, and criteria and comparisons among institutions. The assessment model stresses faculty-determined, institutionally- specific goals and methods with a focus on improvement. This discussion is followed by a description of general education and outcomes that one should consider measuring. Two State of Washington studies, relevant to the evaluation of general education, are presented — one under an accountability model that used standardized tests, and one under an assessment model that used student writing from courses. The report concludes with six prerequisites for evaluating general education.

Some Assessment Findings. 2003 OEA Assessment Group, OEA Report 03-05, 2003. (79K PDF)

This is the first in a series of dynamic reports created by the OEA Assessment Group to highlight particular findings of interest. Throughout the year we undertake a variety of studies, both large and small, that address a wide range of topics. At times our research is very pointed and at times more general, but there is always the potential to learn something we didn’t expect, or to look again from a different angle at things we thought we knew. We will update this page periodically throughout the year to share our findings with you.

Writing at the UW: The First Year. C.H. Beyer, G.M. Gillmore, M.P. Baranowski, and N. Panganiban, OEA Report 03-03, 2003. (397K PDF)

This report focuses on the writing experience of undergraduates as they move through the writing required in their first year at the UW. The report presents results from interview and survey questions that UW SOUL participants answered in the 1999-00 academic year, and draws on two studies of undergraduate writing conducted for the Office of Educational Assessment in 1989-91 and 1994-96. Overall, the results showed the following:

  • The amount of writing students do in their first year at the UW — an average of 8.5 papers averaging 6 pages each — is nearly the same as the amount of writing first-year students reported doing five years ago.
  • In addition to writing papers, students at the UW are required to write short pieces that are usually designed to keep students actively engaged in course content.
  • Students’ most challenging writing at the UW is argumentative writing, and they have had little previous writing experience or instruction with the kinds of argument that they are required write at the UW. The fact that writing is shaped by disciplines is news to freshmen, as well as to some transfer students.
  • Faculty assigning papers should be aware that the kinds of papers students consider challenging require students to make arguments consistent with arguments in the discipline and to use resources outside themselves as support. In addition, challenging papers require students’ time and thinking, are carefully graded, and are likely to have moved through a draft/feedback/revision process.
  • It appears that students have few opportunities to take papers through a draft/feedback/ revision writing process after their freshman year.
  • Students believe their writing improves in their first year primarily along the lines of writing in the disciplines and argumentative writing, more often attributing the improvement to “practice,” rather than to “instruction.”

University of Washington Parents Survey 2001. G. Garson, OEA Report 02-04, 2002. (132K PDF)

The fourteenth annual Parents Phone Survey invited parents to express their views on “how things are going” for their daughters and sons. The survey is one part of a larger effort to assess the University’s success in establishing and maintaining effective lines of communication with parents through such means as Freshman Convocation and UW News, a publication for parents of University undergraduates. This report describes the methodology and presents the findings from the 2001 Parents Survey.

The UW Study of Undergraduate Learning Representativeness of the Sample. G.M. Gillmore and C. Beyer, OEA Report 01-04, 2001. (55KB)

The University of Washington Study of Undergraduate Learning (UW SOUL) is a longitudinal study of students who entered UW as freshmen or transfers in the fall of 1999. Two groups of students comprise the sample studied. The purpose of this study is to assess the representativeness of these samples, relative to the entire population of undergraduate students who entered the UW in the fall of 1999. The two UW SOUL groups did not differ significantly on any of the variables analyzed. However, the combined UW SOUL group differed significantly from the remainder of the population on all variables except transfer credits. Even though all of these variables were statistically significant, the amount of total variance explained by group membership was considerably less than one percent in all cases, as indexed by the eta square statistic.

Eventual Majors of Students Who Enrolled in MATH 124 and CHEM 140: A Study of 1992 Entering Freshmen. G.M. Gillmore, OEA Report 99-04, 1999. (221K PDF)

The graduation majors of students who entered as freshmen in 1992 were categorized into those that required MATH 124 or CHEM 140, those that did not require the course, and those for which the course was an option. Only 33% and 38% of all graduates who passed Math 124 and Chem 140, respectively, majored in a field that required the course. Students who scored below 600 on SAT Math were much less likely to major in a field requiring either course, as were students who received a grade below 3.0 in the respective course. Thus, both courses continue to function as gate-keepers.

Student Views of the Calculus Sequence. K. Etzkorn and T. Taggart, OEA Report 98-3, 1998.

At the request of the Department of Mathematics, this study solicited student opinions on the introductory math calculus sequence. Focus groups were conducted with a random sample of students currently enrolled in the calculus sequence, and a sample of students who had previously completed the sequence. Scripted questions were based on math faculty concerns. Focus group results were combined with quantitative course evaluations of 50 courses and 97 quiz sections over seven academic quarters. The results suggested several modifications to the calculus sequence including the addition of condensed courses directed at students with previous calculus experience, improvements in instructional materials, and broadening of the calculus curriculum to include material related to a wider variety of majors.

Survey Response Rate and Request for Student Identification. N. Lowell, OEA Research Note 98-N3, 1998. (69K PDF)

When conducting survey research concerning student experiences and educational outcomes, it is often critical to link questionnaire responses to data collected from other sources. To this end, we request that students provide their student number as they complete their survey forms. This study examined differential response rates of students who were asked to provide identification and those who were not. Requests for identification did not appreciably affect rate of response, but a sizable proportion of students who returned completed surveys did not provide identification. It is recommended that surveys include requests for respondent identification as well as various demographic variables, even though the latter may be available from linked sources. The same pattern of response was found when results were examined by gender and transfer status, but, surprisingly, minority students were more likely to respond when asked to identify themselves than when they were not asked.

Prediction of English Composition Grades. G.M. Gillmore, OEA Research Note 98-N2, 1998.

The general purpose of this study was to determine the extent to which SAT verbal scores predict success in English 131, Freshman Composition. Test scores were compared to course grades for 10,613 students who received grades in English 131 from fall, 1992, through winter, 1998. All analyses pointed to the conclusion that SAT verbal scores do not effectively predict English 131 grades. High school grade point average is a better overall predictor.

Average Grades. G.M. Gillmore, OEA Research Note 98-N1, 1998. (55K PDF)

This report provides information on average grades at the UW for the 1996-97 academic year, updating OEA Research Note 95-N3. The earlier report noted that grades rose from 1975 to 1987 and then leveled off. Even so, faculty almost unanimously thought that grades were too high and over 80% felt that measures should be taken to reduce them. Students thought that grades were too high, though not to the extent that faculty did. Faculty and students tended to prefer the 4.0 grading system over the alternatives. The present study found that average grades for 1996-97 were essentially equivalent to those for 1994-95.

The Effects of Priority Registration on Missed Classes for University of Washington Softball Players. G.M. Gillmore and A. Few, OEA Report 97-07, 1997.

Student-athletes must be enrolled as full-time students to compete in intercollegiate sports, yet, game schedules result in many missed class sessions. This study examined the effect of priority registration on reducing class absences due to conflicting game schedules. Student class schedules were matched with times of scheduled games and travel for 1996, before priority registration was implemented, and 1997, after priority registration was implemented. It did not appear that priority registration made a positive contribution to class attendance or student-athlete performance.

The University of Washington Teaching Portfolio Project. G.M. Gillmore, D. Hatch, and other contributors, OEA Report 97-05, 1997.

Currently, student ratings of instruction are the most widely used method by which teaching is evaluated. In addition, UW faculty are required to undergo periodic peer review. While these two methods are useful components of a thorough evaluation, they don’t provide a complete picture. This report summarizes attempts by ten campus units to develop teaching portfolios as an additional source of information on quality of instruction. Awards of $10,000 were made to participating units and this compilation summarizes their efforts.

Four Models of the Relationship between Assessment and Accountability. G.M. Gillmore, OEA Research Note 97-N4, 1997. (35K PDF)

Assessment and accountability are uneasy bedfellows. In a sense, the legislature has paid us to do assessment but will deduct payment if we fail to meet certain accountability goals. Our constituencies demand both, as well they should. This report outlines four models relating accountability and assessment, along with related underlying assumptions and major problems.

Evaluating a Writing Program Using Portfolios of Student Writing: A Theoretical Rationale and Plan for the College of Engineering at the University of Washington. C. Scott, C. Plumb, and J. Ramey, OEA Report 97-03, 1997.

Engineering students at the University of Washington take three communication courses: one English composition course, one introductory technical writing course, and another advanced technical communication course, or a departmentally approved substitute. In addition, students complete writing assignments in many of their department courses. In May 1996, a subcommittee of the College’s Educational Policy Committee recommended that the College devise a procedure that could be used to evaluate the current writing program in the College and the effectiveness of that program in preparing engineering students to write at work. The purpose of this report is to lay the theoretical groundwork for such an evaluation and to recommend a procedure. The main recommendation of the report is that the College of Engineering embark on a portfolio project that will span three years and will include soliciting and selecting student participants, collecting evidence and compiling portfolios, maintaining and analyzing evidence, and creating and implementing performance-based outcomes for the writing program in the college. The overall goal of this evaluation is to establish a common approach for teaching and assessing writing that will prepare students for writing in the workplace.

Teaching, Learning and Technology (TLT) Pilot. N. Lowell. OEA Research Note 97-N3, 1997. (95K PDF)

The University of Washington UWired program, in conjunction with teachers from around the state, has developed the Teaching, Learning and Technology (TLT) Program to teach effective use of educational technology in K-12 classrooms. This report describes an assessment of a four-credit pilot course offered in August, 1997, and includes links to the three web-based assessment instruments used. Although participants differed in their prior level of experience with computers, what was taught and how it was taught appeared to have more influence on learning than did previous experience. Lessons were also learned regarding the use of on-line surveys, in particular the importance of limiting the number and length of surveys administered.

The Freshman/Sophomore Writing Experience1994-96. C. Beyer, OEA Report 97-02, 1997. (5.21MB PDF)

This report presents the results of the second Freshmen/Sophomore Writing Study, a project that tracked the writing experience of about 45 UW students between 1994-96, and compares those results with results of the first Freshman/Sophomore Writing Study, which tracked the writing experience of about 100 UW students between 1989-91. The second Writing Study, like the first, involved collecting all of the writing done by participants in their classes during a two year period. Participants also completed reflective essays on their own writing at the end of each year and were interviewed about their courses at the end of every quarter. Results indicated that students in the second Writing Study wrote more arguments about non-literary topics as high school seniors than did students in the first Writing Study, but there was still a gap between the types of papers assigned in high school and those assigned in college. On the other hand, the writing experience of students entering the UW as freshmen in 1994 was nearly identical to that of students who entered in 1989.

A Decade of Formal Assessment at the University of Washington. G. Gillmore and L. Basson, OEA Report 96-07, 1996. (94K PDF)

This report describes the history, impact, and future directions of formal assessment at the University of Washington. It begins with a discussion of the background and history of the assessment movement in the State. Principles guiding assessment and implementation strategies are described, followed by an overview of the impact of assessment. Specific improvements in curriculum and courses guided or influenced by assessment research are described for departmental majors, writing, quantitative and symbolic reasoning, distribution requirements, special programs, diversity, graduation rates and time to degree, accreditation, and forging links among institutions. The report ends with a discussion of future directions seen as serving three major and interrelated goals: 1) the development and measurement of accountability or performance indicators, 2) the assessment of new and continuing programs to improve their effectiveness, and 3) the contribution of assessment to the University’s strategic planning by providing relevant data on quality and efficiency.

Cultural and Ethnic Diversity: The Need for a Requirement. G.M. Gillmore and P.J. Oakes, OEA Report 95-8, 1995.

The Cultural and Ethnic Diversity (CED) Task Force was appointed by the Faculty Senate in order to evaluate the need for and the feasibility of an ethnic and cultural pluralism requirement. This report addresses the need for a CED requirement based on data from three sources: a Cultural Diversity Questionnaire administered to students in twelve classes, questions about the proposed requirement that appeared on the 1994 and 1995 senior surveys, and simulations performed to determine how many 1993-94 graduates would have met the requirement had it been in effect during their tenure. Results from the questionnaire suggest that factors considered important in educating students about cultural and ethnic diversity are considered important and are learned in some classes at the UW. Responses to the senior survey questions indicated that roughly one-third of the respondents would have liked more attention paid to issues of pluralism and diversity in their classes, one-third would have liked less, and one-third were neutral. Finally, the simulations showed that between 60% to 80% of the graduates would have met the criteria for a CED requirement but many of those in the sciences and professional schools would not.

Grades. G.M. Gillmore, OEA Research Note 95-N3, 1995. (176K PDF)

During 1994-95, the Faculty Senate Council on Academic Standards considered the question of grading and the perceived continuation of grade inflation at the UW. The Office of Educational Assessment subsequently surveyed faculty on grading matters and included questions on grading in the annual survey of seniors. This research note provides some of the results from these surveys as well as data from the UW Registrar’s Office on trends in average grades over time, and from the UW Office of Institutional Studies on average grades in various academic units. These data indicate that average grades rose from 1975 to 1987 before leveling off. Both faculty and students think grades are now too high. Over 80% of faculty members feel that measures should be taken to reduce average grades.

Student Instructional Ratings at the University of Washington: Increased Usage and Averages. G.M. Gillmore, OEA Note 95-N2, 1995.

A fairly steady increase in student ratings usage data is presented from 525 classes rated during the 1959-60 academic year to 9110 classes rated during the 1993-94 academic year. The median of the class averages has also risen, from 3.70 in 1976-77 to 3.97 in 1993-94 for the average of items 1 – 4. The ratings of teaching assistants (TAs) increased more sharply than other ranks, suggesting that TA training programs have been effective. Generally, there is some evidence that new generations of faculty may tend to be more effective teachers.

Portfolio Assessment of Student Writing at the Rising-Junior Level. C. Beyer and J. Graham, OEA Report 94-06, 1994. (4.17MB PDF)

This report presents results of three day-long portfolio assessment workshops that were designed to evaluate UW student writing at the rising junior level. Seventeen faculty from a wide range of disciplines and two administrators participated in the workshops. Participants were asked to evaluate the writing portfolios of eight students, written during their freshman and sophomore years. Criteria employed most often in evaluating portfolios was found to be signs of improvement and students’ own evaluations of their work. All three workshops gave convincing evidence that the longitudinal view provided by portfolios is very different from the cross-sectional view provided by papers from one’s own classes and can be a catalyst for instructional improvement.

The Junior/Senior Writing Study 1991-93. C. Beyer and J. Graham, OEA Report 94-02, 1994. (10.51MB PDF)

The authors of this report collected all of the writing done by a sample of 119 UW students throughout their junior and senior years. In addition the students were interviewed quarterly about writing and they wrote reflective essays about their writing experiences at the end of each year. Students were found to write an average of fourteen papers and to write in an average of 43% of their classes. Most of the papers were argumentative or informative, with about equal numbers of each. Evidence is presented that the writing students do is shaped extensively by their majors and a large majority felt they learned most about writing in their majors. The report concludes with five recommendations related to writing requirements, assignments and instruction.

University of Washington Undergraduate Capstone Courses, Practica and Internship Opportunities, and Feedback from Employers. G.M. Gillmore, OEA Report 91-06, 1991. (13,842K PDF)

As a beginning point for end-of-program assessment, a survey was conducted to develop a compendium of existing capstone courses, internship and practica opportunities, and programs for feedback from employers. Each of these three major categories were further subdivided into subcategories, such as whether there are multiple raters of the final products for capstone courses and whether internships are required (either formal or informal).

The Quantitative and Symbolic Reasoning Courses of 1991 Graduates. G.M. Gillmore, OEA Report 91-5, 1991.

All UW students must pass one of 12 courses which satisfy the Quantitative and Symbolic Reasoning (QSR) requirement. The purpose of this study was determine which courses students actually took to pass the requirement. The results showed that while overall 62% satisfied the requirement with a math course, there was a great deal of variability across colleges.

The Validity and Usefulness of Three National Standardized Tests for Measuring the Communication, Computation, and Critical Thinking Skills of Washington State College Sophomores: General Report. OEA Report 89-01, May, 1989. (3,296K PDF)

In its master plan (December, 1987), the Washington State Higher Education Coordinating (HEC) Board recommended that both two-year and four-year institutions conduct a pilot study to evaluate the appropriateness of using standardized tests as a means for measuring the communication, computation, and critical thinking skills of sophomores. The purpose of such a testing program would be for institutions to: a) strengthen their curricula, b) improve teaching and learning, and c) provide accountability data to the public. Over 1,300 sophomore students from public four-year and two-year colleges were tested, with each student taking two of the three national tests studied. Additionally, more than 100 faculty members took shortened versions of the same tests and critiqued them for appropriateness or content and usefulness. The study concluded that the national tests did not provide an appropriate or useful assessment of the communication, computation, and critical thinking skills of sophomores, and added little reliable information about students’ academic performance beyond what was already known from admissions test data and student grades. The tests did not reflect specific aspects of the college experience such as credits earned and did not provide an adequate match with curricular content.