Assessment Forum

The Benefits of Self-Assessment: Measuring Historical Thinking Skills at UMass Boston

Jonathan Chu | Jan 1, 2015

When the history department at the University of Massachusetts Boston became aware of accreditation agencies’ growing pressure for documenting teaching effectiveness, it determined to assert its prerogative as the only body on campus qualified to measure students’ learning in history. In 2011, in anticipation of the 2015 New England Association of Schools and Colleges (NEASC) accreditation review, the provost instructed departments to develop an assessment plan as they came up for regularly scheduled self-studies. Because we had been wrestling with this problem for some time, we were prepared when it was announced we would be the first department in the College of Liberal Arts to undergo a review and to present an assessment plan. The timing was also useful because we had just completed the first significant revision of the major in 50 years and were in the midst of rebuilding the tenure-stream faculty from 12 to 21.

In the process of aligning the department’s new research strengths to a diverse, urban, nonresidential student body and its postgraduation plans, we moved beyond a broad-spectrum content major to one that focused on the historical research, analytical, and writing skills expected of students for either the 400-level Research and Methods seminar or the honors thesis.1 To focus more explicitly upon instruction in historical thinking skills, an approach consistent with work previously reported in Perspectives on History,2 we developed a new introductory course—History 101: Historical Thinking—and organized it as a beginner’s version of the research seminar. We agreed that the faculty member teaching the course had the discretion to select the course content, but the explicit learning objectives were to be skills unique to history; these became the foci of our assessments.

umassbostonphoto.tif

Credit: Harry Brett

Panoramic view, University of Massachusetts Boston.

Our next step was to ensure agreement among faculty that their evaluations of student work were consistent across our courses. The central problem for our assessment plan was the translation of qualitative criteria for measurement into quantifiable terms—a problem social historians frequently encounter when they distinguish the boundaries of significance between few, some, and most. We knew there was general agreement about what students should know and be able to do, but we found that, when we tried to articulate that into concrete criteria, we needed to develop a shared understanding of how we converted letter grades into levels of mastery in skills or learning objectives. Our conversations helped us to define more precisely the levels of performance in specific skills that were being summarized by letter grades. We articulated five essential skills: the understanding of change over time, the knowledge of specific historical content and its context, the ability to use content and sources effectively in historical analysis, the recognition of the distinctions between primary and secondary sources, and the capacity for well-organized and clear writing. When coupled with gradations of performance, our designated skills led to the creation of a matrix (see sidebar).

Having developed the matrix, we found that the actual process of data collection and review was not terribly onerous. Each faculty member takes about 15 minutes at the end of a term to assess and tabulate student performance and to add any explanatory notes. A rotating assessment committee of two spends less than a day at the end of the academic year collecting the results and drafting a report to the department.

We conducted a trial run for academic year 2011–12, collecting data for one section each of two courses—Historical Thinking, and Research and Methods—and using this data to establish the feasibility of our approach. The next year, the assessment committee conducted a full-scale review of all of the Historical Thinking and Research and Methods sections, and entered the data onto an Excel spreadsheet. The use of Excel eased data collection and reporting while providing expanded capabilities for analysis. After analyzing the 2012–13 data, the committee recommended the addition of another category addressing historiography and noted that there had been little improvement in writing skills. Both recommendations reflect the central objectives of assessment: the demonstration of an ongoing process of review and action—or, if the review does not warrant it, nonaction.

In response to our assessment of writing, the department instituted guidelines for minimum amounts of required writing assignments calibrated for each level of instruction, made requests to Academic Support for dedicated history tutors, and used data to support recommendations for limits on class size in Historical Thinking (in opposition to university pressures for larger introductory sections). The review of data from Historical Thinking also helped identify potential honors students early in their academic careers and supported further recommendations for action to the Undergraduate Curriculum Committee.

It is not a complicated plan, and may have taken longer to write this essay than it took to design and implement the recommendations.

We are fully aware of the flaws in our statistics and methods. We cannot, for example, measure improvement among the same students; the scheduling and nature of students’ lives precludes knowing which ones take Historical Thinking their freshman year and appear three years later in the Research and Methods seminar. Attempting to obtain that information would require extensive cross-checking of student records for a small number of students, and could trigger institutional research review while yielding little additional information.3

Our pool of students is small; our time, limited; our assignment of skill levels by faculty, subject to variation. We cannot audit our faculty’s assessments without interfering with their prerogatives and incurring costs that the university is not willing to undertake. We cannot assess what happens to students who are not majors.

We are, however, confident in our assessment because it has revealed unanticipated information, suggested reasonable change, and given us a window into how our curriculum functions as a unit. The primary goal of our plan is to keep the process of assessing history learning in the hands of historians, giving us an opportunity to reflect on how well we do, not merely as individual faculty members but as a department.

Jonathan M. Chu is professor of history at the University of Massachusetts Boston and a former associate and interim dean of its College of Education and Human Development. He is the author of Stumbling toward the Constitution: The Economic Consequences of Freedom in the Atlantic World (2012).

Notes

1. Our revisions reflect the issues and concerns discussed in Joel M. Sipress and David J. Voelker, “The End of the History Survey Course: The Rise and Fall of the Coverage Model,” Journal of American History 97 (2011): 1050–66.

2. See also Thomas Andrews and Flannery Burke, “What Does It Mean to Think Historically?” Perspectives (January 2007), bit.ly/1ttzURm and AHA Staff, “Benchmarks for Professional Development in Teaching of History as a Discipline” (May 2003): 43, bit.ly/1ttzYQR.

3. The data collected from 101 and Research and Methods is summative. That is, an estimate is made from students’ course work of the appropriate level of achievement and tallied on the matrix, thereby separating students’ identity from the data. It is possible to link this data, but this would greatly escalate the complexity of data collection and run potentially afoul of student privacy issues.

Measuring Historical Thinking Skills

Learning Objectives

Mastery

Strength

Met Expectations

Needs Work

Understands the nature of change over time

Knows course content and can place it in historical context

Understands the distinction between primary and secondary sources

Uses content and sources effectively in historical analysis

Understands how scholars’ time and place influence how they ask questions or interpret past events [added after trial run]

Writes well-organized, clear sentences


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Attribution must provide author name, article title, Perspectives on History, date of publication, and a link to this page. This license applies only to the article, not to text or images used here by permission.

The American Historical Association welcomes comments in the discussion area below, at AHA Communities, and in letters to the editor. Please read our commenting and letters policy before submitting.


Tags: Resources for Faculty Resources for History Departments Scholarly Communication Graduate Education K12 Certification & Curricula Teaching Resources and Strategies


Comment

Please read our commenting and letters policy before submitting.