While transcribing the 1853 illustrated edition of Uncle Tom’s Cabin today—so that I can compare it to another keyboarding—I was distracted by an article in Slate. According to the author Ray Fisman, prosecutors and defense attorneys failed to carefully double-check sentencing guidelines, which led to errors in the term of incarceration for over ten percent of Maryland prisoners.
This pattern of error was discovered by Emily Owens, who was working on a Ph.D. in Economics at the University of Maryland. She came to recognize the pattern of error after she struggled to work out “inconsistencies and errors” in the data from Maryland State Commission on Criminal Sentencing Policy. But when Fisman contextualizes the consequences, he falls into a trap of assuming that academic work, as compared to real-world work of criminal justice, has lesser significance. Here’s Fisman’s explanation:
With the stakes so hig—months and years of freedom gained or lost—how could Maryland’s Sentencing Policy Commission have been so sloppy? For academic research—a matter trivial by comparison—it’s common to have data entered independently by at least two typists, whose output is then cross-checked for accuracy. Yet it turns out that complacent bureaucrats weren’t to blame for the sentencing mistakes. The work sheet had to be filled out by the state attorney prosecuting the case, with the final form signed and approved by the defense attorney (who, if he was doing his job properly, would have done the work sheet calculations independently). The commission had, by design, handed off the task of work sheet completion to parties that it assumed would have every incentive to get the numbers right, but it apparently never accounted for widespread incompetence in Maryland’s legal profession.
I am appalled for the real victims, those wrongly sentenced to longer terms. But I don’t understand why Fisman diminishes academic research—that’s what Owens did, after all—as a “matter trivial by comparison.” For there would be no story to tell—and the real-world consequence of these errors would be unknown—if doctoral student Owens had not acted like an academic and spent an inordinate amount of time checking data to explain inconsistencies. What Owens did is academic work, and no doubt poring over the data for inconsistencies must at times have bordered on the maddening, but the work’s real-world consequences, explaining the source of the inconsistencies in data and the social costs that those inconsistencies represent, is not trivial. Sometimes, it is only through the study of the matters trivial that the matters nontrivial became visible.