FMCSA Defends Its CSA Crash Data After GAO Report Criticizes Methods

By Eric Miller, Staff Reporter

This story appears in the Feb. 24 print edition of Transport Topics.

A Government Accountability Office study critical of the way the government identifies crash-prone carriers is the result of a “philosophical disagreement” between regulators and GAO investigators, Federal Motor Carrier Safety Administration officials said.

The study, released earlier this month, concluded that FMCSA’s Compliance, Safety, Accountability data are not good predictors of crash risk because most regulations used to calculate a carrier’s safety score are “not violated often enough to strongly associate them with crash risk for individual carriers.”

The study said carriers lack sufficient safety performance data — roadside inspections and crashes — to ensure that FMCSA can reliably compare them with other carriers.



However, at a Feb. 12 presentation to FMCSA’s advisory CSA subcommittee, agency officials attempted to poke holes in the study, saying the recommendations would effectively eliminate 90% of carriers from federal regulation.

“What the GAO focused on is that a carrier should have 20-plus inspections before we make an observation,” said Joe DeLorenzo, director of FMCSA’s office of enforcement and compliance. “That being said, we all realize that more data is better. But that’s not the reality of what we’re dealing with.”

Rating only those carriers who receive 20 or more inspections would mean that 90% of the 520,000 active U.S. carriers would not show up on the agency’s regulatory radar, DeLorenzo said.

Bill Quade, FMCSA’s associate administrator for enforcement, told the subcommittee that the agency should not to have to wait for 20 inspections to single out problem carriers.

“If your score is really bad, we need to intervene,” Quade said.

But Susan Fleming, director of GAO’s physical infrastructure team, said GAO’s study does not suggest any specific number of inspections needed to draw an accurate picture of a carrier’s safety performance.

“We used 20 as a way to illustrate,” she said. “We could have picked 10, we could have picked 30.”

Fleming added, “We don’t trash CSA or feel that it has to be scrapped. I think we very much have a consistent message saying that the CSA approach holds promise. We believe in a data-driven approach.”

But Fleming said the GAO report provides some “very important food for thought” about some of the serious limitations associated with the CSA safety measurement scores and their reliability, especially in light of the fact that they will potentially be used as a basis for the agency’s safety fitness determination proposed rule.

“We would regard this report as good timing in the sense that FMCSA is at a critical juncture,” Fleming said.

David Parker, the CSA subcommittee’s chairman, said that, in most cases, FMCSA can collect only a limited number of the many carriers’ “events” — roadside inspection violations and crashes.

“FMCSA officials are right,” Parker said. “It’s difficult to get 20 snapshots of some carriers. That’s part of the problem.”

As a result, Parker said, the scoring weights that the agency assigns to violations contained in the seven CSA rating categories, or BASICs, need to have an accurate relationship to crash predictability.

On that point, “the jury’s still out,” Parker said.

Rob Abbott, vice president of safety policy for American Trucking Associations, had a different take on the presentation.

“The agency continues to be in denial of the flaws and limitations in identifying the accuracy and reliability of their scores and identifying the safety performance of individual carriers,” he said.

Members of the subcommittee plan to meet again in April to discuss suggestions on how FMCSA should use CSA data in its safety fitness determination proposed rule expected later this year.

The rule would use CSA data to replace the current SafeStat rating system, which uses a compliance review to rate a carrier as satisfactory, conditional or unsatisfactory.