Getting a handle on fire response times and what they mean is
anything but an exact science
Gilroy – The city’s fire response is beyond compare – literally.
With no common standard across fire departments, it’s tricky if not impossible to compare fire response between local cities. Gilroy isn’t unique: Across the country, residents can compare their crime rate, school performance and taxes to other cities, but not their fire protection.
City governments have resisted efforts to hold fire companies to a single standard, and questioned the optional standards set by national agencies. As a result, fire departments measure their response times in radically different ways, making cross-department comparison a dicey enterprise. Some start the clock when the call goes in to 911; some begin when firefighters roll out of the station. Each city aspires to a different standard, gathers its response times in different ways, and reports the data as it chooses. Some average their times; some report the percentage of times within a standard. Others don’t share the data at all.
“When you say ‘response time,’ is it the four minutes from when you start driving to the fire, the five minutes from when you’re told [about the fire], or the six minutes from when someone calls the dispatcher?” asked Carl Peterson, a National Fire Protection Association spokesman. “The public hears that it took four minutes, or five, or six. But they have no idea when that clock starts.”
National standards exist, developed by the NFPA and the Commission on Fire Accreditation. The NFPA recommends a six-minute response, spanning from when a dispatcher notifies firefighters to when the first vehicle arrives on scene. But that standard remains impractical for many fire departments, especially smaller departments with limited funds, said Jeff Clet, a former Gilroy fire chief and immediate past president of the California League of Cities’ Fire Chiefs department. A national analysis undertaken by the Boston Globe found that only 35 percent of fire departments met the NFPA standard in 2002.
Cities oppose a state-wide standard
Efforts to set a single standard for fire response have failed, met with intense opposition from the League of Cities. One such state bill, proposed by Assemblyman Rudy Bermudez (D-Norwalk) in 2005, was killed by the assembly’s Chief Clerk after cities lodged their complaints.
“It’s a one-size-fits-all approach to fire safety,” said Brian Heaton, a League spokesperson. “Would the city of Gilroy want to be held to the same standard as Los Angeles, or vice versa? … Setting one standard is going to create a huge liability factor.”
As a result, the NFPA standard has remained a recommendation, not a rule. Cities set their own standards for fire response, and choose how to measure and report the data to local governments. Ask four agencies for their response times, said Stewart Gary, a retired Livermore fire chief, and they can give you four different beginning points, averages versus percentiles, department-wide statistics or numbers for each station. Fudge-factors such as whether firefighters record arrival times by pushing a button, noting a time or telling a dispatcher, contribute to the confusion.
“There has never been any federal or state prescriptive law on response times, either what to do, or how to measure it,” said Gary, who now works as Fire Practice Principal for Citygate Associates, which consults local governments on fire response. Trying to compare departments is a Herculean task, and Gary often dissuades his clients from doing it. “You’re comparing too many moving variables.”
Under local control, cities can determine their own fire protection needs, said Peterson. Dense urban areas might want quicker response than farther-flung rural areas or suburbs; richer areas can foot the bill for speedier service, while poorer districts may opt to put other needs first. But local control can also mean local manipulation, said Clet.
“The whole resistance [to a common standard] is that it’ll be used at the local level … to suggest that service levels are poor in an area, to try to get additional staff or to improve response times,” Clet said.
Gilroy has set a response standard of 5 minutes, 95 percent of the time, measured from when a dispatcher calls firefighters to when firefighters reach the scene. Thus far, that goal remains distant: In 2006, the department’s 95th-percentile time was just over seven and a half minutes. Fire analyst Dan Farnsworth said the department can’t meet the speed standard with its current resources. On a typical day, 10 firefighters and 1 relief worker staff Gilroy’s three stations.
Data can be manipulated, analysts say
By selecting how to present data, departments can skew the statistics, Gary said. Averages “are very misleading to the public,” but until a few years ago, most departments used them. Averages sound quicker: Gilroy firefighters reach 95 percent of their calls within 7 minutes and 33 seconds, but their average response time is a shorter 4 minutes and 21 seconds. Of the four departments that responded to the Dispatch’s data request, only one, Milpitas, supplied its response times as an average alone.
Citywide statistics also can mask lagging response times at individual stations, said Farnsworth, but local cities have avoided that pitfall. In quarterly reports, Gilroy notes response times measured by fire district, in addition to citywide numbers. Watsonville, Milpitas and the Santa Clara County Fire Department, which provides fire protection for Morgan Hill, also break down their response times by station, to zero in on problem areas.
When it comes to response times, the difference is often in such details – but big-picture problems loom as well. Many analysts complain that response times just aren’t a genuine measure of fire response.
“What we call response time isn’t truthfully response time,” said Vincent Dunn, a retired New York City fire chief and safety consultant to the National Institute of Occupational Safety and Health. In big cities, high-rise buildings can tack extra minutes onto firefighters’ work; thinly-staffed departments may have to wait for a second unit to arrive, before attacking the flames. Navigating complex buildings such as shopping malls and airports can also lengthen response times. “We should measure it from when the alarm goes off, to when they put water on the fire,” Dunn said.
Nor is speed the only factor in snuffing fire. Concentration, or the number of firefighters who arrive, is also crucial. In nearby Watsonville, one of the state’s densest cities, firefighters easily reach the scene within NFPA’s standard of six minutes, said Chief Mark Bisbee, but can’t get an effective firefighting force on scene within 10, another NFPA standard. Gilroy is still grappling with how to measure the ‘weight’ of its fire response.
“If response time was the only issue, you could put a kid on a motorcycle with a fire extinguisher,” joked Farnsworth. “He’d be there in no time at all.”
Unfortunately, he said, “the concentration just isn’t available in South County.” Gilroy relies on mutual aid and automatic aid from other fire departments to get a larger firefighting force on scene. At large fires in Gilroy, it’s not unusual to see engines from the county fire department and the California Department of Forestry. If multiple fires strike in one night, as happened one hectic September night last year, engines may be dispatched from as far as San Jose.
Plenty of info, no fair way to compare
California collects limited data on fires in the California All-Incident Reporting System, a program initiated in the 1980s, said Penny Nichols, CAIRS coordinator for the state fire marshal’s office. Fire departments must report incidents to the system to comply with the state’s Health and Safety Code, Nichols said, including alarm and arrival times, but the data haven’t been harvested to assess response times. The information is typically used to identify fire trends and consumer product failures.
Nationwide, fire departments “collect massive amounts of data,” said Clet. “But all the different components that make up response times aren’t consistently measured, and there aren’t structured ways to do it.”
Moreover, he added, “there’s been no funding source, so every agency, independently, has tried to collect it the best way they can, to report to local politicians.”
And that’s exactly as it should be, said Heaton of the League of Cities.
“Cities are the most accountable forms of government in the state,” Heaton said. “If the fire department isn’t doing something right, the city is going to hear about it.”
2006 fire response times
* Morgan Hill (Santa Clara County Fire Department): 91.88 percent in less than 7 minutes.Measured from when dispatch rings firefighters to when they arrive on scene.
The goal is 7 minutes or less, 90 percent of the time.
* Watsonville: 93 percent in less than 7 minutes
The average time 6 minutes, 39 seconds.
Measured from when dispatch rings firefighters to when they arrive on scene.
The goal is 5 minutes or less, 90 percent of the time.
* Gilroy: 95 percent in less than 7 minutes, 33 seconds.
The average is 4 minutes, 21 seconds.
Measured from when dispatch rings firefighters to when they arrive on scene.
goal is 5 minutes or less, 95 percent of the time
* Milpitas: The average is 4 minutes, 15 seconds.
Measured from when dispatch rings firefighters to when they arrive on scene.
The goal is 5 minutes or less.