May 14, 1999

 

 

Honorable Judge Frank J. Ochoa

Presiding Judge, Santa Barbara County Superior Court

1100 Anacapa Street

Santa Barbara, CA 93121

 

Subject: Response to 1998-99 Grand Jury Report on Performance Based Budgeting

 

Dear Judge Ochoa:

 

Thank you for the opportunity to respond to the 1998-99 Grand Jury’s Report entitled Performance Based Budgeting. Attached is the County Administrator’s response to that report.

 

The response was finalized after circulating a draft (prepared by me) to the Department Directors for input. Their comments/suggestions are incorporated into the final product. The Sheriff’s input (Attachment A) regarding Recommendation #5(b) provides a valuable perspective about RPM’s and their relationship to employee performance reviews. Also, attached for informational purposes is a copy of an article from the May 1999 issue of Governing magazine entitled "Performance Phobia" (Attachment B).

 

Sincerely,

 

 

 

Michael F. Brown

County Administrator

 

MFB:bcc

 

Attachments

 

c: Gary Monson, Foreman, 1998-99 Grand Jury

Department Directors

Finding #1: Many recurring performance measures seem to be unrelated to the department’s significant activities and at the same time there is an absence of measures for significant departmental activities.

 

Response Finding #1: Partially agree. It depends on the organizational level at which an activity is being measured. Performance measures at the departmental level (generally impact measures) should be directed at key departmental activities. Indicators, efficiency measures, and effectiveness measures should be directed at key cost center activities. Moreover, some departments’ significant activities are project related and therefore do not lend themselves to RPM’s. The quality and selection of performance measures are uneven from department to department and from division to division in some large departments.

 

Recommendation #1: All departments should develop recurring performance measures that reflect activities that are key service differentiators, make up a high percentage of cost and are capable of being improved.

 

Response to Recommendation #1: The recommendation is being implemented on an ongoing basis. Now that the budget is structured in a way such that performance measures can be aligned to cost centers where work actually takes place (see response to Finding #2), over the next two years we will focus more attention on improving performance measures.

 

Finding #2: Numerous recurring performance measures for each department studied were developed and implemented without clearly observable links to department strategic goals and objectives.

 

Response to Finding #2: Partially Agree. Some have observable links and some do not. We will gradually phase in a layered approach. For example, in preparing the 1999 Environmental Scan, participating departments worked extensively on strengthening the linkage between the Board of Supervisors’ goals and objectives and departments’ impact measures. However, not requiring departments to link performance measures to goals and objectives was a specific tactical decision which was made about implementation of performance and measurement in Santa Barbara County. Employees responsible for developing performance measures need to learn how to conceptualize and write performance measures with some degree of rigor.

 

Moreover at the time the County determined to utilize performance measures, the budget was not structured in a way that recurring operational performance measures would be particularly useful. For this reason staff had to restructure the entire chart of accounts and map the actual divisions and subordinate program cost centers where work actually takes place. To add further conceptual complexity early in the process by requiring that the measures be tied to some sort of goal and objective structure would have added too much confusion.

 

Recommendation #2: All departments should reevaluate their RPMs. The number of RPMs should be reduced and each one should be more selective, more challenging and more clearly linked to the mission statement of each department.

 

Response to Recommendation #2: The substance of the recommendation is being implemented on an ongoing basis. In fact, during quarterly operations review meetings (ORM’s) and during the budget preparation cycle, performance measures are evaluated with respect to their usefulness, accuracy, and relevance. When performance measurement was introduced in Santa Barbara County, we made it clear that some measures which were originally tried would be found to be insufficient or not useful. In fact there is a constant effort to upgrade some and replace others.

 

Recommendation #2 also suggests however, that the number of RPMs should be reduced and be made more challenging. We disagree with this aspect and it will not be implemented in that there are some program cost centers which exhibit an appropriate number and type of performance measures. There are other cost centers which have too few and there are some (including some that represent significant budget and activity) which exhibit none. In these cases we actually need more performance measures. Attachment C (an article entitled "Measurement and Management in the Information Age") from the nationally celebrated book Balanced Scorecard, explains why concentrating on too few measures is bad.

 

Finally, Recommendation #2 suggests that the performance measures should be linked to the mission statement of each department. We believe that this should evolve over time as it can result in a very layered, time consuming, and abstract attempt to fulfill an arbitrary logic structure. Instead the performance measures are currently linked to program cost centers so that we can ascertain whether or not the level of activity, efficiency of the activities and effectiveness of the activities appear appropriate to the resources in dollars and staffing requested.

 

For example, on page D-195 of the current budget, four performance measures are provided for the food stamp eligibility unit. This is a twelve employee, $473,000 operation which assists qualified poor people to obtain food stamps in a timely manner. The measures of success are speed and accuracy. If the unit were to perform its functions too slowly children could go hungry, particularly over weekends and holiday periods. On the other hand, if the staff is careless, individuals who are not qualified could receive the benefit thus wasting taxpayer dollars and diminishing the resources available for those who are truly in need.

 

The mission statement of the Social Services department is to "provide protective services, employment services and financial assistance which assist the residents of Santa Barbara in becoming productive and self sufficient contributors to the community." We agree that it is important for the Social Services department to measure the overall success rate in assisting "residents …. in becoming productive and self sufficient contributors to the community." Reduction in the caseload, number of former clients working in unsubsidized private sector employment, and eliminating recidivism in terms of reducing the number of clients who come back on to the welfare roles are all important impact measures related to the mission statement of the department. This notwithstanding, it is also important to measure the effectiveness of the food stamp eligibility unit as well. In fact measures for the food stamp unit could be re phased to include a lead in such as "to promote self sufficiency ……"

 

Finding #3: Current recurring performance measures largely focus on quantitative results, many of which are related to timely performance and are unnecessarily brief, lacking detail that could make them more understandable and meaningful, showing the true significance of what is being measured.

 

Response to Finding #3: Agree.

 

Recommendation #3(a): All departments should use recurring performance measurements that show the value of the activity in relation to entire scope of data available. For example, if a department plans to increase an activity by 5%, it is necessary to understand what the prior level of the activity was and how significant that is in terms of total universe for the activity.

 

Response to Recommendation #3(a): We absolutely agree with this recommendation and strive to implement it and teach it at every opportunity. It has been part of every training and we continually raise the issue during the quarterly operations review meetings (ORMs) in an attempt to help users understand this fundamental requirement. Also the ordinate measures should be included so that the reader can understand the magnitude of the operative issues being measured. As indicated in our response to Recommendation #1 we intend to focus on improving performance measures over the next two years.

 

#3(b) All departments should consider each quantitative measurement in terms of potential negative side effect and develop RPMs that measure the qualitative aspects of those measures.

 

Response Recommendation #3(b): The recommendation will be implemented on an ongoing basis with a particular emphasis during the next two years. Implementation techniques include making the measures more descriptive of the qualitative outcomes expected to be driven by the quantitative aspect. As an alternative, measures can be presented as twins with headers stating the expected qualitative outcome followed by a sub measure expressing the frequency, velocity or other numeric aspect and another sub measure expressing the qualitative outcomes.

 

#3(c) All departments should break out data for outcome and service-quality indicators to help staff in assessing the success and shortcomings of the recurring performance measure. This may mean the expansion of cost centers to more accurately reflect the department’s activities.

 

Response to Recommendation #3(c): The recommendation will be implemented on an ongoing basis with a particular emphasis during the next two years. In some cases the budgets are still expressed at too general of a level in order to attach measurements in a meaningful way. Moreover, the Grand Jury is absolutely correct that most employees work in some sort of work unit which in many cases could be easily conceptualized as a cost center. For employees to really understand and utilize performance measures and related score keeping requires that the particular working cost center be identifiable with its measures. In addition, its staff and financial resources need to be explicit so they can see and understand the related input measures.

 

The annual United Way campaign provides a good example in this regard. Even though the United Way reports the total results of the fund drive from all sources, the campaign is broken down into sub divisions such as public sector, technology, banking, real estate, corporations of different sizes, education, health care, etc. Each of these divisions has its own separate goal, which is a subpart of the larger total community goal. Each of these divisions is in turn further broken down into component businesses, institutions and professional groups with each having their own sub sub target. Everybody knows at each level what the goal is and how they are doing against the goal. This is a powerful incentive for performance. Measures at each level include not only dollars raised, but calls made to the key opinion makers in a particular business or industry, events held to mobilize support, volunteers and contributions, and special contributions from higher paid executives by industry and company etc. Anybody that has participated in a United Way campaign is familiar with this very extensive layered system of performance measures.

 

Finding #4: Training for department heads and managers with respect to techniques for developing RPMs was reported to be insufficient. As a result, many recurring performance measures are not defined well enough to provide meaningful data over time.

 

Response to Finding #4: Although it is difficult to agree or disagree with what "was reported" to the Grand Jury, and, as indicated in responses to previous Findings and Recommendations, we agree that the performance measures can be improved, we disagree with the essence of the Finding. It is not clear from the Report why the Grand Jury accepted anecdotal data that training was or is "insufficient" and then reported it as validated fact. In fact, the Grand Jury Report itself states that three departments were sampled. Since it is not known which departments were sampled it is not clear whether the sampling is valid. The Grand Jury points out that:

 

  1. "Departments were initially given theoretical training in identifying and developing recurring performance measures"

 

 

  1. "Training was directed at upper management, with little or no input from second line supervisors or line staff; however, this year there have been several continuing sessions for upper level management"

 

 

    1. Spring 1997 introductory classes focused on the use of cost centers, criteria for selecting cost centers (by the way, significance is one of the key criteria), kinds of performance measures (indicators, efficiency, effectiveness, outcome/impact), examples, unit measures, key elements of performance measures (action, problem, current state, proposed state, time interval during which the solution will occur), and actual examples.
    2. February 1998 – City/County Management Association performance training for both department heads and budget preparers.
    3. Spring and Summer 1998 training on inputting and using the performance measurement model of FIN.
    4. November 1998 – National Performance Management expert David Ammons conducted onsite training for about 90 executives and management staff.
    5. December 1998 – The Budget Director conducted a Performance Measurement Workshop.
    6. The 1999-2000 Budget Manual contains a detailed tutorial on performance measures.
    7. Several department heads and a number of staffers have been involved in the International City/County Management Association consortium on performance measurement.

 

Understanding of performance measurement should be generally pervasive throughout the organization.

 

Recommendation #4: The County Administrator should offer extensive additional training, including department specific training, for department staff on an ongoing basis after the introduction of performance based budgeting, especially in the area of developing RPMs. A significant portion of the training should occur as a working session, not as a general theory class.

 

Response to Recommendation #4: The recommendation has been implemented. In addition to the training already provided (see response to Finding #4), we have developed an Employee University course on performance measurement, which will be available to all County employees. The first session for the course is scheduled to take place on Thursday, June 17, 1999. The class is being developed directly by the County Administrator and a frontline supervisor so that the frontline employee’s needs and experience can be incorporated into the training. The class will be given at intervals over the year in the hope that eventually several hundred staffers will have completed it within several years.

 

This class will also be provided on a custom basis at the request of specific departments.

 

Instructors will have to be certified by the Employee University and learn correct methods of adult learning in order to properly engage the County workforce.

 

The goal of the course is that graduates will be able to design, write and evaluate recurring performance measures (RPMs) tied to budgetary program cost centers. Specific learning objectives provide that upon completion of the course participants will be able to:

 

 

Finding #5: Management staff was responsible for developing the initial RPMs and the county’s front-line workers have little knowledge or understanding of RPMs.

 

Response to Finding #5: Agree.

 

Recommendation #5(a): The County Administrator should provide staff training and workshops at all levels to create a "buy in" to recurring performance measures by staff as part of a team-building process.

 

Response to Recommendation #5(a): The recommendation has been implemented. See the response to Recommendation #4 above. However, in a 4,200 employee layered organization each department also has responsibility to make sure that each subpart of its organization is actively engaged and Department Directors have been asked to do so.

 

(b) All departments should include discussion of relevant RPMs in each employee’s annual performance review.

 

Response to Recommendation #5 (b): This will not be implemented for general employees at this time although it could become a factor in the future. In some cases, some RPMs may be used to assist and facilitate discussions among general employees about the performance of departmental services and functions. (See Sheriff’s response, Attachment A)

 

Currently, the use of RPMs is included in the annual performance reviews of all executives who are subject to the control of the Board of Supervisors and the County Administrator. One whole section of the executive performance review relates to the use of RPMs and related tools. The County Administrator has encouraged department heads to utilize the executive performance review format with their deputies, division heads, and higher level management analytical employees.

 

Finding #6: Quarterly reviews are held with each department to evaluate actual performance (degree to which the measure is being met) but little time is spent discussing the reliability or value of the measure itself as it is written. In addition, there is no stated plan for benchmarking with other agencies using performance measure information.

 

Response to Finding #6: Partially agree. We agree that there is no stated plan for benchmarking with other agencies using performance measure information. A set of ten comparable counties has been developed and is displayed in the 1998-99 budget. These counties are similar in size, budget, population and percentage of population which receives its municipal type services from the county as opposed to incorporated cities. The Public Health Department has developed an extremely powerful and attractive professional performance report depicting environmental and performance impact data among the ten comparable counties. The next step could be to relate expenditures, staffing and performance measures which are relevant to each of the indicators to determine if there is any relationship.

 

The County Administrator will encourage development by each department of a parallel report. Also, the Sheriff’s Department, Fire Department, Parks Department, Planning and Development Department, General Services Department, Personnel Department, and Public Works Department have been requested to participate in the International City/County Management Association Comprehensive Performance Measures Consortium.

 

We disagree that little time is spent discussing the reliability or value of measures during the quarterly reviews. In fact, the County Administrator and the Department Heads spend considerable time at each ORM discussing this very subject and working on improvements, eliminating bad or irrelevant measures, and seeking ways to present measures in a better manner.

 

Recommendation: #6: The County Administrator should develop a comprehensive plan to address continued refinement of the RPMs themselves and benchmarking with other agencies using performance measure information.

 

Response to Recommendation #6: The County Administrator will include a project in the 1999-2000 Budget to undertake this effort.

 

Finding #7: The County has tied salary increases to performance measures that do not directly improve specific departmental processes.

 

Response to Finding #7: Disagree. The County has not tied salary increases to specific measures that do not directly improve specific departmental processes. In fact, the County has tied potential salary increases to achievement of reduced absenteeism on a bargaining unit by bargaining unit basis. The Grand Jury report and the desired results on this subject are not clear. Lost time (absenteeism) costs the County an estimated $10.3 million per year. The County is primarily a service organization and when employees are not present many assignments must be backfilled by other employees on a premium overtime pay basis at time and one half. - In other cases work is deferred as colleagues fill in. As has been reported to the Board of Supervisors, worker’s compensation costs are a growing problem and the measurement and the provision of incentives and dis-incentives that deal with this issue are imperative.

 

There are no salary increase provisions tied directly to improve specific departmental processes.

 

Recommendation #7: The County Administrator and all individual departments should refrain from introducing significant changes or incentives associated with specific recurring performance measures until the measures are consistent and reliable measuring the primary processes within departments.

 

Response to Recommendation #7: In effect Recommendation # 7 is implemented as we have not and would not tie pay incentives to performance measures which were unreliable or unimportant.