CQ No. 25 SA Crime Quarterly no 38 • December 2011 3 MEASURING OUTPUTS, NEGLECTING OUTCOMES David Bruce* davidbjhb@gmail.com Performance monitoring, often based on the use of performance indicators, has become a central aspect of the work of government departments in South Africa. Even though the South African Police Service (SAPS) is regarded as one of the leading government departments in the use of performance monitoring systems, it does not use performance information in a critical enough manner, particularly given the risk that the introduction of performance measures will lead to perverse incentives. Given that the SAPS is one of the largest police services in the world, the centralised reporting on organisational performance in the annual report is ineffective. It obscures much more than it reveals about what is being achieved by the organisation. Since 2005 the Auditor General of South Africa has been phasing in a ‘predetermined objectives’ audit that involves checking on the reliability of the performance information presented by the SAPS. Though he has limited capacity to do so, the AG also carries out what are called ‘performance audits’, that involve deeper and more focused scrutiny of the functioning of government departments. A 2008-09 performance audit on service delivery at police stations and 10111 call centres highlights the type of scrutiny that the SAPS needs to be subjected to if information on its performance is to become more meaning ful. Since the early 1980s an increasing number of governments have invested in systems of performance management, incorporating the use of performance indicators and the implementation of systems of performance assessment. Behind the drive towards improving performance management has been a concern with the efficiency and effectiveness of government expenditure. Governments increasingly want answers to the question: Is our money being used productively? Considering the massive investments of public money into public sector organisation it is hard to argue that this is not a necessity. Nevertheless, performance measurement is in some ways a mixed blessing. This is partly because, unless the government agency in question is involved in an activity that is fairly simple (the provision of water perhaps), it is often extremely difficult to find indicators or measures that adequately capture or do justice to the activities that the agency is supposed to be engaged in. Good performance * David Bruce is an independent researcher. This paper was written on behalf of the African Policing Civilian Oversight Forum (APCOF). Thanks are due to Sean Tait of APCOF, Chandré Gould of the ISS and an anonymous reviewer for assistance and direction with this article. The Auditor General’s role in SAPS performance assessments CQ No. 38 December 11/21/11 9:46 AM Page 7 provisions that form the basis for the implementation of performance measurement systems include Section 38(1)(b) of the Public Finance Management Act.4 This Act requires that ‘accounting officers’ for government departments or other government entities are ‘responsible for the effective, efficient, economical and transparent use of the resources of the department…’ Performance management and, more importantly, measurement, is intended to be a way of evaluating whether departments are delivering on their obligations effectively, efficiently and economically. More specific provisions such as Section 27(4) of the Public Finance Management Act (of 1999) require accounting officers to submit ‘measurable objectives’ to Parliament with their draft budgets. Treasury regulations also require that the strategic plan include ‘the measurable objectives, expected outcomes, programme outputs, indicators (measures) and targets of the institution’s programmes’.5 PERFORMANCE MEASUREMENT WITHIN THE SOUTH AFRICAN POLICE SERVICE (SAPS) The SAPS has used systems of performance measurement, in compliance with government policies, from at least the late 1990s. An annual performance plan is published each year.6 This provides indicators and related measures for assessing the performance of each programme and sub-programme of the department. As with other departments, the mechanism for reporting on performance is the annual report. A report for the immediately preceding financial year (to end March) is presented to Parliament and published in September or October of each year.7 Over the years the SAPS use of indicators and measures has steadily become more sophisticated and the SAPS is regarded as one of the leading government departments in terms of its use of performance measures.8 Since 2003, as part of its drive to improve its ability to assess performance, the SAPS has also introduced what is known as the ‘performance 4 Institute for Security Studies measures should reflect the key goals that organisations hope to achieve. Assuming these are clearly enough defined it may be difficult to find meaningful ways of measuring what an organisation achieves (the outcomes), rather than just what it does (the outputs). In the field of policing this is a notorious difficulty. Speaking about the use of indicators internationally, Bayley1 says: Most performance indicators focus, unfortunately, on outputs rather than outcomes, with the result that police officers give more attention to reporting what they do rather than what they achieve. This causes them to become pre-occupied with meeting norms of activity rather than adapting their activity to produce desired results, which in turn discourages innovation and reduces operational flexibility. The construction of appropriate measures is therefore something of a high art. One of the inherent risks of indicators is that they have unintended consequences such as ‘promoting inappropriate behaviour or malicious compliance’.2 This has been illustrated in South Africa with evidence of performance management related targets and systems contributing to a pattern of non-recording of cases reported at police stations.3 Performance measures are also necessarily quantitative, with the consequence that the business of government and the work of public servants become dominated by a pre-occupation with meeting certain numeric targets. This has the potential of compromising the overall quality of the service provided. Another factor is that the implementation of performance management systems is expensive and involves a massive investment of resources and time by government personnel, particularly those in managerial positions. In South Africa, no less than in other countries, government has emphasised performance monitoring and performance management. The emphasis on performance management systems may be understood as derived, at the end of the day, from the principles of accountability embodied in the Constitution. Legislative CQ No. 38 December 11/21/11 9:46 AM Page 8 SA Crime Quarterly no 38 • December 2011 5 management chart’. This is an information technology-based system for monitoring and comparing the performance of police stations. The performance chart partly relies on information recorded on the Crime Administration System regarding levels of recorded crime (as measures of crime prevention) and on detection rates and the percentage of cases that go to court (as measures of crime investigation). The system compares the performance of a station against its own previous performance. Stations are then ranked against each other according to the level of improvement in performance that they have achieved (relative to their past performance). The aim is to encourage police leaders and members to focus on continuously improving their activities and operations in line with the strategy of the SAPS. Not unexpectedly, several of the performance indicators that tend to be the focus of attention by politicians, media, the public and the police themselves, relate to crime reduction targets. In the 2011/12 performance plan, for instance, the SAPS established targets to reduce the number of ‘reported serious crimes’ and ‘reported serious crimes within the rural environment’ by two per cent each and to reduce the number of contact crimes9 and trio crimes10 by between four and seven per cent each. Yet within the criminal justice field, as well as amongst the police themselves, there is ambivalence as to whether levels of crime should be used as a measure of police performance. Many argue that levels of violent crime are driven by factors over which the police themselves have no control.11 The SAPS itself has consistently argued that certain categories of violent crime (amongst them murder, assault and rape) should be seen as ‘less policeable crime’ on which they cannot be expected to have a significant impact.12 What is important is that the link between changes in the crime rate in South Africa and police action or strategies is never apparent. The SAPS is a ‘policing behemoth’,13 one of the largest police services in the world, with close to 200 000 employees. A national level figure indicating that ‘recorded contact crime’ has gone down by a certain percentage, in many ways mystifies the issue rather than actually answering any questions. The key question is ‘Can the police demonstrate that their primary crime prevention strategies are making a difference?’14 This would require a more systematic evaluation of the strategic and operational practice of police stations and other SAPS units. It may indeed be true that the SAPS is having a substantial impact on levels of crime. But the way in which indicators are currently used does not really tell us whether or not this is so. A second problem with these types of indicators is that they can create perverse incentives. Over recent years concerns have been raised that a major contributor to reductions in recorded violent crime has been a systematic pattern of non- recording of crime. Between 2008 and 2010 allegations relating to such manipulation were made in relation to eleven police stations in the Western Cape and four in Gauteng, as well as stations in KwaZulu-Natal and Limpopo,15 though no formal investigation was conducted into the matter. The reliability of crime statistics as an accurate record of reported crime depends on the integrity of police practice in the recording of crime statistics. But this suggests that in recent years the pre-occupation with indicators of performance has taken a toll on this type of integrity. These issues are consistently not acknowledged in the annual report or performance plan, even though they have a profound impact on the credibility of the information presented. It needs to be acknowledged that the emphasis on crime reduction targets is in many ways politically driven. The SAPS has been under enormous pressure not only to reduce crime, but also to show that crime is going down. What is important to recognise, however, is that the crime figures that are presented in the crime statistics each year are not simply a representation of the work of the SAPS. Potentially they represent the impact of a number of factors, including socio-economic factors, the work of the SAPS, the impact of other security role players (other components of the criminal justice system and the private security industry), trends in public reporting, and SAPS practice in recording reports or allegations of crime. CQ No. 38 December 11/21/11 9:46 AM Page 9 The issue of the opacity of performance information does not only apply to performance indicators relating to recorded crime. The detective component of the SAPS alone, for instance, numbers over 28 000 people.16 One of the dilemmas that the SAPS faced in constructing its performance indicators was whether to identify itself as responsible for conviction rates. The way in which the SAPS has resolved this in recent performance plans has been to focus primarily on two types of indicators, measured independently for various sub-categories of crime. These indicators are the ‘detection rates’, described as: the total number of charges referred to court plus charges withdrawn before court and charges closed as unfounded ... divided by the total number of charges investigated17 and the ‘percentage of court ready case dockets’, described as: the provision of a fully investigated case docket, whether it includes one or more charges (investigations finalised), which can be utilized by the National Prosecuting Agency (NPA) for the prosecution of an offender/s on the charges linked to the docket. Fully investigated entails that there is no outstanding information which requires further investigation by a detective, and that all evidence (e.g. statements, DNA, toxicology) has been obtained.18 Again, however, the presentation of single figures on ‘detection rates’ or dockets that are said to be ‘court ready’ obscures far more than it reveals. Undoubtedly there are major variations in the quality of these ‘detections’ and ‘court ready’ dockets. The pressure to improve detection rates potentially has its own set of unintended consequences and perverse incentives. The performance plan indicates that detection rates are calculated partly on the basis of cases that are withdrawn by complainants.19 This increases the possibility that police officers may play an active role in encouraging complainants, for instance in domestic violence cases, to withdraw cases. The pressure to improve detection rates and the number of court ready dockets also increases the possibility that detectives will be motivated to take ‘short cuts’, such as using coercive measures20 or fabricating evidence when investigating cases, and increases the risk of wrongful convictions. This is not to say that the police should not use performance indicators, but to raise questions about their design and implementation. One question here concerns the degree to which current performance measures actually tell us anything about the contribution that the SAPS is making towards achieving its key goals. The use of performance measurement systems also needs to be accompanied by a parallel process that seeks to ensure that the performance is not only achieved against measures of quantity but also against measures of quality. But whilst the SAPS can speak with some pride of its framework of performance indicators and measures it cannot do the same in relation to its mechanisms for ensuring the quality of its work. For instance, one of its principle mechanisms for doing so, the SAPS National Inspectorate, has for some years failed to execute its mandate ‘owing to poor leadership and unwarranted reorganisations’.21 Though the framework of performance indicators is, by many standards, of good quality there are also important dimensions of policing that are not addressed. The SAPS in its annual or other reports, for instance, does not monitor, and therefore cannot provide, information on the use of force, including lethal force, by its members. Though statistics on killings by police are presented in the ICD annual report, the SAPS makes no mention of these figures in its own reports. The performance plan does include an indicator relating to the investigation of cases of corruption22 but the plan makes no mention of the issue of the use of force, or for that matter, police safety. There are also profoundly important aspects of democratic policing which are not, and potentially cannot be, represented by this type of performance information, such as the necessity that police should not serve specific party political interests.23 6 Institute for Security Studies CQ No. 38 December 11/21/11 9:46 AM Page 10 SA Crime Quarterly no 38 • December 2011 7 THE AUDITOR GENERAL’S ROLE IN ASSESSING POLICE PERFORMANCE Regularity audits The key auditing function performed by the Auditor General24 is often seen to be the audit of financial statements. In this respect the SAPS audit history has fared well. In a presentation to the Portfolio Committee on Police in August 2009, the AG identified a number of good practice benchmarks that the SAPS had met, including: • A clear trail of supporting documentation that is easily available and provided timeously • The quality of financial statements and management information • The timeliness of financial statements and management information • Availability of key officials during audits • Compliance with risk management and good internal control and governance practices25 Related to this, the significance of of the AG’s report is often seen to be the ‘opinion’ or ‘conclusion’ that is reached on whether the statements reliably reflect the financial position of the department. However, the role of the AG is in fact more broadly defined. In line with Section 20 of the Public Audit Act26 the audit of financial statements is merely part of a broader process of ‘regularity’ auditing that includes two other important facets. The first of these examines compliance with key legislation, including legislation pertaining to ‘financial matters, financial management, and other related matters’.27 The second is known as the ‘predetermined objectives’ audit (until quite recently this was referred to by the AG as the audit of performance information, abbreviated as ‘AoPI’). This involves scrutiny of performance measurement systems and data. Predetermined objectives The audit of ‘programme performance against predetermined objectives’ is referred to in Section 20(2)(c) of the Public Audit Act28 as well as Section 40(3)(a) of the Public Finance Manage- ment Act.29 Essentially this component of the audit can be seen as an assessment of whether the government department is using performance indicators and measures in the prescribed way.30 It therefore involves an assessment of systems and controls relating to the management, collecting, monitoring and reporting of performance information. This includes checking on the consistency of performance information between the strategic/annual performance plan, quarterly reports and annual performance report. It also involves comparing reported performance information to relevant source documentation and conducting procedures to ensure the validity, accuracy and completeness of reported performance information.31 Since 2005-06 the AG, in cooperation with the National Treasury, ‘phased in’ the predetermined objectives component of the audit.32 In 2008 the AG announced that the ‘phasing in approach to compliance’ would continue ‘until such time as the environment shows a state of readiness to provide reasonable assurance in the form of an audit opinion or conclusion’. The phasing in period would involve a ‘review of the policies, systems, processes and procedures for the managing of and reporting on performance against predetermined objectives’. During the phasing in period the AG would not provide a formal ‘opinion’ or ‘conclusion’ relative to the audit of predetermined objectives. Nevertheless, the AG does report on ‘material shortcomings in the process, systems and procedures of reporting against predetermined objectives’. These are reported under the ‘Other reporting responsibilities’ (or for instance ‘Other legal and regulatory requirements’) section of the audit report.33 (A similar approach is followed in relation to the ‘compliance’ component of the audit). The AG’s findings in relation to the SAPS In 2007 and 2008 the AG’s report indicated that there were ‘no matters observed that require inclusion in the report’ or ‘no significant findings’ CQ No. 38 December 11/21/11 9:46 AM Page 11 8 Institute for Security Studies in relation to the audit of predetermined objectives. However, in subsequent reports a number of issues were raised. For instance, both the 2009 and 2010 report of the AG raised concerns about compliance with regulatory requirements,34 stating that: The Department of Police did not in all instances maintain an effective, efficient and transparent system and internal controls regarding performance management, which describe and represent how the institution’s processes of performance planning, monitoring, measurement, review and reporting will be conducted, organised and managed, as required in terms of section 38(1)(a)(i) and (b) of the PFMA. Another issue that was raised in the 2009 report concerned the ‘usefulness’ of reported performance information. This is essentially an assessment of the performance measures used by the department.35 The report indicated that certain key targets used by the department were not ‘specific’, ‘measurable’ or ‘time bound’. The issue was not raised in the 2009-10 report, however, perhaps indicating that in the view of the AG, there had been an improvement in the usefulness of the targets. The AG’s reports from 2009, 2010 and 2011 also questioned the ‘reliability’ of the performance information. For instance, in the 2009 report the AG stated that: The accuracy and completeness of reported performance information could not be verified for programme 2: Visible policing and programme 3: Detective services. This was as a result of performance information that has either been captured inaccurately or incomplete from source documentation such as case dockets.36 In a presentation made in October 2011 the AG identified these questions about the reliability of SAPS performance information as being related, amongst other things, to the unavailability of supporting documentation at stations/units, systems for collating information not being updated, inaccurate information being sent through, and inconsistencies between data in supporting documentation or data systems.37 As indicated, due to the ‘phasing in’ approach these issues that have been raised do not have the weight of formal ‘opinions’ or ‘conclusions’.38 However, they do indicate that there continue to be matters of substantial concern to the AG in the use by the SAPS of performance information, notwithstanding the fact that this is regarded as being of a relatively high standard. In the words of a framework document from the Presidency, the AG’s focus, by means of the audit of predetermined objectives, is ‘mainly on validating the credibility of performance information.’39 In other words, it looks at the robustness of the relationship between indicators, performance information and supporting documentation so that there can be some level of confidence that the performance information that is reported on is correlated with actual performance. The predetermined objectives component of the audit therefore provides some indication as to whether the SAPS can actually back up some of its performance claims with supporting information. Performance audits The predetermined objectives component of the audit was previously referred to as the ‘audit of performance information’ (AoPI). There is therefore a high risk that people who are unfamiliar with the framework to which the AG adheres will confuse this with what are termed ‘performance audits’. But the two are substantially different. As opposed to the audit of predetermined objectives that is referred to in Section 20(2)(c) of the Public Audit Act, performance audits are carried out in terms of Section 20(3), which provides that ‘the Auditor-General may report on whether the auditee’s resources were procured economically and utilised efficiently and effectively’.40 One feature that both audits have in common is that they include a focus on questions CQ No. 38 December 11/21/11 9:46 AM Page 12 SA Crime Quarterly no 38 • December 2011 9 of compliance with laws and regulations.41 Performance audits are however not linked to the financial year ‘and can cover more than one financial year’.42 Performance audits are therefore audits of a different kind. They are not part of the ‘regularity audit’ or necessarily linked to indicators, but an ‘independent auditing process to evaluate measures instituted by management to ensure resources are procured economically and are used efficiently and effectively’.43 ‘The primary objective of performance auditing is to confirm independently that these measures do exist and are effective; and to provide management, Parliament and other legislative bodies with information … on shortcomings in management measures and examples of the effects thereof.’44 In line with the general role of the AG these audits are not intended to question policy45 but are concerned with how resources that are procured with public money, are being used in practice. An example that illustrates the distinction between the two types of audits is that the regularity audit may, for instance, examine whether the paper trail indicates that proper tender procedures were followed when vehicles were acquired. A performance audit on vehicle acquisition might question: Was the acquisition of vehicles necessary? Was it the most economic acquisition? Are the vehicles in fact being utilised for the purpose that they were intended for? Performance audits are therefore a mechanism enabling the AG to carry out examinations of the functioning of governmental agencies that are more focused and deeper than is generally possible by means of the audit of predetermined objectives. However, the performance audit unit is very small and the AG’s capacity to perform these audits is therefore limited. As a result these audits are undertaken selectively. The current approach of the AG to resolving the issue of limited capacity is to focus on ‘transversal audits’. These focus on a specific issue across a number of key government departments. For instance, during 2011 a transversal audit was being conducted in relation to the use of consultants by a number of departments, including the SAPS. Audit of SAPS service delivery One example of the type of assessment of the police that can be obtained through performance auditing was an audit on service delivery at police stations and the functioning of the 10111 call centres, carried out in 2008-2009. (A ‘limited performance audit’ on border management by the SAPS was also carried out in 2007.46) The audit was carried out by the AG at the request of the Portfolio Committee on Police. The audit was carried out at five police stations and one 10111 call centre per province, with some effort made to select a sample that represented the diversity of stations in terms of ‘nature of operations, risk profiles, size and location’.47 The table below gives examples of issues that were the focus of the audit as well as questions that were addressed in relation to each focus area. Table 1: Performance audit of service delivery at police stations and 10111 call centres Key questions addressed Focus of audit Sector • Is there an approved policy for sector policing policing? • Is there a sector commander profile? • Are minutes of sector crime forum meetings compiled? Vehicle • Do operational members have drivers’ manage- licences? ment • Do operational members have authority to drive a state vehicle? • Do stations have the capacity to make full use of vehicles that are available? • Does management have accurate statistics on members with/without licences? • How consistent is the SAPS in imple- menting systems for controlling the use of vehicles? • Are vehicles repaired efficiently? Training • Can management accurately assess what number or proportion of operational members have attended training? • What proportion of members have received specific in-service training? Continued on page 10 CQ No. 38 December 11/21/11 9:46 AM Page 13 10 Institute for Security Studies stations were dealing with domestic violence the audit ‘revealed various instances’ where the requirements of the relevant National Instruction regarding the police response to domestic violence cases were not adhered to. At one police station, it was noted that in 15 (75%) of the cases examined, ‘the SAPS 508(a) (reports of domestic violence incidents) was not completed by officers in the CSC and the OB [Occurence book] number was not recorded in the domestic violence register’. The report concluded that this is likely to contribute to ‘a risk of a lower percentage of incidents being recorded and culminating in criminal charges’.50 • The adequacy and utilisation of resources. For instance, in Gauteng a sophisticated 10111 call centre was opened in 2007. However, the audit found that state of the art AVL (automated vehicle location) technology that was installed at the centre and in over 4 000 vehicles was not in fact being used.51 • Whether SAPS members have the skills or qualifications to carry out certain key functions. For instance, at one police station the audit found that 74% of operational members did not have a valid driver’s licence.52 Though the focus is not primarily on service delivery, where the resourcing or systems do not appear to be adequate the audit sometimes spells out the implications. For instance, in relation to SAPS members without drivers’ licences it not only implicitly raises questions about the recruitment and training systems but also points out that: Operational members without drivers’ licences increase the risk of shifts not being manned by trained drivers, resulting in service delivery inefficiencies in, inter alia, sector policing and reaction time to crimes reported.53 The report is not focused on issues of policy and therefore reflects issues of ‘implementation’ rather than ‘design’. This particular performance audit also does not tackle some of the questions that might be seen to be relevant to an assessment of Key questions addressed Focus of audit Community • Does the design of CSCs cater for service physically disabled people? centres • Are basic services such as water, (CSC) electricity and sanitation functioning? • Are there proper identification parade rooms? • Is information for the public (‘service charters’) displayed as required? • What is the condition of holding cells? Domestic • Is there adherence to procedures violence provided for in national instructions? Bullet proof • Is there adherence to procedures? vests • Is safety equipment (bullet proof vests) provided as required? 10111 call • Do all areas have 10111 call centres? centres • Is equipment adequate and in proper working order? • Is sophisticated call centre technology (including technology for locating police vehicles) being utilised? • What is the level of service (such as the time taken to respond to calls) provided by the call centres? • Are reaction times (i.e. the time taken from when a call is received to when a vehicle is dispatched) in line with prescribed SAPS standards? • What oversight is there by management on adherence to standards? Despite the small sample, the report on the performance audit is informative. While presented as an audit of service delivery, in practice much of the audit is focused on questions of resourcing and systems that have been put in place in order to enable the police to achieve their service delivery objectives, including: • The quality of management information and of management oversight. For instance, under the training section the audit indicated that ‘the PERSAP48 was not accurately updated to reflect all training courses attended by members, thereby compromising the accuracy of management information’ on the number of members who had attended specific training modules.49 • The availability and implementation of policies. For instance, in examining how CQ No. 38 December 11/21/11 9:46 AM Page 14 SA Crime Quarterly no 38 • December 2011 11 • Do they in fact reduce crime, disorder and fear, and promote public safety? • What is their effectiveness in bringing offenders to justice? • How promptly do they respond to emergency calls? • Do they communicate with and serve members of the public in a professional manner? • Are they responsive to vulnerable groups? • Do they cooperate with other agencies and groups in order to enhance broader crime prevention activities? • Do they follow professional standards in recording and reporting information on crime? The use of performance indicators and measures by the SAPS is regarded as being of a high standard when compared to that of most government departments in South Africa, and perhaps even by international standards. But the information that the SAPS presents on its own performance by means of its annual report is in many ways highly opaque and at best represents a proxy for answers to some of these key questions. This is partly because of the size of the SAPS. A set of figures intended to represent the ‘performance’ of an organisation of close to 200 000 people invariably hides much more than it reveals. The emphasis placed on these targets also carries with it substantial dangers, for example when government departments become preoccupied with presenting a satisfactory set of numbers, resulting in the production of perverse incentives and a reduced emphasis on other factors that are important, but do not directly affect, the performance targets. In so doing performance measurement may potentially reduce police responsiveness to the actual concerns of communities. Perhaps it is for these reasons that a new British government discussion paper proposes ‘replacing bureaucratic accountability with democratic accountability’,56 inter alia by ‘removing Government targets, excessive centralized performance management and reviewing the data burden that is placed on forces service delivery at police stations. For instance, the consideration of sector policing and domestic violence only deals partially with questions about the deployment of visible policing units and the nature of their role in crime prevention. It does not question the quality of investigations carried out by detective units. The report also does not ask, or answer, important questions about the nature of police-community relationships, or, for that matter, measures to ensure adherence by police at station level to standards of integrity and human rights. Despite these shortcomings, the performance audit is a deep audit that provides an illuminating picture of the operation of the SAPS. It provides a compelling illustration of the kind of scrutiny that is necessary if one is to be able to develop an appropriately nuanced picture of the operation of the SAPS in practice. CONCLUSIONS In a democratic society the central task of the police is to serve and protect members of the public. In the words of policing scholar David Bayley: The most dramatic contribution police can make to democracy is to become responsive to individual citizens’ needs. […] A police force whose primary business is serving the disaggregate public […] [demonstrates] daily and practically that the authority of the state will be used in the interests of the people.54 The existence of police stations, of which there are currently 1 122, is then in itself apparent evidence of the intention to ensure that police are deployed in locations that will enable them to be accessible to and serve the public. One question that the AG could assess is whether police stations in South Africa are in fact located and resourced in such a manner as to ensure a reasonable level of equity in access to policing services. In assessing more directly the service that is provided by police stations, and the police more generally, to members of the public, some of the key questions that might be asked would include: CQ No. 38 December 11/21/11 9:46 AM Page 15 12 Institute for Security Studies – but ensuring that data are still available to local people’57 and restoring ‘professional judgement and discretion to the police’.58 Performance data have value. But our ability to interpret data can only be enhanced by more detailed information about what is actually taking place at police stations. The process of holding police accountable requires mechanisms that can produce a deeper level of scrutiny. Performance audits carried out by the AG at this point represent something of a model in terms of the type of scrutiny that can provide meaningful information about the police. At the same time the unit of the AG’s office responsible for these audits does not have the capacity to carry out audits focused on the SAPS itself on a sustained basis. But it is not only the office of the Auditor General that has an interest in carrying out these types of assessments. Other agencies, including the national Civilian Secretariat for Police, the provincial secretariats, and potentially also the SAPS National Inspectorate have comparable mandates. Subjecting the police to meaningful scrutiny will require a greater investment in developing the capacity within government to carry out this type of information-gathering exercise. To comment on this article visit http://www.issafrica.org/sacq.php NOTES 1. David Bayley, Democratising the police abroad: What to do and how to do it, Washington DC: National Institute of Justice, United States Department of Justice, 24, https://www.ncjrs.gov/pdffiles1/nij/ 188742.pdf (accessed 21 October 2011). 2. Presidency, Improving Government Performance: Our Approach, Pretoria: the Presidency, 2009, 12. 3. David Bruce, ‘The ones in the pile were the ones going down’: the reliability of current crime statistics, SA Crime Quarterly 31, 2010, 9-19. 4. Public Finance Management Act (Act 1 of 1999). 5. National Treasury, Republic of South Africa, Treasury regulations for departments, trading entities, constitutional institutions and public entities, 2005, 15. http://www.treasury.gov.za/legislation/pfma/regulation s/gazette_27388.pdf (accessed 21 October 2011). 6. See for instance: South African Police Service, Annual Performance Plan 2011/2012, Pretoria: SAPS Strategic Management (Head Office), 2011, http://www.saps. gov.za/saps_profile/strategic_framework/strategic_plan/ 2011_2012/annual_perf_plan_2011_2012.pdf (accessed 22 September 2011). Note that the performance plan is different from the strategic plan. The strategic plan is a multi-year plan (the strategic plan currently in operation is that for 2010-2014). 7. See for instance South African Police Service, Annual report of the South African Police Service for 2010/2011, Pretoria: SAPS Strategic Management (Head Office), 2011, http://www.saps.gov.za/saps_ profile/strategic_ framework/annual_report/index.htm (accessed 21 October 2011). An overview of measures, indicators and performance is also presented in the Estimates of National Expenditure published by Treasury as part of the annual budget at the beginning of each year. 8. This observation was made by an interviewee. Also see David Bruce, Gareth Newham and Themba Masuku, In service of the people’s democracy – an assessment of the South African Police Service, Johannesburg: Centre for the Study of Violence and Reconciliation & Open Society Foundation for South Africa, 2007, 53, http://www.csvr.org.za/wits/papers/papsaps.htm (accessed 21 October 2011). 9. Violent crime against the person (i.e. excluding for instance malicious damage to property). 10. Vehicle hijacking, residential robbery and business robbery (robbery at non-residential premises). 11. See for instance the sources cited in David Bayley, Changing the guard: Developing democratic police abroad, Oxford: Oxford University Press, 2006, 80. 12. See for instance South African Police Service, The Crime Challenge Facing the SAPS, 2011, 9-10, http://www.saps.gov.za/saps_profile/strategic_framewor k/annual_report/2010_2011/10_crime_challenge_saps.p df (accessed 19 October 2011). 13. Bruce, Newham and Masuku, In service of the people’s democracy, 23. 14. Bayley, Changing the guard, 79. 15. Bruce, ‘The ones in the pile were the ones going down’. See also Auditor General South Africa, Report of the Auditor General to Parliament on a performance audit of service delivery at police stations and 10111 call centres at the South African Police Service, 2009, 18. 16. The number includes 25 605 detectives based at police stations and another 2 617 who are part of the Directorate Priority Crime Investigation (DPCI). The Detective division also includes 3 967 personnel in the Criminal Records Centre and 1462 at Forensic Science Laboratories. (SAPS, Annual Performance Plan 2011/2012, 18-19). 17. South African Police Service, Annual report of the South African Police Service for 2010, 86. 18. SAPS, Annual Performance Plan 2011/2012, 41. 19. Ibid. 20. The ICD for instance reported 45 allegations of torture in its statistical report for 2010-2011, the highest number reported since 1999. Graeme Hosken, Torture, murder, kidnap rife in cop ranks, Pretoria News, 30 September 2011, 1. 21. Bilkis Omar, ‘Inspecting’ the SAPS National Inspectorate, Pretoria: Institute for Security Studies, 2009, 11, http://www.iss.co.za/uploads/Paper207.pdf (accessed 29 September 2011). CQ No. 38 December 11/21/11 9:46 AM Page 16 SA Crime Quarterly no 38 • December 2011 13 22. The indicators for the detective service include one for ‘Percentage of court ready case dockets for fraud and corruption by individuals within the JCPS Cluster’ (SAPS, Annual Performance Plan 2011/2012, 41). 23. David Bruce and Rachel Neild, The police that we want: A handbook for oversight of police in South Africa, Johannesburg: Centre for the Study of Violence and Reconciliation, 2005, 20, http://www.csvr.org.za/ docs/policing/policewewant.pdf (accessed 21 October 2011). 24. The Auditor General South Africa (AGSA) is referred to in this report merely as the Auditor General (AG). 25. Auditor General South Africa, Portfolio Committee on Police Strategic Planning Workshop, 12 August 2009 (Powerpoint presentation), 4. 26. Public Audit Act (Act 25 of 2004). 27. Auditor General South Africa, Public Audit Act (25 of 2004): General directive, General Notice 616 of 2008. Government Gazette, 15 May 2008, 4, http://www.info.gov.za/view/DownloadFileAction?id=8 1536 (accessed 21 October 2011). 28. See also Section 28(1)(c). 29. See also Section 55(2)(a) of the Act. 30. Some of the key guidelines in this regard are set out in National Treasury, Framework for Managing Programme Performance Information, May 2007, http://www.treasury.gov.za/publications/guidelines/FM PI.pdf (accessed 19 October 2011). 31. Auditor General South Africa, Department of Police: Audit of Performance Information (AoPI), AoPI National Strategic Management Workshop, (Powerpoint presentation), 18 February 2010, 14. 32. Ibid, 13. 33. Auditor General, Public Audit Act (2512004): General directive, General Notice 616 of 2008, 5. 34. By implication questions of ‘compliance’ are not only addressed as a separate component of the ‘regularity audit’ but may also be addressed within one of the other components. 35. One set of criteria for selecting performance targets is the SMART criteria. This addresses whether the criteria are: Specific: the nature and the required level of performance can be clearly identified; Measurable: the required performance can be measured; Achievable: the target is realistic given existing capacity; Relevant: the required performance is linked to the achievement of a goal; Time-bound: the time period or deadline for delivery is specified. (National Treasury, 2007, 10). 36. Report of the Auditor General as presented in South African Police Service (2009), Annual report of the South African Police Service for 2008/2009, Pretoria: SAPS Strategic Management (Head Office), 149. 37. Auditor General South Africa, report to Portfolio Committee of Police, PFMA Outcomes 2010/11 (Powerpoint presentation), 11 October 2011. Available at: http://www.pmg.org.za/report/20111011- preparatory-workshop-%E2%80%93-south-african- police-services-annual-report-pa (accessed 14 October 2011). 38. Auditor General South Africa, Department of Police: Audit of Performance Information (AoPI), AoPI National Strategic Management Workshop, 18 February 2010, 7. 39. Presidency, Improving Government Performance, 16. 40. Section 5(3) is also understood to provide authority for this kind of audit. See also section 28(2)(a). 41. Auditor General South Africa, Department of Police: Audit strategy for the year ending 31 March 2010, 22 January 2010 (Powerpoint presentation), 4. 42. Ibid. 43. Auditor General South Africa, Department of Police: Audit strategy for the year ending 31 March 2010, 6. 44. Auditor General South Africa, Report of the auditor general to parliament on a performance audit of service delivery at police stations and 10111 call centres at the South African Police Service, 2009, 4. 45. Auditor General South Africa, Report of the auditor general on a performance audit of border control at the South African Police Service, 2008, 1. 46. Ibid. 47. Auditor General, Report of the auditor general to parliament on a performance audit of service delivery at police stations and 10111 call centres at the South African Police Service, 2. 48. PERSAP is the SAPS main personnel database. 49. Auditor General, Report of the auditor general to parliament on a performance audit of service delivery at police stations and 10111 call centres at the South African Police Service, 13. 50. Ibid, 18. 51. Ibid, 21 52. Ibid, 9. 53. Ibid. 54. Bayley, Democratising the police abroad, 13-14. 55. The questions are based on Bruce & Neild, The police that we want, 31 onwards. Also see Bruce, Newham and Masuku, In service of the people’s democracy, especially from page 77 onwards. 56. Home Office, Policing in the 21st Century: Reconnecting police and the people, 2010, 3, http://www.homeoffice.gov.uk/publications/ consultations/policing-21st-century/policing-21st-full- pdf?view=Binary (accessed 28 September 2011). 57. Ibid, 19. 58. Ibid, 20. CQ No. 38 December 11/21/11 9:46 AM Page 17