Article Index

Report C - Analysis of the Annual Report 2010-2011 of the SPSO

(Report prepared by ACCOUNTABILITY SCOTLAND)

There is much of interest in this report, but what we comment on here relates to two issues of concern:

  1. The lack of accountability of the SPSO in regard to the justness of rulings.
  2. The lack of clarity and informativeness of the Report, especially with regard to statistics.

 The lack of accountability of the SPSO

 Referring to public service providers on page 15, the CSA Manager writes:

“The emphasis of its framework is firmly on timely, simple and streamlined complaints handling. This involves a two-stage internal process with local, early resolution by empowered and well trained frontline staff followed by a one-off investigation within consistent timescales. The removal of the ‘safety net’ of subsequent tiers of review or appeal will encourage complaints handlers to get it right first time.”

The final sentence is curious and illogical in that an absence of checks and quality control usually leads to laxity. Complaints handlers would be more likely to get it right if they knew that there would be a review and possible appeal.

Is this statement to be taken as justification for the SPSO’s lack of accountability in regard to the justness of its rulings?

Pages 55-6 concern the Independent Service Delivery Review.

The Independent Service Delivery Reviewer’s role “is purely to look at complaints about service delivery within the SPSO”. It is thus concerned with procedures, having no power to review the correctness or fairness of decisions. It is justice that matters to complainants and to society.

Only nine cases were referred to the Independent Service Delivery Reviewer in 2010-11. The criteria for referral are not stated in the Annual Report, so no quantitative conclusions can be drawn.

An indication of criteria for referral is to be found in the ‘Invitation to Tender For Service Delivery Reviewer Consultancy Services’ which states:

“The purpose of this work will be to investigate complaints about the way in which the SPSO carries out its work in cases where the SPSO itself has been unable to resolve the matter under its internal complaints procedure. The Service Delivery Reviewer will look at the manner in which the complaint has been handled.” 

The Independent Service Delivery Reviewer lists a number of issues investigated, but does not say which were upheld or why. The issues included such items as refusal to explain decisions clearly, refusal to answer a letter and excessive delays.

Puzzlingly, the Independent Service Delivery Reviewer draws a number of reassuring and optimistic general conclusions about the work of the SPSO that cannot be justified by examination of only nine cases and these conclusions must be beyond her remit. No mention is made of other evidence.

One particular unexplained ‘conclusion’ gives emphasis to the restricted role of the Independent Service Delivery Reviewer:

“Inevitably, some complainants will try to use the Independent Review process to reopen their case and have ‘another bite of the cherry’. These attempts must be resisted.”

It is unclear whether this ‘conclusion’ is a general condemnation of case re-examination, but this statement does accord with the SPSO’s lack of accountability in regard to the justness of its rulings.

As detailed on the SPSO website a complainant can seek a review, but only on very limited grounds (if a decision is based on inaccurate facts, if there is new and relevant information).

However Jim Martin has given a seriously misleading description of the situation. Thus, in a letter to Fergus Cochrane, Clerk to the Public Petitions Committee (1 October 2010) he wrote:

“Individuals who are unhappy with the outcome of their complaint have the opportunity to express their dissatisfaction through our challenge to a decision process. The final external step of that process is judicial review.”

The lack of clarity and informativeness of the Report

The general impression given by the Report is that it is largely a public relations document. This is strikingly demonstrated by the recorded comments of complainants. Thus the Report highlights six glowing tributes to the SPSO written by complainants, each occupying one page. Pleasing as they are individually, they are not representative of the comments elicited in past complainant surveys. In fact, according to the Craigforth report for 2009-10, 70-77% of such comments expressed dissatisfaction with the SPSO. The SPSO’s Annual Report makes no mention of a later customer survey, and a FOI enquiry elicited the fact that there has not been one, but not the reason for abandoning the practice.

In the above quotation from page 15 is the sentence:

“This involves a two-stage internal process with local, early resolution by empowered and well trained frontline staff followed by a one-off investigation within consistent timescales.”

This statement is incomprehensible for two reasons. Firstly, it is illogical to imply that resolution can precede investigation. Secondly, the phrase “within consistent timescales” conveys nothing here.

Especially lacking in clarity is the statistical table on pages 60-61: “Cases determined in 2010-2011 by sector, stage and outcome”.

(This is best viewed in the printed version as it is formatted in a way that is hard to analyse on a computer screen.)

This list gives the stages of complaint handling as:

  • Enquiries - advice and signposting
  • Complaints – advice
  • Complaints – early resolution 1
  • Complaints – early resolution 2
  • Complaints – investigation 1
  • Complaints – investigation 2

Note first that the term ‘Early resolution’ is confusing, in that most cases in that category could not be resolved.

The distinction between ‘Early resolution 1’ and ‘Early resolution 2’ is not explained in the Annual Report or on the SPSO website, nor is the distinction between ‘Investigation 1’ and ‘Investigation 2’. In answer to a FOI request, the SPSO said that they do not hold a record of the definitions of the latter two terms. This greatly reduces the value of the table.

As is illustrated below, it is possible to work out what these terms mean, but only by comparing numbers in a way that should not be expected of readers.

According to page 10 there were 673 complaints investigated in depth, 612 with decision letters and 59 through 58 public reports.

The table (pages 60-1) shows the number of cases classified as ‘Investigation 2’ was 61, of which two are shown as ‘no decision reached’. The difference (60 – 2) is 59. This suggests that the category ‘investigation 2’ corresponds to the issuing of public reports.

Other statistics are given as pie charts on pages 10 and 11. Information on page 11 may be combined with the tabulated data (pages 60-61) to draw the following conclusions.

1) According to page 11, 61 cases reached the stage of investigation. That is the number entered under ‘Investigation 2’ in the table.

This implies that those tabulated as ‘Investigation 1’ did not reach the stage of investigation.

Such inconsistency of language is confusing.

2) Detailed comparison of figures in the upper pie chart on page 11 for ‘Decision letter outcomes 2010-11’ with the table on pages 60-61 shows that ‘Early resolution 2’ and ‘Investigation 1’ must account for the decision letters.

According to page 11, 59 ‘decision letters’ did not include a decision (!). The exact breakdown of this number is not clear from the table, but they must include the 53 complaints tabulated as ‘No Decision Reached’ (= 31+22) and the 6 cases tabulated as ‘outcome not achievable’. 

Errors in the Annual Report:

According to page 9, there were 3489 complaints in a year, but the table, and page 11, give 3351.

  In the upper pie chart on page 11, 76 cases are described as “some upheld”. This should presumably read “partially upheld”

On page 11 the lower pie chart (for Investigation report outcomes) is incorrectly headed ‘Decision letter outcomes 2010-11’. The chart actually corresponds to ‘Investigation 2’ outcomes.

Yet, addressing the Local Government and Regeneration Committee on 16 November 2011 Jim Martin twice said of the Annual Report "our statistics stand up to scrutiny".