Article Index

Report A - Two Reports Produced by Craigforth for the Scottish Public Services Ombudsman

These reports, based on questionnaires are:

  1. The SPSO ‘Complainant Survey Results 2009 & 2010’. (Access to this on the SPSO website was denied for a time.)
  1. The SPSO Service and Awareness Survey of Bodies Under Jurisdiction. 2008/2009 (August 2009).

This report was prepared by members of Accountability Scotland, including scientists who routinely analyse data and statistics. It is not a complete summary of the Craigforth reports.

BACKGROUND

The Craigforth reports are based on postal surveys using self-completion questionnaires devised by the SPSO. These were mainly concerned with ratings of satisfaction or dissatisfaction with the SPSO and its performance, although the complainant surveys also gathered information on characteristics of the respondents such as age and ethnicity.

It was not in Craigforth’s remit as specified by the SPSO to present other data, however important and relevant, that was not obtainable from the questionnaires – such as the very small proportion of complaints investigated and the large proportion dismissed at the SPSO’s discretion. (See the statistics from the SPSO website (www.spso.org.uk/statistics) that are analysed in our Report B.)

The Craigforth reports do not interpret the data or suggest improvements in the questionnaires.

SOME KEY FINDINGS

These are stated briefly here, but are elaborated on subsequent pages, the links being indicated by capital letters.

  1. Addressed here in alphabetised sections are various parts of the two reports that are unclear (sections C, D, H5, H7), unreliable (section H3), inadequate (section H6), or even clearly wrong on the basis of evidence within the reports themselves (sections B, F).
  2. In both 2009 and 2010, analysis shows that the SPSO was perceived by complainants as particularly unsatisfactory in regard to Local Authority issues as compared with Health issues (G).
  3. Complainant dissatisfaction increased from 2009 to 2010. This is illustrated by a graph in the Complainant Survey and is therefore not discussed further here.

CONCLUSIONS

  1. In some respects the Craigforth reports produce an unclear and unreliable interpretation of SPSO performance.
  2. Nevertheless, they do show that the performance of the SPSO is highly unsatisfactory.
  3. The body responsible for analysing the performance of the SPSO should in future be appointed by the SPCB and not by, or in collaboration with, the SPSO.
  4. The questionnaires should be re-designed to elicit fuller and less ambiguous information. Essential information not elicited was whether or not complaints were upheld.
  5. Questionnaires should be analysed, not just those for a quarter year, but for every quarter year.
  6. Complainants should be encouraged more strongly to participate.

The SPSO ‘Complainant Survey Results 2009 & 2010’

 A. Craigforth recognised the relatively small scale of the survey, which is based on a total of 229 returns, and noted that:

the results should be considered as indicative rather than statistically robust’.

The Ombudsman himself chose the data to be analysed (the first quarter only of each of the two financial years, whereas all quarters were included in 2008) and thus the sample sizes. This could have biased the results, as could the fact that people with strong opinions are more likely to complete the questionnaires. This is inappropriate methodology for a survey of people who may have been “worn-down” through unsuccessfully pursuing a complaint through SPSO procedure. That many respondents did not answer questions, or ticked the “neither satisfied or unsatisfied”, boxes raises questions of how conscientiously survey forms were filled in.

Nevertheless, clear evidence emerges for poor performance by the SPSO.

B. Key findings. Overall, in 2010, 33% of respondents indicated that they were very satisfied with the service they received (up from 23% in 2009 and 22% in 2008), and 17% indicated that they were satisfied. However, 27% indicated that they were very dissatisfied (compared with 24% in 2009 and 23% in 2008) and 13% indicated that they were just dissatisfied (making a total of 40% dissatisfied or very dissatisfied). These percentages are based on a table on page 12.

That 33% of people indicated that they were very satisfied cannot mean that 33% were very satisfied in all respects, because only 21% indicated that they were very satisfied with the clarity of the SPSO website (p 5), only 24% indicated that they were very satisfied with the thoroughness of the SPSO’s examination (p 8), only 27% indicated that they were very satisfied with the clarity of the SPSO’s explanation of the outcome (p 8), etc. Similarly, more than 27% (i.e. 30%) indicated that they were very dissatisfied with the thoroughness of the SPSO’s examination (p 8).

The finding of the survey that 33% of people rated themselves as very satisfied with the service they received should therefore be interpreted cautiously and should have been qualified by the writer.

C. There is a table on page 22 that is reproduced below. It has the brief heading “Aspects Respondents Particularly Satisfied/Dissatisfied With” instead of the fuller explanation given for the similar table in 2008 From the latter one can work out that ‘More face to face contact or site visit’ represents dissatisfaction and that the “Base” numbers are for responses, not people. In 2008 the number of respondents was also given. Although the figures are damning as they stand, it would be more revealing to be given specific reasons for dissatisfaction. Indeed, detailed knowledge of how even a few particular cases were handled would be more telling than abstract numbers – but that was not in Craigforth’s remit. The data for 2010 indicate that 21% of the cases in question were not investigated adequately and effectively. At least 70% of the responses indicated dissatisfaction with the SPSO.

Aspects that Respondents were Particularly Satisfied/ Dissatisfied With

Item 2008 2009 Q1 2010 Q1
SPSO biased/ineffective 106 27% 26 32% 14 25%
Complaint not investigated thoroughly  70 18% 7 9% 12 21%
Poor communication / overly complicated process 45 11% 6 7% 4 7%
Too slow a process 41  10% 6% 6 11%
SPSO effective / good communication   39 10% 7 9% 7%
SPSO staff professional and helpful  26 7% 15 18% 9 16%
SPSO advised poorly / lack of expertise 11 3% 2 2% 2 4%
Non SPSO issue 20 5% 7 9% 0 0%
More face to face contact or site visits   8 2% 0 0% 2 4%
Lack of alternatives offered 5 1% 0 0% 0 0%
Other 24 6% 7 9% 4 7%
Total 395   82   57  

D. The document refers on several pages to four “steps” in the SPSO processes, but these are not defined. They are not the same as the ‘stages’ tabulated in the SPSO website. This means that a number of tables are hard to interpret.

E. The questionnaires include no questions on the respondents’ perceptions of the factual accuracy of the SPSO reports. This is a crucial omission.

F. The tables on pages 34 and 38 are wrong. In the ‘Your complaint’ table on page 34, the numbers in the ‘Very dissatisfied’ column must be wrong, as they are the same as the ‘base numbers (n)’, meaning the totals. Moreover, the numbers ‘satisfied’ exceed these total numbers. Likewise the top table on page 38.

These errors are less important than the fact that the SPSO paid too little attention to the report to notice them.

G. The report includes numerous tables recording numbers of complainants that were very satisfied, satisfied, neutral, dissatisfied and very dissatisfied in regard to such things as clarity of explanation, and the courteousness, impartiality and helpfulness of SPSO staff. The data were broken down according to the Bodies under Jurisdiction involved (‘subject groups’). In both 2009 and 2010 the SPSO proved particularly unsatisfactory with Local Authority issues and much better with Health issues. These are the two largest subject groups. The difference is evident from simple inspection of the tables on pages 23-38. However, we have calculated, separately for ‘Health’ and for ‘Local Authority’, the percentages of respondents that were either dissatisfied or very dissatisfied. The results are compared in the following graph, which shows ‘Health’ percentages plotted against ‘Local Authority’ percentages. What it shows, in words, is that, in almost all respects, the SPSO was rated as even more unsatisfactory in regard to Local Authority issues than in regard to Health issues.

The various points in the graph represent different aspects of dissatisfaction or satisfaction that are tabulated separately in the report. Regarding the symbols on the graph, the squares are for 2009 and the crosses are for 2010.

Key: Squares 2009; crosses 2010

As an example, the cross marked with an arrow indicates that, for a particular topic (the level of information received with updates), dissatisfaction was 23% for Local Authorities and 33% for Health.

The diagonal line corresponds to equality. That most of the points lie below this line means that, for 2010, the SPSO was judged particularly unsatisfactory in regard to Local Authority issues and much less so in regard to Health issues. This could relate to the common perception that the SPSO is not impartial towards local authorities.

As explained in the SPSO’s Annual Report 2010-2011 (page 27), there are clear reasons why a higher proportion of health cases reach the investigation stage. However, this does not explain why so many complainants with Local Authority issues are not persuaded that their cases have been handled satisfactorily.

SPSO Service and Awareness Survey of Bodies Under Jurisdiction. 2008/2009. report by Craigforth August 2009.

In March 2009 the Scottish Public Service Ombudsman commissioned Craigforth to undertake a service and awareness survey of Bodies under their Jurisdiction (BUJs).

(The BUJs then were Health, Housing Associations, Local Authority, Further and Higher Education, Scottish Government.) 

Herewith the link to The SPSO Service and Awareness Survey of Bodies Under Jurisdiction 2008/09 report. It refers to accompanying appendices that include a copy of the survey, but these are not available at this website.

 H. Questionnaires were sent to all BUJs that had received a decision from the SPSO in the previous six months. They paint a rosy picture of BUJ satisfaction.

However:

  1. Only 54% of questionnaires were returned. (The value of the survey is compromised if responses came mainly from those BUJs that took complaint handling seriously.)
  2. Questionnaires were written by the SPSO itself (unavailable, as noted above).
  3. Only one questionnaire was filled in per BUJ, but some of these had received more than 20 decisions from the SPSO. Many people in the BUJ would then have been involved, so that no one person could have given reliable general responses.
  4. The questionnaire dealt with distinctly different aspects of satisfaction with the SPSO:
    1. matters to do with the specific complaints
    2. SPSO publications, seminars etc.
  5. Item 5.3 on page 11 categorizes in a table the different areas of particular satisfaction or dissatisfaction. However, one cannot tell from the table whether particular responses indicated satisfaction or dissatisfaction (e.g. ‘Other’, ‘impartiality of SPSO’).

Here is the table: Areas of Particular Satisfaction or Dissatisfaction

Comment

Number

Helpful Staff/ Good Quality Service

22

Need improved contact/ updates

8

Too long a process

7

Value informal discussions

5

Excellent Guidance

2

Impartiality of SPSO

1

Other

5

Total

40

  1. There is no indication anywhere as to whether rulings were or were not in favour of the BUJs – or how many. That is relevant, for example, to Item 5.3.
  1. The ‘Decisions’ table on page 7 includes the statement “Decisions made by SPSO are consistent” for which respondents are required to indicate their degree of agreement. We are left wondering what the decisions might be consistent with and how the respondents interpreted the statement.
  1. BUJs could sometimes be more reluctant to be critical of the SPSO than are complainants in that they might wish to remain on good terms.