Article Index

  Submission to the Scottish Parliamentary Corporate Body

from Accountability Scotland

concerning the Scottish Public Services Ombudsman

The primary role of the SPSO is to investigate and deliver administrative justice. No one has ever investigated whether the SPSO delivers justice to the complainants or to the employees of the organisation complained about. We have evidence from both groups that he has failed to adequately and effectively investigate cases producing quality outcomes.

Summary of main points

  1. The SPCB recommended reappointment of SPSO without being informed of the full factual situation.
  2. Parliament, its committees and its staff are reliant on full and accurate facts from the bodies under their supervision; the SPSO demonstrably uses positive appraisal, emphasizing the positive and ignoring the negative.
  3. The SPSO removed the only (partly) independent check and balance in the system by not re-employing the Craigforth company to analyse and report on the work of the SPSO.
  4. In the Craigforth survey for 2010 at least 70% of the volunteered responses indicated dissatisfaction with the SPSO, with 21% of the cases in question regarded as not being investigated thoroughly.
  5. Like complainants, bodies under jurisdiction are also critical of the quality of the SPSO’s investigations and subsequent reports.
  6. The SPSO has four elements in the reporting of procedure, governance and supervision, but no quality control or supervision of either the adequacy or the effectiveness of its investigations in the delivery of administrative justice in its rulings.
  7. In contrast, Criminal Justice has many more stages of checks and balances in the delivery of justice – from trained police officers, to superior officer, to procurators fiscal, sheriffs, defence and possible appeal.
  8. Audit Scotland has not investigated the adequacy, effectiveness and justice of the SPSO’s rulings since the SPSO was set up ten years ago. Nobody has.

The content of the letter from The Secretary, Accountability Scotland, dated 30th January 2012

Accountability Scotland (formerly called Integrity for Scotland) was invited by previous members of the SPCB (Alex Johnstone MSP and Mike Pringle MSP), through Peter Stewart-Blacker, to submit the accompanying reports concerning the SPSO. Our points of contact have been Janice Crerar and Huw Williams, who have been extremely helpful. We understand the difficulties they have had in drawing conclusions from the Craigforth statistics as they have no formal training in this subject and are wholly reliant on Craigforth and the SPSO’s interpretations.

In essence, we say that:

  • The SPSO, while clearly following efficient procedures, fails adequately and effectively to investigate and report on a large proportion of cases. This aspect of the SPSO’s work has never been investigated since the SPSO was set up ten years ago.
  • As a result, Parliament cannot be well informed on the justice of the SPSO’s rulings.
  • Furthermore, Jim Martin himself lacks the necessary feedback to judge this aspect of his organization’s performance and so report on it reliably.
  • The SPSO does lay investigation reports before Parliament, but as there is no formalised scrutiny of these by any part of Parliament or the Scottish Executive that could deliver accurate and quality information on the SPSO’s general performance, there is a need to commission an independent investigation on the quality of reports.

It is important to note that the evidence we report is from the SPSO’s own sources. It is presented in three reports, A, B and C, together with a cover document. These are summarized in the following table. (Some other information (referenced by Roman Numerals) is given in the Appendix.)



Main Findings / Conclusions

Cover Document

Introduction to reports A-C, with Commentary

There is a clear need for adequate and effective external scrutiny of the SPSO in regard to the fairness and justice of its rulings.

Report A

Analysis of Craigforth surveys of complainants and bodies under jurisdiction

The survey reports (now discontinued) provided an inadequate analysis of SPSO performance, but did show that no more than 50% of complainants had recorded satisfaction with it.

40% recorded dissatisfaction.

Dissatisfaction was much greater with local authority issues than with health issues.

Report B

Analysis of Statistics from the SPSO website

The number of submissions accepted as ‘complaints’ each year doubled, yet the number investigated declined from 18% to 4% (and only 0.6% for housing associations).

Each year the percentage of complaints reaching the investigation stage for Health Authorities is twice that for Local Authorities.

Report C

Commentary on SPSO Annual Report 2010-2011

In important respects the Report lacks clarity and informativeness, because of obscure terms and errors.

It demonstrates, and argues for, the SPSO’s lack of accountability in respect of the justness of its rulings.

Our primary objective in submitting this evidence is to persuade the SPCB that they should look closely at the functioning of the SPSO – in regard, not so much to processes and procedures, but to the outcome that matters most, the delivery of transparent justice. In the words of Lord Hewart:

"Not only must Justice be done; it must also be seen to be done."

We believe that there is a clear and urgent need to establish an independent mechanism for monitoring the adequacy, effectiveness and justice of the rulings of the SPSO, as recommended by Dr. Bernard Kingston in 2006 (II).

It is as if the Heinz Company were to run a bean cannery that is well monitored for mechanical efficiency and cost-effectiveness, but without any checks on taste and quality.

The SPSO has given the impression that his critics are just serial or vexatious complainants, but we are aware that one body under his jurisdiction, a Health Board, has suffered similar problems, with catastrophic effects on a GP’s personal and professional life and with far-reaching financial implications for the Health Board (III*). The reports (A-C) accompanying this letter were researched by two scientists well-acquainted with statistics, neither of whom has been personally disadvantaged by rulings of the SPSO. One submitted two complaints to the previous Ombudsman, on behalf of the community and not of himself. The other submitted a complaint, again on behalf of the community, that was actually successful.

In researching evidence we are aware of the possibility that the SPSO’s words might be subject to some positive appraisal (IV). Jim Martin has stated (IV) that:

“Individuals who are unhappy with the outcome of their complaint have the opportunity to express their dissatisfaction through our challenge to a decision process.”

However, there are only very restricted grounds for challenge or appeal (IV, V, VI). Currently, there is external scrutiny only of procedural service complaints by the Independent Service Delivery Reviewer (and very few of these, chosen by the SPSO). She was appointed by the SPSO (for her customer service experience) and is therefore not fully independent. Moreover, she has a very limited remit that specifically excludes case reviews (Report C). Surveys previously carried out by the social research company Craigforth have been discontinued by the SPSO, but in any case they produced little more than statistics on the satisfaction and dissatisfaction of complainants and of bodies under the SPSO’s jurisdiction. Like the Independent Service Delivery Reviewer, Craigforth was chosen and appointed by the SPSO. Experience elsewhere shows that such appointees are loath to bite the hand that feeds them.

An obvious source of evidence on the adequacy and effectiveness of the SPSO’s performance, and on their ability to produce just decisions, lies in the experiences of individual complainants. However, such evidence tends to be complicated and voluminous and may, in the end, be regarded with suspicion as ‘anecdotal’ or unrepresentative. This is one reason for not presenting such evidence here. For the future, we suggest an approach that does not involve mastering all the intricacies of particular cases. This is to pick out some of the many clear-cut and demonstrable examples in SPSO correspondence and rulings of such things as high-handed dismissal of evidence, lack of impartiality, faulty logic and arbitrary decisions. Accountability Scotland would be pleased to prepare such evidence. For now we simply note the experiences of Murdo Fraser MSP (VII) and the damning comment of an English Ombudsman (VIII).

Another issue that needs to be addressed is the knowledge, skills and experience of the SPSO staff. That only about 50 % of complainants are satisfied with the performance of the SPSO may be partly due to the selection and training of the staff. Case workers have been recruited to the SPSO because of their experience in ‘customer service’ and not for proven abilities as investigators. ‘Customer service’ involves activities designed to enhance the level of customer satisfaction, namely the feeling that a product or service has met customer expectations. Although that is useful, the investigators appear to lack the investigatory experience that comes from training and working in inspection, review, consultation and audit services. Furthermore, despite the customer service skills, past complainant surveys revealed high levels of customer dissatisfaction. The SPSO should at least have been able to persuade complainants that they have been properly treated. An SPSO case worker needs a skill set for discovering the truth and delivering justice.

Even Jim Martin was himself criticized by the SPSO for his procedures when Police Complaints Commissioner (IX). When the SPCB considered his re-appointment he was tested at interview on several criteria, but apparently not on his knowledge of how to carry out adequate and effective investigations (X).

We would be delighted if the Corporate Body would like us to appear before them to answer questions, to present the facts and discuss potential solutions within the limitations of the SPCB’s powers. We do also have a more forensic report on the detailed management of the SPSO.

Yours sincerely,

Dr. Richard Burton, Secretary

This is to supplement comments in the above letter from Accountability Scotland to the Scottish Parliamentary Corporate Body. Roman numerals in the letter refer to items I-XI below.

I - About ‘Accountability Scotland’

Accountability Scotland is a democratic organization formed by people from all walks of life who, due to their experience with the SPSO, have come together to campaign for transparent public accountability in Scottish governance. As of September 2011 we have a published constitution and an elected committee. Its members are highly qualified in various fields and include university research scientists with doctorates and many published papers and books, a medical doctor, a researcher and co-coordinator in international public health projects across Europe and US, a civil engineer, electronics engineer, head teacher, contract advisor (construction, engineering, building), a voice dialogue facilitator, and specialists in estate management, law and legal practice. The committee's expertise covers agriculture, business, biological science, occupational psychology and work-stress, campaigning, politics, care work, software development, statistics, journalism, negotiation, commerce, engineering, tourist board directorship, experience in assisting complaints to local authorities and the SPSO and in drafting petitions to the Scottish Parliament.

II - Opinion of an SPCB’s Independent Assessor

Dr Bernard Kingston, the SPCB’s Independent Assessor, was involved in the re-appointment of Professor Alice Brown. He wrote:-

“In addition, this re-appointing process has highlighted, once again, my concern regarding the absence of regular formal appraisals of Crown appointees nominated by The Parliament.

This runs counter to good practice which requires that public bodies must have in place regular and transparent performance assessment procedures to provide necessary and robust evidence when considering re-appointments.

I recognize the argument put that it is difficult to independently investigate the independent investigator but I do not accept that this is insurmountable. I strongly recommend, therefore, that The Parliament continues to give urgent consideration to this matter for these and similar appointments.

That said, I well recognise that this is new territory for us all and one that requires in-depth consideration, wide consultation and due diligence and sensitivity to get it right. A number of other good practice elements emerged during this re-appointments process and I will write an additional note on these for future consideration and action.”

Mr Murray Tosh was the independent Assessor for Jim Martin. He only looked at governance issues, not at the quality of case outcomes. 

III - Inadequate investigation of a Body under Jurisdiction

Case 201002536: Greater Glasgow and Clyde NHS Board. (Reported 20 August 2011)

The essence of this case in the present context is (1) that it is a body under SPSO jurisdiction that is aggrieved (and seriously affected), not a complainant, (2) that the evidence, as reported, is inconclusive and (3) that the general practitioner involved was not interviewed either by the SPSO or by the SPSO’s medical adviser appointed to the case. The SPSO has confirmed that the SPSO did not interview the GP and that it is not the SPSO’s routine practice to conduct interviews.

The latter contrasts with the police practice of always conducting interviews.

Details follow, but, for the sake of Mrs C, we would not wish publicity to be given to this case.

The complainant (Mrs C) raised concerns about the care and treatment provided by a GP from the out-of-hours service to her husband on 2 August 2010. She complained that the GP failed to diagnose Mr C with ischaemic heart disease and admit him to hospital. Mr C died of a heart attack several hours after the GP's visit.

The complaint, that the Board failed to provide reasonable care and treatment to Mr C on 2 August 2010, was upheld.

The SPSO’s complaints reviewer looked at clinical records and the complaint correspondence from the Board, obtaining advice from a retired GP who is unlikely to have experience of the out-of-hours service (the Adviser) and interviewing Mrs C by telephone. The complaints reviewer also listened to the recording of the telephone call Mrs C made to NHS 24.

Quoting the Report:

“The Board responded formally to Mrs C's complaint on 31 August 2010. The Board said that having looked at the evidence, they were in no doubt that the GP had attended Mr C in good time, took a full history, carried out an appropriate clinical examination and formulated a management plan based on his carefully considered clinical assessment. Mr C had presented with symptoms which only with the benefit of hindsight were indicative of a cardiac problem. The Board said that having discussed the matter with the GP, they were certain he took his time and used his skill and history taking to try to determine whether Mr C's symptoms were serious and life-threatening. The GP's clinical impression was that Mr C was well and did not have any serious underlying condition, which was reinforced by his examination. The Board said that had the GP suspected cardiac pain, he would have admitted Mr C immediately to hospital and had he done so, the outcome may have been different. Despite the GP's best efforts, he did not predict such a tragic outcome. The Board apologised that the GP was unable to make a diagnosis that would have led to Mr C's immediate admission to hospital.”

The Ombudsman wrote:

“I have decided that there were failures in the care and treatment provided to Mr C by the GP. In all the circumstances, I uphold the complaint.“

IV - Examples of positive appraisal by Jim Martin

Positive appraisal is evident in the Annual Report, as pointed out in our Report C. Notable are the whole-page glowing testimonials that fail to reflect the balance of opinion demonstrated in the Craigforth reports.

Here are further examples of positive spin, in a letter to Fergus Cochrane, Clerk to the Public Petitions Committee (1 October 2010). The relevant passage is followed by our numbered comments, of which the third is especially notable. (The underlining is ours.)

“What mechanisms do you have in place to examine public dissatisfaction at the SPSO in managing complaints raised by members of the public?

I would contest the implication in the question that there is widespread public dissatisfaction with the SPSO in managing complaints raised by members of the public. There is, of course, some dissatisfaction, but it is by a minority and is entirely in keeping with the nature of our business. It is true of all Ombudsman services that some complainants, in particular those who have not had their grievance upheld, will criticise the body for making a decision that was not in their favour.

The SPSO actively seeks the views of all the people who use its service through our satisfaction surveys. We openly publicise their feedback on our website, and use their views to inform improvements to our service.

Individuals who are unhappy with the outcome of their complaint have the opportunity to express their dissatisfaction through our challenge to a decision process. The final external step of that process is judicial review.“

  1. “-- some dissatisfaction, but it is by a minority”. This is literally true, but the situation can be expressed less positively. Thus the Craigforth statistics (2010) indicate no more than 50% satisfaction. This must mean some kind of general satisfaction, because only 41% of respondents were satisfied with the thoroughness of the SPSO’s examination of their complaints.
  2. “We openly publicise their feedback on our website”. This is true up to a point, but the most informative part of the feedback (specific comment) is withheld (see Report A).
  3. “Individuals who are unhappy with the outcome of their complaint have the opportunity to express their dissatisfaction through our challenge to a decision process.”

This misrepresents the true situation. The grounds for review are very limited (see item V below). Complainants do try to challenge the decision process on other grounds, but are then just told that that is impermissible. The SPSO publishes Jim Martin’s replies on the website, but with all potentially interesting detail redacted.

  1. “The final external step of that process is judicial review“. This bald statement conveys the impression that judicial reviews are a useful back-up, but so far there has only been one judicial review (see item VI below).

V - Limitations set on SPSO’s reviewing of disputed decisions

The circumstance under which a complainant can request a review of a disputed decision are clearly stated on the SPSO website: 

Asking for a review

The grounds on which you can ask for a review are limited, you can only ask for a review if you consider that:

  • we made our decision based on important evidence that contained facts that were not accurate, and you can show this using readily available information.
  • you have new and relevant information that was not previously available and which affects the decision we made.  (When sending ‘new and relevant information’ to us, please tell us if the body you complained about has been given the opportunity to consider the information and if possible, please include the organisation’s updated response to that).

We will not accept a request for a review just because you disagree with the outcome of your complaint. 

VI.  Judicial reviews

These are too costly and difficult for most dissatisfied complainants.

In 2007 the SPSO faced its only completed judicial review. Argyll and Bute Council were contesting a decision of the Ombudsman. The judicial review ruled in favour of the local authority and against the SPSO.

Recently a second judicial review has been initiated. See “Draft judicial review of the Scottish Public Services ombudsman’s decision in Scottish Enterprise fraud case” (24 June 2011).” 

VII. Evidence to the review of SPCB supported bodies committee from Murdo Fraser MSP (RSSB27, 16 January 2009)

Here are some excerpts from this document:

“In my role as an MSP, I have worked with a number of constituents on their cases that have been considered or investigated by the SPSO office. In the cases that I have been involved in, there seems to be several similar criticisms that my constituents have noted regarding their case. These can be summarised as follows:

    1. the length of time taken to decide whether or not to take the complaint to an investigation;
    2. the length of time taken to undertake the investigation;
    3. the quality of the investigation;
    4. the quality of the final report;
    5. the lack of dialogue and opportunity to change the draft report once it has been completed; and
    6. the way that the complaint was generally handled by the SPSO office.

“These underlying problems of the SPSO office in relation to a complainant’s case must be resolved in order to have an effective Ombudsman.

"Another issue is lack of accountability. I believe that there should be a more systematic way of ensuring that the Ombudsman is held to account more regularly by the Parliament. I understand that the SPSO has to be independent of Parliament. However, apart from judicial review, there is no way for a complainant to call into account the work of the SPSO. Stronger accountability is required so that MSPs can question any reports that constituents have concerns with and can also question the conduct of the office.

“There is a real concern that much of the investigation work by the SPSO is based on information handed to them by the complainant and the body that has a complaint lodged against them. This information is useful for a primary stage, but there seems to be a lack of going out in the field and interviewing organisations in order to develop and question the information given to them. Investigations can not just be a ‘paper-trail’.

“In one example of a complaint that I helped work on, after the draft report was submitted by the SPSO, it was clear that there were some basic inaccuracies as well as some important and quite central information not included. There were basic errors and the complainant was never given an explanation for the conclusions. Furthermore, there were some statements in the report that appeared to conflict with other statements in the same report. It is quite right that a complainant will not accept or be happy with a report if the report published gives the instant impression that the issue has not been fully investigated or understood.”

VIII. Comment from an English Ombudsman on an SPSO case

Here is a comment on an outstanding test file sent by Jim Martin to an English counterpart, Jerry White, a Local Government Ombudsman (29 September 2009): The report was laid before Parliament.

“I conclude that the SPSO’s handling of the complaint was characterised by very considerable delay and confusion. Bluntly, it is the worst case of complaint handling by an Ombudsman’s office that I have seen.”

IX. Criticism of Jim Martin as Police Complaints Commissioner for Scotland

In 2008 the SPSO investigated a complaint against Jim Martin as Police Complaints Commissioner for Scotland (Case 200702044). The complaint was not upheld, but the Ombudsman wrote.

“However I am concerned that given the sequence of events described above, at the time Mr C made his complaint to the PCCS, he could reasonably have expected to have been given a draft report upon which to comment. This did not happen and Mr C did not have the opportunity to bring his concerns to the PCCS' attention before publication. I criticise the fact that he was not made aware of the change in the PCCS’s policy in this respect and also of the fact that the related advice leaflet was only updated following this Office’s enquiries in 2008.”

X - Inadequate criteria for the-appointment of Jim Martin

Here is an excerpt from the paper prepared by the SPCB to inform debate on consideration of the SPSO, Jim Martin for reappointment.

“21. In keeping with the criteria used for the independent evaluation mentioned above, the panel agreed that the Ombudsman should be tested at interview on the following criteria and scored against each of them:

    • Managing the day to day running of the office;
    • Fulfilling the functions of the post;
    • Leading any change programme impacting on the Ombudsman’s office in a positive and supportive way;
    • Communication;
    • Leadership and motivation skills
    • Forward planning; and
    • Achievements”
    • There was apparently no consideration of the adequacy and justice of the Ombudsman’s rulings.

XI - Comparison with other Public Sector Bodies in Scotland

Despite the absence of comparable bodies in Scotland, many aspects of the SPSO’s work are replicated in other organisations. For example, Audit Scotland undertakes investigations, but unlike the SPSO uses the Quality Framework to monitor the quality of its decisions.

This framework is used extensively across the public sector, yet the SPSO, who claim to share the learning from their work in order to improve the delivery of public services in Scotland, don’t use it and appear unwilling or unable to learn from others.

Note that the framework includes post project appraisal for quality, whereas the SPSO’s Annual Report argues against having a ‘safety net’ of review and appeal (see Report C).

Evidence on the Performance of the Scottish Public Services Ombudsman

The output of the SPSO is not subject to adequate and effective external scrutiny in regard to the fairness and justice of its rulings, though it does give much attention to its procedures in terms of their speed, politeness and cost.

The SPSO’s external complainant survey, which produced evidence of serious and widespread dissatisfaction, has recently been discontinued.

Accompanying these cover pages are three reports prepared by ‘Accountability Scotland’. Based on the SPSO’s own publications and official statistics, they demonstrate a need for more adequate and effective external scrutiny of the SPSO to ensure transparent public accountability.

Report A analyses two reports produced by Craigforth for the SPSO:

  1. Scottish Public Services Ombudsman: Complainant Survey Results 2009 & 2010.
  2. SPSO Service and Awareness Survey of Bodies Under Jurisdiction, 2009/09.

Report B analyses statistics from the SPSO website (

Report C is a commentary on the SPSO’s Annual Report 2010-2011. It concerns (1) the lack of accountability of the SPSO and (2) the lack of clarity and informativeness that renders parts of the Annual Report incomprehensible.

The material analysed in reports A and B was available to inform MSPs at the time of Jim Martin’s reappointment as SPSO and the Local Government and Communities committee when they closed the eight public petitions against the SPSO without public discussion (though not without dissent).

However, MSPs do not have the time to analyse detailed statistics, to look at questionnaires and raw data and to read voluminous documents. They should be able to assume that any commissioned report is a clear, objective and accurate statement of facts.

We note in passing that the Ombudsman, like the SPCB, cannot respond adequately to flawed evidence.

The existence of the petitions reflects a widespread and serious concern over the performance of the SPSO that is reflected in the Craigforth statistics and already known to a few MSPs. These statistics reveal considerable dissatisfaction with the SPSO. However, a full, objective and fair analysis of this malaise based on individual case histories, yet to be accomplished, must inevitably be long and detailed. All the more important, therefore, is a dispassionate analysis of official statistics.

Because the Craigforth reports are accepted as authoritative documents, it is essential that they be produced to the standards of scientific papers – based on the fullest available data, checked for errors and lack of clarity.

The Craigforth reports do not match acceptable scientific standards.

The questionnaires sent to complainants by Craigforth elicited many detailed comments relating to dissatisfaction and satisfaction with the SPSO. These responses could have been especially revealing, but it was not in Craigforth’s remit to do more than classify and count these detailed comments; they are reduced just to numbers in the Craigforth report, in the table on page 22 (reproduced in Report A). The information is more obscurely presented than in previous years, but it is evident that 70-77% of the responses in 2010 represented dissatisfaction. (The imprecision in the percentage reflects ambiguity in the table.)

It is regrettable that the written comments were not better utilised. The SPSO quotes four such comments on their website (and the later Annual Report quotes six), but these are all very appreciative, and therefore unrepresentative. They are clearly a public relations gimmick.

SPSO staff are encouraged to form good relationships with bodies under their jurisdiction, such as health boards and local councils. This can be likened to encouraging the police to form friendships with criminals to the detriment of their victims. In other areas this has been shown to bias responses and could be a reason for the very low complainant satisfaction with the impartiality of the SPSO.

Not all complainants can obtain their desired outcome (because of inadequate evidence, limitations in the powers of the SPSO etc). They should, however, end up satisfied that the SPSO has done its best for them. The Craigforth data show this not generally to have been the case (see Report A).

Public access to Craigforth’s ‘December 2010 service user survey results’ appears to have been denied for a period, but has now been reinstated on the SPSO website. A later survey has not been commissioned.

Report A - Two Reports Produced by Craigforth for the Scottish Public Services Ombudsman

These reports, based on questionnaires are:

  1. The SPSO ‘Complainant Survey Results 2009 & 2010’. (Access to this on the SPSO website was denied for a time.)
  1. The SPSO Service and Awareness Survey of Bodies Under Jurisdiction. 2008/2009 (August 2009).

This report was prepared by members of Accountability Scotland, including scientists who routinely analyse data and statistics. It is not a complete summary of the Craigforth reports.


The Craigforth reports are based on postal surveys using self-completion questionnaires devised by the SPSO. These were mainly concerned with ratings of satisfaction or dissatisfaction with the SPSO and its performance, although the complainant surveys also gathered information on characteristics of the respondents such as age and ethnicity.

It was not in Craigforth’s remit as specified by the SPSO to present other data, however important and relevant, that was not obtainable from the questionnaires – such as the very small proportion of complaints investigated and the large proportion dismissed at the SPSO’s discretion. (See the statistics from the SPSO website ( that are analysed in our Report B.)

The Craigforth reports do not interpret the data or suggest improvements in the questionnaires.


These are stated briefly here, but are elaborated on subsequent pages, the links being indicated by capital letters.

  1. Addressed here in alphabetised sections are various parts of the two reports that are unclear (sections C, D, H5, H7), unreliable (section H3), inadequate (section H6), or even clearly wrong on the basis of evidence within the reports themselves (sections B, F).
  2. In both 2009 and 2010, analysis shows that the SPSO was perceived by complainants as particularly unsatisfactory in regard to Local Authority issues as compared with Health issues (G).
  3. Complainant dissatisfaction increased from 2009 to 2010. This is illustrated by a graph in the Complainant Survey and is therefore not discussed further here.


  1. In some respects the Craigforth reports produce an unclear and unreliable interpretation of SPSO performance.
  2. Nevertheless, they do show that the performance of the SPSO is highly unsatisfactory.
  3. The body responsible for analysing the performance of the SPSO should in future be appointed by the SPCB and not by, or in collaboration with, the SPSO.
  4. The questionnaires should be re-designed to elicit fuller and less ambiguous information. Essential information not elicited was whether or not complaints were upheld.
  5. Questionnaires should be analysed, not just those for a quarter year, but for every quarter year.
  6. Complainants should be encouraged more strongly to participate.

The SPSO ‘Complainant Survey Results 2009 & 2010’

 A. Craigforth recognised the relatively small scale of the survey, which is based on a total of 229 returns, and noted that:

the results should be considered as indicative rather than statistically robust’.

The Ombudsman himself chose the data to be analysed (the first quarter only of each of the two financial years, whereas all quarters were included in 2008) and thus the sample sizes. This could have biased the results, as could the fact that people with strong opinions are more likely to complete the questionnaires. This is inappropriate methodology for a survey of people who may have been “worn-down” through unsuccessfully pursuing a complaint through SPSO procedure. That many respondents did not answer questions, or ticked the “neither satisfied or unsatisfied”, boxes raises questions of how conscientiously survey forms were filled in.

Nevertheless, clear evidence emerges for poor performance by the SPSO.

B. Key findings. Overall, in 2010, 33% of respondents indicated that they were very satisfied with the service they received (up from 23% in 2009 and 22% in 2008), and 17% indicated that they were satisfied. However, 27% indicated that they were very dissatisfied (compared with 24% in 2009 and 23% in 2008) and 13% indicated that they were just dissatisfied (making a total of 40% dissatisfied or very dissatisfied). These percentages are based on a table on page 12.

That 33% of people indicated that they were very satisfied cannot mean that 33% were very satisfied in all respects, because only 21% indicated that they were very satisfied with the clarity of the SPSO website (p 5), only 24% indicated that they were very satisfied with the thoroughness of the SPSO’s examination (p 8), only 27% indicated that they were very satisfied with the clarity of the SPSO’s explanation of the outcome (p 8), etc. Similarly, more than 27% (i.e. 30%) indicated that they were very dissatisfied with the thoroughness of the SPSO’s examination (p 8).

The finding of the survey that 33% of people rated themselves as very satisfied with the service they received should therefore be interpreted cautiously and should have been qualified by the writer.

C. There is a table on page 22 that is reproduced below. It has the brief heading “Aspects Respondents Particularly Satisfied/Dissatisfied With” instead of the fuller explanation given for the similar table in 2008 From the latter one can work out that ‘More face to face contact or site visit’ represents dissatisfaction and that the “Base” numbers are for responses, not people. In 2008 the number of respondents was also given. Although the figures are damning as they stand, it would be more revealing to be given specific reasons for dissatisfaction. Indeed, detailed knowledge of how even a few particular cases were handled would be more telling than abstract numbers – but that was not in Craigforth’s remit. The data for 2010 indicate that 21% of the cases in question were not investigated adequately and effectively. At least 70% of the responses indicated dissatisfaction with the SPSO.

Aspects that Respondents were Particularly Satisfied/ Dissatisfied With

Item 2008 2009 Q1 2010 Q1
SPSO biased/ineffective 106 27% 26 32% 14 25%
Complaint not investigated thoroughly  70 18% 7 9% 12 21%
Poor communication / overly complicated process 45 11% 6 7% 4 7%
Too slow a process 41  10% 6% 6 11%
SPSO effective / good communication   39 10% 7 9% 7%
SPSO staff professional and helpful  26 7% 15 18% 9 16%
SPSO advised poorly / lack of expertise 11 3% 2 2% 2 4%
Non SPSO issue 20 5% 7 9% 0 0%
More face to face contact or site visits   8 2% 0 0% 2 4%
Lack of alternatives offered 5 1% 0 0% 0 0%
Other 24 6% 7 9% 4 7%
Total 395   82   57  

D. The document refers on several pages to four “steps” in the SPSO processes, but these are not defined. They are not the same as the ‘stages’ tabulated in the SPSO website. This means that a number of tables are hard to interpret.

E. The questionnaires include no questions on the respondents’ perceptions of the factual accuracy of the SPSO reports. This is a crucial omission.

F. The tables on pages 34 and 38 are wrong. In the ‘Your complaint’ table on page 34, the numbers in the ‘Very dissatisfied’ column must be wrong, as they are the same as the ‘base numbers (n)’, meaning the totals. Moreover, the numbers ‘satisfied’ exceed these total numbers. Likewise the top table on page 38.

These errors are less important than the fact that the SPSO paid too little attention to the report to notice them.

G. The report includes numerous tables recording numbers of complainants that were very satisfied, satisfied, neutral, dissatisfied and very dissatisfied in regard to such things as clarity of explanation, and the courteousness, impartiality and helpfulness of SPSO staff. The data were broken down according to the Bodies under Jurisdiction involved (‘subject groups’). In both 2009 and 2010 the SPSO proved particularly unsatisfactory with Local Authority issues and much better with Health issues. These are the two largest subject groups. The difference is evident from simple inspection of the tables on pages 23-38. However, we have calculated, separately for ‘Health’ and for ‘Local Authority’, the percentages of respondents that were either dissatisfied or very dissatisfied. The results are compared in the following graph, which shows ‘Health’ percentages plotted against ‘Local Authority’ percentages. What it shows, in words, is that, in almost all respects, the SPSO was rated as even more unsatisfactory in regard to Local Authority issues than in regard to Health issues.

The various points in the graph represent different aspects of dissatisfaction or satisfaction that are tabulated separately in the report. Regarding the symbols on the graph, the squares are for 2009 and the crosses are for 2010.

Key: Squares 2009; crosses 2010

As an example, the cross marked with an arrow indicates that, for a particular topic (the level of information received with updates), dissatisfaction was 23% for Local Authorities and 33% for Health.

The diagonal line corresponds to equality. That most of the points lie below this line means that, for 2010, the SPSO was judged particularly unsatisfactory in regard to Local Authority issues and much less so in regard to Health issues. This could relate to the common perception that the SPSO is not impartial towards local authorities.

As explained in the SPSO’s Annual Report 2010-2011 (page 27), there are clear reasons why a higher proportion of health cases reach the investigation stage. However, this does not explain why so many complainants with Local Authority issues are not persuaded that their cases have been handled satisfactorily.

SPSO Service and Awareness Survey of Bodies Under Jurisdiction. 2008/2009. report by Craigforth August 2009.

In March 2009 the Scottish Public Service Ombudsman commissioned Craigforth to undertake a service and awareness survey of Bodies under their Jurisdiction (BUJs).

(The BUJs then were Health, Housing Associations, Local Authority, Further and Higher Education, Scottish Government.) 

Herewith the link to The SPSO Service and Awareness Survey of Bodies Under Jurisdiction 2008/09 report. It refers to accompanying appendices that include a copy of the survey, but these are not available at this website.

 H. Questionnaires were sent to all BUJs that had received a decision from the SPSO in the previous six months. They paint a rosy picture of BUJ satisfaction.


  1. Only 54% of questionnaires were returned. (The value of the survey is compromised if responses came mainly from those BUJs that took complaint handling seriously.)
  2. Questionnaires were written by the SPSO itself (unavailable, as noted above).
  3. Only one questionnaire was filled in per BUJ, but some of these had received more than 20 decisions from the SPSO. Many people in the BUJ would then have been involved, so that no one person could have given reliable general responses.
  4. The questionnaire dealt with distinctly different aspects of satisfaction with the SPSO:
    1. matters to do with the specific complaints
    2. SPSO publications, seminars etc.
  5. Item 5.3 on page 11 categorizes in a table the different areas of particular satisfaction or dissatisfaction. However, one cannot tell from the table whether particular responses indicated satisfaction or dissatisfaction (e.g. ‘Other’, ‘impartiality of SPSO’).

Here is the table: Areas of Particular Satisfaction or Dissatisfaction



Helpful Staff/ Good Quality Service


Need improved contact/ updates


Too long a process


Value informal discussions


Excellent Guidance


Impartiality of SPSO






  1. There is no indication anywhere as to whether rulings were or were not in favour of the BUJs – or how many. That is relevant, for example, to Item 5.3.
  1. The ‘Decisions’ table on page 7 includes the statement “Decisions made by SPSO are consistent” for which respondents are required to indicate their degree of agreement. We are left wondering what the decisions might be consistent with and how the respondents interpreted the statement.
  1. BUJs could sometimes be more reluctant to be critical of the SPSO than are complainants in that they might wish to remain on good terms.

Report B - Analysis of some statistics from the SPSO website

A report prepared by Accountability Scotland based on SPSO website data

The data are for the years 2006/7, 2007/8, 2008/9 and 2009/10.

These are obtainable from

Because the data are presented in tables that have varied in format, the year-by-year trends are hard to see. A detailed analysis of some of the data is therefore necessary.

Notable conclusions

Over these four years the number of submissions to the SPSO accepted as ‘complaints’ each year has doubled. However, the percentage of these reaching the stage of investigation has declined from 18% to 4%.

Each year the percentage of complaints reaching the stage of investigation has been about twice as high for Health Authorities as for Local Authorities. It has been very much lower for Housing Associations, falling to 0.6% in 2009/10.

Whatever the explanation for these statistics, they are not evidence for improved usefulness of the SPSO. They do suggest a significant reduction in associated workload to the detriment of some complainants.

The following points relate to the years 2006/7, 2007/8, 2008/9 and 2009/10.

  1. The formats of the tables presenting the data are inconsistent from year to year, making it hard for the reader to discern trends in the very numerous data.
  1. The trend in number of complaints has been mostly upwards, being, in the four years, successively 1826, 2881, 2875 and 3524. However, the number of complaints reaching the stage of investigation has declined, being successively 329, 426, 201 and 143.

For the latter, the respective percentages are 18.0%, 14.8%, 7.0% and 4.1%.

Note that the number of what are called ‘complaints’ does not include enquiries and the submissions that are withdrawn or else rejected at the outset as inappropriate. 

  1. Each year the percentage of complaints reaching the investigation stage has been roughly twice as high for Health than for Local Authorities. These percentages have been even lower for Housing Associations.

These points are illustrated in the following graph, which shows, for each year and for each of those sectors, the percentage of complaints that are investigated.

The explanation for the differences between sectors is unclear, but, in addressing the Local Government and Regeneration Committee on 16 November 2011, Jim Martin wondered “whether the high number of upheld health complaints is because [he has] interventional powers in health”, but not in relation to local authorities.

The difference between health and local authorities could be responsible for the much lower levels of complainant satisfaction in regard to local authorities that we have revealed by analysis of the Craigforth report ‘Complainant survey results 2009 & 2010’ (see our Report A).

Report C - Analysis of the Annual Report 2010-2011 of the SPSO


There is much of interest in this report, but what we comment on here relates to two issues of concern:

  1. The lack of accountability of the SPSO in regard to the justness of rulings.
  2. The lack of clarity and informativeness of the Report, especially with regard to statistics.

 The lack of accountability of the SPSO

 Referring to public service providers on page 15, the CSA Manager writes:

“The emphasis of its framework is firmly on timely, simple and streamlined complaints handling. This involves a two-stage internal process with local, early resolution by empowered and well trained frontline staff followed by a one-off investigation within consistent timescales. The removal of the ‘safety net’ of subsequent tiers of review or appeal will encourage complaints handlers to get it right first time.”

The final sentence is curious and illogical in that an absence of checks and quality control usually leads to laxity. Complaints handlers would be more likely to get it right if they knew that there would be a review and possible appeal.

Is this statement to be taken as justification for the SPSO’s lack of accountability in regard to the justness of its rulings?

Pages 55-6 concern the Independent Service Delivery Review.

The Independent Service Delivery Reviewer’s role “is purely to look at complaints about service delivery within the SPSO”. It is thus concerned with procedures, having no power to review the correctness or fairness of decisions. It is justice that matters to complainants and to society.

Only nine cases were referred to the Independent Service Delivery Reviewer in 2010-11. The criteria for referral are not stated in the Annual Report, so no quantitative conclusions can be drawn.

An indication of criteria for referral is to be found in the ‘Invitation to Tender For Service Delivery Reviewer Consultancy Services’ which states:

“The purpose of this work will be to investigate complaints about the way in which the SPSO carries out its work in cases where the SPSO itself has been unable to resolve the matter under its internal complaints procedure. The Service Delivery Reviewer will look at the manner in which the complaint has been handled.” 

The Independent Service Delivery Reviewer lists a number of issues investigated, but does not say which were upheld or why. The issues included such items as refusal to explain decisions clearly, refusal to answer a letter and excessive delays.

Puzzlingly, the Independent Service Delivery Reviewer draws a number of reassuring and optimistic general conclusions about the work of the SPSO that cannot be justified by examination of only nine cases and these conclusions must be beyond her remit. No mention is made of other evidence.

One particular unexplained ‘conclusion’ gives emphasis to the restricted role of the Independent Service Delivery Reviewer:

“Inevitably, some complainants will try to use the Independent Review process to reopen their case and have ‘another bite of the cherry’. These attempts must be resisted.”

It is unclear whether this ‘conclusion’ is a general condemnation of case re-examination, but this statement does accord with the SPSO’s lack of accountability in regard to the justness of its rulings.

As detailed on the SPSO website a complainant can seek a review, but only on very limited grounds (if a decision is based on inaccurate facts, if there is new and relevant information).

However Jim Martin has given a seriously misleading description of the situation. Thus, in a letter to Fergus Cochrane, Clerk to the Public Petitions Committee (1 October 2010) he wrote:

“Individuals who are unhappy with the outcome of their complaint have the opportunity to express their dissatisfaction through our challenge to a decision process. The final external step of that process is judicial review.”

The lack of clarity and informativeness of the Report

The general impression given by the Report is that it is largely a public relations document. This is strikingly demonstrated by the recorded comments of complainants. Thus the Report highlights six glowing tributes to the SPSO written by complainants, each occupying one page. Pleasing as they are individually, they are not representative of the comments elicited in past complainant surveys. In fact, according to the Craigforth report for 2009-10, 70-77% of such comments expressed dissatisfaction with the SPSO. The SPSO’s Annual Report makes no mention of a later customer survey, and a FOI enquiry elicited the fact that there has not been one, but not the reason for abandoning the practice.

In the above quotation from page 15 is the sentence:

“This involves a two-stage internal process with local, early resolution by empowered and well trained frontline staff followed by a one-off investigation within consistent timescales.”

This statement is incomprehensible for two reasons. Firstly, it is illogical to imply that resolution can precede investigation. Secondly, the phrase “within consistent timescales” conveys nothing here.

Especially lacking in clarity is the statistical table on pages 60-61: “Cases determined in 2010-2011 by sector, stage and outcome”.

(This is best viewed in the printed version as it is formatted in a way that is hard to analyse on a computer screen.)

This list gives the stages of complaint handling as:

  • Enquiries - advice and signposting
  • Complaints – advice
  • Complaints – early resolution 1
  • Complaints – early resolution 2
  • Complaints – investigation 1
  • Complaints – investigation 2

Note first that the term ‘Early resolution’ is confusing, in that most cases in that category could not be resolved.

The distinction between ‘Early resolution 1’ and ‘Early resolution 2’ is not explained in the Annual Report or on the SPSO website, nor is the distinction between ‘Investigation 1’ and ‘Investigation 2’. In answer to a FOI request, the SPSO said that they do not hold a record of the definitions of the latter two terms. This greatly reduces the value of the table.

As is illustrated below, it is possible to work out what these terms mean, but only by comparing numbers in a way that should not be expected of readers.

According to page 10 there were 673 complaints investigated in depth, 612 with decision letters and 59 through 58 public reports.

The table (pages 60-1) shows the number of cases classified as ‘Investigation 2’ was 61, of which two are shown as ‘no decision reached’. The difference (60 – 2) is 59. This suggests that the category ‘investigation 2’ corresponds to the issuing of public reports.

Other statistics are given as pie charts on pages 10 and 11. Information on page 11 may be combined with the tabulated data (pages 60-61) to draw the following conclusions.

1) According to page 11, 61 cases reached the stage of investigation. That is the number entered under ‘Investigation 2’ in the table.

This implies that those tabulated as ‘Investigation 1’ did not reach the stage of investigation.

Such inconsistency of language is confusing.

2) Detailed comparison of figures in the upper pie chart on page 11 for ‘Decision letter outcomes 2010-11’ with the table on pages 60-61 shows that ‘Early resolution 2’ and ‘Investigation 1’ must account for the decision letters.

According to page 11, 59 ‘decision letters’ did not include a decision (!). The exact breakdown of this number is not clear from the table, but they must include the 53 complaints tabulated as ‘No Decision Reached’ (= 31+22) and the 6 cases tabulated as ‘outcome not achievable’. 

Errors in the Annual Report:

According to page 9, there were 3489 complaints in a year, but the table, and page 11, give 3351.

  In the upper pie chart on page 11, 76 cases are described as “some upheld”. This should presumably read “partially upheld”

On page 11 the lower pie chart (for Investigation report outcomes) is incorrectly headed ‘Decision letter outcomes 2010-11’. The chart actually corresponds to ‘Investigation 2’ outcomes.

Yet, addressing the Local Government and Regeneration Committee on 16 November 2011 Jim Martin twice said of the Annual Report "our statistics stand up to scrutiny".

ADDENDUM I   (16 February 2012) to the submission of Accountability Scotland to the SPCB

  1. Independent investigation of the quality and effectiveness of the SPSO’s rulings (for example by Audit Scotland or an independent enquiry led by Professors Adler and Mullen, who are experts in the field of administrative justice) need not undermine his independence of Parliament. This is because the SPSO would not be bound to respond to the resulting report. Nevertheless, the feedback should be welcomed as it would give the public assurance that the SPSO does deliver justice as claimed.
  1. We would welcome an investigation similar to that carried out in England [The Law Commission (LAW COM No 329) Public Services Ombudsmen, July 2011]. There is clearly a need to bring the SPSO into a more coherent system of administrative justice in Scotland
  1. We are extremely concerned that the SPSO has abandoned totally any collecting of data on complainants’ views either on service or quality of investigation. Craigforth has not been re-employed, nor has anyone else. Therefore the SPSO has no external or internal guidance as to his own performance in terms of service and quality of justice, despite the two statements below.

In 2010 the SPSO wrote in regard to future surveys:

On the advice of Craigforth, we are exploring the possibility of forming a focus group of service users to provide us with more qualitative feedback. We expect to progress this in the next financial year, when our new business process has had an opportunity to settle into place.

The SPSO's Research and surveys page is Simply click on “December 2010 survey, findings and future action”.

From Annual Report 2009-2010:

In 2009 –10 we began a project to begin collecting data on complainants’ views of our service in the first quarter of the year. The data will be compared with the feedback from complainants in the first quarter of 2010 –11.

For your information we attach our recent submissions to the AJTC.



Letter to the SPCB from Accountability Scotland dated 20th February 2012

We are aware that the SPSO has just submitted his Draft Strategic Plan 2012-2016.

We note that the SPCB is considering this well before the deadline for responses in the SPSO’s public consultation process. The plan was published on Thursday and we learnt of it on Sunday evening. So we have had little time to consider it.

Our concern, as already expressed, is that there is no check on the quality, effectiveness and justice of the SPSO’s rulings on complaints and that, in this regard, the SPSO is accountable to no-one.

The optimistic tone of the Draft Strategic Plan disguises this fact.

The Draft Plan gives the impression that the external quality assurance audit looks at the quality of work produced by the SPSO and in common with most British Standard quality assurance schemes they only look at the quality of the paper trail. However, the impression given is that quality assurance covers all the Ombudsman’s actions.

The Internal Audit Report for the Scottish Public Services Ombudsman (Annual Assurance:Quality Assurance Scheme), October 2011, makes clear that the terms of its audit engagement were only to appraise the controls governing the SPSO’s actions, not the actions themselves. In other words, the quality assurance looks at only a tiny part of the SPSO’s work.

We note that the SPSO, except in one aspect, irrelevant here, the SPSO does “not anticipate any significant change to how we carry out our work”.