The book is now available from Gower Publishing.
In his first book, Mike questions the value of public inquiries. Every day, we hear about another inquiry being set up, or why the last one failed. Time and money is spent on inquiries and on implementing their recommendations, but they do not lead to the learning they should.
Based on research into high profile inquiries and commissions, It Should Never Happen Again focuses on the gaps between what is known, what knowledge is used by practitioners and by those who judge them. It contrasts the judicial perspective of those who inquire; the academic perspective of those who know; and the practical perspective of those who are required to act.
The difference between these perspectives creates barriers that impede others from learning from inquiries. Crucially, inquiry outcomes do not assist the leadership of organisations to improve risk governance. Mike offers new models for understanding risk and its governance.
There are two practical offerings that readers may wish consider.
The first is a suggestion of what should be included in the Terms of Reference for any inquiry. These are set out in the table below:
For political reasons, recommendations of public inquiries or inquests often accepted, at their time of being published with little critique or debate. However as we can see from the series of inquiries into the events at Hillsborough on 15 April 1989, many fail to survive the test of time. This books suggests a series of test that should be applied to any recommednation before it is adopted.
In brief, for recommendations to be justified they must:
Contain applicable corrective action or indicate deficiencies in knowledge (in this latter case further research may be recommended).
Be based on reasoning that flows directly from and is cross-referenced to the findings.
Show a clear design strategy (rather than being just a list of individual actions) which explains how the future will be improved rather than just explaining the past.
Show a thorough understanding of the system (preferably referring to a model of the system) and how how the recommendation will not encourage further sub-optimisation of the system or other unintended consequences.
Derive from the integrity and credibility (expertise) of investigators: this goes back to the Adams’ Model where, if we cannot see, feel, touch or measure, we have to trust the person or people giving the advice.
Be peer reviewed; to learn we have to trust, we therefore need to see that an opinion is not only fully justified but has also been fully tested.
Involve and communicate with the appropriate stakeholders; again we see tension between the need and desire to learn and any political or judicial goals that the inquiry may attract. It is clear that learning and enacting lessons can only be enhanced by engagement with the other relevant stakeholders.
Articulate (1) the perspective adopted, (2) any conflicts of interests and (3) sources of bias or any other analytical limitations. As has already been said, trust in those giving advice is a very important factor as to whether the advice will be heeded. Whatever can be done to build that trust would be worthwhile.
Finally, I suggest that recommendations also need to be structured so that they:
Are self-contained (as many people will only read the recommendations); that is, they can be understood as a stand-alone statement when extracted from the context of the report. They are clear and unambiguous about:
- the action required, and how success should be judged,
- who is responsible,
- how the system will affect and be affected by the changes,
- the relationships and interaction between recommendations,
- the risks involved in taking these actions.
Differentiate between Macro, Mezzo and Micro recommendations.
Set realistic time limits for response and “follow-through” which indicate that they appreciate the size of the task involved.
While applicability may be easy to see in hindsight, the application in foresight is much more difficult. If those conducting inquiries condemn practitioners for any perceived failure of foresight, they should be prepared to recommend how the lesson learnt may be applied more generally and the danger of unintended consequences need to be eliminated.