‘Reliability’, like ‘validity’, is a term with multiple meanings. In non-profit settings (amongst others) there are several contexts in which the concept of ‘reliability’ is expressed differently. This post reflects only some of these.
A dictionary definition sounds straightforward enough – ‘that which can be relied upon, fit to be depended upon, trustworthy‘. When we rely on something or someone, we depend on that person or thing with full trust and confidence, and we attach faith to that dependability.
Some of the other meanings associated with ‘reliability’ are illustrated in the selection of quotes presented below:
Just as governments have an obligation to minimise uncertainty in economic, social, and political environments, non-profit boards likewise have an obligation to minimise uncertainty within their organisations and amongst their stakeholders. Reliable policymaking, program delivery, and fiscal management need to be augmented by reliable strategic review measures, ensuring that the organisation is responsive to changing stakeholder needs.
One of the key reliability tools used by boards is the development of organisational policies. When involved in decision-making, a key question for directors is whether the situation might be one that could arise again in some form. Where the need for consistency in approach has been identified, the principles underpinning the decision on the current matter can be extracted for adoption as a policy to guide future decisions and actions.
When used in the context of measurement and evaluation, such as you might use for governance oversight, reliability connotes consistency and repeatability. While these qualities may also generate some level of trust, they add some further dimensions to the term, especially when linked with ‘partner’ concepts like validity. [Further reflections on this pair of concepts will be offered in a future post]
When we refer to the reliability of data, we mean the degree to which the result of a measurement, calculation, or specification can be depended on to be accurate.
Quality assurance and control
The American Society for Quality defines reliability as:
“the probability that a product, system, or service will perform its intended function adequately for a specified period of time, or will operate in a defined environment without failure”.https://asq.org/quality-resources/reliability
Quality standards like ISO 9000 promote the importance of consistency in executing procedures and workflows. The repeatability of documented procedures is equated to the maintenance of quality, along with other quality control measures, such as those required for security, or health and safety.
Re-liability and risk controls
Risk assessment and the controls identified to address key risks are key areas for consideration of reliability issues. The board’s role in ensuring conformance with laws, regulations, contractual obligations, and standards, also requires reliable systems to check and assure compliance.
Organisations with dependencies on certain assets (e.g. vehicles, information and communication technology or ICT, specialised equipment, etc.) usually devise an Asset Maintenance or Reliability Strategy and plan. Essentially, this is a specialised risk management plan designed to address continuity of services that are dependent on infrastructure which may be subject to wear and tear, malfunction, or failure. Given our universal dependency on IT systems, it seems reasonable to expect that every non-profit’s risk management plan will include preventive and response measures relating to ICT equipment and systems (along with cyber-risk of course).
Reliability engineers like Accendo Reliability, suggest that:
“the reliability plan is the list of tasks and events that enable the team to understand reliability risks and accomplishments well enough to make the right decisions during the (project) development process.”https://accendoreliability.com/select-tasks-reliability-plan/
High Reliability Organisations (HROs)
Non-profit organisations are not immune from bad news. Associations, charities, and for-purpose organisations anticipate that something may go wrong when they devise risk management plans. Ensuring that potential trouble is spotted early requires good reporting systems and an expectation that reports will not only highlight positive developments.
Emeritus Prof. Andrew Hopkins’ compact monograph, A Practical Guide to Becoming a High Reliability Organisation (AIHS, 2021), notes that:
“Researchers have described HROs as mindful organisations, constantly aware of the possibility of failure. They seek out localized and small-scale failures and generalize from them.” …
“Mindfulness is not just a characteristic of organisations. It is also a characteristic of their leaders. Mindful leaders are very aware that their systems may not be working as well as intended, nor as well as they are being told by their subordinates. They are suspicious of a steady stream of good news and are forever probing for the bad news that they know lies beneath the surface.”https://www.aihs.org.au/sites/default/files/A%20Practical%20Guide%20to%20becoming%20a%20High%20Reliability%20Organisation%20-%20Andrew%20Hopkins.pdf
Prof. Hopkins lists 10 principles for a ‘bad news’ reporting system, which you may find helpful when assessing whether your organisation could be described as “highly reliable”.
Credentialling and probity checks are part of the trust system used as a foundation for the engagement of new hires. Later, as onboarding and induction processes give way to the employee being ‘on board’, progressively greater trust is invested in their capacity to take on responsibility as they reliably demonstrate that capacity. An employee’s scope of duties and level of responsibility is adjusted (or not) according to their demonstrated skills and abilities. These may be increased (with or without a work value adjustment to remuneration), or become the subject of performance management measures where they experience difficulties.
Dependable and trusted staff are highly desirable for organisational effectiveness, but we all recognise that the perfect state is aspirational, and so we must allow for human failings. Some degree of fault tolerance is required to handle misunderstandings, unfamiliarity with procedures, competing pressures, and personal distractions and tensions.
Interestingly, while we aim to build trust, the IT security trend is to move to a ‘zero trust’ model. One definition of zero trust is “a security model based on the premise that no one is blindly trusted and allowed to access company assets until they have been validated as legitimate and authorized“. See this Wikipedia entry for some useful background on the concept. https://wiki2.org/en/Zero_trust_security_model
As justifiable as this model may be, it is bound to conflict at some level with team members wanting to be trusted and autonomous in their use of technology. Implementing models such as this requires good communication about the purposes and benefits of compliance and the importance of reliable control systems for safe operation. Ideally, these control systems offer peace of mind for both those implementing them, and for directors overseeing management and operations.
According to Washington Post opinion writer Barton Swain, the maxim “Trust, but verify” entered common usage when President Ronald Reagan’s adviser on Russian affairs, Suzanne Massie, was preparing him for talks with Mikhail Gorbachev in 1986. She suggested that it could be helpful for Reagan to learn a few Russian proverbs, and the one he preferred was “Doveryai no proveryai” — trust, but verify. Apparently, he liked it too much, provoking Gorbachev’s annoyance when he used it at every meeting.
Claiming to use evidence-based decision-making without having considered your standards of evidence and data quality issues (verification), may leave your board open to challenge if it is ever required to account for a decision before a court, or the media.
Deborah Richards, in her AICD article on directors’ seven deadly boardroom sins, highlighted the Centro case in which the judge found that the directors had failed to notice an error in the financial reports. This involved $2b in current liabilities being wrongly classified as non-current. While they argued that they had relied on an external auditor, the judge found they breached their duties by not checking the figures.
As well as their innate curiosity therefore, all directors are encouraged to exercise a ‘constant (healthy) skepticism’ when reviewing management reports and board proposals. That disposition is as fundamental for evidence-based decision-making as empiricism is for science. Try substituting ‘governance’ for ‘science’ and ‘director/s’ for ‘scientist/s’ in the following quote from Carl Rovelli
“The very foundation of science is to keep the door open to doubt. Precisely because we keep questioning everything, especially our own premises, we are always ready to improve our knowledge. Therefore a good scientist is never ‘certain’. Lack of certainty is precisely what makes conclusions more reliable than the conclusions of those who are certain: because the good scientist will be ready to shift to a different point of view if better elements of evidence, or novel arguments emerge. Therefore certainty is not only something of no use, but is in fact damaging, if we value reliability.” (Emphasis added)Carlo Rovelli