Quarkside

29/10/2011

Low Confidence in Government ICT Strategy

We should be pleased that Government has published “Government ICT Strategy – Strategic Implementation Plan”.  It is evidence of a controllable top-down approach and provides an easily digestible 77 pages of prose that should give us all confidence that the full programme will be delivered.  There are 19 Objectives and 19 Programme Key Milestones in the document.  Looking deeper, each of the objectives has a project team and its own set of Key Milestones.  Hence there is   a total of about 60 Key Milestones. So, bottom-up, people are working diligently.  However, there is a risk that they may be too constrained to look at the overall programme.  By observation, it’s the people at the bottom who know what is really going on – but they are rarely asked their opinion.  Quarkside recommends that a cross-section of staff are interviewed on their level of confidencethat the overall programme objectives will  be met.  The initial impression is that there are too many objectives  with  low confidence of success.

Objectives

This is the list of Objectives from the table of contents.

Objective 1: Reducing Waste and Project Failure, and Stimulating Economic Growth

  1. Asset and services knowledgebase
  2. Open source
  3. Procurement
  4. Agile
  5. Capability
  6. Open standards for data
  7. Reference architecture
  8. Open technical standards
  9. Cloud computing and applications store

Objective 2: Creating a common ICT infrastructure

10. Public services network (PSN)

11. Data centre consolidation

12. End user device strategy

13. Green ICT

14. Information strategy

15. Risk management regime

Objective 3: Using ICT to enable and deliver change

16. Channel shift

17. Application Programme Interfaces (APIs)

18. Online government consultation

19. Social media

Milestones

Quarkside has mapped the Key Milestones (M) with the Objectives (O)

M

O

Key Milestone Date
1

1

100% of central departments have access to the ICT Asset and Services Knowledgebase and can input, discover and output data

September 2011

2

9

Cloud Computing Strategy published

October 2011

3

12

End User Device Strategy published and delivery programme commenced

October 2011

4

13

Green ICT Strategy published

October 2011

5

5

ICT Capability Strategy published

October 2011

6

2, 6, 8

First release of a draft suite of mandatory Open Technical Standards published December 2011
7

7

First draft of reference architecture published

December 2011

8

14

Publication of cross-government information strategy principles

December 2011

9

15

High level information risk management governance process designed agreed

December 2011

10

9

Roll-out of ‘lean’ sourcing process

January 2012

11

11

Data Centre standards published

February 2012

12

10

Core PSN capabilities delivered and services available to allow sharing of information between customers regardless of whether they are on the new PSN or legacy environments

March 2012

13

8

A set of open standards for data adoption established and progressed by government departments, driven by the Open Standards Board

June 2012

14

9

50 accredited products on the Government Application Store

December 2012

15

12

Full implementation of End User Device Strategy commences

January 2013

16

4

Agile techniques used in 50% of major ICT-enabled programmes

April 2013

17

9, 10

80%, by contract value, of government telecommunications will be PSN compliant

March 2014

18

9

50% of central government departments’ new ICT spending will be transitioned to public cloud computing services

December 2015

19

11

Cost of data centres reduced by 35% from 2011 baseline

October 2016

It is pure co-incidence that there are 19 items in the Objectives list and 19 Key Milestones.  That some Milestones support several objectives is fine, and not an issue.  However, none of the four Objectives for “Using ICT to enable and deliver change” seem to get a mention, namely:

16. Channel shift

17. Application Programme Interfaces (APIs)

18. Online government consultation

19. Social media

This is not mischievous arm-chair auditing.  It demonstrates one of the duties of a risk manager to check that ALL objectives are visibly targeted in projects. The Programme should either drop the objectives under change control, or revise the implementation plan.  The latter is obviously preferable, because they are the only ones that have impact on citizen services.  The first 15 objectives are largely internal and focus on efficiency, not effective service delivery.

Confidence in Outcomes

Confidence in ICT Strategy Implementation

Quarkside has used the Confidence Management method to review the Milestone Plan.  The Confidence Chart dramatically shows the impact of omitting four of the Strategy objectives. How is it possible to give any level of confidence that they will be achieved when they are not related to any Milestones? Admittedly, the results were obtained from a sample of one person; just think how more useful this would be if more people were sampled at each level in the programme teams.  There will be some interesting results that would make immediate sense to Ministers – who are reliably expected to look at only one sheet of paper.  The average  level of confidence in the overall programme is just 50%.  Surely this  would be important to Ministers if it was a true reflection   of what the Civil Servants think is likely to happen in the future. It is just like a doctor sticking a thermometer in a patient’s mouth – not a diagnosis.

The Governance structure includes a Director of ICT Futures, Liam Maxwell.  He has been appointed and has begun work to horizon scan and improve capability to identify risks and exploit new technologies (Action 28).  Liam Maxwell should be made aware of this innovative method devised by an SME.  Any objective with less than 50% confidence should surely have a prominent place on the risk log.  It is far more effective to sort out any issues at this early stage than wait until it is too late to correct them.  The end results of all large programmes can be predicted during the first 30% of their planned time period. Liam Maxwell should try and ensure that each of the project implementation teams own risk logs can be compared objectively.  The current sets of top three risks in each sub-delivery area are purely qualitative. They cannot be compared to identify which are truly the biggest programme risks.  Educating project teams to use the Crilog Risk Index is one possible way of ranking every risk in the Programme.

24/10/2011

Policies need Confidence

All organisations need policy. How confident are they that policy (aka strategy) will be followed and that the desired outcomes will be achieved? Because policies are always top-down, confidence in success exudes from the top; but apathy, indifference and scepticism is the normal response from the bottom. Let’s see how Confidence Management could lead to more realistic expectations and implementations.

Here is an example     from a membership organisation of  managers working in the information domain. They have been developed, top-down:

Three core principles 

  • Collaborate, share and re-use assets
  • Redesign services to simplify, standardise and automate
  • Innovate to empower citizens and communities

Six strategic capabilities

  1. Leadership from CIOs
  2.  Governance
  3. Organisational change
  4. Strategic commissioning and supplier management  
  5. Shared services
  6.  Professionalism

Six key issues around information and technology

  1. Information governance
  2. Information management, assurance and transparency
  3. Digital access and inclusion
  4. Local public services infrastructure
  5. Business change
  6. Central government services have been integrated with local public services delivery.

The document had a successful launch.   Stage 1, on time and on budget.   Stage2 is for the individual members to plan implementing these policies (aka strategies) back in their own organisations.    Stage 3 is to convert those plans into changed practices throughout the UK.

The Confidence Management Process  converts the lists above into fifteen questions in relation to how confident each interviewee is about the progress that  will be achieved within  5 years, by 2016.

How confident, on a scale of 0% to 100%, are you that the following targets will  be achieved?

  1. The organisation and partners have  collaborated, shared and re-used information assets  with pooled budgets and staff for all services.
  2. All services have been redesigned to simplify,  standardise and automate business processes.
  3. Citizens and communities have been empowered by innovative methods in every service area.
  4. CIOs have demonstrated leadership and delivered more efficient and fairer public service outcomes, demonstrated by KPIs.
  5. ICT Governance processes have been managed by a Programme Office,   administering a portfolio of business change programmes and projects  with strong change and risk management.
  6. Outcome-focused organisational change methods have been employed successfully in all services.
  7. Strategic commissioning and supplier management has proved to be more effective and reduced costs by at least 25% with better service provision.
  8. Shared services have been operated 25% more efficiently  by     partnership arrangements.
  9. ICT Staff have been assessed under SFIA framework and improved their level of skills to  operate as  a certified professional.
  10. An information governance framework has been implemented and best practice is being followed.
  11. Standard information management, assurance and transparency processes have been instituted that control data throughout the lifecycle – providing a single version of the truth.
  12. Multi-channel digital access has extended to  80% of service transactions, with special provision for digitally excluded citizens.
  13. The infrastructure has employed the Public Sector Network for Cloud services, shared data centres and shared application.
  14. Business change projects have delivered measurable outcomes and benefits across organisational boundaries.
  15.    All central government departments have closely integrated ICT systems     for local public services delivery, eg Education, Health, Justice, DWP  and HMRC.

This would enable a bottom-up, and middle-out, perspective of the policy. It should help to identify the critical success factors (CSFs). In the end it is the foot soldiers who have to implement policy. They are likely to recognise similar initiatives from many years earlier, and carry on with business as usual. Leadership must focus on CSFs or lose the opportunity for another five years. Hard times calls for tough decisions.

20/10/2011

7DIG: Ranking Risk

Filed under: Governance,Risk — lenand @ 9:09 am
Tags: , , , ,

The third and final part of the Quarkside risk method solves the problem of ranking, or prioritising, risk across any selected domain, large or small.   Other risk ranking methods eg a Risk Matrix, are too subjective and can miss  important  risks.

Crilog Risk Index

The Crilog Risk Index (CRI) evaluates risks consistently across whole organisations.  In a scale ranging between 0 and 100, it gives a single value for any risk impact between £1000 and £10 billion.  The absolute value is not significant; it is to allow ranking and setting priorities for risk reduction action plans.  A CRI value is calculated for every entry in a risk log.  It is a logarithmic function of the probability and impact of the risk.  Impact assessments only need to be “order of magnitude”.  The number of zeroes is all that really matters  and people are comfortable in saying that a risk is somewhere between £100k and £1million, for example.

After calculating the CRI, risk can be assessed according to which band it occurs and given a “Traffic Light” colour.   The ranges are arbitrary and can be modified to suit the risk appetite an organisation.

Colour Description Action CRI
Red Potential disaster Act now – advise directors 64 to 100
Yellow High risk Manage down 32 to 64
Green Manageable risk Monitor monthly 0 to 32

The major benefit of using this single value index is that it can be used on completely independent risks logs, whether they cover corporate, departmental or project risks.  At any time it is possible to merge risk logs and present a consolidated view of importance.

Crilog Risk Cube

Quarkside recommends adding an additional dimension to the traditional 2-dimensional risk matrix – to make a Risk Cube.

The Probability axis is linear from 0% to 100% and the Impact axis is a logarithmic scale that easily show £billions or even £trillions alongside £thousands.  The third axis is the Risk Index, which is coloured by traffic lights and a size proportional to the value.

Compared to the traditional matrix, the Cube graphically draws attention to important outliers, such as:

  • Risks that are almost certain to impact the bottom-line performance of a company; very high probability of loss
  • Risks that could potentially destroy a company; very low probability but very high impact.

Although it requires a culture change for people to allocate values to both the impact and probability of entries in the risk log, in practice it works well.  It produces clear method for ranking any magnitude of risk across different divisions or programmes.   It fits well with confidence measurement, in a non-threatening environment, as a small step towards using percentages for risk probabilities and orders of magnitude for risk impacts.

Monthly monitoring of changes to a Risk Index, at least for the high  values, show the health of projects and programmes.  It would be a miniscule proportional of  total expenditure for the benefits achievable.  Organisational leadership must now ensure that action takes place to remove, mitigate or minimise risk.

19/10/2011

7DIG: Identify Risk with Confidence

Filed under: Governance,Risk — lenand @ 10:35 am
Tags: , , , , , ,

Using Confidence to Identify Risks

Rather than starting the search for risks in a negative way, Quarkside recommends using the reverse psychology to ask about success – “How confident are you that targets will be met?” Asking about confidence levels not only helps to identify risks, it shows a positive view rather than a negative one.  Simple bar charts can be used to show targets, changes of opinion and alerts for areas of concern felt by staff.

Average Confidence

The chart above may look gloomy, but this was the state of play in a large public sector project. It was the result of interviewing many levels of staff.

  • Anything below 50% confidence is worth further investigation and should be entered in the risk log.  Immediate risk reduction action should be taken.
  •  For intermediate levels of confidence, say between 50% and 75%, risk log entries should reflect the reduced level of risk.  The root cause for reduced confidence should be investigated.
  • Even if there is high confidence of success, greater than 75%, then there should be a risk log entry if the impact of failure is high.

Without claiming intellectual rigour,

Risk Probability% = 100% – Confidence%

Managers are comfortable with this concept – high confidence equates to low risk and vice versa.  Discussion helps people to accept that simple quantification of risks is neither difficult nor threatening.

Confidence Management Process

Confidence Management Process

It may be old-fashioned, but Quarkside is a proponent of managing to a baseline, or vision or goal or whatever you want to call it.  Lets also call them strategic objectives.  The main point is that they are organisation wide, and that leadership has ensured that everybody understands and has bought into them.

Interviews are carried out using a one-page questionnaire that records levels of confidence.  A five-point scale ranges from totally confident to minimally confident.  Subsequently, values from 90% to 10% are allocated.  The analyst can also select extreme values, say 100% or 0%, if the interviewee stresses strong opinions during the course of an interview.  Comments on the reasons for low values are welcomed – highlighting the root cause of a risk if raised by several interviewees.

To encourage open and frank responses, an independent interviewer asks questions in confidence and ensures that comments are not attributable to a specific person.  Interview data is analysed and presented in a report.  The contents include commentary on areas of high and low confidence and references to the risk log.

After several months second and subsequent reports discuss the change in confidence levels since the previous report.  A change chart graphically indicates the effect of risk reduction since the previous review.  The process provides feedback into the risk management control loop.

Most importantly it supports the risk management process by flushing out risks that may not have been formalised.  In extreme circumstances, it could contribute to a decision to change the baseline business or project targets.

Experience

The method has shown benefits in £billion programmes – but it could be applied in any form of project – even Agile ones.    Some key findings were:

    • Confidential, non-attributable interviews help to open up discussions and identify root causes of problems.  It allows comment at peer level that might not surface in the presence of overbearing managers
    •   The initial interview requires a few minutes to explain the concepts and establish understanding of the business objectives.  Subsequent interviews are quicker to execute and frank answers are obtained in less than one hour.
    •  The questioning technique encourages managers to think more quantitatively about business targets and the probability of achieving them.  They feel comfortable that 90% confidence has a residual 10% risk, and that it is fair to include it in a risk log.
    • Levels of confidence can diverge extremely between interviewees.  Whether lack of communication or “head in sand”, it is useful data worthy of further investigation.
    • In programmes experiencing difficulties, the results provide a focus for debate at board level.  One organisation used the results to renegotiate a major contract.
    • Even with generally satisfactory levels of confidence, it is worth investigating the target with the lowest confidence.  One internal audit team raised a security risk with an impact greater than £1 billion; procedures were tightened.  This is the company-threatening risk that is missed by using traditional risk matrices and resulted in the Risk Index to be described in the final section.

Looking to the future, the method should be used on all public sector programmes that rely on computer information for success eg Universal Credit, Health Service ICT, Individual Electoral Registration,  the Government ICT Strategy and Identity Management.

            

18/10/2011

7DIG Risk Revisited: The Problem of Risk Matrices

Filed under: Innovation,Risk — lenand @ 10:09 pm
Tags: , ,

Several years ago, project risk was high up the agenda especially in PRINCE2 projects.  But it was always viewed negatively.  People are naturally reticent about exposing risk in their area of responsibility.  Much more positive responses were achieved by reversing the questions and asking for levels of confidence.  Let’s call it Confidence Management, rather than Risk Management.  There are examples of success.

The method starts with interviews for assessing levels of confidence in achieving business targets.  Confidence levels are used to generate data in risk logs, which quantify business risk as a company wide, or even global, risk index.  It has a simple value for risk impacts that range from a few pounds to billions.

A 2X2 risk matrix is commonly used.

Risk2X2

2X2 Risk Matrix

It does not give much information, but better than nothing.  More complicated versions with colour coding can alert senior executives.

5X5 Risk Matrix

The main problem of such matrices is that they mask the importance of very low probability, but very high or catastrophic risk impacts.  The undersea blow-out of a BP oil rig is a recent example. Some risk impacts have to be measured in billions, but such crude methods do not alert people adequately. The axes are often not given numerical values and therefore impossible to correlate projects across a programme.

The next blogs show a new way of identifying risks and a quantitative method for prioritising risks across complex programmes.

Blog at WordPress.com.