We should be pleased that Government has published “Government ICT Strategy – Strategic Implementation Plan”. It is evidence of a controllable top-down approach and provides an easily digestible 77 pages of prose that should give us all confidence that the full programme will be delivered. There are 19 Objectives and 19 Programme Key Milestones in the document. Looking deeper, each of the objectives has a project team and its own set of Key Milestones. Hence there is a total of about 60 Key Milestones. So, bottom-up, people are working diligently. However, there is a risk that they may be too constrained to look at the overall programme. By observation, it’s the people at the bottom who know what is really going on – but they are rarely asked their opinion. Quarkside recommends that a cross-section of staff are interviewed on their level of confidencethat the overall programme objectives will be met. The initial impression is that there are too many objectives with low confidence of success.
Objectives
This is the list of Objectives from the table of contents.
Objective 1: Reducing Waste and Project Failure, and Stimulating Economic Growth
- Asset and services knowledgebase
- Open source
- Procurement
- Agile
- Capability
- Open standards for data
- Reference architecture
- Open technical standards
- Cloud computing and applications store
Objective 2: Creating a common ICT infrastructure
10. Public services network (PSN)
11. Data centre consolidation
12. End user device strategy
13. Green ICT
14. Information strategy
15. Risk management regime
Objective 3: Using ICT to enable and deliver change
16. Channel shift
17. Application Programme Interfaces (APIs)
18. Online government consultation
19. Social media
Milestones
Quarkside has mapped the Key Milestones (M) with the Objectives (O)
M |
O |
Key Milestone | Date |
1 |
1 |
100% of central departments have access to the ICT Asset and Services Knowledgebase and can input, discover and output data |
September 2011 |
2 |
9 |
Cloud Computing Strategy published |
October 2011 |
3 |
12 |
End User Device Strategy published and delivery programme commenced |
October 2011 |
4 |
13 |
Green ICT Strategy published |
October 2011 |
5 |
5 |
ICT Capability Strategy published |
October 2011 |
6 |
2, 6, 8 |
First release of a draft suite of mandatory Open Technical Standards published | December 2011 |
7 |
7 |
First draft of reference architecture published |
December 2011 |
8 |
14 |
Publication of cross-government information strategy principles |
December 2011 |
9 |
15 |
High level information risk management governance process designed agreed |
December 2011 |
10 |
9 |
Roll-out of ‘lean’ sourcing process |
January 2012 |
11 |
11 |
Data Centre standards published |
February 2012 |
12 |
10 |
Core PSN capabilities delivered and services available to allow sharing of information between customers regardless of whether they are on the new PSN or legacy environments |
March 2012 |
13 |
8 |
A set of open standards for data adoption established and progressed by government departments, driven by the Open Standards Board |
June 2012 |
14 |
9 |
50 accredited products on the Government Application Store |
December 2012 |
15 |
12 |
Full implementation of End User Device Strategy commences |
January 2013 |
16 |
4 |
Agile techniques used in 50% of major ICT-enabled programmes |
April 2013 |
17 |
9, 10 |
80%, by contract value, of government telecommunications will be PSN compliant |
March 2014 |
18 |
9 |
50% of central government departments’ new ICT spending will be transitioned to public cloud computing services |
December 2015 |
19 |
11 |
Cost of data centres reduced by 35% from 2011 baseline |
October 2016 |
It is pure co-incidence that there are 19 items in the Objectives list and 19 Key Milestones. That some Milestones support several objectives is fine, and not an issue. However, none of the four Objectives for “Using ICT to enable and deliver change” seem to get a mention, namely:
16. Channel shift
17. Application Programme Interfaces (APIs)
18. Online government consultation
19. Social media
This is not mischievous arm-chair auditing. It demonstrates one of the duties of a risk manager to check that ALL objectives are visibly targeted in projects. The Programme should either drop the objectives under change control, or revise the implementation plan. The latter is obviously preferable, because they are the only ones that have impact on citizen services. The first 15 objectives are largely internal and focus on efficiency, not effective service delivery.
Confidence in Outcomes
Quarkside has used the Confidence Management method to review the Milestone Plan. The Confidence Chart dramatically shows the impact of omitting four of the Strategy objectives. How is it possible to give any level of confidence that they will be achieved when they are not related to any Milestones? Admittedly, the results were obtained from a sample of one person; just think how more useful this would be if more people were sampled at each level in the programme teams. There will be some interesting results that would make immediate sense to Ministers – who are reliably expected to look at only one sheet of paper. The average level of confidence in the overall programme is just 50%. Surely this would be important to Ministers if it was a true reflection of what the Civil Servants think is likely to happen in the future. It is just like a doctor sticking a thermometer in a patient’s mouth – not a diagnosis.
The Governance structure includes a Director of ICT Futures, Liam Maxwell. He has been appointed and has begun work to horizon scan and improve capability to identify risks and exploit new technologies (Action 28). Liam Maxwell should be made aware of this innovative method devised by an SME. Any objective with less than 50% confidence should surely have a prominent place on the risk log. It is far more effective to sort out any issues at this early stage than wait until it is too late to correct them. The end results of all large programmes can be predicted during the first 30% of their planned time period. Liam Maxwell should try and ensure that each of the project implementation teams own risk logs can be compared objectively. The current sets of top three risks in each sub-delivery area are purely qualitative. They cannot be compared to identify which are truly the biggest programme risks. Educating project teams to use the Crilog Risk Index is one possible way of ranking every risk in the Programme.