Quarkside

07/01/2013

Who should trust “Trusted Computing”?

Most PC devices can connect to network resources, either on a public (Internet) or private network.  The Trusted Computing Group has developed standards for “Trusted Computing”, which has a specialised meaning.  With “Trusted Computing”, PC behaviour is enforced by standards and technologies that shift the root of trust from software to hardware embedded in the device.  For example, PCs with a TPM (Trusted Platform Module) could have hardware preset by the supplier:

  • to restrict the operating system versions;
  • to restrict software to specific versions;
  • to provide encrypted access to data stores;
  • to measure and report on the integrity of platform, including the BIOS, disk MBR, boot sector, operating system and application software.

In a corporate environment with a private network, it increases confidence in the integrity of the system.  Hardware identification of the end-points increases levels of information assurance is a benefit and could justify the complex processes of TPM deployment.  There is a case for extending the use of TPMs into more mobile devices.  Such reasons have led to mandation for business communications with the US DoD.

However, before anybody thinks of mandating “Trusted Computing” for the general public, let’s look at some of the implications.  Depending on how the standards are deployed, both PCs and mobile devices could lose flexibility, freedoms and privacy.  The superficial attraction of improved security on private networks could constrain the use of public networks and deter innovation.

Many areas of concern are reviewed in Wikipedia:

  • In order to trust anything that is authenticated by or encrypted by a TPM … , one has to trust the company that made that chip, the company that designed the chip, those companies allowed to make software for the chip, and the ability and interest of those companies to not compromise the process;
  • “Trusted Computing” (TC) would have an anti-competitive effect in the IT market;
  • TC can support remote censorship;
  • Software suppliers can make it much harder for to switch to competitors’ products;
  • TC-protected documents may be unreadable by competitive software;
  • Digital rights management technology could prevent users from freely sharing and using potentially copyrighted files without explicit permission;
  • A user who wanted to switch to a competing program might find that it would be impossible for that new program to read old data;
  • With remote attestation, a website could check the Internet browser being used and refuse to display on any browser other than the specified one;
  • The migration section of the TPM specification requires that it be impossible to move certain kinds of files except to a computer with the identical make and model of security chip;
  • Users unable to exercise legal rights, under headings such as fair use, public interest or whistle-blowing;
  • Users vulnerable to vendor withdrawal of service;
  • Users unable to override restrictions even if confirmed to be physically present to allow the computer to use a secure I/O path to another user;
  • Loss of anonymity could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistle blowing, political blogging and other areas where the public needs protection from retaliation through anonymity;
  • TPM hardware failure creates the possibility of a user being irrevocably cut-off from access to private information.

The risk is that “Trusted Computing” becomes mandated without a full debate of these concerns before incorporation into government policy.  When you look at the list of financial supporters of “Trusted Computing”, many have a lot to gain from anti-competitive application of the standards.  Arguments for policies and regulations that might curtail current liberties should be balanced against those from less well-financed champions of Open Source, Open Rights, Open Internet and Civil Liberties.  We must avoid the scenario where honest citizens, without TPMs installed in their communications devices, could be locked out of using digital public services.

31/05/2012

Open Source: Start looking

Filed under: Standards — lenand @ 12:20 pm
Tags: ,

Decades after successful use of open source by industry giants such as Google, Facebook and Apple, surely it is sufficient evidence to convince ICT managers that it should always be assessed.  Even the most conservative Home Office is claiming benefits in £millions.

Cash strapped local government and voluntary sector agencies must start looking for savings by retiring legacy proprietary systems and moving to open source and open standards.  Open Source and Open Standards are not the same thing, but they are often conflated – and sometimes with Open Data.  The key things to remember are:

  • Open Source is free computer source code that is reusable and improvable.  Most users do not change the source code.
  • Open Standards are the result of agreements of interested parties and encourage interoperability between systems.  Businesses should specify open standards in procurements, even for proprietary software.
  • Open data is free to use, and should be defined as complying to an Open Standard.  It can be processed by Open Source or proprietary software.

Prepare for the evolution.  Read all about the Open Source Summit on May 30th 2012.

31/10/2011

Liam Maxwell: One year later

It is more than a year since Liam Maxwell’s  “Better for Less” was published.  What has been achieved from the 69 pages of ideas? It obviously made the right impression because he is now working in the Cabinet Office in a one year appointment from September 2011

He did ask “WHAT WILL SUCCESS LOOK LIKE?

Our goal should be to deliver to the online population frontline public services with minimal, possibly zero, administrative cost, freeing up cash for more effective, intermediary-based, service delivery for those not online, and also as savings. This is already happening in some areas of local government and driving taxes down. It is happening in other countries, making service delivery better. It is time the biggest component of the British economy, its bloated state, started to learn these lessons.

How does it work? 5 principles underlining all IT in government We base our approach on a small number of core principles

1) Openness

a. Open Data – government data must be transparent
b. Open Source works – its concepts should be applied to processes as much as to IT
c. Open Standards will drive interoperability, save money and prevent vendor lock-in
d. Open Markets – competition creates efficient market-based solutions.

2) Localism – the centre may set the standards, but local deployment is best.

3) Ownership and Privacy

a. It’s our data, government can have access but not control over personal data.
b. Government should be accountable for data protection and proper use.

4) Outcomes matter more than targets.

5) Government must be in control of its programmes, not led by them.”

Let’s look to see how successfully the principles have been incorporated into the Government ICT Strategy.

1. Open data, open standards and open source are clearly stated objectives. And open markets are part of the procurement objective.

2. Localism does not get a mention, according to word search. This is a gaping hole, but perhaps Liam will explain this when he speaks at the SOCITM conference in November.

3. Alarmingly, neither privacy nor data protection are words within the strategy.  The objective for “Risk Management Regime” has implied elements for both, but the metrics concentrate system security – not anything based on citizen data protection.

4. Outcomes are potentially the most important gap in the strategy. There’s too much concentration on internal, central government processes. The four objectives for using ICT to enable and deliver changeare not really focussed on citizen outcomes.

5. Governance of programmes is an implicit role for the “Public Expenditure (Efficiency and Reform) Cabinet sub-committee (PEX(ER))“. There are twelve senior people named, with representatives from MOD, MOJ, HMRC, HO, DoH, DWP and Cabinet Office.   That  should be enough people. However, Quarkside thinks that UK plc should also have representation from departments with responsibility for improving the ICT skill base of the country. Shouldn’t DfE and BIS have something useful to contribute? And if localism is really important, why doesn’t DCLG have a place on the high table?

Quarkside gives “Better for Less” 40 marks out of a possible 100 for influencing the agenda. In the old days, this was a ‘Pass’ at A level. So not too bad. However, it would not have secured you a place in one of the top universities.

Blog at WordPress.com.