Kemuri: IoTLondon Showcase

Filed under: Innovation,Risk,Social Care,Technology,Wellbeing — lenand @ 3:26 pm
Tags: , , ,

The buzz at the IoTLondon showcase demonstrated the huge energy of entrepreneurs.  Kemuri was able to demonstrate tangible progress since the original presentation in February.  The strapline was a bit tacky, “Granny Monitor”, but all better suggestions are welcome.

People were interested in poking their fingers into the working demonstrator, thankfully it was in battery powered mode and not connected to the mains.  It contains:

  • Temperature sensor – for hypothermia risk
  • Power sensor – for dehydration or nutritional risk
  • Motion sensor – for immobility or fall risk
  • Controller board – to schedule data transmissions
  • Communications board and aerial
  • Power socket

All fitting into a UK standard double socket enclosure.  Mission accomplished.

We had genuine enquiries about selling to the general public.  The answer has to be “No”.  The equipment must be professionally installed and the first units will go to telecare companies or care agencies.  Furthermore, it was just a demonstrator and it has to go into a prototype, certification and pilot phase before it should be introduced at scale.

As pointed out in the Marketing Flyer, we are “seeking supporters, collaborators and funders” before taking the next steps.  Tell your friends.


Funding a Social Enterprise

Filed under: Health,Innovation,Risk,Social Care,Wellbeing — lenand @ 8:43 am
Tags: ,

Ideas are free.  Concepts are beguiling and an excuse for inaction.  Improving the wellbeing of an ageing population seems to be a noble social enterprise.  People don’t want intrusive monitoring – but digital technology can help.  Kemuri has taken the first steps in providing a passive wellbeing monitoring service that will cost less than £1000 in the first year and less than £500 in subsequent years.  One year in a residential care home costs at least 10 times this amount, or more likely, 20 or 30 times.

The technology for Kemuri’s simple well-being monitor uses the Internet of Things.  It may be a buzz word, but when Google invests £2 billion on such products, it is not just hype.  Adrian McEwan’s book on the Designing the Internet of Things has helped to identify the action needed for a start up.  The chapter on a Business Model Canvas provides a nine point process for showing what must come next. The benefits are clear, the costs are containable, so how can funds be raised to deliver at scale by 2016, when the impact of the Care Bill becomes clear to the general public?

The proposition is that funding should be forthcoming if there are sufficient provisional orders.  Provisional orders would be financially risk-free for care commissioners.  If the products and services do not meet pre-determined conditions, then the order need not be fulfilled.  The financial risk is taken by the funders.  The big question is the size of the market at the price point selected.

In the UK, here are approximately 2.5 million people aged over 75 living alone in own homes, sheltered housing or care homes.  Any one of these is a potential beneficiary of passive wellbeing monitoring.  2,500 provisional orders would be just 0.1% of the potential market.  2,500 provisional orders represents a turnover of £2.5 million.   Is this attractive to Angel funders or Venture Capitalists?  Time will tell.


‘Man in the middle’ attacks for dummies

Filed under: Risk,Security — lenand @ 3:49 pm

Public WiFi security risks are real, not imaginary.  This paper from the Royal Holloway College, University of London highlights Security Risks associated with the use of public Wi-Fi hotspots.

Should Quarkside publicise this paper?  The advice works both ways.  It gives as many clues to potential criminals as it does to those who should take more care.


Predictive Monitoring: Heart Attack and Stroke?

Filed under: Innovation,Risk — lenand @ 6:12 am
Tags: , ,

Machine learning via neural networks produces impressive results.  The Blood Glucose prediction hackathon used three data streams: historical blood glucose, insulin dosage and carbohydrate consumption.  This explains approximately 50% of the prediction.

Blood Glucose Predictive Power

Blood Glucose Predictive Power

Adding three more streams would increase this to 85%; other nutrition, activity and other lifestyle factors.  All these could be simply collected from mobile devices and used in other health prediction applications. It shows the value of monitoring data streams for multiple purposes.  This data could also be analysed alongside logs of blood pressure and heart rhythms.

Just think of the value to people who are at risk of heart attack or stroke.   Real-time predictions of heart rate and blood pressure could set alarms that would moderate a person’s behaviour.  An impertinent machine telling you to Stop driving!, sit down! or have a rest! may upset your plans – but it is better than risking your own or another life.

Machine learning is not rule based – it calculates the rules.


£21bn Cyber Crime cost contradicted

Filed under: Governance,Politics,Risk,Security — lenand @ 1:59 pm
Tags: ,

Cyber Crime (eCrime) is global, but it needs local solutions.  It needs political will to engineer a major reduction in eCrime.  The problem is that politicians need to recite credible evidence to justify expenditure to their key constituents, eg citizens and small businesses.  Such evidence is reviewed in a report published by Cardiff University, “eCrime Reduction Partnership Mapping Study”.

One of the authors, Dr Michael Levi, launched the review in Parliament yesterday.  In wanting to avoid headline-catching assertions, he obliges you to read the 80 pages to extract any gems:

  •  Estimated losses to UK business of £21billion (Detica and Cabinet Office) do not “meet acceptable quality standards”.
  •  The size and scale of eCrime is unknown and good data is not collectible;
  •  Criminals profit from eCrime, with tax and welfare being the greatest source of income;
  •      SMEs and individual victims do not get any justice response, nor do the Police plan to provide it. Malware, phishing or illegal copying are not on the radar.

On this evidence it is difficult to imagine that many politicians will be inspired enough to lead on promoting local eCrime reduction partnerships formed from police, business, government and local authorities.  Self-help may be the way forward, but how do you inform people of the true risks and methods of avoidance?  It may be practical to initiate a scheme like Neighbourhood Watch, but sustaining success would depend on charismatic leadership – not on bureaucratic data collection and dissemination.  It is a fact that the offer of advice creates fear, and the perception that things are worse than the evidence suggests.


Central must go Local

Filed under: Governance,Local Government,Risk — lenand @ 7:22 pm
Tags: , ,

“With pressure to reduce spending, it is more important than ever that central government engages effectively with local government to draw on its expertise and capability in designing and delivering good quality, efficient public services.”

 The Great E-mancipator led me to this quote from a National Audit Office report “Central government’s communication and engagement with local government“.

They were not pulling punches when they wrote, ” insufficient engagement with fire and rescue authorities was one factor that led to a major project to replace control rooms being cancelled in 2010. The project did not have the support of the majority of the end-users essential to its success, which wasted a minimum of £469 million“.

DCLG are doing more now, but are other big central government departments actively improving communications and engagement?  What is the risk of similar wastage on Universal Credit, Individual Voter Registration and all things connected to e-Identities?  They should check their risk registers this week – and make sure they have enough LA engagement.




Cabinet Office eID follows Quarkside?

Filed under: Governance,Politics,Risk — lenand @ 12:40 pm
Tags: , , , , ,

At the end of May, the Cabinet Office reported that Identity Assurance goes to Washington.  They seem to have taken heed of January’s Quarkside support of the OIX standard for eIDs.  This is the Open Identity Trust Framework (OITF) Model that does not require a central hub.  Perhaps the headline claim is a little strong, since there is no evidence that anybody there has read the blog!  Nevertheless, given that a central eID scheme has been ruled out by Government policy, it is a small step in the right direction.  Although a central scheme would be the most efficient to operate and implement, federation of eIDs is technically feasible.

Now for the next set of issues:

  • Can the Government use a current implementation of OIX that prevents identity fraud, such as duplicate identities or impersonation?
  • Will private sector identity providers, such as Google, provide eIDs at a price that makes commercial sense to themselves or citizens?
  • Will the scheme be ready in time for Universal Credit with sufficient trust in electronic credentials?

With a risk manager’s hat on the answers to all of these is probably “No“, ie greater than 50% chance of missing targets.  Failure of Quality, failure of Cost and failure of Time; the fundamental triumvirate of project management.  Will this be another ill-fated YAGIF (Yet Another Government IT Failure) – which is actually a Governance failure, not ICT?

The OIX framework does not obviously include the high levels of trust that public sector agencies will need to dispense £billions with on-line transactions.  Something akin to an Identity Trust Matrix may be necessary, tailored to the specific needs of service providers such as schools and the NHS.


7DIG: Identity Trust Matrix

Filed under: Innovation,Risk,Security — lenand @ 2:49 pm
Tags: ,

Here is a suggestion for a visual chart that will help people to understand risk they are taking in electronic transaction between a customers and suppliers of goods or services.  How much should a supplier trust a potential customer? – and vice versa?

Trust Matrix

 Identity Trust Matrix

Suppliers look at the vertical axis and decide how much value is at risk if the customer avoids paying for what is delivered – whether by deliberate fraud or by accident.  In other words, what level of trust can be placed in customers’ electronic credentials?

Customers look the horizontal axis to gauge whether the Trust Level of a credential is sufficient to meet the value of potential purchases.

Each transaction is given a Trust Index with a calculated value from 0 to 100.  These are split into three ranges of high, medium and low risk – from a supplier’s perspective.

  • RAG         Risk                 Trust Index  Proceed?
  • Red          High Risk           10-100       Unwise to proceed
  • Amber      Medium Risk      2.5-10        Proceed with caution
  • Green       Low Risk              <2.5         Sufficient eID to proceed

The amber area of the matrix is where the reputation of each party should be considered in addition to the trust level.  What is the trading history of both customer and supplier? Ebay traders understand this principle.

Clearly, the matrix depends on the agreement of Trust Levels of credentials.  Quarkside has not developed a firm proposal, but here are some starting suggestions for four ranges (with maximum low risk value):

  • 1         (£10)         Username and password.
  • 2,3      (£100)       Additional personal secrets;
  • 4-6      (£1000)     Documentary evidence of identity, such as banks’ “Know Your Customer” requirements.  Inclusion of credit agency data.  Face to face interviews by the enrolment agency may be needed. Sufficient to obtain a passport;
  • 7-10     (£10,000)  Biometrics necessary to complete transactions.  The highest levels would have government security vetting and very strong protection against counterfeit credentials.

If this sparks any interest, suggestions to help definition of trust levels will be considered.  The background to the need for an Identity Trust Matrix will be the subject of future posts, following the 7DIG framework.


IER: Matching Mayhem

The Cabinet Office recently published a number of papers on the  Introduction of the Electoral Registration and Administration Bill.  With barely a third of electors bothering to vote in local elections, what is the likely impact of any new processes in increasing the number of voters?  It does not feature in any of the documents.  Reducing fraud and increasing accuracy seems to be the driver – not democratic accountability.

  • It is a widely held view that the current system for registration is vulnerable to fraud and a public perception that this allows electoral fraud to occur.
  • Individual Electoral Registration (IER) should therefore improve the accuracy of the register and allow people to register in different ways. 

The preferred option is to pre-populate the electoral register with electors who can be validated against public data sources in 2014/15 and then require the remaining electorate, future house movers, and new voters to register (and have their registration validated) from 2014/15 onwards.

As previously reported by Quarkside, the scary part is the data matching against public data sources by 400+ local authority Electoral Registration Officers (EROs). “…confirmation is expected to pre-populate the register with 57% of the eligible electorate“, leaving 43% to be found by other means.  This assumption is derived from the 2012 Electoral Commission report on Data matching schemes.  Delving deeper into this, we find a startling recommendation that

  • The pilots did not follow processes, in terms of the IT systems and matching arrangements, which would be used for nationwide data matching. The evaluation cannot therefore draw conclusions about how the costs of these pilots would translate to a national roll-out.

Not only that, the poor quality of the matching data showed:

  • … the average match in the pilot areas using Department for Work and Pensions data was 66%.

And everybody knows, the highest costs are the result of solving poor data quality problems.  Has this been factored into the Cabinet Office calculations.

  • The process, as tested in these pilots, was labour intensive with significant work required to analyse the data. Those involved felt that the level of work required would not be sustainable in the future.

The prognosis is not good – but we shall battle on regardless of all the warnings emanating from Local Government EROs and computer service departments.  We need a secure, consistent, governance framework that can be followed by all Councils – at a price the nation can afford.


Right to be forgotten: Is it practical?

Filed under: Governance,Politics,Privacy,Process,Risk — lenand @ 8:08 am
Tags: , ,

The reform of the EU’s data protection framework has an explicit requirement that obliges online social networking services (and all other data controllers) to minimise the volume of users’ personal data that they collect and process.  Furthermore, data controllers must delete an individual’s personal data on request – assuming there is no other legitimate reason to retain it.

One wonders if this also applies to back-up and archive files.  The best organisations may be able to trawl through history, selectively remove personal records and produce an audit trail to prove it.  It may start messing up statistical reports – but that a minor problem when most public sector organisations do not have information governance processes capable of tracing individuals – let alone removing all traces of them.

Next Page »

Blog at WordPress.com.