Tuesday, December 18, 2007

Time for an IT Department?

Having digested the news that Ruth Kelly announced to the House yesterday it seems clear that in Government and its agencies the principle of least privilege is dead (if I should add, it was ever alive). You may be wondering how exactly this latest data loss occured. How was it that 3 million personal data records were in Iowa? Who the hell is this company that had the data? Well if you read this blog regularly you will know the answer but for those that don't it goes like this.

The EU ergo the UK have, at least on paper, some the strictest data protection laws in the world. However, in a world on global trade this poses a problem for any company or as it transpires Government that wishes to purchases services that require the transfer of data outside Fortress Europe.

In the case of the USA, the federal government acknowledged this and decided, working in 'partnership' with the EU, to draw up a 'framework' that would allow US companies to take data and 'satisfy' EU minimum requirements. The system that was set up is known as Safe Harbor and works in such a way that a US company that wishes to handle data from the UK has to be assessed against the data protection legislation.

Who does the assessment? Well according to the Safe Harbor website it is the registering company. They download the relevant form, tick the boxes and gets listed on the site. Once on the site this means a UK based organisation, public or private, can send the company data withou worrying about the law anymore. Safe Harbor likewise, if my own reading of the site is correct, acts as an indemity type protection for the US company should anything go wrong.

Pearson Vue (NCS Pearson) are such a company. They specialise in software for testing and assessment, and they are, as we learned yesterday, contracted to a sub-office of the DVLA and Department of Transport. They are also, as I revealed last week, the company responsible for taking data on a daily basis from the Teacher Development Agency, and, unlike with the learner driver issue, the TDA does send date of birth details as well as other specific personal data to the US.

The real problem here though, as I said above, is the the principle of least privilege is seemingly dead in Government agencies. The principle dictates that only those who require access to data should have access, think of it like 'need to know'. This however poses difficult questions when it comes to software development.

After all, if you are developing and maintaining a system that is already in production you need to have some sort of production like data set to test upon. Performance testing for example is something that can only really be achieved against a proper data set, lest you go for linear extrapolation and take the risk of missing a potential clanger of a bug.

Thus, when the Government has a system it will, on occassions, need to have full data available for development purposes. But what do you do when your developer is not in the EU but is in the US or some other country?

Effectively you find yourself in a situation where you have to breach least privilege, and transport your data into an unknown state, both geographically and conceptually. It is at this point at which the system breaks down because it relies entirely on a paper procedure and promises that everything will be OK.

So what is the solution? Well for a start it is time for Government IT to be brought properly in-house. As with the need for a Whitehall wide ministerial position for information security there needs to be a ministerial position and departmental responsibility for IT across Government.

A proper technology ministry responsibile for all IT and security. A department which all other departments resource their IT systems through and which is based in the UK. The bottom line is this. Under no circumstances should any personal data be sent out of the country by Government.

Now some people might say what about private industry? Am I suggesting the same should be true for them? The answer is no, because the private sector is already heavily governed and heavily punished when it makes serious mistake with data security. Unlike the Government, the private sector is already heavily curtailed by the law.

The Government's proposal for jail time for anyone breaching data security is a misdirected solution as well. Putting a Band-Aid over a gaping ash will not stop the blood from leaking. It is the system that is flawed, and heavily punishing those working in a flawed system will not stop the problems occuring.

As long as we have a disconnected system of IT development and systems in Government then there will always be someone else to blame. It's time foe the Government to realise that the buck must stop with Government when Government systems fail. If that means removing responsibility for IT from many and giving it to the few so that there is a place for the buck to stop then so be it.

19 comments:

Shirley said...

yes but no but...
Given the insulting state of SRO's (see NAO report) and the complete lack of understanding of IT shown by the mania for big systems......
Perhaps a few years with card index systems till they can demonstrate at least CMM Level 2?

John Sandell said...

The plot thickens. The Telegraph reports that "Pearson won a seven-year contract to administer and process the test for the Driving Standards Agency in November 2003... Pearson Driving Assessments Limited is a subsidiary of Pearson VUE"

But Pearson's website says "Pearson VUE is the trading name of Pearson Driving Assessments Limited, a company registered in England and Wales with registration number 04904325, whose registered office is located at Hellaby Business Park, Hellaby Lane, Hellaby, Rotherham, South Yorkshire S66 8HN. VAT No GB 830 0666 55"

So the Government gave the data to a UK company, which then sent it to the US.

dreamingspire said...

Not until Crown Immunity is lifted should we let govt run massive admin systems.

Dark_Heretic said...

How long is before everyone realises that Government and massive data control / IT should NEVER mix?

Kafka said...

What is particularly galling about all this is the way they try & lord it over us commercial IT hacks with all their government inspired systems & processes. SSADM, PRINCE, ITIL etc. All that guff & they still couldn't organise the proverbial piss-up in a brewery.

From the viewpoint of someone working in IT compliance, dealing with SOX & PCI just for starters, they're a bunch of amateurs, from the bottom to the very top.

James Barlow said...

I'm not convinced the answer to the problem is more civil servants, nor a further centralisation of procurement.

Ed said...

Is "more government" ever the solution? Look at how well the government runs our schoolsnospitals.

Newmania said...

Thats an interesting idea Dizzy, so many blunders in general have been IT related but it did not occur to me to look specifically at the way IT was handled in Government.


You know I think this about the only pots I have read where I cansee a really useful policy coming from it

kAFKA said...

This comes under the heading of adding insult to injury ....

But good for a laugh anyway -- check out the programme -- even more fatuous than most one-day conferences!!

www.govnet.co.uk/govit

From today's CW.

Mrs Smallprint said...

Allowing data offsite with sub-contractors, whether in the UK or elsewhere is a disaster waiting to happen. Secure in house sytems and testing are the only answer. I am astounded that anyone could think otherwise. Yet they still expect us to believe our data will be safe with them when then start making us pay for ID cards, hah!

Deadbeat Dad said...

This looks like a perfectly reasonable suggestion, Dizzy. The only problem is that the buck never does seem to stop (in any area of Government), does it?

And when, as has been demonstrated time and again with major IT projects, the Government can't even manage a contract properly, what hope is there of maintaining a competent in-house resource?

On the subject of data and development: it ought to be easy enough to supply realistic datasets to developers without compromising security, by transposing fields and scrambling numeric data for example.

Morus said...

No - Dizzy is entirely right about this. A Sec of State-led Department for Governmental IT and Information Security needs to be established as soon as possible.

The centralised procurement aspect already occurrs through OGC, but without the necessary expertise within the governance structures of major IT projects. A Dept for Gov IT would solve this lack of capability in this area.

Also, the reason there is no accountability in government itself for when IT projects fail is that they are inevitably cross-department, or only part of a department: the gvt says "why should the Health secretary resign over an IT issue?" or "it is not the major area of her job" or "three depts were involved - we can't lose three Sec of States, and it would be unfair to penalise one only" and she escapes with 4 apologies to the HoC.

A centralised IT department, overseeing all gvt IT projects, with a Sec of State in charge and accountable could be resourced (both with respect to staff and money) through recharges to other government departments. The synergies would actually save money, but without blatantly raising existing departmental budgets, there is no reason to imagine there would be extra cost or headcount. This is a stella idea, that needs to be implemented. I would strongly advise the Tories to put this in their manifesto.

Start a campaign, Dizzy

Regards

MORUS

mitch said...

I can almost hear the government announcement it goes like this
"A new ministry of IT will be created to bring all IT under one roof and using the best talents from the civil service and the private sector answerable to the PM".
We are truly screwed!!

Tom FD said...

If they do bring IT development in-house then I hope the contractors' needs will be considered in development. For a few weeks last year I worked in an office for a council house maintenance company where they were manually entering the exact same data into two systems concurrently: the system provided by the council, and the system belonging to the contractor. This created so much work that this previously fully able department of two had to take on two temps to get a huge backlog out of the way - of which I was one. As much as I appreciated the opportunity to work (although doing nothing but enter the same data into two systems every day didn't do my already-fractured mind any favours) it was overshadowed by the guilty feeling that my job really shouldn't have existed in the first place...

Oxymoron said...

I remember going to a recruitment evening a few years ago by a leading management consultancy recruiting IT staff to work on the NHS records project and I actually walked out within a few minutes because they clearly had no idea what they actually wanted...

dreamingspire said...

OGC has only been about process, not about content. Last year it made a few forays into project content examination, of which the only result that I have seen is the re-jig of the ID Card project (now the National Identity Scheme, using existing data where possible rather than registering people from scratch).

Anonymous said...

Many years ago I was head of a joint department, and inherited a system under which one side used Wordperfect, and the other Windows Office. Naturally they could not communicate across and it was double work all round. Crass...

Anonymous said...

I know a guy in IT who works as as independent contractor (currently for a government project). It is clear from what he tells me that the civil service don't have a clue, are incapable of assessing tenders and are getting ripped off by incompetent international IT companies run by over promoted graduates who don't understand what they are doing.

ITIL MAN said...

Dizzy:

After all, if you are developing and maintaining a system that is already in production you need to have some sort of production like data set to test upon. Performance testing for example is something that can only really be achieved against a proper data set, lest you go for linear extrapolation and take the risk of missing a potential clanger of a bug.

As someone who led the design and development team for a core IT system in a Government department back in the 80's this is a no-brainer. You simply jumble the core identifiers withinh the data before the data is sent outside the secure area.

So Fred Bloggs becomes Daphne Bloggs and no longer lives in Manchester, he lives in London and his id number isn't ABC123 its MCD716. All it needs is one simple batch programme with a few simple algorithms to jumble the data. Simple and low cost. Whats the problem?

As I say I've done this and it works it. We used to jumbled data for test and development databases all the time.

The reason it is not done is because the powers that be are penny pinching at the expense of security or lazy IT personnel can't ber bothered. That's all.

There is no excuse for the sort of incompetence that we are currently seeing.

You want to stop it - start sacking people for gross negligence at the top and start implementing and enforcing penalty clauses in 3rd Parties' contracts.

Cheap and effective.