Infosecurity – the GCHQ way

Balancing privacy and security requires highly developed information security policies and, of the UK intelligence agencies, GCHQ has taken the lead

In a perfect world there would be no need for security and intelligence agencies. But in an imperfect world, where such agencies are required, arguably the best way to balance security and privacy is to minimise their ability to abuse their powers without stopping them from doing their jobs.

Doing so requires agencies, regulators and politicians to create highly developed information security policies and practices. The UK has a relatively high tolerance of state surveillance, partly based on the agencies’ Second World War reputations, but also on the public’s assumption that they use their present-day powers properly. Maintaining this is in the interest of the agencies as well as the public, as support for legislation such as the Investigatory Powers Bill would be severely threatened by evidence of corruption or sloppy practice.

Of the three agencies, GCHQ has to take the lead on information security. It has the most staff – 5,564 at March 2015, compared with 4,047 at the security service MI5 and 2,479 at the secret intelligence service MI6. It probably spends the biggest part of the £2.6bn intelligence account, although the budgets for each agency are secret. It certainly has the greatest powers to gather information, being the only one of the three agencies that can wield all four kinds of ‘bulk power’, the controversial surveillance abilities that gather data on large numbers of mostly innocent people.

In his recent report on such powers, the independent reviewer of terrorism legislation David Anderson wrote: “Bulk powers, by definition, involve potential access by the state to the data of large numbers of people whom there is not the slightest reason to suspect of threatening national security or engaging in serious crime. Any abuse of those powers could thus have particularly wide-ranging effects on the innocent.”

Safest pair of hands

Also, the other agencies outsource some of their work in this area to GCHQ. MI6 relies on GCHQ’s bulk interception of communications to provide targeted information, while both MI5 and MI6 analysts use GCHQ’s system for bulk personal data on travel.

Fortunately, there is some evidence that GCHQ is the safest pair of hands among the agencies on information security. In his most recent annual report, the intelligence services commissioner Sir Mark Waller said that GCHQ had reported three errors during 2015, compared with 11 at MI6 and 67 at MI5. In its defence, Waller said that MI5 obtains significantly more warrants than the other agencies, “and their error rate is in fact low as a proportion of authorisations”.

Other external checks have called MI5’s information security into question, including its staff contravening rules for accessing communications data on 210 occasions over five years. David Anderson’s review said that GCHQ had reported no errors over bulk communications data during the same period, while on bulk personal datasets the investigatory powers tribunal has heard that between June 2014 and February 2016 there were six “instances of non-compliance” at MI5, with two members of staff disciplined, and five instances at MI6, with three staff disciplined. There were just two such incidents at GCHQ, neither involving individual non-compliance.

UK IT security role

GCHQ has another advantage: it is responsible for strengthening the information security of UK organisations through its CESG arm. It has started publishing advice such as on passwords, and is in the process of expanding its work through the establishment of a new National Cyber Security Centre.

Apart from this, it has previously been difficult for organisations to learn from GCHQ’s own information security experience. But documents from Privacy International’s investigatory powers tribunal case against the intelligence and security agencies, as well as David Anderson’s review, have included details on how GCHQ secures its own information.

Much of its work focuses on individual users. All employees go through a three-month vetting as part of recruitment, but the tribunal documents show this is just the start of the process. This people-centric approach is outlined in Boiling Frogs, a research paper released by GCHQ in May 2016. Writers Russ B, Mike M and Steve H argue against a model of security that prevents people doing their jobs – not least because it may lead to a growth in shadow IT workarounds – preferring one that offers permission and enables staff. “People-centric security is a strategic approach to information security that emphasises individual accountability and trust, and that de-emphasises restrictive, preventative security controls,” they wrote.

In a witness statement to the investigatory powers tribunal, the unnamed deputy director for mission policy at GCHQ revealed some of the ways in which the agency attempts to put this model into practice:

  • A business case for access: Unless they have an up-to-date qualifying skill – generally gained only by using the system in question – GCHQ staff have to prepare a business case to access its bulk telephony and internet data tool. This includes demonstrating a requirement for the data and confirming there are colleagues who will support the person in using the system. Applications have to be approved by a local manager and the system’s senior user community. If approved, the new user has to read a “defensive brief” for the system which covers the proportionality of the tool’s use and the policy requirements, along with advice and contacts for support. Only then can the user apply for an account.
  • Compulsory training: An account does not lead directly to access. New users have to undergo online training courses and tests, including a legal overview course and, for staff who have access to operational data, an advanced mission legalities course. Both must be re-taken every two years to retain access. The agency also runs an hour-long e-learning training course in using bulk personal datasets with an additional module for travel data, both of which have to be completed before access is granted to that system.
  • Multiple levels of access: For GCHQ’s bulk telephony and internet data tool there are three levels of access: level 1, 1+ and 2, with the last allowing access to communications content. Some bulk personal datasets are limited to a handful of people, at least initially: in June 2013, one financial dataset acquired from MI6 was accessible only by two people, while in April 2015 a dataset acquired for a time-limited trial was used only by about 10 analysts.
  • A written reason for access: Use of systems is tracked, but, along with MI5 and MI6, GCHQ also requires staff to confirm that they require access each time, selecting one of three legal justifications: national security, economic well-being or to support the prevention or detection of serious crime. GCHQ aims to stop this from being purely a menu choice exercise by requiring a free text justification for the search.
  • Use it or lose it: As well as requiring users to jump through hoops to get access, staff with access to the bulk telephony and internet data tool have to use it at least once every six weeks, “or it will expire and a new application will be needed”. In addition to person-centric security, GCHQ uses a range of auditing techniques, both internal and external. These include measures that hold data and datasets for limited periods, with the latter needing reauthorisation if they are to be kept.
  • Limited retention periods: GCHQ holds bulk communications data for a year, as does MI5, after which the information is deleted automatically. On bulk personal datasets, since 2010 an internal GCHQ panel has reviewed their use twice a year. The panel authorises retention for a limited period and owners must request extensions, with the usefulness of each dataset tracked through technical data sheets that analysts fill in stating sources while drafting reports. If the owner does not request retention or the panel does not grant this, the owner has to provide evidence that the dataset has been deleted.
  • External auditing: As well as internal reviews, since December 2010 the intelligence services commissioner has carried out twice-yearly checks on GCHQ’s use of bulk personal datasets, on the orders of then-prime minister David Cameron. Sir Mark Waller, in his October 2015 inspection, spotted incomplete paperwork for retaining one bulk personal dataset, where the original authorising document was unsigned. As a result, GCHQ’s mission policy department checked the paperwork for all such datasets.
  • Admit errors: The agency has raised its own errors with the commissioner, such as when a dataset containing the names and photos of several thousand alleged intelligence agency officers was released online, which GCHQ downloaded to check for its own staff. The file was then deleted, so the agency helped MI5, MI6 and an overseas Five-Eyes partner agency (from the US, Canada, Australia or New Zealand) check against their own employee lists. Permission had not been sought for external sharing, but Waller decided it was justifiable as it aimed to defend employees.

There is always more that GCHQ and the other agencies could do. In his report on bulk powers, David Anderson recommended that the government appoints a technical advisory panel, partly to help the agencies “reduce the privacy footprint of their activities”. But on the basis of the recently published documents, other organisations could benefit from considering GCHQ’s practices on information security.

First published by, 29 September 2016

Dev-Olympics Rio 2016 medal table: East of England triumphs

Team GB’s medal-winners from Rio 2016 come from all over the country and beyond, as this interactive map of those winning individual medals shows. (Click on a circle for data on each medal-winner.)

Continue reading “Dev-Olympics Rio 2016 medal table: East of England triumphs”

Unhealthy valleys: Wales’ problem with ill-health

Greater Glasgow gets a lot of coverage for its poor health through having the lowest average lifespans in the UK. Although residents of the Welsh Valleys – the post-industrial areas north of Cardiff – don’t have such short lives, they are most likely to be living with poor health. The three UK council areas where more than 10% of adults say they are in bad or very bad health are in south Wales: Neath Port Talbot (10.5%), Blaneau Gwent (10.7%) and Merthyr Tydfil (11.1%). Continue reading “Unhealthy valleys: Wales’ problem with ill-health”

How software drives safety in aerospace, healthcare, oil and gas

Safety-focused industries have developed safety-critical software and services from which other sectors can learn

Software is used to maintain safety in many high-risk fields, including aerospace, railways, automotive, nuclear power and healthcare.

“If you look at many sectors, at least 85% of the functions seen by users, whether a car driver or an aircraft pilot, are, to some extent, enabled by software,” says John McDermid, professor of software engineering at the University of York.

This is particularly true for aerospace, where the use of software in safety has been growing slowly over a long period. Some 10-15 years ago, aircraft depended much more heavily on software than other engineered domains, but other areas have been catching up rapidly, says McDermid.

Particularly in automotive, medical and consumer technologies, “it has been increasing at an incredible rate”, he adds.

This has left different industries regulating safety-critical software in different ways, says McDermid, with aerospace and nuclear power using a regulator to assess software and decide whether it complies with standards. “In some other sectors, for example automotive, in effect there is self-regulation,” he adds.

McDermid says there is a tension over whether or not safety-critical software should be formally approved. Regulatory checks are expensive and often quite slow, but aerospace software has an excellent record. “It’s a question of what balance of risk you take,” he says.

The US Federal Aviation Authority is currently looking at the cost of assuring aviation software, to see if this can be reduced, says McDermid. “As we develop more autonomy in road vehicles, I think we’ll find that the standards in automotive get more stringent,” he adds.

Medical devices have rapidly made more use of software, says McDermid, and although the US Federal Drug Authority has some regulatory power, more rigour may be required, with numerous examples of failures.

“I think that would be the one obvious area where more care and attention is needed,” he says. “We’re going from individual devices, such as the pacemaker, to connecting lots of devices in hospitals and also trying to support people in their homes more. We need to do much more to understand the interactions of these systems.”

One way in which healthcare providers are using software for safety is by analysing the data staff already collect. Some hospitals run an algorithm on basic patient observations to allocate risk levels, with people in more danger being checked more often by more senior staff as a result.
Digitising the National Early Warning Score

Many NHS hospitals use the Royal College of Physicians’ National Early Warning Score for this, calculated from measures such as temperature, blood pressure and level of consciousness.

The calculation can be worked out by hand, but this is time-consuming and prone to error. In research published in the Elsevier journal Resuscitation, Portsmouth Hospitals NHS Trust and the universities of Portsmouth and Bournemouth found that moving the calculation from paper to electronic entry via iPods cut the average time from 67 to 43 seconds, while the resulting incorrect clinical actions dropped from 14% to 5%.

The score, which includes recommended action, also makes it easier for nurses to provide a measure of severity to colleagues, rather than simply saying they are worried about a patient. “By having a score they can refer to, it makes that communication much quicker and slicker – there’s a common language,” says Paul Schmidt, a Portsmouth Hospitals consultant in acute medicine.

Portsmouth Hospitals began developing the VitalPAC software it uses for this in 2005, and deployed it across the hospital in 2009. It was originally intended to capture nurses’ observations for research on patient deterioration, but the trust realised it could use the data to help slow or prevent such deterioration.

Electronic records help reduce outbreaks of norovirus

One example has been to greatly reduce outbreaks of the norovirus winter vomiting bug. Portsmouth Hospitals cut the number of cases by 91% between 2009-10 and 2013-14, far more than the 28% drop recorded across England, according to a 2015 paper for BMJ Quality and Safety.

Through functionality developed with The Learning Clinic to record nausea and vomiting electronically, the trust was able to rapidly notify its infection prevention and control team, which could move patients into isolation, increase hygiene measures and order intensive cleaning.

“Our responses are much more targeted and much quicker,” says Schmidt. “The consequence is to virtually eradicate ward closures.”

The trust has also used the VitalPAC system to reduce mortality rates from cardiac arrests suffered by patients while in hospital and is using it to compare the frequency of patient observations on different wards.

Schmidt says standard electronic patient record systems in hospitals simply digitise what healthcare professionals used to write on paper. “That is not necessarily transformative,” he says. “If you look at how industry uses this technology, it’s for process control.

“People aren’t widgets, but they are subject to harm,” he adds, with 30% suffering some kind of damage during a hospital admission. “What we’re bringing to healthcare is the kind of technology that industry and airlines use.”

Safety software in oil and gas

Some of these industries are expanding their use of software for safety by extending it to staff training. In oil and gas extraction, certificates (and a passport) are used to check people boarding helicopters to rigs on the UK continental shelf, with similar systems operating in other countries.

The certificates show that an individual has up-to-date training on everything from escaping from a crashed helicopter underwater to medical checks.

Some certificates are permanent, while others last between six months and three years. Handling these certificates is a big administrative task – but an essential one.

“If you arrive at the gate to be mobilised without the correct certificate, you won’t be allowed to travel,” says Kevin Coll, managing director of Solutions Aberdeen, an oil industry-focused software and consultancy firm.

Coll, whose firm has provided IT services to oil and gas firms for 25 years, noticed many clients had tried to build systems to manage these certificates, or managed them through a set of spreadsheets. This led Solutions Aberdeen to develop Onboard, a web-based software suite that records certification and levels of competency. It can also track staff and contractor contact details, their location and availability, and integrate with other software when customers want to retain existing applications.

Features particularly useful in the North Sea include nicknames, so that someone commonly known as Jock can be found by this in the system, as well as by his actual name; and the ability to hold different contact details for emergency and routine queries. The system covers mobilisations on 59 of the 146 manned platforms in the North Sea.

Norwegian oilfield services firm Archer has 1,400 people on the system, gathering data previously held on separate systems and spreadsheets. “No longer did we need to lose time tracking down data – it was right there and we could access it when needed,” says Archer operations manager Mark Cowieson, quoted in a University of Aberdeen Business School case study.

Getting certification wrong can have a major impact on safety. If someone is blocked from travelling to a rig, that may leave it short of staff. And if they manage to get there without the right training – including on something specific to that rig – the consequences can be dire. Different rigs employ identical equipment, such as pipeline valves, but with different settings – one may be set to 150psi while another is at 1,500psi.

“If you go out on a rig using a piece of equipment that could blow up, you have to be trained and competent in its use for everyone’s safety there,” says Solutions Aberdeen’s Coll.

Money-saving opportunities

Onboard can also be used to avoid wasting money on safety issues. “An operator might insist that all of a group of personnel are trained on forklifting, but having reporting visibility would allow you to push back on the operators and query why it is needed when only a select few actually ever use them,” says Coll. “We could then save £250,000 a year by not training people who are never going to be allowed to operate the forklift anyway.”

The spare skills capacity adds little to the safety of a rig, because such skills require regular practice as well as training, he adds.

Similar software could be used in other regulated industries, such as in the health service to fill gaps in rotas and show the availability of locums, says Coll. “There is no high-level nationwide system that we are aware of that says who is available and what skills they have,” he adds. “That used to be the case in the North Sea.”

There is also potential in the nuclear industry and for providers of public-service vehicles, he points out.

Danger of over-reliance on software

But is there a danger of over-reliance on software to avoid disasters?

“There are a lot of things that computers and software are better at doing than human beings. They are much better at being consistent and reliable, and so on. But human beings are very much better at dealing with unanticipated things,” says the University of York’s McDermid.

However, this leads to the paradox of automation, with evidence mounting over a number of years that with aircraft increasingly run by software, pilots are not dealing with problems as well as before. “When something goes wrong, they don’t necessarily have the seat-of-the-pants flying skills or the understanding of the aircraft that they used to,” adds McDermid.

Software can also provide part of the solution, by improving training for disasters through simulations. This may not quite be real world, but there are clear advantages to pilots and others gaining experience of unsafe events without risking any lives.

First published by, 19 January 2016

Genomics and big data; and I ♥ Milton Keynes

I attended a recent conference run by the Sanger Institute and supplier DDN on genomics and big data, which involved a visit to the Sanger’s famous laboratories and data centre. Genomics could produce between two and 40 exabytes of data annually by 2025; astronomy, which churns out data, is expected to produce just one exabyte. A decent-sized PC hard-drive holds a terabyte of data, roughly a million megabytes; an exabyte is roughly a million terabytes. A lot. The resulting article for is here.

The massive scale of genomics data is forcing those providing its IT to rediscover old efficiency techniques. It is also seeing institutions working to upgrade their facilities. This includes University of Oxford, which is working on a new Big Data Institute near the city’s hospitals in Headington. Continue reading “Genomics and big data; and I ♥ Milton Keynes”