‘Too expensive’ to delete police mugshots of innocent people

April 19th, 2018 no comment

The painstaking task of weeding out potentially hundreds of thousands of custody photographs of innocent people is likely to be deemed too expensive and onerous by cash-strapped constabularies, a minister has warned.

Innocent people whose mugshots are kept on file by the police have the right to request for such images to be deleted in their entirety – a right that only a handful of people have taken up so far.

However, the way in which these images are stored means that the process of deleting them can be complicated and clunky and it cannot be automated without a major upgrade to existing police computer systems.

Baroness Williams of Trafford, the Minister for Countering Extremism, told MPs on the Science and Technology Committee that there were “technical challenges” to automating the deletion of facial images, making deletion a “complex” manual exercise.

She added that “any weeding exercise will have significant costs and be difficult to justify given the off-setting reductions forces would be required to find in order to fund it”.

In total around 21 million facial images are held on the Police National Database (PND) and on local force systems. The arguably unlawful retention of at least some of these pictures has been a longstanding bugbear for pro-privacy groups such as Big Brother Watch.

Some senior law enforcement insiders have privately claimed that the retention of images of unconvicted people could help constabularies to fight crime in the long run. There have even been rumours that artificial intelligence-type tools are currently being developed to make links between police mugshots and visual clues buried in illegal material online.

In a letter to the chairman of the Science and Technology Committee, Baroness Williams explained that custody images are first stored on the policing system of the arresting force, of which there are 43 in England and Wales, and are then copied from these systems onto the PND.

“These records are structured around a person’s contacts with the police, rather than conviction status, and so there may be multiple images across several systems relating to a particular individual,” she wrote. “If a record is deleted from a local custody system it will also be deleted from the PND.

“However, deletion from the PND will not lead to an automatic deletion from the local police system, as there is no link back from the PND to local systems. In order to delete custody images automatically, it would be necessary to upgrade all 43 local systems and the PND.”

Committee chairman Norman Lamb called for an urgent review.

“Innocent people should rightly expect that images gathered of them in relation to a crime will be removed if they are not convicted,” he said. “This is increasingly important as police forces step up the use of facial recognition at high profile events, including the Notting Hill Carnival for the past two years.”

The row came as the Home Office cited compliance with data protection laws as the reason for the controversial destruction of landing cards that could have helped Windrush arrivals prove their right to stay in the UK. The issue is at the centre of a scandal over apparent threats to deport some children of immigrants from the West Indies who came to Britain immediately after the Second World War.

In 2015 it emerged that police were struggling to scrupulously review all material held on a counterterrorism database in order to comply with new privacy legislation.

At the time Clive Reedman, a biometrics consultant and expert in criminal justice, said: “To remove something from a criminal database of this size, whatever type of profile it is, is difficult if not impossible.

“Deleting data – things like biometric profiles and DNA profiles – is a very difficult thing to do. It’s actually also quite a nonsensical thing to do in many cases.”

A year later it was revealed that hundreds of DNA or fingerprint records relating to persons of interest to counter-terrorist police had either mistakenly been deleted or would need to be destroyed owing to oversights in applying for authorisation to legally retain such data.

Japanese carbon capture plant claims higher efficiency and lower cost

April 19th, 2018 no comment

The site, located near Tomakomai port on Japan’s north island, is reportedly cutting energy costs by as much as two-thirds compared with other projects according to Reuters.

These claims have yet to be commercially verified but if true could reignite interest in CCS that has waned in recent years due to the high cost of the technology.

The UK government was accused of wasting approximately £168m of taxpayer money last year after two initial attempts to develop CCS technology ultimately resulted in funding being pulled and the projects never got off the ground. 

The government didn’t want to spend the billions that it would have taken to maintain the projects.

While the $300m (£211bn) site at Tomakomai port represents just a small portion of the $20bn invested in CCS, it has potential for easing CO2 emissions from industries such as gas processing and cement and chemical production.

Pipe for tranporting CO2 pictured at CCS test site in Tomakomai

Most investments into CCS have focused on capturing carbon from power plants fired by coal and other fossil fuels – the largest source of CO2 emissions – but there have been big setbacks and some projects cancelled.

“Tomakomai is an exciting development. Progress on CCS has been far too slow and projects like that are very encouraging,” said Graham Winkelman, climate change lead at BHP.

Industrial applications such as that being tested at the Tomakomai site are where the focus now is on CCS, he said.

BHP is the world’s largest exporter of coal for steel-making, a fuel and industry often marked as big sources of climate-warming emissions.

A history of failed projects has plagued the development of CCS. British utility Drax won a €300m (£260m) EU grant to develop the technology in 2014.

But just a year later the government’s withdrawal of CCS funding support, reversing pledges made promised in the Conservative party’s 2015 election manifesto, resulted in the closure of the facility.

Southern Co’s Kemper power station in the United States was to use CCS in an attempt to get clean power from coal, but was abandoned after billions of dollars of investment.

Chevron Corp also delayed the world’s largest CO2 injection operation in Australia, after spending A$2.5bn (£1.4bn) on the project at its Gorgon liquefied natural gas plant, itself beset by many problems.

CCS involves separating CO2 from other materials and gases and injecting it underground to prevent it from escaping into the atmosphere or using it to create pressure to push oil to the surface as wells deplete.

At Tomakomai, by-product gas is piped from a nearby Idemitsu Kosan refinery and CO2 pulled out as it passes through an amine solution. By using the remaining gases to generate power and recycling heat, energy costs are cut to between half and a third of a typical extraction plant, the company said.

In February a report claimed that CCS would only have “limited realistic potential” to stymie climate change and that deep and rapid cuts to the amount of greenhouse gases being put into the atmosphere would need to be made to stick to pledges in the 2015 Paris Agreement. 

Gas rocks! Mineralising carbon dioxide into rock

April 19th, 2018 no comment

Late in 2016, an international consortium of scientists behind a project called CarbFix announced that it had successfully begun the industrial-scale capture and mineralisation of carbon dioxide (CO2) from the emissions of Hellisheiði, Iceland’s largest geothermal power station. They had turned an unwanted and harmful greenhouse gas into a benign and solid mineral contained within bedrock, and they had done it on a timescale previously thought impossible. CarbFix had turned gas into rock in just two years.

Carbon capture and storage (CCS) is not a new idea. The first large-scale CCS facility started operations in the USA in 1972. Today there are 17 CCS projects operating in various countries around the world, from Brazil to Saudi Arabia, with a combined CO2 capture capacity of around 30 million tonnes per annum. According to the Global Carbon Project, more than 36 gigatonnes of CO2 is emitted into the global atmosphere every year as a consequence of fossil fuel use and industry alone.

In order to limit global warming caused by this high rate of emissions to well below 2°C of warming – which experts from across the scientific community agree would signal irreversible environmental catastrophe – total global emissions of all greenhouse gases (of which CO2 makes up 65 per cent) will need to be at least halved by 2050. The Global CCS Institute thinks CCS is key to reaching this target, claiming that “it will be impossible to deliver the ‘well below’ 2°C climate goal if CCS is not adopted as a key mitigation option within five to seven years”.

During conventional CCS, CO2 is captured from the emissions produced by fossil fuel power plants, heavy industry or refineries using a chemical process. The CO2 is liquefied under pressure so that it can be transported and injected into underground rock formations, often an old oil or gas field, at depths of up to 5km. Successful storage can only be achieved if there is an overlying layer of impermeable rock to stop the still-buoyant CO2 escaping to the surface and eventually returning to the atmosphere. Given the right chemical composition of rock in the storage formation, it is thought that the CO2 will eventually transform into limestone over a timescale of hundreds to thousands of years.

Such long timescales, prohibitive costs and the sheer quantity of energy required (for every three coal power plants undertaking CCS, a fourth is needed to provide the power) are thought to have held back the expansion of CCS. “The process does need improving,” admits John Scowcroft, executive adviser to the Europe region at the Global CCS Institute. “The thing people worry about is whether the CO2 is going to come back – and the subsidiary worry, whether it is dangerous.”

CarbFix has the potential to address these worries. “The importance of CarbFix, in my opinion, is that it is showing definitively, in fast-forward, that all CO2 sequestered will turn out like this, that the CO2 will be rock and it will stay there,” Scowcroft says.

Currently ranked second in the Environmental Performance Index, Iceland regularly tops global ‘green lists’. This is largely due to its use of alternative energy, particularly geothermal. Generally considered a ‘clean’ technology, geothermal energy does produce waste emissions of both CO2 and hydrogen sulphide. An investigation into whether CCS techniques could provide a solution to unwanted emissions evolved into what has become the $10m CarbFix project.

“We wanted to mimic the natural process that is happening in nature here in Iceland,” explains project manager Edda Aradóttir at Orkuveita Reykjavíkur (Reykjavik Energy), one of the principal CarbFix partners. “We expected that the process would be fairly rapid and we knew it would be quicker than conventional CCS, but this was the fastest outcome we could have expected.” 

The apparent alchemy of permanently mineralising CO2 in just two years was achieved through subtle deviations from standard CCS techniques. Rather than injecting liquefied CO2, CarbFix dissolves the gas in water and injects the resulting carbonated solution to depths below 1,000m. The solution has a CO2 concentration of 0.823 mol/kg and a slightly acidic pH of 3.85 at 20°C. Not only do interactions between the carbonated solution and the host rock begin immediately, the speed of the reaction is aided by the acidity, which helps to break down the host rock to release elements that combine with the carbon and oxygen of the CO2.

Another important benefit is that CO2-saturated water is denser than liquefied gas, eradicating the problem of buoyant CO2 escaping back to the surface. An impermeable cap rock over the sequestration site is no longer needed.

However, the real magic comes from the selection of the host rock: basalt. Basalt is highly reactive because it contains relatively large amounts (up to 25 per cent by weight) of the more readily interactive elements such as calcium, magnesium and iron. These are the elements that react with the injected fluid to form carbonate minerals, particularly the solid crystalline mineral calcite.

“We are scientists, so we are always sceptical,” says Aradóttir. “But we got indications that the process was working very quickly. We labelled the injected gas with different tracers and tracked it from nearby monitoring wells. By taking samples of residual water, we could see the CO2 was not coming through. Later we drilled cores and we could detect our tracers in the calcite we found.”

‘We have to find a way of building the needed infrastructure and incentivising the use of CCS.’

Edda Aradóttir, Orkuveita Reykjavíkur (Reykjavik Energy)

The deposits appear as white-ish veins in the test cores. Since pilot injections began in 2010, CarbFix has achieved over 95 per cent permanent mineral CO2 sequestration in under two years, compared to the centuries required by other methods.

Approximately 10 per cent of the planet’s continental surface and most of the ocean floor is basalt. So in theory, CarbFix could be widely replicated. It has been calculated that while mineral storage capacity in Iceland could be as much as 400 gigatonnes of CO2, the ocean ridges could exceed 250,000 gigatonnes – more than enough to account for the safe sequestration of all CO2 emissions resulting from the use of all fossil fuel resources on Earth.

While this tantalising potential is hard to ignore, critics are quick to point out that CarbFix has yet to prove itself viable on a huge scale. CarbFix is injecting 10,000 tonnes of CO2 annually at the Hellisheiði site. This is alongside its sister project, SulFix, which injects 7,000 tonnes of hydrogen sulphide per year into basalt to form iron pyrite. Larger-scale testing has been impossible in Iceland as there are simply not enough emissions. Geothermal Hellisheiði produces only 5 per cent of the emissions generated elsewhere by fossil-fuel plants.

Other criticisms include concerns that mineralisation may not be permanent if deposited rock is subjected to substantial heating, but Aradóttir rejects this. “You can reverse the process by dissolving the carbonite in acid, but we don’t intend to do that.” Then there is the amount of water required to dissolve the unwanted gases before injection. CarbFix uses 22 tonnes of water for every tonne of CO2 dissolved. Seawater can be used instead of freshwater, but because salinity influences the process, around 30 tonnes of seawater is required to dissolve a tonne of CO2. Aradóttir is quick to point out that, along with CO2, all heavy metals precipitate out of the injected solution into the mineral deposits, resulting in purified groundwater that is immediately potable and could be re-used. “Any water can be purified,” she says, “even waste water.”

The price of CarbFix at the Hellisheiði site is $30 per tonne of CO2. This compares favourably with conventional CCS methods, which cost between $60 and $130 per tonne, but Aradóttir believes cost is still the biggest drawback. “Within Europe, industry can buy its CO2 quota cheaply, at approximately €6 per tonne,” she says. “The politicians have to be involved.” Scowcroft at the Global CCS Institute agrees: “The CCS community has a lot of work to do in getting the message across and the legislation in place. We have to find a way of building the infrastructure and incentivising the use of CCS.”

Nevertheless, he sees great value in the CarbFix project beyond the surprising speed of mineralisation. “They’ve found a more efficient storage method that provides greater reassurance to the public. We talk a lot about technology, but public engagement is as important. A method can be as efficient as you like but if the public don’t like it, it won’t succeed.”

A new phase of the project, CarbFix2, was launched late last year, aiming to move the technology from the demonstration phase to and economically viable complete CCS chain that can be used in Europe and the world.

So, is CarbFix the climate solution miracle we’ve all been waiting for? “There is no one climate miracle,” says Aradóttir. “It’s very risky if everyone thinks they don’t have to worry about their emissions now. We have to apply every available solution and everyone has to play a part.”

Dounreay destined for demolition: endgame for an atomic icon

April 18th, 2018 no comment

The iconic sphere nestled among a squat series of buildings at Dounreay marks the location of a pioneering site in 20th century nuclear technology. This part of the north coast of Caithness, Scotland, was the home of the UK’s fast-breeder reactor programme. It was created primarily to make electricity and plutonium, but was also used for reprocessing nuclear material and carrying out cutting-edge scientific and engineering experiments.

The sphere – inevitably dubbed the ‘golf ball’ in this, the ancestral home of the good walk spoiled – housed the Dounreay Fast Reactor (DFR), hailed for its pioneering position in global nuclear engineering. In 1962, it was the first fast reactor in the world to supply electricity to the national grid. Now, however, it is one of the most challenging nuclear sites in the world to decommission.

The pioneering efforts of the atomic scientists and their willingness to push boundaries has left a legacy of complicated, hazardous decommissioning work. Dismantling radioactive machinery is fraught with difficulty, and today’s engineers must negotiate ageing infrastructure in inaccessible, hazardous areas.

Old nuclear sites were built perhaps with a dim assumption that we genuiuses of the future would have figured out how to demolish them. But simple demolition is not enough at Dounreay. Many parts of the site hold dangerous nuclear material and levels of radiation that mean workers must be protected. Today’s security climate also means that restrictions are in place on moving nuclear material around.

The design, use and piecemeal adaptation of old nuclear sites like Dounreay have given today’s engineers complex problems to overcome, and a keen understanding of the need to think ahead. The use of proven technology is encouraged, as well as simple processes wherever possible.

Dounreay has a chequered history when it comes to radioactive waste management but the site is now a popular venue for international engineers – among them recently a Japanese delegation – to learn how to dismantle specialised atomic sites.

The nature of the experiments that took place in DFR meant that when it was shut down in 1977, some of its fuel was stuck in the reactor and remains there today. Dealing with the reactor’s volatile liquid metal coolant was also a big challenge, which has since been overcome, making DFR less hazardous for those dealing with the reactor contents.

The heart of the DFR sphere is the reactor, now switched off but still very radioactive. Before the reactor can be dismantled and the outer sphere shell taken down, workers must figure out how to remove hundreds of elements that have remained in place since the shutdown.

The reactor is essentially a large, metal bucket set into floor level inside the sphere. The bucket contains a very long honeycomb, where hundreds of smaller channels held metal rods with an outer metal cladding. The rods had different purposes in the reactor, including to make plutonium, reflectors to bounce energy back into the nuclear chain reaction, and fuel rods.

There has been no power generation at Dounreay for over 40 years, but the site remains off limits

Image credit: Getty Images, Alamy

Senior project manager Ron Hibbert is responsible for the complex engineering work required to empty the reactor. He explains that the work is aligned with the Nuclear Decommissioning Authority’s Magnox Operating Programme (MOP), which sets a timeline for when the reactor should be emptied and its nuclear load transported to Sellafield in England.

Even getting to this point of removing the fuel has been a very long journey – the removal of the reactor’s liquid metal coolant, composed of sodium and potassium, was a significant challenge given its hazardous, volatile nature and its location in a tangled knot of complex pipework. Some of it had solidified and a special chemical reaction process had to be developed and tested before work could begin on the real thing.

The decommissioning of DFR has involved demolishing buildings and putting up new ones, as well as installing safer, more reliable services. Even so, engineers often encounter new problems as work proceeds.

Issues emerged in the Dounreay golf ball around the complicated sequence of operations needed to remove the fuel. The original circular metal rail from the 1950s, upon which the huge 25-tonne DFR crane runs, had moved out of position, requiring a new approach to moving the crane.

Despite this and other issues, Hibbert remains optimistic that the project can meet the timeline required by the MOP.

He says: “On the breeder project we’d had a really good year but we then had some problems with previously installed equipment. We had a number of difficulties trying to set to work with previous infrastructure.”

The knock-on effect is that there is now a plan to rotate the polar crane through a different angle in order to access the reactor elements and move the sphere’s special element transfer flask from that point. Issues have also come to light with a piece of equipment where cables became tangled during various configurations needed to access the breeder elements through the reactor’s rotating shield. A new tool has been devised to tackle the problem.

Hibbert comments that the aim for the new replacement was to keep to simple engineering principles and produce a locally built tool rather than design a complicated, expensive solution.

“We have replaced the mast with a very simple retrieval system. One of the things we are most proud of is that everything you see mounted on the reactor top apart from the retrieval cell has been designed by the project team and supplied by local contractors.

“We are very proud of the fact that there is local capacity to do this; there are highly skilled local companies and knowledge within the site that helps us work out these problems.”

Radiation and contamination always complicate dismantling work as there are risks to staff and the creation of more nuclear waste to consider.

“What I learned, and it stuck with me, is that if you are doing work in the reactor you have to be very confident of what you are doing before you introduce equipment because there will be very little chance to modify things later,” says Hibbert.

“As soon as you do anything in the reactor, you are potentially contaminating things and there is no chance of getting the equipment back off site. Modifying on-site is difficult because you then need a radiation-controlled area, welding restrictions and so on.

“Your best shot is to test before you put it in. So I have used that approach, which I learned on various projects at the Dounreay Prototype Fast Reactor. We set up a trial facility at the T3UK site. What it did was replicate the top function of the reactor, made sure the heights were right and had a simulated section of the core area.”

T3UK is an engineering test facility and educational campus in the north of Scotland aimed at the nuclear, marine and energy markets. The test structure was a mock-up of a section of the DFR reactor with a test rig above it.

He continues: “We developed our replacement design, built a trial version of it, and we evolved that. It was needed to give us the confidence that where we were going with the design was right. We needed to be able to reach everything while it functioned as intended.

“We kept that at T3UK for various reasons including as a ‘hot standby’ so that if there is a problem we can quickly study it offsite, come up with a solution and then replicate that in the actual reactor.

“There is damaged fuel in the reactor so we will retain the rig there for the further tooling and equipment needed. When we take out the 500 or 600 easier elements, that gives us time to develop solutions for the challenging ones – that’s the philosophy.”

Using the new design is part of a complex 3D jigsaw where some parts, such as the crane and reactor shields, rotate in a circle and others, such as the element transfer container, need to move to the correct position horizontally and vertically to take elements from the core to a special pipe where they are sent downwards for inspection. After this, they reach the building where they are cut up in shielded cells for packaging and transport.

Hibbert continues: “The new design has worked very well for us. That was a challenge in itself with the restrictions on the dimensions, and it has to work in two halves; it’s like one shell inside another. But at the same time, half that diameter is still a significant size, it is a bit heavy and fragile. The transport frame had to come in here and go into position so that we could, safely and in a controlled manner, lift it into there after a careful assembly procedure.

“You can say it in a sentence or two but the actual work it took to do that was significant. At the time it was a contaminated area, so the last thing we wanted was to have people spending longer there than they needed to. That is why we did all the work off-site, so that what you have to do is come in, set it in place and bolt it down. That saved a huge amount of radiological exposure.”

‘A lot of effort has been put into inactive commissioning of this system. If anything breaks, we have demonstrated that we can replace or repair it remotely.’

Chris Wratten, DFR

To make the jigsaw even more complicated, some of the rods are stuck in the reactor channels. A ‘top plate’ metal structure in the reactor has been cut to free up some elements, and more needs to be cut to access headless rods that are shorter than the others. Different grabs are needed for the new tool.

Some of the most damaged rods were affected by overheating in the reactor during experiments decades ago. In some instances, the reactor coolant was supposed to keep temperatures under control but, in places, the elements got too hot for them to stay in the right place and shape. Late in the life of the reactor, materials were deliberately subjected to extreme conditions, including coolant boiling. Localised parts of the reactor may have reached up to 900°C, which has caused fuel failures.

Preliminary work has shown that some elements are not in a bad condition, so the current aim is to focus on taking these out, while work in the neighbouring building where elements are cut up and packaged for transport is under way.

Hibbert says: “We’re initially using a simple philosophy to extract them. In reality, we think that between 500 and 600 elements are straightforward to remove. What we found when we worked in the reactor last year is that the elements further from the centre should be relatively straightforward to take out. Because the sphere work is quicker than what we have to do in the breeder building, there is an opportunity to do other work between times.

“To underpin our assumptions, we went in and checked this. This tested our assumption and proved some elements are not stuck at the bottom. The elements might not come right out to the top when you pull them, but all the indications are that the initial survey was right.”

Additional work has also been needed to move the project forward, in the form of using the T3UK engineering facility to look at why a transfer flask wasn’t working as expected. By using a deep pit in the test centre, workers were able to see that a grab was twisting inside the flask when it should not have been.

“You can’t legislate for ageing infrastructure,” says Hibbert. “What assumptions can you make? What’s going to be good? What’s going to be bad? We go through a thorough maintenance regime but these types of challenges happen and, to be honest, there are other people going through similar issues at other sites.”

Sharing experiences with other sites has shown the team that engineering problems are faced at many industrial locations that require simple, problem-solving approaches.

After travelling down a pipe and being inspected, the elements are stripped of their cladding and put into shielded packaging. By then they should be the right length, width, weight and composition for the heavily shielded flask, which is used for transport off site.

Fuel from the Dounreay reactor is taken to Sellafield by rail for reprocessing

Fuel from the Dounreay reactor is taken to Sellafield by rail for reprocessing

Image credit: Getty Images, Alamy

Project manager Chris Wratten explains that the work is carried out in shielded concrete cells with thick lead glass windows and a nitrogen atmosphere for safety reasons. Remote grabbing arms and a camera system are used to carry out the work. One cell is where small amounts of liquid metal coolant are still present on the radioactive metal pieces; an adjacent cell is used after it has been cleaned and is being packaged for transport. The DFR material is destined for the reprocessing plant at Sellafield.

“The long-term plan for the cladding is that it goes into the intermediate-level waste store and it remains there until the radioactivity levels go down. All the waste stays on site and all the fuel goes to Sellafield,” Wratten explains.

He says the current approach means that in newer buildings, the services such as the electrical supply can all be accessed and maintained easily away from the radioactive areas. The end result is that when this newer plant needs to be decommissioned, it will be easier to take apart and there will be less nuclear waste to deal with.

Wratten’s approach echoes that taken in the sphere work, of testing equipment first before any radioactive material is involved.

“There has been a lot of effort put into inactive commissioning of this system so that we don’t have to then go back in and make any changes. With every piece of kit in the cell, we have demonstrated that if it breaks, we can replace it or repair it remotely. We know it’s robust and maintainable. We don’t then have to put people into a hazardous area. Something that is easy to do by hand is not easy with the remote grabs. It’s about making things simple.”

For older shielded cells on site, a system of periscopes and mirrors was used to view the nuclear materials being worked with, as well as special windows. Today, standard cameras are used to get a better view than the slightly distorted 2D view through the lead glass window.

“We use standard, cheap, off-the-shelf cameras. You could specify expensive, radiation-tolerant cameras but they’re not as good and don’t have the same functionality as the normal cameras.

“Because the radiation doses aren’t that high in this cell, we can use standard cameras from other industries and they don’t have the added cost of being radiation-tolerant. If they do fail, they’re easily replaceable. Even then, we haven’t actually had one fail yet in the lifetime of the plant, and you can see fine detail from them.”

Removing the elements from the ‘golf ball’ sphere is one of the last jobs to be done at DFR before the reactor is taken apart and the metal shell taken down. Last year, the site operators submitted a planning application to the local council for a series of projects from 2018 towards the site shutdown. These jobs include some major changes to the Dounreay skyline, which will be cleared of its Cold War icon.

Mobile maps: mapping live data with the help of mobile networks

April 17th, 2018 no comment

In 2009, the German newspaper Die Zeit published an animated map of six months in the life of Malte Spitz, a Green Party politician, using his mobile call details records (CDRs). These are the logs of how, when, where and with whom he communicated, collected by his phone supplier for billing purposes.

By matching these records to mentions of Spitz’s political life on websites and blogs, Die Zeit was able to pinpoint the places the politician visited, the routes he took, how long he’d stayed in each location, and the people he’d texted and spoken to on the phone at the time.

Die Zeit and Spitz wanted to reveal how potentially intrusive the retention of CDRs can be in the way that they can track individuals from one cellular antenna tower to the next. Now, nearly a decade later, CDRs are being used to map the movements of millions of people at once.

These new population-scale movement-mapping techniques are based around anonymised aggregations of CDRs. Once user IDs and location data are irreversibly scrambled, the message is that CDRs become statistical tools that can be used for the good of society.

CDRs en-masse can show patterns of human ebb and flow across cities, countries, and even continents. Linked to other datasets about climate or disease, they can reveal the influence of humans on epidemics, pollution and more.

The testing lab for ‘social good’ projects has largely been the developing world, which is where most of the world’s five billion mobile phone owners live.

Over the last few years, studies have used CDRs to model the spread of dengue fever virus in Pakistan, ebola in West Africa, and malaria in Kenya, with encouraging results.

The Kenya malaria project was one of the first. A team from Harvard School of Public Health (HSPH) with seven other institutions mapped every call or text over a year (from June 2008 to June 2009) made by each of 15 million Kenyan mobile phone subscribers to one of 11,920 cell towers.  

Malaria is caused by parasites transmitted through mosquito bites. It infects over 200 million people a year, and kills around half a million, mostly children in sub-Saharan Africa

Researchers were curious about how humans ‘importing’ infections might contribute to the transmission of parasites at distances beyond where mosquitoes might go.

Every time an individual left home (home in these studies is assumed to be near the antenna where most calls at night connect to), they calculated where they went and the length of the trip. They estimated the disease’s prevalence in each location with a 2009 malaria prevalence map from the Kenya Medical Research Institute (KEMRI) and the Malaria Atlas Project.

From this, they inferred each resident’s risk of being infected and the daily risk that visitors to particular areas would become infected.

It turned out that most infections carried by people end up in the capital Nairobi, after they have returned from malaria hotspots like Lake Victoria.

The conclusion of the team’s 2012 paper in the journal Science was that preventative schemes could target these volumes of human traffic between regions.

One of the most ambitious disease-mapping initiatives is under way in India using CDR data from 280 million people to understand and control the spread of TB. Tuberculosis killed 423,000 Indians in 2016, which is a third of the world’s TB death toll.

A team from the GSMA (a trade body that represents the interests of mobile network operators worldwide), the mobile phone firm Bharti Airtel and Be He@lthy, Be Mobile (a collaboration between the WHO and the ITU) will be mining the CDR data to identify potential TB hotspots. The government will use the data to target vaccinations and campaigns as part of its plan to eradicate TB in India by 2025.

Big Data for social good


The GSMA, the body that looks after the interests of mobile phone operators, launched its Big Data for Social Good initiative in February 2017. The idea is to use mobile call records to address epidemics and humanitarian crises.

The United Nations Foundation is a supporting partner, and mobile firms across the world are backing the initiative.

These include Bharti Airtel, Deutsche Telekom, Hutchison, KDDI, KT Corporation, Megafon, Millicom, MTS, NTT Docomo, Orange, Safaricom, SK Telecom, Telefónica, Telenet Telenor Group, Telia, Turkcell, Vodafone and Zain.

In Europe, the mobile operator Telenor is developing a predictive model of how seasonal flu spreads in Norway using phone records and case data from the Norwegian Institute of Public Health (NIPH).

According to WHO statistics, seasonal flu epidemics around the world cause three to five million cases of severe illness and kill 290,000 to 650,000 people each year.

“We want to understand how people have travelled in the past, and match this to case-study data of how flu has spread throughout Norway. Based on that information, we would like to simulate intervention strategies to see if we can reduce the spread of flu,” explains Kenth Engø-Monsen, a senior research scientist at Telenor Research in Norway.

A team at the University of Oslo will be building the computer model, which they will tune to predict the spread of flu during the actual flu season.

“If we can come up with reliable interventions that demonstrate in simulation that we can slow down the spread of flu, then the NIPH would be ready to implement these,” says Engø-Monsen.

Potentially, flu vaccinations could be targeted geographically, based on travel patterns, he says.

Environmental monitoring is another ‘social good’ application that is capturing the imagination of research teams and mobile companies.

A couple of years ago an MIT study, led by Marguerite Nyhan, changed our view of air pollution in cities.

Using CDR data from 8.5 million people in New York City, Nyhan and colleagues showed that the daily movements of people around 71 districts had a major influence on air quality. They were interested in small particles less than 2.5µm in diameter (PM2.5) associated with the worst health effects.

Comparing active population exposure with home exposure (which assumed a static population), they found that districts that contributed most to overall PM2.5 exposure were those with most residents. Districts with higher relative influence on exposure tended to be clustered in the areas where New Yorkers work and socialise (lower regions of Manhattan and centralised areas of Brooklyn and Queens).

Mobile operator Telefonica has taken this idea a step further in Sao Paulo, Brazil’s largest city, as part of the GSMA’s Big Data for Social Good Initiative.

Combining mobile phone data, machine learning and data from weather, air quality and traffic sensors, Telefonica is predicting air quality across the city up to 48 hours in advance.

It’s a cheaper alternative to air-quality monitoring that allows local government to take further preventive actions, says Jeanine Vos, head of the GSMA’s Big Data for Social Good Initiative. “Using this predictive approach, you can see areas of high air pollution and take preventative steps like rerouting traffic, and giving warnings to people with health conditions such as asthma.”

In a more unusual vein, scientists at the Norwegian Institute for Water Research (NIVA) have been using CDRs and sewage analysis to understand the extent of drug use in Oslo.

Everything humans eat and drink, including drugs (legal and illegal), leaves a chemical signature in sewage that can be measured at sewage treatment facilities.

‘The dynamic population shifts we can measure in this way are excellent.’

Kevin Thomas, University of Queensland and NIVA

Population is the biggest uncertainty for these measurements because people take drugs on different days and the numbers of people in a sewage catchment area will vary.

“At NIVA, we had been studying Oslo quite intensively in terms of its drug use and I noticed as I sat on the Metro every day that far more people go in and out of the city on weekdays than weekends,” explains Kevin Thomas, director of the Queensland Alliance for Environmental Health Sciences at the University of Queensland and a research scientist at NIVA.  

Thomas coordinates a network of laboratories across the world measuring drug use by taking sewage samples, as part of the international SCORE network.

In NIVA’s research they had seen big changes in drug and alcohol consumption at weekends, and Thomas was concerned that they were missing part of the picture.

It took Thomas three years to persuade a mobile phone company to agree to work on this. The project went ahead in 2016, using anonymous phone data from Telenor customers collected across a holiday period in June to July 2016.

They found that numbers of people in the sewage catchment area ebbed and flowed dramatically. 469,000 people, for example, were counted at 9am on 5 June and this had increased to 670,000 by 2pm the next day. Anyone tracking drug use at the time would have assumed there was a massive spike, when there were just more people.

Over the testing period, illicit drug use rose, with Ecstasy spiking at weekends. These results suggest that mobile data could help public health agencies, law enforcement and epidemiologists refine their understanding of drug use trends.

“The dynamic population shifts we can measure in this way are excellent,” says Thomas, who is now repeating the study over 12 months in Oslo as part of a European project looking at drugs markets.

In the future, says Thomas, these methods could be used to monitor the general health of a city, something he is researching in Australia.

“Sewage monitoring can tell us about people’s vitamin intake, alcohol intake, and you could look at very specific markers such as grain intake, how much fruit and vegetable [based on beta carotene levels], and exposure to pesticides, fire retardants and plastisisers. In fact all the chemicals you are exposed to in the home,” he elaborates.

How signals track users from tower to tower


Mobile phone networks are a honeycomb of overlapping radio coverage areas (cells) that are generated by thousands of base-​station towers spaced anything from a couple of hundred metres apart to a few kilometres.

The base stations connect your phone to other phone users or to the internet. They also generate the call detail records (CDRs) that log all communication activity for billing purposes.

CDRs detail the unique IDs of the caller and ‘callee’, and also those of their phones (international mobile equipment identity, or IMEI). Calls or SMS are time-stamped with a call-start and call-end, and the cell IDs and locations of the base stations involved are also logged. An extended version of these records, called xDR, also includes records of mobile data.

In urban environments, these details can locate individuals to an accuracy of around 200m, making it easy to track their approximate trajectory from tower to tower across a network.

By cross-referencing to road and rail maps, it is possible to infer more precisely users’ modes of transport and routes.


CDR data sets used in many of these social projects are based purely on SMS and call logs generated only when people use these services. For applications that need more frequent location updates (every 15 to 30 minutes), there are xDRs (extended Detail Records), which include all the signals generated by the transmission of data packages by smartphones.

“When you turn on your phone, you send a signal, when you turn it off, you send a signal, when you change your location, it sends a signal, and so on. Everything you do with your phone generates a signal, which is read by the cellular network and collected as the xDR,” explains Arturo Amador, a senior consultant at the ICT consultancy Acando, who has been involved in developing Telenor’s Mobility Analytics platform since 2015.

However, these larger data sets are ‘noisier’, making it harder to find signals of interest, points out Telenor’s Engø-Monsen.

“I’d prefer 10Mbytes or 100Mbytes of nicely curated mobility data on a daily resolution with a fairly good spatial resolution, than 20 terabytes of detailed browsing history,” he comments.

Commercial applications of movement monitoring in tourism, retail and transport are likely to benefit most from these larger (if noisier) sets of phone-use data.

A recent trial on roads around Dublin, carried out by Vodafone Ireland and the mobile analytics firm Cell Mining, is a good example of a transport application that uses frequent signal updates.

They were able to distinguish the fast-moving phone subscribers travelling in trains and cars using these records. By measuring the number of phone calls that were cut short or data sessions that dropped out along particular road or rail routes, they were able to create ‘mobile experience’ maps marking each cell site along a route in ‘traffic light’ colours, from red to green.

This trial has one eye to a future of autonomous cars when we will want to plan our journeys around roads with the best mobile network quality, instead of those that get us to our destinations the fastest.

Worries about privacy (bearing in mind many of these applications involve phone companies giving customer data to third parties) remains the single strongest limitation of this technology, which brings us back to Malte Spitz and his map.

The solution may be (as in Norway) to mandate that all processing is carried out on a secure platform on the mobile operator’s premises. “We cannot upload anything to the cloud and we cannot ship anything outside Norway,” explains Acando’s Arturo Amador.

Likewise, mobile companies should use state-of-the-art anonymisation techniques including hashing and encrypting all user identifiers and sensitive fields such as the location of antenna towers.

Path obfuscation is an additional privacy enhancer that adds random ‘noise’ to the location fields, according to Amador. “Instead of a person starting their journey in location A, it becomes A+delta where delta is a small random number. And instead of ending up in location B, it becomes B+delta. Across a population, the results contain enough ramdomness to protect privacy,” he explains.

If mobile operators get the privacy issues right, mapping population movements in this way could change the way governments and international agencies create and implement health and social policy.  

Unlike surveys or censuses, which take years and cost millions, mobile movement maps can be generated quickly and cost-efficiently, and updated frequently, sometimes even in real time. Moreover, their diagnostic nature means that our mobile-tracked movements could become part of a series of huge policy feedback loops.


Exposure to air pollution in New York

Comparing active population exposure to air pollution with home exposure in New York. Districts with higher relative influence on exposure tended to be clustered in the areas where New Yorkers work and socialise (lower regions of Manhattan and centralised areas of Brooklyn and Queens).
Credit: ‘Exposure Track – The Impact of Mobile-Device-Based Mobility Patterns on Quantifying Population Exposure to Air Pollution’ by Marguerite Nyhan et al; Environ. Sci. Technol., 2016, 50 (17).


Spread of malaria in Kenya

Density maps showing sources (red) and sinks (blue) of human travel and total parasite movement in Kenya, where each settlement was designated as a relative source or sink based on yearly estimates.
(A): Travel sources and sinks. (B): Parasite sources and sinks.
Credit: Quantifying the Impact of Human Mobility on Malaria; Amy Wesolowski et al. Science 338, 267 (2012)


Mapping poverty with call data records

Neeti Pokhriyal at the University of New York at Buffalo has created ‘poverty maps’ in Senegal using CDRs from 9 million Sonatel customers. The thickness of the link indicates the volume of calls and texts exchanged between regions. Size of the circles indicates total incoming and outgoing calls and texts. The level of connectivity turns out to be a good proxy for the Multidimensional Poverty Index (MPI) for a region, which is a composite of 10 indicators.

Further information









Book review: ‘Taming the Sun’ by Varun Sivaram

April 12th, 2018 no comment

Headline figures suggest this is a boom time for solar power. A recent UN report shows that it made a massive contribution to a global increase in investment in renewable energy infrastructure during 2017 that far outstripped growth in fossil-fuel generation in the same period.

However, the big picture could temper solar supporters’ optimism. China alone was responsible for nearly half of new capacity; at the same time, investment declined in the US and Europe.

Solar may finally be at a point where costs are low enough to make it attractive as the world’s fastest-growing power source. However, without sustained effort, Varun Sivaram warns, there is a risk the current level of enthusiasm will rapidly cool.

Countries may be installing cheap panels by the acre, he argues, but they aren’t investing in the innovation that solar will need to make the crucial leap from the 2 per cent of global electricity it’s responsible for today to providing at least a third by the mid-century.

‘Taming the Sun: Innovations to Harness Solar Energy and Power the Planet’ by Varun Sivaram (MIT Press, £24.95, ISBN 9780262037686) is both an overview of the current state of solar and a manifesto making the case for sustaining current levels of growth through three kinds of innovation – financial, technological and systemic.

Growing photovoltaic capacity by an order of magnitude will need massive amounts of new capital investment and Sivaram looks at how this can be achieved. In parallel, he acknowledges, although installations based on existing silicon technology are likely to continue expanding over the next decade, beyond that point their growth could hit a ceiling that can only be broken through by a combination of ‘dirt-cheap’ generation, new materials and cost-effective storage.

The book also serves as a stark warning that science alone isn’t enough. Solar’s future will require visionary public policy and politics has a part to play. Sivaram laments the way in which the current US administration’s policies threaten to undermine the role the country has traditionally enjoyed as a pioneer in energy innovation.

‘Taming the Sun’ is an even-handed untangling of a situation that can appear a mess of contradictions, with scientists despairing that commercial technology is stagnating while the industry trumpets its progress. Although no one would argue that solar will power the entire planet by itself in the foreseeable future, ‘Taming the Sun’ is a convincing argument for taking steps to make it the centrepiece of a global clean-energy revolution.

Fault Current Limiting Technology & Applications

April 6th, 2018 no comment

Rising fault level, due to generator connection, transformer upgrade, rising upstream fault level or network interconnection, can mean that switchgear and other plant needs replacing with higher rated equipment.  A fault current limiter can provide a rapidly deployed and cost-effective solution to this. This webinar will describe the applications, deployment, functionality, monitoring, as well as control and performance of this product.

Key learning outcomes from the webinar:

  • Discover why you need a Superconducting Fault Current Limiter (SFCL) 
  • Learn about how SFCLs are delivered and what they look like in a substation
  • Hear about a specific fault current limiting technology which has already been trialled in a UK network for 3 years, limiting at least 9 faults

Register for this on demand webinar

Hands-on review: Tregren T-series hydroponic planter

April 4th, 2018 no comment

This high-tech kitchen garden combines home hydroponics and Internet of Things app control for a fast, low-maintenance growing experience. The planter has built-in LED lighting and automatic watering.

All you do is tell the app what seeds or plants you’re growing and replace the nutrient-laden water when the app prompts you to, which is every three weeks or so. You don’t need to think about the watering and lighting. It’s automatic and optimised, so plants grow around three times faster than normal.

The T-series comes in three sizes: T3 for up to three plants, T6 for up to six and T12 for – you guessed it – up to a dozen. And it comes in three colours: white, grey or black. We tested the T3 in white.

Assembly is very simple. Then you dissolve a sachet of nutrients in a litre of water and pour it into the reservoir that sits underneath. The water is pumped up via a mini fountain, at an interval that depends on what you’ve planted or sown – you tell it via the app. The timings for the grow light are also set via app.

There are two types of nutrient sachets: one for herbs and salad, another for fruits and flowers. €12 buys eight sachets and you’ll use one sachet every three to four weeks.

You can also buy seed pods made of compressed peat from Tregren, with or without seeds. We tried a pack of six Italian herb seed pods: basil, marjoram, mint, oregano, parsley and thyme. You get six discs of compressed peat plus six circles of filter paper. You soak the peat discs in water for 10 minutes, during which time they get much taller. Then you push the papers (which contain the herb seeds) firmly on top, so they get moist.

The T3 is just large enough for the six pods but it would only fit a couple of supermarket herb pots, which is the other obvious use for it.

We put the prepared pods in the T3, being careful to tell the app that it was in seed-growing mode, put a litre of tap water in the reservoir and waited. You don’t add nutrients to the water until the seeds start to sprout. Only two of the four papers had the name of the herb printed on them, the others were a guessing game.

The app is simple and minimal. You just tell it what you have planted when and it does the rest – communicating with the T3 via the Internet of Things (so the first time you set the garden up, you must connect it to the app and then give it the password for your Wi-Fi router, all of which went smoothly).

The seeds took a week to sprout, at which point we added the nutrients. Two weeks later, four of the six were thriving and getting big enough to see that basil and parsley were in the lead. It’s suggested that you can harvest after four to eight weeks.

It’s hard to say how much faster the process was than straightforward planting, but it’s very clear that it was effortless. We were almost willing the other two pods to fail because there’s no end of seeds we’d like to try in the T3. Cut-and-come-again salads like rocket, sorrel and lettuce are top of the list. We soon found ourselves pining for the extra growing space of the T12.

So we experimented on the side. Literally. We’d heard that one of the best and easiest ways to use a Tregren garden is simply to extend the life of supermarket herb pots. There was no space left inside the T3 for our cheap-as-chips pot of basil from Aldi, which was on its last legs, so we parked it right next to the T3 and poached its light.

Bingo! The basil came back to life and very soon began to thrive. It needed watering regularly, because it didn’t benefit from the Tregren’s automatic watering and nutrients, but the bright and constant LED light did our budget basil the world of good… and we were sold on the benefits of the T3.

The successfully Kickstarted T-series aren’t Tregren’s first products. They were preceded by Herbie and Genie, which boasted similar beautiful design but no app control. The T-series is more refined all round.

Home hydroponics is a technology long overdue reclamation. It’s primarily known for cultivating marijuana, and the association is so strong that very few companies have looked to bring it into our kitchens where it belongs. Farming food more efficiently and locally is a growing concern and it doesn’t get more local than this. Forget food miles, this is food millimetres.

Buying hot-housed supermarket herb pots, only to watch them wither after a week, is a miserable waste of resources when they could thrive for months. If you love herbs that aren’t hardy enough to thrive in the garden, a Tregren T-series smart garden is well worth considering.

from €90 tregren.com


Click and Grow gardens

Its smaller hydroponic growing stations have industrial design that’s uncannily similar to Tregren, but Click and Grow also offers entire “wall farm” vertical gardens along the same lines.

from €100 eu.clickandgrow.com

Grobo One

This tall hydroponics kit from Canada is aimed at the medical and recreational marijuana markets. As well as app control of lighting and nutrients, there’s a lock and a carbon filter to nix smells.

$1,993 grobo.io

Niwa One

Available to pre-order from the US, this does all of the above but also controls heat and ventilation. Available in three sizes or you can buy a $259 kit and build the plywood casing yourself.

from $429 getniwa.com

Saudi Arabia and Japan’s SoftBank create powerful solar-energy firm

March 28th, 2018 no comment

The projects are expected to ultimately produce up to 200 gigawatts (GW) by 2030, according to SoftBank chief executive Masayoshi Son.

This would add up to around 400GW of globally installed solar power capacity and is comparable to the world’s total nuclear power capacity of around 390 GW as of the end of 2016.

The agreement will start with the construction of Saudi Arabia’s first two solar-generation projects with 3GW and 4.2GW of solar capacity respectively.

By 2030, the plan commits the parties to manufacture and develop solar panels in Saudi Arabia for solar power generation, between 150GW and 200GW.

By investing in solar power, Saudi Arabia, the world’s biggest oil exporter, can reduce the amount of crude it currently uses to generate power and increase its overseas shipments.

The final investment total for the 200GW of generation, including the solar panels, battery storage and a manufacturing facility for panels in Saudi Arabia, will eventually total around $200bn, Son said.

The agreement also commits the parties to explore the manufacture and development of energy storage systems and establish joint ventures for research and development.

The initial phase of the project for 7.2GW of solar capacity will cost $5bn, with $1bn coming from SoftBank’s Vision Fund and the rest from project financing, he said.

Saudi Arabia’s Vision 2030 reform plan, which aims to reduce the country’s economic dependence on oil, was a good match for the fund’s long-term vision for innovation, said Son.

“These two visions have come together to create clean, sustainable, low-cost and productive renewable energy,” he said. “The Kingdom has great sunshine, great size of available land and great engineers.”

Despite being one of the world’s sunniest countries, Saudi Arabia generates most of its electricity from oil-fired power plants.

Saudi’s entire installed power capacity is currently around 60GW. Adding 200GW would create enormous excess capacity that could be exported to neighbours or used by industry, although the kingdom will still require other forms of power generation for night-time back-up.

The country began an official tender process for its first nuclear reactors last year, making it only the second Arab nation to turn to nuclear power. 

Industry estimates say around 300,000 to 800,000 barrels per day of crude oil are burnt for Saudi power generation.

Exporting that oil could increase Saudi’s annual oil revenues by between $7bn and $20bn, at the current price for benchmark Brent oil of almost $70 per barrel.

In 2016, the single largest solar farm in the world was constructed in the Indian state of Tamil Nadu, taking the record from a facility in California. 

Japan’s first electric car battery recycling plant to sell old batteries at half price

March 27th, 2018 no comment

The project will be led by 4R Energy Corporation, a joint venture between Nissan and Sumitomo Corporation, and will give the costly batteries new life after they pass their peak performance.

With the rapidly rising number of electric cars on the road, the availability of used lithium-ion batteries is expected to increase significantly in the near future as buyers of the first generation of electric cars look to replace their vehicles.

The recycling and refabrication of such batteries is expected to have a substantial impact on the battery industry, affecting demand for new battery materials, and on the environment and society as a whole.

Global automakers are looking for ways to make cheaper EVs and prolong the life of their batteries, which can account for up to one-fifth of each vehicle’s cost and are made from increasingly costly materials, including cobalt and nickel.

A recent study found that worldwide supplies for lithium and cobalt – key elements for producing batteries – could become critical by 2050.

The new factory will begin selling rebuilt replacement lithium-ion batteries for the first-generation Nissan Leaf.

The batteries will be produced at the new factory in the town of Namie by reassembling high-performing modules removed from batteries whose overall energy capacity has fallen below 80 per cent.

They will be sold in Japan for 300,000 yen (£2,015), roughly half the price of brand-new replacement batteries for the world’s first and best-selling mass-marketed, all-battery EV.

“By reusing spent EV batteries, we wanted to raise the (residual) value of EVs and make them more accessible,” said 4R chief executive Eiji Makino, 4R.

The new plant has opened around 5km north of the site of the Fukushima nuclear disaster and it is hoped it will be an economic boon to Namie, which has struggled in the wake of the disaster.

The flat, rectangular battery packs that line the bottom of each Leaf chassis are trucked into the plant, where each module is assessed.

Sumitomo has come up with a way to analyse all 48 modules contained in each battery pack in four hours, a huge time saving over the 16 days Nissan engineers previously used for similar measurements.

Modules with capacities above 80 per cent are assigned for use in replacement Leaf batteries; lesser modules are reassembled and sold as batteries for fork lifts, golf carts and lower-energy applications such as streetlamps.

The plant can process 2,250 battery packs a year and initially plans to refabricate “a few hundred” units annually, Makino said, adding that 4R would see whether the process could also be used for batteries from the latest Leaf model, which uses a different battery chemistry.

Makino said it would be difficult for 4R to completely break down and recycle EV batteries on its own, but may consider partnering with another company to retrieve reusable materials, a process that industry experts say is key to sustainable EV battery production.