Surviving the flood

Published in Physics World, 1 Oct 2014

Planned big-science facilities are set to generate more data than all the global Internet traffic combined. Jon Cartwright finds out how scientists will deal with the data deluge

When the €2bn ($2.6bn) Square Kilometre Array (SKA) sees first light in the 2020s, astronomers will have an unprecedented window into the early universe. Quite what the world’s biggest radio telescope will discover is of course an open question – but with hundreds of thousands of dishes and antennas spread out across Africa and Australasia, you might think the science will be limited only by the enormous extent of the telescope’s sensitivity, or its field of view.

But you would be wrong. “It’s the electricity bill,” says Tim Cornwell, the SKA’s head of computing. “While we have the capital cost to build the computer system, actually running it at full capacity is looking to be a problem.” The reason SKA bosses are concerned about electricity bills is that the telescope will require the operation of three supercomputers, each with an electricity consumption of up to 10 MW. And the reason that the telescope needs three energy-hungry supercomputers is that it will be churning out more than 250 000 petabytes of data every year – enough to fill 36 million DVDs. (One petabyte is approximately 10^15 bytes.) When you consider that uploads to Facebook amount to 180 petabytes a year, you begin to see why handling data at the SKA could be a bottleneck.

This is the “data deluge” – and it is not just confined to the SKA. The CERN particle-physics lab, for example, stores around 30 petabytes of data every year (and discards about 100 times that amount) while the European Synchrotron Radiation Facility (ESRF) has been annually generating upwards of one petabyte. Experimental physics is drowning in data, and without big changes in the way data are managed, the science could fall far short of its potential. […]

To read the rest of this article, please emailfor a pdf.

Quantum data are compressed for the first time

Published in Physics World, 29 Sep 2014

A quantum analogue of data compression has been demonstrated for the first time in the lab. Physicists working in Canada and Japan have squeezed quantum information contained in three quantum bits (qubits) into two qubits. The technique could pave the way for a more effective use of quantum memories and offers a new method of testing quantum logic devices.

Compression of classical data is a simple procedure that allows a string of information to take up less space in a computer’s memory. Given an unadulterated string of, for example, 1000 binary values, a computer could simply record the frequency of the 1s and 0s, which might require just a dozen or so binary values. Recording the information about the order of those 1s and 0s would require a slightly longer string, but it would probably still be shorter than the original sequence. […]

The rest of this article is available here.

Neural network cuts false-positive recalls

Published in MPW, 15 Aug 2014

Researchers in the US have developed an image-analysis technique that could cut the number of false-positive results in routine screening mammography. The technique, which uses a neural network to simultaneously analyse four different images, could save on costs associated with recalls, and relieve patients of undue worry (Phys. Med. Biol. 59 4357).

Routine mammography has been shown to significantly reduce mortality associated with breast cancer, which according to the World Health Organization is by far the most common cancer in women worldwide. Unfortunately, searching for true cases of malignant tumours among a majority of healthy women is a difficult task. Doctors naturally err on the side of caution, which means that of the 6 to 12% of women who are recalled, just one-tenth or fewer are truly in need of treatment. According to one study, in 10 years of screening more than half of women will experience a false-positive recall. […]

The rest of this article is available here.

Solar variation has little impact on cloud nuclei

Published in ERW, 17 Jul 2014

The Sun’s 11-year cycle of activity affects the formation of cloud condensation nuclei on Earth by less than 1%, according to a study performed in the US. The amplifying effect, although larger than some previous estimates, is still too small to account for the observed solar signature on the Earth’s temperature record.

The brightness of the Sun varies by about 0.1% over its 11-year cycle. Although this difference has left an observable signature on the Earth’s temperature over time, it is far too small to account for the majority of climate change on Earth. Confusingly, however, calculations show that the amplitude of the signature is 2.5 to 3 times too large for it to be caused by a direct heating effect. […]

The rest of this article is available here.

‘Quadrapeutics’ magnifies chemoradiation

Published in MPW, 17 Jul 2014

Nanoscopic explosions triggered by a laser can enhance the effectiveness of traditional cancer therapies by 10 to 100 times, according to scientists in the US who are pioneering the technique. “Quadrapeutics” appears to kill cancer cells while leaving healthy cells intact, and could prove especially good at targeting drug-resistant tumours and those in children (Nature Medicine 20 778).

Developed by biochemist Dmitri Lapotko at Rice University in Houston, TX, and colleagues, quadrapeutics is so called because it relies on the administration of four different components: chemotherapy drugs, gold nanoparticles, a laser pulse and X-ray radiation. At its heart, however, is the idea that tiny mechanical explosions – which the researchers call nanobubbles – amplify the effect of chemotherapy and radiotherapy, but only in cancerous cells. […]

The rest of this article is available here.

Cosmology surge keeps Europe ahead in dark matter research

Published in Horizon, 7 Jul 2014

A new wave of researchers is setting out to help shed light on the unseen forces shaping the universe, and keep Europe at the forefront of experimental cosmology.

Earlier this year, Dr Alexey Boyarsky at Leiden Observatory in the Netherlands discovered what could be indirect evidence of dark matter, while researchers at the European Space Agency (ESA) are now developing a probe that will try to measure dark energy, the unseen force believed to be driving the accelerated expansion of the universe. […]

The rest of this article is available here.

Rise of shine

Published in New Scientist, 5 Jul 2014

The solar cell of the future will be flexible, highly efficient and oh-so cheap – just as long as we can make it work in the rain

GOOD things come to those who wait, and Tsutomu Miyasaka had waited a long time. Knowing that a solar cell can be made using just about any pigment – coffee, chlorophyll, red wine – the Japanese physicist had spent years testing all sorts of colourful substances in the hope of finding one as efficient as it was cheap. Then, one day in April 2007, a student walked into his lab at the University of Tokyo, carrying a lump of an unremarkable mineral called perovskite.

It proved to be the start of something entirely remarkable. When Miyasaka reported the results from his first perovskite solar cell in 2009, it converted just 4 per cent of the sunlight’s energy to electricity. By 2012, the figure was over 10 per cent, and others were beginning to take notice. With groups around the world now on the case, efficiencies are touching 20 per cent, beating most solar cells currently on the market. “The field is moving so quickly – everyone is jumping on board,” says physicist Michael Johnston of the University of Oxford. Is this the solar technology everyone has been waiting for? […]

The rest of this article is available here.

Pooling funding to accelerate the move to smart electronics

Published in Horizon, 13 June 2014

If we want Europe to gain market share in developing technology for smart devices like phones and tablets, we must embrace public–private partnerships, according to Dr Andreas Wild, executive director of the EU’s new Electronic Components and Systems for European Leadership (ECSEL) Joint Technology Initiative.

Why is electronics so important to the EU?

‘Back in June 2011 the European Commission’s High-Level Group on Key Enabling Technologies recognised certain technologies as being important for Europe, and one of these technology areas was micro- and nanoelectronics, including semiconductors. In a nutshell, nanoelectronics provides the ‘smart’ in everything. If you have a smartphone, it’s smart because there’s a chip inside with software embedded in it and functionalities integrated around it. A smart card is not smart because of the fancy plastic, but because behind the gold pattern there is a chip with embedded software and integrated functionalities. Likewise, every smart object is smart because of the chip inside – and that’s micro- and nanoelectronics. There are economic implications of being able to make smart products, and there are also strategic implications in safety, security, autonomy and independence.’ […]

The rest of this article is published here.

Functional MRI tracks neurotransmitters

Published in MPW, 3 June 2013

Researchers in the US have for the first time used MRI to follow the dynamics of neurotransmitters with molecular precision. They have demonstrated the technique on dopamine, a neurotransmitter that represents processes of reward and motivation in the brain.

Neurotransmitters are the chemicals released at the end of nerve fibres to communicate signals to other nerve fibres in the vicinity. An understanding of their dynamics is important for a more general understanding of brain function, yet they are hard to study. In the past, scientists have resorted to PET, an imaging technique that relies on a radioactive tracer being inserted into the body so that its path can be monitored by emitted gamma rays. But PET can only supply images that are spatially accurate to within a few millimetres and temporally accurate to within a few minutes. […]

The rest of this article is available here.

Winds of change

Published in Physics World, 1 June 2014

The future of the wind industry is looking brighter thanks to a decades-old laser technology. Jon Cartwright explains how laser anemometry could cut the cost of wind energy and boost its share of the world’s energy market

On a coastal plain in Østerild, north Denmark, a gargantuan white structure turns solemnly in the breeze. The latest wind turbine designed by the Danish manufacturer Vestas, the V164, is the biggest yet: at 220m, it is well over twice the height of the Statue of Liberty. And when it was finally tested in Østerild at the beginning of 2014, it also proved to be the world’s most powerful – capable of generating 8 MW of power, enough to provide electricity for some 7500 homes.

The V164 is a symbol of the wind industry’s recent success. Over the past 14 years, the number of installed turbines across the world has risen dramatically, from an output of just 17 MW in 2000 to nearly 320000MW last year – corresponding to about 4% of the world’s total energy demand, according to the Global Wind Energy Council. The boom has been due partly to a surge in the construction of turbines in China, but many smaller countries are also adopting the technology. The UK, for example, generated 10% of its electricity from wind power last year, and it has more offshore wind capacity than the rest of the world combined.

Despite this success, however, the industry has sometimes struggled politically – not least because of a conflict between the cost and location of wind farms. Onshore wind power is relatively cheap: it costs about $87 per megawatt-hour, midway between natural gas ($66/MWh) and coal ($100/MWh), according to a 2013 report by the Energy Information Administration (an agency of the US Department of Energy). Plans for new onshore wind farms often face strong local opposition, however, which is why politicians frequently look offshore for new opportunities. But offshore wind is far more costly: the same 2013 report rates it as more expensive than nearly any other energy technology – renewable or otherwise – at about $222/MWh. The high cost of offshore wind was highlighted in March this year when Scottish and Southern Energy, a UK gas and electric company, announced that it would cut its investment in offshore turbines in order to assure a two-year price freeze for its customers. […]

To read the rest of this article please send an email to for a pdf.