From MIT Technology Review:
In some ways, science is suffering from too much data. Experiments and computer simulations analyzing everything from the dynamics of climate change to the precise details of folding proteins can churn out billions of numbers describing these physical phenomena. Making sense of all this data remains a challenge.
Recently, however, researchers at the University of California, Davis, and Lawrence Livermore National Laboratory announced that they have developed software that makes analysis and visualization of huge data sets possible without the aid of a supercomputer. The researchers’ algorithm slices up data into more manageable chunks, then stitches it back together on the fly, so that the data can be manipulated in three dimensions, all on a computer with the power and capacity of a high-end laptop. [Full Article]