Be the most informed person in the room with Newsy's free e-newsletter

View our privacy policy: http://www.newsy.com/privacy/
Lawrence Livermore National Laboratory

Government Spends Millions On Computers To Wrangle Big Data

The U.S. Department of Energy is commissioning some of computing's biggest names to create two supercomputers capable of handling Big Data demands.

By Erik Shute | November 14, 2014

IBM, NVIDIA and other tech brands are cooking up two supercomputers for the U.S. Department of Energy.

The initiative will birth Summit and Sierra, soon to be the world's fastest computers, by 2017. Summit will be an open resource for scientists and everyday testing, and Sierra will be used strictly for nuclear security at the Lawrence Livermore National Laboratory.

The super PCs will be able to process at speeds up to 300 and 100 petaflops respectively.  A petaflop is the benchmark for how quickly a computer processes information. Each equals 1.026 quadrillion calculations per second. (Video via NVIDIA)

Article Continues Below

Considering in 2008, IBM created the first supercomputer to run at just one petaflop, we’ve come a long way. So why the need for speed? It goes beyond the headlines crying hundreds of millions of dollars spent. The consumer answer from the DOE is:

"High-performance computing is an essential component of the science and technology portfolio required to maintain U.S. competitiveness and ensure our economic and national security."

VentureBeat adds: "The unstated goal: We must have more computing power for scientific research than other countries can marshal, because data is power."

More importantly, harnessing that power will use a concept known as “Big Data” or “data-centric computing." 

According to IBM’s press release, we generate 2.5 billion gigabytes of data every day. It includes every Facebook post and every weather pattern — and much of it’s lost due to today’s poor processing methods. (Video via IBM)

IBM’s new supercomputers are designed to shuttle less data through the computer. Engadget offers a great analogy. Imagine the computer’s processor like a hydroelectric dam and the data is water. Normally we squeeze millions of gallons of water through the dam. But what if we didn’t move the water to process it? What if we moved the dam, or the processor instead? That’s where more efficiency lies.

DOE expects the two new supercomputers to yield better ways to predict hurricanes, produce better biofuel and other benefits to national infrastructure by processing Big Data.

This video includes images from the Lawrence Livermore National Laboratory and the U.S. Department of Energy.

Want to see more stories like this?
Like Newsy on Facebook for More Coverage