Pushing Data to the Atomic Level

In Big Data by Daniel NewmanLeave a Comment

Fiber Optic cables connected to an optic ports

This post is sponsored by Samsung Business. All thoughts and opinions are my own.

Miniaturization is an important goal in computer technology. The smaller we can make computer components, the more we can fit into a single device and the more powerful it will be. Data storage is no exception. Scientists dream of creating data storage centers that can store the entire library of human knowledge in an area the size of a pinhead.

We aren’t quite at that point yet, but we are getting closer. Right now, I own a one-terabyte external hard drive that’s about three inches in diameter. Scientists recently created an atom-based storage device than can store 500 terabytes in a square inch. This impressive creation makes a once top-of-the-line purchase look like something from the Stone Age. Yet the technology has yet to reach the level dreamed of by scientists. Our need for increased storage space continues to climb rapidly, and scientists are striving to overtake that need.

We Need More Storage

Why do we need to store so much information in such a small area? Well, the amount of information the human race is creating—and putting to use—is constantly increasing. In fact, our information output is skyrocketing at an exponential rate. Scientists’ best estimates put the amount of information that the human population produces every single day to be around 2.5 quintillion bytes of data.

Currently, Google is the company that uses the most data in the world. Its servers process 3.5 billion requests each day. It stores 10 exabytes (10 billion gigabytes) of data. Facebook, Microsoft, and Amazon are all on Google’s heels in terms of the amount of data their servers store.

Speaking of servers, Amazon, Google, and Microsoft all require more than a million servers to hold their vast stores of data. Imagine how much physical space three million servers take up, and how much power they need to run and to stay cool. Now imagine devices only a few inches (or maybe a few feet) in diameter replacing all those servers. That’s what storing data on an atomic level promises us. The amount of energy we could save is staggering. So too are the implications of being able to store so much data.

In 2013, the internet contained roughly 500 exabytes of data. By 2020, scientists estimate that number will have skyrocketed to 40 zettabytes. That is 400 billion gigabytes! If the amount of data we produce continues to grow at this rate, atomic storage will not only be useful, it will be an absolute necessity.

Creating Quantum Computers

Storage is not the only use for atoms. Computer scientists have theorized about quantum computers for the last three decades. The idea behind quantum computing is using atoms and molecules to not only store data, but also to perform processing and other memory tasks.

In fact, scientists have already created simple quantum computers that can perform simple calculations. The technology required for these computers is extremely rare, and the ability of scientists to manipulate qubits (quantum bits, the medium on which quantum computers store information) is still very limited. To date, the most advanced quantum computer has only been able to control 16 qubits at one time. Practical uses for quantum computers are still a long way in the future, but they are definitely coming.

Our data production grows at a truly staggering rate. So does the need to store that data for analysis so that we might learn from it. Scientists are striving to create storage devices that can hold more data in much smaller spaces. Atomic storage devices are in development. The near future may see these types of storage centers replacing massive server warehouses.

For more content like this, follow Samsung Business on Insights,  Twitter,  LinkedInYouTube and SlideShare.

Photo Credit: Universidad Politécnica de Madrid via Compfight cc

Daniel Newman is the Principal Analyst of Futurum Research and the CEO of Broadsuite Media Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise. From Big Data to IoT to Cloud Computing, Newman makes the connections between business, people and tech that are required for companies to benefit most from their technology projects, which leads to his ideas regularly being cited in CIO.Com, CIO Review and hundreds of other sites across the world. A 5x Best Selling Author including his most recent “Building Dragons: Digital Transformation in the Experience Economy,” Daniel is also a Forbes, Entrepreneur and Huffington Post Contributor. MBA and Graduate Adjunct Professor, Daniel Newman is a Chicago Native and his speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

Leave a Comment