Humanity has a data storage problem. In fact, by 2025, IDC predicts we'll be producing over 160 zettabytes of data per year. This data must go somewhere, whether it's on-premise, in the cloud, or a combination of the two. Despite some claims, physical architecture will continue to be critical to the storage of data.
The last significant disruption to the enterprise storage space was flash, in a revolution driven by vendors who made the technology simple to use, fast, evergreen, and more cost-effective than disk and tape. As a result, flash storage has drastically reduced the amount of real estate and power that data centers require while addressing a whole number of operational, reliability, and performance issues.
But What Exactly is the Next Frontier?
According to Markets and Markets, the next-generation data storage market is estimated to grow to USD 102.2 billion by 2024. This will account for massive growth in digital data volume; the proliferation of smartphones, laptops, and tablets; expansion of the IoT market; the increasing penetration of high-end cloud computing.
Several challenges will accompany this continued explosion of data:
Scalability. By 2020, IDC predicts 90% of large enterprises will be generating revenue from data-as-a-service. Businesses far and wide will be looking for ways to scale their operations up and down based on customer demands for data services at a given point in time.
Rather than purchasing all of that data storage at once, and sitting with massive CapEx costs, or requiring a huge forklift upgrade every few years, businesses need intelligent (or evergreen) scalability. Drive uncertainty out of the equation by moving storage to an OpEx model. It will ensure that maintenance is predictable and fair; power and cooling operate on an as-needed basis; floor space is maximized; performance, scale, and feature upgrades are automatic; capacity surprises are a thing of the past – ensuring your storage investment is protected.
Density. According to the Advanced Storage Technology Consortium, density will continue to be a top priority to businesses – helping to reduce the burden and costs of underutilized space and energy when it comes to data storage.
All-flash storage is one area that is making leaps and bounds in ensuring this density for the future. With all-flash, you don't need to buy extra hardware just to get the performance you need. Flash has increased the density and useable capacity of storage arrays, enabling organizations to reduce entire racks of equipment to handfuls of rack units. The end result is fewer rack units dedicated to storage, and that directly reduces data center costs.
The edge. IDC expects that by 2025, nearly 30% of the world's data, particularly Internet of Things (IoT) connected devices, will need real-time processing. Naturally, this gives way to the continued movement of processing power at the edge.
If you’re not yet familiar with edge computing, it refers to the place where the IoT connects the cloud and the physical world. In short, data is being processed closer to the device, rather than traveling “all the way” back to the cloud.
Edge computing for IoT data storage allows for more robust connections and helps businesses process information as quickly as their customers want it – i.e., now. The other key benefit of edge computing or processing relates to security. Devices at the edge are typically collecting data that is more personal in nature – whether it’s a smart car tracking a user’s daily journey, or a voice assistant listening to your conversations – and ultimately, we want to know this information is secure. Edge storage and processing allows organizations to scrub away personal details and collate only the relevant data before it is ever sent to the cloud, therein reducing the volume of data required to be stored.
Beyond the Next Frontier
For years researchers have discussed the potential of data storage at a molecular level. For example, researchers from the Delft University of Technology in the Netherlands have developed a method of storing data by manipulating single atoms. Their testing revealed atomic storage has the power to deliver 500 terabytes per square inch – which is hundreds of times denser than today's hard drives. These sorts of studies have given way to conversations about the potential of DNA (Deoxyribonucleic acid) as the next generation storage medium – after all its dense, easy to replicate and scale, and stable. But we aren't there yet.
In this new age of data, every organization is more than the product or service they offer. They are also information businesses and should be looking to optimize their data storage in ways that deliver value to their customers, mitigate risk, and drive higher return on investment. By considering three key challenges – scalability, density, and the edge – organizations will be well placed to embrace the next frontier of data storage.
Mark Jobbins - VP & Field Chief Technology Officer APJ, PureStorage, wrote this article.
The views and opinions expressed in this article are those of the author and do not necessarily reflect those of CDOTrends.