Attempting to keep pace with never-ending demands for storage space in the enterprise is a daunting task. While the DAS and NAS technologies of yesterday have been largely replaced by SANs hosting virtualized infrastructure, one constant remains: the use of the traditional hardware refresh cycle as a means to address organizational storage needs.
Any IT professional responsible for storage planning will be familiar with this exercise. With each hardware refresh cycle, IT planners ask themselves…
Until recently this approach may have sufficed. Enter big-data.
Big-data is acting as the disruptor to hardware refresh cycles, and more specifically in the arena of data storage.
In response, IT professionals and organizations alike need a more dynamically responsive approach to storage provisioning.
Some organizations have turned to cloud storage architecture as a means to address this challenge head on. With its elastic nature and scale-out capability – public, private, and/or hybrid cloud storage solutions are likely to be the cornerstone of organizational IT infrastructure.
In an article by Computerworld, analysts are predicting that in the next 8 years digital data will exceed 40 zettabytes (1,099,511,627,776 gigabytes (GB) in a zettabyte) or about 5,200 GB per person on earth. Emerging countries/markets will likely become the dominant data generators rising from 36% to 62%. However, the research suggests that this data will mainly be produced by computers not humans.
More data, more storage, faster hardware, larger/faster networks, tighter security, “real” service-oriented architectures, “bring your own device” solutions, converged infrastructures and overall efficiencies needed for customers. Meta tags will be the critical element in farming and correlating this data. And, by 2020, while cloud spending is projected to rise from 5-40%, the cost of storage will likely plummet. Interesting times ahead….