Attempting to keep pace with never-ending demands for storage space in the enterprise is a daunting task. While the DAS and NAS technologies of yesterday have been largely replaced by SANs hosting virtualized infrastructure, one constant remains: the use of the traditional hardware refresh cycle as a means to address organizational storage needs.
Any IT professional responsible for storage planning will be familiar with this exercise. With each hardware refresh cycle, IT planners ask themselves…
- What is our current storage footprint?
- What is our expected rate of change over the life of the storage platform under consideration?
- How well-suited is the storage platform for expansion over its planned life?
Until recently this approach may have sufficed. Enter big-data.
Big-data is acting as the disruptor to hardware refresh cycles, and more specifically in the arena of data storage.
In response, IT professionals and organizations alike need a more dynamically responsive approach to storage provisioning.
Some organizations have turned to cloud storage architecture as a means to address this challenge head on. With its elastic nature and scale-out capability – public, private, and/or hybrid cloud storage solutions are likely to be the cornerstone of organizational IT infrastructure.