The Evolution of Enterprise Storage Management
By Eduardo Rivera, IT Director
NetApp IT is a massive storage user, with over 100 petabytes accounted for over 450+ storage controllers and over 5,000 servers. Managing and optimizing this data is a huge undertaking, with shifts in strategy changing occasionally how we approach handling this data.
As a company that has been around for a while, we have also experienced the birth, growth, and influence of “the cloud”. For many enterprise IT organizations and at the behest of ill-guided IT leadership, early adoption of the cloud meant forklifting their data center assets and dumping them in someone else’s data center (aka “the cloud”). This approach quickly becomes unmanageable and financially difficult to handle.
The tenets of cloud architecture are not the same as what most enterprise organizations have built on-premises and applications that are simply forklifted are met with architectures that do not always match with what they are intending to do. Those early lessons have led to a more mature and efficient way of leveraging cloud resources, which is what we call today the hybrid cloud model. Balancing on-premises and cloud resources where we can optimize for security, performance, and cost.
This approach allows organizations like NetApp IT to find the “sweet spot” of IT infrastructure needs to serve traditional and modern application workloads.
Recognizing and leveraging a hybrid cloud strategy for IT infrastructure is one thing, but there is another change we have seen in the consumer expectations regarding our IT infrastructure. The cloud revolution has also influenced how they think IT resources should be presented and consumed. The expectation today is that everything should be available “as a service”. IT resources allocation should be easy, self-service, and immediate. This includes storage infrastructure. This also implies that these storage services should be available across the hybrid cloud model.
What does the hybrid cloud mean to storage infrastructure?
This really breaks down into two primary areas:
– How do we deliver storage, moving from a traditional environment to the hybrid cloud?
– What does the role of the storage engineer look like?
A big change for NetApp IT is a new customer type. Modern consumers of storage services want to use storage as a service, with self-service options and portals. They want a simplified experience where they can easily identify what they need, procure it, and access it quickly. On the other hand, our storage administrators still need to set rules and boundaries that determine how storage services are provided to end consumers.
However, the servicing of these storage assets happens through an orchestrator – middleware software tying the user request to the provisioning of the actual storage infrastructure. Thus, as storage administrators, we are charged with servicing this orchestrated automation workflow by creating limits for it but ultimately letting it manage the minutia of creating the assets requested by the end user. In this scenario, the automated platform becomes the customer that storage administrators now have to serve.
For our storage team, this means leaning on automation. Previously, storage infrastructure was built by hand and monitored by humans. This is still true, but future storage will be delivered through code. Integrating with other platforms, like orchestrators, will be vital. Engineers must also be able to control and maintain storage through code, through everything from standards implementation to an ongoing code execution.
Build automation processes that work
This starts with understanding the stack. Storage isn’t on an island, it’s part of a robust, symbiotic environment that affects an entire IT ecosystem. Our storage automation program is a “perpetual effort to define all storage infrastructure deployment, configuration, and operational tasks via code.”
Or, in simpler terms, it’s how we see the future of storage management.
If there’s a way to automate a process, we should do so. Start with simple processes and use available tools. Avoid creating new processes that don’t work with the wider environment. Adopt an agile mindset and accept that this process will never be completed. It’s continuous improvement where perfection is impossible.