How to Build the Ideal Storage Solution: Part 2 – “It’s the software, stupid!”

IdealStoragePart2

In part one of this series I explained how a Coho Data storage solution uses the best in commodity hardware offerings to build the next generation of web-scale storage arrays. Now hardware is, of course, incredibly important to any storage solution, but it is far from the complete picture. Hardware will only take you so far, and that journey will be a boring one as it revolves around just speeds and feeds.

That was the approach used by storage providers back in the nineties. Big fast boxes soon replaced by bigger faster boxes, which would be replaced as well over time. This was the kind of storage that I was taught we should buy when I first started in IT.

Speeds and feeds are great, but I need more..

My problems as an IT admin never seemed to be solved by a bigger and faster array. Sure, I could hold more data and process storage IO requests faster, but what about knowing who was creating all of that data in the first place? How about some insight into what department’s data was creating a potential hot spot, so that I could plan my next storage purchase accordingly? Even better, how about intelligently preventing hot spots in the first place? Why could I not find a storage solution capable of doing all of this?

Then one day the answer struck me like a thunderbolt. I was looking in the wrong place for the answers to my problems. I was never going to find a hardware only solution that could do what I needed. It was like someone slapped me in the back of the head and said: “It’s the software, stupid!”(Click to tweet)

Luckily I was not the only person who felt this way, because after dealing with that incredibly wicked Y2K bug (that is code for “the stupidest reason I was ever forced to be on call during New Year’s Eve by a paranoid non-technical person” by the way) we soon began to see arrays appear on the market with technologies like snapshotting, and predictive capacity usage. I wish I could say that shortly after these technologies emerged that we had storage solutions that could prevent hot spots and provide detailed information on how particular collections of data were utilizing the storage system.

I wish I could say that, but instead I would have to wait for over a decade before a storage company figured out how to combine scale out storage with excellent performance. Luckily I happen to work for that storage company today (Coho Data).

Enter the Salmon

Want to avoid getting stuck in a traffic jam? Do not drive into a traffic jam to begin with. That is one of the things that I love about having GPS navigation on my smartphone; I not only know where I am going, but I also know where the bottlenecks are on the road before I encounter them.

The architecture of a Coho Data storage solution works on a similar principal. The Software Defined Networking (SDN) controller for the 10GbE switches knows both the available capacity and the performance potential of each Microarray on the network. Just like GPS navigation can keep a driver on open road to avoid traffic problems, the SDN controller can prevent data hot spots by ensuring that a “traffic jam” never starts in the first place.

Furthermore the SDN controller is not directing traditional blocks on the storage network, but instead it is directing multiple copies of the same data to ensure data protection across multiple microarrays. Every VM hosted on a Coho Data solution is distributed across an object based datastore in the form of multiple stripes of data. Each stripe is duplicated, so if there were an event like a hard drive failure the data is still available on a completely different Microarray. The data is immediately duplicated again to yet another microarray, and all of this is possible because the SDN controller is coordinating traffic amongst all of the Microarrays.

What does it mean for you?

The ideal storage solution would rely on software to intelligently deliver storage resources regardless of the hardware it is running on. By leveraging intelligent software you as the customer can leverage commodity hardware to deploy storage in minutes, eliminate LUNs and volumes, while mixing and matching hardware as you scale your environment. Intelligent software reduces your storage management overhead, while at the same time increasing your application’s uptime.

Intelligent software is not just a technical advantage, because it offers key business benefits as well. Leveraging commodity hardware, using software defined networking to eliminate storage bottlenecks, and the reduction in time spent managing storage is only possible when you have a well designed software layer. Coho Data storage solutions give IT architects, administrators, and CIO/CTOs the insight needed to focus on delivering application performance that is highly available. Managing a Coho Data solution allows you to evaluate your storage performance at a glance instead of having to manage storage constantly. Thus IT organizations can focus on deploying applications quicker, scaling on demand, and reducing costs while mundane management tasks are automatically handled by the software.

Part 1 of this series explained how commodity hardware can be used to build the ideal storage solution. Part 2 described how intelligent software unleashes the true potential of that hardware without overwhelming the administrators of a storage solution. Be sure to check out “How To Build the Ideal Storage Solution Part 3: Bring Me Your Applications!” to learn how the ideal storage solution is the one that enables your applications to run at their best no matter what!

Interested in Coho Data? You should download our datasheet or  get the independent analyst review by ESG here.

5,112 total views, 1 views today