Designing The World’s Largest
Private Clouds For SQL Server

ben-eventsBen DeBow, founder and CEO of Fortified Data and has architected and deployed mission critical enterprise data platforms for over 20 years. He is the architect of the Core Health Framework, Fortified Data’s proprietary evaluation system for enabling data platforms to achieve the highest levels of scalability, availability and performance. In this interview, Ben shares some best practices and lessons learned from designing and deploying some of the world’s largest private clouds for SQL Server.

Q: When you say world’s largest, give us some detail or an example.

When we think of most organizations, size is relative because every company has a large, mission critical system. So what is large to us? Large to us is hundreds of terabytes and thousands of instances supporting hundreds or thousands of business applications for an organization. However, more organizations are consolidating SQL Server deployments, resulting in more data on fewer, but more powerful servers.  In most cases, the environment ends up being bigger than before in terms of data.

Q: What business requirements are driving these huge deployments?

The projects are really being driven from the business and consumer. The consumer is used to applications being delivered quickly.  The business has to keep up with new features and capabilities. This is putting pressure on IT organizations to move faster so the business can get an idea from concept to consumer in weeks, not months or years. The other driver is technology. Virtualization and more powerful infrastructure at every layer is enabling organizations to consolidate while offering the business faster, more targeted services to build their applications on. If executed correctly, this is a win for the business and IT. The business is able to take an idea to the marketplace in weeks and IT is able to reduce their physical footprint and cost.

Q: What are some common misconceptions related to designing highly consolidated SQL Server private clouds that achieve optimal scalability, availability and performance?

Several come to mind. The first one is that ‘if we consolidate, we will have less servers, therefore we have less to manage’. Yes, during a consolidation project, we standardize and decommission unused databases.  But when the consolidation project is over, there are just as many databases that have to be managed by the DBAs. There are many efficiencies that are gained by reducing the SQL Server instance count, but the DBAs have just as much responsibility and work after the consolidation project ends.

Another common misconception is about running mission critical SQL Servers on virtualized hardware. It can be done and is being done by the largest companies in the world. Several large global banks have stated that they are going 100% virtual for their SQL Server deployments. The key to success with these environments is rooted in the engineering and design of the platform. A virtualized database platform must support large amounts of data movement and storage. It has to support the 5% of the systems that your business classifies as mission critical. This is done by working with all of your technology teams like before to design a service offering that can guarantee the performance, availability and scalability of resources up and down the stack. Only then can we begin to agree to support a SLA of 99.999% and a PLA that works for your business.

Q: What role, if any, can the public cloud play in a private cloud solution?

Every design that we are creating today takes into account the public cloud. When we design the on-prem solutions, we design the solutions so they can be easily extended to the public cloud since many organizations are deploying some aspect of their environment to the public cloud today. Another important consideration in the design process is to design the service offerings for sizing and features to take into account the public cloud’s capabilities. This enables the business to better understand what is available and is offered on-prem versus public cloud.

Q: How do you achieve optimal taste and tenderness for a set of baby back ribs?

ribsAs a SC Certified BBQ judge, the first question is whether or not you have ever had smoked out baby back ribs from the south. If not, book a flight to Charlotte, order some southern BBQ and it will change your life. Now, how do I achieve the optimal taste and tenderness?  Tenderness is relatively easy, and it starts with knowing your tools.  I’ve had a Backwood smoker for the last 5 years and have really gotten to know how it cooks. You need to really know the temperature profile of your smoker before you can truly crank out tender ribs every time.

The other trick is to cook the ribs for 1.5hrs at 250 degrees and then remove them. Then wrap each slab of ribs in aluminum foil with some butter and brown sugar for another 1.5hrs. after that, remove from the smoker, unwrap them and apply your favorite finishing sauce to them. But don’t eat them yet!  Put them back in the smoker for another 30mins. After that, take them out and let them rest for 30 more minutes and then they are finally finished.

The taste is a personal prefence and there are many recipes out there. My tastes depend on the season and my mood. One interesting story is that my neighbor across the street is Indian and we get together and create a hybrid rub using traditional seasonings and his own Indian seasonings. The rubs that we have created are unique and have a truly flavor profile of their own.