I was recently meeting with SAP customers while traveling through Singapore and Bangkok. What I found on these travels was a growing market with unique challenges, and some brilliantly spicy food. I also found a customer base dealing with the same global questions of “HANA” and “Cloud”. It’s a statement of fact that IT must evolve by reducing the on-going cost equation, and replatform to faster, more flexible architectures in order to keep pace with the Line of Business (LOB). One of my old mentors from the 90’s had a tag line: “Speed Kills”. The sensibility of this statement has only become more relevant over the last 20 years.
Yes, HANA and Cloud are the two levers of change in the SAP customer base, and I find a constituency that is focused on getting this right. IT budgets aren’t what they were in the 90’s, and they are dealing with the major realities of running mature global IT operations. I interpret their collective position as one trying to solve a challenging puzzle.
As I prepared to present last week, I wanted to engage the audience. While waiting for my turn to speak, I came up with the “HANA Puzzle” concept below. The “HANA Puzzle” went over pretty well with the audience and I think it’s relevant for the broader SAP community, so I wanted to share it with you. Here’s a quick step through of my Whiteboard talk.
“Can you solve it?” I asked the audience. They trained their eyes on the seemingly arbitrary list of letters, yet found no hidden key. So, I began to explain to them, the HANA puzzle.
The first letter is “A”, that stands for…
APPLIANCE – When HANA was first released, SAP limited infrastructure variability by requiring every deployment of HANA to be installed on a certified appliance. This ensured HANA had the appropriate compute horsepower required to run, and it simplified the deployment process for the customer. Even today there are many customers who are inclined to consider an appliance model for their deployment of HANA because of its initial simplicity. In reality, the appliance model was a contemporary of early HANA when limits were welcomed, but it loses favor for mature deployments. Today where HANA deployments moving into their second, third, fourth step of evolution, TDI has become the model of choice.
TAILORED DATA CENTER INTEGRATION (TDI) – TDI is the ability to install HANA on top of a customer’s IT landscape through a self-certification process. There are still some requirements for component validation, but the effect is a significant savings in overall TCO. I recommend this paper by Antonio Freitas on the mainstreaming of TDI for a full review of TDI’s impact.
Why is TDI a better solution for TCO? Simple, IT operations have been refined for multiple decades to optimize on a horizontal model. Key optimization techniques like capacity planning and load balancing are a function of the maximization of shared resources. Most customers have found that they can run HANA successfully within their existing landscape, or optimize their infrastructure with new tech that maximizes across multiple axes, not just their HANA deployment. As important as cost, this additionally provides the maximum flexibility for operations. Finally, using IT standards leverages the company’s existing skill sets.
All of these are key optimizations that TDI enables, but probably the most singularly important optimization TDI supports is our “V” in the puzzle. Here is a blog by SAP’s Bill Zang covering the impact of TDI and virtualization on the cost of systems operations.
VIRTUALIZATION – OK I am guessing a few of you figured out the “V” in the puzzle was virtualization, because virtualization’s power to optimize is well known. If you are curious how that specifically impacts HANA, here’s a quick read on the basics of Vsphere 5.5 support specific for HANA. I am comfortable in saying that, today, virtualizing non-production HANA is common practice. The savings created through standing up and shutting down Virtualized HANA development environments and the improved model for HA and DR alone justify including HANA in your non-production environments. However, some companies have ventured even further, using virtualization in production. Watch Bill Reid talk about his deployment of virtualized HANA in production for EMC IT.
Well the puzzle is in the process of being solved, can you guess was “P” is for?
PRIVATE CLOUD – It’s a short, but steep leap from Virtualization to private cloud. Private cloud adds in the next level of application/DevOps functionality to the stack, which further abstracts and automates HANA away from the physical data center and into the cloud. Private cloud does this while the providing the most cloud protection via hard-walled environments. There are many ways to deploy HANA on private cloud including the market leading solution from Virtustream called xStream Cloud Management software. This solution granularizes the environment into small compute chunks and optimizes the layout to minimize the HANA workload’s footprint. Then xStream routinely monitors usage of each unit of compute. The system will further automate the starting and stopping on SAP environments, minimalizing the amount of human interaction needed for HANA landscape operations. This is useful, for customers who deploy “on-premise” and “off-premise”.
ON/OFF PREMISE – Let’s continue the conversation on xStream to apply its optimization to an off-premise environment. If you have contracted Virtustream for managed services or are using xStream sfw for hosted private cloud, then the products ability to turn off and on small compute units called “MicroVMs” translates into significant savings. By monitoring whether a MicroVM is on or off every 5 mins, Virtustream minimizes their charges to actual consumption, only charging for compute units that are “on”. Add in the automated starting and stopping of SAP workloads, and a hosted private cloud can translate to 20, 30, 50% or more savings over your existing deployment.
SAP sees private cloud as a key catalyst to the success of HANA. SAP created a specification for private cloud called HANA Enterprise Cloud (HEC) which they provide through a small certified list of providers (including EMC/Virtustream).
Can you guess the “H” yet?
HYBRID CLOUD – Now we’re getting serious. Hybrid cloud is the next frontier of HANA and SAP computing. Only the most advanced SAP companies have begun to venture into the future of a hybrid cloud model. There is some ambiguity in the market as to what defines a hybrid cloud. Is a customer who has Success Factors SAAS and a hosted private cloud for HANA, a hybrid cloud? Well yes, probably; and by this definition hybrid is somewhat mainstreamed. However, when I mention an elite group of customers heading to the future… well I’m talking about more advanced functionality. I am referring to the ability to create elasticity by bursting workloads from on premise to off or from one cloud location to another. This is the promise of a huge step in further optimization, but there are natural roadblocks to hinder progress. “How big is your data?” or “HANA is an in-memory platform” are two great examples. So today you can not slice off an intra-workload within HANA and seamlessly float it to the cloud. However, think of needs for elasticity in development, system migration, HA, or DR? Hybrid functionality can be really impactful to operations of global businesses.
Let me tell you about one personal experience. Again I am going to use xStream Cloud Management software as an example. I recently worked with resources from Virtustream, EMC and VCE to test out a bundled solution putting xStream on a Vblock. The objective was to allow customers run the cloud optimizing software within their data centers and operationally communicate with other xStream-based clouds. We put this solution through the paces. There were several scenarios like “cloud site failure”, and “system migration between sites” that were proved out. In our first few phases of testing we have had amazing results. Check out this solution brief for more information.
PUBLIC CLOUD – The final “P” is for public cloud. For mission critical systems, public cloud is less impactful than its sister cloud derivations, yet it can’t be overlooked when SAP customers are looking at overall optimization. Public cloud can provide a variety of offerings from SAAS offerings like SuccessFactors, to small online HANA development environments, to offloading a company’s traditional landscape or as a tool for addressing big data requirements. Here’s a story from about BlueFin’s leverage of public clouds for their SAP landscape. As companies plan their replatforming efforts they should consider public cloud as a tool to round out their overall strategy.
Well… We’ve solved the “HANA Puzzle”.
I called it the “HANA Puzzle” because for many companies its not a question of why HANA, they know that HANA is the future; yet the “how” and “what” can be confusing because of the amount of evolution we’ve experienced in the last few years. I hope you see an “answer” in my solution to this puzzle. Everyone has to define their own journey, but there is tangible precedence in the market on what decisions will maximize both your operational flexibility and TCO.
For current and future EMC customers, I want to point out; EMC Federation (including EMC, Virtustream, and VMWare) provides the market with the hardware, software and services to address each and every iteration and derivative of HANA you may choose, across the entire “puzzle”.
I hope this helps you solve your own path for HANA. Please feel free to share your story or ask for details on any of this as needed.
(As for the hot and spicy food… a few of my favorites were Rendang, Laksa and this spicy bamboo salad in Bangkok… Man, it doesn’t taste the same in the States… Loved it.)