Can I take Mission Critical Apps to the Cloud? – THE GOLDEN 7 CONSIDERATIONS


I have been in Minneapolis since Monday to participate in #SAPWeek.  #SAPWeek is a unique experience where EMC brings in their customers, partners and experts, from all over the globe, into a concentrated deep dive whiteboard jam session for a few days. I was at the first in Santa Clara in 2007 and have been attending many each year since. There are very few things that are as consistently more rewarding as exploring the future of enterprise landscapes that run SAP, and that is #SAPWeek.

Right now, customers are re-platforming in droves. Here in the mid-west the topics of interest ranged from “Is it time to go to X86/Virtualization”, to “what is HANA TDI” (and I should I use it), to “is the cloud real for mission critical”.

The quick answer to whether you should roll out HANA with an appliance or TDI is: TDI. You are going to save 20-30% or more of OPX getting off the appliance model. If someone   is advising you otherwise, get new advisers.  Check out my blog on “the HANA Puzzle” for more on that story.

As for the cloud, many customers were…well… shocked here in the heartland that “cloud” is a feasible option for mission critical apps like SAP HANA. I am hear to tell you it is. There are many VERY LARGE companies aggressively adopting the cloud for SAP Traditional and HANA. Since our alignment with Virtustream in 2014 our field teams are very active responding to this market migration.  (If you are not familiar with Virtustream,  here’s a good level set video for you)

During the discussions this week. I referred to a slide I built with Christoph Streubert to help our customers navigate the questions as to whether they can get to the cloud. My laptop was not booting and I promised to post this via a blog. (Commitment DONE)

Golden7 for Cloud

I think the average IT org can take this list and build out a tailored profile/gap analysis to begin to determine the big questions of the cloud:

  • What – What workloads will I move to the cloud (or what environments in my landscape)
  • When – When does it make sense based on my cost and risk profile
  • Why Not – What about my environment hinders me from sending workloads to the cloud.  Make sure you socialize this item.  I am finding these “sacred relics” of the past are actually breaking down as your cross the lines of business. Cloud is compelling.
  • Who – Not all clouds are the same. Make sure your cloud partners are offering:
    • You performance requirements/guarantees
    • You long-term operational costs with significant reductions.
    • Risk monitoring and management

As you dive in to the “Golden 7” considerations, feel free to reach out for an interactive discussion on how to fill out your version of this story

What will Dislodge the Pebble – Virtualizing Large DBs


2011 is the year for the question, when will large databases “virtualize” in mass? It’s somewhat a technology question; it’s definitely a bit of a sizing/scale/performance question, but I want to make the case here that it’s additionally a people and risk question that must be answered.

There is much worthy discussion about whether the technology is ready and for whom. I have the opportunity to work with some of the largest companies in the world and they each have unique requirements which are individually thrilling to help address. With that said, there are some great technologies available to attack “the middle” of the normal curve. Vsphere5 release increased capacity limits for virtual core counts and such, removing those fundamental limitations. I work with customers around products like Vblock from VCE which adds significant horsepower and simplicity to the equation, working as an accelerant for change. Finally, I know of customers who have virtualized significant DBs, but these areas are not my point of focus for this blog. Instead l’d like to look at human factors that will determine the pace of this transformation.

In a way, people are as much the database as the bits and the b-trees. People create the data, people write the software (well we use to…), people govern the use of the data, people maintain the data and, ultimately, people spend the money that most often the data is collected for in the first place. People are a huge part of the equation for virtualizing databases. Let me pull a bit of Hollywood-script-writing license and generalize this down to two people, the CIO and the DBA. This is unfair to paint with such a broad brush, but they will be the quintessential players in my story.

The DBA is a well-honed machine. The masters of this craft have been at it for 20 years cutting their teeth on MF to Open Systems migrations; they rode the rise of Oracle. Some started, left and came back to DB2. Most have suffered countless huge audacious problems and have earned their place in the database hall of fame. Companies pay the big bucks for these superstars for one reason. Both their technical experience and their risk mitigation experience have trained them to KEEP THE DB RUNNING, and they do. Virtualization to a DBA is a scary thing. It goes against the grain and, hey, when you’re in your 40’s who wants to be reviewing logs at 3:00am on Sunday morning… They have an aversion to such a big change which will require “new experience” to keep the system stable. Additionally many DBAs still suffer from performance and growth challenges that they feel are their primary job to address. Our companies have employed DBAs to bring these exact skills to the table and thus many DBAs are “cautious” to “concerned” about virtualization.

Now for the CIO. The CIO, though less connected with the details, has similar concerns of risk. Their success is tied to a highly performant and stable data center. With that said, they face an equally heavy burden to reduce costs each year as data grows, like a teenager, year in and year out. They have seen the virtues of virtualization and believe it to bring better utilization, cost management, IT agility, and DR capability. CIOs are listening to both sides and reasoning the answer to what, how, when; why is already covered.

Ok, so what you really want to know is do I see a trend? I have statistical training in my graduate work and I can safely say my sample size is ambiguous and biases are undefined, but I will not let that stop me from sharing my reasonably crafted opinion. Yes, there are some trends. The trend in 2010-1H2011 was a peppering of important deployments across the industry that virtualized DBs. These provided a proverbial support to show the sky isn’t falling, and stages for the next level of demonstrable growth. The trend in second half of 2011 will be lots of experts busy in the tasks of design, creation and realization. The whiteboard work has significantly increased. Blueprinting, POCs and other type of activity have notably increased. Messaging at Executive Briefings has increased (CIOs are listening). So when does the pebble dislodge and the deluge of large virtualized DBs flood the install base? If you carve off the most outrageously large and unique DBs from the conversation, I think you will see steady activity in the back half of 2011 and 2012 the pebble will tumble.