What Happened in Vegas – SAP Insider Follow Up


ImageWell I’m back home from a trip to SAP Insider’s conference and I promised to answer a series of questions based on my conversations from the show. If you will allow me, I am going to change how I deliver on my earlier commitments.  I found my questions morphed drastically as I engaged the attendees.  Why?  well frankly I stink at being a reporter and in fairness, the show’s attendance changed this year enough that I found myself going in a different direction.

Go back four or five years ago and there were two shows called “Admin” and “BI”. They were primarily attended by BASIS teams who were dealing with infrastructure issues.  With SAP’s M&A strategy and SAP’s messaging evolution, the event has changed to reflect the new SAP. Now the show had divested to create “Cloud & Virtualization”, “Mobile”, & “Business Objects Bootcamp”.  Let’s admit it, since 2009 the economy and general show attendance has fluctuated. I am sure that has something to do with the shows coming together in a co-located venue allowing for those attending to stay in a single track or blend their experience across multiple domains.  This is not intended to be an Oracle Open World or SAPPHIRE level event, it’s not that big. In my experience the 2nd-3rd level shows tend to be better attended on the east coast.  So net-net, even with the bundling of multiple shows, the attendance was a little light this year. Additionally a show that used to be a haven for BASIS teams seems to have changed in personnel and now is a different show entirely.  I think this is important because it is a reflection on the wider changes in the SAP industry.  I have a couple of insights about this that I will explain through a discussion of those who attended amd those who didn’t.

 There were 3 types of people present at the show and 2 types surprisingly absent:

1)      PRESENT: Those who want to catch up on virtualization/Cloud.

If I generalize the group, the Cloud attendees were a more “matter of fact” group. They were either going to the cloud, had an upcoming RFP, or presented themselves in a way that they felt they needed to “catch up” with everyone and get virtualized or to the cloud.  This is an interesting dynamic given that this same time last year Cloud was the fresh buzz and only those with iron guts were talking about venturing their mission critical to the cloud.  This shows the pace of innovation in the current market and ultimately that the cloud benefits have “held” and the scary risks have subsided. This adoption is also in the face of all the HANA interest which today is not (practically) a cloud ready product. So in a careless and causal way I’ll make the broad generalization that many if not most are taking a “get to the [private, hybrid, public] cloud now and pilot HANA in the lab” approach with the assumption HANA will somehow fit into their overall architecture as they mature their programs.

 

2)      PRESENT: Those who want to deploy HANA or are active BOBJ users:  

I arrive at my statistical certainties by the scientific process of “# of questions asked” and how many people attend various session topics. Given my sample size, time of day differences, and the deviation in how much coffee I have consumed at a given time, make my prognostications suspect at best. However, let’s not let truth stand in the way of knowledge.   I would have to give the hands down vote of “topic of highest interest” to HANA. HANA sessions were packed  most attendees seemed to be staging HANA pilots or were BOBJ users who were investigating how HANA would impact BOBJ over the next few years. The Keynote given by Steve Lucas, Senior Vice President and General Manager Business Analytics at SAP was all Big Data. He talked about open source, unstructured data, Hadoop and brought it all back to SAP’s data analytics stack, which is based primarily off HANA and remnants of the Sybase.  SAP is Big Data obsessed and so are its customers.

 

3)      PRESENT: Those who are looking to deploy mobile apps that integrate with SAP

I must admit I am most impressed with SAP’s dive into the mobile market. Within just several months they have taken this global ERP company, added Big Data and integrated a mobile BYOD story as if it was always there. It’s very tight messaging, even if the technologies are still coming together in a few spots. I really enjoyed the well attended sessions on the details of the SUP.  What’s “SUP”… “dunno, What’s up with you?”  SUP is Sybase Unwired Platform. This is the API and code set that was added to SAP’s arsenal through the Sybase acquisition which allows users to write platform specific or platform agnostic applications. As they pulled back the covers at the show, it’s not perfect yet, but it does provide a solution that many of the install base can use to close down on the “last mile” (aka the mobile user).

 

4)      ABSENT: Those who’ve already began executing the Cloud

The Surprising VOID of the show was the contingent of folks who are actively deployed or who are deploying a cloud solution.  I personally/causally know many who just weren’t there. Of those I talked to before the show, I had a perspective. They said they are deep in deployment and weren’t sending anyone. What I didn’t realize was how pervasive a phenomenon this was.  I’m not saying there was no one chasing the cloud, but it was notably down and my minor investigations into this, presented a plain fact. Customers “get it” and are “doing it”. Interestingly enough, this is exactly where EMC IT is in their multi-phased process to reach Cloud-enabled ITaaS.  They too stayed home this year. Though I missed the constant discussion of Cloud, I realize a more important event has happened. In past blogs I have talked about Gladwell’s Tipping Point for the cloud. I think we’re there.  Cloud for mission critical has moved into the mainstream.

 

5)      ABSENT: Those who are making investments in Big Data other than HANA:

Another interesting attribute of this event was that the broad heterogeneous all encompassing world of Big Data is seamlessly shrunk to nothing but BOBJ and HANA (with a token Hadoop thrown in).  Ok, you may say that SAP has aggressively driven the competition away from the venue. Maybe, but that isn’t the vibe I get. Even those that come up to you in conversation, do not seem to have a vocabulary or interest that extends much further than the SAP Landscape. I attribute this to a couple of factors. One is that SAP is a big animal. It’s like a youthful trip to the amusement park. You couldn’t see the boundaries, it felt endless. I think many who build their careers on SAP, live within a large ecosystem that can consume your focus without the need to test the boundaries.

 

Secondly I think it is a function of how “new” data analytics/Big Data is to SAP. BOBJ is Business information integration technology, and before Sybase there were few who considered SAP in the realm of database management.  HANA is a fairly new concept, only appearing a few years ago.  So the SAP user community is comprised of people who haven’t traditionally been players in the pre-cursors to Big Data and thus possibly not broadly trained on the overall market.  This is not to imply an inability to execute, only that this is really new and the SAP ecosystem is evolving. It will look notably different in a couple of years.

 

So those were my major insights from SAP Insider sessions in Vegas. I hope you pulled at least one nugget from the pan. Until next time, stay informed. Adios.

Advertisements

Already Tired of the Cloud?


Mentally I have a multi-tenancy problem. On one shoulder I have a hippie wannabe creative type that bangs on musical instruments, writes, paints, and focuses on the creative process. On the other side, I have a little suit-wearing business analyst. This duality is sometimes maddening. The artist in me wants to poke a stick in my temple when I hear “the cloud” on yet another TV commercial, and the other part of me is excited to be part of the next phase of this technology cycle. Now that I have everyone questioning my mental state… let me provide a diversion and state that duality, diversity, alternating needs are part of what is driving the Cloud movement. So let’s talk about what’s really happening and why everyone is forced to watch those crazy commercials.

“Value of the Cloud Today”
As I describe the value of “Cloud” to companies today, I don’t want to claim authorship on my categorizations. Many have provided inspiration. In an effort to recognize all, I’ll point out Dave Vellante’s iterative work on “Stack Wars” as a good example. He makes many good points about the race to technical dominance among the big vendors today. With that said, I’ll continue on like it’s all my idea. For my discussion purposes, I’ll overly simplify and say we have 4 operative categories of computing today:

1 )Traditional Tech: For this discussion, I am referring to everything from 2009 back, Client Server, Mainframe, etc. Over the years since we first used Hollerith’s Desk to tally the census data around the turn of the 20th century, we have worked to optimize individually owned computing environments and we have spent trillions of dollars on our data centers in primarily a self ownership model. For more info on cost, this blog has some of the details on IT spending ( Information Technology Spending By Country, etherfire, brighthub.com, Published Jul 31, 2010). Because of its dominance, we will not see traditional tech become extinct any time soon, but I do see notable changes in the following 3 categories.

2) Vertical Integration: Henry Ford was a master at Vertical Integration. He owned the rubber tree farm to make his tires, machined his own parts, etc. This is a very traditional model for the beginning of a technology curve (aka automobiles in early 1900’s). Interesting enough some companies are deploying this strategy today in the later, more mature phases in which we are currently. A good example is Oracle’s exa-N product lines. It’s an “all in one box”, single vendor strategy. The focus is specialization and is aimed to increase performance for data analytics and potentially reduce costs from server reduction. There are other examples of “IT appliances” in the market. The company where I work has similar purpose-built appliances. These purpose-built “plug in and go” appliances will continue to gain ground because they are easy to consume, but they need to balance their specialization with their cloud integration capabilities. Why should you care if they connect to the cloud? Because, history proves specialized/isolated assets are more expensive than communal assets. Appliances will likely be “cloud compliant” or they will find themselves as a tangent away from a cloud strategy and obscured from the primary plans of the business.

3) Virtual Integration: This technique leverages the features of Virtualization to blur the lines of proprietary systems into an abstracted layer of logic technology components. With a virtually integrated environment come optimizations for utilization, improved flexibility, notable savings from consolidation and other benefits. Every virtualized environment is not necessarily a “cloud”, but it provides a medium to begin developing systems with cloud attributes. This still requires the individual companies to retain ownership or to bundle managed services as part of their efforts.
4) Hosted Service Levels and/or Functionality: Hosted environments provide a utility type computing environment where the user can consume IT units without a great deal of day-to-day knowledge about what is happening “behind the curtain”. Hosted environments are not the sole domain of the Cloud, but they provide a scaled platform and a financial transactional model to monetize or apply the cloud to IT. In the last decade hosted systems have grown tremendously, albeit not without many customers returning to in-sourced or on-premise systems. These reversals are a mix of growing pains in scaled execution and changing customer requirements. Though I’ll point out change is a function of clouds. Clouds should allow for IT units to “float” between providers.

Within some of these categories you can begin to see optimizations achieved by the abstraction of the physical equipment from the logical functionality and the utilization of service that abstracts the consumer from the technical expertise. Both approaches are optimizing through a function of consolidation. These changes mirror the evolution of electricity production (expertly outlined in “The Big Switch” by Nicholas Carr, a book and topic I plan to cover in a later blog.) These building blocks provide the technical stage for a new business model we call the cloud. Today we have 2 cloud formats “Public Cloud” and “Private Cloud”. A Public Cloud is where a consumer buys the service and uses it paying per use, a monthly or an annual fee. Public Clouds generally imply multi-tenancy. Meaning my stuff is next to your stuff and there is some basic security to protect us from each other.
A Private Cloud provides cloud-like functionality, but is housed in a single instance either owned by the consumer or specifically hosted for that consumer by a 3rd party. Whether a consumer leverages a cloud publicly or in private, there are some basic benefits they are able to leverage:

Utilization – Because your servers for say Exchange 2010 are now virtual, you don’t have to purchase all of them as physical assets, they will co-exist on a large physical server, and this inherently increases utilization as an example.
On-Demand Capabilities

o Performance – Yes VMs do add a bit of overhead, but once again you’re combining strength. Imagine 100 cars on the highway and every car going downhill gives their horsepower to the cars going uphill. This will increase the ratio “performance/dollars spent”.

o Capacity – Similarly, capacity needs fluctuate and virtualization can help level set to an overall optimization of available assets.

Virtual Management– Today Cloud technologies take the fundamentals of virtualization and add management capabilities which are critical to enable the switch to the cloud.

o Separation of duties – ability to define delineation between users and functions. Strong permissions controls, performance controls of the virtual machines, etc.

o Simplification – Example, if you had a box crash 100 miles away, you would have to requisition a new server, wait for it to ship, plug it in, and load it with new software. If that box was a VM, you could send the server configuration like an email to the remote server and enable it with a couple of button clicks. Cloud simplifies, cloud tools provide templates and wizards to increase the utility of the administrator.

Owner to Consumer Switch – With all this new flexibility and simplification, companies can evaluate and execute a more fluid ownership model. Do they own this asset or do they ship it to a partner? Do they allow an integrator to host a developing system and move it back when it’s ready for production, do they own the primary system, but bolt on clouds from 3rd parties? These are options that were non-existent or at least much more difficult to execute just a couple of years ago.

“What’s next?”
Going forward the Cloud will provide greater utilization and flexibility in how we consume IT units. In my discussion, some would say I missed a Cloud category when I mentioned only public and private clouds. That would be “Hybrid Clouds”. I did this because to me this is how the cloud will be applied. “Hybrid” is a mash-up based on the customers financial and risk requirements. For me, the application of cloud will be like a Cloud shopping mall where you have a diverse series of store fronts offering a variety of ownership models and service levels. Figuratively, you may choose to buy a pre-made suit for one need, and the thread and cloth for another. I must assume this will be good for the customer and our collective GDP. My Business Analyst personality likes the options for business going forward, and my hippie side likes the fluffy clouds painted on everyone’s logos these days.

What will Dislodge the Pebble – Virtualizing Large DBs


2011 is the year for the question, when will large databases “virtualize” in mass? It’s somewhat a technology question; it’s definitely a bit of a sizing/scale/performance question, but I want to make the case here that it’s additionally a people and risk question that must be answered.

There is much worthy discussion about whether the technology is ready and for whom. I have the opportunity to work with some of the largest companies in the world and they each have unique requirements which are individually thrilling to help address. With that said, there are some great technologies available to attack “the middle” of the normal curve. Vsphere5 release increased capacity limits for virtual core counts and such, removing those fundamental limitations. I work with customers around products like Vblock from VCE which adds significant horsepower and simplicity to the equation, working as an accelerant for change. Finally, I know of customers who have virtualized significant DBs, but these areas are not my point of focus for this blog. Instead l’d like to look at human factors that will determine the pace of this transformation.

In a way, people are as much the database as the bits and the b-trees. People create the data, people write the software (well we use to…), people govern the use of the data, people maintain the data and, ultimately, people spend the money that most often the data is collected for in the first place. People are a huge part of the equation for virtualizing databases. Let me pull a bit of Hollywood-script-writing license and generalize this down to two people, the CIO and the DBA. This is unfair to paint with such a broad brush, but they will be the quintessential players in my story.

The DBA is a well-honed machine. The masters of this craft have been at it for 20 years cutting their teeth on MF to Open Systems migrations; they rode the rise of Oracle. Some started, left and came back to DB2. Most have suffered countless huge audacious problems and have earned their place in the database hall of fame. Companies pay the big bucks for these superstars for one reason. Both their technical experience and their risk mitigation experience have trained them to KEEP THE DB RUNNING, and they do. Virtualization to a DBA is a scary thing. It goes against the grain and, hey, when you’re in your 40’s who wants to be reviewing logs at 3:00am on Sunday morning… They have an aversion to such a big change which will require “new experience” to keep the system stable. Additionally many DBAs still suffer from performance and growth challenges that they feel are their primary job to address. Our companies have employed DBAs to bring these exact skills to the table and thus many DBAs are “cautious” to “concerned” about virtualization.

Now for the CIO. The CIO, though less connected with the details, has similar concerns of risk. Their success is tied to a highly performant and stable data center. With that said, they face an equally heavy burden to reduce costs each year as data grows, like a teenager, year in and year out. They have seen the virtues of virtualization and believe it to bring better utilization, cost management, IT agility, and DR capability. CIOs are listening to both sides and reasoning the answer to what, how, when; why is already covered.

Ok, so what you really want to know is do I see a trend? I have statistical training in my graduate work and I can safely say my sample size is ambiguous and biases are undefined, but I will not let that stop me from sharing my reasonably crafted opinion. Yes, there are some trends. The trend in 2010-1H2011 was a peppering of important deployments across the industry that virtualized DBs. These provided a proverbial support to show the sky isn’t falling, and stages for the next level of demonstrable growth. The trend in second half of 2011 will be lots of experts busy in the tasks of design, creation and realization. The whiteboard work has significantly increased. Blueprinting, POCs and other type of activity have notably increased. Messaging at Executive Briefings has increased (CIOs are listening). So when does the pebble dislodge and the deluge of large virtualized DBs flood the install base? If you carve off the most outrageously large and unique DBs from the conversation, I think you will see steady activity in the back half of 2011 and 2012 the pebble will tumble.