Storage Informer
Storage Informer

Tag: Enterprise Storage

Data Domain, NetApp And The IT Industry

by on Jul.21, 2009, under Storage

Data Domain, NetApp And The IT Industry

EMC logo Data Domain, NetApp And The IT Industry

As I think about this latest acquisition, there are three major themes worth exploring. The first theme has been covered widely already — the impact of data deduplication, why it’s hot, the value of differing approaches, why it needs to…

Chess As I think about this latest acquisition, there are three major themes worth exploring.

The first theme has been covered widely already — the impact of data deduplication, why it’s hot, the value of differing approaches, why it needs to go everywhere in the stack, etc. 

No need to cover that one here.

The second theme is the impact to EMC, Data Domain and our mutual customers. 

I attempted to sketch out how Data Domain’s technology and offering could potentially create a surprising amount of scale and synergy when juxtaposed with EMC. 

There were some colorful comments on that one.

The third theme revolves how this acquisition is emblematic of broader themes in the industry, and how a certain class of IT vendors will find themselves with hard choices to make going forward.  Although I’ll be using NetApp as an example, the discussion actually covers a fairly broad spectrum of well-known IT players.

And, yes, I’m wearing an asbestos suit …

Big Things Afoot In The IT Biz

No surprise, we’re in a period of rapid consolidation in the IT industry.  Being a large, successful IT vendor is now a game of scale and synergy — scale to reduce the costs of development, distribution and support, and synergies across the portfolio to increase the value of the overall offering.

For example, if one looks at what Oracle is doing, it’s pretty clear they’re thinking this way.  EMC’s transformation from a single product vendor (Symmetrix) to the broad portfolio of today reflects that thinking as well.  Cisco might fall into this camp.  And there are other examples, if you think about it.

If you get involved in corporate strategy, you have this “aha” moment when you fully grasp the amazing potential of scale and synergy.  However, realizing that potential is another thing altogether, as we’ll see in a moment.

Get Big Or Get Bought

Now, let’s look at the chessboard from another perspective.  Imagine you’re a mid-sized IT vendor, and you’ve become really successful with “your thing” — maybe it’s a particular technology, or a business model, or something else relatively innovative that’s gotten you to a certain size.

At some point, other people start to figure out what you’ve done, and start to offer alternatives that are just as good — or sometimes better — than your unique gig. 

You might think you can compete through size and strength, until these newer competitors get bought up by the bigger companies, and then the tables turn on you.

For example, Dell had a killer business model — until other vendors copied it.  In the late 90′s, EMC had a lock on high-end enterprise storage, until there were “good enough” alternatives from other vendors.  NetApp has a nice file system with some useful tricks, but has never been able to move much beyond this. 

Lots of other examples in the industry when you look at it this way.

The key point here is simple — the things that fuel the initial phases of growth of any successful IT vendor only have a limited life span — the rocket fuel doesn’t last forever.

So, What’s An IT Vendor To Do?

It’s easy — buy or get bought.  Either play the “scale and synergy” game, or sell yourself to someone who wants to play that game.

To play the “scale and synergy” M&A game, you’ll need three things:

(1) money, in the form of cash or equity
(2) a reasonable pool of attractive candidates to buy
(3) the ability to extract the “scale and synergy”

I am no financial engineer, so I don’t have much to say about capital structures, debt to equity ratios, dilution and so on.  I can follow those conversations, but I don’t have much to say.  To simplify the discussion, let’s assume that our moderately successful IT vendor has a pile of money to work with.

The problem that’s specific to these class of mid-sized vendors right now is that it’s a pretty shallow pool of acquisition targets.  The larger and more interesting targets that have shipping products, happy customers, demonstrated growth potential, etc. are drawing the attention of the bigger players. 

These bigger players not only have more money, but usually have demonstrated an ability to extract more value from an acquisition.  Which tends to leave either very small companies who haven’t demonstrated success, or a few picked-over shop-worn names that everyone else has looked at already, and decided to pass.

That’s the M+A target pool that NetApp and others in their category are facing right now.  Not a lot of attractive choices, and — if there is one — there will likely be acquired by a bigger player who’s more interested and can make a better offer. 

Making An Acquisition Work

The only proven way to make M+A work is to invest in building a machine and doing it over and over and over again.  That’s the formula at Oracle, Cisco, EMC, Microsoft and the other big players.  We’re not perfect, and we all make mistakes, but each company has a core discipline and track record in making acquisitions work.

Most people aren’t aware that EMC was doing acquisitions way back in the 1990′s.  I would offer that we weren’t very good at it back then, just like any midsized IT vendor in our category.  It wasn’t until we set off in a new strategic direction (thanks to Joe Tucci) that we invested enough effort to get really good at the whole thing.

By comparison, NetApp hasn’t been able to demonstrate any real success in making their acquisitions work.  The Spinnaker acquisition could hardly be called a success (it almost tore the company apart), Decru disappeared into obscurity, Topio was withdrawn from the market.  One could argue that Onaro is enjoying some small modicum of success, but does it really matter?

Hard Choices Ahead

It was interesting to hear Dan Warmenhoven publicly tick through the list of potential suitors for NetApp, and take each and every one of them off the table for one reason or another.  While I could debate his logic, it’s very clear that he and the rest of his management team are thinking long and hard about “Option A” — be acquired.

By comparison, “Option B” — buy a lot of stuff — isn’t working out so well.  There are bigger and more skilled players competing for the same acquisitions, and they can justify paying more simply because they have a track record of extracting value through scale and synergy.

Which leaves us with “Option C” — carrying on as before.  Continue to focus on one single product and technology  (e.g. WAFL), continue to enhance it with new features such as dedupe and various forms of integration, incrementally tune the business model in terms of margin, channel mix, keep a brave face in public, etc. — but hardly exciting in the big scheme of things.

Unfortunately, much as NetApp got their start in offering “good enough”, there’s a host of players waiting in the  wings waiting to do the same thing to them: open-source file systems such as ZFS, small aggressive players such as Isilon, Compellent and 3Par, and even encroachment from the consumer/prosumer/SMB marketplace. 

Not to mention big players like EMC that have enormous R+D budgets and are prepared to use them as a weapon.

Add to that the structural transition going on in the market from physical to virtual, and from enterprise IT to service providers (private cloud thinking in a nutshell), and you’ve got to ask yourself some hard strategic questions.

No Obvious Answer

Now, this discussion is nothing against their technology (every technology has strengths and weaknesses), or a discrediting of their rabid fans (who I will undoubtedly hear from in their typical colorful fashion), or even a critique of their conduct (still have big issues with that one) but in terms of The Big Game, it’s not clear what their next move will be.

And it’s not just NetApp — I could construct a rather long list of other IT technology companies who are roughly in the same position that they are, and are facing the same set of uncomfortable choices.

Any thoughts?

Update your feed preferences


Leave a Comment :, , , , , , , , , , , , , , , , , , , , more...

IBM Expands Enterprise Storage Offerings to Help Businesses Manage Information

by on Jul.13, 2009, under Storage

IBM Expands Enterprise Storage Offerings to Help Businesses Manage Information

IBM today announced enhancements to its information infrastructure portfolio of high-end enterprise storage products designed to help businesses manage the explosive growth in data and information.

E-mail this page delicious Save to


Leave a Comment :, , , , , more...

You’ve already lost

by on May.29, 2009, under Storage

You’ve already lost

EMC logo You’ve already lost

To the people who think FLASH as disk is a bad thing, you’ve already lost. FLASH is already being treated as disk in everything from high end enterprise storage arrays and large servers to desktops, laptops, netbooks and consumer digital…

To the people who think FLASH as disk is a bad thing, you’ve already lost.

FLASH is already being treated as disk in everything from high end enterprise storage arrays and large servers to desktops, laptops, netbooks and consumer digital music players.

This battle of ideas is over.

Like the choice between a new way of thinking about 64 Bit processing or simply extending the existing 32 Bit paradigm with extensions the market moved to the extensions for the existing 32 Bit paradigm.

Instead of moving from the Disk paradigm to a new one the dominant Disk paradigm and all the processes which come with that have just been extended to embrace FLASH technology.

In the same way the paradigm of RAM and it’s use on the motherboard will also be extended to embrace cheap and plentiful FLASH.

Flash based Tier 0 is fast disk in storage arrays because there’s an entire paradigm which expects a disk to appear to be the storage device.

Paradigms are powerful things. So powerful that an idea formulated in the 70s, that of putting files in folders or directories when using an operating system is still with us today. Even though I’d be hard pushed to find anyone in the teens who hasn’t worked in an office that has ever used a filing cabinet, they’re still organising their music collections, movies and digital photos in folders.

Did you ever stop to wonder why?

Look at early automobiles. Brass Era cars were called horseless carriages because they looked like carriages, but without horses. The only technology in them which was the same might have been seats, axles and wheels, but but the fact they were delivered as an extension of the dominant personal transportation paradigm of the time is why they succeed.

It doesn’t matter how ludicrous you think something is, if it’s an extension of an existing paradigm it’ll stand a far greater chance of succeeding than if it isn’t.

Extending another paradigm, FLASH based cache will be treated as slow RAM on computer motherboards. I was going to say all motherboards but I think it’ll be a while yet before we see it on every motherboard. Anyone thinking of putting FLASH on a card and calling it gravy better have a word with the motherboard manufactures as you’re not designing anything they can’t and you’re not currently shipping anything they won’t in greater volumes and for a fraction of the price.

FLASH succeeds in the market because it extends existing paradigms which have already been accepted.

It doesn’t attempt to supplant them.

Update your feed preferences


Leave a Comment :, , , , , , , , , , , more...

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Visit our friends!

A few highly recommended friends...