The S.O.L. Moment for BI Hobbyists

When the PC industry was young we used to build computers from build-it-yourself computer kits. The best known kit was the Altair 8800 but another was actually called SOL (maybe due to the condition in which kit purchasers/PC hobbyists would find themselves…).

Nobody buys and builds computers from a kit anymore. We buy professional products that are built by people who design, assemble and test thousands of these every day. Yet the “build-it-yourself  BI kit” is still the dominant way IT teams today buy, assemble and deliver Business Intelligence. That’s insane!

Since BI projects are built by hobbyists from tens of building blocks, it’s no surprise that their requirements so closely match “hobbyist” PC requirements:

  • More Important Requirements: Performance, cost, expandability, upgradeability.
  • Less Important Requirements: Reliability, availability, service, ergonomics and usability.

In an ideal world, BI Hobbyists would realize that assembling and operating a diverse set of technologies and products that come usually from number of suppliers (or from a single supplier through multiple acquisitions) would be overwhelmingly complex. But the fact that the pieces come from a large company typically gives the buyer the illusion of completeness and unwarranted optimism about the chance of success.

And we don’t live in an ideal world anyway. That’s why BI projects are so often delivered without the adequate “reliability, availability, service, ergonomics and usability”. And quite often they’re not delivered at all. Clearly, some BI hobbyists find themselves in similar position as PC kit purchasers used to…

These are the “S.O.L. moments” when we get calls from prospects today (increasingly from Fortune 500 types). As much as I would love to see GoodData as a part of large-scale BI projects from the very beginning, I understand that we need to prove ourselves first. I am more than happy to come to the rescue.

But isn’t it obvious to the industry that business intelligence should no longer be built by hobbyists? BI buyers should focus on business value (metrics, dashboards…) and BI projects should be built by people who design, test and deliver at least hundreds of these every day…

BI at SaaS Speed

Winston Churchill once said that “difficulties mastered are opportunities won”. His quote is very applicable to the the effort of building BI in the cloud. GoodData announced earlier today that May 2011 was our biggest month ever, so it is good time to look at difficulties and opportunities of BI Cloud in greater detail.

Business Intelligence is a huge opportunity. Even in its current, dysfunctional, on-premise form it is $10B software industry. And on-premise BI is extensive and expensive IT initiative that involves building a complete chain of data integration, data-warehousing, dashboarding and visualizations. On top of the IT efforts comes tricky business part: what to measure, what are the right metrics, how to present them and to whom. And it all has to happen at the speed of business, not at the speed of IT.

This IT/business dichotomy leads to extremely low success rate of BI projects – as much as $7 billion annually is spent on BI undertakings that are considered failures. That’s right – $7 billion worth of BI software ends up sitting on the shelf every year!

On the contrary the SaaS model works best when the product is well defined, customer adoption is fast, satisfaction/loyalty is high and cost of servicing the customer is low (for more information on SaaS metrics please read “Top 10 Laws of Cloud Computing and SaaS” here). This means that the traditional, slow moving, complex and expensive BI will NEVER make it to the cloud. Numerous small and large companies have tried to host their traditional on-premise BI products in the cloud, but SaaS laws are called laws for a reason – these companies either failed already or will eventually fail.

So what is GoodData doing differently to master the difficulties of Cloud BI?

1. Product Definition/Customer Adoption – in order to make customer adoption as quick as possible, we are building a set of BI applications. These apps are templates that contain not only connectors to standard data sources (such as Salesforce, Zendesk and Facebook) but also complete dashboards and reports that incorporate best practices in the form of metrics. Our Sales Analytics app helps you measure predicted revenue. Our Helpdesk Analytics app measures your backlog and resolution times. Our Marketing Analytics app teaches you how to calculate campaign ROI. We’re adding these applications on a weekly basis. You can see the full list of our apps here: http://www.gooddata.com/apps

2. Customer Loyalty – We deliver a complete, managed service to our customers. Our developers, ops and support personnel are making sure that every single data load goes as planned, all reports are loaded correctly and that there are no performance issues. We even publish our Operational & Service Performance here: http://www.gooddata.com/trust

3. Cost of Service – We’ve architected a very different platform that allows us to host a large number customers at a relatively low cost. The platform is so different that we often have a hard time communicating it to the BI analyst community (concepts like REST APIs and stateless services are not part of normal BI nomenclature). And the flexibility built into the platform allows us to move at the pace of business and not the pace of IT: we deliver a new version of GoodData to our customers every two weeks and we make tons of changes to customer projects daily.

Even the fact that we know how many reports we served to our customers in May of 2011 (over 1,000,000) sets us apart. While the old BI industry can only guess the level of adoption and product usage (of shelfware) we actually know. But again, “difficulties mastered are opportunities won”!

With friends like Forrester and Gartner, IBM and SAP don’t need enemies…

[tweetmeme]
The Innovator’s Dilemma by Clayton M. Christensen is my favorite business book – its main idea (disruptive technologies serve new customer groups and “low-end” markets first) was the guiding principle of all my startups. The best part is that even though everybody can read about the power of disruptive technologies, there is no defense against them. Vendors can’t help themselves. They study The Innovator’s Dilemma, pay Christensen to speak to their managers, but their existing customer base and “brand promise” prevent them from releasing products that are limited, incomplete or outright “crappy.” That’s what makes them disruptive. And industry analysts seem to be the only hi-tech constituency that has either never read Christensen, or is still in absolute denial about it. It makes sense: a book claiming that “technology supply may not equal market demand” is heresy for people who spend their lives focused primarily on the technology supply side.

Christensen argues that vendors no longer develop features to satisfy their users, but just to maintain the price points and maintenance charges (can you name a new Excel feature?). But in many cases the vendor decisions are driven more by industry analysts and their longer and longer feature-list questionnaires. The criteria for inclusion into the Gartner Magic Quadrants and Forrester Waves seem to be copied straight from Christensen’s chapter: “Performance oversupply and the evolution of product competition”. Analysts are the best supporters that startups can have: they are being paid by the incumbents to keep them on a path of “performance oversupply”, making them so vulnerable to young vendors “not approved” by the same analysts!

Forester BI analyst Boris Evelson gives us a great example of this point in his blog about “Bottom Up And Top Down Approaches To Estimating Costs For A Single BI Report”. While Boris is a super smart BI analyst, he somehow failed to observe that his price point of $2,000 to $20,000 per report opens a huge space for economic disruption of the BI market. Anybody interested in power of disruptive technology in BI should listen to a recent GoodData webinar with Tina Babbi (VP of Sales and Services Operations at TriNet). Tina described how the economics of Cloud BI enabled her to shift TriNet’s sales organization “from anecdotal to analytical”. This would not be possible in the luxury-good version of BI, where each report costs thousands. Fortunately, Tina is paying less for a year for a “sales pipeline analytics” service delivered by GoodData than the established vendors would charge for a single report.

I hope Boris’ blog post will appear in one of the future editions of The Innovators Dilemma as a textbook example of how leading analysts failed to recognize that established products are being pushed aside by newer and cheaper products that, over time, get better and become a serious threat. And with friends like Forrester and Gartner, the incumbents don’t really need young and nimble enemies…

COSS BI: Open Source, Open Core or Openly Naked?

Peter Yared wrote recently a BusinessWeek guest blog post called “Failure of Commercial Open Source Software.” Not surprisingly his post caused a lot of angry replies from people who work for COSS companies. “The emperor is not naked” they argued.

I believe that the COSS emperor is openly naked. And the discussion shouldn’t be whether COSS is a complete or a partial failure just because there are few successful exits that Peter neglected to mention. At the end of the day Peter’s comment that “selling software is miserable” is true. Every sales rep involved in selling COSS would agree (I’m interviewing many of them now). Selling COSS is no easier than selling any other form of software.

Any company using the word “open” should be able to explain the true cost of delivery (this is one of Peter’s points). And there is an obvious litmus test of openness of COSS companies: One that I would call “open pricing.” COSS companies should openly publish their price list and clearly mark what’s free and open and what’s paid and closed. Otherwise OSS is just a bait-and-switch to a familiar proprietary software tactic of customer lock-in. This is what OSS was supposed to get rid of in the first place.

Let’s take a look at some of COSS companies in the Business Intelligence space. The bait and switch is in a full swing here:

Jaspersoft: https://www.jaspersoft.com/jaspersoft-business-intelligence-suite-0 Let us prepare a custom quote for you.

Pentaho: http://www.pentaho.com/products/buy_bi_suite.php Request a Quote

Talend: http://www.talend.com/store/talend-store-inquiries.php A Talend account manager will be in touch shortly to provide information and/or a detailed quote.

We announced GoodData pricing earlier today and I would actually argue that we are a more open company than any of companies listed above. Our customers know exactly what service they get and how much it will cost.

We stick to our company motto: GoodData = BI – BS. And at there is a lot of BS going on in COSS space. It may actually be its biggest failure.

 

Full disclosure: I have been a big believer in open source since we opensourced NetBeans more than 10 years ago.

TDWI: Independence vs. Cash

A long time ago I came to the conclusion that “independent industry analyst” was an oxymoron. But the willingness to sell independence for cash reached a new low with TDWI’s New SaaS Business Intelligence Portal. Please visit the link and see if there is any trace of independence left…

Please Don’t Let the Cloud Ruin SaaS

Back in the old good days of enterprise software, we did not need to worry about our customers. We delivered bits on DVDs – it was up to the customers to struggle with installation, integration, management, customization and other aspects of software operations. We collected all the cash upfront, took another 25% in annual maintenance. Throwing software over the wall … that’s how we did it. Sometimes almost literally…

I now live in the SaaS world. My customers only pay us if we deliver a service level consistent with our SLAs. We are responsible for deployment, security, upgrades and so on. We operate software for our customers and we deliver it as service.

But there now seems to be a new way how to “throw software over the wall” again. Many software companies have repackaged their software as Amazon Machine Image (AMI) and relabeled them as SaaS or Cloud Computing. It’s so simple, it’s so clever: Dear customer, here is the image of our database, server, analytical engine, ETL tool, integration bus, dashboard etc. All you need it is go to AWS, get an account and start those AMIs. Scaling, integration, upgrades is your worry again. Welcome back to the world of enterprise software…

AMI is the new DVD and this approach to cloud computing is the worst thing that could happen to SaaS. And SaaS in my vocabulary is still Software as a Service…

Bad economics are difficult to shake off

Terry Pratchett once wrote that “Gravity is a habit that is hard to shake off”. We could make a similar comment about the financials of SaaS BI companies. As much as startups in this field would like to shake off their bad economics, reality always catches up. We’re seeing one after another SaaS BI startup to go out of business. Back in June it was LucidEra and earlier this week Blink Logic ceased operations. But anybody who only briefly looked at Blink Logic’s finances (it was a public company) shouldn’t be surprised by this event.

Why do so many of the attempts to marry BI and SaaS fail? The problem is that Saas BI sounds simple … simple enough to take an existing BI asset (integration engine, open source analytical engine, columnar database, dashboarding, even domain expertise & consulting) and just host it! All it takes is VMware or an AWS account, web server and Flash or JavaScript. Some people call this a paradigm shift, I call it window dressing. LucidEra was essentially restarted Broadbase, BlinkLogic was once called DataJungle, PivotLink recently changed their name from SeaTab, Cloud9 Analytics has a secret history as Certive, Success Metrics morphed into Birst. I could go on…

Why do SaaS BI companies have bad economics? It’s an attractive market – one of the last few open spaces in software. BI requires dealing with lots of data, lots of compute power and many users. SaaS + BI seems obvious. But truthfully, it’s such a difficult opportunity that it requires a new approach, yet everybody is taking shortcuts. SaaS BI isn’t just hosted BI just as email is not just better faxing, wikis are not just simplified Microsoft Word. Some time ago I wrote a case study on how my former company, NetBeans, was able to successfully compete against giants like Symantec, Borland or IBM, this case study is very relevant to our SaaS BI discussion.

The SaaS BI paradigm shift needs to be truly transformational in order to be successful – something that will get BI above the 9% adoption flatline it’s been at for years. Not everybody gets this. One of the best analysts in this space Boris Evelson wrote a blog post earlier this week where he focuses on differentiation of SaaS BI startups. His first question is: VC backing. Is the firm backed by a VC with good track record in information management space? But LucidEra was very well funded by leading VCs. The correct question that Boris should have asked is: Are the backers of the company funding innovation? Do they understand that it takes three years to become an overnight success?

At the end of the day, it’s about economics. At Good Data, our economics are simple – cloud computing, multitenancy and adherence to customer development. We’ve spent two years investing in innovation. That is what I tell my investors every day. And that is how we are going to avoid the startup death spiral.

Friends Don’t Let Friends Overpay for BI

Business Intelligence projects are famous for low success rates, high costs and time overruns. The economics of BI are visibly broken, and have been for years. Yet BI remains the #1 technology priority according to Gartner. We could paraphrase Lee Iacocca and say: People want economical Business Intelligence solutions and they will pay ANY price to get it.

Nobody argues with the need for more Business Intelligence; BI is one of the few remaining IT initiatives that can make companies more competitive. But only the largest companies can live with the costs or the high failure rates. BI is a luxury.

I believe that the bad economics of BI are rooted in the IT department/BI vendor duopoly on BI infrastructure. This post focuses on IT’s inability to deliver efficient BI projects; I will write about the BI industry in my next blog:

There are three fundamental reasons why IT departments in their current form fail to deliver economical BI solutions:

1) They don’t understand elastic scale

IT departments are good at scaling: adding more and more hardware and software but scaling makes sense for tasks that are highly predictable. Given the ad hoc nature of BI we not only need to increase the compute power when we need it for a complex queries but we also need to be able to decrease the compute power when it’s not needed to keep the costs down. Elastic is more important than scalable. And this precisely why internal BI solutions will always be either too expensive or too slow for complex queries…

2) They try to control BI with a single version of the truth

While the volatility of business environment is increasing the IT departments are trying to button up the business knowledge (data, metadata, processes) into a top-down, inflexible and lengthy process that should produce a single version of truth. The problem is that the underlying business is changing so rapidly that by the time this is done the resulting analysis and reports are not correct anymore and the BI project becomes shelfware.

3) They cannot measure success of BI

“If you can’t measure it, it’s not worth doing!” is one of the selling point of BI but it is difficult to measure the success of BI projects. IT delivers on initiatives that are quantifiable (throughput, response time, performance, data sizes) and since the data size is one of the few easily measured aspects of BI it is the only metric where IT can claim success. This is why we often read about terabyte and petabyte datawarehouses. But it is a small portion of the BI market (2%) and they happen to be places where data goes to die.

“Business-IT Chasm”: The Business Perspective

My plan for today was to write more about the “Business-IT chasm” but I came across a great blog post written by Jorge Camoes that reveals the business perspective of this divide. There is nothing better than the first hand experience:

IT will try to change your project, naturally. Try to avoid the “security bomb” (their favorite). You know how poor their expensive BI toys are, and you should know what they can and can’t do with them. Minor concessions can earn you some points. When they tell you they can’t implement your core ideas be prepared to fake genuine surprise, compare costs (again) and emphatically say that their options clearly don’t meet the organization’s needs.

My suspicion that business has limited ability to influence the electrical engineers is demonstrated by this quote:

Pissing off the IT department is one of the most enjoyable games in corporate life, but be a gentleman and don’t make them look stupid. They don’t usually have a good sense of humour and take their quest to conquer the world very seriously. If you really want to implement the dashboard, don’t make it an island if you can avoid it (connect it to the tables in the IT infrastructure, instead of copy/pasting data).

Here is the link to the full post.

Leading Healthcare Information Provider Licenses Good Data for On Demand Business Intelligence

We made the following announcement earlier today and it is obviously a very important milestones for us. And I absolutely believe in what I said in the press release: “Intelimedix’s expertise, combined with Good Data’s on demand collaborative analytics, form an unbeatable combination for healthcare organizations,” said Roman Stanek, founder and CEO of Good Data Corp. “This is a great opportunity to show how solution providers benefit from incorporating Good Data into their offerings.”

Leading Healthcare Information Provider Licenses Good Data for On Demand Business Intelligence

CAMBRIDGE, Mass. – December 2, 2008 – Good Data Corporation, an emerging provider of on-demand (SaaS) collaborative business intelligence solutions, today announced its first customer agreement – with Intelimedix LLC, a leading supplier of business intelligence solutions for health insurers.

Good Data, which recently completed a $2 million initial round of funding from private investors, delivers a cloud-based platform for business intelligence projects. The company is launching a public beta of its hosted service in December 2008 that will offer data analysts in any company immediate and inexpensive access to the power of collaborative business intelligence.

Good Data helps Intelimedix enhance its core analytic service offerings. Intelimedix plans to integrate Good Data capabilities into core applications that run reporting and analysis tools for functions including payment integrity, fraud detection, benchmarking, and measuring operational efficiency.

“In the healthcare environment, users need to be able to access information quickly, efficiently and reliably to make strategic decisions that impact their business,” said David Robinson, Chief Technology Officer of Intelimedix. “Good Data’s technology will help us improve our analytical tools immeasurably. We see Good Data as an important strategic partner that will help us deliver more flexible, effective solutions to our customers.”

“Intelimedix’s expertise, combined with Good Data’s on demand collaborative analytics, form an unbeatable combination for healthcare organizations,” said Roman Stanek, founder and CEO of Good Data Corp. “This is a great opportunity to show how solution providers benefit from incorporating Good Data into their offerings.”

About Good Data
Good Data Corporation was founded with the mission to provide a platform for collaborative analytics. The company believes sharing and teamwork allows users to move past isolated reports and arrive at the true meaning of “business intelligence.” Development of the underlying technology began in 2002 and is currently in use in large insurance and retail corporations. Good Data is a privately held company with headquarters in Cambridge, Mass., and engineering operations in the Czech Republic.