Friday, December 18, 2009

Lock-in

Switching costs are the costs incurred from changing from one brand to another. Tangible switching costs are things like laying a new telephone line. Intangible costs are things like the cost of switching to a new telephone number.

Vendor lock-in is when a customer prefers a different product than the one he uses, but not enough to pay the switching costs. Many vendors of high tech products work to increase their own lock-in and decrease the lock-in of their competitors. In some cases, for example computer hardware, switching costs tend to decrease with time.

Brand specific training for users of computer software is a lock-in that tends to increase with time. Users do not like changing from software they are familiar with, and the longer they use the software, and more proficient they become, the less willing they are to change. With enterprise software, the cost and risk of switching from one product to another is a powerful force maintaining the status quo.

Lock-in is the key issue in software marketing. The usual marketing ideas that apply to selling high margin consumer goods like fresh fish simply do not apply. In fact the normal rules our supply and demand are so skewed in the IT market that they are almost impossible to recognize.

Competing commodity software packages immediately become freeware. The reason is that the high initial cost of creating software combined with the low marginal costs per customer encourages vendors to increase market share by price cutting. If there is no brake, the market ends up spiralling down to freeware. If a new standard is introduced into the market, and it succeeds in becoming a real standard, your package may become a commodity and you may find yourself having to give it away.

But by locking customers into a solution, software vendors can reap sizeable profits even if their products are more or less the same as the competition. The reason is that the switching cost, and not the license fees for the software itself –- which tend to be a small part of the total cost of ownership -– is what is keeping the customer paying.

Tuesday, December 08, 2009

Alea developers move to Jedox

Interesting story behind Jedox hiring the remaining Alea developers from Prague. As far as I know there are three left, but once there were thirty. The MIS hotheadedly fired most of them after the disastrous Alea 4.0 project back in 2001 (or maybe 2002, I have forgotten).

Alea was a clone of TM1 developed by MIS GmbH from scratch in Prague back in the mid 90s. After squandering its IPO money, MIS ended up as part of Infor. Comshare was there too, and the two product lines started to compete internally and merge. In my opinion MIS had better technology than Comshare, but I am not a neutral observer.

One upshot of all this is Alea was renamed Infor OLAP and development was moved to Ann Arbor. My guess is that this is one reason why Matthias Krämer went to Jedox, though I haven't asked him. He won't need to take much time to get to know the products.

The MIS front-end tools are still being developed in Prague. The Web application development tool was originally created by Intellicube, which Christian Raue (founder of Jedox) sold to MIS. The development team was partly made up of redundant Alea developers. I was later product manager for that product, now known as Application Studio. Unsurprisingly, the Jedox front-end tools are very similar to Application Studio.

Wednesday, November 11, 2009

PowerPivot as a database

A couple of people disagreed with my somewhat pessimistic analysis of PowerPivot's (Gemini) prospects. So I decided to jot down a few more details on how the product works as a database. I still stand by my original comments, but it is interesting to look at what the advantages of the product are as well.

PowerPivot is an interesting approach to dealing with the problem of data storage in Excel. Microsoft Excel users often store data in Excel as if it were a database. But Excel is not a good database for several reasons:
  • It fails to separate formats and raw data in an orderly way.
  • It does not provide an easy way to share data among users.
  • It has no straightforward means of applying calculations to more than a single cell.
As a result, there are lots of issues that come up in projects making heavy use of Excel. In particular performance, accuracy, reliability and versioning are compromised. Gemini deals with some of these issues.

Gemini addresses the issue of data storage in Excel in the most simple and obvious way. It provides a new method of storing data in an Excel file that is separate from the formatting and calculations in Excel. It does not, by default. store the data outside the Excel file. But although PowerPivot data is stored in the same file as Excel it is stored separately from the rest of the Excel data.

To understand exactly what is going on we need to take a step back and look at how the latest Office products store data. Basically the new Office format is a zip file which contains several XML files. One Excel file contains the Excel spreadsheet Excel itself. Other objects such as charts are stored separately in the same zip file as the spreadsheet file. (Actually this is a simplification. Each sheet is stored in a separate file, and there are other complications involving shared content.) The zip file has the format of a normal zip file even though it has the extension xlsx and not the .zip extension. But don't take my word for it -- you can see the exact internal format of Excel for yourself by simply opening any Excel 2007 file with a data compression program like WinRAR.

In Office 2010 it will be possible to store additional data in another separate file in that same zip file and to access that data using the database interfaces provided by Excel. These database interfaces are the pivot table and the database spreadsheet formulas. From the outside the file won't look any different -- except maybe a little bigger. Or maybe a lot bigger. The much ballyhooed column compression only works in memory.

Office 2010 also addresses the issue of sharing data between users. I guess this option is really intended for Word more than for Excel, but it works across te entire product line. In any event an Excel sheet stored on a server will be available to multiple concurrent users with a local installation of Excel or in Sharepoint. The idea is that you create an Excel file and put it on a server where all your colleagues can access it, and the product allows you to edit the file simulataneously, even updating changes made by others. This sounds like a rough and ready solution to the second problem listed above, although it does not provide any security.

The final issue that seems important to me is applying a calculations to a large number of elements. And here, too, PowerPivot does offer some respite. Calculated columns can be added to the table. This is a Good Thing because it save the user from the difficult and error prone procedure of copying the formula down the entire column. It is not a replacement for multidimensionality, because the "database explosion" effect means that you will need to define a lot of columns to imitate multidimensional aggregation. But plenty of Excel-based solutions get by without this so I guess it won't always be an issue.

The upshot is that the product really does offer departmental users some relief from Excel chaos by treating the worst symptoms. And it does so without being disruptive, a key design consideration in viewof the size of the Excel user base. So I can understand some people's enthusiasm.

Tuesday, September 01, 2009

The Limitations of Gemini

Information is still a bit sketchy on Gemini,but it is already fairly clear where Microsoft is going with it. The clarity comes partly from Microsoft's public statements and partly from a sober analysis of the constraints Microsoft's BI program is subject to.

Microsoft will be offering three ways to view Gemini data. The first way is the grid they have been presenting in there demos. Take a close look at that grid. They always start it in Excel, subliminally -- or not so subliminally -- suggesting that the data is in Excel, but that is not the case at all. In fact it is just a separate grid that floats above the Excel window. The Gemini grid has nothing to do with Excel and is pretty basic as a reporting tool.

The second and third ways of viewing Gemini data are not new at all. In fact they are just the same tool that are now used for viewing Analysis Services data -- Reporting Services and the Excel Pivot Tables. Although both are adequate tools for some purposes, they tend not to score particularly well in customer satisfaction surveys. Perhaps more important they do not offer much in the way of features that are specific to the data model of Gemini.

The marketing role of the front end is very different in QlikTech. QlikTech's main marketing argument is "associative analysis" which it claims is a revolution in OLAP. Whether or not you agree with that claim, there is no doubt that the product delivers a specific set of features to make good on it. Specifically, the selection behavior of the list objects with their characteristic green shading, automatic hiding of unassociated elements in other objects and so on are specialties of the product that visualize the concept of associative analysis.

Arguably the real reason for this is that the front end is out of the hands of the creators of Gemini. After all Microsoft's BI front end is Excel. Excel has perhaps a half a billion users, and most of them would just be confused by any radical change in the interface.

The data import tools for end users look pretty simple. This is an issue that often gets forgotten in BI sales presentations, because data import is not a topic that end users like to get involved with. It is also a key weakness in QlikView, which offers a few automatic options but quickly switches to a proprietary scripting language. So far Gemini looks even weaker. But strengthening them would be to compete with the tools SQL Server already offers.

So with Gemini Microsoft isn't offering anything new in the front end or data import. What this means is that Microsoft is betting on the in-memory feature itself to make Gemini attractive to the end user. Microsoft has very little in the way of analogous features for the end users. Instead it has emphasized the issue of scaleablility, always a popular marketing claim but not the key issue in a self service environment.

So Gemini will have to sell itself on the technical merits of in-memory data management.

Wednesday, July 29, 2009

IBM acquires SPSS

Still digesting Cognos, IBM is now investing in high end analytics -- Reuters says it splashed out $1.2 bn for the data mining and predictive analytics specialist SPSS. That is a remarkable sum, considering that it’s a 40% premium on the value of the stock and about 25 times the company's annual profit. The takeover is great news for veteran CEO Jack Noonan who joined the company in 1992 and is now approaching retirement age. This is especially true considering that the success of the open source R language is squeezing proprietary data mining vendors.

This isn't IBM's first foray into data mining. In 2000 BARC published a study showing that IBM's Intelligent Miner was one of the best products on the market from a functional point of view, right up there with SAS Enterprise Mining. Ultimately however, IBM never had a lot of success with Intelligent Miner. IBM has been moving out of the market step by step and now only offers a few specialized solutions. And when IBM announced the SPSS deal Intelligent Miner was only mentioned in passing as an execution engine for SPSS solutions.

Maybe the Cognos acquisition changed things for IBM. SPSS is a Cognos OEM-Partner providing advanced analysis features to Cognos. Cognos is more focused on reporting and basic analysis so the SPSS products show no real overlap. On the other hand Cognos has also made its forays into the data mining business, including its purchase of Forethought, part of a once fashionable but now forgotten desktop data mining strategy.

Data mining and advanced analytics companies have carved out an interesting niche for themselves. Bringing data mining to the masses in the form of simplified data mining has not been a very successful idea, probably because the concepts are just to complex for the average business user. And BI companies have worked for decades to replace custom modeling with pre-built applications, but have had little success. But both SPSS and SAS successfully sell applications such as analytic CRM and vertical solutions for the financial industry.

SAS is strong on data management and advanced analysis. IBM now has assets strengths in both areas. SAS is still stronger in specialized analytic applications, but SPSS has been successful in this area as well. Furthermore IBM is likely to emphasize pre-built applications – as they have for Cognos.

SAP Business Objects is another company that will have to adjust to this acquisition. It had a very close relationship with SPSS, because SAP -- like IBM Cognos -- does not have much to offer in this market segment. SAP will have to look around for a new partner, perhaps KXEN or another smaller vendor.

Friday, July 10, 2009

Running BI on Chrome OS

The Google Chrome operating system would basically be a browser and some drivers to connect to the underlying hardware. It would be the realization of Marc Andreessen's dream in the 90s reducing Windows to a device driver.

It only make sense for a client. The idea would be to run programs with a Web client that don't require any local installation. That way it doesn't matter what the underlying system actually is.

Google has been pushing HTML 5 pretty hard, and Chrome OS is a good reason to. For example, HTML 5 has a tag to show vidoes, which means you would need a plugin. That means shifting the burden to the browser itself. The more the browser does the more viable the Chrome OS idea is.

It's notable that a lot of BI tools would work on Chrome. For example, a lot of the sprawling Information Builders would work fine, as would MicroStrategy -- they run on DHTML. So would Business Objects WebI, which is Java based. I'm guessing Java will work. Cognos would have some issues because some of the studios require ActiveX.

And so on. I'm not attempting to do a complete survey here, just point out that BI vendors have already done quite a bit for Web support, and ultimately, a Web strategy is a Windows-free strategy.

Monday, June 15, 2009

Cognos and the midmarket

At the IBM event in Berlin last week (the "IOD") Cognos spent a good deal of time and energy briefing analysts on its midmarket strategy. To make a long story short, Cognos is planning to bring out a new product to address the midmarket.

One take is that it is at least partly a step towards alignment with the policies of IBM's service organization. IBM Services has had a mid-market offering for years now. And my impression is that Cognos is focusing a lot of energy on aligning itself to towards IBM Services as a sales channel. The alignment towards IBM Services also shows in Cognos's increased focus on prepackaged business content.

However the midmarket idea is in line with what appears to be an internal shift at Cognos back towards the end user. Cognos 8 shifted the product line towards the needs of the enterprise. The shift has been going on for some time, probably as a reaction to customers who prefer to stick with Cognos 7. "The pendulum is swinging back" was the message in Berlin, and recent versions of Cognos 8 Bi have revived the PowerPlay product family.

Targeting the midmarket is not exactly the same thing as targeting the business user. Targeting business users can also mean creating departmental solutions in large companies. However there certainly is an overlap. The new product is intended to be easy to install and administer and to cover planning reporting and analysis in one package. It will have Web access and an Excel front-end. I'm guessing its also going to be relatively cheap.

Cognos is adamant that it is building a new purpose built solution, not simply bundling existing tools. But it also says the new product is "based on existing proven technology like IBM Cognos TM1 and IBM Cognos 8". It will be interesting to find out exactly what that means.

Issue they should be looking at is
  • Compatibility with existing products. Customers using TM1 might want to upgrade to this product. If they can't they might start worrying about the future of the existing product.
  • Organizational isues. Will IBM Cognos have a sales team willing to make the effort to sell this product?
  • Easy installation. Self service is a key goal in the project, and installation needs to be automatic.
  • Automatic metadata exchange. TM1 models need to be automatically visible in the Cognos 8 bits of the offereing, even at the cost of flexibility.

Friday, April 17, 2009

Twitter's limited API

When I first looked at the Twitter API I was surprised to see how simple it is to use. The next thing I noticed is how little it can do. In particular, as a business intelligence guy I was struck by the lack of sophisticated query methods.

The API does not provide a way to do simple things like get a sorted list of the people you follow, which is probably ok. It's easy enough to do your own sort in your client. But what if you want to sort the people you follow by say the number of people THEY follow? To do that, you need to make an API call for each person you follow to find who he follows, and then do the sort in the client. This is pretty much hopeless for analysis purposes, especially considering the calls per hour limitations the API has and the fact that there are lots of Tweeters following 20,000+ people.

Twitter doesn't deliver some API features because it can't afford to. The "firehose" is the jargon for a real time feed of all tweets from everyone worldwide. Twitter has been promising this for some time but keeps delaying it. One reason may be that they are afraid there would be too many takers. I suspect the reason Google can offer so much storage to gmail users is that no one uses it -- like a bank hoping there won't be a run. Using an add on to store lots of data there is possible, but I think Google frowns on those shenanigans and will even block your account if you up load too fast... So as long as Twitter keeps growing at its current breakneck speed, there are some things the API won't offer because offering it could break their overstrained servers.

Another reason that Twitter might not offer some functions to their API is that they want to sell analyses as an added value service. Twitter still doesn't have a business model (except "Microsoft or Google") but analytics is an obvious option. In fact I think that providing analytics is the only real prospect that Twitter has.

In particular, Twitter has a mechanism for providing that favorite business intelligence feature -- the real time alert. For example, sighting a yellow headed blackbird in Connecticut is unusual, but the news needs to be immediate for an ornithologist to profit from it.

Twitter is also about people, not just about information. It provides relatively detailed information about who knows who. In fact I see this as the keep features of the service, so I am surprised that so little effort is invested in suppressing spam.

But whatever the specific application, the ability to analyze Twitter's database is too valuable to give away, so I suspect Twitter will not do too much to make their API better for analytics in the near future.

Monday, April 13, 2009

Combining Saas, Open Source and BI

The terms business intelligence (BI), software as a service (SaaS) and open source (sorry OS means operating system to me so no acronym) get juxtaposed a lot these days. Have a look at this as an example. The connection between SaaS (which kind of segues to cloud computing) and open source is twofold:
  1. An SaaS provider is a kind of license multiplier, so license fees are critical for him. That makes open source more attractive to him than to most others.
  2. Open source products are usually to technical for business users. But SaaS providers may shield their users from this complexity.
The upshot of all this is that open source may be very interesting to SaaS BI providers, even if it isn't very interesting to end users. There is plenty of chatter about this, but as usual a lot of the messages are aimed at the wrong recipient. It makes no sense to pitch open source SaaS to BI end users, because they don't care. But it makes a lot of sense to pitch open source BI to BI SaaS providers.

This whole thing is typical of the muddle surrounding BI, which is pretty technical but aimed at business users.

Thursday, April 09, 2009

Lyza, Gemini and QlikView

I've seen a recent rash of comparisons between Lyzasoft's Lyza and Microsoft' Gemini project. For example, here, here, here , here and here .

It's an interesting twist to the usual way you market a product. Lyzasoft seems to want to cash in on the buzz Microsoft is creating for its (competing) product. This in turn is connected to the buzz surrounding QlikTech.

Tuesday, April 07, 2009

SAS and business analytics

The discussion of SAS's new marketing campaign that differentiates between "business intelligence" and "business analytics" goes on. SAS has replied to some of the criticisms. To be honest, as I have already said I have a lot of doubts on the subject.

On the other hand, I honestly don't care very much. At BARC a big part of our mission is helping customers find the right product. To do that, we make a big effort to help companies separate important information from less important information. When we advise customers about which product they should select, we never discuss the vendor's marketing material. We discuss the user's needs and the feature set of products that seem likely to fit those needs.

Nigel Pendse wrote a piece called "What's in a name?" some years ago. It's dated in all its details, but still rings true. It certainly is not a criticism of the products the vendors had on offer, or any recommendation pro or con of any of the products the vendors offer. That would have been a disservice to the vendors and to potential customers.

I don't always agree with the way vendors present their products, but in the end it's their business. And even when I like the way they present their products I don't recommend that anyone base a purchase decision on any marketing statement. The question is how well the product fits the users' needs.

I also wonder if SAS is overreacting. The company can hardly expect analysts not to react to a marketing message announcing the death of business intelligence. It is obviously intended to be provacative. It would be unrealistic to expect all the reactions it provoked to be positive. I don't think the best analysts are necessarily the ones that praise the vendors the most. Like Abraham Lincoln said, knavery and flattery are blood relations.

Update: This post is partly based on something Peter Thomas twitteredI hadn't realized that he also had a new blog entry on the topic when I wrote it.

Saturday, April 04, 2009

Excel as a planning tool

In the course of my consultancy I often come across examples of companies doing their planning in Excel. For small-scale scenarios this is fine, but we sometimes see amazingly big and complex systems built on Excel. I do not consider this to be best practice.

I have twittered about this a time or two and I every time I get responses from people asking me what I mean. I think this one of the great things about social media. As a (self-annointed) expert on BI I tend to think that this is an obvious point. In fact Twitter has reminded me that it is not.

What I am talking about is large sets of Excel spreadsheets sometime containing a good deal of complexity which are sent around a company by email to collectplanning data. Such systems often contain tens of thousands of Excel formulas and are sometimes augmented by BASIC code as well. In some cases the results are fed back into some transactional system, but not always.

So here are a few important points on this issue:
  • I am not at all critical of the idea of Excel addins. Addins are third party products that enhance Excel. In fact far from being critical of this class of product, I like them quite a bit, and MIS, the company I once worked for and which now belongs to Infor was one of the many vendors of useful products of this type. Recently all the big BI vendors have piled into this market. An in-depth discussion of this type of software is also the topic of one of the most popular documents at OLAPReport (login required).

  • I also do not criticize companies and people who create this kind of system. In many cases it is the only way they have to deal with the complexity they are presented with in the limited time the planning cycle allows. The ingenuity I have seen put into some of these systems these systems is amazing

Nevertheless, I think these systems are very problematic, and any company using them should invest time and energy reviewing them and attempting to find a good way to replace them. The reason is that the are expensive to maintain, limited in functionality -- particularly in the are of analysis and simulation -- and inevitably suffer from data quality issues.

Tuesday, March 31, 2009

Maybe Status.net could be a BI application

This post has gotten a lot of attention. There are a couple of interesting points in it.

I think the idea at social networks could deliver new kinds of information to BI systems is a good one.

I also agree that social networks could deliver BI content. In fact BI is often very collaborative.

But that won't be the future of BI.

For one thing, there is a lot of information out there that nobody really has in his head. BI tools are there to discover that information. Until that happens, nobody in your network is going to be able to tell you about it.

BI content is often created by power users who swap content back and forth or create reports and publish them to relatively passive recipients. It is a very social activity, and in some ways it fits social networks. On the other hand, social networks have some of the well known failings that knowlege workers struggle with now, especially versioning, data quality and other reliability issues. Until a social network finds a solution to those problems, it won't revolutionize BI distribution.

Monday, March 30, 2009

SaaS is irrelevant to BI
There is some talk in the market about whether SaaS BI is viable. For example, Forrester is calling it unproven. Others aver that SaaS is the future of BI.

My view is that SaaS is simply irrelevant to BI. The arguments pro and con seem more suitable to discussing the future of SaaS ERP than to SaaS BI.

Forrester, for example, says there is a lot of skepticism about real time data transfers. Fine, but real time data isn't much use for strategic decision making anyway. Its study explicitly compares BI to ERP and ECM, neither of which play bs the same rules as BI.

SmartBiz implies at least that SaaS will make implementation times faster but doesn't offer any evidence -- lots of BI products are Web based anyway. Contrary to Forrester it claims that SaaS is better for dealing with large data volumes.

I suspect that SaaS is largely irrelevant to BI. A lot of the arguments pro and con seem more applicable to ERP that BI anyway. It may be that some SaaS vendor is successful in the market in the coming years, but it I don't think it will be because the vendor offers SaaS. It will be because it offers attractive BI features.
SAS business intelligence and business analytics
SAS's marketing people have started a discussion about the difference between business intelligence and business analytics -- whatever that means.

Here's a rather peppery take from Neil Raden.

James Taylor's not exactly new to the business, but he seems perplexed by SAS's marketing.

And Peter Thomas points out that SAS is attacking its own line of BI products.

Also talked to Michele Goetz about this, and she seems to be saying that SAS got it backwards -- BI is replacing business analytics!

That how the buzzword bubble works. A vendor or analyst looking for attention simply invents a new term or differentiates in a new way between terms and whole whole wave of discussions breaks loose. The upshot is that the company that started the discussion gets a lot of attention.

I think this is more legitimate for an analyst than for a vendor. After all, analysts are there to puzzle things out. Of course folks get carried away sometimes, and there are a lot more buzzwords out there than you really need to describe what's going on. But it's also true that new insights keep coming up and the market ghanges all the time.

It seems to name that a vendor would only have an excuse if
  1. It actually has some new product to sell.
  2. It decides to change its sales strategy.
Presumably SAS's goal was to get people talking about SAS, so that worked out pretty well. But changing your positioning is always a little uncomfortable for a vendor, because it leaves your customers asking what happened to the old positioning. Look at Microsoft's discomfort after it withdrew its planning tool. And in this case we get back to Peter Thomas's point -- if we take this seriously, what are we to think of SAS's BI products now?

Sunday, March 29, 2009

The new BI Survey is hitting the market
It's fatter than ever this time around and like this guy says, the new edition serves up plenty of Nigel Pendse's trademark acuity.
Will the recession help business intelligence?
Well maybe it will, though I expect total IT outlays to decline. At least I would say that BI is a relatively recession proof business, because bad times tend to encourage people to think about where the money is going to.

Anyway BusinessWeek is hyping the meme. http://bit.ly/QcEa