DBMS vendors and technologies
Database management system (DBMS) vendors and technologies
Business intelligence (BI) used to be characterized by speed and cost-effectiveness — short sales cycles, low-cost departmental purchases and deployments, evasion of IT departments’ strangleholds of data, and so on and so forth. That focus has blurred, as BI vendors have increasingly focused on analytic applications or enterprise-wide standardization sales. But increasingly I’m seeing signs that the pendulum has swung at least partway back. For example:
- Business Objects and Netezza have announced a mid-range BI appliance.
- Ingres is headed in the same direction.
- QlikTech is enjoying great growth for its fast-deploying BI technology.
- KXEN and Verix offer “easy” data mining technology.
- Search-based BI is trying to circumvent the data warehouse deployment process.
It’s about time.
|Categories: Analytic technologies, Business intelligence, Computing appliances, Data mining, DBMS vendors and technologies, Usability and UI||1 Comment|
Network World today posted my column predicting a rosy future for computing appliances. A lot of the supporting research has been posted in this blog recently; here’s what was a preliminary summary and survey of appliance vendor strategies.
Subsequent to submitting the column, I developed a simpler taxonomy of computing appliance types, namely:
Type 0: Custom hardware including proprietary ASICs or FPGAs.
Type 1: Custom assembly from off-the-shelf parts. In this model, the only unusual (but still off-the-shelf) parts are usually in the area of network acceleration (or occasionally encryption). Also, the box may be balanced differently than standard systems, in terms of compute power and/or reliability.
Type 2 (Virtual): We don’t need no stinkin’ custom hardware. In this model, the only “appliancy” features are in the areas of easy deployment, custom operating systems, and/or preconfigured hardware.
Here’s what I predict for each of them.
|Categories: Check Point Software, Computing appliances, Crossbeam Systems, DBMS vendors and technologies, EMC and VMware, Virtualization||3 Comments|
My recent flurry of research into computing appliances was spurred by a column I just submitted to Network World. In that column there’s a URL – pointing to this post – promising a guide to more details on that research. Thus, here’s a set of links to my posts of the past few months on computing appliances, both here and on DBMS2.
Half or more of the computing appliance vendors I’ve looked into follow very similar hardware strategies: They use mainly standard parts; they include uncommon but off-the-shelf networking (and sometimes encryption) accelerators; and they of course optimize the mix of those parts and general hardware architecture as well. (EDIT: I actually gave names to three strategies — even if they were just “Type 0″, “Type 1″, and “Type 2″ — in this overview of data warehouse appliance vendors. And in another post I considered arguments about whether one would want a data warehouse appliance at all.) Examples I’ve posted about recently include – and I quote the forthcoming column – “DATallegro and Teradata (data warehousing), Cast Iron Systems (data integration), Barracuda Networks (security/antispam), Blue Coat Systems (networking), and Juniper (security and networking).” (ANOTHER EDIT: But I think DATAllegro’s strategy has changed.)
By way of contrast, there’s also a group whose stance is more along “hardware/schmardware” lines. Sendio and Proofpoint (in most cases) don’t really do anything special at all in their boxes; what’s more, Proofpoint actually has significant software-only deployments over VMware’s virtualization layer. Kognitio and Greenplum think their software-only data warehouse offerings are appliance-equivalents too; indeed, Greenplum’s software is sold mainly bundled with Sun hardware (to the extent it’s sold at all), and Kognitio is hinting at an appliance-like offering for competitive reasons as well. Check Point Software plays both sides of the field; it offers its own kind of “virtual appliance,” but also gets many of its sales through appliance vendors. Its most interesting such partner, if not its biggest, is Crossbeam Systems, which in my opinion may very well represent the future of appliance technology.
|Categories: Barracuda, Blue Coat Systems, Check Point Software, Computing appliances, Crossbeam Systems, DBMS vendors and technologies, EMC and VMware, Juniper Networks, Proofpoint, Security and anti-spam, Sendio, Virtualization||3 Comments|
Obviously, Oracle has the potential to be a titan in analytics. But it doesn’t have its act at all together yet.
And so I agree with a couple of comments on Stephen’s post, to the effect of “Well, gee, no wonder that Siebel’s BI tools look like they’ll be the surviving technology.”
EDIT: Mark Rittman offers a lot of screenshots of Oracle’s Siebel BI Suite. If you look at other posts on his blog, you’ll see Discoverer as well.
|Categories: Analytic technologies, Business intelligence, DBMS vendors and technologies, Oracle||1 Comment|
Below is an actual email I sent to my Computerworld editor, the incomparable Tommy Peterson.
So anyway, I visited Intersystems today, at the insistance of PR lady Rita Shoor, even though it seemed a phone call would have sufficed. Notwithstanding that this was a relatively longstanding meeting, Linda scheduled a dinner for us in Cambridge with my stepdaughter, which is basically good, because Intersystems is in Cambridge, but forgot about my meeting, and wound up scheduling the dinner for 9:30. Rescheduling ensued, but when I drove to Intersystems for a 2:30 meeting, it was still in flux. I was in an odd state anyway driving to the meeting, because I was already rather tired (my sleep schedule oddities), but psyched from having FINALLY posted the white paper online that represented my biggest writing project in almost a decade (because of the number of sponsors).
Despite several wrong turns at the tricky address of 1 Memorial Drive, I arrived in plenty of time, or even a bit early. I’d worn my hooded leather jacket due to the rain, but since I was in a parking garage, I decided to leave it in the car. “What can possibly go wrong that would make me need this jacket, I thought, except for a fire and building evacuation? And how likely is that??”
So I go upstairs to the meeting (after walking fruitlessly up many flights of stairs and then back down, in an error that seems common among newcomers to the building). But all is good, and there’s a very pleasant start to the meeting (as well there should be, given the GREAT column I wrote about them last year). Before long, however — you guessed it, there’s a fire alarm. After much noise and disruption, it turns out that it’s a REAL fire, and we evacuate, through the smell of smoke, that is stronger on the lower floors.
So I’m outside in a cold drizzle in my shirtsleeves. After a few minutes of stoic schmoozing, I’m reunited with the meeting folks, including Rita Shoor clomping over in 5 inch heels (her estimate) with somebody holding an umbrella over her. At my urgent suggestion, we decamp to continue the meeting in a restaurant, and they select the nearest one (with Rita commenting along the way about said heels). We’re evidently the first people to have this brilliant idea, and continue the meeting in quiet. But soon a flood of people has the same idea, and the place has techies hanging from the rafters, noisily. We continue the meeting over the din, but with some interruptions. We learn there had been a notice of substantial time before the fire department would let people back in (hence the exodus across the street). We further learn that the apparent cause of the evacuation is a fire in a red Toyota parked in the garage underneath the building, which concerns me, because I indeed arrived in a red Toyota. However, it is clarified that this car was on a different level of the garage than mine, and I relax, and we continue to discuss the glories of Ensemble.
A little while later a young man dashes in, wet from the rain, and inquires whether Curt Monash is present. I learn that one part of the prior information had been wrong; the fire had NOT been on a different level of the garage than the one I’d been parked on. In fact, it is my car that had burned up. More precisely, the engine compartment was burned, the sprinklers had suppressed it, the fire department had staved in the windows, everything was soaked, and the car was almost certainly totalled.
And that, Tommy, is why although you will get a column before I leave on my flight Monday, it may not be as long BEFORE Monday as you had requested, and as I had originally intended.
I’ve been writing this month about the three different paradigms used by the leading enterprise software vendors:
- Data/information-centric (IBMOracle)
- People-centric (Microsoft)
- Process-centric (SAP)
Well, in a recent announcement IBM set out to straddle the three categories, and a couple more to boot:
IBM has identified five entry points to enable customers to more easily approach and initiate an SOA project. These entry points include people-, process- and information-centric approaches as well as connectivity and the ability to reuse existing assets.
But a look at some of the detail from the announcements strongly suggests that the three paradigms haven’t overnight truly become co-equal.
For supporting a people-centric approach to SOA, WebSphere Portal version 6.0 integrates IBM Workplace and collaborative technologies, making it easier for users to build and deploy composite applications that can be tailored by industry, role or task. The new release takes advantage of AJAX to create a more responsive user environment.
Sounds like pretty basic stuff.
Additionally, the latest version provides a workflow builder that utilizes the process engine from WebSphere Process Server, open standards-based software powered by WebSphere Enterprise Service Bus (ESB) that helps simplify the integration of business processes.
Ditto, although I’d put that in the “process” rather than “people” category.
To improve business visibility and deliver a process-centric approach to SOA, IBM announces WebSphere Business Monitor version 6.0. This software provides an aerial view of the business and enables customers to proactively identify potential issues before they impact productivity. New features in WebSphere Business Monitor include business alerts, links to third party reports that combine real-time performance and historical analysis, and scorecards to track the status and metrics of projects.
Again, pretty basic.
For an information-centric approach to SOA, IBM is delivering industry-specific models to help clients successfully launch their SOA initiatives. The enhanced IBM Banking Information FrameWork and IBM Insurance Application Architecture models provide a set of critical processes, workflows, and activities to help organizations reengineer their business processes to implement strategic initiatives such as master data management.
Now, I’ve in no way been briefed on those, but off the top of my head that sounds more than just “basic” to me.
Data is still pre-eminent at IBM.
Speculation is rampant as to Oracle’s exact strategic goals in acquiring Innobase and Sleepycat, with more open source vendors rumored to be coming soon. Rather than try to add some nuances directly to the low-end/open-source/brand-extension/embrace-and-extend strategic discussion, I’d like to step back and say one thing:
Multi-DBMS product strategies work moderately well.
Admittedly, in the history of software there only have been a limited number of DBMS products that be regarded as huge successes, and only in one case has more than one of them belonged to the same company. But even so, history is fairly encouraging toward whatever it is that Oracle is trying to do.
IBM. IBM has had two hugely successful DBMS product lines – IMS and DB2. Since IMS and DL/1 were separate products, and there are also two significantly different versions of DB2, it’s even fair to say that IBM has had four different rather successful DBMS. And that’s not even counting acquisitions.
Informix. Shortly before it imploded, Informix got a little carried away with a multi-product strategy. It didn’t help that by claiming all the products were on a single code line, they were saying something that A. Wasn’t true and B. Nobody would have cared about if it were true. Still, the Progress-like Informix/SE was a fundamentally different product from Informix’s Oracle-competitive high-end products, and both were viable businesses. Unfortunately for Informix, when it moved successfully into the high end it defocused on the low end, and went from being a powerful #2 in the VAR market to a real also-ran.
Sybase. Sybase was once a leader with what is now called Adaptive Server Enterprise, and continues to muddle through as nontrivial also-ran. Meanwhile, Adaptive Server Anywhere is the leader in its niche. Like Informix, however, Sybase walked away from what had been a strength, which is the laptop/desktop/office OEM market, really focusing on the pervasive computing/nontraditional computer market at the expense of what was once a strong business position (e.g., as the initial big platform for Siebel’s original Sales Force Automation products).
Oracle itself. The acquisition of RDB from Digital was a major success for Oracle, in that the technology really helped the main Oracle product while the legacy RDB business tootled along to pay for itself. I think the smaller TimesTen will be a big success as well.
I think Software AG is doing OK with a multi-DBMS strategy too, but I’m a bit foggy on the details. Progress has a few very impressive references and not much else from its recent DBMS-like product acquisitions, but I’m cautiously optimistic there. That leaves Microsoft pretty much as the only single-DBMS vendor around, and I’m sure there are folks in Redmond who, because of Analysis Services or Access or something, would even dispute that.
If Oracle pursues some kind of parallel product line open source DBMS strategy, there’s every reason to think they can pull it with only moderate conflict and anti-synergy. At least, that’s what industry history seems to suggest.
And I have some thoughts as to why this is true. In no particular order, they are:
1. Developing DBMS is a hard skill – and one that’s transferable from project to project.
2. The same goes for a grab-bag of specific experience, tricks, algorithms, and so on.
3. Positioning of multiple DBMS products need not be in serious conflict. (Actually, companies do tend to screw that up a lot, which is why almost all the successes I outlined above are only partial. Maybe I’d better save a detailed discussion of that point for future postings.)
Marten Mickos, CEO of MySQL, is a quotable man this week. Oracle saw to that by acquiring Sleepycat, on the heels of its prior acquisition of Innobase. Basically, his message is rah-rah open source, he really truly can compete with Oracle on functionality, but of course as a practical matter Oracle probably is locking in its application customers to its DBMS, including customers from the Siebel and Peoplesoft acquisitions. That makes sense. It’s consistent with what I’ve been hearing from SAP. I now think that the quotes elsewhere suggesting he wasn’t serious about powering ERP software at all were misunderstandings. He just recognizes that the ERP software MySQL will power will largely be SAP’s.
As I’ve previously noted, the expectation is that MySQL will wind up getting share in SAP’s customer base. At least, the expectation is that their technology will be good enough to do so. The business reasons for SAP to favor this outcome are of course pretty obvious. Almost the only remaining question is whether SAP will back MySQL with great force, or whether it will divide its love between MySQL and its own inhouse DBMS product MaxDB.
The disclosures in this post have been updated in June, 2008.
I’m sometimes amazed at the breathless pseudo-naivete about pundits (analysts, bloggers, whatever) and compensation. The latest round was kicked off by a WSJ article about bloggers promoting FON. A couple of years ago, Computerworld editor Maryfran Johnson was viewed as a heroine for pointing out analyst firm conflicts of interest.
Personally, I’ve been an analyst for almost 30 years; I have a strong reputation for being independent and critical; and I get most of my revenue from vendors. So perhaps I’m in a good position to clarify some of the issues.
1. Good vendor relationships are an important factor in an analyst’s success. It’s not just revenue; you also need access to information. This is true whether you’re a stock analyst or an industry analyst.
Now, if you’re a good analyst, you can work around access problems. You can talk with customers, competitors, ex-employees, and other industry players. You may have relationships that transcend the company’s communication controls. (For example, it’s a firing offense at Oracle to have unsanctioned conversations with an analyst. And Oracle isn’t sanctioning a whole lot of conversations with me these days. But for a number of reasons, such as longstanding relationships with “untouchable” higher-ups, my information flow from inside the company is still pretty good.) Still, having access is better than not having access, and companies use that as a lever.
2. Analysts typically have more confidence in the companies that are their paying clients. I honestly call ‘em as I see ‘em, no matter who is or isn’t paying me. But some of my calls have to do with confidence. And who will I be more confident in? Company A, which has disclosed almost all their current activities and intermediate-term plans to me, and has given serious consideration to expensive advice they’ve paid me for (and hopefully done something with the advice)? Or Company B, with whom my relationship is largely being fed marketing pabulum, with only the occasional renegade getting off the reservation and telling me what’s really going on? Obviously, it’s often Company A.
Gartner Group is no different from me in that regard.
3. There’s a reinforcement cycle that confuses questions of bias. Companies give money and attention to analysts who are positively inclined towards them. They buy consulting services from analysts whose worldviews are compatible with theirs. The resulting relationship, if it goes well, reinforces everybody’s positive opinions of each other.
Meanwhile, companies give cold shoulders to analysts who don’t like them. And that just reinforces analysts’ opinions too.
4. Experience teaches that the companies that most manipulate or hide from analysts have the most to hide. If a company feels good about its strategy, and is eager to listen and learn how to make it even better, it’s often pretty engaged with analysts. If there are some product weaknesses it would prefer not to have discovered, it may be more inclined to concentrate its efforts on only the big firms it must talk to, and cold-shoulder the others. There are exceptions, of course, based on factors such as marketing budgets or the cluefulness of the analyst relations staff. But a good analyst’s gut feel about who is or isn’t being forthright is often a pretty good indicator of how a company’s technology is doing. Indeed, I have had some famous successes in this regard over the decades (e.g., the Cullinet and Sybase stories, which I really need to write up at some point over on the Software Memories blog). And it’s not just me. David Ferris of Ferris Research led the way when he and I had a success of that kind together with respect to Critical Path, shortly before the management team was discovered to be criminally dishonest.
5. Being on advisory boards almost always involves compensation or the expectation of compensation. Anybody who asserts otherwise is dishonest or naive. But then, the only folks I’ve ever seen assert otherwise are Fabian Pascal and (sort of) Chris Date.
So here is some of my disclosure.
- SAP is currently my biggest customer. In various other years my biggest customer has been Oracle, Computer Associates, Microsoft (I think — if not so, then close to it), AOL, and a predecessor of what is now the Progress DataDirect division. And that’s by no means a complete list.
- Every white paper and every webinar I do is “sponsored”; i.e., money changes hands. (There may be occasional exceptions to that rule in the future, but it’s usually the case.)
- The companies that are currently most seriously diminishing my opinion of them via the cold shoulder they give to various analysts (not just me) are Oracle and Cognos.
- For years, I have had exactly one investment research client — a portfolio manager whose identity you could probably guess by looking at the testimonials on www.monash.com.
- I cannot commit to promptly or completely disclosing who my consulting clients are. Sometimes they want to be served in confidence. However, I always have and in the future always will disclose any kind of relationship in which I am paid to promote companies in any way.
|Categories: About this blog, Analytic technologies, DBMS vendors and technologies, Enterprise applications||4 Comments|
Borland is exiting the IDE business. Wow. On the one hand, I long ago figured out that IDEs weren’t a real business. On the other hand, this is Borland we’re talking about — the last holdout.
Three factors killed the IDE business, and a fourth drove a stake through its heart. None of these, IMO, is the rise of Eclipse; that’s a symptom of the problems, not a cause. Rather, I think the key factors are:
A. Vendors are paid well for run-time products, not development-time. Most categories of “platform software” actually have a major programmer-productivity aspect to their pitch. Microsoft Windows makes device connectivity easy. Application servers make Web connectivity, data integration, load-balancing, and/or failover easy (the reason for buying them has changed frequently). Database management systems are ultimately just big SQL interpreters. Similar stories could be told about other categories, including almost everything in analytics.
And those product categories are often big businesses, because vendor revenue depends on the number of end-users, not the number of developers. Thus, it’s often obvious that value far exceeds expense. By way of contrast, getting the same revenue from developer-based pricing might require tens of thousands of dollars per developer seat, and rightly or wrongly that kind of pricing is very hard to enforce.
There was a period in industry history when technology made it natural to officially have run-time versions of developer tools. This was the “fourth-generation language” period, which arguably lasted from the early 1980s until the mid-1990s or so. But once Java came along, everybody wanted compiled code instead of proprietary interpreters. And that was that.
B. Price competition was brutal. Server-based development tools may have been expensive, but PC-based language products were very cheap. Microsoft’s first product ever was a Basic interpreter; I don’t know the price for sure, but I’m guessing it was a few hundred bucks. Way back in the early/mid-1980s, Borland started out by selling a $49 developer tool, namely a Turbo Pascal compiler. When Microsoft and Borland duked it out in the C++ market in the early/mind-1990s, huge amounts of software documentation showed up for what were rather low-priced products.
But the real killer was Visual Basic. VB conflated the IDE and “language” markets, and imposed language pricing on the development tools market. And that was that. What’s more, a significant fraction of the development tools market was held by the independent DBMS vendors (Oracle, Informix, et al. — and the same had been true in the prerelational era). They wanted account control, with lots of applications built on their DBMS to create lock-in and more server sales. And that was a higher priority for their tools businesses than making a profit. So when it became hard to hold the line against Microsoft tools pricing, Oracle et al. weren’t all that depressed about caving in.
C. Products are obsolete before they were mature. Contributing strongly to the economic problems of the IDE business is that the products usually don’t do that good a job. Oh, in many ways they’re great, and programmers swear by them. But programmers also swear at them, because they commonly do only part of what is necessary. Generally, a new tool will be developed to help with a new need, such as relational DBMS access or GUI client/server interactions or three-tier processing or whatever. But these tools will often be weak at what came before; e.g., Powerbuilder and Visual Basic weren’t very good at industrial-strength scalability. By the time the shiny new tools mature to do a good job at the older requirements, some other platform shift comes along, with yet newer and shinier tools to handle the latest twists.
D. It’s all about collaboration. The latest requirements shift is from supporting individual developers to supporting teams. That makes almost everything about IDEs irrelevant, or at least a commodity. Borland, which has been telegraphing today’s shift for a while, may have made the point most clearly. But you hear similar things from Microsoft and Oracle.
So there you have it. I was perhaps earlier than most in figuring this out, having gotten a very painful education in the point by the commercial failure of my critically acclaimed application development tools opus in the mid-1990s. (It survived as an essential reference for the trade press for years. But not many users ever actually bought it; instead, they just bought Visual Basic and didn’t even consider the more sophisticated products I wrote the guide about.) But I think the whole IT world sees it now.