Sides of Subverting Open Source

Martin Schneider at The 451 Group commented on whether the collective “we” can be too jaded regarding some proprietary vendors’ apparent embrace of open source methods. This was in response to a piece by Dave Rosenberg and Matt Asay about subverting open source for sake of certain marketing purposes. Rosenberg and Asay essentially say that Microsoft and SAP have a well-known history of speaking out against Free and open source software (FOSS) and concepts.

Certainly, Microsoft and SAP have put effort and money into spreading fear, uncertainty, and doubt (FUD), and both have publicly made, sometimes very strange statements about or against FOSS. Yet recently, both are putting some effort into releasing bits in an open source method or else funding some open source development. Rosenberg and Asay seem to think there is an ulterior motive,

“Any outreach attempts from vendors who have worked for years to destroy open source should be taken with a grain of salt and a sharp eye cast on motivating factors.”

Or could this mean, as Schneider suggests, that these companies are beginning to join the community’s stance that open source “…is simply a better way to create and distribute software.”? Rosenberg and Asay seem to take that into account by acknowledging the project leaders for the open source initiatives within these companies probably are working in earnest–I can’t help but lean toward a bigger picture that, as a whole, there is something else, more involved, taking place.

It makes perfect sense, if you’re a proprietary vendor, to delve deeply into your FOSS competitors, and for several reasons. I believe there are serious reasons to be wary of such proprietary vendors’ forays into FOSS and at the same time to embrace that. Here is why.

First, any vendor has to know what it’s competing against. This is just standard good business practice, there are even industries devoted to supporting this idea–competitive intelligence. What better way to understand the new models undoing your traditional strategy than to emulate them and find out how they work. The more you understand, the better can you build your products to compete and win. If the FOSS community innovates new technology, Microsoft wins by learning it and improving upon it for their own products, just like any good open source vendor would want to do (of course an open source vendor would participate by feeding the community with those improvements as well).

Second, what about that often referred to Machiavellian notion of keeping your friends close and enemies closer? If Microsoft can successfully attract an open source development community into its fold (so-to-speak) it gains a very powerful tool, a foothold into the “enemy’s” camp, which allows it to anticipate and prepare its proprietary strategies.

Third, does it hurt the proprietary vendor in any way? They’ve got all their proprietary business and propaganda in full swing, everyone already knows about that. On the other hand FOSS and Linux are gaining recognition. I’ll make an educated guess that FOSS and Linux are still not as well understood, in concept, by the majority of business decision-makers, much less the public in general. I think they still lack the massive public feeling of acceptance that most software vendors currently enjoy with their traditional proprietary business models. However, as that understanding and recognition grows in positive ways, it can only help companies like Microsoft and SAP to be able to show they’re just as much involved in the leading edge of technology practices. It’s simply good PR. If Microsoft and SAP can manage this while maintaining their proprietary side, so much the better for them (from their perspective).

Fourth, let’s suppose there truly is an ulterior motive to subvert FOSS communities. In the shoes of a company like Microsoft, it makes sense to blur the lines of differentiation between your proprietary approach and real FOSS approaches (hence the shared-source initiative). The harder it is for critics, detractors, or enemies to clearly differentiate your approach from their own, the harder it will be for them spotlight your weaknesses and their strengths, thus the customer cannot act on clear information for his or her software selection decisions. Furthermore, if you actually do participate in some ways with the FOSS community, you may gain some supporters that will defend, in good conscience, your motives, and possibly even turn a blind eye toward some of your other, less savoury practices (this not only blurs more boundaries but it again helps with grassroots PR, which is oh-so important on the Internet).

Finally, I’d like to say that already there is no clear side-versus-side here, we have to pay attention to the grey to really comprehend the situation. While I think we can see companies like Microsoft and SAP employing some intruiging strategies for subversion, and there are battles between models and methodologies, to a degree there is also some learning and the adoption of new and better practices. Because of the co-opetitive nature of FOSS models, the gradual adoption by the likes of proprietary vendors may even, unexpectedly, end up subverting those vendors’ models. We’re not too jaded to be constantly wary and suspect these companies of efforts to undermine FOSS, but we should, at the same time, cheer them on when they actually do participate in real FOSS processes.

Net Neutrality and Future Legacies

I’d like to comment quickly on the net neutrality issue. The Web thus far is a system–that from the beginning–essentially anyone could access in a like manner. A few companies have a strong interest in changing that though, in making, what I understand, are something like tiers of accessibility. Considering the life and social changes that have taken place as provoked by the new sorts of creative innovation the Web has fostered, I think changes limiting Net interoperation are incredibly bad ideas. A basic idea Tim Berners-Lee puts forward is

“Freedom of connection, with any application, to any party, is the fundamental social basis of the Internet, and, now, the society based on it.”

This may sound abstract to some but Bob Frankston wrote an entertaining piece that illustrates the unsavoury results of losing such freedom. For a thorough and technical analysis, I find Daniel Weitzner’s text on The Neutral Internet: An Information Architecture for Open Societies interesting.

The thing is, whatever starts taking place, technologically or in government policy now is going to be around for a while. People will adapt, install, and use software that is based on or otherwise enforces such technologies and policies. That means we have to imagine the consequences of a future saddled with the legacies we’re creating now. I hope we act to keep our liberty intact.

PeopleSoft Nuisance in North Dakota

A Computerworld article covers some of the problems (and ends with a few happier notes) about a PeopleSoft (Oracle) ERP implementation taking place in ND’s government and education sectors. Although the state agencies sound generally satisfied, the article focuses on North Dakota University System’s unhapiness with the unexpected massive cost and time overruns for getting their system implemented.

Why did they underestimate the costs, which ballooned from the extra time required for the (still) incomplete implementation? The article suggests the lesson to be learned is never embark on a major project like this without employing a full-time project manager (which, surprisingly it sounds like this implementation lacked from the start). But there is something else to learn from the article:

“The academic software modules, particularly a grants and contracts management application, also did not perform as expected and have required extensive customization, said Laura Glatt, vice chancellor of administrative affairs at the Bismarck-based university system.”

I wonder why they did not expect this? Perhaps their original RFI/RFP was not designed to request that information? Did they script some demonstration scenarios for the vendor to show them how the modules would accomplish the sort of functionality they needed? I’d think there could have been some way to prevent this issue–maybe the ghost of a full-time project manager would have thought of that during the selection and evaluation phases.

Blog News Feed Versus Newsletter Usage

The Wall Street Journal Online has a short and slightly thought-provoking interview with Jakob Nielsen concerning newsfeeds and blogging.

I think the news feed reader is taking the place of both some browsing activity and some e-mail activity. People ought to be viewing blogging and news feeds not as the “extreme edge” mentioned in the interview but rather a notable shift in the way people discover and retrieve information from web sites.

Lee Gomes (the interviewer) asked why Nielsen prefers an e-mail newsletter over a news feed. It brought up a few points on the focus of a newsletter but Nielsen cautioned “Unless a newsletter is very good, people will just say, ‘Oh no, more information.'” And I find that to be my case. There are a few newsletters I like reading but the majority have too much garbage to wade through and simply clutter my e-mail inbox. I’m hesitant to subscribe to anyone’s newsletter now that I’m invariably offered the option during any web site registration. Over the years, site after site, has reinforced the notion that once I subscribe, the subscription will balloon into unwanted mail and it will be difficult to remove myself from the lists. Even when that’s not the practice, there is that suspicion. Abuses have made that impression the general state.

Many years ago I attended a conference held by a local phone company, which was trying to convince its corporate clients to build corporate web sites (and hence they needed fast Internet connections). The conference had a number of very informative sessions highlighting the benefits a web site could bring to a business. One of the points I recall, was how much emphasis they put on having a clear and easy sign-up page for a company newsletter. They made the case that a well-designed newsletter, would help a company stay in contact with its customers (of course that lends itself to all kinds of wonderful marketing activities).

Now I hear similar arguments for the business benefits of blogging. Except the nice thing about blogging and hence blog news feeds, is that the subscribing user has complete control over whether s/he subscribes to it or not (unlike what happens when you release your e-mail address to the clutches of some unfamiliar internal machinations of a company you probably have little reason to trust).

One last thing. On the conversational aspect of blogs (which seems to be, at least in part, commentary on who actually is reading/using them), Nielsen comments that it works for fanatics “…who are engaged so much that they will go and check out these blogs all the time.” I’m inclined to agree temporarily, but it is a shortsighted viewpoint if that is where it ends. True most people I know, haven’t got a clue what a news feed reader is much less a blog, though since I’ve been using these for a while, they’re familiar concepts and tools for me. However, all technology uses tend to be that way. When I went to the conference I mentioned previously, an e-mail newsletter seemed like something only a small percentage of the population would ever use. That changed. Now that I regularly use a news feed reader to read articles, I use my web browser less frequently. That is a major shift in the way I access Web content.

Compiere Repots itself for Growth

It seems that open source ERP provider, Compiere, is prepping itself for a lot of new growth. Today it announced (hot on the heels of bringing in Andre Boisvert as its Chairman of the Board and Chief Business Development Officer) that it would be moving its corporate headquarters to California’s Silicon Valley and at the same time that it secured a nice little VC nest egg of $6 M (USD).

The company’s press release (linked above) has CEO, Janke, stating “…the market’s demand for our product has outgrown our capability to scale the business accordingly.” And then Boisvert mentions that Compiere’s in a good position with its “…modern architecture at a time when many proprietary legacy ERP systems are approaching the end of their intended life cycle.” That sort of growth should keep Compiere’s model vibrant.

In April the company announced seven new implementation partners, which I think bodes well for its business model–a model largely based on second level support and training (in other words supporting its partners). The more partners implementing it the better. The company had a little over forty toward the end of 2004 and now claims more than seventy. Hopefully the additional funding truly will help it scale for those partners’ demands.

Open Source Database and OS Demand Stats

A few articles about open source database growth made the rounds recently. Mostly these discuss a rise in growth, for example the EnterpriseDB survey notes

More than half of all survey respondents indicated that their respective companies had either already deployed an open source database or were more likely to deploy an open source database than any other open source application, including CRM, desktop productivity, and ERP. The survey was sponsored and administered by EnterpriseDB.

Gartner too published some stats on database growth, though of a slightly different nature.

The combined category of open source database management systems vendors, which includes MySQL and Ingres, showed the strongest growth, although it was one of the smallest revenue bases,” said Colleen Graham, principal analyst at Gartner.

These are all interesting so I thought I’d post a few stats TEC tracks about enterprise end user demand. We find out what companies are looking for as requirements for implementing different enterprise systems (ERP, CRM, SCM, etc.). It might be valuable to compare these different sources and types of stats for an overall picture.

According to our tracking of about 3,000 different users, the following numbers signify the percent of those users that selected each of these platforms as a technology requirement for their enterprise software selections (such as an ERP, CRM, SCM, etc. system). Note that we ask about some other platforms too but I’ve omitted those stats–they account for very small percentages.

DBMS Q1 2005 Q2 2005 Q1 2006
IBM DB2 7 7.2 7
Microsoft SQL Server 35.9 37.6 36.4
MySQL 8.9 9.6 12.7
Oracle 20.4 21 20.6
PostgreSQL 3.1 2.8 3.5
Hosted solution (not installed on a customer server) 0.5 0.6 3.4
Server Q1 2005 Q2 2005 Q1 2006
IBM iSeries (AS/400) 7.6 7.1 7.4
Linux (such as SUSE, Red Hat, or Debian) 11.8 11.4 12.9
Unix (such as Solaris or AIX) 13.3 12.7 11.5
Windows Server (such as NT/2000/XP) 54.2 54.2 49.4
Hosted solution (not installed on a customer server) 0.7 1.7 5.4

It’s pretty clear that we have not seen great changes in demand for Oracle, Microsoft, and IBM systems, but MySQL certainly has increased in 2006 over 2005 and PostgreSQL has been working its way up. I happen to know that so far for Q2 2006, the open source systems are set to surpass the previous quarters’ demand.

So while Gartner is calling attention to strong growth but small revenue bases, perhaps one could look at the direction the demand is moving in (based on the Enterprise DB survey and TEC’s stats) and guess that the revenue base may be ready to change.

Verifying an RFI

Today, I had a conversation with a consulting firm that works with TEC‘s decision support tools and knowledge bases (KBs) on enterprise software. In this case, they were engaged in an ERP selection project.

The consulting firm was asking me about the data accuracy (in our KB) regarding the functionality of some of the vendors they’d shortlisted. TEC researches and provides immense amounts of information on software products so it is an incredibly tricky task for us to ensure that the data is accurate and timely. Considering the number of clients using our evaluation services for their projects, as well as the consultants using the same services for their clients, it amazes me when a software vendor either isn’t amenable to providing updated information about their products, or in a few cases, is less than truthful about their products’ capabilities. That’s what I want to talk about in this post because I had to answer this consultant honestly, without bias, and what I explained to him about the way one abnormally naughty vendor treated the RFI response process, seemed to slightly sour him toward its product.

First, usually vendors respond in earnest to our RFI inquiries, it’s in their best interest. I wonder though, if a few vendors respond dishonestly while knowing TEC exposes its analysis data to thousands of customers (who may very well become sales for the vendor), how well are these vendors responding to the inquiries they receive from individual clients that don’t have many resources for vetting information? I mean to say, if you’re working on a project to select some kind of enterprise software system, design your own custom RFI, and send it out to a bunch of vendors, how are you going to be sure that the responses are truly accurate? Even consultants won’t have expertise on every product out there.

It seems to me that until you get to a stage where you’ve already selected a few vendors to give scripted demonstrations, there isn’t much of a way to verify the accuracy of the responses; and how much time will have elapsed just to get to that point? I’m not suggesting that vendors are likely to act in bad faith, criteria are also commonly misunderstood. Even with a focussed team of subject matter experts, editors, and translators, we get inquiries from very knowledgeable and intelligent people that don’t understand criteria we use for our data collection.

Here’s a way that fails. I once worked for a company that had a slick on-line decision support/analysis tool called Compariscope. Our analyst team would actually get copies of the software from the vendors and set up test environments. This ensured accuracy in the data but it also meant the scope of the analyses was extremely limited and because of the significant time required, we were always playing catch-up to the latest software releases. Perhaps, it could have worked if we’d had hundreds of analysts, vast supplies of equipment, and the vendors were all willing to give us copies of their software (often they responded to requests for software as though we’d handed them a cleaver and asked them to cut off their left leg for lunch). That business model quickly evaporated. So installing and testing every type of enterprise software application is not a feasible methodology for an anlyst firm, much less the end user company.

When I started working with TEC, we only covered discrete and process ERP systems, and at that point, we only provided data for about ten vendors. Our ERP analyst, PJ, could check out the information and have a decent idea if the vendor understood the RFI and made an earnest response. But a single person cannot verify every one of over 3,000 criteria and as we grew and started providing information on more software vendors and on more subject areas (SCM, CRM, etc.) it became quite difficult to make sure all of the data were accurate. Even with additional analysts, nobody in the world really knows what every product is capable of.

I’m curious to know if, anyone that might read this (consultants, people that have worked on their own selection projects, etc) has come up with some good methodologies to verify data after gathering your own RFI processes before spending serious time in product demonstrations. Please respond with your thoughts. Here is what we came up with.

TEC demands RFI responses from an official of the vendor that is responsible for replying to client RFIs. Then we take a few steps to vet the vendor’s data…

1) Once we retrieve a completed RFI, we have a team of people give it a quick review, checking for obvious errors and such, if it passes that test, it moves on, if not, it goes back to the vendor for revision.

2) Our analysts start reviewing the information based on their own knowledge and experience of course, but also things like the RFI being internally consistent with itself (if you’re careful there are some ways to structure an RFI like this). Benchmarks, using TEC decision analysis tools. Analysts also have to constantly be aware of what’s going on in the field so that they can see consistency with known customer results, peer findings, news, conference announcements, and vendor sources such as collateral, other products, services, and initiatives.

3) I came up with a veridical comparison method that aggregates all our existing vendor responses to the criteria in a knowledge area (ERP for example) and defines what the likely level of support would be for each criterion. This lets analysts flag criteria where a particulare vendor deviates far from an expected range and understand what the next most likely levels of support are. For example:

If we know that only two in thirty ERP vendors (at any tier) natively support a standard interface to CAD systems for direct data access, and we see a start-up vendor telling us this criterion is fully supported, our analysts know they’ve got to see the vendor demonstrate that. The reverse is true as well. Sometimes a vendor says it doesn’t support criteria “out-of-the-box” but when we talk to the vendor, or it demonstrates how its system works, we realize the vendor simply misunderstood the criterion. That’s a great opportunity for us to learn how to clarify the criterion’s wording.

4) As I hinted above–the demonstration. All of these checks can go only so far. When an analyst actually sees the vendor demonstrate its capabilities, he or she can definitely verify the accuracy of an RFI.

Finally, even with the checking, benchmarking, and reviewing, sometimes, within the thousands of criteria, an error falls through the cracks. Sometimes, admittedly, even we are not quite fast enough. On occasion a consultant or VAR, intimately familiar with a product, alerts us to an error. Other times a customer is already using some product and tells us about an error in its ratings on our system.

Perhaps it would have been best if we’d discovered these errors first. But, in my opinion, this is one of the great strengths of having information like this being accessed by so many people with different perspectives via the Internet. As FOSS communities and other collaborative projects like Wikipedia has demonstrated–a little effort from many people can go a long way to improve a common goal. In our case, making sure we maintain accurate information on enterprise software products. I’ll admit this cross-checking process is probably not very transparent to the public. Perhaps a more transparent and collaborative cross-checking process would be a method to further improve data accuracy.

E-mail Replacement Idea

In a previous post, I briefly commented on blogs as an e-mail replacement. It was an off-the-cuff remark but I started thinking about it more. Perhaps it could end spam?

This afternoon one of my colleagues came by my desk and commented on the RSS reader I had open. She wondered if it was a nice looking skin for Outlook (it wasn’t, but it did look nice because the reader was running on my Linux box rather than Windows). But the comment struck me because at first glance the RSS reader does look like essentially the same thing as an e-mail application. Repeat, what if we all begin blogging instead of e-mailing?

It takes no effort to imagine everyone having a blog and thus RSS feed. Rather than send response e-mails back and forth in conversation, trackbacks could accomplish the job. You’d only add a feed to your reader if it was someone you wanted to communicate with (trackback with). Instead of e-mail addresses, we’d have feed subscriptions.

The reverse could be true as well, you could flag categories or posts within your feed as private or visible only to particular acquaintances (friends, colleagues, etc.) perhaps using xfn features or some kind of social networking system. Conversations and subjects could be tracked by tagging them instead of creating an e-mail folder fungal multiplication horror.

Web sites already exist that let people aggregate feeds into custom pages, so that would take care of replacing web-based e-mail. A number of desktop applications are interfaces for blog posting without logging into the blog’s web interface. Perhaps those tools could be combined with the readers. Finally, instead of sending an e-mail, everyone’d just post to their blogs.

What is appealing about this idea? Unless I’m missing something, It seems like a rather easy way to eradicate spam and for some unfortunate OS users, decrease other contagions (unless of course you choose to subscribe to a spammer’s feed). The trick is, I think, getting enough people to blog in order to change the dominant communication method from e-mailing to blogging.

On the Subject of Learning, Tools for LMS Purchasers

Niall at NetDimensions Insights wrote up two nice pieces pointing out a few ways that people seeking a learning management system can use low-cost tools to compare the different offerings out there. He mentioned both the Brandon Hall feature comparison document as well as the LMS RFI templates that Technology Evaluation Centers offers–this is what caught my attention. Niall comments that

Use of this template does not mean that you do not need to perform a thorough analysis of your organization’s learning management requirements. Each organization will have a unique set of requirements and you should ensure that any additional requirements identified are added to the template.

That’s exactly it too. Templates like those can be helpful to research functional requirements/support requirements, and ultimately toward soliciting proposals, but any spreadsheet-esque comparison grid will only do so much when it comes to analysis.

In any case, I wanted to respond by pointing out another inexpensive tool a potential LMS purchaser could use, which is TEC’s online LMS evaluation tool. It has the same hierarchy of criteria as the spreadsheet, except it lets you prioritize those criteria based on your organization’s learning requirements (and can include additional custom criteria). Then it analyzes your priorities to figure out which vendors match your requirements (of course it uses rated vendor responses to all those criteria to do this). I think this supports, at least in part, Niall’s recommendation for a thorough analysis.

The Start

I needed a place to post some thoughts on things taking place in the IT world and I didn’t want my other sites to be mixed with those issues, hence my newest blog. Blogs appear to be a communication necessity now (sort’ve wonder if they may someday be able to replace e-mail, we could use them with private sections and private trackbacks). I’ve experimented with blogs in a number of ways for the past several years but it wasn’t until I recently read Robert Scoble and Shel Israel’s book, Naked Conversations, that I got inspired to begin using an RSS newsreader (actually I’m using both akregator and liferea) to keep track of a lot of blogs. After religiously reading these blogs for the last few months, I can’t not be part of the conversation anymore.

The company I work for, TEC (which has blogging on its horizon too), is in the process of developing a new decision support knowledge base addressing health care information management systems. I’ve been working on its structure today and noticed the analyst used an interesting acronym, ADL, which stands for the “activity of daily living”. I can’t help but wonder why the world needs such an acronym, can’t I just, err, live? It reminds of an article I saw linked in a blog yesterday (unfortunately I’ve forgotten which blog now), discussing a new wave of corporate buzzwords.

In any case, I’m sure ADL is an important technical term to the HCIMS industry (in a short bit I’ll probably know why). For the time being, I think I’ll just note that blogging has become a part of my ADL.