The Opposite Test

Friday February 17, 2006
Whether or not I've used macs, I've always been a big Guy Kawasaki fanboy. Getting someone with such integrity and seemingly boundless enthusiasm to evangelize for them was one of the best things Apple ever did.

Since I am going to be "evangelizing" quite a bit while in Austin, I am thrilled to see that recently Mr. Kawasaki started blogging, sharing his wisdom about evangelizing, and he's saying things that I already agree with.

He's a big fan of top ten lists, and everything I've read so far I agree with. One thing stuck out for me in particular though, because I just said something similar about open source project descriptions:
Apply the opposite test. How many times have you read a product description like this? “Our software is scalable, secure, easy-to-use, and fast?” Companies use these adjectives as if no other company claims its product is scalable, secure, easy-to-use, and fast. See if your competition uses the antonyms of the adjectives that you use to describe your product. If it doesn't, your description is useless. For example, I've never seen a company say that its product was limited, full of leaks, hard-to-use, and slow.
I wouldn't mind so much if people wrote such descriptions and then moved to substantiate them. Sometimes it's really important to have software that is scalable, easy-to-use, and fast. Sometimes you really do want a fast, clean, dark theme.

Presumably, in such a situation, your users know how fast, or how clean, or how dark they want the theme to be - how many users it has to scale to or what their training costs are going to be. Talk about that. Measure it, and write your advertising like a thesis you are going to have to defend. If you're writing such literature, even if sales isn't your job, you are in the role of a salesman and your readers know it, even if you don't. That means they are going to assume that every single word you say is a lie. Provide examples, show screenshots, compare to other things that they might be familiar with. Try to avoid graphs without meaningful numbers and units - for example, don't do this. Apple's Intel Core Duo site includes an "application performance" graph that has bars that say "4.1x faster", but don't explain the benchmarking very well, and while there are 4 tests there is only one "baseline" bar. (Also, they do say that they used a beta of the Cinebench software, which means that the results aren't even going to be comparable to something that customers looking at this site will have access to to run themselves on their own hardware, even if they were available, which they aren't. But I digress.)

I'm picking on apple because I'm considering maybe buying a MacBook this year, but the open source world has even more to learn about this than corporate marketroids. Every open source database project claims to be efficient, but how efficient? At least Oracle provides benchmarks during sales pitches. Let's say that I am going to build a system where database efficiency is really important - what is the most efficient open source database? Even the Open Source Database Benchmark site doesn't list results - their Project Status page (which is admittedly ancient, but they are still the first google hit for "open source database benchmarks") is only detailed enough to show that certain databases work with the benchmark, if you want to run it yourself.

When you're describing your open source project, think about your users, not about you. I think that the temptation to say software is "efficient" or "scalable" comes from the fact that programmers have to spend time doing optimizations and thinking about scalability. Even if it takes the bulk of your time, that's a base-level requirement, not something that's going to make your project better than its competition. Sure, if you walk up to a database user and ask them, "What would influence your choice of a database on future projects?" they might say something about efficiency, but if you think about what a database user is going to do with it, how they are going to experience the utility of the database both during development and on a running service, they are not going to be benchmarking and tuning constantly. They are going to be debugging problems with the database, when they inevitably get something wrong. People don't like to think about themselves making mistakes, or the database failing, but I think you will find people responding more positively to a database that provides gobs of useful information about what's going on than a database that is 8% faster than its nearest competitor.

While it's not perfect, I am a big fan of SQLite, and I think that (in addition to having an excellent technology) the "marketing message" on the website is very good. It begins by describing the database as "small, zero-configuration, self-contained", which is more interesting than the performance characteristics - to most users. I happened to be concerned about both issues, and despite being out of date, they provide a long, detailed page on database performance which clearly indicates that it is not slow.

Since I have been thinking about these sorts of issues, I am starting to formulate a plan for Twisted's marketing and future directions, too, but that's enough blogging for one night. Watch this space...