Archive for August, 2008

The agony and the ecstasy of the first customer

Wednesday, August 27th, 2008

Ah, the first customer. No other single event is as important and nerve-wracking to an enterprise software company as getting your first customer up and running. Fail, and the ensuing reputation damage could kill your business. Succeed, and the reference will set you up for a much easier sales effort to the next one. Not to mention the effect on employee morale, investors both current and potential, and (heaven forbid) your revenue line.

You know there are a few things not quite done about your product yet, but you think most of it is there, and you can manage around the rest. You know that you’ll learn some things from how the customer will use it, but exactly what is still unclear. You think you know how to deploy it, but some of the integration challenges are still opaque. As somebody said, you’re 90% of the way there, and the last 10% takes the other 90% of the time.

Choosing the right first customer is essential. Someone with a champion willing to work with you through the inevitable obstacles that will crop up. Someone with a problem hard enough that your solution delivers significant value, but easy enough that your alpha-quality solution doesn’t choke. Someone name-brand enough to provide a strong reference.

In my experience, setting expectations is the hardest part. You don’t actually have any idea how long it’s going to take to deploy, but the customer wants an estimate. You don’t know exactly what the main value propositions are, because you don’t know exactly how they’ll use it, but you had to say something to sell it to them. And you want to leave yourself enough wiggle room that when you finally do deliver something of value, you can declare success and say that was your plan all along. This is a balance beam that’s hard to walk.

Final piece of advice: listen to your customer. Don’t just listen politely, but really dig in and listen hard. Spend a lot of time with them, both in the office and out. Learn why they think you can help them, and understand their own motivations. Watch them in their daily work, both with your product and without, to understand what they do and how they see the world. These things will help you work through the challenges, deliver a product that they love, and give you the foundation of a successful business.

Scratching an itch and enhancing CellarTracker

Monday, August 25th, 2008

Once in a while I feel the need to write some actual code. I’m really a programmer at heart, and I find it incredibly satisfying to think through a problem and create a useful solution. For me it’s more interesting than crossword puzzles, and the end product is (sometimes) more valuable.

For a variety of obvious reasons, it’s not a good idea for me to get my fingers into Yieldex production code, so I end up scratching this particular itch with small side projects. This one took me a couple of hours, and hopefully will be useful for enough people for it to have been worth it. In any case, I learned a lot, so it was worth it for me.

I’ve written about CellarTracker before. I’ve been a user for years; I love the site, and I love the business. But, I find some of the UI to be less than perfect. Here’s an example: when I get a wine offering in the mail, there are usually a number of different vineyard designations, and I get an allocation of a few bottles from each. I haven’t found a way in CellarTracker to enter 2 bottles each from 10 different designations without doing an incredible amount of clicking around and waiting for pages to refresh. So, I set out to solve this problem with a “bulk purchase” mechanism.

There are a number of different ways to tackle this problem, so the first thing was to decide on an approach. I’m still a bit 1999 when it comes to the coding for the web, so my first idea was to set up a server-side program. Then I started thinking about the complexity of screen-scraping and parsing, and robustness in the face of potential CellarTracker updates. Then there are the security issues of passing usernames and passwords so my server could log in to CellarTracker. Finally, I realized I didn’t really want to be responsible for keeping the service up and running, so I almost bailed on the entire idea.

Then I remembered GreaseMonkey for Firefox. Cool – an opportunity to enter the 2000’s in web programming, and polish up some JavaScript skills. And, it got around all of the above problems in a neat client-side way. The only real issue is that it works only with Firefox, and for most people would require an install of GreaseMonkey and the script itself.

I started by installing GreaseMonkey and a couple of web development tools, notably the DOM Inspector, the Javascript Shell bookmarklet, and then later the Web Developer Toolbar. I read quickly through the Dive Into GreaseMonkey book, and then just started coding. I was pretty excited that in only a few minutes I could build a script to automatically change a CellarTracker page upon load.

After a couple hours of experimentation, some heavy shell use, and a bit of DOM inspection, I had something up and running. I created a test account on CellarTracker, and entered a bunch of purchases. Success! I finished up by spending a few minutes on data validation, trying to make it obvious when something went wrong and how to fix it.

The best part about this project for me was learning one level deeper about how Javascript and the DOM work. Javascript is a much more powerful language than I remembered from 1999, and I have much better understanding of how Ajax and many modern web sites work. And it was fun.

The final step was posting it in the new/old OSS directory here on Oxyfish, and writing this entry. I also posted a note in the CellarTracker forum, in case anybody else wants to use it. I’m very happy with how it turned out, and am looking forward to the next wine offering in the mail, so I can start saving time.

Install the CellarTracker Bulk Purchase extension (requires Firefox and GreaseMonkey)

Good Analytics Starts with Good Data

Friday, August 22nd, 2008

Analytics have long been an important part of content management, with companies like Omniture and Quantcast building nice businesses helping media sites understand how people are finding and using their sites. In contrast, the analytics focused on the revenue side of the business, particularly the display advertising, are surprisingly primitive.

One big reason for this is how much data is thrown away. Take DoubleClick DART for Publishers (DFP), for example. If you are a DFP user, you have access to some very basic information, like delivery count for each ad yesterday. But if you want to know which zones of your site a run-of-site ad was delivered on, you’re out of luck, they don’t keep that around. If you want that kind of data, you need to turn on “data transfer” and analyze the logs yourself, which despite their protestations of “it’s your data”, is an extra cost service (more on that below). And here’s a little-known fact about DoubleClick’s data transfer: it still doesn’t give you all the data. The DFP ad server strips out any publisher-specific tag attributes that aren’t used to target an ad. So, if you have your site divided into categories, with a key-value in the tag specifying the category, even with the detailed logs you still have no idea what categories a run-of-site ad was run on. They do have a workaround: duplicate your entire tag into a new parameter “u=”, with a new set of delimiters, and then you can re-parse it on the other side. In summary: re-tag your entire site, to get back “your data” that you are paying them to give you. The other ad servers are not much better.

If you ask your ad serving provider for the raw logs, they will claim that this is a lot of data, so they need to charge you for storage, bandwidth, etc. Let’s take a look at cost for a minute. If you are a pretty large site, you might generate 10GB of logs a day. For a reasonable comp, let’s look at what it would cost to use Amazon S3 for this service. 10GB of data transfer each day is $1/day to get it in, and $1.70/day to get it back out. To store it for 30 days would be $1.50. So, about $150/mo to move and keep 30 days of data, at the top end, with a healthy margin for Amazon. Remember that you pay extra to Amazon for the flexibility of scaling up and down easily, a dedicated hosting center would almost certainly be cheaper. So, if your ad serving provider is charging you much more than that (one DoubleClick customer I know was quoted $3500/mo for about 100MB of data per day, they negotiated it down but are still paying way too much), you should push back. Memo to ad server companies: storage is a lot cheaper than it was 10 years ago.

Ad sales and operations groups are starting to realize that they need detailed analytics to do their jobs. The first step of that is good underlying data, and getting that is way too hard. This needs to change.

Two steps forward, one step back

Thursday, August 21st, 2008

We are riding the typical start-up roller coaster. Some days are absolutely great, others not so much. Here’s a good example.

I’m in a contract negotiation process with a major media company. Awesome – they want to use our product! And they’re willing to pay for it! Wow, okay, let’s do it. I send them a proposed contract. They get their lawyers involved. They decide they want to use their contract template instead. Hmm – then why did I spend the money to have my lawyer do the first draft? Their template is totally one-sided, but we work with it, mark it up and send it back. They send it back again. We finally put together a conference call with their business folks, their lawyer, my lawyer, and me. The call takes a while, but goes well: we hammer out all the substantive issues, and get close to signature.

Here comes the kicker. The next day, their lawyer goes on paternity leave. Hey, I know how that is, some things you can’t plan. Bummer is, the new lawyer they put on it looks at the contract, says “no, no, no” and shoves in a dozen more major redlines, completely changing the deal. Argh! Another round of back and forth markups, another major conference call, a late-night conference with my lawyer to get it turned around, and finally we’re done again. It cost an extra week and a few more thousand dollars, but we got it done. Now, the real work begins.