OpenFiction [Blog]

Intentional traffic

Posted in Uncategorized by scarsonmsm on March 20, 2007

Just finished reading John Battelle’s The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture. Thought-provoking in a number of ways, and I’m sure I’ll milk it for a few blog posts. It really is one of the better books I’ve read in a while, and a funny aside–the title doesn’t lend itself to searching very well. The Amazon book search I did on the title “Search,” didn’t put it on the front page of returns. I had to go back and type in the author’s name.

Anyway, reading it right after reading the report, I found several passages that spoke to the issues raised regarding web surveys and what they represent. The first passage recounts the aha moment that Bill Gross had for his paid search engine GoTo.com, which presaged the Google AdWords system:

Put simply, it’s not the quantity of traffic, Gross realized, it’s the quality. Any business would be willing to pay a lot more than seven to ten cents a click for the right traffic. That realization that became Gross’s eureka moment—a moment that, more than any other, spawned today’s Internet advertising economy. For every single online business (even, it turns out, portals) undifferentiated traffic is worth very little, but specific traffic, traffic with an intent to act in relation to a business’s goods or services [Battelle’s emphasis], is worth quite a lot. Gross realized that businesses will pay quite a bit to acquire the right kind of traffic. All he had to do was build an engine that created intentional traffic.

This notion of intentional traffic can be an important one to survey results much as it is to advertising. By in large, the people who answer a site survey are those who come with an intent to act in relation to the site’s content. Harley is absolutely right on that you can’t take survey results and apply it to traffic overall, because a large part of almost any web sites traffic is going to be non-intentional traffic driven largely (I’d guess) by inappropriate search results or simple curiosity generated through media coverage or blogosphere links. In the last MIT OCW evaluation report [9.0 MB!], I noted that 51% of the traffic to MIT OCW was what I call “one and dones”–single page visits (page 20). Now due to a quirk in how our site is instrumented, not all of these can be dismissed as unintentional traffic (we can’t afford to instrument all the PDFs on the site, so if a visitor comes to a Lecture Note page and downloads four or five lecture notes, this still only registers as a single-page visit). I’d guess, though, from what I’ve seen of other site metrics, this is fairly representative of other site traffic.

There are other cuts at web metric data that can be helpful in defining intentional traffic as well. Returning visitors are another indication. On page 14 of the above report, I note that about 40% of our survey respondents indicate they are new visitors, whereas our analytics would indicate this figure is more like 70%. This indicates just how much bias there is in the responses, but it’s not necessarily all bad news. Yes, most first time visitors won’t complete a survey, but on the other hand, most of their answers probably wouldn’t be particularly useful. The portion of first-time visitors who do take the time to complete the survey are an interesting population, because they represent on some level traffic with enough intention with relation to the content to invest in answering questions.

One of my favorite new metrics coming off of our site is the number of zip downloads. A few months back, we set up a system that generates and makes available for each course zip packages of the online course content that can be downloaded to users local computers. We’ve been serving up somewhere between 200,000 and 300,000 of these a month this year, and they are as close as we’ll ever come to a “purchase” on our site. It’s the activity on the site that most directly indicates visitor intent. We also have Amazon links in portions of the site, and eventually when this is rolled out to the entire site, I expect the number of related texts purchased to be an interesting measure of visitor intent.

All of which is a long way of ending up where I ended up in the last related post. No, survey results don’t represent to overall site traffic, but the trick is to figure out the site metrics that can help identify what populations they do represent. They can, I think, provide a clear picture of the types of benefits a site produces and for whom, and when carefully coupled with web metric data produce some reasonable estimates of the volume of those benefits. To a certain extent, they can indicate why visitors who might become repeat intentional traffic do not. I’d love to hear from anyone out there who’s tried to address these issues, and learn more about approaches that have been tried.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: