Category Archives: ad serving

Digital Channel Attribution…A Cheat Sheet

Last click, last touch, first touch, fractional…Adometry, TagMan, Visual IQ, Convertro, Omniture, C3 and ClearSaleing…there is no shortage of techniques and vendors claiming to have the right solution for digital marketers today.

Forrester recently released their The Forrester Wave™: Cross-Channel Attribution Providers, Q2 2012 and got most of it right. However, as a recent and current practitioner of digital attribution analysis they curiously weighted market share over real-time data collection – strange!

In any case, if you are still struggling to understand the the various attribution methods used in digital marketing channel analysis and want to avoid the painful learning curve, take a look at the simple vendor-neutral hype-free cheat sheet.

What do you think? Send your feedback!
Download Digital Attribution Cheat Sheet r3.1

Early take on Post-impression Viewthrough: Lilypad White Paper

An oldie but goodie:

White paper about Streams Lilypad analytics tool that I wrote back in 1996. Oddly, the industry has become search obsessed and has barely advanced with respect to post-impression and viewthrough tracking since then. DoubleClick for Agencies (DFA), Atlas and MediaPlex all measure viewthroughs; ComScore routinely provides research about this passive asynchronous behavior.

RSS Advertising Part II – The Wild West meets the Measurement Crater

Measurement…The Wild West

As prefaced in Part I, identifying and reaching an online audience can be a challenge – especially if you are after the evasive Techfluentials.
The reality is that right now, measurement is The Wild West of the online advertising industry. As marketers (gold-seekers) demand more and more accountability for their spend, software and media vendors continue the cycle of launch, failure and/or consolidation in a made scramble to sell the pick-axes to those after online gold.

By choosing to measure in-feed RSS advertising with oversold site metrics tools…it just make things even wilder. The reality is that there are serious technical limitations to consider when planning how to measure success at advertising to this target audience. Without proper planning however, marketers are left with a measurement gap of epic proportions.

First a quick primer to frame this classic situation. As a trained marketer, there are basically two different objectives of advertising and your ad creative typically can do one of the following well:

  1. Branding. In other words, making an impression and/or changing perceptions. Often very important but very difficult to accurately measure. For this reason, measuring branding is more complex and almost always requires either a custom-study or syndicated (shared, usually generic) sample-based research. Right now, ad effectiveness services like Dynamic Logic, InsightExpress, Nielsen and ComScore are not yet working with RSS feed networks. This means that qualitative audience research requires a custom study that is considerate of the RSS user’s environment – this may incur additional costs.
  2. Response. In the other corner is the quantitative measurement of tangible results such as sales, leads, impressions, clicks, time spent and the like. While it is much easier to technically measure the entire universe of such activities, in-feed RSS advertising success is a double-edge sword. Clicks (response) may be huge and the Clickthrough Rates great (relative to banners); with so much industry hand-wringing over declining CTR, clearly having a bounty of clicks is a good problem to have these days!

The Measurement Crater

For the purposes of this series, let’s say that you just want to measure response as the primary success metric for an RSS ad campaign. Unfortunately, if you or your client are depending on a JavaScript-tag based landing page tool to measure consumer response, you will likely experience something akin to this:


What happened? Wondering where did the clicks go? How many visits from suchnsuch.com’s RSS feed? Did they buy? Did they come back? Curious as to why you can’t determine what they did after they landed? As am I.

Newsflash: JavaScript tag-based Site Metrics have Limitations

Online marketers that are primarily interested in measuring response from an RSS campaign just found one. While many enterprise site metrics vendors brag about their simple, “just add our tag to your footer” implementation (Omniture, Google Analytics, Coremetrics)…if only it was that easy to get usable information.

The harsh technical reality is that JS tag-based systems require the browser to execute their special tag when the landing page is rendered. That is very different than server-side site metrics tools that track every access by definition. The main problem with relying exclusively on these tag-based approaches is that they cannot count accesses that originate from JS-disabled borwsers or altogether JS-incompatible applications. In other words, these popular site metrics tools essentially are blind to and ignore browsers and any traffic (including robots and spiders) that do not execute JS; I’m not going to get into the cookie deletion argument either.

Suffice to say with RSS advertising to Technfluentials, tracking non-JS accesses becomes your problem. To put it in marketing terms, here’s why:

  1. Techfluentials use standalone desktop RSS Feed Readers/Aggregators (non-browser applications).
  2. Techfluentials access the Web via mobile deivices in a browser environment that is even less likely to execute JS tags (my Treo 755p uses Palm Blazer 4.5, which offers the disable option).
  3. Techfluentials ALSO deliberately/religiously disable JS in their browsers (not to mention deleting cookies).

In other words, your most valuable segment is missing from the numbers. What to do about it?

To Be Continued…

RSS Advertising Part III – Solutions to this Mess

Comparing Visits & Clicks…

In this business, we often get distracted by technologies on the way to the business ends they are supposed to help achieve. The purpose of this post is to outline a basic process that should shed some light on the very thorny issues involved when comparing numbers derived from agency-side ad servers like DART aka DFA and client-side site metrics tools like Omniture.

Complicating matters is a situation whereby landing page tracking by the ad server cannot be implemented. What to do?
  1. Determine the Purpose. If the marketing objective is performance-oriented, or post-click engagement is essential then you are on the right track. However, if your campaign is focused on a branding objective, how important is it really to get numbers to match?
  2. Choose your Measures. Which measures make sense? Stefane Hamel provides a good backgrounder on Instances vs. Visits. If you are in the paid display advertising/non-search business then most likely clicks and visits are closest; unique clicks and visits/visitors even better. If you are working with search, instances may make more sense…view-through offers even more insight on post-click engagement.
  3. Leverage Standards. The Internet Advertising Bureau (IAB) & Web Analytics Association (WAA) are probably the most relevant industry groups that are working on measurement standards; terms and definitions vary.
  4. Manage Expectations. The reality is that numbers from different systems are unlikely to match; there are a variety of reasons but to make a long story short, they won’t match without serious integration and that has not yet happened. However, the recent announcement of a multi-faceted strategic alliance between agency holding group WPP & Omniture is very promising.
  5. Baseline. Given #1, the best alternative to is to create a simple baseline, e.g. an average over a safe period of time, e.g. a week or a month.
  6. Drop-off/Match. Breathe deep – acknowledge the causes are most likely latency, clickthrough URL parameter coding, landing page tag placement, application filters and counting methods that may never be in sync.
  7. Test & Debug. Understand that campaign trafficking and set-up processes are rife with glitches…clients, creative shops, media agencies and publishers rarely have the staff, or luxury of training to make all of this work flawlessly all of the time. One could spend significant quality time on process engineering these tasks.
Seems simple enough, right? Easier said than done!