Tagging HTTP Get Requests to audit click thru traffic from thin rev app net client

Sivakatirswami katir at hindu.org
Tue Dec 6 04:48:56 EST 2005

Goal: From with a rev remote client app deployed to fooNumber of  
desktops, be able to monitor click thrus on buttons or links to  
external URL's.

e.g. Let's say, for discussion purposes, that one were to credit  
Runrev, Shafer Media or Altuit in one's credit boxs, -- or any  
advertiser for that matter: "AyushHerbals.com, ZarposhImports.com"  
etc... and, in your links on a rev thin client interface, you had the  


Now... given that

a) the web server that is serving up "http://www.ZarposhImports.com/"  
is not under one's control..
b) the IP of the GET request is coming from fooUser's ISP server

Is there a way by which the web master in charge of httpd access logs  
for "http://www.ZarposhImports.com/" (or whatever) can determine from  
the GET request entries in his httpd access logs that the request has  
come from my client app?

My first guess is, yes, possibly, if a) one could configure the http  
headers  in such a way as to make a customized user agent that is  
clearly recognized as one's app. i.e. instead of the user agent being  
the standard "Mozilla" or "Internet Explore 6.0" in the GET lines,  
the end  of the line server httpd access log would show " USER AGENT:  
VeryCoolRevolutionAppName" as the "browser" which made the GET  
request. and b)  there was actually "someone at home" at the end  
advertisers web site. i.e. many will not be savvy enough to even  
understand if I say "check the httpd access logs and you will see our  
clients accesses..."

But, assuming the advertiser does in fact have a web master savvy  
enough to check these stats, and that they really do want to know..  
(after all they may be paying us big  bucks...) my "worry" would  
be... does Apache do any kind of filtering in this context?. i.e will  
it "balk" at a none standard user agent making the GET request? I  
doubt it, as I see all kinds of none standard user agents accessing  
pages on our own sites...God only knows what they are... they don't  
look like standard browsers to me... Of course, I never mess with  
httpd.conf and if there were some reason to block any of these I  
would not know how to do it anyway, but other web masters may be more  
security conscious?  I just don't know...I can't imagine a security  
issue that  would require such a filter.

Now, obviously there is another strategy, a simple one too, to log  
these click thrus on *our* server.
...Easy-- instead of to direct external URL we set up a Rev redirect  
CGI and pass the url through to our own server first in a call like  


I already handle 404's with a rev CGI that just does this kind of  
redirect...(if anyone would like to see my rev 404 redirect  
architecture I would be happy to share it...it's very simple, very  
efficient and easily re-configurable for any web site need) Better  
yet, is there some where to post this kind of thing for the community?

My redirect cgi reads the incoming query string, logs it and  
redirects the user browser to the external site. This is trivial to  
set up.  20 minutes of work at most and it's done...

but then, it's just  my word that "My logs show 10,000 users clicked  
on the link to your site last month from our client...."  so, while  
my logs would be, *I* know, very accurate, there is no iron clad way  
to provide evidentiary proof that our numbers were not inflated....so  
the former solution is much better...hhh.mmm I guess one could do  
both and have the best of both worlds... i.e. the special http  
headers would be generated by my CGI and not the thin client... which  
just talks to our CGI....then i would have a log and so would the  
webmaster at fooAdvertiser.com

insights anyone?


More information about the Use-livecode mailing list