Welcome to WebProNews Breaking eBusiness and Search News
Advertise | Newsletter | Sitemap | News Feeds News Feed 
 WebProNews Search Part of the iEntry network iEntry inc. 

Backlinks (http_referrers)

A.P. Lawrence
Expert Author
Published: 2003-03-26

WebProNews RSS Feed

When a web page is accessed by a link from some other page, the address of the other page (the "referring page") is made available to the web server. We can pick that information up from logs or as the page is being displayed. For example, if we have Server Side Includes or php, we can pickup the referring page from an environment variable. Here's a snippet of Perl code that does that:

Yes, there's a missing R in HTTP_REFERER. Yes, that's wrong, but that's what the variable is so that's what you use.

That's all very interesting, and people have used this information to get an idea of their web popularity or do do cutesy things like "You came here from..". But there's a more important use for backlinks: they may be a source of more information.

Suppose that I write an article about using DVDRAM for backup. It will quickly be indexed by the major search engines, and people using those engines may get directed to my article. Other folks who have written about DVDRAM or Unix backup may notice it, and may link to it from their articles.

Now consider someone searching for information who gets directed to my page by Google. If all I have is my page, and it isn't quite what they are looking for, or if they need more information, it's back to Google for more. However, suppose I had previously harvested the HTTP_REFERER information for those other pages that linked to me and showed those links to the visitor? These would be excellent links for them to follow up on- if the referring page linked to my DVDRAM for Unix backup page, it may very well be strongly related to what they are looking for.

To accomplish that, I need a database that collects backlinks. That's pretty simple to do. Every one of my pages starts something like this:
<!--#include virtual="/cgi-bin/nlastmod.pl?/Reviews/dvdram.html" -->

The magic part is the "include virtual" that triggers a Server Side Include script called "nlastmod.pl".
That script does many things, including setting up a consistent style sheet and harvesting http_referrer information.

See Automagic Website for more details on Server Side Include scripting.

The harvesting uses a Perl database to store the referrals. Here's the relevant code:
$thispage =~ s/index.html//;
# normalize index pages
foreach ($frompage) {
   next if ($thispage eq $frompage);
   # refresh
   next if /=/;
   # mostly search engine pages
   next if /localhost/;
   # somebody's personal links
   next if /aplawrence.com/;
   next if /aplawrence.com/;
   next if not /http:/;
   # none of my own stuff
   dbmopen %backlink, "/usr/home/aplawren/www/data/backlinks/index", 440;
   dbmclose BACKLINK;

I deliberately don't store certain pages. I'm not interested (not here anyway) in backlinking to articles within my own site.
That is something that would be valuable, but I think it needs to be separate from these, so they aren't harvested here.

I also ignore any link that has "=" in it. That actually ignores some pages that are legitimate, but most such links are search
engine result pages or other transient links that I do not want to harvest. Here's a typical search engine transient:


With more thought and effort you could probably cut out the transients without affecting legitimate use of "=" but I didn't bother.
I lose a few referrers that way, but I don't clutter things up with thousands of transients.

Once the harvesting code has picked up the links, it's easy to display them. Here's code that displays all of them; I use this on my Referrals page.

$forced= shift @ARGV;
$thispage =~ s/index.html//;
print "Content-TYPE: text/plainnn";
dbmopen %backlink, "/usr/home/aplawren/www/data/backlinks/index", undef;
while  (( $one,$two)= each %backlink) {
  ($a,$link)=split /|/,$one;
  next if $link =~ /=/;
  next if $link =~ /localhost/;
  $backlist[$x++] = "<li><a href="$link">$link</a> ($two)n";
dbmclose %backlink;
if ($backlist[0]) {
  print "<p>Referring pages (off site)<ul> ";
  foreach (sort @backlist) {
  print "</ul>n";
  print "<br>$x";

On specific pages,. the following code only lists those backlinks that are specific to this page:
print "Content-TYPE: text/plainnn";
dbmopen %backlink, "/usr/home/aplawren/www/data/backlinks/index", undef;
while  (( $one,$two)= each %backlink) {
  next if $two ne $thispage;
  ($a,$link)=split /|/,$one;
  next if $link =~ /=/;
  next if $link =~ /localhost/;
  $backlist .= "<li><a href="$link">$link</a>n";
dbmclose %backlink;
if ($backlist) {
  print "<p>Referring pages (off site)<ul> $backlist</ul>n";
  print "<p>The links above may be useful in finding related material, however 
            they are generated automatically from http headers and therefor can be inaccurate";
  print "<p>These are referrals to this specific page. 
            See  <a href="/Links/referrals.html">Referrals</a> 
            for a more complete listing of referrer pages";

Note that, unlike the referalls page, I don't bother to sort these.

You can see this code in action at the bottom of every article here. If there are no backlinks, nothing appears, but if someone has referenced the page (and someone has used their reference) it will automatically appear.

Please Read the Disclaimer
Reprint and Copyright Info

Receive Our Daily Email of Breaking eBusiness News

About the Author:
A.P. Lawrence provides SCO Unix and Linux consulting services http://www.pcunix.com

WebProNews RSS Feed

More Articles

Contact WebProNews


Targeted Information for Business
WebProNews is part of the iEntry network

Internet Business: Marketing: Small Business:
WebProNews MarketingNewz SmallBusinessNewz
WebProWorld AdvertisingDay PromoteNews
EcommNewz SalesNewz EntrepreneurNewz

Software: Search Engines: Web Design:
WebMasterFree Jayde B2B DesignNewz
NetworkingFiles SearchZA FlashNewz
SecurityConfig SearchNewz WebSiteNotes

Developer: IT Management: Security:
DevWebPro ITManagement SecurityProNews
DevNewz SysAdminNews SecurityConfig
TheDevWeb NetworkingFiles NetworkNewz

The iEntry Network consists of over 100 web publications reaching millions of Internet Professionals. Contact us to advertise.

 Advertise | Contact Us | Corporate | Newsletter | Sitemap | Submit an Article | News Feeds
 WebProNews is an iEntry, Inc. ® publication - $line) { echo $line ; } ?> All Rights Reserved
About WebProNews
WebProNews is the number one source for eBusiness News. Over 5 million eBusiness professionals read WebProNews and other iEntry business and tech publications.

WebProNews provides real-time coverage of internet business.

Free Email Newsletters:
WebProNews SearchNewz
WebProWorld DevWebPro
Marketing SecurityNews
Plus over 100 other newsletters!

Send me relevant info on products and services.

Ten most recent posts.

Featured Software

WebProNews in the News
View all recent mentions of WebProNews from around the world!

Recent Articles On ...
Google eBusiness
Yahoo Ask Jeeves
MSN Blogs
Search Engines Blogging
Affiliate Programs Marketing
eCommerce Advertising
eBay Sun Microsystems
AOL Adsense
Microsoft Adwords
Oracle IBM
Amazon Apple
SEO iPod
Adsense XBox
PR Adobe

iEntry.com WebProWorld RSS Feed WebProWorld Contact WebProNews Print Version Email a friend Bookmark us