[nycphp-talk] Looking for ideas on how to allow spiders to cra wl authenticated pages
Steven Samuel
steven at sohh.com
Mon Feb 24 22:19:06 EST 2003
Here's what I did to improve my site in the rankings.
I made an invisible link on my opening page. (TOP LEFT, right below the WIRE
image) and that image links to:
http://www.sohh.com/meta_crawl.html
This file contains all the links to my site that I want indexed by spiders.
And when I submit to search engines, I submit:
http://www.sohh.com/meta_crawl.html
I'm usually in the top 5 when people search for Hip-Hop.
Steven Samuel
SOHH.com
-----Original Message-----
From: DeWitt, Michael [mailto:mjdewitt at alexcommgrp.com]
Sent: Monday, February 24, 2003 9:47 PM
To: NYPHP Talk
Subject: RE: [nycphp-talk] Looking for ideas on how to allow spiders to
cra wl authenticated pages
Chris,
This was along the lines of what I was thinking and possibly using the
remote_address in conjunction to further limit the access. I don't know if
this is feasible since from what I have heard of Google, their bots can come
from anywhere.
Mike
> Sure, you can check the User-Agent header to see if it matches a known
> spider,
> but your authentication is effectively reduced to someone sending this
> header,
> and if you can find User-Agent strings for known spiders, so can an
> attacker.
>
> Chris
>
>
>
>
--- Unsubscribe at http://nyphp.org/list/ ---
More information about the talk
mailing list