bots.txt (was Re: [sldev] CAPTCHA to validate land sales.)
Joshua Bell
josh at lindenlab.com
Tue Oct 9 16:43:00 PDT 2007
Felix Wakmann posted some brainstorms about a "robots.txt" for SL here:
http://www.your2ndplace.com/node/417 in the context of allowing parcel
owners to communicate with the SLBrowser spider bots to limit what is
searched, just as robots.txt does for the web.
That link was pointed out to me during a panel discussion on in-world
search, and might be a good place to start. We were discussing it in the
guise of search (i.e. classic robots.txt) where the idea is to limit
content indexing, and thus not limited to bots but also to spidering
tools that may work through (or as) APIs (e.g. the ones we're working on
as part of the search project to expose content as HTML).
dirk husemann wrote:
> how about introducing a "bots.txt" file? currently we have no way of
> telling a bot whether he is allow on a property (or what he is allowed
> to do). this is not going to deter "ill-behaving" bots from just
> ignoring whatever "bots.txt" tells them to do (or not do) --- but at
> least a parcel owner can cleary express her wishes & rules.
>
> this could go into the parcel properties stuff and consist of a couple
> of rules:
>
> * bots (not) allowed on this parcel
> o bots (not) allowed above XXXm
> o bots (not) allowed below XXXm
> * bots (not) allowed to collect information
> o items for sale
> o texture information
> o prim information
> o avatars present
>
> well-behaving bots (e.g., bots owned by well-known companies) would
> honor those settings.
>
> cheers,
> dirk aka dr scofield
> --
> dr dirk husemann, pervasive computing, ibm zurich research lab
> --- hud at zurich.ibm.com --- +41 44 724 8573 --- SL: dr scofield
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> Click here to unsubscribe or manage your list subscription:
> /index.html
>
More information about the SLDev
mailing list