[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: digest 201



It was said recently...
>I would be interested in hearing about experiences or thoughts subscribers
>may have on the legality or ethics involved in sending a web crawler to
>visit and index the sites of publishers for whom we have site licences.
>
>The resulting database would be made available on our Intranet only; 
>i.e. only for the use of those for whom we have licenced access to the
>publishers. 

What's the problem? You are creating a resource that should be provided by
the publisher with whom you are doing business with any way. Should your
moral fibres twitch, then settle pettal because there is no need to shake.

Your 'intent' is to provide a higher level of service, getting better use of
those monies expended with the publisher. So, from a management perspective
it is a good thing, from a publishers perspective what are they going to do?
Somehow ban all web bots? Rework the robots.txt file to exclude you?

If you still cannot sleep right, then look at the exercise with an editorial
hat on. Compile the information, make it available, but also include a
'review' of the results in say, a two hundred word essay on 'Why this is
interesting'.

The copyright fair use rules allow much of this to happen without any
niceties being exchanged, otherwise how would theater goers, book readers,
film buffs and so on be aware of how good or otherwise a creative work is,
if we did not have reviewers utilising the fair use doctrine?

The information resource you create, if properly structured and managed
could be back licensed to the publisher for their benefit and use. You might
strike a deal for everyone to have access to it, not just your intranet.

I am going back to lurking now....

Bede